Factually! with Adam Conover - Your Ring Doorbell is Working with the Cops with Johana Bhuiyan

Episode Date: July 14, 2021

LA Times reporter Johana Bhuiyan joins Adam to explain how Ring built a private surveillance empire by promising kickbacks to an unlikely accomplice: the cops. Learn more about your ad choice...s. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 You know, I got to confess, I have always been a sucker for Japanese treats. I love going down a little Tokyo, heading to a convenience store, and grabbing all those brightly colored, fun-packaged boxes off of the shelf. But you know what? I don't get the chance to go down there as often as I would like to. And that is why I am so thrilled that Bokksu, a Japanese snack subscription box, chose to sponsor this episode. What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds. Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Starting point is 00:00:29 Plus, they throw in a handy guide filled with info about each snack and about Japanese culture. And let me tell you something, you are going to need that guide because this box comes with a lot of snacks. I just got this one today, direct from Bokksu, and look at all of these things. We got some sort of seaweed snack here. We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
Starting point is 00:01:15 chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this is so much fun. You got to get one of these for themselves and get this for the month of March. Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono style robe and get this while you're wearing your new duds, learning fascinating things about your tasty snacks. You can also rest assured that you have helped to support small family run businesses in Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight to your door.
Starting point is 00:01:45 So if all of that sounds good, if you want a big box of delicious snacks like this for yourself, use the code factually for $15 off your first order at Bokksu.com. That's code factually for $15 off your first order on Bokksu.com. I don't know the way. I don't know what to think. I don't know what to say. Yeah, but that's alright. Yeah, that's okay. I don't know anything. Hello and welcome to Factually. I'm Adam Conover. feed you ads or figuring out how to otherwise monetize it. And you know, that business model, I'm not gonna say it's all bad. However, the creation of all that data has risks and consequences
Starting point is 00:02:58 that those companies rarely think through. In fact, sometimes they drive straight into those chasms, head first, straight on, the consequences be damned. Let me give you an example. Take the home security company Ring. You might remember Ring from its star turn on Shark Tank a couple years ago. I love Shark Tank. Yeah, I don't care how communist you are. Shark Tank is a great show. Bernie Sanders himself sits at home yelling at the TV on a Friday night. No, no, no, Sharon. Don't go with Mr. Wonderful. He's going to screw you. Get a deal with Cuban.
Starting point is 00:03:28 You got to work with Cuban. Okay, that was my horrible little Bernie Sanders impression. I apologize for it. Look, Ring now is a massive success. One of the biggest successes in Shark Tank's history. I'm sure you know them. Your neighbors have them. They make indoor and outdoor security cameras and smart doorbells.
Starting point is 00:03:43 That's right, that little blue circle you see winking at you from all the doorsteps as you're on your afternoon walk, that's Ring. Amazon bought the company a few years ago for a billion dollars. And there are now millions and millions of Ring devices on doorbells, on the outsides of buildings, and inside our own homes, constantly recording video of us and our communities. Now, that is kind of dangerous. You know, if you get together that amount of private footage in one place, it becomes a pretty big target for some bad actors. So you need the people who are in charge of keeping all that data to be very, very careful about it. And Ring, let's just say, has not been. First of all, a couple of years ago, they had to fire some of their own employees after they were found
Starting point is 00:04:30 accessing customer video feeds they shouldn't have been. And then, of course, Ring's entire security system has been compromised again and again, allowing hackers to look in on people's private lives and in some cases, shout abuse at people who are watching TV in their living rooms. Now, all those stories are well covered, but it gets a lot worse because Ring has also partnered with law enforcement agencies around the country, giving them easy access to users' videos as part of investigations. You know, if the cops want to go search a private residence or private physical records of some sort, they have to get a warrant. But Ring has created a system in which the bar is much, much lower.
Starting point is 00:05:10 They have, in effect, created the private surveillance network cops could only dream of. And that's not just dangerous to the people who own the cameras. It's even more dangerous to the people on the footage who might be wrongly picked up or targeted by the police. And you know, as we know from pretty much all of American history, when police get extra powers to wield, they usually wield them at black and brown people first. And recently, there was a blockbuster story in the LA Times that exposed how Ring has been working with police departments, not just here where I live in Los Angeles, but across the country in a truly disturbing way. And I'm not going to reveal it to you now. I'm going to let the reporter who broke that story tell you herself. It's her second time on the show. She's
Starting point is 00:05:53 one of our very favorite guests, and I am thrilled to have her back. She writes about technology and accountability for the Los Angeles Times. Please welcome Johanna Buya. Johanna, thank you for joining us on the show once again for your second time. Thanks for having me. I'm so excited to be here. So last time you were on the show, we talked about the gig economy. We're not going to talk about the gig economy today. I mean, we can talk a little bit. Yeah. The gig economy pervades our entire society. Maybe it'll come up. But the reason we started talking again was you had a really blockbuster piece in the L.A. Times about ring doorbells, which are a I find to be a constant source of concern. They're a plague upon the lands. I see them everywhere I go. I'm being recorded by a little camera and I've always been concerned about it. So tell me, what is the issue that that you are covering with ring doorbells?
Starting point is 00:06:44 about it. So tell me, what is the issue that you were covering with ring doorbells? Yeah. So this was an issue specific to the LAPD, but to be clear, it's happening at police departments across the country. But basically Ring was trying to get a bigger foothold in LA, and they also wanted to prove that they were like an effective crime fighting tool. So they worked directly with the LAPD and they gave the LAPD at least 100 officers free ring doorbells or they have, you know, these other cameras that they have and solar panels for those cameras. And they would give these free devices to them, which at the time retailed for like $200 a pop. And then or coupon codes, you know, so discounts for these devices. And then they would encourage them or ask them to promote Ring to not just other officers in their station, but other officers in other stations, neighborhood watch associations, community members. So they basically, at the end of the day, were trying to get officers to kind of lend their credibility to this claim that like, you know, our Ring device will help stop burglaries or theft. And yeah, I mean, at least, you know,
Starting point is 00:07:47 15, we got a hold of 3000 or so emails between the LAPD and ring. And so 15 is like a very conservative number. But you know, in terms of being able to definitively say these officers after receiving a ring device promoted ring to other people, there were a little over 15 people who, you know, in their email said, oh, I did this. And so at least 15 officers did promote Ring after receiving a free device. More than 100, though, received free Ring devices, which in and of itself could run afoul of LAPD rules.
Starting point is 00:08:20 Yeah, I mean, my first question was going to be, is this in any way legal to give? I mean, just a private company giving away free stuff to like a free expensive electronics or discounts, which are, you know, not quite cash equivalent, but are getting closer to it in exchange for like services. Like, OK, so what what was their request specifically to the police officers? Hey, we'd love to give you a free doorbell in exchange. We'd love what or was it that much of a tit for tat? Yeah, it was. It was actually that was kind of the thing that really blew me away was the like very direct request of like, we want to give you this ring device so that you can get a feel for how good of a crime fighting tool it is. And can you please tell these community members how good of a
Starting point is 00:09:10 tool it is? Can you tell other officers how good of a tool it is? In some cases, officers literally, it was kind of like, you know, MLM style, like they would get a discount code and then send it out to a bunch of officers. And for every 15 uses of their discount code, they'd get another free device. Like there were some situations like that where again, it's like maybe not, they're not necessarily like unclear, like whether they push the message, but they did get people to purchase Ring devices using their personal discount code and then got free devices in exchange for that. So there was like a little bit of officers kind of like pushing the message. They're like, you know, there were emails of officers saying, I so believe in this product.
Starting point is 00:09:48 I'm telling everyone I know about it. There was one email where an officer said, I really love this product. I've been using it. I recommended it recently to people who were burglarized in the last few months. And I said, you should get this because it'll stop that. You know, there was, so there are people pushing the message, but there are also just people pushing the product. Well, I mean, you're making the LAPD sound like they're going around like Girl Scouts selling cookies, but civil servants armed with guns who have a really strong position of authority
Starting point is 00:10:15 in the community. And this is a national program? They're doing this in other cities too? So this is a program that they said, the ring said in response to our story that they stopped in 2019. So they no longer request officers. They don't no longer donate products to officers and request them to promote it in exchange. And that's their words. Right. So they like admit that that's what they were doing. But what's happening now instead is potentially more nefarious if you have concerns about surveillance, right? Like they, officers now have access to Ring's neighbors platform, which is basically a social network, kind of like a next door for Ring and other security cameras. So next door plus surveillance footage, right?
Starting point is 00:10:58 Like, so users can use surveillance footage, but there's a law enforcement portal that gives law enforcement like direct access to this feed of videos. They could request like the videos directly from the users on the platform. And that like, you know, if you don't really know much about the process, that doesn't sound like that big of a deal. But typically, in order for an officer to get that footage, they'd have to go through Ring, go through the company, and then get a subpoena or get a warrant. And there'd be a paper trail. You'd be able to say, here's how many times officers requested this information from Ring. And there might be a right of refusal or something like that from the user. There's just a much more judicial process here. And you have to get a judge
Starting point is 00:11:38 to sign off on it. Whereas this doesn't require any of that. You can go directly to the user. And if they say yes, you can get that footage. And again, there's this concept of consent. People are like, oh, well, if they say yes, then it's not a big deal. If an officer asked you for footage, would you feel comfortable saying no? Not necessarily. And so up until literally the day that the day two or three days after we reached out to Ring for comment on this story, But the day, two or three days after we reached out to Ring for comment on this story, Ring did not have any way of really tracking how many requests they got from different police departments.
Starting point is 00:12:23 And so, but on like, it was like a few weeks, a few days before we actually published it, they announced that they were finally allowing users to go into the app and see under each police department how many times they requested video footage. So this is like a new level of transparency that they hadn't had in the, you know, four or five years that they've had this platform. And when we asked LAPD, just to give you an idea of like how, you know, opaque it is, when we asked LAPD how many requests they have sent to ring for this footage, they were like, it would be, you know, too much of a burden to count. We don't keep track of it. It'd be too much of a burden to count how many times we've asked of it. It'd be too much of a burden to count how many times we've asked for it. So, cause I was going to ask like, what are the police getting out of this? Like, could I, if I've got some product, if I'm hawking my new, like kombucha energy drink, can I just like offer free discounts to cops and they'll start selling
Starting point is 00:12:58 it for me on the street? Like, are they really that hard up for money that any company is going to do this? And like, no, the answer is that in exchange for this ring was also what basically building like a back door that allowed them access to surveillance footage without having to go through the legal process that would result in a paper trail and, you know, transparency and reporting and all those sorts of things. Instead, they just have like a fast lane to getting vast amounts of user footage. I can understand why the police would want that. Like, yeah, I think genuinely like the short term perk for a lot of officers were like this free,
Starting point is 00:13:36 like high tech device. You know, this started in 2016. The emails we have are from 2016 and 2019. And in 2016 is when they were getting the products. And at that time, Ring was like fairly new, right? Like it was like the hot product. It was three years after they were on Shark Tank. Like it was an expensive product. It was like a cool product and they were going to get it for free. They wouldn't have to spend $200 on it.
Starting point is 00:13:54 So I think for a lot of officers, it definitively was this short-term perk. And it's unclear at that time how many officers were thinking about like, oh, well, we're going to be able to get access to this sort of like easier, less transparent way of accessing surveillance footage. But certainly in the end, right, the thing that Ring did give them was this like much easier, low barrier access to personal surveillance footage. to personal surveillance footage. Well, let me just ask in terms of, I understand why Ring would want to promote their product as being helpful for law enforcement. I've certainly seen tons of Ring videos on Nextdoor that's like the most popular form of
Starting point is 00:14:37 Nextdoor content is like, you know, my package was stolen from my front porch and here's footage of the package thief. I had a friend who bought a ring doorbell specifically because his packages kept getting stolen. And then what ended up happening was now he had footage of the packages getting stolen and nothing changed, right? He told the police I've got footage and they were like, wow, yeah, that's sure a guy stealing your packages. And like, that was, you know, that was it. It did nothing. And he eventually took it down. So my question is, are ring doorbells actually useful for stopping crime in any way that we'd be interested in? Well, there's no studies that prove it is, right? And that's kind of like the issue with surveillance footage generally, right?
Starting point is 00:15:21 Like there are so many studies that are like, I don't know if this really does anything. And when it does do, like there's like the added risk of actually misidentifying people, right? Like there are so many studies that are like, I don't know if this really does anything. And when it does do, like there's like the added risk of actually misidentifying people, right? Because that keeps happening to black and brown people. But Ring at the time, and this was kind of another part of the story is they did a pilot with the LAPD and that the numbers that they came, that came out of that pilot, that study was used in marketing materials all across the country for months, if not more than a year. But we saw the email that showed that Ring kind of exaggerated the number. So they basically looked at, they themselves say it was a randomly determined geographical zone that they called Wilshire Park, right?
Starting point is 00:16:03 And then they said, this is where we're going to give people a bunch of free rings. And so they took the number of break-ins in that area during those months of when they gave like six months of, I think it was like a six month period where they like gave people a bunch of rings and they compared it to the vicinity, right? Like just nearby neighborhoods that didn't, they didn't give free rings to, right? And so rather than just doing percent difference, right, like just like the very straightforward and there still would have been a reduction. Like, you know, if you did that math, they did some really wild thing where they like made did the percent difference between what the actual number was in the in the region where they had the pilot and what the number would have been if that changed.
Starting point is 00:16:46 Like it had the same percent. It was just very like not the way that you do math. And the LAPD was like, oh, yeah, that's fine. Sure, go ahead. And so what they found was that there was a 55% decrease in burglaries in the area that they had the pilot in. But the numbers were like nine to like five um and so that's a 44 difference if you actually do the math um just the way that they did math was very weird but even
Starting point is 00:17:11 that it's like not statistically significant it's such a small yeah it's 55 really there were four less where previously there were nine that's like extremely small sample size very clearly and then also there's no like indication of like what other correlations there were, you know, just like no. And like, what if some of the, like the nearby places also had rings? Like nothing, nothing like that.
Starting point is 00:17:34 But that statistic was used everywhere else. It was used within emails, officers who were promoting ring as like at the behest of ring in some cases were using that statistic. They're saying there's a 55% decrease in some cases, we're using that statistic. They're saying there's a 55% decrease in this place, blah, blah, blah. And it just shows you just like, you know, why a relationship like that can also have, I mean, the downfalls or the downsides of that relationship,
Starting point is 00:17:56 because you're giving credibility again to this claim of crime reduction that isn't really true, right? Like it's not true or it's at the very, very least exaggerated. Yeah. Now, I don't let me say, first of all, I don't like have a enormous problem with people simply like having a security camera. Like I understand a product that gives you a security camera with your doorbell, especially it's like useful to see who's at your door when you're in a if you're in a multi-story home, stuff like that. But I've also noticed like, there's been an immense proliferation of ring doorbells, at least where I live in Los Angeles, I walked down the street and I just see those glowing blue lights like pointed at me, you know,
Starting point is 00:18:38 and I'm aware that I'm on camera and, you know, it feels like it gives a different feeling than uh one of crime stopping you know i also here in los angeles a lot of scientology buildings and they're all covered with security cameras you know and when i walk by those i feel intimidated and i don't think those cameras are like up there to stop crime you know it's like creating a as sort of a fortress mentality or that sort of thing that's that's my personal experience of them. I'm curious in your reporting, what what what have you seen as the consequences of this like pervasive amount of surveillance that we now have on our streets? this specific situation, right? The issue with the cops accepting free ring devices, specifically a personal security device, and then selling it and encouraging people to buy it is the concern that they're going to use fear of crime to like help it to essentially act as a long arm of a corporation, right? And we don't want officers, you know, we want officers to be ethical. We want
Starting point is 00:19:39 to make sure that we can trust their advice. You know, we're not, we want to make sure that when they say, oh, this will help prevent burglaries, they're saying it because it actually will and not because they're, you know, acting on behalf of a corporation, right? So that's the concern with the way that the officers were acting. But I think, you know, on the personal end, right,
Starting point is 00:19:57 for individuals, this goes back to kind of what I keep harping on in all of my reporting. And it's something that I'm really trying to focus on is the stakes of surveillance, right? Like, of course you want to be safe. Like, of course you want to make sure that your personal belongings or your packages, like, get to your home and nobody's stealing it. A hundred percent, that makes sense, right? But the way that these systems often are disproportionately targeting black and brown people can be really dangerous, right? And so the, you know, it's sort of like balancing personal safety with sort of a personal
Starting point is 00:20:33 responsibility to people who are disproportionately targeted by law enforcement and surveillance systems and understanding how those two things kind of interact, right? Like I'll often like ring neighborhood watch period without technology has historically criminalized black and brown people. There are people who will see a black person or a brown person walking down the street and immediately either immediately assume or think, you know what, maybe I'll keep an eye on them just in case they do something, you know? Half of the videos that you see of people posting ring doorbells on next door are literally just someone saying, I saw a black person in my neighborhood. I mean,
Starting point is 00:21:09 this is, we could have a whole other episode about next door and their problems with this issue. But like, I've seen those posts myself. I've seen my neighbors, you know, post such things. And like, it's very, if you have your eyes open, you see it happen. Yeah. And so, well, the thing with this partnership with Ring and the LAPD is, and other police departments, is not only are you going to say, you know, I'm, you know, Mary Jo from a small town in Minnesota, and I see a black person in my neighborhood, and I'm like, you know what, that person looks suspicious. I don't know.
Starting point is 00:21:38 I've never seen them before. Now, the police have direct access. Like, you have a very easy way of, saying, going to the police and being like, hey, this person is suspicious. And then the police might then go after this person or keep an eye on this person after literally doing nothing, right? They're literally being criminalized
Starting point is 00:21:54 and watched and surveilled simply because they looked suspicious and the basis of their, you know, suspiciousness is being black or brown. And so, you know, it's, it, you're adding basically like convenience, right? The way that tech democratizes everything and it makes everything more convenient. It also has made targeting vulnerable groups much, much more convenient. Yeah. Wow. I just want to return
Starting point is 00:22:20 again to the point that you made, because I asked you a question that you're like i just got to make this point about the police and i want to like emphasize it the fact that the the company i think we all know intuitively that using fear to sell a product is wrong you know you see one of those commercials that has a little old lady saying i don't want my family to be saddled with my funeral expenses when i die so i bought this ripoff life insurance, you know, like, oh, this is exploiting an old person's fear of being a burden. We don't like this kind of ad. We have a revulsion against that kind of thing. So first of all, Ring about like just very validly has been selling their products using that. But then to employ, you know, public servants who are in a position of privilege and power over the issue of, uh,
Starting point is 00:23:08 of public safety and crime that are in a prime position to exploit people's fear is, yeah, that's deeply unethical. Um, and really, really concerning. Um, uh, I'm sorry, I could go on about it for a while, but thank you for reminding me why to be mad about that. Yeah. And I mean, you know, you asked earlier whether it's illegal and like the police, the LAPD in particular, have a code of ethics and their ethic code of ethics specifically prohibits them from accepting gifts that would even give the appearance of impacting any sort of city business. And also there's a separate rule that says that you can't use your position to like ingratiate yourself essentially. Right. And, and like this situation is both those things in my mind. Right. And when they first, when they first responded, they were like, you know what, we haven't looked through all 3000 emails. These
Starting point is 00:24:00 are, these emails came out of a public records request. Right. So someone looked through it, but they said, we didn't, we haven't gotten a chance to look through all of it, but upon These emails came out of a public records request, right? So someone looked through it. But they said, we haven't gotten a chance to look through all of it, but upon preliminary review, it doesn't look too bad, right? But then there are emails where an officer literally is communicating with a ring representative and is like, oh, yeah, I haven't been able to convince my neighbors of getting a ring yet. They're elderly and they are really scared of technology. So I've just been watching them to see when their adult children come by so that I can convince their adult children to convince them.
Starting point is 00:24:32 And I'm like, that's not unethical. But LAPD officers who had already gotten ring devices emailed Ring and were like, hey, we're having a family picnic. You know, we usually like scrounge up some money to get some like measly like raffle gifts or whatever. Do you see where I'm going with this?
Starting point is 00:24:49 And then the ring representative was like, oh, yeah, I can like get you a free device to raffle out to your family members. Wow. And so, you know, a few weeks after or about a week after we published the LAPD police chief did talk to the police commission and they were like, we are investigating this and we want to make sure that officers actually really do realize that you can't accept gratuities that like, you know, compromises your position as a civil servant, which is like probably the right move. I mean, you can, there's a lot of people being like, well, the LAPD is going to investigate themselves and they're not going to find any wrongdoing.
Starting point is 00:25:24 You know what, at the very least least like they're giving the impression of you know some sort of accountability yeah god um thank you well thank you for doing the investigation to like actually spur any kind of action on this but the like baldness of the transactional nature between ring and the cops is like, it's truly shocking to me that it would be that it would be that blatant. But like, let's return to the point of, you know, black and Brown folks and other marginalized communities being affected by this. I mean, the, I don't know. I've,
Starting point is 00:26:03 I've seen it again in my own community. Like a couple months ago, there was a whole series of events that happened on my street, which is that in my little complex, one of my neighbors was like, hey, I saw kind of a weird guy on my porch. And there was a person of color. And he didn't do anything. He was just like kind of being weird, kind of erratic on the porch. You know, we live in the city.
Starting point is 00:26:23 There's unhoused folks around. There's folks, you know, there's los angeles you know what i mean yeah um and i was like hey this looks looks like no cause for concern i hope somebody i hope this guy gets some help right um and then like half an hour later we hear helicopters around and uh it turns out that a bunch of cops roll up pull out their guns on this guy like a bunch of cops all at once helicopters the whole nine yards and then uh because i think someone further down the street had like seen him on their ring doorbell and called the cops and then there's multiple people on the street filming the whole thing on citizen and this was
Starting point is 00:27:00 just a guy walking around with a stick you know like nothing actually happened on the street you know like nobody's nobody's house was broken into he didn't try to jump a fence he was just a guy walking around with a stick, you know, like nothing actually happened on the street. You know, like nobody's nobody's house is broken into. He didn't try to jump a fence. He was just like being weird on the street. This is on my street. And like all this was spurred by people sitting inside their houses, looking at these cameras, jumping to conclusions, having, you know, fear based reactions. But fear based reactions that were like given to them by the technology. Yeah, no, for sure. I mean, I think like it's, it's kind of wild to look at the history of Ring and, and, and use it as sort of like a mirror of like where society has gone because Ring started as, you know, purely a convenience thing, right? Like Jamie Siminoff, the CEO of Ring went on Shark Tank and was like, I was always working in my garage and I couldn't
Starting point is 00:27:44 see who was at my front door. So I made this Ring doorbell. And also it actually helps me like watch my packages. Right. But like, it was like purely like convenience. And then they made this like really, really sharp turn into crime and like made this, it made it all about how you can like, you know, protect your community, protect your own home. And, and I, yeah, I think capitalized on, you know, a growing fear of black and brown people, honestly, like other people. Right. And, and you're seeing this so much in tech, you mentioned Citizen, there are so many other companies that are kind of capitalizing on this, like real fear of crime. And I, you know, oftentimes you can trace it back to like one or two things, right? Like counterterrorism stuff, like anti-Muslim stuff post 9-11, right? Like there's like a moment of crisis that sort of spurs a lot of this stuff, you know, insurrection, for example, like people are like calling for, you know, all this facial recognition technology in response to this crisis. Even people who like are against facial recognition technology are like, you know,
Starting point is 00:28:43 let's use it on the bad guys. The issue is we have seen historically that when the government and law enforcement get to choose and tell us who the bad guys are, oftentimes they're black and brown people. Oftentimes that infrastructure that was, you know, introduced to respond to a very specific moment or a very specific crisis are just used disproportionately against black and brown people somewhere down the line, you know? And so, yeah, I think like, there's just like this real fear of crime. And there's, you know, in a lot of cities, there's like actual crime reduction happening right now, right? So it's like not really based in logic. It's just like, it's a great selling point. How do you sell a security camera? Not just for your packages, but also for, you know, safety generally. Burglary. Right. When Ring was promoting its devices in L.A. or to the LAPD, it wasn't just package theft. They were like specifically talking about burglary and like home break ins, which is a very different thing than what they like launched as.
Starting point is 00:29:44 home break-ins, which is a very different thing than what they like launched as. Right. So they're selling crime, like they're selling crime prevention without like, and without any real, like there's not, it's not clear that there was like any real basis for that fear in the first place or increased fear in the first place. And the weirdest thing is I've always felt that these products that are fear-based, like the purchase is fear-based. It's like, hey, aren't you afraid of your home getting broken into? So buy this product. The products almost always create more fear in the people using them. That like everyone I know who has a ring doorbell is constantly, I got it. There's someone on my porch. What is it? I got an alert. And like, you know, if some weird guy wanders by my front door, I don't know about it, you know, and if they try to break in while my door is locked, you know what I mean?
Starting point is 00:30:29 Like it like the locks work pretty well. And the constant vigilance that it gives you. Citizen is another example of this where people are like, like, I mean, there's been plenty of reporting about how citizen like specifically juices their notifications to like constantly keep people like addicted to the, to the phone. And, and, you know, they, they like come up with things to make alerts, suspicious person seen in, you know, just to like, maybe we'll go, Oh, watch out, watch out. Oh, there's a person a couple of blocks away. Watch out, you know, to make them feel like they need the app where like, we didn't need to know about these non-events happening around our community and having them presented in this way just like hooks us on this fear receptor.
Starting point is 00:31:10 And so you end up like it's like the cure is worse than the disease, you know? Yeah, no, I mean, and so I haven't reported a ton on Citizen, but like the reporting on it is a great, great example of like the extreme version of this, right? great, great example of like the extreme version of this, right? I mean, when I did have Citizen, because I was testing it out when it first came on or first launched, I thought like, oh my God, like San Francisco and New York are just like trash cities. Like I'm from New York. I was like visiting San Francisco all the time. So I had alerts for both of them on. And I'm like, what? Like there's a man taking his like dick out in the middle in a parking lot of McDonald's. That's horrible. But again, like you said, like, why the hell do I need to know that like why do I need to know how does that impact me at literally at all and like they have my location right like I don't need to know that um but yeah
Starting point is 00:31:54 I mean you know there was that that recent the reason why citizen came out like was like in the news recently I mean the CEO saw this man who put like supposedly could have committed a crime, like basically was like like foaming at the mouth to like catch this man and like put all of their resources to catch this man and publicized it. And it ended up not being the guy who committed the crime. Yeah, they put out a fake like APB to everyone on Citizen. We're looking for this man. They had like I think like literally like news anchors like on the app everyone on citizen. We're looking for this man. They had like, I think like literally like news anchors, like on the app talking about it, like our man hunt for so-and-so and it wasn't the right person.
Starting point is 00:32:33 It was, it's like despicable and, and like citizen employee, the vice did a lot of great reporting on this about how, about how also specifically like the people who run the notification system, there are specifically encouraged to like juice the notification reports and all that stuff. I mean, that's just like, and the original name of the app was Vigilante. Like the whole thing is, again, we could go on about it forever. This is not a thing that the tech industry was doing five years ago or that I thought that they were doing in the early days of really exploiting people's fears of crime and specifically people's like false fears of crime. Like our cities are, you know, there was a crime spike during the pandemic, but like compared to like the 70s and 80s, like every city in America is vastly safer than it once was.
Starting point is 00:33:25 And, you know, if you got good locks on your door, you're pretty much, you know, like if anyone's a victim of a crime, we've been listening to this.
Starting point is 00:33:33 I'm not discounting that experience. But if we look at the numbers, like our fear as a society is not in pace with the actual reality on the ground. And these products
Starting point is 00:33:43 are specifically trying to create more fear in people. Yeah. and using the cops to do it. Yeah. I think like, to be fair, like, again, like you were saying, there's, if you have been burglarized and, or if you have been a victim of a crime or anything like that, of course, you're going to try to do things to ensure that it doesn't happen again. Right. And I, and I totally respect that. And I think there are a lot of products on the market that do help you do that. I think all it. Right. And I, and I totally respect that. And I think there are a lot of products on, on the market that do help you do that. I think all it is, is that like,
Starting point is 00:34:08 we want, don't want to make it easier for the officers. You know, we don't want to create a system where officers just like are freely able to access all of that information and data with no transparency, no bureaucracy, no clear due process. Right.
Starting point is 00:34:22 It's not, it's not that we're saying, Oh, like also like if you don't feel safe you shouldn't do anything about it we're saying well should we have like more guardrails for how you know officers law enforcement the government whoever else other private companies have access or can access that information right so that's like the big thing right because a lot of people their response will be well you know on the one, like we want to make sure that we're safe.
Starting point is 00:34:45 On the other end, like if you're not doing anything wrong, then why does it matter if people are watching you? Right. And both those arguments, like to an extent, make sense. But with on the on the second part of it, it's like, well, do you want to feel like you're constantly being watched because you are brown or black? Like if you if we decided that, like, oh, well, actually, not we decided we have, the government has decided that domestic terrorism is actually one of the biggest threats to the country. The FBI has released reports about this. And so if we decided to then like, you know, broadly discriminate against people who like look like the insurrectionists, and we were constantly watching them based on just what they look like, like at a certain point, you're going to be like, you know what, this is actually bad. Like, it's not great to just watch people based on what they look like. So I think those are like the two important points, but you mentioned, you know, like tech companies haven't, weren't really doing this five years ago. I think like similarly,
Starting point is 00:35:37 what I was saying about like the ring pivot, right? Like it was a lot of companies kind of came about, they were either subtly responding to that, like they were either subtly creating services that were like next door. Great example, right? Like they weren't actually like launching as like a crime fighting thing. I think what they saw was that it turned into that organically. Like people turned that into like this sort of neighborhood watch crime thing. And I think what's happening right now is a lot of companies are responding to that, like desire to like make sure that like, you know, all crime in their neighborhood is stopped.
Starting point is 00:36:12 And Citizen for sure is an example of that. But I think, you know, and again, I haven't done reporting on this. This is just literally based on like motherboard reporting, like kudos to them, amazing publication. But this is what happens when you privat. Amazing publication. But this is what happens when you privatize law enforcement. Like this is what happens when you create a really weird private police that have like private and like like business incentives and motivations.
Starting point is 00:36:35 Like they are going to like try to get the criminal to prove that their service is amazing, to prove that, you know, law enforcement should work with them or that like they're weird little like network of, I mean, they basically have private police cars, right? Citizen is doing that as pilot programs. I don't know the current status of them, but they at least had been running them and had bigger plans to roll them out
Starting point is 00:36:57 of like basically a private police force in Los Angeles, maybe other cities that would be like, you know, we've got, there's plenty of private security forces driving around all the time, armed to various degrees. But that was going to be like, I guess on Citizen, you could like summon like a Uber fake cop to if you saw a scary person, you press the button and they show up if you pay a fee like looks like that was the business model that they were going to. And that's like there doesn't seem to be anything illegal about it.
Starting point is 00:37:27 But it's deeply frightening, the idea that like especially because, you know, you say that these companies are are, you know, they saw that it was people in the neighborhoods who are using the products this way. And, yeah, it's certain types of people. It's fearful, paranoid, often very comfortable people, you know, the sort of folks who, you know, peek out from behind the curtains and say, there's a man I don't recognize, except now they have technology to record those people, blast it out on social media, maybe summon a fake cop one day. Like, and, and it's, you know, a lot of times these products are being marketed to paranoid racists. Like, and, and it's, you know, a lot of times these products are being marketed to paranoid racists. Like what else, what else can we say about it than that? And it's, it's deeply
Starting point is 00:38:12 weird. Yeah. I mean, I think I'm like, so yeah, I live in, I live in the Bay area. I'm from New York, right? Like largely diverse cities. Right. So of course, and obviously this is a podcast, like I wear hijab. Like I am like very vividly and explicitly Muslim. And in those cities, I don't really feel like that targeted or I don't feel really ostracized or anything like that. Of course, I get stares. Of course, there's been situations, but much less than in other places. I'm on vacation in Minnesota right now. And downtown Minnesota, downtown Minneapolis, all of that, like very diverse, you know, but I'm in the suburbs of Minnesota. I'm in like northern
Starting point is 00:38:49 Minnesota right now. And never have I felt like so fearful of the way that people are reacting to me. And imagine that like constantly and imagine someone being like, oh, I actually now can call this like, I can like on like via app, Uber Uber and a cop right now to like follow this person and like see what she's up to. Like I went for a run in the neighborhood and I felt like I was worried what people might think. And I was worried people might call the cops on me because I'm not from that neighborhood and I'm wearing a hijab and it's just something that they're not used to. Like it's a very real, real fear. that they're not used to. Like, it's a very real, real fear. Like if I'm in a neighborhood, honestly, with like tons of security cameras, I'm just like, I'm going to run through the middle of the street. I'm not going to get too close to the houses because I know that a lot of people do have
Starting point is 00:39:36 this like very real fear of anyone who doesn't look like them, black and brown people. They're just like constantly on alert about anyone from outside their community. And it, again, it's like, it's, it's real. Like it's like a, it's a real fear to know that like you are being watched and one wrong move, one thing that could look potentially like, I'm like, I don't even try. I literally don't even look at the homes. Cause I'm like, I don't want them to think that I'm like scoping their homes out. I don't want them to think that like, oh, I'm going to come back later. And so you like you, you know, I'm like I'm an American. Like I was born in New York.
Starting point is 00:40:13 I'm born. I was born in Queens. Like, why should I be walking in any part of America and feel that way? That's kind of the situation, you know, of course, for black people more than me. You know, like that's the situation and that's kind of the environment that we've created. And that tech, I'm not going to say is like the core of it and the root of it and the cause of it. But I think like some tech companies have profited off of it. Some tech companies have enhanced and enabled this sort of like fear based environment. And when you're jogging on that street and you look up at the houses and you
Starting point is 00:40:45 see that row of little blue circles staring at you, like that's all the more intimidating and it's all the more a marker of these people are afraid of people outside and they're putting up a threatening front. And that's, in fact, part of the point, because that's mostly the point of any security camera. Well, look, I really want to ask you. We've talked a lot about law enforcement. I want to talk more about the privacy implications for ourselves as a society about this. But we got to take a really quick break. We'll be right back with more Johanna Booyah.
Starting point is 00:41:25 OK, we're back with Johanna Booyah. We've talked extensively about the discriminatory potential and actual reality of, you know, this sort of wide-scale consumer surveillance. I want to talk about the privacy implications of it. I mean, ring has had a lot of press for the last couple years for like they're generating this enormous amount of surveillance footage um that is you know stored on their servers but they have apparently in the
Starting point is 00:41:58 past event like very lax security protocols for what to do with it like they're basically creating huge amounts of very, very volatile data that they seem to be very bad at protecting. I remember there being a lot of news about security holes. Do you share those concerns? I mean, is that like, that's a way in which they're dangerous to even the people who own the devices?
Starting point is 00:42:18 Yeah, I mean, I think there aren't enough regulations and policies about how long you can store, you know, data and personal information. You know, like if there are hacks, like, yeah, I think I am really concerned about the security of all of that. I do also think that like privacy does, you know, we talked a lot about law enforcement, but privacy, like law enforcement access to things is also a privacy issue, not necessarily for the user, sometimes for the user itself, but for the people walking around. You're like, I'm just going to walk
Starting point is 00:42:47 around, now my image is going to be in some police database, or it's going to be in a ring database that police can access at any point. And so, you know, I think there's privacy implications on all sides of this. And of course, like, hacking anything is just about, like, how much money you have
Starting point is 00:43:04 and how motivated you are to hack it and you can hack it. Any cybersecurity firm will tell you that. Like it's not a matter of like whether they are going to be able to hack it. It's about how much time they have, how much resources they have, and if they really, really want to hack something. So it is really important for there to be policy or privacy regulations that specifically talk about how you can store information, how long you can store that information. And I think, you know, going back to what I was saying before, this is again kind of a part of like personal consumer willingness to give your information to companies and then just kind of deciding like, you know what, it's a cost of doing business with these
Starting point is 00:43:39 tech companies for them to take my information and do whatever the hell they want with it. You know, we're only seeing in places like California, like the CCPA and stuff like that coming about right now. It's still not perfect. It's not, companies are still resisting it. But I think a lot of it falls back to the consumer. The consumer not really feeling like it's that important anymore or feeling like, maybe not that they don't feel it's important.
Starting point is 00:44:04 Maybe they feel like it's just impossible to to reel it back in. Like the beast is out. There's no way to put that privacy beast back in. We're never going to have privacy again. But what my goal is with my beat is just to like consistently emphasize that it's so important to at least try to regulate how our data is used because the stakes are the highest for the most vulnerable communities. Like it always comes back to that. You know, I did a story recently about ICE requesting information from Google. And so basically what happens is there's a legal request process. And this is the same legal request process that the LAPD would have had to go through to get footage from Ring.
Starting point is 00:44:50 But you can, you know, there's like if it's a federal agency, there's like national security requests. If it's a local agency, they have to do like subpoenas and warrants. But oftentimes, like, you know, tech companies aren't super incentivized to say no to these requests. Like, why would they? And so in this case, ICE requested the information of a user. We got a hold of an email where Google reached out to the user and said, like, the DHS requested your information. You have seven days to get a court-ordered motion to quash this subpoena, or else we're probably going to give up all of your information. And this is their Google account. So not Gmail, not one specific Google service, Google, everything, Google Maps and whatever.
Starting point is 00:45:32 All their searches, all of their driving direction history, all of their, like these are people's entire lives. Yeah. And so, you know, Google will say that they gave them the opportunity to fight it, but who has, not everyone has a lawyer on hand. Not everyone knows what it means to get a court ordered motion to quash. And I saw this email. If I saw this in my inbox, I would have been like, oh, some weird terms of service.
Starting point is 00:45:54 Like, yeah, it's a phishing attack or something. Yeah, totally. And so there's just so many barriers to being able to quash that information. And again, it goes back to, oh, like, why wouldn't I give Google online information? Why does it matter how long Google stores that information? Why don't we have rules, you know, over whether like how much information Google is allowed to give to law enforcement or how do we like, it doesn't matter if we try to incentivize, like, why should we incentivize tech companies to not give our information up? And that was, you know, this was a very like unique case where ICE was
Starting point is 00:46:25 using an administrative subpoena, which is different than a regular subpoena because there's no judge, right? So it's like they come up, it's like their own subpoena. They don't go to a judge. They're just like, hey, like we want your information. And, but still like to any like lay person, you're like, oh yeah, I'm being subpoenaed. I have to give my information. But there's actually no, you know, judicial oversight over it. It's not self-enforcing. But in most cases, like if it's a federal government or federal agency, like asking for your information, they do it through national security letters or any other methods like that, that typically come with a gag order. It typically comes with
Starting point is 00:47:04 like a year long gag order. So typically comes with like a year-long gag order. So you actually never know, like for a year or so, even more, because they might extend it if your information is being given up. So that's why it's so important to like care that your personal information is being stored forever, like on these like cloud servers of all these major tech companies. So this is like, I read this piece by you, but hearing you describe it like blows my mind more about it because you're describing like, I mean, we have a presumption of being able to defend yourself in America, you know, from something along these lines. That's why we have search warrants, right? That like there's a process if the government wants to
Starting point is 00:47:42 come to my home, presumably they need to go to a judge and they need to prove why they can come into my home and go through the whole song and dance. Right. Before they are actually able to. At least that's what I understand from TV. That's like that's like what we it's like part of our civic education in America that that's required. Like what we, it's like part of our civic education in America that that's required. But this is like an agency coming to your digital home, coming to actually someplace that has a lot more private information than your actual home does. It has your entire search history. It has your entire email history.
Starting point is 00:48:20 It has your entire, like your whereabouts. All those sorts of things. It has photos, potentially thousands and thousands of photos. And what the government just comes and says, they don't need a warrant. They don't need anything else. They just say, give us this. And Google literally says, you have to go get a court order to stop this. And yeah, how would you go about doing that? Like if I go to the LA County courthouse, is there a Google court order desk that I can go to? Like, who do I call? I have a lawyer, but he's an entertainment lawyer and he doesn't get back to me within a week about anything. So I think I would be up Schitt's Creek as well if I were to if I were to have this.
Starting point is 00:48:56 Like there's a there's a presumption of innocence and a presumption of people can't just go through my shit in America that this really appears to violate. Yeah. Like the only real like analogy to this is, you know, cops, some government agency, whatever, knocking on your door, presenting you with a subpoena, saying I have to go through your home. Right. But that again, like you can't fight it. You don't really get notice about it or whatever. But you can you know what's happening. Right. And you see them going through it. Whereas this is just like your information, like who the hell knows, like the CCPA, if you're in California, you could be lucky enough to like request your data and see what information they have on you. But like, you don't really know what information they're looking
Starting point is 00:49:36 through with an administrative subpoena. The one thing that I should say is that they can only request like specific type of information, subscriber information. So they're not really supposed to ask for a location and stuff like that. We got a hold for the story of the actual subpoena, and they were asking for location and stuff like that. And it's just a matter of Google deciding whether or not to give it to them. So they're not supposed to ask for it, but if Google gives it to them, then Google gives it to them. They didn't do anything wrong.
Starting point is 00:50:03 But an administrative subpoena, the reason why like you're able to get information about it and stuff like that, again, is because it really should not be like beyond subscriber information. And there's no gag order affiliated with it because it's not reviewed by a judge at all. But yeah, it's just, you don't really, there's such, it's so, such a, like a black box of a situation and you have like very few remedies for it. And those two things like really matter. Like they really, really matter. Particularly, you know, this is ICE, right? So like conceivably they're going after an immigrant. Conceivably they're trying to use this information for, you know, in order to detain this person or whatever it is for some
Starting point is 00:50:42 ICE investigation. And so like that person is a vulnerable person. That person is in a vulnerable situation and they're in there, they're this like ICE, this government agency or this law enforcement agency is given, being given the tools to potentially detain this person or investigate this person or surveil this person by Google. Like Google is giving them the tools to do that. Right. And it's like, what do we expect tech companies to do in response to that? Right. Like, of course you have to like work with, especially if it's a criminal situation, it's like, what are they going to do? Like they get subpoena, like they get, like there's not always a ton for them to do. But we have seen situations
Starting point is 00:51:23 where like Twitter has fought back when, you know, I forget what agency it was and all the details of it, but Twitter has fought requests for information about one Twitter account. I think it was like the alt DOJ or the alt some like government agency or something like that. So there have been instances where they will fight it. And we want to make sure, like, again, this is about consumer behavior. We should make sure as consumers, as individuals, that we emphasize to these tech companies that our privacy is paramount. Like our privacy really, really matters to us. And we're not willing to just give up any semblance of privacy just to use your services. Because in those cases, right, the tech companies will have to be like, actually, in order to at least give the impression that we care about privacy, we're protecting our users' privacy, we should
Starting point is 00:52:08 fight this. Because in their response to me, and then in that story, they're like, yeah, we really care about people's privacy, blah, blah, blah, blah. And it's like, well, you might, I'm not sure if there was like, they were limited in what they were able to offer. And lawyers I spoke to also weren't sure if they were able to give more time to like, to quash this motion. But, you know, Google could ask for more time. Like, yeah, just try to like we want to make sure that as consumers were able to incentivize tech companies to fight for our privacy as well. But we I I completely agree with that.
Starting point is 00:52:39 But we have limited power as as consumers, you know, like I mean, I personally a couple of years ago, years ago i de-googleified i don't use google for anything i have a separate email service i use duckduckgo for my searches i you know and and i try to keep stuff a little dispersed because i don't like everything all being in one place but like i know there's somewhere there's some repository i'm not thinking about that has a lot of private information and the problem is i don't know how all these different companies are storing this information like for instance the company that has most of of private information. And the problem is, I don't know how all these different companies are storing this information. Like, for instance, the company that has most of my shit is Apple, right? And Apple has responded to the growing groundswell of desire for privacy by really being privacy forward in their marketing and saying, we're the privacy company.
Starting point is 00:53:18 And their privacy practices actually are better than, to some extent, than other companies. They have the most secure instant messaging texting service. You know, the iPhones are truly encrypted. And, you know, Apple has refused in the past to like there's a big thing a couple of years ago about them refusing to unlock, you know, build a backdoor for the FBI and other federal agencies. And so, like, it's better than, you know, the better than other companies. Right.
Starting point is 00:53:44 But to what extent? I don't know what what how you know, the better than other companies. Right. But to what extent, I don't know what, what, how, you know, how many law enforcement requests that they complied with. I have no idea. And despite that, Hey, that's maybe 40% of my shits on Apple. A whole bunch of it is on companies that I have no understanding of their data control policies. You know, like ring again, at some point appeared to just be like putting all the videos on just some like unencrypted server somewhere where like anyone could grab or I forget what the story was. It was like any Ring employee could look at any video from anyone at any time because there was no, you know, like there was no encryption on Ring side, you know, which is like an obviously massive, awful security hole. But like there's no laws around any of this. There's no regulation.
Starting point is 00:54:27 And so any one of these companies can just keep a big sloppy bucket of my data out on the shop floor, ready to get kicked over whenever any klutz walks by. And I have no control over it, nor do I even know which ones have good policies or not. So, and what this is highlighting for me is how much we desperately
Starting point is 00:54:45 need actual regulation around uh around user data because we're basically allowing all these companies to like stockpile large amounts of hazardous material that is you know is a harm right when you have enough of other people's personal information in one place, it becomes a target for hackers, for law enforcement, for bad actors of any kind. Not that law enforcement's always bad actors, but often we want to be protected from law enforcement. It becomes a target. It becomes a honeypot.
Starting point is 00:55:18 It becomes something that those people want. And so what is done with that information is like really paramount. And we need like some basic standards in our fucking society around it. Like, we have to care. Like, we have to care what companies are doing with our data. We have to care where they're putting it. And we have to care who has access to it. Because it might seem like, oh, we have no way of, you know, bringing this back in.
Starting point is 00:55:52 We have no way of engaging in today's society with technology without giving up that data. But we actually, there are regulatory methods to like at least create like guardrails companies may run afoul of them but at the very least we'll have a means to like keep them accountable um you know and and part of it is like policy part of it is us like giving an f as society because i think that's like the real issue here it's like a lot of people there is some movement right now with policy and stuff like that. But like in response to almost every single one of my articles, I will always get people being like, why does it matter that like they have our information? Like, how are you going to like, how else are you going to live?
Starting point is 00:56:34 Or like, why does it matter if people are watching if you're not doing anything wrong? I'm like, it matters. It matters. It matters. It may not matter for you today. It could matter for you today in a few years. But it does matter for so many vulnerable people. I mean, you know, I mentioned like counterterrorism efforts after 9-11. Like
Starting point is 00:56:51 one story that I'm looking at right now is just how so many of those surveillance tactics, like the surveillance playbook was really expanded post 9-11, used disproportionately on Muslims for years and years and years. But so many of those tactics are now being used on black and brown people. So like things that get introduced in moments of crises will then be proliferated to the rest of the society and oftentimes be disproportionately targeting black and brown people, immigrants, queer groups, any, any marginalized groups. Yeah. I mean, the idea that we, that we can't do anything about it is so weirdly pervasive and it's, and it's bizarre because the only reason for it is that like the tech industry, the internet,
Starting point is 00:57:30 everything that comes along with it has only been around on a consumer level for what, 30 years. And so there were no laws about it because it didn't exist yet. And we just need to write some, like we have them in other areas in, in the medical field. We have HIPAA regulations that like very carefully dictate under what conditions medical information could be taken. You know, I work with, you know, the group I do homelessness services with. We're very cognizant of HIPAA regulations. Whenever we're taking anyone's medical information, we actually avoid taking it for that reason because we know it's a hot thing.
Starting point is 00:58:03 Right. And anyone who's dealing with that knows the same, you know, talk to any social worker, anyone else. And that's because like, I don't know, at some point, we in society were like, oh, yeah, this is something that we need to make sure everyone's very careful with. And we passed a couple laws. And I think we all know that, you know, I think we have a presumption about our doctor's office that they're going to, you know, I think we have a presumption about our doctor's office that they're going to, you know, treat our medical records with confidentiality. And like, why, why can't we have the same expectation around, around our doorbell footage or anything else? It's,
Starting point is 00:58:36 I think the reality is like, people don't know how high the stakes are. Like they don't know how high the stakes are for a lot of people and the stakes are, may not be high for them right now. Like they don't know how high the stakes are for a lot of people and the stakes may not be high for them right now. And so, you know, my hope is like if I continue to like highlight human stories and the human impact of all of these surveillance issues and privacy issues, that people will start coming around to it. Like, and it's not just me. There's so many amazing reporters doing this work. It's just like, you know, you talk about surveillance and privacy and people's like eyes glaze over. So, you know, we've got a lot of work to do, but I think, yeah, it's just a matter of getting across like really, really,
Starting point is 00:59:10 really how much harm could be caused to so many people if you don't really start regulating a lot of the way our data is being used. Well, and the point that you make about it, it might not harm me that much, right. To have the ring doorbell out and to like use an unsafe service that is giving the footage to law enforcement and you know like exposing my footage to hackers and all these sorts of things and maybe i as like a you know able-bodied white guy um will say hey what does it matter to me? But it harms others. It harms other people in my
Starting point is 00:59:45 community that I should and do care about, even if it's not always visible to me. Yeah. And again, yeah, exactly. Like it just goes back to like balancing like my personal need for safety and other people's personal, like it's also safety for them too, right? Like it is a safety issue for a black man to be surveilled by cops, misidentified as a criminal, and then be targeted in some way, shape, or form. We've seen that it's a safety issue. And so, like, it's our safety versus some, like, it doesn't have to be us or them. There just needs to be a better way to make sure that, like, both those things are, there are at least, like, some sort of guardrails or regulations around both those things. are there at least like some sort of guard line and guardrails or regulations around both those things? You know, what it reminds me of is an issue we talked about on the show before in regards to
Starting point is 01:00:29 car safety that, you know, we have NHTSA, we have the car makers, we have everyone working on keeping the person inside the car safe. There's much less attention paid to the safety of the person who is hit by the car, right? We now have all these flat front, like if we were paying attention to that, you know, these big flat front SUVs with a really flat grill, we wouldn't have those because those are much more dangerous to get hit with than a sloping roofed car
Starting point is 01:00:52 where you roll over the top, right? Or the sort of A pillars that prevent us from seeing, et cetera. If we were focusing on the actual vulnerable person who's getting hit by the car, we would do things a little bit differently. And, you know, I can grant, hey, maybe for where you are, the neighborhood you live in, for the history that you have,
Starting point is 01:01:10 it makes you safer to have a camera pointed on the outside of your home, pointed at the sidewalk. But we also need to consider how it makes the people walking down the sidewalk less safe. Like there is another person on the other end of that camera. And that those people are never talked about by ring or by the LAPD or any police department really, or by the tech industry. It's really only folks like you who are talking about those people's safety. Yeah. I mean, you know, I'm not going to say, I don't know, maybe they have talked about it. I don't know. Like never say never. Yeah. But I do think like, yeah, exactly. I think it's just like people, they're sold on their own personal safety and as they should
Starting point is 01:01:51 be allowed to protect that, you know, their personal safety. But we also have to be very, very aware of how much our consumer, our individual behavior actually impacts other people almost systemically. Yeah. Well, I can't thank you enough, Johanna, for coming on the show and for doing this reporting and coming out to talk to us about it. It's been great to have you back. And we'll have to have you again next time you crack something huge like this. Thanks. Thanks for having me. This was so fun. I am sure that I misspoke sometimes, but I just started reporting on this. So hopefully I can prove my reporting will prove it out.
Starting point is 01:02:22 Well, you've done a lot in a very short period of time. Where can people find out more about you and your work? You can find me at latimes.com. I'm on the business section. So typically my stories are there. Hopefully my stories are also on the front page. But you can also find me on Twitter at J M Booyah, B O O Y A H. Awesome. Thank you so much, Johanna, for coming on the show. Thank you. This was really fun. Well, thank you once again to Johanna Booyah
Starting point is 01:02:54 for coming on the show. If you want to check out her work, go to the LA Times. If you want to support all of the authors that you hear on this show, remember you can access our special bookstore at factuallypod.com slash books special bookstore at factuallypod.com slash books. That's factuallypod.com slash books. And when you buy books there, you'll be supporting
Starting point is 01:03:11 not just this show, but your local bookstore. That is it for us this week on Factually. I want to thank our producers, Chelsea Jacobson, Sam Roudman, Ryan Connor, our engineer, Andrew WK for our theme song, The Fine Folks at Falcon Northwest for building me the incredible custom gaming PC that I'm recording this very episode for you on. You can find me online at Adam Conover, wherever you get your social media or at Adam Conover dot net. Until next week, we'll see you next time on Factually. Thank you so much for listening. that was a hate gun podcast

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.