CyberWire Daily - Apple CSAM: well-intentioned, slippery slope. [Caveat]

Episode Date: August 25, 2021

Guest David Derigiotis, Corporate Senior Vice President at Burns & Wilcox, joins Dave and Ben for an in-depth discussion this episode. Departing from our usual format, we take a closer look at the imp...lications of Apple’s recent announcements that they will be enabling scanning for Child Sexual Abuse Materials, CSAM, on iOS devices. We devote the entire episode to this topic and hope you will join us. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to the Cyber Wire Network, powered by N2K. Air Transat presents two friends traveling in Europe for the first time and feeling some pretty big emotions. This coffee is so good. How do they make it so rich and tasty? Those paintings we saw today weren't prints. They were the actual paintings. I have never seen tomatoes like this. How are they so red? With flight deals starting at just $589, it's time for you to see what Europe has to offer.
Starting point is 00:00:31 Don't worry. You can handle it. Visit airtransat.com for details. Conditions apply. AirTransat. Travel moves us. Clear your schedule for you time with a handcrafted espresso beverage from Starbucks.
Starting point is 00:00:46 Savor the new small and mighty Cortado. Cozy up with the familiar flavors of pistachio. Or shake up your mood with an iced brown sugar oat shaken espresso. Whatever you choose, your espresso will be handcrafted with care at Starbucks. Starbucks. If we don't continue to fight for that, we're going to lose every possibility of living in a more secure, in a more protective environment without Big Brother, without big technology, knowing every single step that we take when we leave our homes, when we're sleeping, who we're talking to. We're going to get're sleeping, who we're talking to. We're going to get to the point where we're going to be fully exposed and there'll be nothing left that we can do about it. Hello, everyone, and welcome to Caveat, the CyberWire's privacy,
Starting point is 00:01:34 surveillance, law, and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yellen from the University of Maryland Center for Health and Homeland Security. Hello, Ben. Hello, Dave. We are taking a bit of a departure from our usual format this week, and we're going to take a closer look at Apple's recent announcements that they will be enabling scanning for child sexual abuse materials on iOS devices. We're going to be spending the entire episode on this topic, and joining us is going to be our special guest, David Darajotis. He's Corporate Senior Vice President at Burns & Wilcox. While this show covers legal topics and Ben is a lawyer,
Starting point is 00:02:10 the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. And now, a message from Black Cloak. Did you know the easiest way for cybercriminals to bypass your company's defenses is by targeting your executives and their families at home? Black Cloak's award-winning digital executive protection platform secures their personal devices, home networks, and connected lives. Because when executives are compromised at home, your company is at risk.
Starting point is 00:02:46 In fact, over one-third of new members discover they've already been breached. Protect your executives and their families 24-7, 365 with Black Cloak. Learn more at blackcloak.io. All right, so as I said at the outset here, we're taking a little bit of a different approach to this week's show rather than go through our own individual stories. I think this announcement from Apple warrants its own conversation. And before we dig in, Ben, of course, we want to welcome our special guest this week.
Starting point is 00:03:22 Happy to have David Derajotis return. David, welcome. Thank you, Dave. Thank you, Ben. Great to be here. David is a Corporate Senior Vice President at Burns & Wilcox, and we're happy to have him with us. I want to start the conversation by saying that we're going to do our best this week to focus on the policy side of this issue. I think a lot of other people have addressed the technical side of this, how the engineering, the encryption, you know, all of those things that are going on behind the scenes. And there are plenty of places to hear conversations about that side of things. there are plenty of places to hear conversations about that side of things. We'll have some links to some other podcasts, the ATP podcast, the Accidental Tech podcast did an excellent discussion.
Starting point is 00:04:12 We'll have links to those in the show notes. Let me start off, let's just sort of do a little taking the temperature here. Ben, why don't I start with you? Can you just give us a little bit of an overview of your understanding of what exactly is going on here? So Apple introduced two new programs to monitor sexually exploitive material from children, so people who are minors. The first one I have fewer civil liberties issues with. That is an algorithm that recognizes nude images in the iOS messaging application. It requires a parent to opt in. So it's not the default setting.
Starting point is 00:04:58 And once a parent opts in, that parent, if that child is under 13, would be notified if that child sends or receives a nude image. That doesn't present civil liberties concerns from my perspective because the parent is opting in, and we're talking about very young children here, and it's obviously a very worthy policy goal to prevent child exploitation. The other program that was announced certainly does present more significant civil liberties concerns, especially as we get into some of these more slippery slope arguments about what might happen. As part of that program, Apple is monitoring photos that have been posted to the iCloud for images that match exploitive child pornography images that are in national databases like the one maintained by the National Center for Missing
Starting point is 00:05:45 and Exploited Children. That program is going to be used at the outset in the United States. It's going to be rolled out over the next several months. And presumably, if Apple finds a photo through their algorithm that matches a photo in that database, then law enforcement is going to be notified that could potentially lead to criminal prosecution. And I think that program is the one that's causing Fourth Amendment advocates and civil liberties groups the most concern. And the concern is that Apple has created a backdoor where if the government seeks to request information and seeks to do so in a way where they're not asking for a warrant issued by an impartial judicial magistrate, then Apple has created the technology where they can go into somebody's private photos and extract that information. I think that just
Starting point is 00:06:38 strikes people the wrong way. It cuts against our values of digital privacy. David, when you heard the news about this and saw some of the initial reactions, what was your response? The concern is more overreach by large technology companies. There's already so much invasive operations. You look at Facebook, you look at Google. And for the last several years, Apple has really built a reputation on privacy. I mean, if you look back to the 2015 San Bernardino shooting, the pressure that they received from the FBI to create that backdoor to get into the phone, to be able to bypass the
Starting point is 00:07:19 four-digit code at the time and get access to the content. So they received such pressure from the FBI, but they remained firm. So my question to Apple is what's changed since then? You see the advertising, the billboards that they put up, what happens on your phone stays on your phone. This is moving in the complete opposite direction. And while I think the spirit of it is well-intentioned, I think we can all band together and say that explicit images with minors, children, that's something that's horrendous, despicable. Nobody wants to see that. But unfortunately, once you open up the door to this kind of access on private citizens' devices, there's no closing it, and it will only get wider going forward. Could the argument be made that Apple's response in the San Bernardino case gives them credibility here, that they have demonstrated that they will resist government pressure to open things up? Ben, what do you think about that? case of the government not only exerting pressure on Apple through the courts, but also in the court
Starting point is 00:08:25 of public opinion saying they're protecting information that might lead us to prosecute terrorists who killed, you know, 15 individuals in this horrific attack. And Apple did stand strong, although eventually FBI was able to break their encryption anyway. So the issue did become moot. But I also think that's one of the reasons the backlash has been so swift, because it is Apple, because it is this company that has presented itself as the foremost protector of private user data. I don't think the reaction would have been the same if it had come from another provider, just because Apple carries that sort of cachet as being this company that presents itself as the utmost protector of our private information.
Starting point is 00:09:09 So I think it goes both ways. They've earned credibility in the past, but just by making this decision, that cuts against their reputation as a company. And I think it might throw some of their past actions into question as well. throw some of their past actions into question as well. Are they really as protective of private information, of end-to-end encryption, as they've claimed to be over the past several years? You know, David, it seems like part of the outrage here is the fact that Apple has chosen to do this on device rather than scanning images that are in the cloud, which is what many other providers are doing. Google does this, Facebook does this, Dropbox does this. So that is a routine, uncontroversial thing, but it seems like Apple did not expect the backlash of actually doing
Starting point is 00:10:00 the scanning on the device. In your mind, what's the difference here? Well, I think there's a big difference. And Apple promises, they make promises that they won't bend the knee to government intervention, requesting access to data that's on the phone, looking at photos and other content. But we also have to remember, Apple sells iPhones without FaceTime in Saudi Arabia because local regulation prohibits encrypted phone calls. And if you look at one of the most stringent cybersecurity laws in the world and really most invasive, if you take China, for example, they have, other apps from the store, because in China, the government needs access to that information and they require audits as well. So I think it's very difficult for Apple to uphold the promise that they will not give up information that's located on the device in the iCloud when we've already seen instances
Starting point is 00:11:04 of them following regulations, being compliant with governments all over the world? Who's to say that this won't be next? And who's to say that the government won't ask for a little bit more as opposed to it just being right now sexually explicit material? I think that's a great point, David. And I know this is cliche. We just had our baseball game in the Iowa cornfields reflecting Field of Dreams. And the line from that movie is, if you build it, they will come. And I think that's unfortunately applicable here, is that once this technology is built, you can't absolutely assure users across the hundred-some-od odd countries where Apple sells its products,
Starting point is 00:11:47 that their information is going to be protected into perpetuity. Because once this technology exists, it can be used not just to scan photos, but to scan other applications, to scan social media applications, although that's already, of course, public anyway, but to scan messaging applications, to crack down on political dissidents, or to foster censorship. And as you've said, we've seen examples where they've bent the knee to more totalitarian countries who have made these requests because they want to continue to sell their devices
Starting point is 00:12:19 in those countries. So once this technology exists, there are going to be governments across the world and not governments that are particularly friendly to civil liberties who want to exploit this technology for their own purposes. And it's going to be much harder for Apple to go to these authorities and say we're not going to do that when now these authorities are fully aware that Apple has the technological capabilities to do so. Apple has the technological capabilities to do so. You know, child sexual abuse material is a special category. And the National Center for Missing and Exploited Children, who heads up the database for this, they have a special status when it comes to this sort of thing. I mean, they are, it's my understanding, they are the only organization in the U.S. who is allowed to house these materials because they have to, you know, they have to have to, in order to create the database, they have to do this. How does the fact that this is a special category of crime play into this, if at all? Ben?
Starting point is 00:13:30 So, from a legal perspective, the First Amendment does not protect child pornography. And that's something that's very unique for that area of the law. The First Amendment is protective of all types of lewd images, even, you know, obscenities, things that are offensive to most of our eyeballs. That is protected under our First Amendment. There is this carve-out that the Supreme Court has reaffirmed over and over again that child pornography does not merit First Amendment protection, partially because it is its separate category in that it is extremely exploitative of children. It can have real-world effects. It's not theoretical. So I think we have to keep that perspective in mind, and that's why Apple is rolling out this technology in these
Starting point is 00:14:17 circumstances to apply to these exploitative images, is that we know that sexually exploitative images of children does carry this sort of extra weight. It falls outside areas of First Amendment protection in a way that all other types of speech, be it political speech, personal expression, do not. So I do think it carries some sort of extra meaning, not just morally, but within our legal system as well. not just morally, but within our legal system as well. David, do you suppose that this adds a layer of anxiety to folks who are using iOS devices, knowing that, I mean, it's hard to imagine, this is such a horrible crime that I think the possibility of being falsely accused of it could cause anxiety for people. And I'm not one who believes in that
Starting point is 00:15:06 argument that, you know, if you're not doing anything wrong, you have nothing to hide. I mean, you know, that's, I think, runs counter to many of the things in our constitution. But I can see, you know, this is a new level of surveillance and it's on your device. It's there all the time. and it's on your device. It's there all the time. If you're using iCloud photo services, it's there. That's sort of the ballgame, yes? I completely agree, Dave. And look, Apple has been, for the last countless years, they've been a company that promotes privacy. They've been a company that's used their edge in this space to gain a competitive advantage over the likes of Google, over the likes of Facebook. So they've been the one leading the privacy charge. And I think that the American public, we've just seen countless data breaches. We've seen constant
Starting point is 00:15:55 issues of rogue employees, threat actors that are accessing our personal information. There's not a day that goes by where you don't see a new headline surrounding ransomware, exposing sensitive and confidential medical information, whatever it may be. And Apple has really been the leader. They've been at the forefront of promoting privacy, of promoting keeping your information secure. So I think that there is a heightened level of anxiety because now you have big tech reaching directly onto your device in ways that I think a lot of people don't realize can happen. So what photos you have, the text messages you're sending, the contacts that you have, so many companies constantly scrape. They go after this information. They share it. And now we have another instance of a program. Again, it's in the best spirit that the intentions are very favorable in terms of stopping the spread of this type of information, stopping the spread of sexually explicit material as it relates to children.
Starting point is 00:17:06 of course. But again, if somebody is wrongfully accused, if there's a threat actor internally, if there's a rogue employee, there are just so many ways for this to go sideways, for it to go wrong. And I think Apple has announced, they did say that there's, the number was astronomical. I think it was one in a trillion that there would be no false positives or one in a trillion false positives. So it's very unlikely. My concern here is, again, they are coming onto your device, onto your personal information. What's going to happen if somebody gets into Apple, if they're able to access information
Starting point is 00:17:34 that they wouldn't have been able to access previously had it not been for this program? What if a rogue employee decides to do something that they shouldn't do if they're going to get some type of payoff from a criminal? And then again, what's the next step from here? As Ben mentioned, what does this lead to? Once this door is opened, it can only get wider. You can't go backwards. Yeah. And as we're recording this, this morning I saw on social media some folks in cybersecurity. There are GitHub projects underway where people are starting to experiment with creating adversarial images to counter this, to confuse this, to cloud this, to make this more difficult.
Starting point is 00:18:21 So that's a real issue as well. It's a possibility. All right, we're going to take a quick break here. We're going to pay some bills and let our advertiser share their message with you. We'll be right back in just a moment. Thank you. and data into innovative uses that deliver measurable impact. Secure AI agents connect, prepare, and automate your data workflows, helping you gain insights, receive alerts, and act with ease through guided apps tailored to your role. Data is hard. Domo is easy. Learn more at ai.domo.com.
Starting point is 00:19:22 That's ai.domo.com. And we are back. Ben and I joined by our special guest, David Derajotis. He is Corporate Senior Vice President at Burns & Wilcox. Apple has been very specific and clear about the levels of technology they have put in place here to try to keep things on the straight and narrow when it comes to this. Apple's Craig Federici sat down with Joanna Stern from the Wall Street Journal. They had a good, what I describe as a clarifying conversation in response to a lot of the backlash. And one of the points that Craig made was Apple, who has been accused of being late to the table with this sort of thing, as we mentioned earlier, Facebook, Google, Dropbox, they've all been doing this sort of scanning for quite a while, and Apple has been lagging. And Apple has gotten pressure from folks in Congress that they needed to step up and do a better job. So part of this, I think part of the why now question might be that this is a response to Apple. If Apple didn't do this themselves,
Starting point is 00:20:45 be that this is a response to Apple. If Apple didn't do this themselves, they would have been forced to do something through legislation. So all that said, the technical steps that Apple has put in place are significant. Craig made the point that, you know, you have to have 30 images or so on your device before Apple even gets notified that there's an issue here. There has to be this critical mass reached before Apple is even able to access any of the images. And only then does a human intervene and view a low-res version of the images to verify that they are actually what the machines think that they are. So what I'm getting at here is that it seems as though Apple, in good faith, tried to come at this problem from a technological point of view. It seems to me that Apple themselves are surprised at the amount of backlash that they are getting here.
Starting point is 00:21:43 Ben, does that surprise you? backlash that they are getting here. Ben, does that surprise you? It surprises me a little bit, just because you'd think that Apple, the most privacy conscious of these companies, would understand why this would cause a backlash. I'm not surprised by the backlash, just because I think it conforms with values that we have. You know, our Fourth Amendment jurisprudence in this country is built around this idea of reasonable expectation of privacy. If we display a subjective expectation of privacy by trying to keep our information, whether that's photos, messages, etc., to ourselves, and if that expectation is reasonable, then the government needs a warrant to obtain that information. I think even though,
Starting point is 00:22:26 you know, that's a provision of the Constitution, it's also a fundamental value that if we seek to keep something private, then the government can't get access to it. And so I think just the possibility, either whether it's through this or something like the EARN IT Act, a piece of proposed legislation that would have compelled companies like Apple to take a similar action. The concern is that it would be a very significant violation of that expectation of privacy. And I think that offends us on a pretty deep moral level just because it's such a fundamental value of our country. It has to do with, you know, going back to our English legal ancestors, where you would have these general warrants, where minions of, you know, the ruling king or queen would come in
Starting point is 00:23:16 and look for incriminating information in people's houses, even if they, you know, didn't have any suspicion that somebody had done something wrong. And so I think our ears perk up a little bit when we hear something like this, where we know that Apple is going to be accessing our private digital space. So I just think based on our own political culture, you can understand why there was such a backlash. David, what are your thoughts on that? I agree with Ben there fully. I mean,
Starting point is 00:23:46 again, how much information are we willing to give to these large technology companies that already encapsulate so much around the way that we live our lives? They already know our location. They already know the different apps that we're using, when we're using them, how long we're using them. They already know all of the contacts that are we're using, when we're using them, how long we're using them. They already know all of the contacts that are in our phone, who we're corresponding with. So we're giving up inch by inch, sometimes mile by mile, as a matter of fact, bigger pieces of our private life. We're going to get to the point if there's no action taken and people aren't mindful.
Starting point is 00:24:22 And I think this is why there has been such a backlash because people have become much more mindful of what it means to be private and to have that expectation of privacy. If we don't continue to fight for that, we're going to lose every possibility of living in a more secure, in a more protective environment without big brother, without big technology, knowing every single step that we take when we leave our homes, when we're sleeping, who we're talking to. We're going to get to the point where we're going to be fully exposed and there'll be nothing left that we can do about it. Hmm. It kind of strikes me that, you know, most of the intrusive technological surveillance that
Starting point is 00:25:03 we've been subjected to over the past several years has been more of the metadata variety. You talk about contacts, you know, who we're messaging, what the duration of that message was. Even something like historical cell site location information doesn't, you know, reveal our most private communications. So I think that's where this is a very prominent step in that direction is we're talking about content here. We're talking about the actual photos and the actual messages. So I think there really is a distinction in terms of the private information Apple has had access to in the past and the private information now that they have access to the content of our devices. What about the fact that this is all opt-in? information now that they have access to the content of our devices.
Starting point is 00:25:53 What about the fact that this is all opt-in? I mean, in other words, Apple has said that if you're not using iCloud photo services, they're not even going to be scanning your photos. It effectively turns off the switch for any of this scanning to happen. The hashes for the images are going to be in the operating system. That's going to be baked in now. But unless you turn on your iCloud functionality, they're not even going to take a look at it. And yet people don't seem to be calmed down by that. Why do you think that is, David? Well, again, I think it's the level of awareness that Apple has really created in their marketing strategy, in their campaigns, their focus on privacy. I do think that it is a positive and a great step that they're taking in terms of making sure that the consumer has to specifically and explicitly opt in to this service. But again, we've seen a number of cases,
Starting point is 00:26:46 not Apple, but we've seen cases throughout the years where companies make retroactive changes to their privacy notice, to the actions that they're taking, and they just automatically start collecting information or they're automatically opting people in to the various programs or data collection methods that the company employs. So this is where you as a consumer have to pay very close attention to the terms and conditions to any future changes that could occur. Because right now it may be opt-in, but who's to say two years from now that doesn't change and you're automatically opted in when you do a software update for the newest Apple release.
Starting point is 00:27:24 Those are things that people need to pay very close attention to because it can come back to bite you if you're not aware. You mean, David, you don't read all 200 pages of the terms and conditions when you download that iOS update? I know, you know, being involved in the privacy space, I hate reading all of those terms and conditions and privacy notices. Actually, I guess it's part of your job that sometimes you have to, right? Poor guy. That's right.
Starting point is 00:27:50 You know, it also strikes me that as much as Apple has built a reputation on supporting privacy as being, you know, the privacy company among these big tech giants, you know, the privacy company among these big tech giants. They also have a reputation for a certain amount of arrogance. You know, Apple knows what's best for you, right? You don't, you know, when the iMac initially came out, you don't need that floppy drive anymore. You know, with recent iPhones, you don't need that headphone jack anymore. And while I think that's largely worked out for Apple, certainly, you know, their products are selling well. They are, if not the, among the
Starting point is 00:28:33 most valuable companies in the world. I think this is a case where that attitude, where they didn't float this publicly before they just came out and announced it fully baked, right? They came out and said, this is what we're doing. It is a technological marvel. It's going to solve all of these problems and it's a done deal. I think that attitude from Apple in this case isn't really serving them well. Do you agree with me, David? this case isn't really serving them well. Do you agree with me, David? I agree.
Starting point is 00:29:10 I mean, Apple has sold, they have over a billion iPhones in use today. The iCloud has over 850 million users across the world. They've built a reputation around user experience, around privacy. And I think anything that they do, any steps that they take that are contrary to that perception, to that kind of cachet that they've created, it's going to create outrage. And I think that's why we're seeing the backlash here, because they have built a reputation on protecting their consumers. They have built a reputation on going in the opposite direction that other large tech companies have chosen to take. Apple's on a different path. And again, the question I would ask, what's changed now
Starting point is 00:29:49 from your stance in 2015, 2016, when you were willing to put up that fight, when you were not willing to turn over the contents of the phone and not willing to create that backdoor to break the encryption? Because then it was about privacy. And Apple stated, if we make these changes, if we create that backdoor, privacy will be lost going forward for everyone. So now that they're taking this new step, the question I ask is, what's changed since 2015? Yeah, and I think it's worth just mentioning as a point of clarification that Apple does not encrypt your iCloud backups.
Starting point is 00:30:29 And that has been a way for law enforcement to get at information that people had stored on their phones. It's also benefited Apple because it gives them the ability to help people restore their phones. them the ability to help people restore their phones. If a phone's been locked or broken or lost, you know, they can get at those backups that they otherwise wouldn't. But, you know, in the past, Apple has made noises that they were going to start encrypting that, and they've got pushback from folks like the FBI, and they haven't done it so far. Ben, I'm curious. We hear stories about these tech companies receiving demands from the government to turn over information. And part of that demand includes not being allowed to tell anyone that the demand was made. Right, a gag order. Is that coming into play here?
Starting point is 00:31:25 Is the specter of that hanging over this as well? Absolutely. In a bunch of different circumstances, when the government requests information, those requests come with a gag order. And depending on the circumstance, especially if it has to do with something national security related, it might be really hard to even seek judicial review to reveal information that's part of that gag order. That could be very dangerous for the public because Apple is, in a sense, muzzled. They're not able to share the type of requests that they've been given, whether it's been informal or whether it's been an official subpoena. And that can certainly impact public debate on this. If Apple is receiving a bunch of these requests,
Starting point is 00:32:06 but they're bound by gag orders, how are we to know as the public exactly what Apple is asking for and exactly what they are receiving? And that really does depend on the context of the information that's being sought, but it's certainly an issue that's in play here. Hmm.
Starting point is 00:32:23 David, in your mind, where could Apple go from here? Is there a way that they could salvage this situation? I think for true privacy advocates, there's no turning back for Apple at this point. I don't know if there's any salvaging that the messaging that's already gone out, the steps that they're taking. They're not launching this update until the end of this year. And it's going to be in the U.S., United States only. So I think it can only snowball and become more of a privacy nightmare going forward as they introduce this in other countries, as they roll it out in other places across the world. My worry is what's going to happen for the
Starting point is 00:33:05 citizens in those countries, what's going to happen for us here in the United States years down the line. Because again, they are taking, as it appears, very strong security measures. The hashing methods that they're doing on the phone, matching them back to the NCMEC, the National Center for Missing and Exploited Children, they're doing it in a private manner in a way that makes sense. My fear is what's the next version of this going to look like for citizens in China and Saudi Arabia? What's going to happen for the citizens when the government says, well, we want to now see different types of communications, whether it's political, whether it's religious? see different types of communications, whether it's political, whether it's religious, what's going to happen when they start matching hashes to those types of conversations and those types of images that are being communicated and shared between the citizens. I don't think there's any
Starting point is 00:33:54 going back from here. It's only going to get broader and wider in the years to come. A question for either of you. I mean, do the service providers themselves, the Apples, the Googles, the Facebooks of the world, do they have liability for potentially having these images on their servers? Are they protected by, say, Section 230 of the Communications Decency Act, or are they obligated to look for and scrub and report these sorts of images? Well, Ben, maybe I can jump in here for part of this. Section 230 is not all-encompassing. There are some exceptions to what's covered, federal criminal liability, electronic privacy, intellectual property claims. So I think any type of information that an organization collects, stores, no matter how sensitive, you can look at the most sensitive medical information, procedures, billing information that are stored and collected on individuals. You can look at social security numbers. If they're choosing to store very
Starting point is 00:34:56 sensitive, critical data on consumers, it's their obligation to protect it and to store it securely because they're the ultimate data owner. They are the ones that have this information within their network. They need to protect that data as well. So I don't see a scenario where they will be protected from any type of liability. The more information they collect, it only increases the liability that they have for the consumers that they service. Yeah, that's my read of the situation as well. It also is the Communications Decency Act that we're talking about. So the intention of that act was to limit the exposure of offensive content to children. I mean, Section 230 is one part of that law. But, you know, I think while there is a general shield of liability for images or content posted on these third-party networks,
Starting point is 00:35:55 as David says, it's not all-encompassing. And there are ways in which they can be held liable. Hmm. All right. Well, so final word, Ben. Let me give it to you. I mean, suppose Apple came at this and said, you know what, we're going to reconsider this. We're going to come at this the same way that some of our colleagues have. We're just going to,
Starting point is 00:36:18 we're simply going to scan things on our cloud servers. We're not going to do this on your actual device. Do you think that would put them in the clear? Would people be okay with that? I kind of agree with David that the cat's already out of the bag. You can't, you know, stuff this back in just because they have already decided to make the decision to go into people's devices. Now we know, A, that they're willing to engage in such a step, even if they were to retract under a severe public backlash, and B, that the technological capability exists. So even if they were to retract these particular programs, we know that they have the technological capability, and if they were ever sufficiently pressured, whether it's by our government or by Saudi Arabia or by China, into making use of this technology, that the technology exists. And it's now all on the record that this is something that they've developed and are willing to use. You know, I think we can
Starting point is 00:37:16 surmise that this effort is the result of political pressure because members of Congress, you know, as evidenced by the proposed legislation of the EARN Act, think that tech companies have not done enough to protect against childhood exploitation. So in a sense, this announcement is a reaction to public pressure, which means what's to stop them from similarly reacting to public pressure in the future, whether it's in this country or a country that cares even less about civil rights and civil liberties. So I think it would be really hard for them to fully undo what they have already done by making this announcement. David, your final thoughts? I agree with Ben 100% on that. And I think this is just a reminder, if we truly want to obtain personal privacy and protection, we cannot
Starting point is 00:38:05 give so much power to any one technology organization, any one company in general. It's important to segment your digital life. Having all of the information on your Apple device, having all of the communications, the phone calls that you're making, contacts, photos, it puts you at risk. And you're really at the mercy of that company, no matter how strong of a privacy reputation they want to promote. So I think that the biggest learning lesson from all of this is segment your exposure. Don't put all of your eggs in one basket and make sure you're mindful of how the company operates, what they're collecting and who they're sharing that information with. All right. Well, gentlemen, a good conversation on a difficult issue. Thanks to you, David Derajotis, Corporate Senior Vice President at Burns & Wilcox. Thank you so much for taking
Starting point is 00:38:55 the time to join us today. Dave, it was my pleasure. Thank you, and thank you, Ben. Thank you, David. Cyber threats are evolving every second, and staying ahead is more than just a challenge. It's a necessity. That's why we're thrilled to partner with ThreatLocker, a cybersecurity solution trusted by businesses worldwide. ThreatLocker is a full suite of solutions designed to give you total control, stopping unauthorized applications, securing sensitive data, and ensuring your organization runs smoothly and securely. Visit ThreatLocker.com today to see
Starting point is 00:39:33 how a default deny approach can keep your company safe and compliant. Thank you. where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Iben. Our executive editor is Peter Kilby. I'm Dave Bittner. And I'm Ben Yellen. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.