CyberWire Daily - Rigging the game. [Caveat]

Episode Date: October 12, 2020

Ben describes a decades-long global espionage campaign alleged to have been carried out by the CIA and NSA, Dave shares a story about the feds using cell phone location data for immigration enforcemen...t, and later in the show our conversation with Drew Harwell from the Washington Post on his article on how Colleges are turning students’ phones into surveillance machines. Links to stories: ‘The intelligence coup of the century’ RIGGING THE GAME Spy sting Federal Agencies Use Cellphone Location Data for Immigration Enforcement Got a question you'd like us to answer on our show? You can send your audio file to caveat@thecyberwire.com or simply leave us a message at (410) 618-3720. Hope to hear from you. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to the CyberWire Network, powered by N2K. Calling all sellers. Salesforce is hiring account executives to join us on the cutting edge of technology. Here, innovation isn't a buzzword. It's a way of life. You'll be solving customer challenges faster with agents, winning with purpose, and showing the world what AI was meant to be. Let's create the agent-first future together. Head to salesforce.com slash careers to learn more.
Starting point is 00:00:55 Is this a level of surveillance too much? Is this privacy invasive? Is this something that the students even really understand what we're doing? And so there's this big debate on a lot of colleges as to whether this is an appropriate level of student supervision or whether this has gone too far. Hello, everyone, and welcome to Caveat, the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yellen, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. Hi, Dave. On this week's show, Ben describes a decades-long global espionage campaign alleged to have been carried out by the CIA and NSA.
Starting point is 00:01:23 I share a story about the feds using cell phone location data for immigration enforcement. And later in the show, my conversation with Drew Harwell from The Washington Post. We're going to talk about his article about how colleges are turning students' phones into surveillance machines. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. Hey, everybody. Dave here.
Starting point is 00:01:57 Have you ever wondered where your personal information is lurking online? Like many of you, I was concerned about my data being sold by data brokers. So I decided to try Delete.me. I have to say, Delete.me is a game changer. Within days of signing up, they started removing my personal information from hundreds of data brokers. I finally have peace of mind knowing my data privacy is protected. Delete.me's team does all the work for you with detailed reports so you know exactly what's been done. Take control of your data and keep your private life private by signing up for Delete.me. Now at a special discount for our listeners. Today, get 20% off your Delete.me plan when you go to joindeleteme.com slash N2K and use promo code N2K
Starting point is 00:02:48 at checkout. The only way to get 20% off is to go to joindeleteme.com slash N2K and enter code N2K at checkout. That's joindeleteme.com slash N2K, code N2K. code N2K. And we are back. Ben, why don't you start things off for us? A big story dropped this week. Yes, this is from the Washington Post in an article entitled The Intelligence Coup of the Century. And the headline is certainly appropriate as this article was indeed eye-opening. Yeah. So the history of this story goes back to the end of World War II and to the beginning of the Cold War.
Starting point is 00:03:30 Many countries around the world, including some of our adversaries, trusted this one company, Crypto AG, to keep the communications of their spies, soldiers, and diplomats secret. And we found out through this article, through information that was made public to the Washington Post and to a German publication, that company was actually working directly for the CIA where other countries around the world would think that they were purchasing this advanced cryptological product, but really they were purchasing a product that would allow them to be spied on by the U.S. government. And Crypto AG is not a U.S. company. It is not. It is a company that is based in Switzerland. In fact, one of the things that was eye-opening about the article, even though this company has been bought out and liquidated as of 2018, there is still actually a building in Switzerland that has Crypto AG lit up on the side of it. So it's a company that
Starting point is 00:04:36 exists in a modified form, but it is a company that has existed basically as long as modern cryptography has existed dating back to World War II and the Cold War. So this was a joint agreement between the CIA, West German intelligence at the time, and this company. And it was unknown to not only foreign governments, but also to the US media and media around the world. And these documents for the first time shed some light on that arrangement, which is really quite remarkable. It sort of confirms not some of the conspiracies, but some of the allegations and the Edward Soden disclosures of 2013 that the U.S. runs a pervasive global surveillance network. And nothing can be more pervasive than tricking countries into buying cryptography products that are actually monitored by our intelligence agencies.
Starting point is 00:05:28 And the upshot of this is that our intelligence agencies were able to gain quick access to our adversaries' conversations. And that was obviously a major boon for our intelligence during the Cold War, during our conflicts in the 1990s, and even the early parts of the war on terrorism in the early 2000s. So certainly something that opened my eyes. Now, there were some interesting, first of all, the whole article is fascinating from the Washington Post, but some things that caught my eye were that not only were these devices sold to our adversaries, but to our allies as well. And there were some points where the Germans got a little cold feet about that, that maybe they thought the Americans might have been doing a little too much peeking at allies' communications. Yeah, so we were far less bashful
Starting point is 00:06:17 than our German counterparts were in using this as an espionage tool. Even though this was a joint operation between us and BND, which is the West German Intelligence Agency, you're right that the German government did not like the fact that we, not only was this tool being used by adversaries of our government, but also some of our allies during the Cold War.
Starting point is 00:06:37 I think part of that was to maintain the cover of the secret espionage technique. If this was a product, crypto AG, that was only sold to our adversaries, perhaps that would open some eyes in foreign intelligence communities, including maybe some of our adversaries, that this was all a ruse. This was all an effort for the United States to breach the private communications of foreign countries. So in order to maintain our cover, I think part of that included selling this service or watching the service be sold to some of our allies as well. Another fascinating aspect of this, I saw come by this morning, a reporter for the Baltimore Sun,
Starting point is 00:07:16 who said that back in 1995, they had done research and had written a story about this very thing. And at the time, the CIA and NSA said, this is baseless speculation. It's a ridiculous story. Of course, we don't do any of this. And interesting to see how, as time marches on, we look back, the Baltimore Sun had it right. Yeah, our local newspaper did quite enterprising work in 1995. And frankly, looking back on it, it's now very impressive of course the CIA and the NSA are gonna deny it because they had been able to pray off the gullibility of foreign countries to use this product and were able to make a profit part of the arrangement we made with the
Starting point is 00:07:57 individual who owned this cryptography company crypto AG was that we would share profits of the sales so he would get a stipend, but it was actually the United States government that was collecting profits of the sale of this technology. So of course, in 1995, when this was discovered by this Baltimore Sun expose, it's understandable that our intelligence agencies would want to deny it. And that sort of leads me to wonder why they didn't quite deny it for this Washington Post article. Now, neither the CIA nor the NSA confirmed the contents of this article, but it was sort of a non-confirmation confirmation, nod of the head saying, well,
Starting point is 00:08:34 we're not saying this is not, not true. We are saying this is not, not, not true. And I think there are a couple of reasons for that. The main reason is that, as I mentioned, this company has been liquidated. So this arrangement, at least in the form that it existed for 60 years, is no longer in existence. I think that's probably the main reason. The other key reason is that the cryptography industry has sprouted up. Now there are a million enterprising companies that provide these services. Crypto AG had a monopoly on cryptography services for a long time. And we were able to take advantage of that,
Starting point is 00:09:09 or at least a semi-monopoly. Now, with a bunch of competitors, it can't control the market in the same way. So it's not going to be as effective of a tool. If there's one supermarket in a town, then we could put a surveillance camera in front of that market and see every single person that goes shopping. Suddenly, when there are 20 supermarkets, that becomes much more difficult. So I think that's one of the reasons why this program no longer exists in its identical form and why the CIA and NSA aren't exactly denying it at this point. Yeah, I suppose as time marches on, this sort of cryptography is no longer the exotic thing that it once was. Pretty routine to use strong encryption when you're sending messages back and forth these days just through software. Absolutely. And it wasn't always
Starting point is 00:09:55 like that. I mean, that's why it was so secretive and so lucrative for so many years is this was an industry that was still in its infancy as electronic communications themselves were in their infancy. Now, cryptography itself has always existed. People were writing on with feathers and pens using secret code and there were people trying to decrypt that encrypted information. But at least in its current form, this is something that's relatively new. And in the last 10 years or so, more of an industry has developed around strong cryptography. Do you suppose there'll be any fallout from this? Because the program is pretty much wound down, is this kind of something where probably all the
Starting point is 00:10:36 players in this likely kind of knew it was going on, and so it's sort of more historical than anything? Yes, I do. I don't see there being any repercussions. If this were the 1970s and we were still a country that was obsessed with pervasive secretive government surveillance efforts, then we could see something like the church committee, which was the congressional committee set up to unveil all these secret CIA programs. Nixon, you know, killing his enemies, that sort of thing, whatever the CIA was doing in foreign countries in the 1960s. Right, right. You know, and we started to see a little bit of that political will show up after the Snowden disclosures.
Starting point is 00:11:15 But my guess is this is sort of historical in nature. I think there's also kind of an assumption that we sort of look the other way with what our intelligence agencies do. It we sort of look the other way with what our intelligence agencies do. It's sort of better left unknown. I mean, the CIA is subject to a lot of federal regulations. Many of them are classified, but it's sort of an area that lawmakers are generally very hesitant to touch. And that's why they get away with a lot, including this pretty massive heist that they were able to engineer for several generations. Yeah, well, it's definitely an article worth checking out.
Starting point is 00:11:49 It's from the Washington Post, written by Greg Miller, The Intelligence Coup of the Century. Do check it out. It is, and it's also just a very fascinating historical lesson. You could almost learn a lot about foreign policy in the last 60 years by simply reading this article. So yeah, I would recommend it. Yeah. My story this week comes from the Wall Street Journal, and this is written by Byron Tao
Starting point is 00:12:11 and Michelle Hackman. And it's titled, Federal Agencies Use Cell Phone Location Data for Immigration Enforcement. A couple of interesting aspects of this. So Immigration and Customs Enforcement, ICE, part of DHS, they have been buying cell phone location data from private companies and using it to track down various folks of interest to them who may be crossing the border. They're saying they're using it for people entering the U.S. illegally, people who might be running drugs. This story says that they used this data to reveal a tunnel that went underneath the border. So on the one hand, I suppose you could say from a law enforcement's point of view, the purchase of this data from a private contractor, it has useful functionality for law enforcement purposes. But I'm going to guess that you have your own opinions
Starting point is 00:13:02 about what's going on here, Ben. Yeah, I mean, certainly this presents civil liberties concerns both from a Fourth Amendment perspective and people who are concerned about overbroad authorities from our immigration enforcement would certainly be concerned about this article. Now, DHS didn't exactly tell the Wall Street Journal, understandably, what they were doing with this location data. But as you mentioned, we're suspecting that they're using it to track suspects,
Starting point is 00:13:28 figure out where these illegal entry ports are, and you referenced this potential tunnel under the border. They're claiming that they purchased data that was anonymized, and it is. All different types of advertisers purchase anonymized location data to make decisions on advertising targets. This is something we've talked about on the podcast many times. Right. Legal, legal to do. It's absolutely legal to do. DHS can say that it's anonymized, but we talked about that New York Times article several times on this podcast. Well, per se, the information is anonymized. There
Starting point is 00:14:02 are various ways to de-anonymize it. You can probably figure out who a person is if you spend all of your time tracking their location. So that's one concern. The other concern is that this is sort of a workaround, the Fourth Amendment to the United States Constitution. So we've talked about in 2018 there was a Supreme Court decision, Carpenter v. United States. Yep, comes up a lot. It sure does. And it held that cell phone location data is protected by the Fourth Amendment. Law enforcement needs warrants to collect it.
Starting point is 00:14:33 Right. But here we essentially have a workaround. It's not the government asking for judicial approval to access this data. They're simply purchasing it on a commercially available market. And there are a lot of companies that sell this data. It're simply purchasing it on a commercially available market. And there are a lot of companies that sell this data. It's available through many commercial exchanges. So you don't really have to get the lawyers involved. They can buy the data. It's theirs. It's anonymized. And, you know, it's a way to get around what they would otherwise have to do,
Starting point is 00:15:00 which is go to a magistrate judge or a federal judge and get a warrant to get location data, historical self-state location data on any individual. So that's another reason why it's concerning. Another thing I'll mention is from the Fourth Amendment perspective, per a 1989 Supreme Court decision, Verdugo or Quidez, I'm sure I'm pronouncing it wrong. The Fourth Amendment does not apply to people who are not part of the so-called national community. And that generally includes unlawful, undocumented immigrants. That was my next, that was going to be my next question to you. You're reading my mind. Jinx. Go on. So, you know, that's another element to this. The Carpenter case,
Starting point is 00:15:42 as is true for all Fourth Amendment jurisprudence, certainly applies when we're talking about U.S. citizens or U.S. persons who are in this country lawfully. But that standard is very different, for better or worse, when we're talking about people who are undocumented. Now, where this gets complicated is they're purchasing a lot of location data. Presumably, most of that location data concerns U.S. persons. There are going to be certain segments of it that concern people who are undocumented who are in this country. So the government has obtained a lot of location information they could use to identify U.S. persons, even though maybe DHS and Immigration and Customs Enforcement is not using it for that
Starting point is 00:16:22 purpose. So if you want to trust that once the government purchases this data, they're only using location tracking as it relates to undocumented immigrants, then feel free to do so. But I think there's reason for doubt on that front. How would this workaround get shut down? How would someone come at this, the folks who believe that this is problematic? What would be their avenue to prevent agencies like ICE from using this sort of data? So I don't know if we have a bleep feature on our podcast here, but I would say they are S out of luck. One of the reasons is standing. In order to sue the government, you have to have standing. And in order to have standing, you'd have to prove that your specific location data was collected. That comes back to a Supreme Court decision,
Starting point is 00:17:09 Clapper v. Amnesty International. It's going to be almost impossible for someone to A, know that they're being surveilled, and B, be able to prove it with the type of specificity needed to get oneself into court. And then that doesn't even get to the problem of non-US persons who basically would not have any legal cause of action to challenge this type of surveillance. So I guess you're not entirely SOL. Your one avenue is to go to Congress and perhaps write your legislator, tell them that they should prohibit ICE and DHS from purchasing location information from third-party vendors. I'm not guessing that there is a groundswell of support for such a policy in the current United States Congress,
Starting point is 00:17:49 but that doesn't stop people from contacting their legislators in other contexts. So go for it if that's something that you believe in. Or contact the executive branch. They get letters. You have a First Amendment right to tell the government to redress your grievances. So that is one avenue. But it's certainly not something where I would expect there would be some sort of legal injunction against this type of collection. Interesting. All right. Well, those are our stories this week. It is time to move on to our listener on the line.
Starting point is 00:18:23 Our listener this week is Tim. He calls in from Vancouver, Washington, and he's got a question about blockchain and GDPR. Hi, my name is Tim from Vancouver, Washington. I am calling about the implications of GDPR for things like the blockchain and specifically the right to be for guts and that sort of a thing. Now GDPR specifically, as I recall, calls out any sort of unique identifier of a user could be used to add personally identifiable information. The blockchain is supposed to be anonymous,
Starting point is 00:18:59 but as we know, it can easily be de-anonymized. And your, like, Bitcoin address, for example, is a thing that you deeply identify. If not you as a person, at least you as a user within the system. And that can be de-anonymized to you as a person. My question is, is the blockchain be necessarily something where you're not able to remove transactions or to be quote-unquote forgotten what about all these startups that are like facebook but on the blockchain or whatever you know they try to put everything on the blockchain because it sounds cool in their startup package in their business plan does gdpr have anything to say about that kind of a thing and does gdpr
Starting point is 00:19:41 put something about a railroad bike through the ideas using blockchain for everything, which seems like a lot of companies are trying to do nowadays. Thanks a lot for your time, guys. I love the caveat, love CyberWire, and keep doing what you're doing. I appreciate it. Thank you. Alright, so interesting question from Tim. Tim, thanks for sending that in. To me, this comes down to that basic tension that I've heard people describe with the says that an individual has the right to be forgotten. They can have their personal data erased. But then we have blockchain technology, which is immutable. And so a lot of analysts have concluded based on this dichotomy that it's impossible to
Starting point is 00:20:40 store any kind of personal information on blockchain and still comply with GDPR. I've seen a couple of articles out there point out some nuance in this that might give policymakers some clues as to how to resolve this apparent conflict. One is that blockchain isn't always immutable. So some of the new protocols make it so that data stored on blockchain isn't as immutable as it once was. So that's one consideration. Yeah, I've seen that too, where you can say, within the blockchain, you can go back and
Starting point is 00:21:10 say, there used to be data here, and it's not here anymore, and here's why, and so on and so forth. I'm sure I'm getting the details wrong, and there's probably people screaming at their computers right now. But yes, to your point, there's some interesting innovation there. Yes, yeah, they can scream at their computers right now. But yes, to your point, there's some interesting innovation there. Yes. Yeah. They can scream at their computers all they want. The thing that's probably more in my wheelhouse in terms of knowledge, since I probably know far less about blockchain than you do, is the details of the right to be forgotten. GDPR is not absolute
Starting point is 00:21:41 on the right to be forgotten. So there are a couple of exceptions to the general rule that a person has a right to be forgotten according to GDPR. If processing is still necessary for the performance of a contract, for scientific or historical reasons in the public interest, or to comply with a legal obligation, or if there's a legitimate interest that overrules the interest of the data subject, then the right to be forgotten does not apply. So that could open up some potential avenues for blockchain users if they can figure out how to fit within one of those exceptions. The last thing I'll mention is that the definition of personal data in GDPR isn't entirely clear. There's been a lot of scholarship about some confusion as to what counts as personal data. So, you know, some things that might be on-chain. So there have been questions about pseudonymized data. And GDPR has tried to, through EU regulations, define exactly what that means.
Starting point is 00:22:36 But those regulations have not yet been finalized. So, you know, until that definition is finalized, we can't be entirely sure that whatever is on the blockchain counts as personal data for the purposes of GDPR. So while there is this apparent conflict, I think when we take these other factors into consideration, there's some nuance there that could allow the right to be forgotten and blockchain to coexist. Now, you mentioned an exception for fulfilling legal obligations. Could this be something that through the typical gymnastics that take place with something with a EULA, with an end user license agreement, that could provide some cover here? Yeah, it is. Although those license agreements would still have to abide by the regulations inherent in GDPR.
Starting point is 00:23:21 I see. And because companies have had to adapt to GDPR, that's going to apply to users outside of the European Union, including those in the United States. And now that we have the CCPA in California, those sort of end-user license agreements, the compliance has become more difficult for these technology companies.
Starting point is 00:23:42 So it is something you can account for there, but those are still subject to probably whatever the highest level of regulation that these companies would have to comply with, whether that's the European Union or the state of California. Yeah. All right. Well, Tim, thank you for sending in your question. It was a good one. We would love to hear from you if you have a question for us. Our call-in number is 410-618-3720. That's 410-618-3720. You can also email us an audio file at caveat at thecyberwire.com.
Starting point is 00:24:14 We would love to hear from you. Coming up next, my conversation with Drew Harwell from the Washington Post. We're going to be discussing his article on how colleges are turning students' phones into surveillance machines. But first, a word from our sponsors. Your business needs AI solutions that are not only ambitious, but also practical and adaptable.
Starting point is 00:24:38 That's where Domo's AI and data products platform comes in. With Domo, you can channel AI and data into innovative uses that deliver measurable impact. Thank you. role. Data is hard. Domo is easy. Learn more at ai.domo.com. That's ai.domo.com. And we are back. Ben, you and I recently spoke about Drew Harwell's article in the Washington Post about how colleges are using cell phone information to track their students, some of the concerns there. We reached out to Drew and he agreed to come on the show. Here is my conversation with Drew Harwell. This was an issue that was starting to bubble up from campus newspapers. Some of these college journalists were seeing, you know, strange emails alerting them to what in some cases was just called a new attendance monitoring system.
Starting point is 00:25:49 And so there were a couple of colleges that were starting to write about this, but I didn't really have a sense of how widespread this was. Once we sort of dug in a bit, we found out that it was a lot of colleges, actually, maybe 60 or so across the country just using one of these two companies. You know, so we talked to the companies and found that there's kind of the creation of this new infrastructure for monitoring students' attendance, but also their location in a really intricate way across campus using sort of Bluetooth beacons and these campus-wide Wi-Fi networks to process en masse where students were going, when they were going there, and kind of their behavior along the way. What is the exchange here? I mean, are these companies coming to the university and saying, hey, here's a capability that we can offer you, and these are the benefits that you could get from it? Yeah, they're going to the campus and saying, you have all of these students, you have a problem of professors having to gather attendance in this kind of old school way. And you want data on your student body as well.
Starting point is 00:26:57 You want to know when they're going to the library and how often. And so here we have this product that can use pretty basic technology or the technology you already have on campus and give you that extra power and that extra knowledge in a really simple way. And so for a lot of the administrators, they feel like, hey, what's the harm, right? I mean, this is a fairly cheap system to turn on. We get a ton of information out of it. And the company, you know, does all the heavy lifting. And so it's for some of these administrators been an easy sell. But there's also been this ethical argument from some of the professors and administrators as well, too, saying like, hold on a second. Is this a level of surveillance too much? Is this privacy invasive?
Starting point is 00:27:46 hold on a second, is this a level of surveillance too much? Is this privacy invasive? Is this something that the students even really understand what we're doing? And so there's this big debate on a lot of colleges as to whether this is an appropriate level of student supervision or whether this has gone too far. Are the students given any ability to opt out? In most of these cases that I've seen, yes, they are given the ability to opt out. With the Bluetooth beacons, they can say no and then kind of check in in a much sort of slower, less efficient way in person with the professor. They still have to sort of mark their attendance in some way. But with the Wi-Fi one, if you connect onto campus internet, which is pretty much every student on campus, you are given an opt-out, but it's sort of in a way that it suggests the opt-in window says, do you want to, you know,
Starting point is 00:28:30 increase campus security and, you know, make the experience better for everybody? If you choose to opt out of that, that's when you opt out of the program. But, you know, as with every kind of conversation about informed consent for these, the feeling is that a lot of these students are probably seeing the window pop up and just clicking yes, as with all of us, right, when we install a new app. And so there's a feeling from even when I talked to a lot of these students that they didn't really understand what the trade-off was. They didn't understand what data was being collected and how sort of granular their schedules were being monitored. Yeah, one of the things you highlight in your story is how this could be applied to student athletes. And I can see there being extra attention to student athletes who are on scholarship.
Starting point is 00:29:18 Obviously, the universities have a big investment in them. Yeah, and this system actually got its roots in monitoring student athletes. The feeling was for a lot of these athletes, you know, like you said, they are volunteers. They are already pretty hyper surveilled. They already have human class checkers kind of go into their lecture and make sure that they're in class because, you know, the team wants them to be eligible to play on the weekend. And part of that is fulfilling some basic academic requirements. So the system kind of got its start by being, hey, let's automate this. Let's make it so you don't need a human to check in. They just sort
Starting point is 00:29:56 of go on their phone and tell the team. And so right now that system is in play to the tune of if a student doesn't go to class, one of these student athletes, or if they're more than a couple minutes late, one of the advisors on the team or one of these sort of academic specialists will text the student athlete and say, where are you? How can I get you to class? So, you know, there was a feeling that for these student athletes, it's okay, right? Because, you know, they get something out of the arrangement too. And, you know, for a lot of them, they're on scholarship, but the school wants them to be there. But there's also been kind of this slide into the more general student population. And so this will be students that maybe are paying their way through school, or maybe there's no real
Starting point is 00:30:38 academic requirement except their own personal sense of responsibility. And it's obviously a bigger base of students as well. This isn't just the couple hundred that are kind of playing athletically to represent the school. It could be tens of thousands across campus. And so it becomes sort of a question of, is there a level of surveillance creep here to where it goes from student athletes to all students to maybe all faculty? And you see this going, not just, you know, this kind of surveillance systems used not just on school campuses, but increasingly in workplaces. And you just kind of see a couple colliding factors here. The technology is so cheap, the desire to
Starting point is 00:31:16 gain that amount of sort of mass data by campus leaders, employers is really high. And it's just extremely easy to turn on. We all have sort of location trackers in our pockets at all times. So all of those dynamics are sort of colliding and, you know, you see it happen on school, but who knows where it goes to next. What's the reaction to this story been? What sort of feedback have you been getting? There's been kind of an alarm from a number of students and parents and professors even that feel like this is incredibly invasive to students' privacy. You know, there's been a number of professors, though, too, and parents as well saying,
Starting point is 00:31:59 you know, these are young adults that we want to make sure are doing the right thing. If we can increase their attendance, if we can increase their student performance, make them more successful graduates, maybe a couple of nudges aren't that bad. And, you know, there's also sort of this fatalism in terms of our locations are being tracked all the time by our phones anyway. If the school I go to thinks it's important to use that data in a way that can maybe even benefit me or benefit my college education, then what's the issue? But I think one sort of interesting kind of societal reaction has been from people who feel like this is like a technological nanny state, right? You're taking these young adults who are put onto a college campus,
Starting point is 00:32:50 expected to sharpen their own sense of personal responsibility, go through this period of adolescence, make mistakes, you know, do all the things that you're expected to do in a university environment. And they're taking that and making it so there's this extra layer of oversight by this technical system. And you're almost supplanting that sort of self-starter mentality that you're trying to grow in college and saying, you know, it's just another box to check. It's just another technical system to satisfy. So I think that part has been really interesting to me, this feeling that maybe these students are being infantilized along the way by having to make sure that their attendance point average is satisfactory, but not really having to think about what's my sense of self-responsibility? Should I just be doing this because I want it, not because I need to police some other system?
Starting point is 00:33:33 I also can't help wondering about a chilling effect on people's ability to explore things, you know, or even, you know, visit a particular religious organization on campus or go to the health clinic or, you know, check out the LGBT community or something like that. If you know someone's looking over your shoulder, it could have a chilling effect on those sorts of things. Yeah, absolutely. And a college is full of sensitive places, right? And young adults do all sorts of sensitive things that they may not want a school leader or a parent or a professor or an administrator to know about. And we've often celebrated that, right? This is their independence. We want them to build that sense of autonomy. And there has been research on this too, that students and many of us really
Starting point is 00:34:22 change our behavior when we know we're being watched. We don't pursue the kinds of activities or behaviors that we would do if we felt like our privacy was preserved. So it changes our behavior. You can see kind of the benefit in a system that could get students in class more, right? But you can also see the peril of a system that can be used to prevent somebody from, like you said, I mean, going to a health clinic for a health scare or pursuing a different faith. The danger there is just so real. And even if these colleges aren't misusing the data, right, that implicit fear from the students, that feeling that someone's always watching, you just have to worry about what that could subtly do to these young adults as they're kind of building out
Starting point is 00:35:07 the rest of their life. You know, my co-host and I were joking when we were discussing your article that there's an opportunity here for some enterprising students to gather up other people's mobile devices and take them to the library and take them to places where there's healthy food
Starting point is 00:35:24 and check them into their classes, you know, be that person. There's always money in fooling the system, right? I can imagine. If I were a student entrepreneur, that's what I would be pursuing. Right, right. All right. Well, Drew Harwell, thank you so much for taking the time for us. The article is Colleges are Turning Students' Phones into Surveillance Machines, Tracking the Locations of Hundreds of Thousands. This is in the Washington Post. We appreciate you taking the time for us.
Starting point is 00:35:50 Yeah, thanks for having me. All right, Ben, what do you think? Well, first of all, thanks for Mr. Harwell for coming on the pod. His article was fascinating enough for us that we did a segment on it, so I'm glad we were able to talk to him. that we did a segment on it. So I'm glad we were able to talk to him. I think a couple of things stick out at me. One is that this is much more of a human interest story in some ways than it is a technology story. We're talking about surveillance that is so pervasive
Starting point is 00:36:13 that students are under constant fear that they're going to be watched. And that can change people's behavior. As Drew was saying, they may not seek to join certain religious institutions or they might decide not to go to the health clinic for something, you know, that they might be embarrassed about. Right. And something that could actually end up being a huge health scare, like, for example, some sexually transmitted disease.
Starting point is 00:36:35 So in that sense, it could really have a profoundly negative impact on an entire generation. And the question, of course, is granting that it would have that impact, is it worth it? You know, I guess it depends on what your priorities are. He did talk about some of the benefits of it. We can figure out if student athletes are going to class. We can get predictive data on, you know, students who might have depression, that sort of thing. But, you know, it seems, at least from my view, the costs from a human interest perspective are so significant that they would seem to outweigh the benefits. And we all survived in college. We all made mistakes and we all went to places we shouldn't have gone and stayed up to hours to which we should not have stayed up.
Starting point is 00:37:21 But we made it. Right. have stayed up, but we made it. And I just think, is this big brother type system necessary to the extent that it would justify what's really a major invasion on civil liberties? And at the very least, it's good that the Washington Post and before that, some campus newspapers have shined the light on this, because at least we can start to have this conversation. Yeah. One of the things I think about is that just the very fact that this data is being collected and there's a third party involved and everybody, of course, says we're being safe about this and it's all very confidential and we would never use it for anything wrong. But as we know,
Starting point is 00:38:00 time passes, things leak, things get cracked, encryption gets broken. And I can imagine 20 years down the line, that presidential candidate who's there at a debate and someone says, you know, Madam Senator, back in college, we have this tracking information that says that, you know, you visited the health clinic half a dozen times over the course of a few days. You know, please tell us about that sexually transmitted disease you had. And off we go. Or, you know, this senator tells us she knows about political history, but she only showed up to one out of 24 political science 101 classes her freshman year. Right, and she stayed up all night in her boyfriend's dorm.
Starting point is 00:38:42 You know, I mean, yeah, we could do this all day. I'm already thinking about what my college could potentially have on me that might affect my future political career. But at least I was not subject to this type of surveillance. Absolutely. Like I said, it's good that the conversation is starting. One thing that is particularly bothersome about all of this is it's not like there's a general announcement on the first day of school, you know, when you're checking into your dorm, like, hey, just so you know, we're conducting this very intensive personalized surveillance. Or even just make it opt in. Absolutely.
Starting point is 00:39:17 Right? Because it's not opt in and you can't really opt out. You could decide not to get on campus Wi-Fi, but then you can't really do anything. You can't do your homework. You can't do research. You can't make social arrangements. So, yeah, I mean, to me, it's very problematic. It seems like a solution that's going to end up just causing more problems than it fixes. But then again, my kids are too young to go to college, so maybe I'll feel differently when they're 18.
Starting point is 00:39:42 And, you know, I want to know how many times they're going to the intro to polycycloids. It does happen, Ben. It does happen. Cyber threats are evolving every second, and staying ahead is more than just a challenge. It's a necessity. That's why we're thrilled to partner with ThreatLocker, a cybersecurity solution trusted by businesses worldwide. ThreatLocker is a full suite of solutions designed to give you total control, stopping unauthorized applications, securing sensitive data,
Starting point is 00:40:22 and ensuring your organization runs smoothly and securely. Visit ThreatLocker.com today to see how a default deny approach can keep your company safe and compliant. The Caveat Podcast is proudly produced in Maryland at the startup studios of Data Tribe, where they're co-building the next generation of cybersecurity teams and technologies. Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. Our coordinating producers are Kelsey Bond and Jennifer Ivan. Our executive editor is Peter Kilby. I'm Dave Bittner. And I'm Ben Yellen. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.