CyberWire Daily - Exploring the cultural values of personal privacy. [Caveat]
Episode Date: September 7, 2020Dave shares a story about our own state of Maryland trying to crack down on ransomware, Ben shares a New York Times story about facial recognition software, and later in the show our conversation with... Stuart Thompson from the New York Times on the article, Twelve Million Phones, One Dataset, Zero Privacy. Links to stories: How ransomware bill would tighten focus on the threat in Maryland The Secretive Company That Might End Privacy As We Know It Got a question you'd like us to answer on our show? You can send your audio file to caveat@thecyberwire.com or simply leave us a message at (410) 618-3720. Hope to hear from you. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the CyberWire Network, powered by N2K.
Calling all sellers.
Salesforce is hiring account executives to join us on the cutting edge of technology.
Here, innovation isn't a buzzword.
It's a way of life.
You'll be solving customer challenges faster with agents, winning with purpose, and showing
the world what AI was meant to be.
Let's create the agent-first
future together. Head to
salesforce.com slash careers
to learn more.
But it's too late.
Like, you can't get your data back.
You don't know where it's gone.
You don't know who has access to it.
You can't delete it.
You can't request it.
There's no method for you to find out anything about what's known about you.
Hello, everyone, and welcome to Caveat, the CyberWire's law and policy podcast.
I'm Dave Bittner, and joining me is my co-host, Ben Yellen, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Hi, Dave.
On this week's show, I have a story about our own state of Maryland trying to crack down on ransomware. Ben shares a New York Times story about facial recognition software. And later
in the show, my conversation with Stuart Thompson, also from the New York Times.
We're going to be discussing his recent article,
12 Million Phones, One Dataset, Zero Privacy. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any
of the topics we cover, please contact your attorney. We'll be back after a word from our sponsors.
Hey everybody, Dave here.
Have you ever wondered where your personal information
is lurking online?
Like many of you,
I was concerned about my data
being sold by data brokers.
So I decided to try DeleteMe.
I have to say,
DeleteMe is a game changer.
Within days of signing up, they started
removing my personal information from hundreds of data brokers. I finally have peace of mind
knowing my data privacy is protected. DeleteMe's team does all the work for you with detailed
reports so you know exactly what's been done. Take control of your data and keep your private
life private by signing up for
Delete.me. Now at a special discount for our listeners. Today, get 20% off your Delete.me plan
when you go to joindeleteme.com slash n2k and use promo code n2k at checkout. The only way to get
20% off is to go to joindeleteme.com slash N2K and enter code N2K at checkout.
That's joindelete me.com slash N2K code N2K.
And we are back.
Ben, before we jump into this week's stories, we got some feedback from a listener wrote me over on Twitter. His name is Tom, and he said, Dave, fan of your podcasts. Question on
today's caveat podcast. It sounded like the caller question was regarding revenge porn.
You both said that there's nothing that can be done. I'm not sure that's entirely accurate
anymore. I believe a number of states have passed anti-revenge porn laws. Is it possible to have
that confirmed by Ben and amended if true? Also possibly make mention of the different groups,
such as Badass Army, that work to help victims of it. Keep up the great work across all these
shows. Enjoy them all from Tom. So yes, Ben, you and I had a conversation. We were talking about
revenge porn and this notion of whether or not someone steals photos or shares photos that
were shared with them. But there is some nuance. Can you clarify here? So first of all, thank you
for writing us as always. Everybody is welcome to do that on your social media platform of choice.
Thomas, correct. There are actually 41 states who have some version of revenge porn statutes on the books. The statutes vary in terms of the punishment. So
some of them prescribe injunctions where the person who posted the revenge porn is forced to
take them down, formal warnings, takedown notices, other infringement notices that varies across
states as well. One thing that's notable that I think gets to this listener's question is revenge porn has a
very specific definition and not to, you know, get too X rated on our G rated podcast. But revenge
porn, the way it's defined in most states has to do with depicting sexual acts, explicit stuff,
explicit stuff. Yeah, well, let's just put it at that, which means that posting, you know, racy photos of your
ex-girlfriend or ex-boyfriend, well, ethically questionable to say the least, wouldn't count
as revenge porn for the purposes of these statutes.
Even though obviously that image could be used for blackmail purposes, for some of the
same purposes that people use revenge porn for.
I think part of it is lawmakers have to draw the line somewhere. And the line they've drawn is
around explicit acts. So, for example, if someone posted a picture of a woman in a bikini,
not revenge porn. In most states, it is not. What if she were nude? Then in most states,
that would be revenge porn. Okay. They talk about showing
private body parts as another part of what defines a typical revenge porn statute. Okay.
You know, I think like if you wanted to look for guidance, most websites, social media websites,
terms of services would probably be a good place to start. You know, you can post bikini photos on
Facebook. You can't post naked pictures. And I think that's the general reasoning behind these statutes. Yeah, there's also a lot of
variety in terms of what we would call the mens rea or the criminal intent of the person posting
revenge porn. So states often look at whether a person actually intended to harm the reputation
of the person for whom they were depicting. Some states don't take that into
consideration. The simple act of posting the image themselves is violating the statute.
Oh, that's interesting.
Obviously, consent is a huge element of it. So most states have elements where if there's been
explicit or implied consent by the person whose images are being posted, that does not count as
a violation of the statute.
And also, I think, as Tom points out, there are organizations, one that I know of that he
mentions here, they call themselves the Badass Army. They are on Twitter if you search for
Badass Army. And they help people who've been victims of this sort of thing, help guide them
through what their options are,
provide support. But from everything I've seen, a good organization out there doing good work to try to help people who find themselves victims of this sort of thing. Yeah. You know, whether it
violates the law or not, when you are a victim of any type of online harassment like this,
and that's what this is, is this harassment. It's good to know that there are resources out there. Yeah. All right. Well, Tom, thank you for sending in your kind note
and for asking us for some clarification. Always good to do that. Let's move on to our stories.
I'll kick things off for us this week. My story comes from Delmarva Now, which is a publication
that covers the Maryland, Delaware, Virginia area.
And this is about Maryland legislators who are introducing a ransomware bill that's going to
tighten down the rules when it comes to ransomware, increase some penalties,
just sort of try to bring this into focus. This was written by Wesley Brown from Capitol News Service.
So we've got some Maryland legislators. Senator Susan Lee, a Democrat from Montgomery County,
sponsored the bill. And it sounds to me like what they're doing here is just trying to put the word out to criminals that even just possession of ransomware would be a crime in the state of
Maryland. So this would be a radical change in the law around ransomware would be a crime in the state of Maryland.
So this would be a radical change in the law around ransomware in Maryland.
The way the current statute works is only illegal to actually use ransomware. This statute, if enacted, would criminalize the simple possession of ransomware unless
the person possessed it for research purposes.
So if you were in an academic setting and you were trying to study different types of ransomware, that would not be a violation of the statute.
But as we do in other areas of the law, oftentimes we criminalize the possession of something because the threat of the person will use whatever that thing is, is particularly dangerous.
That's why we prohibit in many instances, the possession of certain types
of firearms, even if a person does not use the firearm. So we're criminalizing somebody for
simply possessing what can be very dangerous. And Maryland has experience, obviously, in dealing
with the dangers of ransomware. Most famously, Baltimore City was the victim of a ransomware
attack back in May of last year, crippled city systems for
several months. Took me a while to get my water bills. It was not fun to get a bill three months
later for $300. So obviously, this is a very serious problem. And the key with this piece
of legislation, and I think this is something that Senator Lee would agree to, is it's a deterrent.
or Lee would agree to is it's a deterrent. You want to make it as undesirable as possible for people who are even considering possessing ransomware to realize that their actions would
violate this law and they would be subject to pretty significant penalties. I think the statute
calls for something like a $10,000 fine or five years in prison as a maximum penalty. So it's a misdemeanor, but it's
a serious misdemeanor. So we are creating this deterrent effect. A couple of things noteworthy
here. One, just to shout out to one of your colleagues and certainly friend of the Cyber
Wire, Marcus Roshecker. He testified on this bill. He sure did. Yeah, he is a friend of the show.
Also a fellow employee at the Center for Health and Homeland Security. Marcus testified on this bill? He sure did. Yeah, he is a friend of the show, also a fellow employee at the
Center for Health and Homeland Security. Marcus testified on this last year. The bill was not
enacted in the 2019 legislative session. It was all Marcus's fault. Just kidding, Marcus.
So we're back here in 2020. They have made some changes to the bill to make it more palatable.
And from what I heard from Marcus, the hearing went very
well. And there seems to be broad increasing support for this type of legislation, largely
because not only Baltimore City, but also the smaller city of Salisbury, Maryland also suffered
a ransomware attack. So there's just more of an awareness for how much of a danger it is to our
localities. And this is a proactive measure for there to be
a deterrent. Another thing worth noting, I don't know if you were going to mention this,
is that a couple of other states, and this is what Marcus mentioned in his testimony,
Michigan and Wyoming have experimented with this approach. The approach is so new that we don't
actually have any quantitative research on whether this is an effective deterrent to
cyber attacks. That remains to be seen.
Well, let me play devil's advocate on that. A Maryland law would affect citizens of Maryland,
right? Do we think that the people who attacked the city of Baltimore were Marylanders? I don't.
Probably not.
What I'm getting at is how much of this ransomware is even coming from within the United States.
So I agree with your devil's advocate advocate take. That's what lawyers do. Let's play devil's advocate. A couple of things, though, that sort of principle is applicable to a lot of different
state laws. OK, you could say why pass a state law on anything because somebody from Virginia
could come in and, you know, do this illegal behavior and then would go beyond
our jurisdiction, we wouldn't be able to catch them. Okay. So it's still a deterrent factor,
just because the state of Maryland is making a statement, this type of behavior
is criminal in nature. So it's sometimes just simply stating that the possession of something
is criminal itself acts as a deterrent. That ne'er-do-well who's thinking about their bank account is almost empty and weighing their
options and saying, you know, I have some computer skills. Maybe I'll just
spray out some ransomware here. They may see this and think twice.
Absolutely. Yeah. I mean, it's the deterrent effect, I think, is something that, you know,
we use for all criminal statutes, even in instances where
it's the perpetrators are beyond our jurisdiction. I think the state of Maryland is just doing what
it's what it can in the absence of federal law that criminalizes ransomware and the absence of
some sort of, you know, international law. We can only control what happens within our own borders.
And so this is this is a start. And it's also beyond the tangible effects of the statute. And
I don't think I have an estimate of how many people per year or whatever would be prosecuted.
It's about the intangible, which is sending a message that possession of ransomware,
even if it is not used in and of itself, is both wrong and potentially dangerous.
So stay in school, kids.
Stay in school, kids. There are better ways to
make money. Betting on sports. No, just kidding. Don't possess your ransomware. Yes, absolutely.
All right. Well, that is my story. Ben, what do you have for us this week?
So my story comes from the New York Times. I got a lot of play over the weekend written by
Kashmir Hill. It's entitled The Secret of a Company That
Might End Privacy As We Know It. It's about a company called Clearview AI started by a gentleman
named Juan Tontat, 31-year-old Australian. The way this application works that he created is
you can take photos of anybody on the street and using this technology's facial recognition
software,
it can match that person's photo to publicly available information. So if that person has posted social media pictures, or if that person is featured in a YouTube video, if their photo
is anywhere on the internet, it can be matched to the photo taken on the street.
And what was previously unknown about this relatively small startup company is that 600 law enforcement agencies across the United States have started to make use of this technology, have used Clearview AI to help solve crimes.
As you can guess, it's an incredibly effective crime fighting tool. They mentioned an instance, I believe it was in South Carolina, where two individuals got
into a fight in a park.
A bystander took a video of the fight.
The police were trying to find the perpetrators.
They used Clearview AI, matched the person's face with publicly available information,
got that person's name, and were willing to effectuate an arrest.
So that side of it's very good.
The other side of it, of course,
is frankly very disturbing. We've talked a lot about technologies that are invasive of privacy
that would create a perpetual surveillance state. And tell me if you disagree. I don't think we've
ever come across a technology potentially as invasive as this. Largely because it's facial recognition, it means
the thing that's most personal to us, our face, be photographed in public, be matched up to
publicly available information online and to be used potentially by law enforcement.
Right. So for years, for decades, we have been okay with our image being captured when we're out and about in public.
Surveillance cameras, which I think we all agree for protecting retail places for security.
Sure, security, yeah.
All reasonable, and we're, I'm just imagining, you know, walking into
my local fast food joint and walking up to the counter and having the person behind the counter
say, oh, welcome back, Mr. Bittner. And they actually mentioned that in this article. They
say, you know, when they were first brainstorming, these guys came up with this startup and they were
brainstorming ways to use it. And one of their suggestions was, well, why don't we give it to hotels?
They can take a, you know, use security footage of a person walking into the hotel and then
make them feel very at home and welcome.
When they get to the front desk, you'd say, hello, Mr. Bittner.
Good to see you again.
At home and welcome is not the feeling that I'm going with here.
But all right.
about a couple of weeks ago that we're going to talk about in our interview segment today.
There's just this sort of depressed, angry reaction to the fact that this technology exists,
the fact that it's used by law enforcement agencies across the country. And there was sort of a tone of resignation among the founders of Clearview AI. Mr. Tone Tott basically said,
social media sites themselves are scraping
users' images. You know, Facebook does it all the time. It's not us that invented the use of facial
recognition software. We are just sort of augmenting this tool. And then there was this other kind of
Orwellian quote. So one of the investors, one of the early investors in the startup is a guy by
the name of David Scalzo, founded Curanaga Partners, and he was interviewed as part of this article.
He dismissed concerns about Clearview making the internet searchable by face,
which is sort of the logical conclusion of where this would all be going if
Clearview AI expanded. And he said that he's come to the conclusion that, quote,
because the information constantly increases, there's never going to be privacy. Laws have to determine what's legal, but you can't ban
technology. Sure, this might lead to a dystopian future or something, but you can't ban it. I mean,
that's sort of a really eye opening and frankly, shocking statement of resignation, in my view.
And, you know, it's sort of up to the general public whether they want that viewpoint to be
the viewpoint of our policymakers. And in the absence of robust federal data privacy legislation,
Mr. Scalzo's view is the prevailing view. That's sort of what we've decided to accept
as a society. There is no privacy out there. We can have laws here and there that protect
personal privacy, personal identifiable information. But technology is limitless.
And whether the technology in question sounds dystopian or not, it doesn't make sense to ban it.
And so I think as a society, we're going to have to kind of reckon with that viewpoint.
All right. So a couple of things I want to ask you about.
First of all, there is no expectation of privacy when you're out and about in public.
Right.
Correct.
Right.
That is correct.
Yep.
So we know that we're overall probably pretty comfortable with that.
Yes.
So I'm thinking about the earlier example you were talking about of the two gentlemen
who got into a tussle.
It was Indiana, by the way. I got the state wrong. All right. My apologies to the great
Indiana wants me, Lord, I can't go back there. Yeah. So two gentlemen get in a tussle in the
park. They use the video to find them. How is this not a faster version of a police officer
canvassing door to door with a description saying, you know, it's a guy with a bushy eyebrows,
a mustache and a sports jersey. Do you recognize this guy? This is just that, but faster.
Yeah. I mean, I think the whole issue is that it's faster and it's instantaneous.
Right.
So something that used to take hours of police work, you know, a lot of monetary resources,
human capital, staff resources now can take place with the click of an iPhone camera.
Casually.
Right.
Which means no downside.
Yeah.
And it's so easily done and replicated.
I think sometimes the only thing protecting our personal privacy in the past was the fact that it took a lot of work to conduct this type of surveillance. When it doesn't take a lot of work and it's cheap and it doesn't cost a lot of money or resources, then that type of
surveillance is going to be in mass use. So the fact that it is easier to identify people using
this technology in and of itself is the reason why it's so dangerous for personal privacy,
in my view. Yeah. You know, when I've talked to law enforcement about these
sorts of things, I remember specifically having a conversation with a chief of police about
license plate scanners. And his point was, well, don't you want that criminal to be tagged? Don't
you want I'm trying to make your family safer if that criminal drives by one of our scanners and
we see there's an outstanding warrant for that person, we're going to go get that bad guy. And I'm thinking with this sort of thing,
don't you want to know when that sexual predator gets too close to that elementary school? Don't
you want that information? That's the argument from law enforcement. And I think they would say
it's compelling. It is compelling. I absolutely agree with it. One thing they mentioned in this
article, and this was actually gleaned from Clearview's sales presentation, is that the app has helped identify some really bad people. Somebody accused of sexually abusing a child. That person appeared in the mirror of somebody else's gym photo, which is almost a stroke of luck.
It's like kind of a TV show.
It is.
Zoom in, enhance. Exactly. Yeah, it's a law and order
episode waiting to happen. A person behind a string of mailbox thefts in Atlanta. That one,
do we need Orwellian technology for mailbox theft? I don't know. But for something more serious,
you know, a John Doe found dead on an Alabama sidewalk. Absolutely. From law enforcement's perspective,
any tool they can have to help solve and prosecute crimes is useful not only for law enforcement,
but for the rest of us. Okay. So in your view, what's the balance we could strike with this?
I mean, that's the billion dollar question. I think, and I know we've talked about this before,
the legal protections and personal privacy protections need to be made robust in order to catch up with the technology. So you have to try to keep things in equilibrium because we can capture billions of people's faces just by strangers taking iPhone pictures on the street, the laws have to protect personal privacy in a way that's
just as robust. And whether that's Congress passing, you know, a type of CCPA, California
Privacy Protection Act law that protects personal privacy, or it's the court stepping in and saying,
you know, something like the Fourth Amendment, which protects us against unreasonable searches and seizures, has to be extended to cover this type of technology that
previously would have required some sort of invasive search or seizure into a person's
property or into a person's private life. So the laws have to be adjusted to keep up with these
rapid changes in technology. Now, that doesn't happen
for a variety of reasons. Legal bodies are not startups, in case you haven't noticed.
They tend to move at a different pace.
At a snail's pace. Yeah, sort of a snail versus the cheetah thing here. So that's one part of
the problem. And, you know, from a policymaker's perspective, law enforcement is in their ear too,
saying,
yeah, we know that this technology potentially does sound Orwellian, but here are all the bad
people we've locked up because of what Clearview AI has done. And if you can get in a lawmaker's
ear on that, it's almost hard to turn this down as a potential resource. I think I was struck by
the outrage I saw on my social media feeds by people who otherwise just aren't outraged about this type of thing.
So maybe by taking it a step too far, you know, it will raise that public consciousness so that Congress or the court step in and say, we have to extend privacy protections to match the scale of the technology that exists.
Well, and how important that organizations like the New York Times are out there
bringing these sorts of things to light.
Yes, I subscribe mostly for the crossword puzzles.
But yeah, I mean, they've done some very, very important work in this area.
This was investigative work in and of itself.
This person was trying to hound down the founder of Clearview AI for several months.
The founder was getting very good at evading questions, had set up a, I believe, a faked LinkedIn profile for himself using like, yeah, he listed himself on LinkedIn as somebody named John Good, which is, you know, a little on the nose, half step, half step better than John Doe. Right. Right. But I think, you know, eventually they agreed to speak with the author of this article just because I think having an article posted without their input would be worse. Right. Than one with their input.
All right. Well, certainly another interesting piece of work from The New York Times. We'll have a link to that in the show notes. It is time to move on to our listener on the line.
And had a caller call in this week with an interesting, somewhat specific question.
Here is this week's listener on the line.
Hello, I'm calling from Virginia and my job is in the field of digital forensics.
I wanted to know what are
the legal requirements for obtaining a private investigator's license when performing digital
forensics? Thanks. All right, Ben, I know you probably have the answer to this right off the
top of your head, right? Yes. No, that is absolutely not true. It's a great question.
Yeah, I did a little research on behalf of that question. I know this is probably
the least satisfying legal answer people ever get, but it really does depend on the state.
So some states like Texas do require a private investigator's license for most types of
computer forensic specialists. Other states like California generally do not require the license
because the nature of the work by computer forensics experts is so fundamentally different.
It's in the course of, you know, computer forensics examiners.
They're not engaged in the type of activity that falls within the purview of the definition of private investigators.
You know, sometimes they are professional engineers are conducting experiments. You know, they may be doing things like chemical testing, you know, and this isn't just true for forensic technology, all different types of scientists that may testifying in court or continuing their forensic
research. And so that was the basis of the California law. I would know that that California
law seems to descend from a state court decision. I was reading an article on this, the State
Department that has jurisdiction over this issue, didn't quite know how to answer the question you presented. But they
ultimately, based on California court precedents, said that the work done by computer forensics
experts falls short of the jurisdiction under the Private Investigator Act. So I would recommend
that you look up your own state statute on what's required of private investigators and whether a forensics expert
has to get one of those private investigators licenses. There are a lot of resources out there
that have done 50 state surveys on the scope of private investigator licensing laws across all 50
states. So I would encourage you to check that out. All right. Yeah. Interesting stuff. And
thank you for sending in that question. We would love to hear from you. If you have a question for us, you can call and leave a message. Our number is 410-618-3720. That's 410-618-3720. You can also email us an audio file. That would be at caveat at the cyberwire.com. Send in your audio file and perhaps we will use it on the air.
Coming up next, we've got my interview with Stuart Thompson from the New York Times. He's
here to talk about his article, 12 Million Phones, One Dataset, Zero Privacy. That bombshell article,
Ben, you and I have discussed previously. Really interesting conversation. So stick
around for that. But first, a word from our sponsors. products platform comes in. With Domo, you can channel AI and data into innovative uses that
deliver measurable impact. Secure AI agents connect, prepare, and automate your data workflows,
helping you gain insights, receive alerts, and act with ease through guided apps tailored to
your role. Data is hard. Domo is easy. Learn more at ai.domo.com. That's ai.domo.com.
And we are back. Ben, I recently spoke with Stuart Thompson. He is one of the co-authors
of the New York Times article, 12 Million Phones, One Dataset,, zero privacy, a bit of a bombshell article you
and I spent some time discussing recently. Here is my conversation with Stuart Thompson.
Source came to us after reading some of the privacy project work that we've been doing this
year, basically shining a spotlight on privacy and technology sort of again after, you know,
sort of that's been a topic for a while, but we've come back to it. And, you know,
they had access to this data and were concerned with what they saw and wanted someone to shed
light on it and really give a sense of the volume and scale of the collections going on. And this is,
you know, mobile phone locations and sort of tracking where you go and what you do in your
life. And it's not really treated as a serious or concerning thing in the industry. And the sources were concerned and wanted somebody to argue for change. And that's
what we tried to do. What can you tell us about the data set itself? I know your source wanted
to remain anonymous. So of course, we'll respect that. But the data set itself, how unusual is it
among the people who trade in this sort of thing? It's pretty common. It's a huge industry.
We're talking about millions and millions of dollars, lots of investment going on. And,
you know, we counted, you know, probably around 70 companies trading in this data in some capacity
or working around it. There's probably more than that. You know, it's probably not too unfair to
say it's a shadowy business. They're not very public facing. They're not companies that have, you know, household names, although some of the household
names do participate in location tracking, like, you know, Facebook and Google.
But there's a lot of companies you've never heard of, like Timo and Fidzup and Cubic and
Factual and these kinds of companies that, you know, they're not household names.
And they've been collecting this data for a decade since the App Store
pretty much launched and allowed companies to get access to GPS data on your phone.
They've been collecting it. And one of the more popular companies that people probably know is
Foursquare. And if you were like me in the sort of beginning of the App Store days, you might have
been checking into a business and trying to be the mayor of your local Starbucks or something.
But little did we know at the time that that's some kind of utility is sort of like a social network kind of thing.
But actually way more valuable is turning that data into business insights.
And that's what's been going on for a decade.
But we've never really been able to see what that looks like.
We kind of have an idea of tracking and we kind of think that we understand what tracking is like for targeted advertising,
but to actually see like a huge map of, you know, the nation's capital with so many dots
that it completely fills the screen is a different thing entirely.
Can you walk us through how you approach this data set? I mean, how did you get started?
How did you decide where you were going to begin?
Yeah, the data is actually really simple, which is another maybe threat to how it can circulate around.
It's just, you know, think of it like an Excel spreadsheet,
if you've ever used that, or, you know, a table, a simple table.
It's got latitude, longitude, date, time, a user ID,
and the duration of their stay in one place in seconds.
And, you know, with just a couple of simple tools, you can start mapping that out.
So we built a couple of different dashboards and things that let us analyze and filter
it.
And basically what we were trying to do is assess risk.
And we more or less came up with a list of things like what might be risky in here.
Unlike previous releases of data, including the stuff that the Times has reported on,
which is sort of the foundation for a lot of our reporting. This data included Washington, and that raised some national
security questions like, is the Pentagon in there? Are the nation's spies being tracked?
Are congressmen being tracked? Is government buildings excluded from this or not? And we
basically found no matter where we zoomed into or how we filtered, there was a data point somewhere.
So we just started looking at, we zoomed into the Pentagon and we thought, okay, maybe there
won't be any dots here because it's a secure facility.
But that wasn't the case at all.
There were hundreds and hundreds of phones being tracked.
The one exception is maybe the CIA building, which had no pings inside of it, but it didn't
really matter because there were pings in the parking lot and that led back to people's homes and sort of gave clues to the IDs. So that's what we used instead.
Help me understand here, once you have established who an ID is associated with,
I mean, is that really the key to being able to track an individual?
Yeah. So the data includes an ID. And if you look at any individual point, that's not that useful.
Like a point in a house might give you something.
But if you can't connect that to a bunch of other points, then it doesn't give you very much.
But that's useful for marketers to know where individual people go.
So they attach an ID to it.
And if you pull out one person's or all the points associated with that ID,
you basically get a bunch of points around the city.
And it catalogs where they go. And the most common cluster of location pings was
their house usually, because their phone is just sitting at their house most of the time at night
or when they're home eating dinner and it's just pinging their locations. There's a huge cluster
right at their house. And the second most common cluster is at their workplace, because also where
they spend a lot of time. So they'll be in meetings and having lunch in the break room, whatever, and their phone is also pinging a lot. So those two points are actually super revealing to an identity. And the data doesn't have a name, it doesn't have a phone number. Like a lot of people ask us, so can you look up my phone number in there? It doesn't work like that. You know, the industry calls it anonymous and that's really fundamental to their business
plan to be like, you know, don't worry about us.
Don't regulate us too hard.
It's anonymous data.
And they're true in a way because there aren't really any identifiable features in the data
itself other than the location.
But if you think about where you go, you probably travel from home to your workplace.
And how many people make that journey every single day?
And that's the kind of data that's in here. So we can look at a house and back up from there and
look at public information that we have access to that shows their name, who owns the house.
And then people post stuff online. They have a LinkedIn profile, they have a Twitter account,
whatever. There's a newsletter somewhere, someplace on the internet, there's some breadcrumb
that mentions your name and your workplace. And really, that's a huge spotlight on who you are potentially.
One or two other data points is enough to feel very confident about who it is.
As you combed through the data, were there any particular individuals you were able to key into that you found particularly chilling or unsettling and how easy it was to track someone?
particularly chilling or unsettling and how easy it was to track someone? Yeah. So we basically started looking at the data thinking, can we de-anonymize
the data? And you're looking at basically just a bunch of points, like just a bunch of dots on a
map. It's not really too scary, but the first time I was successful in tying a dot to a person by
looking at a house and a workplace and a couple
other places the person went to confirm who they were. It was really scary. It was freaky.
I got goosebumps. I sat back on my chair, took a deep breath. I think I went for a walk.
It turned the data from something that's just a spreadsheet and a bunch of points on a map to
someone's diary. It felt pretty invasive at that
moment to turn something that's innocuous and sold and traded for profit to something very
revealing about someone's life. So it didn't have to be a prominent person to feel chilled by what
we were seeing, but then there were prominent people in there. So we listed in the story a
couple of people, senior defense
department officials, secret service agents. And we found one secret service agent, who we believe
is a secret service agent, following Trump around on a high profile weekend with a foreign prime
minister visiting at Mar-a-Lago. And you see the path in the story and it's like, whoa, that's pretty crazy.
But you have to think we started out being like,
maybe this is fully airtight.
Maybe there's no tracking at all of people
within a literal arm's reach of the president,
but that's not the case.
Everyone was sort of involved in this.
Now, the people who are buying and selling these data sets,
how restrictive are they in terms of who they will sell to?
If I went to one of these companies and I said, hey, I want to buy a bunch of this data,
how hard would it be for me to get my hands on it?
It's a great question. And we can only sort of speculate based on what companies say. We've had
we talked to some sort of insiders, former employees and so on. But the truth is there's
nothing legally stopping them from selling to anybody. So you can imagine, and the former CEO of Foursquare says he was turning down
million dollar deals to sell batches of data. Now that's easy for Foursquare to do because they have
a big business doing other kinds of analysis. But if you're a startup, a small company with 12
people, you have an incentive to make some money and there's nothing legally stopping you from
selling this.
And we had a former employee of one of the companies tell us, contracts like this go
for millions of dollars.
So yeah, I mean, the companies say they work with trusted businesses, they have a vetting
process that includes references, you have to have a plausible business case.
Those are the kinds of parameters that companies tell us about. But the truth is, there's no way to really say that that's all that happens. Of course, companies want to say that, and it's hard to evaluate because there's no obligation for them to report who they sell to.
where they get data from or where it goes. And we can piece it together by looking at some of the big players that, you know, analyze the data and, you know, boast about the analysis they're
able to do. But it's a black box in a real way. What's the reaction been to the story? I'm
thinking specifically, have you heard anything from anyone, say, on Capitol Hill?
So Congress seems extremely distracted right now.
That's a fair point. Yes. Capitol Hill? So Congress seems extremely distracted right now.
That's a fair point.
Yes.
So we haven't seen a bunch of hearings triggered right away, but I think there's definitely
a number of senators who are interested in privacy and have put forward their own privacy
bills.
And this kind of thing I think adds to the sense of unease that people have.
And when privacy returns to the forefront, We expect some sort of privacy, federal privacy bill,
definitely within the next couple of years,
I hope this year, but it's an election year,
so we never know, but there's definitely going to be
a federal privacy law coming, and this kind of exposure
is really important for people to understand the scale
of what's going on in the industry that pretty much
operates invisibly, like in the background of your phone.
How has this changed your attitude to this sort of thing?
Do you have a different perspective, having been through the work that you've done here
on how you interact with your own mobile devices?
Oh, yeah.
I'm a nut now.
I'm totally freaked out.
My editor was joking that I was going to file this story by carrier pigeon because I would
be living in the woods somewhere.
But yeah, I mean, when you see it and, you know, I started off the privacy project when they first told me like, oh, we're going to do a series on privacy.
I was like, OK, like I remember writing about privacy like a decade ago.
And I was like, I don't know if I really care.
And like maybe if you know, if you if you consent, you know, they give you a screen and you consent to it.
So what's the problem? I think it's a little bit like we're all brainwashed a little bit to think that the status quo is the right kind of, the only way it can work.
And when I started looking at this data, I felt really sort of trapped.
I couldn't escape the industry if I wanted to. And a lot of people,
they don't want to right now. But, you know, maybe after seeing the story or maybe after
seeing some more stuff, they might feel like they don't want to participate. But it's too late. Like
you can't get your data back. You don't know where it's gone. You don't know who has access to it.
You can't delete it. You can't request it. There's no method for you to find out anything about
what's known about you. And, you know, you might change your mind down the road if you don't care, like I did. I used to use
Foursquare. I used to think location services were a cool thing on a phone. It's like, what a cool
way, what a cool additional piece of technology we have access to now, where you can monitor your
life, but it's extremely valuable to businesses and there's no oversight. There are very few legal limitations on what they can do with it.
So yeah, I'm totally freaked out.
I turn off my location services at all times,
my phone for anyone who wants it.
And yeah, I mean, it's scary,
but I know a lot of people don't do that
and they're not totally equipped to deal with it.
The other thing I'll say is like we published our story,
which did great and got a lot of attention, but the second most read piece in the whole series wasn't our
expose on the national security threats. It was how to protect your phone. It was the
three steps to protect your phone. People want to know that stuff, and I think it's
the responsibility of the companies like Apple and Google to make that much easier to adjust,
because we need to publish an it's, you know,
we need to publish an article that gives you three complicated steps and a bunch of videos
to protect your phone when it should be much easier than that.
But what are your thoughts on potential mitigations for this sort of thing? Is this
something where we should see regulations or legislation? Do you have any insights there?
It's complicated, the solutions that Congress can put forward.
I think generally, I'd love to see more transparency. I'd like companies to have to
publish where the data goes and what they do with it. If Congress looks at this issue and makes a
federal privacy law, and the end result is longer privacy policies, it's going to be a total
disaster. And so far, a lot of the laws that get published is
like essentially more notice and more consent. And that's not really effective. I mean, we've
seen that, you know, you might consent to share your location with an app, but you don't know
where that's going. And it shouldn't really be up to you to have to decide all that. So I'd really
like to see more pressure put on companies in a tangible way. You could really put a lot of limitations,
like the timeframe that you can keep the data for, the use cases that you can use it for,
is another borrowing from the European privacy law. So if you state that you want to use it
for advertising, you then can't use it for hedge fund analysis or something. It has to be for the
strict reasons that you state up front. And yeah, I mean, the companies right now, they can keep the data forever.
That's, you know, privacy risk for sure.
So it'd be nice to have some limitations on that.
And there's people arguing that this stuff just shouldn't exist at all.
There's a use case where, you know, you give your location to an app and it might use it
within the app environment.
But what we're reporting on are companies that sit inside that app, have nothing to
do with the app, and then use it basically as like a mining tool to collect your location data.
And maybe that seems to me like something that maybe should stop.
All right, Ben, what do you think?
Well, it's just great to hear from Stuart.
First of all, we spent a lot of time on this podcast, and I've spent a lot of time in my own brain wrestling with the implications of this article.
So it was just nice to hear from him. And also nice to hear that he too has been wrestling with this.
What was interesting to me is that he was put on this privacy project, not being somebody who was obsessed with digital privacy.
He came at it from a perspective that I think a lot of people come at it with, which is we all press.
I agree to the terms and conditions.
Right. If I have nothing to hide, why should I be scared? lot of people come at it with, which is we all press, I agree to the terms and conditions.
If I have nothing to hide, why should I be scared? And then he did this extraordinary research project and realized that there are a lot of reasons to be disturbed. Not only the
violations of personal privacy, but some of the things he uncovered about dangers to national security, that phones were pinging inside the Pentagon, outside Langley, the CIA headquarters.
And so it took him doing this extensive research to be at the place where he is now, where
he turns off location services on all of his applications.
And his colleagues are joking about him doing the story from in the woods. And I think sort of the logical leap to make from that is the more all of us would get into the weeds and how invasive this technology is, the lack of transparency, I think the more it would concern all of us. So sometimes it's just a matter of, I wouldn't necessarily use the word ignorance, but just not understanding. Resignation is the word I think of.
Yeah. A lot of it is resignation because we, and I know we've talked about this too, we
like having location services for all of the conveniences it provides. And it does provide
a lot of conveniences. And another point I thought was very meaningful that he made is,
conveniences. And another point I thought was very meaningful that he made is, yes, we do have the opportunity to consent to location sharing within each application. But unless you do the
type of research that he's done, most users not only have no idea where that information is going,
but they shouldn't really have an idea where that information is going. It should not be up to the
users of these devices to figure
out the dangers of sharing their location. Just like if you think of, you know, a non-digital
product, it should not be up to the users, you know, to figure out where, you know.
I shouldn't have to keep a Geiger counter at my house to test the foods I bring into my home for
radioactivity. That is a perfect example. Mine was going to be, you know, if your dishwasher
stops working,
it shouldn't be up to you
to figure out
what cog in the machine
is causing that problem to exist.
But I like your example.
Yeah, well, you're right.
We have general rules
about radioactivity
and consumer products.
We agree that's a bad thing.
And so this would just be
another type of consumer protection.
He shares my skepticism
that Congress is going to act quickly on this, even though there's some interest. But I think
the place he wants to start, which is laws around transparency, just so that people,
if they choose to be aware of where their information is going, have the option to be
aware. They may not choose to be aware, but at least giving them that option.
Cyber threats are evolving every second, and staying ahead is more than just a challenge.
It's a necessity.
That's why we're thrilled to partner with ThreatLocker,
a cybersecurity solution trusted by businesses worldwide.
ThreatLocker is a full suite of solutions designed to give you total control, stopping unauthorized applications, securing sensitive data,
and ensuring your organization runs smoothly and securely.
Visit ThreatLocker.com today to see how a default-deny approach
can keep your company safe and compliant.
The Caveat Podcast is proudly produced in Maryland at the startup studios of DataTribe,
where they're co-building the next generation of cybersecurity teams and technologies.
Our thanks to the University of Maryland Center for Health and Homeland Security
for their participation.
You can learn more at mdchhs.com.
Our coordinating producers are Kelsey Bond
and Jennifer Iben.
Our executive editor is Peter Kilpie.
I'm Dave Bittner.
And I'm Ben Yellen.
Thanks for listening.