The Daily - The End of Privacy as We Know It?

Episode Date: February 10, 2020

A secretive start-up promising the next generation of facial recognition software has compiled a database of images far bigger than anything ever constructed by the United States government: over thre...e billion, it says. Is this technology a breakthrough for law enforcement — or the end of privacy as we know it?Guest: Annie Brown, a producer on “The Daily,” spoke with Kashmir Hill, a technology reporter for The New York Times. For more information on today’s episode, visit nytimes.com/thedaily. Background reading: Federal and state law enforcement officers are using one company’s app to make arrests in 49 states. So what is Clearview AI, and what influence does it hold?Clearview’s app is being used by police to identify victims of child sexual abuse. Some question both the ethics and the accuracy of the results.

Transcript
Discussion (0)
Starting point is 00:00:00 From The New York Times, I'm Michael Barbaro. This is The Daily. Today, a secretive company promising the next generation of facial recognition software has compiled a database of images far bigger than anything ever constructed by the U.S. government. images far bigger than anything ever constructed by the U.S. government. The Daily's Annie Brown speaks to reporter Kashmir Hill about whether the technology is a breakthrough for law enforcement or the end of privacy as we know it. It's Monday, February 10th. Kashmir, how did this story come to you?
Starting point is 00:00:54 So I got an email. It was a Wednesday morning. I was checking my phone. And it was from a tipster who had gotten a bunch of documents from police departments. a tipster who had gotten a bunch of documents from police departments. And one of the police departments had sent along this memo about a private company that was offering a radical new tool to solve crimes using facial recognition. And what would make a facial recognition tool radical? So law enforcement has for years had access to facial recognition tools. But what this company was offering was unlike any other facial recognition tools that police have been using because they had scraped the open web of public photos from Facebook, from Venmo, from Twitter, from education sites, employment sites, and had a massive database of billions of photos. So the pitch is that you can take a picture of a criminal suspect, put their face into this app, and identify them in seconds. And when you read this memo, what do you make of what this company
Starting point is 00:02:00 is offering? So I've been covering privacy for 10 years, and I know that a technology like this in public hands is the nightmare scenario. This has been a tool that was too taboo for Silicon Valley giants who are capable of building it. Google in 2011 said that they could release a tool like this, but it was the one technology they were holding back because it could be used in a very bad way. And why exactly is this kind of technology, this line in the sand that no one
Starting point is 00:02:39 will cross? What makes it so dangerous? So imagine this technology in public hands. It would mean that if you were at a bar and someone saw you and was interested in you, they could take your photo, run your face through the app, and then it pulls up all these photos of you from the internet. It probably takes them back to your Facebook page. So now they know your name. They know who you're friends with. They can Google your name. They can see where you live, where you work, maybe how much money you make. Let's say you're a parent and you're walking down the street with your three-year-old. Somebody can take a photo of you they know everything about you and you can face repercussions for just trying to exercise your political opinions. You know, if this app were
Starting point is 00:03:32 made publicly available, it would be the end of being anonymous in public. You would have to assume anyone can know who you are anytime they're able to take a photo of your face. who you are anytime they're able to take a photo of your face. And so that technology is what this company is pitching these police departments. Exactly. And what do you know about this company at this point? So at this point, all I really know is that the company is called Clearview AI. And so the first thing I do is Google it and I find their website, which is clearview.ai. And the website is pretty bare, but there's also an office address listed there,
Starting point is 00:04:18 145 West 41st Street, which happens to be just a couple of blocks from the New York Times office. So I decided to, you know, walk over there and there just is no 145 West 41st Street. So that was weird. So now I have this company that's offering this radical new tool. It's got a fake address. It's got a fake address, which is a, you know, a huge red flag. So what do you do next? I found the company on LinkedIn.
Starting point is 00:04:49 It only had one employee listed, a sales manager named John Good. John Good. John Good. It seemed like it could also be fake. And I sent that person a LinkedIn message and never heard back. So one of the things I find online is a website called PitchBook that lists investments in startups. And so it says that this Clearview AI has received $7 million from a venture capital firm and from Peter Thiel, you know, a big name
Starting point is 00:05:20 in Silicon Valley, invested in Facebook and Palantir. So I reach out to his spokesperson and he says, I'll get back to you. I never hear from him again. And then one day I open up Facebook and I have a message from a friend whose name I don't recognize. And he says, hey, I hear you're looking into Clearview AI. You know, I know them. They're a great company. How can I help? And you don't know who this guy is? I don't. I mean, I know them. They're a great company. How can I help? And you don't know who this guy is? I don't. I mean, it's a guy I met once 10 years ago, and somehow he knows that I'm looking into this company. But I'll take it, you know, finally someone wants to talk to me about Clearview AI. And so I say, hey, can I give you a call? And then he doesn't respond,
Starting point is 00:06:02 which I'm getting used to. You just can't catch a break. I'm like, I cannot believe this is another dead end. So phone and email are not working for me. So I just need to figure out another door to knock on to try to talk to a real human being. And one of the investors in the company is this venture capital firm that has an office in Bronxville, New York. So on a cold, rainy Tuesday, I got on the train and headed to Bronxville. I get to the company's address. It's just like in a retail space and go inside. There's this long, quiet hallway of office suites.
Starting point is 00:06:45 And this venture capital firm is at the very end. And I knock on the door and there's no one there. So I start trying to talk to their neighbors. And a woman who works next door says, oh, yeah, they're never here. So I'm walking down the stairs to go back out of the building. And two guys walk through the door. They're both in dark suits with, like, lavender and pink shirts underneath, and they just kind of look like VCs to me. So I say, hey, are you with this venture capital firm? And they say,
Starting point is 00:07:18 we are. Who are you? And I was like, I'm the New York Times reporter who's been trying to get in touch with you. And they said, you know, the company has told us not to talk to you. And I said, well, I've come all the way out to Bronxville. You know, can we just chat for a little bit? They say, OK. It probably helps. I'm like very pregnant. And they offered me water. And they just start telling me everything. And what do they tell you? They confirm that they've invested in Clearview AI and that Peter Thiel has also invested. They identified the genius coder behind the company, this guy named Juan Tontat. And they say he's Vietnamese royalty, but he's from Australia.
Starting point is 00:08:04 Thanh Tat, and they say he's Vietnamese royalty, but he's from Australia. And they also tell me that Juan is the one that was using the fake name John Good on LinkedIn. He's John Good. He's John Good. And they confirmed that law enforcement is already using the app, and that law enforcement loves it and that it's spreading like wildfire. Wow. So I've learned some stuff from these two investors,
Starting point is 00:08:34 but no one from the company is talking to me still. So in the meantime, I am also reaching out to law enforcement because I want to know if this app really works as well as the company claims. By this point, I had learned that over 600 law enforcement agencies had tried the app, including the Department of Homeland Security and the FBI. Wow. It's not just local police departments. This is being used by the federal government already. Yeah. I mean, I was just shocked to discover how easily government agencies can just try a new technology without apparently knowing much about the company that provides it. So I talked to a retired police chief from Indiana who was actually one of the first departments to use the app.
Starting point is 00:09:18 And they solved a case within 20 seconds, he said. A case they hadn't been able to solve. That they hadn't been able to solve. A case within 20 seconds, he said. A case they hadn't been able to solve. That they hadn't been able to solve. One of the officers told me that he went back through like 30 dead-end cases that hadn't had any hits on the government database. And he got a bunch of hits using the app.
Starting point is 00:09:38 So they were really excited about it. This is way more effective than what they were using before. Exactly. With the government databases they were previously using, they had to have a photo that was just a direct, you know, full face photo of a suspect, like mug shots and driver's license photos. But with Clearview, it could be a person wearing glasses or a hat or part of their face was covered or they were in profile.
Starting point is 00:09:59 And officers were still getting results on these photos. Wow. But the most astounding story I was told was that investigators had this child exploitation video and there was an adult who was visible in the video just for a few seconds in the background. So they had this person's face, they had run it through their usual databases and not gotten anything back. But then they ran his face through Clearview's app and he turned up in the background of someone else's gym selfie. Like you could see his face in the mirror. And so they figured out what gym this photo was taken at. They went to the gym,
Starting point is 00:10:39 they asked the employees, do you know who this is? And the employee said, you know, we can't tell you, we have to protect our members' privacy. But then later, the detectives got a text from somebody who worked there identifying the person. And that, I mean, that's just something that would not have been possible without Clearview's app. So because officers were telling me the tool works so well, I wanted to see it for myself, on myself. And I asked them if they would run my photo through the app. Every time I did this, things would get weird. The officers would tell me that they ran my photo and there were no results. No pictures of you. There were no pictures of me, which was really weird because I have a lot of photos of myself online. And then officers would just stop responding to me or talking to me.
Starting point is 00:11:32 And I had no idea what was going on until one officer was kind enough to explain to me. Hello, how are you? Hey, it's Kashmir. Yeah, sure. I'm keeping this officer anonymous because he could get in serious trouble for talking to me so openly about Clearview. If you could just describe yourself to the extent that you can describe yourself. I'm a police officer at a large metropolitan police department. So he's a cop who was doing a 30-day free trial of the app, and he was really impressed with it. So I asked him if he wouldn't mind running my photo.
Starting point is 00:12:12 And what did he tell you happened when he sent your picture through? Yeah, nothing. I didn't get a response at all. No results? No results. And within a couple of minutes of me putting your photo up there, maybe five, less than ten, I got a phone call from the Clearview company. They wanted to know why I was uploading a New York Times reporter's photo. That is so wild. I don't know. It creeps me out as a reporter. I mean, yeah, it just... It kind of creeped me out as a user. my face, which I found, you know, very alarming because this is telling me for the first time that this company is able to monitor who law enforcement is looking for and not just know
Starting point is 00:13:13 who they're looking for, but manipulate the results. And so then that made me go back to the earlier officers who had run my photo and they all confirmed, yes, I got a call from the company and they said, you know, we're not supposed to be talking to the media. So were you able to keep using the app after that? My account was deactivated. Did you ever get access back? I never did. But, you know, I have colleagues that have access.
Starting point is 00:13:46 So if I were to need a picture searched, I could just email it to them and they could email me the results. And you think the tradeoffs are worth it in terms of, you know, what the company has access to? I do. I think it's worth it. So from a law enforcement perspective, it's worth it. You know, we get a lot of cases and we don't usually have a lot of leads. And so anything that can honestly, anything that can help us solve a crime is a win for us. From a privacy perspective, it's rather frightening, the amount of information that they were able to get and provide. As long as they're doing it for the right reasons,
Starting point is 00:14:36 everything will work out. Let's put it that way. But the problem is, we don't know anything about the company at this point. We don't know if there's any kind of oversight. We don't know who the people are that are operating this and what their intentions are with their product. You know, the person in charge of the company won't talk to me. But then, it's the end of December when I get a call from the company's spokeswoman, and she says that the founder, Juan Tontat, is ready to talk. We'll be right back.
Starting point is 00:15:35 be right back. Do you have a hard stop? No, I don't actually. 12 noon. I have no hard stop. Oh, and I have lots of questions. So I'll take as much time as you can give me. So Kashmir, you finally got an interview with the founder of Clearview, this man named Juan Tontat. Where do you meet him? So we met in a WeWork in Chelsea. He came down to the lobby. And his appearance surprised me because I had Googled him online and there are a lot of photos of him. And he's usually pretty eccentric, like a lot of paisley shirts. He's a burning man. But in person,
Starting point is 00:16:07 he was very conservative. He was in this dark blue navy suit with a white button-up and leather shoes. So he looked very much like the security startup entrepreneur. He was looking the part. He was looking the part. When were you born? How old were you? 88, so I'm 31. Okay. And what do you learn about him?
Starting point is 00:16:33 So he is 31. He grew up in Australia. You can't hear that in his voice. I love computers, obviously. Yeah, so how did you get interested in technology? We had a computer, of course, when I was four or five years old. So his family got a computer when he was three or four, and he was always tinkering with computers growing up. We got the internet when I was 10, I think. And then you could discover all these things online. But Linux, I was like, I have to get this thing. It's the nerdiest thing ever.
Starting point is 00:16:54 I convinced my dad, we installed it, and I would spend the whole summer reinstalling and learning Linux stuff. Stayed home from high school and learning programming for fun. So that's, I just really liked it. He enrolled in college, decided to drop out like many technologists do, home from high school and learning programming for fun. So that's, I just really liked it. He enrolled in college, decided to drop out like many technologists do, and moved to San Francisco when he was 19. 2007. Before it was a big thing, right? It was kind of getting there, but it wasn't huge. This is 2007, and this is kind of a boom time. You know, the iPhone has just come out. That's the Facebook app era. Remember that?
Starting point is 00:17:26 Yeah. People are becoming millionaires by making Facebook games. And he wants to be the next big app guy. Being there is a lot different from reading about it online. You absorb a lot more of how people get things done. And you learn a lot more secrets. What did he build? So the Facebook apps were like
Starting point is 00:17:45 would you rather apps and kind of like romantic gifts. Some of the first iPhone games as well. One of his most recent apps was called Trump hair and it was an app for adding Trump's hair to your photos. That's it. That's it. The tagline was it's gonna be huge. Okay. That's it. The tagline was, it's going to be huge. Okay. So how do you move from a Donald Trump hair app to something that seems like it could revolutionize police work? Well, he moved to New York, and that seemed to be a big change for him. And he started meeting very different people.
Starting point is 00:18:27 One of the most important people he met was Richard Schwartz. I ended up meeting Richard at a party. This 61-year-old guy who worked for Mayor Rudy Giuliani in the 1990s, who was just very politically connected. I really loved that. He had a lot of stories. And then we talked for like an hour about different ideas, because I was like, this is what I do, technology. I can make anything. And it went from there. And the two of them decided with Juan Tontat's tech know-how and Richard's Rolodex that they wanted to try to start a facial recognition company together. And why facial recognition?
Starting point is 00:19:00 Why did the two of them choose that? I think it was because Juan had started reading a lot of papers about facial recognition and machine learning. I had never really studied AI stuff before, but I could pick up a lot of it. And I think they realized they could make money doing it. What were you thinking in terms of like the range of ideas at first? What were you thinking? A lot. I could go on really crazy. There's a lot of face recognition algorithms out there and a lot that work pretty well.
Starting point is 00:19:26 What was different about what Juan Tantat and Richard Schwartz were doing is they had been willing to scrape all of these photos from the internet. So they just had a huge database of photos. Right, billions of photos. Exactly. And then we hit this
Starting point is 00:19:42 point where we got to like 99% accuracy. I remember that we were just in the office and just like, wow, it works. Try that one again. Try that one again. And just every time I would pick the right person up and that's when we knew this is crazy. This actually works. Is that legal? Can you just take photographs from anywhere on the internet and use them for this kind of thing? There was a ruling in a federal court this fall that said, yeah, this kind of public scraping seems to be legal. And what are they hoping to do with this software at this point?
Starting point is 00:20:18 I mean, they're just trying to figure out how they can make money off of the app. And so they eventually end up settling on law enforcement. And they started solving cases from grainy ATM photos, cases they would have never solved. So this kind of spread to different departments and then from one agency to other agencies. And do you ask him about that thing that happened with the officer who couldn't find your photos? Yeah. So that was one of my questions and I wasn't entirely satisfied by his answer. One thing that surprised me,
Starting point is 00:20:48 some of the officers I talked to tried to run my photo through it, and they got no hits. And I have, like, tons of photos online. It must have been a bug. Did you guys block me from, like, getting results? I don't know about that. Because I was like, this doesn't make any sense. He said, oh, yeah, that was a software bug.
Starting point is 00:21:05 But he laughed. I was like, I have a thousand any sense. He said, oh, yeah, that was a software bug. But he laughed. I was like, I have a thousand photos online. This can't work as well as they say it works. Yeah, well, it must have been a bug in the software or something. Why did you do that? Maybe it doesn't work. You never know, right? This could be the long con.
Starting point is 00:21:21 It works. What do you think that was about? I don't think it was a software bug. I don't know. You have no idea, huh? Huh. Yeah. So he said the software bug is now fixed.
Starting point is 00:21:33 Oh, yeah. So this is the iPhone version. And he took a photo of me. Oh, it does work. Oh, that's so surprising. I know. And the results included a bunch of photos of me online. Oh, my God, I totally forgot.
Starting point is 00:21:46 That was 10 years ago. Including some I had never seen before. I didn't know were online. So he's just brushing off this weird thing that happened to you. But do you get the sense that he's thinking at all about privacy? So I asked him, you know, this is a very powerful app. And I asked him what restrictions know, this is a very powerful app. And I asked him, what restrictions is he thinking about for it? And he said, you know, one, that they were only selling it to
Starting point is 00:22:11 law enforcement right now, though it does turn out that they're also selling it to a few private companies for security purposes. But he said they wouldn't sell it to bad actors or bad governments. And our philosophy is basically, if it's a U.S.-based, both like a democracy or an ally of the U.S., we will consider it. But like, no China, no Russia, or anything that wouldn't, you know,
Starting point is 00:22:38 be good. So if it's a country where it's just governed terribly or whatever, I don't know if we'd feel comfortable, you know, selling to certain countries. So it doesn't sound like he has much of a rubric for deciding who to sell to. And it sounds like there's no one really overseeing how he's making these decisions. At this point, it's just up to Clearview to decide who they want to sell the app to.
Starting point is 00:23:03 No pressure, but like when we talk to some venture capitalists, they're like, why don't you make this consumer? You know, law enforcement is such a small market, you won't make that much money. And we've considered it. And it was just like, what's the use case here? And there are, you know, right now we can help catch pedophiles. What if a pedophile got access to this, goes around the street? But when I was talking to one of their investors, he says, you know, we want to dominate the law enforcement market, and then we want to move into other markets like hospitality, like real estate. And he predicted that one day, you know, all consumers will have access to this app. I can tell you that one of your investors hopes that you guys are going to go into the consumer market. Yeah, he talks too much. But like, we're not going to do that.
Starting point is 00:23:51 Juan seems to be saying, yeah, there's pressure on us to sell to private consumers, but we're not going to do that. And how reasonable is it to think that he has control or the company has control at this point over where this technology goes? I mean, one point that I made when I was talking to him is that oftentimes the tools that law enforcement use end up in the hands of the public. I personally feel like you guys have kind of opened the door to now this becoming more normalized. now this becoming more normalized. Just because a lot of tools that law enforcement have eventually make their way into the public hands. Not always. Everyone has a gun.
Starting point is 00:24:34 Right? Anyone who wants one can get one in the U.S., basically. His response was strange. He said, well, look at guns. Law enforcement has guns, but not everybody has a gun. And I don't know if that's because he's from Australia. Yeah, he's proving your point, by at guns. Law enforcement has guns, but not everybody has a gun. And I don't know if that's because he's from Australia. Yeah, he's proving your point. It seemed like he was proving my point rather than rebutting it.
Starting point is 00:24:55 You know, we've been building the technology to make this possible for years now. Facebook building this huge database of our photos with our names attached to it, advances in image recognition and search technologies. It all led us here. But there's been no accompanying regulation or rules around how the technology should be used. There's no real law or regulation that makes this illegal. The scraping seems to be okay. We don't have a big ban on face recognition. We don't need to give consent for people to process our faces. And so in terms of holding this tool back, we're just relying on the moral compasses of the
Starting point is 00:25:42 companies that are making this technology and on the thoughtfulness of people like Juan Tontat. But yeah, but what do you think about that? Do you think that this is kind of too dangerous a tool for everybody to have? Um, I have to think about that and get back to you on an answer because it's a good question yeah I've thought about it a little bit you haven't thought about it
Starting point is 00:26:08 I have I have but I need to really you know come up with a better a good answer for that like honestly like yeah Thanks, Kashmir. Thank you.
Starting point is 00:26:36 Since Kashmir began reporting on Clearview AI, several major social media companies, including Facebook, Twitter, and Venmo, have demanded that the company stop using photos scraped from their websites. But it's unclear what, if any, power those social media companies have to force Clearview to comply. A few weeks ago, the state of New Jersey barred law enforcement from using Clearview's technology. But police remain free to do so in 49 other states. We'll be right back. Firing Gordon Sondland, his ambassador to the European Union, who called the president's actions toward Ukraine a quid pro quo. And Lieutenant Colonel Alexander Vindman, a member of the National Security Council, who expressed alarm over the president's phone call with the leader of Ukraine.
Starting point is 00:28:06 The Times reports that several Republican senators urged Trump not to fire the witnesses, fearing it would send a dangerous message, but that the president ignored their advice. And the global death toll from the coronavirus has reached more than 800, surpassing that of the SARS epidemic, which killed 774 in 2003. The number of confirmed infections from the coronavirus now stands at more than 37,000. Finally, new polling in New Hampshire, which will hold its primary tomorrow, shows Mayor Pete Buttigieg neck-and-neck
Starting point is 00:28:45 with Senator Bernie Sanders and former Vice President Joe Biden slipping into fourth place. Vice President Biden, the first question is for you. In the last few days, you've been saying that Democrats would be taking too big a risk if they nominate Senator Sanders or Mayor Buttigieg, but they came out on top in Iowa. What risks
Starting point is 00:29:06 did the Iowa Democrats miss? The poll, conducted by the Boston Globe, WBC, and Suffolk University, suggest Buttigieg is benefiting from a strong performance in the Iowa caucuses, and that Biden may perform poorly for the second time in a row, a prediction Biden confirmed during Friday night's debate on ABC. Well, they didn't miss anything. This is a long race. I took a hit in Iowa, and I'll probably take a hit here. That's it for The Daily.
Starting point is 00:29:51 I'm Michael Barbaro. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.