The Daily - A Criminal Underworld of Child Abuse, Part 1

Episode Date: February 19, 2020

Note: This episode contains descriptions of child sexual abuse.A monthslong New York Times investigation has uncovered a digital underworld of child sexual abuse imagery that is hiding in plain sight.... In part one of a two-part series, we look at the almost unfathomable scale of the problem — and just how little is being done to stop it. Guests: Michael H. Keller, an investigative reporter at The New York Times, and Gabriel J.X. Dance, an investigations editor for The Times. For more information on today’s episode, visit nytimes.com/thedaily. Background reading: Last year, tech companies reported over 60 million online photos and videos of children being sexually abused. Lawmakers foresaw this crisis years ago, but enforcement has fallen short. Our reporters investigated the problem and asked: Can it be stopped?Tech companies detected a surge in online videos of child sexual abuse last year, with encrypted social messaging apps enabling abusers to share images under a cloak of secrecy.Here are six takeaways from The Times’s investigation of the boom in online child sex abuse.

Transcript
Discussion (0)
Starting point is 00:00:00 It's not altogether uncommon in investigations for us to turn up information that is shocking and disturbing. The challenge is when in the course of your reporting, you come across something so depraved and so shocking that it demands attention. People have to know about this, but nobody wants to hear about it. How do you tell that story? From The New York Times, I'm Michael Barbaro. This is The Daily. For The New York Times, I'm Michael Barbaro.
Starting point is 00:00:44 This is The Daily. Today, a months-long Times investigation uncovers a digital underworld of child sexual abuse imagery that is hiding in plain sight. In part one, my colleagues Michael Keller and Gabriel Dance on the almost unfathomable scale of the problem and just how little is being done to stop it. It's Wednesday, February 19th. Gabriel, tell me how this investigation first got started.
Starting point is 00:01:34 So it all began with a tip. Early last year, we got a tip from a guy, and this guy was looking up bullets. Bullets for guns. Bullets for guns. And he was actually looking for a very specific weight of bullet on Microsoft's Bing search engine. And while he was looking up these bullets,
Starting point is 00:01:56 he started getting results of children being sexually abused. And the guy was horrified. He didn't understand why he was seeing these images, couldn't stand to look at them, and so he reported it to Bing and heard nothing. And full of outrage, he writes us a letter to our tip line and described what he was looking for,
Starting point is 00:02:22 described the kind of images he was getting back. He says, New York Times, can you please look into this for me? So I actually emailed my colleague, Michael Keller, and I asked him to look into it. So in the tip, they had listed the search terms they'd used. So we tried to replicate it. We put it into Bing. And we saw a flash on the screen of images of children. And so I wrote back to Gabe and said, yeah, this checks out.
Starting point is 00:03:06 You could type words in to Bing and get back explicit images of children. So this is not the dark web. This is just a regular commonplace search engine. That's right. So a few things went through my head. First of all is, well, we need to document this because as most of us know, things on the internet change all the time. It's possible they came down soon after, etc. But we were very unsure what kind of legal liabilities we had when it came to documenting anything regarding this imagery. So we emailed Dave McCraw, who's the head counsel at the New York Times, to ask him, you know, what can we do? What can't we do? How do we go about investigating where this imagery is online? And doing it without somehow violating the law. That's right. And David wrote back immediately and said, there is no journalistic privilege when investigating
Starting point is 00:03:57 this. You have no protections and you have to report it immediately to the FBI. And so that's what we did. We submitted a report both to the FBI and also to the National Center for Missing and Exploited Children, which is the kind of government designated clearinghouse for a lot of these reports. And what did they tell you? They weren't able to tell us anything about the report we submitted, but it made us wonder how common is it that they get these kinds of reports? How many images are out there? How many images are out there? How many images are flagged to them each year?
Starting point is 00:04:28 And they were able to tell us that. And that number was frankly shocking to us. The handful of images that the tipster stumbled across was just a tiny portion of what the National Center sees every single day. They told us that in 2018 alone, they received over 45 million images and videos. Wow. 45 million images a year.
Starting point is 00:04:59 That's more than 120,000 images and videos of children being sexually abused every day. Every single day. But to put it in perspective, 10 years ago, there were only 600,000 images and videos reported to the National Center. And at that time, they were calling it an epidemic. And at that time, they were calling it an epidemic. So in just a decade, it went from 600,000 reports to 45 million. Yeah. So we were really curious.
Starting point is 00:05:43 How does a problem called an epidemic 10 years ago become such a massive issue now? And one of the first things we learned was that we did try and tackle it back then. The national epidemic of grown men using the internet to solicit underage teens for sex. As more and more parents become aware of the dangers, so have lawmakers in Washington. In the mid to late 2000s, as the Internet was being more widely adopted, this issue of online child sexual abuse really got on the radar of Congress. There was even a bill being introduced by Debbie Wasserman Schultz. The Internet has facilitated an exploding multi-billion dollar market for child pornography.
Starting point is 00:06:21 There were multiple hearings. I'm here today to testify about what many of my law enforcement colleagues are not free to come here and tell you. They heard from law enforcement. We are overwhelmed. We are underfunded. And we are drowning in a tidal wave of tragedy. They were overwhelmed with the number of reports that were coming in. Unless and until the funding is made available to aggressively investigate and prosecute possession of child pornography, federal efforts will be
Starting point is 00:06:45 hopelessly diluted. They, in many cases, had the tools to see where offenders were, but not enough staff to actually go out and arrest the perpetrators. We don't have the resources we need to save these these children. Alicia Kozakiewicz, Pittsburgh Resident, 19 Years Old, Sophomore in College Hello. Thank you for inviting me to speak today. My name is Alicia Kozakiewicz, a Pittsburgh resident. I am 19 years old and a sophomore in college. There was also very chilling testimony from a victim of child sexual abuse. For the benefit of those of you who don't know, don't remember those headlines,
Starting point is 00:07:29 I am that 13-year-old girl who was lured by an Internet predator and enslaved by a sadistic pedophile monster. In the beginning, I chatted for months with Christine, a beautiful red-haired 14-year-old girl, and we traded our school pictures. Too bad that hers were fake. Yeah, Christine was really a middle-aged pervert named John. And he had lots of practice at his little masquerade because he had it all down. The abbreviations, the music, the slang, the clothes.
Starting point is 00:07:59 He knew it all. John slash Christine was to introduce me to a great friend of hers This man was to be my abductor, my torturer I met him on the evening of January 1st, 2002 Imagine, suddenly you're in the car Terrified and he's grabbing onto your hand and crushing it And you cry out but there's no one to hear In between the beatings and the raping
Starting point is 00:08:24 He will hang you by your arms while beating you it and you cry out but there's no one to hear. In between the beatings and the raping, he will hang you by your arms while beating you and he will share his prized pictures with his friends over the internet. The boogeyman is real and he lives on the net. He lived in my computer and he lives in yours. While you're sitting here, he's at home with your children. Tax forces all over this country are poised to capture him, to put him in that prison cell with the man who hurt me. They can do it. They want to do it.
Starting point is 00:09:18 Don't you? Alicia's testimony really moved people. People responded. And eventually, about a year later, the bill passes unanimously. And what is this new law supposed to do? So this law, the 2008 Protect Our Children Act, is actually a pretty formidable law with some pretty ambitious goals. It was supposed to, for the first time ever, secure tens of millions of dollars in annual funding for investigators working on this issue. And it required the Department of Justice to really study the problem and put out reports to outline a strategy to tackle it. And what has happened since this ambitious law was put into place?
Starting point is 00:10:03 In many ways, the problem has only gotten worse. Even though the number of reports has grown into the millions, funding is still pretty much what it was 10 years ago. And even though the government was supposed to do these regular reports, they've only done two in 10 years. And that's an issue because if you don't have anyone studying the size of the problem, you don't have anyone raising alarm bells and saying, hey, we need more resources for this. So they didn't study it and they didn't increase the funding in a way that would match the scale at which the problem is growing. Yeah, it really looked like they had this law in 2008,
Starting point is 00:10:47 and then everyone really took their eye off the ball. So we called Congresswoman Debbie Wasserman Schultz, who was one of the leading proponents of this bill, to figure out what happened. And we are really gobsmacked to hear that she's unaware of the extent of the failings. She sends a letter to Attorney General William Barr laying out a lot of our findings, requesting an accounting. As far as we know, she hasn't heard anything. So even the person who co-wrote the law was unaware that it
Starting point is 00:11:17 was pretty much failing. She knew about the funding, but even she didn't know the things had gotten this bad. she didn't know the things had gotten this bad. And we wanted to figure out, now 10 years later, what kind of effect is this having to law enforcement, to the people on the ground working these cases? And what we heard from them really shows what happens when everyone looks away. We'll be right back. Gabriel, Michael,
Starting point is 00:11:57 what happens when you start reaching out to law enforcement? So, Mike and I probably spoke with about 20 different internet crimes against children task forces. And these are the law enforcement agencies responsible with investigating child sexual abuse. To be honest, most of the time as an investigative reporter, generally law enforcement, I mean generally anybody, but especially law enforcement, is not particularly interested in speaking with us. Usually, they don't see much to gain.
Starting point is 00:12:32 But surprisingly, when it came to this issue, they were not only willing to speak with us, but they were interested in speaking with us. Why do you think that was? It's partly because we were using the right terminology. And by that, I mean, we were asking them about child sexual abuse imagery, not child pornography. And what exactly is the distinction? Well, legally, they're basically the same. But for the people who deal with this type of crime day in, day out, who see these images and who speak with survivors, they know that calling it child pornography
Starting point is 00:13:15 implies a bunch of things that are generally incorrect. One is that it equates it with the adult pornography industry, which is legal and made up of consenting adults, whereas children cannot consent to the sexual behavior. The other thing is that the crimes depicted are heinous, and that each one of them is essentially looking at a crime scene. And for that reason, they prefer to call it child sexual abuse imagery. But I think they also talk to us because for the law enforcement who deal with this, they very much feel that the issue is undercovered and under-discussed, especially considering the seriousness of the crime.
Starting point is 00:14:13 I mean, we had the kind of coordination and cooperation from these law enforcement officers that we rarely see from anybody whatsoever. They let us go out on raids with them. They provided us with detailed reports. They talked to us about specific cases, they were really, really open because they felt that as a nation, we were turning our backs on children, essentially. And once you have that access, what do you find? What we learned talking with all these law enforcement officers was just how this world operates. A lot of the departments told us about the high levels of turnover they have. operates. A lot of the departments told us about the high levels of turnover they have. We had one commander who said back when he was an investigator, he saw one image that was so shocking to him, he quit and served a tour in Iraq. That was his escape. To even see this imagery
Starting point is 00:14:58 once changes your life. And these people look at it all day long. And then on top of that, they have to deal with the fact that, you know, their funding has not gone up whatsoever. They're being funded at a level that means they can't do proactive investigations anymore. So they're not sitting in chat rooms trying to catch what many of them think are the worst criminals. They're unable to do anything really than respond to the utter flood of reports coming in from the National Center. And because of the sheer number of reports coming in, they're forced to make some truly terrible decisions on how to prioritize who they're investigating. Like what?
Starting point is 00:15:41 The FBI told us that in addition to, of course, looking for anybody who's in immediate danger, they have to prioritize infants and toddlers. When we first got into this, we didn't even consider the idea of infants. And later LAPD would say the same thing. We're prioritizing infants and toddlers and essentially not able to respond to reports of anybody older than that. I mean, it really left us pretty speechless. So we're learning a lot from speaking with law enforcement, but they also only have a small part of the picture. We also are thinking about this tip that we got, where the tipster was able just to find these images on search engines very easily. And so we still have this big question of how easy is it to come across this material?
Starting point is 00:16:53 And how do you go about answering that question? When we initially started trying to replicate the tipster search, we had to stop because we didn't want to be searching for these illegal images. But then we discovered a technology that would allow us to keep investigating without having to look at the images. It's called photo DNA, and it essentially creates a unique digital fingerprint for each image. And as the National Center is receiving these reports and identifying these illegal images of child sexual abuse, they keep track of these digital fingerprints. And other companies can tap into that database to see, is this image that I have, is it in that database of known images? And we stumbled upon a service from Microsoft
Starting point is 00:17:40 that actually allows you to do just that. It'll take a URL of an image and tell you if it matches an image already in that database. So we wrote a computer program that would replicate that initial search from the tipster, record all of the URLs. The key part of it, though, was that it blocked all images. So suddenly you can now search for these images and find where they live on the internet without illegally looking at them. Right. So we started doing this test
Starting point is 00:18:14 across a number of different search engines. And to sit there and watch as the program starts ticking away and see the first time it flashes that it got a match and then to see it match again. Bing from Microsoft. And match again. Yahoo. And match again.
Starting point is 00:18:32 Another one called DuckDuckGo. I mean, I think both our jaws dropped. We didn't find any on Google, but on other search engines powered by Microsoft data, we found a number of matches. Dozens, dozens of images. You're saying the company that came up with this technology to help track these images
Starting point is 00:18:55 is the very same company whose search engine allows people to find them, view them, keep viewing them. Right. As soon as we started running this computer program using Microsoft to detect illegal imagery, we were finding illegal imagery on Microsoft. So it was very clear that they were not using their own services to protect users from this type of image. And Microsoft told us that this was the result of a bug that they fixed. But about a week later, we re-ran the test and found even more images.
Starting point is 00:19:32 Wow. So it sounds like you're finding all these images on Microsoft-powered search engines. So how much of this in the end is just a Microsoft problem? It's more complicated than that. We were performing this limited search just to test search engines on images. But we're also, at the same time, reading over 10,000 pages of court documents. These are search warrants and subpoenas from cases where people were caught trading this material. And what becomes clear pretty quickly is that every major tech company
Starting point is 00:20:06 is implicated. In page after page after page, we see social media platforms, Facebook, Kik, Tumblr. We see cloud storage companies, Google Drive, Dropbox. We read one case where an offender went on a Facebook group to ask for advice from other people. Say, hey, how do I share this? How do I get access to it? How do I get access to children? And they say, download Kik to talk to the kids
Starting point is 00:20:36 and download Dropbox and we can share links with you. Wow. And from these documents, it becomes clear that the companies know. I mean, there's so many cases for each company that they all know. And so the question becomes, what are they doing about it?
Starting point is 00:21:08 Tomorrow on The Daily, a victim's family asks that same question. We'll be right back. Here's what else you need to know today. President Trump has granted clemency to a new round of high-profile individuals, including Bernie Kerik, the former New York City police commissioner who was convicted on eight felony charges, including tax fraud, Michael Milken, the financier convicted of insider trading. And Rod Blagojevich, the former governor of Illinois,
Starting point is 00:21:52 who was convicted of trying to sell Barack Obama's Senate seat after he became president. Yes, we have commuted the sentence of Rod Blagojevich. He served eight years in jail. It's a long time. Asked about Blagojevich, who Trump ordered released from prison four years early, the president mentioned his wife's public pleas on Fox News and Blagojevich's appearance on Trump's NBC TV show, Celebrity Apprentice.
Starting point is 00:22:23 I watched his wife on television. I don't know him very well. I've met him a couple of times. He was on for a short while of The Apprentice years ago. Seemed like a very nice person. And on Tuesday, after a new poll showed him surging in the Democratic primary race, Michael Bloomberg qualified for the next presidential debate, scheduled for tonight in Las Vegas.
Starting point is 00:22:50 The poll, conducted by NPR, PBS, and Marist College, showed Bloomberg with 19% support, putting him in second place behind Bernie Sanders. That's it for The Daily. I'm Michael Barbaro. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.