Making Sense with Sam Harris - #213 — The Worst Epidemic
Episode Date: August 4, 2020Sam Harris speaks with Gabriel Dance about the global epidemic of child sexual abuse. They discuss how misleading the concept of “child pornography” is, the failure of governments and tech compani...es to grapple with the problem, the tradeoff between online privacy and protecting children, the National Center for Missing and Exploited Children, photo DNA, the roles played by specific tech companies, the ethics of encryption, “sextortion,” the culture of pedophiles, and other topics. If the Making Sense podcast logo in your player is BLACK, you can SUBSCRIBE to gain access to all full-length episodes at samharris.org/subscribe.
 Transcript
 Discussion  (0)
    
                                         Welcome to the Making Sense Podcast.
                                         
                                         This is Sam Harris.
                                         
                                         Welcome to the Making Sense Podcast. This is Sam Harris.
                                         
                                         Okay, the long-awaited episode on the most depressing topic on Earth.
                                         
                                         Child sexual abuse, otherwise known as child pornography, in the form of its public consumption.
                                         
                                         As many of you know, I've delayed the release of this episode for several months.
                                         
                                         It just never seemed like the right time to drop it.
                                         
                                         When is the right time to talk about this, really?
                                         
    
                                         In the tech space, it was probably 20 years ago.
                                         
                                         Anyway, this is an episode that many of you will find difficult to listen to, understandably.
                                         
                                         If you work in tech, I think you have a moral responsibility to listen to it.
                                         
                                         If you work at a company like Facebook or AWS or Dropbox or Zoom or any company that facilitates the spread of so-called child pornography,
                                         
                                         you really have a responsibility to listen to this conversation
                                         
                                         and figure out how you can help solve this problem.
                                         
                                         As you'll hear, we've gone from a world where pedophiles were exchanging Polaroids in parking lots to a world in which there is an absolute
                                         
                                         deluge of imagery that provides a photographic and increasingly video record of the rape of
                                         
    
                                         children. And as you'll hear, the tech companies have been terrible at addressing this problem, and law enforcement is completely under-resourced
                                         
                                         and ineffectual here. Now, as I said, I recorded this conversation some months ago
                                         
                                         as an indication of how long ago, when Zoom came up in the conversation, I felt the need to define
                                         
                                         it as a video conferencing tool used by businesses. I've since cut that. But everything we discuss
                                         
                                         is all too current. In fact, the problem has only gotten worse under the COVID pandemic,
                                         
                                         because the children being abused are more often than not at home with their abusers,
                                         
                                         at home with their abusers, and the people who consume this material are at home with much less to do. So both the supply side and demand side of this problem have increased.
                                         
                                         I will add a short afterword to mention a few things that the government is now doing,
                                         
    
                                         but nothing of real substance has changed to my
                                         
                                         knowledge. Today I'm speaking with Gabriel Dance. Gabriel is the Deputy Investigations Editor at the
                                         
                                         New York Times, where he works with a small team investigating technology, from the topic at hand,
                                         
                                         online sexual abuse imagery, to the companies that trade and sell our data,
                                         
                                         and this business model that's increasingly known as surveillance capitalism. Before working at the
                                         
                                         Times, Gabriel helped launch the criminal justice news site The Marshall Project, where he focused
                                         
                                         on the death penalty and prison and policing. And before that, he was the interactive editor
                                         
                                         for The Guardian, where he was part of the
                                         
    
                                         group of journalists who won the 2014 Pulitzer Prize for coverage of the widespread secret
                                         
                                         surveillance by the NSA. In this episode, I speak with Gabriel about the global epidemic of child
                                         
                                         sexual abuse. We discuss the misleading concept of child pornography, the failure of governments
                                         
                                         and tech companies to grapple with
                                         
                                         the problem, the trade-off between online privacy and protecting children, the National Center for
                                         
                                         Missing and Exploited Children, the difficulty in assessing the scope of the problem, photo DNA
                                         
                                         and other tools, the parts played by specific tech companies, the ethics of encryption,
                                         
                                         by specific tech companies, the ethics of encryption, sextortion, the culture of pedophiles,
                                         
    
                                         and other topics. Again, this episode is not a barrel of laughs, but it's an important conversation and is yet another PSA, so not paywalled. If you want to support the work I'm doing here,
                                         
                                         you can subscribe at SamHarris.org.
                                         
                                         And now I bring you Gabriel Dance.
                                         
                                         I am here with Gabriel Dance.
                                         
                                         Gabe, thanks for joining me.
                                         
                                         Thank you so much for having me on.
                                         
                                         So, undoubtedly, in the intro to this, I will have prepared people to not listen to the podcast if they find the topic truly unbearable.
                                         
                                         Right.
                                         
    
                                         But I guess I should just reiterate here that we're going to speak about probably the most
                                         
                                         depressing topic I can think of.
                                         
                                         The real gravity of it tends to be concealed by the terms we use to describe it.
                                         
                                         So we're going to talk about, quote, child pornography and the exploitation of children.
                                         
                                         And yet these phrases can conjure images for people that are not, that don't really get it,
                                         
                                         what's going on here. Because they can remind people of things like, you know, there are
                                         
                                         teenagers who get into the porn industry before their 18th birthday, right? And that gets found
                                         
                                         out. Or, you know, teenagers send naked photos or videos of themselves to one another, or even with
                                         
    
                                         strangers online, and these images get out. And, you know, all of that gets binned into this category of child pornography. But
                                         
                                         at the bottom of this morass that you and I are going into, we're talking about the rape and
                                         
                                         torture of young children, either by family members or caregivers or by people who have
                                         
                                         abducted them. And obviously, I'm going to want to know from you just what the scale of this problem actually is.
                                         
                                         But then we're talking about a vast audience of people who are willing to pay to watch these
                                         
                                         children raped and tortured because they find the rape and torture of children to be the sexiest
                                         
                                         thing in the world, apparently. So psychologically and socially, we're just in a horror movie here, and people need to
                                         
                                         understand that's where we're going, and you can pull the ripcord now if you don't want to go there
                                         
    
                                         with us. So that's a fairly grave introduction to you, Gabe, but you have been covering this topic
                                         
                                         for the New York Times in a series of long and very disturbing articles. So, you know,
                                         
                                         welcome to the podcast and thank you for doing the work you're doing because, I mean, but for
                                         
                                         your articles, I really, again, would have just the vaguest notion of what appears to be going on
                                         
                                         in our world and even in our own neighborhoods. So thank you for lifting the lid on this horror
                                         
                                         show because it can't be especially fun to be doing this work.
                                         
                                         Well, thank you. Thank you for having me on. And I really appreciate you starting with an
                                         
                                         introduction that discusses the terminology around this horrible, horrible crime. We, I also didn't know anything about what I came into this calling
                                         
    
                                         child pornography. And our investigation started in February of 2019. And it was pretty quick
                                         
                                         that we learned, I investigated this with a colleague of mine, Michael Keller,
                                         
                                         primarily, and it was pretty quick that we learned the proper terminology used by people
                                         
                                         in the industry and law enforcement is child sexual abuse material. And I think for the purposes of
                                         
                                         this podcast, it'll be easier to refer to this as CSAM,
                                         
                                         which is the easier way of referring to it and not constantly using what I think is the
                                         
                                         inaccurate and inelegant term, child pornography.
                                         
                                         Maybe let's just linger on the terminology for another minute or so, because it really is one
                                         
    
                                         of those terms that really just reliably misleads people. So another example of this is people talk about male
                                         
                                         circumcision and female circumcision, right? As though the term circumcision were interchangeable
                                         
                                         in those phrases, right? And so this is a social issue that is being obfuscated by some common words.
                                         
                                         And so just to give people a sense of what should be obvious but strangely isn't, let's
                                         
                                         just consider how different this is from normal pornography.
                                         
                                         Because there's a lot that could trouble us and perhaps should trouble us about normal
                                         
                                         pornography. You can ask questions
                                         
                                         like, how did these women in particular find themselves in a situation where they're performing
                                         
    
                                         sex on camera? I mean, are they getting paid? How much are they getting paid? Are some of them
                                         
                                         actually not getting paid and being exploited or even coerced? Are they private videos that
                                         
                                         were meant to be kept private that just got leaked?
                                         
                                         Is there some backstory of suffering that would make the average person feel terrible about
                                         
                                         watching what purports to be a video of consenting adults having sex? So, I mean,
                                         
                                         these are totally reasonable questions to ask, but it's also understandable that most people
                                         
                                         don't really think about these things when they're watching
                                         
                                         normal adult pornography because human suffering isn't being directly shown on the video. I mean,
                                         
    
                                         even if it's edgy pornography, who knows, it could be horrendous stuff out there that
                                         
                                         I can't imagine, but normal pornography, even edgy pornography, is within its frame, it seems to be the work
                                         
                                         of consenting adults doing something they want to do for whatever reason.
                                         
                                         But anything involving kids does not function by that logic at all, right?
                                         
                                         So any image or video of an adult having sex with a five-year-old is simply the record
                                         
                                         of a crime, right? Just full stop. And
                                         
                                         it is obviously a crime to anyone watching it. And yet, you know, as appalling as it is that
                                         
                                         these crimes occur, it's almost more appalling that there's a vast market for them. I mean,
                                         
    
                                         I'm prepared to believe that, you know, one guy in a million is going to
                                         
                                         abuse his stepchild, right? But the idea that there are millions and millions of people with whom this
                                         
                                         person could find an online dialogue and sell them the video record of this abuse, I mean, that's just
                                         
                                         completely shocking to me, and the scale of it is completely
                                         
                                         shocking as you report. So let's just, let's talk about the nature of the problem. What is,
                                         
                                         what's going on and how much of it is out there? Well, I think you're absolutely right to draw a
                                         
                                         distinction between what we call adult pornography and what the misnomer child pornography and what you said
                                         
                                         several times hit the nail on the head, which is consent. I mean, these are, as you said,
                                         
    
                                         children. I mean, even if we take away and we can come back and speak about self-produced material by teenagers, maybe, or 17-year-olds who might
                                         
                                         be engaging in sexual acts on film before turning the legal age.
                                         
                                         These are not what we are discussing in the majority of our reporting.
                                         
                                         We are talking about prepubescent acts of sexual crimes against children.
                                         
                                         There is no consent.
                                         
                                         They are unable to consent.
                                         
                                         And there's no illusion of consent. You have to get your head around the depravity of the audience here. And again, I mean, this is going to sound very judgmental. Let's bracket for
                                         
                                         a moment some kind of compassionate and rational understanding of pedophilia that we might
                                         
    
                                         want to arrive at. It should be obvious that no one chooses to be a pedophile or anyone who finds
                                         
                                         this imagery titillating. But there's one example in one of your articles that this is not an
                                         
                                         especially lurid description of the crime, but just the details give you a sense of how
                                         
                                         insane all this is. So this is lifted from
                                         
                                         one of your articles. In a recent case, an offender filmed himself drugging the juice boxes of
                                         
                                         neighborhood children before tricking them into drinking the mix. He then filmed himself as he
                                         
                                         sexually abused unconscious children. So that's part of the titillating material for this audience, the imagery, the video of this guy
                                         
                                         putting whatever narcotic he used into juice boxes
                                         
    
                                         and feeding it to unsuspecting children
                                         
                                         and then performing sex acts on them.
                                         
                                         The criminality of this and the evil of it
                                         
                                         is absolutely on the surface.
                                         
                                         The details are mind-boggling.
                                         
                                         There are definitely, I mean, a variety of extremely depraved things that may come up in our discussion that I have learned are extremely hard for people to hear.
                                         
                                         They were hard for me to even begin to comprehend when I was learning these things from law enforcement,
                                         
                                         from survivors, from child advocates. I mean, the one you described was actually an example given by Special Agent Flint Waters, who at the time was a criminal investigator for the state of Wyoming.
                                         
    
                                         He was appearing before Congress when he was describing that video.
                                         
                                         And that was in 2007, actually, before this crime has exploded in the same way and or in the way that it has.
                                         
                                         I mean, for reference in 2007, and I'm sure we'll get more into the total numbers, but there were fewer than 100,000 reports of online child sexual abuse material.
                                         
                                         In 2019, we just published a story on this past Friday.
                                         
                                         In 2019, there were almost 17 million reports.
                                         
                                         there were almost 17 million reports. So the explosion in content being found is staggering.
                                         
                                         And to talk a little bit, I mean, the examples are all horrendous, hard to hear,
                                         
                                         harder to imagine, nothing you want to think about or read about. But just to kind of take it to the extent that we've learned is what's going on, there's also an active community engaged in committing sexual
                                         
    
                                         crimes against what they call, the criminals, pre-verbal children, which is to say children who cannot speak yet. And that means obviously usually children younger
                                         
                                         than two, younger than one, instances of children days and months old being molested, raped, being
                                         
                                         raped, filmed being raped. And it is truly beyond shocking. And as we started to speak with the people who regularly engage with this content, their lives are forever changed. Anybody who deals with this issue cannot get it out of their minds. really speaks to why it has become such an interesting issue when it comes to law enforcement
                                         
                                         and the Department of Justice and tech companies and this very interesting new privacy issues and
                                         
                                         some of the other things that raise that naturally come out of this subject.
                                         
                                         Yeah, so I want to talk about the scale of the problem insofar as you understand it and how
                                         
                                         ineffectual the government and the tech companiessofar as you understand it and how ineffectual
                                         
                                         the government and the tech companies have been thus far dealing with it. But just to talk about
                                         
    
                                         your experience for a moment, how do you go about reporting on this? And in the course of reporting,
                                         
                                         are you exposed to any of this material? Or do you actually, can you do all your reporting without feeling the same
                                         
                                         kind of psychological contamination that the law enforcement people you speak with experience?
                                         
                                         Great question. And one that also we had no idea going into this. So it might be helpful if I talk
                                         
                                         a little bit how we stumbled into this subject and then how we learned how to report on
                                         
                                         it. So I've been working here at the Times investigating tech companies for several years
                                         
                                         now. And that has been everything from bots and fake followers, Cambridge Analytica, data deals
                                         
                                         that Facebook has struck. So I've been immersed in this field along with several
                                         
    
                                         colleagues where these mammoth companies are tracking all sorts of things about you, web pages
                                         
                                         you like, who your friends are, where you are, using this data to target you, things I know that
                                         
                                         you've discussed at length with many of the people you've had on the show. But still, I felt often, both in conversations with editors here as well as people outside the building, that I was having difficulty gaining traction on issues surrounding privacy online and how important and how high the stakes are.
                                         
                                         And so I started asking the small team I work with questions about, you know, what kind of actual harm can we show?
                                         
                                         Because many people would argue that whether it be Facebook or any other company violating our privacy by sharing our data with another company or selling our data or whoever might be doing what with our information.
                                         
                                         Many would argue that the harm is in violating our privacy.
                                         
                                         But that is still an abstract concept for many.
                                         
                                         And especially sometimes in a place like a news agency,
                                         
    
                                         harm, when I'm working with people like Megan Toohey and Jodi Kantor, who are investigating Harvey Weinstein and crimes against women,
                                         
                                         and there's tangible harm there,
                                         
                                         and there's harm for some of my
                                         
                                         colleagues investigating what's going on in Myanmar and the Facebook disinformation and people dying
                                         
                                         from that. I mean, there's harm there. And so gaining traction around online privacy and harm
                                         
                                         was something that I was looking for. What topic is really going to bring this to a point where people can start having, it was like a fast forward, right? I wanted to short circuit this conversation about privacy online to a point where we could actually begin discussing it in a way that has very, very real harm and consequences.
                                         
                                         consequences. And so in that... But here you're talking about the flip side of this, which is our commitment to maintaining privacy at all costs, if we ever achieve that, the full encryption of
                                         
                                         Facebook Messenger, for instance. One of the knock-on effects of that will be to make these crimes more or less undiscoverable.
                                         
    
                                         Absolutely. Absolutely. And we'll come back, I'm sure, to encryption and some of the potential,
                                         
                                         I don't know, solutions for people's privacy and very, very high stake decisions for children
                                         
                                         suffering this abuse and people trying to rid the internet of this type of material.
                                         
                                         So you're right, I was coming at it from a privacy side.
                                         
                                         But I also knew that it was more complicated than that.
                                         
                                         And so we wanted to figure out
                                         
                                         where does this privacy line start actually,
                                         
                                         like where does the rubber meet the road?
                                         
    
                                         And one of the ideas
                                         
                                         was what we were calling at the time child pornography. And that was not only because
                                         
                                         of the privacy thing, but we were also talking about what has technology done over the last 10
                                         
                                         years that has completely changed the world. And one of those things is the ability to create and
                                         
                                         share imagery. I mean, look at Facebook, look at Instagram, all of these different types of social platforms and other things that have
                                         
                                         YouTube that have spun up. I mean, so much more images and videos are being created and shared
                                         
                                         and stored, et cetera, that we, we, it was just a hunch. I mean, what's going on with child pornography? And A, nobody wants to talk about it.
                                         
                                         So as an investigative reporter,
                                         
    
                                         that is actually helpful when you encounter a subject
                                         
                                         that really nobody wants to touch.
                                         
                                         But the second thing that happened,
                                         
                                         the second thing that happened that was, that also...
                                         
                                         I want to return to that point, though.
                                         
                                         I don't want to derail you,
                                         
                                         but we have to return to why people don't want to talk about this and the consequences of that.
                                         
                                         Absolutely. Absolutely.
                                         
    
                                         But the second thing that came in, which actually in its own way interestingly ties back to the encryption discussion and everything, is the New York Times has a tip line that I actually helped set up in 2016.
                                         
                                         And this tip line has multiple ways people can send us information.
                                         
                                         Some of those ways are encrypted.
                                         
                                         Some of those ways are regular emails.
                                         
                                         Some of them are through the paper mail.
                                         
                                         And we received a tip from a man.
                                         
                                         And I believe it just came in over email.
                                         
                                         I don't think he was concerned with protecting his own identity.
                                         
    
                                         And this tip said, look, I was on Bing, Microsoft Bing search engine, and I was looking up bullet weights.
                                         
                                         So literally the weight of bullets, which I understand are measured in grains. And I'm not
                                         
                                         going to say the specific term he was looking up, but that he was actually looking at bullets,
                                         
                                         but a certain weight of bullet.
                                         
                                         And he said, you know, I typed this in,
                                         
                                         and all of a sudden I'm seeing images
                                         
                                         of children being sexually assaulted.
                                         
                                         And I've reported this to Bing,
                                         
    
                                         and it's been days or weeks, and they're still there, and I don't know what to do,
                                         
                                         so I'm telling you. And so we had already been thinking about this issue, and here in mid-February
                                         
                                         we get this tip, and I ask my colleague. Luckily, as the small team leader of this technology
                                         
                                         investigations team, I'm sometimes able to pass a tip on and
                                         
                                         ask one of my fellow reporters to try to run it down. And so in this instance, I was happy to do
                                         
                                         that. And in this instance, it was Mike Keller. And I said, Mike, check it out. Do me a favor
                                         
                                         and check it out. So Mike writes me back maybe half an hour later and says, yeah, I typed in the exact terminology that the
                                         
                                         guy sent in. And I only looked at it for half a second, but there were definitely very disturbing
                                         
    
                                         images that came up. And so we were shocked, first of all. But second of all, we immediately
                                         
                                         reached out to our head legal counsel at the New York Times. And there's a lot of benefits for working the New York Times. But one of really the best things is that we have excellent legal representation in-house. In this case, it's David McCraw, who's relatively famous in his own right for his dealings both with President Trump as well as many other people, Harvey Weinstein, etc. And David says, look, it is extremely important that
                                         
                                         you both understand that there is no journalistic privilege when it comes to child pornography.
                                         
                                         And he sent us the statue and he sent us some news stories where reporters had, in fact, gotten in trouble for reporting on this subject.
                                         
                                         And so what we had to do immediately, because Mike had, in fact, seen images, is report those images to the FBI and the National Center for Missing and Exploited Children.
                                         
                                         Because not many people know this, I don't think, but it is one of the only crimes, if not the only crime, that you have to report if you see.
                                         
                                         I mean, you don't have to report a murder if you see it.
                                         
                                         But if you see an image of child sexual abuse, you have to report it or you are breaking the law.
                                         
    
                                         And that stands for everybody.
                                         
                                         So we filed a report with the National Center and we filed a report breaking the law. And that stands for everybody. So we filed a report with the
                                         
                                         National Center and we filed a report with the FBI. And we then began embarking on this
                                         
                                         investigation knowing, first of all, that A, we did, of course, we did not want to see any of this
                                         
                                         material. But B, if we did see it, we had to report it. And along the way, we even received
                                         
                                         emails from the FBI saying, hey, reminder, you're not allowed to collect this information. You're
                                         
                                         not allowed this material. You're not allowed to look at this material. There is nothing you can
                                         
                                         do around this that is legal, which really did cause a complicated reporting process.
                                         
    
                                         That's interesting.
                                         
                                         It somehow seems less than optimal, but it's also understandable.
                                         
                                         Do you think they have the dial set to the right position there, or would there be some better way to facilitate your work or whatever role you as a journalist can play in
                                         
                                         solving this problem? I mean, I think probably it's in the right spot, to be honest. I think that
                                         
                                         while it was difficult, it was, I mean, there are, we reviewed hundreds and thousands of court
                                         
                                         documents. And these court documents include search warrants and complaints
                                         
                                         and a variety of other things. And when you have a search warrant, so when an investigator,
                                         
                                         let's say based on a tip from the National Center or based on investigative undercover work,
                                         
    
                                         discovers somebody with this type of material on their system, they file a search
                                         
                                         warrant.
                                         
                                         And when they file the search warrant, they have to describe probable cause.
                                         
                                         And this probable cause nearly always is descriptions of a handful of the photos and videos.
                                         
                                         And speaking, I've been speaking with a variety of both advocates and people involved. And while
                                         
                                         I have been personally lucky enough to have never, ever seen one of these images or videos,
                                         
                                         I've read descriptions of hundreds, if not more than a thousand. And it is a terrible,
                                         
                                         terrible, terrible thing to read. And some people have said reading it is worse than seeing it. Now,
                                         
    
                                         terrible thing to read. And some people have said reading it is worse than seeing it. Now,
                                         
                                         I don't know. And I can't make that comparison, but I don't feel like I would gain much in the reporting process by actually seeing these things. I mean, just as you, you've only read them in,
                                         
                                         in our reports. Right. And I'm sure that's even more than enough for you to understand the gravity. And so I don't see what would be helpful in in my being able to see them in any kind of journalistic privilege. And I think that would also likely be abused if it if it existed.
                                         
                                         Yeah, I guess the only analogy I can think of is the ISIS videos, you know, the decapitation videos and the other records of their crimes, which, you know, journalists have watched and anyone can watch. And, you know, I've spent a lot of time, as you might know, railing about the problem of jihadism.
                                         
                                         And, you know, I'm just aware that I, to know, you know, how bad ISIS was, I'm reliant on people who are paying, you know, firsthand attention
                                         
                                         to their crimes. You know, someone like Graham Wood over at The Atlantic is actually watching
                                         
                                         these videos and confirming that they're as bad as is rumored, so I don't have to. And so,
                                         
                                         essentially, you have the cops doing that work for you. It seems, I can't imagine the information
                                         
    
                                         is getting lost or corrupted
                                         
                                         there, given that there's so much of it. But it just would be odd if someone like Graham,
                                         
                                         in the process of writing his articles on Abu Bakr al-Baghdadi and ISIS and our misadventures
                                         
                                         in the Middle East, had to at every turn worry that he could be thrown in jail for having discovered an ISIS video
                                         
                                         online. That seems like an extra piece of overhead that he doesn't need to do his work.
                                         
                                         Yeah, it was, I mean, it was nerve wracking. It was uncomfortable. And again, I mean, we
                                         
                                         did every single bit of our reporting consultation with our lawyers. And we were also in close
                                         
                                         contact with the FBI, the Department of
                                         
    
                                         Justice, you know, local law enforcement throughout the country, international agencies dealing with
                                         
                                         this. So that doesn't provide any cover, certainly. But I was hoping or I hope that it raised flags
                                         
                                         everywhere to say like, you know, because I was Googling some pretty crazy terms
                                         
                                         at points trying to learn about this issue. And I mean, if you Google just child pornography
                                         
                                         on Google, literally search it on Google, they will return messages telling you that this is
                                         
                                         an illegal thing to look for providing resources if you're inclined to look at this type of material.
                                         
                                         I mean, there is active messaging around people looking for this type of imagery.
                                         
                                         So I wanted to make sure I didn't end up on some list somewhere,
                                         
    
                                         which I hope I'm not on.
                                         
                                         But basically, we wanted to make kind of as much noise as we could as
                                         
                                         investigative reporters. We're not trying to tip other people off that we're doing this story.
                                         
                                         But so that law enforcement knew that we were actually trying to engage in this in a real
                                         
                                         journalistic way, and that there wasn't any sort of anything else going on.
                                         
                                         Okay, so what do we know about the scale of the problem? You know,
                                         
                                         you explained that at one point in 2007, we had 100,000 reports. I'll remind people 2007 was a
                                         
                                         time when we were all online. That's not 1997. You know, 2007 is certainly well into the period where
                                         
    
                                         the internet has subsumed all of our lives,
                                         
                                         so that you have 100,000 reports then, and now we're up to 18, 19 million reports.
                                         
                                         But how much of this is just more looking and finding the ambient level of abuse that was always there?
                                         
                                         Or how much do we know about the growth of the problem?
                                         
                                         Because it seems like, judging from your articles, the reporting around
                                         
                                         the issue is increasing something like exponentially, whereas the number of arrests
                                         
                                         and the amount of resources being put towards solving the problem are more or less flat,
                                         
                                         which is a terrible juxtaposition there. So what do we know about how big a problem this is?
                                         
    
                                         Well, it's the right question. And
                                         
                                         unfortunately, I don't think I'm going to have a completely satisfying answer. And part of that,
                                         
                                         everything around this subject in some way goes back to the fact that nobody wants to talk about
                                         
                                         this subject. And so there isn't a lot of transparency for a variety of reasons, whether
                                         
                                         or not it's the federal government
                                         
                                         not keeping the records and reports that they should be, whether it's the lack of transparency
                                         
                                         from the National Center, which is responsible for the collection and serves as the clearinghouse
                                         
                                         for this type of imagery, or a variety of other things. So I'm not going to have,
                                         
    
                                         I can't answer your question completely, but I can give us some idea. And so the tip line, the cyber tip line is run by the National Center for Missing and Exploited Children, commonly referred to as NCMEC.
                                         
                                         When people started becoming aware of kind of what you were saying, like 97, 98, people are coming online and law enforcement and Congress, other leaders are realizing that child sexual abuse imagery is also coming online.
                                         
                                         And the Internet was the biggest boon to child sexual abuse imagery since the Polaroid camera. And so...
                                         
                                         Let's just spell that out for people that can't do the psychological math so quickly there. So
                                         
                                         that the significance of the Polaroid camera was that you didn't have to figure out how to get your
                                         
                                         film developed by a stranger anymore. You could just develop it yourself. And that took a lot of friction out
                                         
                                         of the system of documenting the abuse of children, because unless you had a dark room,
                                         
                                         it was kind of mysterious how people could produce a ton of this material in the first place.
                                         
    
                                         Right. And according to the law enforcement we've spoken with in the 80s and 90s, I mean,
                                         
                                         they were pretty comfortable. I mean, 80s and
                                         
                                         early 90s before the advent of the internet, they were pretty comfortable saying that they were,
                                         
                                         they had a good handle on this problem and they were actually like stomping it out. I mean,
                                         
                                         child pornography, child sexual abuse material used to really be the domain of law enforcement
                                         
                                         in the U.S. Postal Service because that is how it was traded. It was
                                         
                                         traded in parking lots. It was mailed. And that is how the majority of it was caught and detected.
                                         
                                         But with the advent of the internet, and this is, again, this is even before
                                         
    
                                         digital cameras for the most part. I mean, certainly cell phones. So they opened this tip line, 1998. In 1998, they received
                                         
                                         3,000, just over 3,000 reports of what the legal term is child pornography, which is also why it's
                                         
                                         a bit confusing when talking terminology. Most of the laws refer to it as child pornography. So there's just over 3,000 in 1998. By 2006, 7, we're at 83,000
                                         
                                         reports, 85,000. And then something happens. And nobody can say with certainty, but the numbers start exploding with the invention of the smartphone.
                                         
                                         The iPhone's introduced in 2008.
                                         
                                         A bunch of other phones also start to be produced that have high-quality cameras, broadband connections.
                                         
                                         band connections. And so by about 2015, actually 2014, we break a million for the first time.
                                         
                                         And it's a big jump. 2013, there's less than half a million reports. 2014, that number doubles.
                                         
    
                                         2015, that number quadruples. We're over 4 million reports. And by 2018, we're at 18 and a half million reports.
                                         
                                         So the numbers are growing exponentially. But there's something we need to tease apart here,
                                         
                                         which is there are reports to the National Center. And the vast majority of these reports,
                                         
                                         more than 99%, come from what they call our electric service providers.
                                         
                                         Facebook, Twitter, Google, etc.
                                         
                                         But each report can contain a number of files.
                                         
                                         So this is not a one-to-one. So when there's 18.5 million reports in 2018, that does not mean there was 18 million pieces of content found.
                                         
                                         In fact, there were 45 million pieces of content found in 2018. And it was about split between
                                         
    
                                         images and videos. And we'll certainly come back to the discussion of videos because there's
                                         
                                         something startling going on there. But the numbers that we just published a few days ago for the 2019 numbers
                                         
                                         really start to tease apart these differences between reports and numbers of files. So in 2019,
                                         
                                         for the first time in over a decade, the number of actual reports went down.
                                         
                                         So the number of reports received by the National Center in 2019 was just shy of 17
                                         
                                         million. So we're looking at a drop of about one and a half million. And we can talk about why
                                         
                                         that happened in a minute. But the number of files reported in 2019 was just under 70 million.
                                         
                                         was just under 70 million. So we've gone from 45 million in 2018 to 70 million in 2019.
                                         
    
                                         And again, as recently as 2014, that number was less than 3 million.
                                         
                                         So I want to talk about why this is so difficult to even focus on and what explains the failure of our response thus far.
                                         
                                         But I don't want to lose the thing you just flagged. What's the distinction between still images and videos that you wanted to draw there? The thing that we've seen, so the rise in all
                                         
                                         photos and videos detected, and we should very much get to that, which is the fact that these are only known images
                                         
                                         and videos that they are detecting.
                                         
                                         The systems they have to catch this content
                                         
                                         are trained to match only images and videos
                                         
                                         that have been identified previously as illegal material.
                                         
    
                                         So we're not talking about new material almost whatsoever.
                                         
                                         This is in near completenesseness previously seen images and videos.
                                         
                                         But to speak specifically to videos, the technology for detecting video child sexual abuse is nascent compared to image abuse.
                                         
                                         nascent compared to image abuse. And for that reason, they've detected as recently as 2017,
                                         
                                         there were only three and a half million videos reported to the National Center as compared to 20 million images. Last year, there were 41 million videos reported as compared to 22 million. No, I'm sorry, 27 million images.
                                         
                                         So I know these are a lot of numbers, but what we're seeing is videos are exploding. Well,
                                         
                                         the number of videos detected. And that's almost wholly due to Facebook. And Facebook started scanning aggressively for videos in late 2017. And by
                                         
                                         2019, they were responsible for by far the majority of video reports. I think they were
                                         
    
                                         responsible for 38 million of the 41 million videos reported. So the numbers are rising. The reports,
                                         
                                         we'll come back to in a second, but the numbers of files and videos are rising.
                                         
                                         But as to your initial question, what does this tell us about, A, how much content is online,
                                         
                                         and B, how much is being produced? It tells us nothing about either of those for a few reasons. Not nothing, but it paints a
                                         
                                         very incomplete picture. The first reason is, as I said, they're only detecting previously identified
                                         
                                         imagery, which means they're not detecting anything that is being newly created. That process is a
                                         
                                         very slow process to get added to these lists of previously identified imagery.
                                         
                                         It is because of funding issues and a variety of other things.
                                         
    
                                         The list of previously identified imagery is growing very slowly.
                                         
                                         But the number of recirculating images and videos is as high as ever.
                                         
                                         So we don't know a lot about how much new content is being produced.
                                         
                                         And we also don't know, because of that, we don't know if this problem is, as you said,
                                         
                                         always been there. And we're just finding it because more companies are actively looking for
                                         
                                         it, or if it's actually growing. Now, conversations with law enforcement, amongst others, say that the problem's growing.
                                         
                                         And even common sense, as I said, with cell phones, broadband, cloud storage, social media.
                                         
                                         I mean, the internet is built to share videos and content and files.
                                         
    
                                         There's platforms that are billion-dollar platforms completely dedicated to this.
                                         
                                         that are billion-dollar platforms completely dedicated to this.
                                         
                                         The fact that we don't know exactly how much is out there is evident in Facebook being responsible for 90% or so of all reports.
                                         
                                         Right, right.
                                         
                                         And other companies, we're not sure exactly.
                                         
                                         The whole industry, certainly before our reporting, and still to a certain extent, was very cloaked in secrecy. And people were happy for that to be the case because nobody wanted to ask.
                                         
                                         are doing and not doing. But one wonders whether Facebook is responsible for much of the problem,
                                         
                                         or just given their scale, given that they've got three billion people on the platform,
                                         
    
                                         and given the sophistication of their tools that allow them to find the problem to the degree that they do that, it's hard to know whether we're just penalizing them for looking and discovering
                                         
                                         the enormity of the problem on their side. But, you know, you would have a similar problem
                                         
                                         anywhere else you looked if you deployed the tools, you know, on any of these other platforms,
                                         
                                         whether it's Dropbox or Tumblr or any of these other companies you mentioned in your articles.
                                         
                                         That's totally right. And I actually want to make sure that it's clear that I don't
                                         
                                         think Facebook should be penalized for having the highest number of reports. I mean, there's a lot
                                         
                                         of nuance around the number of reports. And for example, we fought tooth and nail with the National
                                         
                                         Center for them to disclose the number of reports by company in 2018, and they would not do it.
                                         
    
                                         And none of the other tech companies would disclose it either. All we knew was that there
                                         
                                         were 18 and a half million, nearly 18 and a half million reports. We didn't know who they came from.
                                         
                                         We didn't know what companies were detecting imagery versus video. We don't know when they
                                         
                                         were scanning for that.
                                         
                                         And there was a variety of reasons for that. But the biggest reason, there were two biggest reasons. One is the National Center for Missing and Exploited Children is actually a private
                                         
                                         nonprofit. And that has come under judicial review. And we can talk about that more later
                                         
                                         if we want. But what that provides them is we are not able to file Freedom of Information Acts
                                         
                                         to receive information from them. So even though they're sitting on the canonical database of
                                         
    
                                         child sexual abuse reports, that's a federal crime, it's an extremely important statistic
                                         
                                         that in most instances, we would file a freedom of information request and be able to learn some of the information around that big number.
                                         
                                         When we cannot file a freedom of information request to NCMEC, and they would not tell us.
                                         
                                         So that was the number one challenge.
                                         
                                         And then none of the other tech companies would tell us either.
                                         
                                         That was the number one challenge. And then none of the other tech companies would tell us either. And finally, we had a source, who I can't disclose who, come to us and say, look, you would not out that the number just from Facebook Messenger was 12 million.
                                         
                                         After we reported that number, the federal government had a conference or a presentation.
                                         
                                         They said that it was in total 16 million from all parts of Facebook.
                                         
    
                                         And at first, at first blush, you think, damn, I mean, Facebook is absolutely riddled with this content.
                                         
                                         Damn. I mean, Facebook is absolutely riddled with this content.
                                         
                                         Now, let me be clear.
                                         
                                         Any company online that has images or videos is infested with this content.
                                         
                                         It just, that is the case.
                                         
                                         So Facebook does not stand alone in having this issue. The very interesting part about those numbers is that they actually reflect Facebook taking
                                         
                                         an extremely aggressive stance in looking for this imagery.
                                         
                                         I mean, they're scanning every photo that gets uploaded.
                                         
    
                                         Since late 2017, they're scanning every video that gets uploaded.
                                         
                                         And they're aggressively reporting it to the National Center. So those very high numbers actually reflect a very conscientious
                                         
                                         effort to find and remove this imagery. I mean, we spoke with Alex Stamos, who is a former,
                                         
                                         I think, chief security officer for Facebook. He was also the same position at Yahoo. And I mean, he said that
                                         
                                         if this if other companies were reporting the same way that Facebook was reporting,
                                         
                                         we wouldn't have, you know, 16 million reports last year. We'd have 50 or 100 million.
                                         
                                         So Facebook actually and when we can come back to Facebook Messenger, because that's where things get interesting with Facebook.
                                         
                                         But I think by any measure, Facebook is actually an industry leader when it comes to finding and reporting this content.
                                         
    
                                         Right. I know that people hearing this are going to feel, once they absorb the horror of it, they will feel somewhat powerless to do anything to help solve
                                         
                                         the problem. And so one question I was going to ask you at the end is, you know, are there any
                                         
                                         nonprofits that you recommend we support who are working on the front lines of this? And so you
                                         
                                         just said something somewhat equivocal about the National Center, which is really at the center for this, and they're a nonprofit.
                                         
                                         What do you recommend people do here?
                                         
                                         I mean, is there, should we be giving money to the National Center for missing and exploited children?
                                         
                                         Or is there some better option for people who want to help here?
                                         
                                         Sure. It's a good question.
                                         
    
                                         We, you know, I'm generally not in the business of, It's lucky as a reporter, a lot of my job is pointing out problems and not necessarily finding the solution to them.
                                         
                                         people. Really, I mean, you can't work on this and not be a compassionate person. This is a labor of love that these people are doing. That said, there are definite issues. I mean, the fact that
                                         
                                         it has this quasi-governmental status that has come up. Justice Gorsuch, when he was a judge in the 10th Circuit,
                                         
                                         ruled that the National Center was, in fact, a part of the government. They get 75% of their
                                         
                                         funding in general from the federal government. That's about $30 million a year. But at the same
                                         
                                         time, they are absolutely overwhelmed. I mean, this problem is overwhelming the National
                                         
                                         Center. It's overwhelming law enforcement. It's overwhelming a lot of tech companies.
                                         
                                         So while it's complicated, I do think that their heart absolutely is in the right place and their
                                         
    
                                         efforts are in the right place. They're just behind. They're behind. Now, you could give
                                         
                                         money to them and that would be good.
                                         
                                         There are other nonprofits that also are doing great work. The Canadian Center for Child Protection,
                                         
                                         who we reported on and who is one of the leaders in starting this idea of what they call naming
                                         
                                         and shaming tech companies. Because of the cloak of silence that's been around this, because we
                                         
                                         haven't been able to hear what are you actually doing to combat this problem. The Canadian Center
                                         
                                         has taken the lead in trying to push that process forward. You can donate money there. You can
                                         
                                         donate money to Thorn, which is also a non-profit that is developing software for smaller companies,
                                         
    
                                         which is a challenge if you're a smaller company.
                                         
                                         Building these kind of systems to scan and detect content is expensive,
                                         
                                         and they're sometimes unable to do that.
                                         
                                         Why wouldn't there be an open-source effort
                                         
                                         to develop this technology that anyone could use?
                                         
                                         I mean, why would there be any proprietary angle on this at
                                         
                                         all? Why wouldn't Google or Palantir or, you know, Facebook just break off some of their expertise
                                         
                                         and say, here are the tools, this is how you find child pornography in your databases, you know,
                                         
    
                                         use them freely? Right. Again, a great question. Now, part of what's going on is, and look, Google
                                         
                                         sits on Nick Mick's board. Facebook sits on Nick Mick's board. Nick Mick gets in-kind donations
                                         
                                         from Palantir. They've essentially depended in large part on Palantir and Google to upgrade
                                         
                                         their systems over the past few years, even though they have a not insignificant sum of money coming in from the federal government.
                                         
                                         But the detection system most commonly used is something called photo DNA. So photo DNA was
                                         
                                         invented in 2009, which many experts would say is at least five years too late when they knew
                                         
                                         what the problem was. But all the same, invented in 2009, it was a partnership
                                         
                                         between Microsoft and a person named Dr. Hani Fareed, who was at Dartmouth at the time, now
                                         
    
                                         Berkeley. And it is proprietary, and we'll talk about that in a second. But basically what it is,
                                         
                                         is it's a fuzzy image matching. And by fuzzy image matching, I mean, many of your listeners who I know are
                                         
                                         adept at technology, you can take what are called cryptographic hashes of any type of file. And a
                                         
                                         cryptographic hash will shoot out a string of characters and that string of characters is
                                         
                                         unique to that file. And if any small thing changes, that cryptographic hash will change.
                                         
                                         any small thing changes, that cryptographic hash will change.
                                         
                                         And so for a while, they were using cryptographic hashes to try to match known images of child sexual abuse.
                                         
                                         The challenge became that the people who are trading this type of imagery
                                         
    
                                         often are also pretty technological and literate in different technologies.
                                         
                                         And so they knew that even if they saved the image at a
                                         
                                         different type of compression, or if they cropped it even slightly, that that cryptographic hash
                                         
                                         would no longer match. So photo DNA was the solution to this. And photo DNA is, again,
                                         
                                         they call it a fingerprint, you can call it a fuzzy match, but basically it takes into account a lot of these
                                         
                                         minor changes that can be made to an image. So even if you change the color a little bit, you
                                         
                                         crop it a little bit, you write on it maybe a little bit, it's still going to catch that image.
                                         
                                         That was invented in 2009. Now the question of why it is an open source is a good question.
                                         
    
                                         They would say that it would allow people who are looking to dodge the
                                         
                                         system, to manipulate the system, access to the algorithm, which would then allow them to find out
                                         
                                         how to do that. I don't know enough to say whether that's for sure the case or not. For example,
                                         
                                         Facebook last year released an open source algorithm for video detection.
                                         
                                         Now, a couple weeks ago, I asked some cryptologists, why would Facebook do that? And they said,
                                         
                                         well, it's probably not that very, that could have an algorithm, to be honest. Dr. Freed will tell
                                         
                                         you that photo DNA is not some kind of top secret, incredibly complex thing, but they still do keep it under wraps. Now, Microsoft, who owns
                                         
                                         PhotoDNA, will license that to most companies from what we understand if they ask. Now, there's been
                                         
    
                                         some movement around that lately that complicates things, but for the most part, Facebook has a
                                         
                                         license for PhotoDNA. Google has a license for PhotoDNA. All of the big companies have
                                         
                                         licenses for PhotoDNA. And they use it on their system so that they can all share this list of
                                         
                                         what they call hashes, a hash list, in between themselves where they fingerprint photos and take
                                         
                                         a look at it. Now, that technology is being developed, unfortunately, with video, which I mentioned
                                         
                                         previously. There is no standard. And that has been confusing to us. It remains confusing to us.
                                         
                                         The National Center has said that they would prefer there's a video standard just the same
                                         
                                         way there is an imagery standard. But there is no video standard. So Google has their own
                                         
    
                                         hash fuzzy fingerprint system for video. Facebook has their own system. Microsoft
                                         
                                         actually evolved PhotoDNA to have their own system. And the government uses a different,
                                         
                                         law enforcement uses a different system. So now all of a sudden you have this issue of a bunch
                                         
                                         of different proprietary technologies generating different types of fingerprints that are
                                         
                                         incompatible and no
                                         
                                         master list of these fingerprints. So there's really a rabbit hole to go down, which is not
                                         
                                         uncommon to technology as a whole. But again, in this instance, the ramifications of it are stark.
                                         
                                         Well, there's the effort of the tech companies and the effort of government.
                                         
    
                                         And there's something mysterious here around how noncommittal people have been
                                         
                                         toward solving this problem.
                                         
                                         Because as you say, in at least one of your articles, the US government has approved $60
                                         
                                         million a year for this problem, which is really a drop in the bucket. I mean,
                                         
                                         it's just that on its face is not enough. But they don't even spend that amount every year.
                                         
                                         They set aside that amount of money, but they spend something like half of it.
                                         
                                         And this is just totally mystifying to me. I mean, if ever there were a nonpartisan issue
                                         
                                         where, you know, you could get Christian conservatives on
                                         
    
                                         the one hand and progressives on the far left on the other to be equally animated about,
                                         
                                         it's got to be this. So the issue is, let's figure out how to prevent kids from being raped and
                                         
                                         tortured for money. And let's figure out how to hold people accountable
                                         
                                         who do this and who traffic in this imagery.
                                         
                                         And yet it seems that even the money
                                         
                                         that's reserved to fight this problem
                                         
                                         isn't being deployed.
                                         
                                         How do you understand that?
                                         
    
                                         It's hard to explain,
                                         
                                         but I do think that this is perhaps the right time
                                         
                                         to talk about people just not wanting to discuss the issue.
                                         
                                         So in 2008, people knew this was a problem.
                                         
                                         As I said, that testimony that you had quoted earlier from Flint Waters where he's talking about the man who gave juice boxes to the children and then raped them, that was 12 years ago.
                                         
                                         That was 12 years ago as of last year,
                                         
                                         so it's 13 years ago now, 2007.
                                         
                                         In 2007, everybody knew this was a huge problem.
                                         
    
                                         And so a bill was put on the floor by Joe Biden,
                                         
                                         Debbie Wasserman Schultz, bipartisan.
                                         
                                         I believe Cornyn was evolved either at that time
                                         
                                         or at least by 2012.
                                         
                                         As you say, it was a bipartisan issue. I think it passed unanimously. It was called the 2008 Protect Our Children Act.
                                         
                                         And it wasn't until a few, probably like a month into our reporting, that we realized that there
                                         
                                         was legislation in order to confront this issue. And the more we dug into that legislation,
                                         
                                         what we saw is it was pretty good.
                                         
    
                                         It really foresaw a lot of the issues.
                                         
                                         And that, but then what we saw,
                                         
                                         which was really disappointing, to put it mildly,
                                         
                                         was that many, most of the major aspects of the legislation
                                         
                                         had not been fulfilled by the federal government.
                                         
                                         So there were three main provisions that were not followed through on.
                                         
                                         The first, and perhaps the most consequential, is the one you discussed,
                                         
                                         which is Congress allocated only half of the $60 million that the bill appropriated
                                         
    
                                         to fight this. That money is supposed to go directly to state and local law enforcement
                                         
                                         in order that they can deal with this problem. And we haven't even spoken about them, but
                                         
                                         the short of it is they're completely overwhelmed. They're having to do total triage.
                                         
                                         The short of it is they're completely overwhelmed.
                                         
                                         They're having to do total triage.
                                         
                                         Many of them, that means they focus only on infants and toddlers,
                                         
                                         leaving the rest of the cases unexamined.
                                         
                                         That's true with the FBI.
                                         
    
                                         That's true in L.A.
                                         
                                         So you have these, they're called Internet Crimes Against Children Task Force, ICACs.
                                         
                                         All these ICACs begging for money.
                                         
                                         The money has been appropriated. For the last 10 years,
                                         
                                         it stayed almost wholly at $30 million of the $60 million. I might be using appropriated wrong,
                                         
                                         might be authorized. I'm not sure what the term is. But basically, they're allowed to give up to $60 million. They're only given $30 million. We found another thing, that the Justice Department is supposed to produce biannual, every two years, reports on these topics or reports on this problem.
                                         
                                         Now, these reports are supposed to have several pieces of information.
                                         
                                         They're supposed to compile data about how many reports, where the reports are coming from, in order that we have an idea of the scope of this problem.
                                         
    
                                         And they're supposed to set some goals to eliminate it.
                                         
                                         Well, only two of what should now be seven reports have been produced.
                                         
                                         And finally, they were supposed to have an executive level appointee,
                                         
                                         at least by 2012 when the bill was reauthorized for the first time.
                                         
                                         They're supposed to be like an executive level appointee,
                                         
                                         essentially a quarterback who's in charge of this issue.
                                         
                                         That position has never been filled with the executive level person.
                                         
                                         It's been a series of short-term appointees leading the efforts.
                                         
    
                                         And so it was stunning to see that they had foreseen this problem,
                                         
                                         and they had actually set up a pretty good law
                                         
                                         meant to address it.
                                         
                                         And the only reason that we can think of
                                         
                                         that these things were not followed through on
                                         
                                         is people were very happy to put the law in place
                                         
                                         and then turn their backs.
                                         
                                         And I can only chalk that up to people
                                         
    
                                         just literally not wanting to pay any mind to this issue
                                         
                                         after feeling like they dealt with it.
                                         
                                         It is truly mysterious.
                                         
                                         I mean, I don't know.
                                         
                                         Again, what we're talking about is a source of suffering that is as significant as any
                                         
                                         we can think of happening in our own neighborhoods, right?
                                         
                                         This is not happening in some distant place in a culture very unlike your own for which
                                         
                                         the normal levers of empathy are harder to pull, right?
                                         
    
                                         This is happening to, if not your kids, your neighbor's kids,
                                         
                                         and some guy down the block is paying to watch it. And it's all being facilitated by technology that
                                         
                                         is producing more wealth than any other sector on earth, right? So you're talking about the
                                         
                                         richest companies whose wealth is scaling
                                         
                                         in a way that normal businesses never do, and the money's not being allocated to solve this problem.
                                         
                                         It's just, we need something like a Manhattan Project on this, where all the tech companies
                                         
                                         get together and realize this is not something the government is especially good at.
                                         
                                         and realize this is not something the government is especially good at.
                                         
    
                                         Look at those Facebook hearings.
                                         
                                         And, you know, you have a bunch of geezers up there trying to figure out what Facebook is while also trying to hold Zuckerberg to account for having broken our democracy.
                                         
                                         And it's just a completely fatuous exercise, right?
                                         
                                         So clearly we need the best and brightest to
                                         
                                         break off 1% of their bandwidth and wealth and figure out how to solve this problem. Because
                                         
                                         what seems to be happening, based on your reporting, correct me if I'm wrong, is that
                                         
                                         there are troubling signs that tech is moving in the opposite direction. They're creating technology
                                         
                                         based on other concerns that will make the problem harder to discover. And the example of this that
                                         
    
                                         you've written about is that Facebook is planning to fully encrypt Facebook Messenger, which is
                                         
                                         one channel where a lot of this material streams. And if you
                                         
                                         do that, well, then Facebook will be able to take the position that, you know, Apple has taken around
                                         
                                         unlocking its iPhone, right? Like, we can't unlock the phone because not even we can get into your
                                         
                                         iPhone. So if that person's phone is filled with evidence of crimes against children, well, it really can't be our problem.
                                         
                                         We've built the technology so that it will never become our problem.
                                         
                                         And there are many people who are understandably part of a cult of privacy now that have so fetishized the smartphone in particular and other channels of information as, you know, sacrosanct and have
                                         
                                         to be kept forever beyond the prying eyes of government, no matter how warranted the search
                                         
    
                                         warrant is, that a lot of people will line up to say, yeah, I really don't care what might be in
                                         
                                         the Facebook Messenger streams of others or on another person's iPhone, I do not
                                         
                                         want it to be the case that you can ever get into my iPhone or my encrypted messaging. And I don't
                                         
                                         know how you feel about that. I mean, I think I've heard the arguments specifically with the case
                                         
                                         of the iPhone. Frankly, my intuitions have been kind
                                         
                                         of knocked around there such that I actually don't have a settled opinion on it. But I'm
                                         
                                         pretty sure that if you tell me that, you know, there's somebody who we know is raping and
                                         
                                         torturing children, and we have the evidence on his iPhone, but we can't open it, 99% of my
                                         
    
                                         brain says, okay, that's unacceptable. No one has an absolute right to
                                         
                                         privacy under those conditions. Let's figure out how to open the iPhone. But many people will
                                         
                                         disagree there for reasons that, you know, in another mood I can sort of dimly understand.
                                         
                                         But, you know, for the purposes of this conversation, those reasons seem sociopathic to me.
                                         
                                         How do you view the role of tech here and
                                         
                                         our looming privacy concerns? Right. Well, it's interesting to hear somebody such as yourself,
                                         
                                         who I know has a lot of experience with many of these issues, not child sexual abuse, but privacy,
                                         
                                         technology, and the tech companies. But let me go back to a few things
                                         
    
                                         you said, and then I'll address the encryption bit. We were shocked to find out how many people
                                         
                                         actually are engaged or looking at this type of material. Just one statistic or one quote I can
                                         
                                         actually give you is we were speaking with a guy, Lieutenant John Pizzaro, who's a task force
                                         
                                         commander in New Jersey dealing with this type of
                                         
                                         content. So Lieutenant Pizzaro says, look, guys, you got 9 million people in the state of New
                                         
                                         Jersey. Based upon statistics, we can probably arrest 400,000. Okay. So he's just saying that
                                         
                                         5% of people look at child pornography online. Is that, I mean, that's how I interpret that.
                                         
                                         That's right.
                                         
    
                                         Okay.
                                         
                                         So, I mean, that just seems, it just seems impossible, right?
                                         
                                         It's like.
                                         
                                         I mean, you've struck, I mean, it's part of the challenges with reporting on it.
                                         
                                         You know, like it's A, nobody's going to tell me they look at this stuff.
                                         
                                         I actually did have a series of encrypted chats with somebody who ran some websites
                                         
                                         that did have this material. But
                                         
                                         figuring out how many people look at it or don't is very difficult for a reporter. But law
                                         
    
                                         enforcement and there is an agenda on law enforcement. We'll get to that when we talk to
                                         
                                         encryption. But what they say is three to five percent of any random population will be looking at this material. And that's not all pedophiles.
                                         
                                         And in fact, a large number of those people are not pedophiles. And that's one of the issues with
                                         
                                         having this kind of content even available is that many of the child advocates will say,
                                         
                                         you know, you have, you spoke a little bit about adult pornography earlier,
                                         
                                         You know, you have, you spoke a little bit about adult pornography earlier and the wide range of adult pornography and just the insane types of things. And, you know, again, according to interviews and law enforcement and specialists we've spoken with, they say that this will drive people towards child sexual abuse.
                                         
                                         I just wanted to start, but you noted that there's a lot of people. It seems to be about it. Nobody was asking him questions about it.
                                         
                                         I mean, I know there's been articles written about this in the past several years, but there has not been an investigation such as ours in probably a decade or so.
                                         
    
                                         It's a very, very easy subject to look away from.
                                         
                                         But in the course of my reporting, I did go back years and found employees, former employees
                                         
                                         at Twitter, former employees at employees, former employees at Twitter,
                                         
                                         former employees at Snapchat, former employees at Facebook, because those are the people
                                         
                                         who had insight in, let's say, 2012, 13, 14, when the problem started really getting big.
                                         
                                         And from every single person at every one of those companies, I heard the same thing, which is that
                                         
                                         the teams responsible for dealing with that material,
                                         
                                         which are generally called trust and safety teams, are totally underfunded and basically ignored.
                                         
    
                                         So an example, one former Twitter employee told me in 2013, when Vine, which was Twitter's short-lived video, like video tweets, six-second, eight-second tweets,
                                         
                                         in 2013, there were gigabytes of child sexual abuse videos
                                         
                                         appearing on Vine,
                                         
                                         and they were appearing more quickly than this one person.
                                         
                                         There was one person charged with this could take them down.
                                         
                                         So the idea that this is a new problem is totally absurd.
                                         
                                         All the companies have known about it for a long time, but they've been happy to not answer questions about it.
                                         
                                         I think this appeared in a few of your articles, but there's one sentence in one of these articles that I read and reread, and I'll just read it here, and you'll have to explain
                                         
    
                                         this to me. So I'm just quoting one of your articles. Police records and emails, as well as
                                         
                                         interviews with nearly three dozen law enforcement officials, show that some tech companies can take
                                         
                                         weeks or months to respond to questions from authorities, if they respond at all. Now, to my eye, that sentence doesn't make
                                         
                                         any fucking sense, right? I mean, how is it that the FBI could be calling Tumblr or Facebook or
                                         
                                         Dropbox or AWS or any other platform saying, listen, we've got crimes in progress being documented on your servers. Toddlers are being
                                         
                                         raped, and this information is being spread to the world through your servers. Call us back.
                                         
                                         How is it that the cops aren't showing up with guns kicking in the door getting a response,
                                         
                                         if they don't get a timely response.
                                         
    
                                         I mean, part of the problem, Sam, is that there's too much.
                                         
                                         The cops are overwhelmed.
                                         
                                         So what's often occurring, we found, is that there is so many reports coming in to a local task force that they have to spend, A, a significant portion of their time triaging these reports,
                                         
                                         trying to find, I mean, the number one thing they want to do is identify if there is a
                                         
                                         child in imminent harm. Because again, a lot of this material is recirculated material.
                                         
                                         These are children who were abused 10 or 15 years ago who have been since rescued, saved, and the images
                                         
                                         are being found and reported, but there's no imminent harm. So they're going through and
                                         
                                         they're triaging. And again, we're talking tens of thousands or sometimes hundreds of thousands
                                         
    
                                         of reports for a task force. So what we found was occurring, which I agree is incredibly disturbing,
                                         
                                         is the law as it stands now, and there has been a bill
                                         
                                         that has been introduced subsequent to our reporting to address this, but as it stands now,
                                         
                                         the law says that tech companies, as soon as they become aware of this content, must report it.
                                         
                                         So first of all, tech companies are not legally obligated to look for this content. And there are
                                         
                                         real and difficult to manage Fourth
                                         
                                         Amendment issues around that. But putting those aside, tech companies are not legally obligated
                                         
                                         to look for this content, but they are legally obligated to report it as soon as they know about
                                         
    
                                         it. So they will report this content and then they are only allowed to store, whether it's the imagery or anything about this, they're only
                                         
                                         required to store that for 90 days. After 90 days, they have to get rid of the imagery. There's no
                                         
                                         way they can keep a hold of the imagery, which leads to significant challenges for training things
                                         
                                         like artificial intelligence classifiers. Okay, so they have to get rid of this. And that in itself
                                         
                                         issues a challenge. And second, a lot of the time, because law enforcement is so overwhelmed, because there's so much content, because they're having to figure out what's an actual threat, by the time they go to the tech company, many times they've getting rid of all kind of any kind of logs or data or anything around imagery.
                                         
                                         So there were several instances where law enforcement would go to Snapchat and they wouldn't have it.
                                         
                                         We found cases where Tumblr for a certain period of time, I think in 2016, was in fact informing people who they found this content that they had reported them.
                                         
                                         And we talked with law enforcement who said that that gave criminals ample opportunity to delete
                                         
    
                                         the evidence, to destroy devices. So yeah, it's absolutely nuts. And it's because
                                         
                                         the overwhelming amount of content, the inability for the National Center,
                                         
                                         whose technology is literally 20 years old, to properly vet and filter and triage the content.
                                         
                                         So some of they do, they try. Some of that then falls on local law enforcement, who again,
                                         
                                         is overwhelmed. And by the time they get to some enforcement, who, again, is overwhelmed.
                                         
                                         And by the time they get to some of these things, it's often gone.
                                         
                                         So a bill was introduced in December that would double the amount of time, at least I believe, that the companies are required to hold on to this information.
                                         
                                         So that's one positive step, I think. But I want to get back really quick to the idea of why tech companies maybe
                                         
    
                                         aren't inclined to deal with this issue the way I think most of us would expect.
                                         
                                         So think about these trust and safety teams, right? Their job is to identify pieces of content
                                         
                                         and users to remove from the platform. Now, I'm sure you know that the way that many of these
                                         
                                         companies, if not all of these companies, certainly the public ones and the ones who are looking for
                                         
                                         funding, report their success is by number of users. Daily active users, monthly active users,
                                         
                                         whatever it is. So you have this team within your organization whose job it is to remove users
                                         
                                         and to flag users. And yes, I think it's very easy for all of us to say,
                                         
                                         well, no shit, and we're better off without them. But I mean, unfortunately, what reporting,
                                         
    
                                         and again, this is across several different organizations, several different people,
                                         
                                         this is not one anecdote, it's not two anecdote, this is several people saying, several different people. This is not one anecdote. This is not two anecdotes.
                                         
                                         This is several people saying we were underfunded.
                                         
                                         It wasn't a priority.
                                         
                                         And as I think we've seen with Facebook in recent years,
                                         
                                         as well as other companies,
                                         
                                         until it becomes a public issue, a public problem for them,
                                         
                                         they're not inclined to put any resources
                                         
    
                                         towards anything that is not in some way driving the bottom line. And so that brings us to encryption.
                                         
                                         Before we go there, let me just kind of message to the audience here, because I know that many
                                         
                                         people who work at these companies listen to the podcast. In fact, I know many of the people who
                                         
                                         started these companies. I can reach out and talk to many of the principal people here.
                                         
                                         So I know many of you are listening.
                                         
                                         At every level in these companies, you have to figure this out.
                                         
                                         The fact that this is the status quo, that so little attention and so few resources are
                                         
                                         going to solving this problem when the problem itself
                                         
    
                                         is being materially facilitated by your companies, right? I mean, the problem couldn't exist
                                         
                                         at this scale, anything like this scale, but for the infrastructure you have built and upon which
                                         
                                         you're making vast wealth, right? It's just, it's completely understandable
                                         
                                         that it's a daunting task, right? But if you're working for these companies and you're spending
                                         
                                         all your time trying to increase their profit and spending no time at all, I mean, when was the last time you as an employee at Twitter or Tumblr or
                                         
                                         Facebook or AWS or Dropbox, any of these companies, have thought about the problem
                                         
                                         we're now talking about? Please do something. You know better than I do what you might do
                                         
                                         to make noise within your company about this, but prioritize this.
                                         
    
                                         Google lets its employees spend some significant percentage of time just thinking about problems
                                         
                                         that interest them. Well, become interested in this one. We're going to look back on this period,
                                         
                                         I mean, of all the things that are going to seem crazy in retrospect, the deformities in our
                                         
                                         politics and in culture at large, born of our just not
                                         
                                         figuring out how to navigate our use of these tools. You know, the fact that we spend half of
                                         
                                         our lives ignoring our friends and families because we're looking at what's happening to
                                         
                                         our reputations on Twitter, because we've put this slot machine in our pockets and take it out 150
                                         
                                         times a day, right? All of that is going to seem
                                         
    
                                         insane. And once we correct for it and find some psychological balance, we'll be better for it.
                                         
                                         But nothing will seem more insane than the fact that we did not address this problem in a timely
                                         
                                         way. So with that PSA to my friends in tech, back to you, Gabe, what do you have to say about the prospects of encryption and related issues?
                                         
                                         rundown, but just so people know. So there's a pretty big distinction between somebody like Facebook, who's scanning every single thing, their social media company, they're doing it
                                         
                                         aggressively, and places like cloud storage. So Dropbox, Google Drive, they tend to have very
                                         
                                         similar policies. And those policies are, they don't scan anything you upload. They're only
                                         
                                         going to scan a file when you share that file.
                                         
                                         That's their policy.
                                         
    
                                         Now, that's an interesting policy.
                                         
                                         But in our reporting, we found that people easily circumvent that policy.
                                         
                                         They do that by sharing logons.
                                         
                                         They do that by leaving public folders open and available.
                                         
                                         So they're choosing, and these are all what the companies would say are privacy-based policies.
                                         
                                         So Dropbox and Google only scan on share.
                                         
                                         Now, let me tell you a little bit for these first numbers that were only released to the New York Times about 2019.
                                         
                                         Dropbox only filed 5,000 reports last year in 2019.
                                         
    
                                         Now, while we were doing our reporting in 2019, we said to Dropbox, do you scan images? Do
                                         
                                         you scan videos? And after weeks of them saying, well, we can't tell you that, we won't tell you
                                         
                                         that for whatever reason, they at one point in, I believe, July of 2019, they said scanning videos
                                         
                                         is not a priority, not a priority for us. We don't feel that videos are the medium of choice necessarily,
                                         
                                         and that's not a priority.
                                         
                                         By the time we published our article, literally in the days before,
                                         
                                         Dropbox said, oh, oh, we're scanning video now.
                                         
                                         Okay, so they start scanning video, let's say, in the last quarter of 2019.
                                         
    
                                         What the numbers show is that of the 5,000 reports Dropbox filed to the National Center,
                                         
                                         there were over 250,000 files. 5,000 reports, 250,000 files. The majority of those files were video. Okay,
                                         
                                         so Dropbox starts scanning for video, they start finding a lot of video. Amazon, Amazon's cloud
                                         
                                         services handle millions of uploads and downloads every second.
                                         
                                         Millions every second.
                                         
                                         They don't scan at all.
                                         
                                         They scan for no images.
                                         
                                         They scan for no videos.
                                         
    
                                         Last year, they reported zero images, zero videos.
                                         
                                         You know, we could go on.
                                         
                                         Those are some of the bigger ones. You have Apple.
                                         
                                         Apple cannot scan their Messages app, and they elect not to scan iCloud. So once
                                         
                                         again, they're cloud storage. They don't scan their cloud storage. Now, I've gone back to them.
                                         
                                         Some of these companies are starting to do it. I think that there's nothing like, you know,
                                         
                                         the exposure of a company to motivate them to begin doing this. But there are certainly things
                                         
                                         they can be doing, and they will
                                         
    
                                         tell you that they do dedicate a significant amount of resources. But let me address that as
                                         
                                         well. So Microsoft, who again invented or sponsored and invented the photo DNA, the creation of image
                                         
                                         scanning, has long been seen as a leader in this field. And remember, this all started with a tip
                                         
                                         from a user saying that
                                         
                                         they were able to find child sexual abuse on Microsoft. So my colleague, Michael Keller,
                                         
                                         both of us have computer science backgrounds, he wrote a program. And this computer program
                                         
                                         used what's called a headless browser, which means you can't actually see the browser.
                                         
                                         And he programmed this headless browser to go on Bing, to go on Yahoo, to go on DuckDuckGo,
                                         
    
                                         and to go on Google and search for child sexual abuse imagery using terms that we both knew
                                         
                                         were related to child sexual abuse, as well as some others that were sent to us as tips.
                                         
                                         And the program, I mean, again, we had it very heavily vetted by our
                                         
                                         lawyers. It even blocked the images from ever being loaded, just so you know. So not only could
                                         
                                         we not even see a browser window, but the images were stopped from ever loading. But what we did
                                         
                                         is we took the URLs that were returned from these image searches, and we sent those URLs to Microsoft's own PhotoDNA cloud service. So essentially,
                                         
                                         this is a cloud service that we signed up for with Microsoft, saying very clearly,
                                         
                                         we're New York Times journalists, we're reporting on this issue, and we'd like access to your API
                                         
    
                                         to check for images of child sexual abuse. They gave us access to the API. We wrote a computer program that searched Microsoft
                                         
                                         Bing using terms. We then sent those URLs to Microsoft Photo DNA and found dozens of images.
                                         
                                         Dozens of images. This is a trillion dollar tech company. So not only that, we found dozens of
                                         
                                         images. And that's before we just cut
                                         
                                         it off. I mean, again, with letters coming in from the FBI saying, be careful. And we weren't
                                         
                                         trying to make a massive collection or prove that there are millions. We found 75 before we were
                                         
                                         like, okay, there's plenty here. So then what we did is we went and told Microsoft. We said,
                                         
                                         this is what we did. This is exactly what we did. These are the search terms we used.
                                         
    
                                         They said something akin to, you know, a bug, a problem.
                                         
                                         Okay.
                                         
                                         Three weeks later, we did it again, and we found them all.
                                         
                                         We found different ones.
                                         
                                         We found more.
                                         
                                         So the idea that these tech companies cannot find,
                                         
                                         I mean, they should be able to do this themselves, obviously, when two journalists at the New York Times can do that. So the idea that they're doing, and this isn't just Microsoft, right? It was also found on Yahoo and DuckDuckGo. Now, both of those are powered by Microsoft's search engine. So the fault lies largely with Microsoft. We did not find any on Google.
                                         
                                         So the fault lies largely with Microsoft.
                                         
    
                                         We did not find any on Google.
                                         
                                         So that says two things.
                                         
                                         One, Microsoft is not realizing that their own system is indexing and serving up imagery that its own technologies can identify.
                                         
                                         And two, it's doable.
                                         
                                         You can stop this.
                                         
                                         Google's done it.
                                         
                                         However, Google did it in their search.
                                         
                                         And I'm not saying it's impossible to find it.
                                         
    
                                         Again, we didn't do some kind of exhaustive search, but it wasn't turning up on Google. So there is some extremely
                                         
                                         uneven commitment to this issue. And also there's this, the issue we flagged in discussing Facebook
                                         
                                         a while back, where if you don't look, you don't have any bad news to report. If Facebook looks,
                                         
                                         If you don't look, you don't have any bad news to report. If Facebook looks, they find 16 million instances of the problem. AWS doesn't look, and they don't pay a significant price for not looking. The not looking has to become salient and an instance of terrible PR for anyone to be incentivized to look, it sounds like, beyond actually caring about this issue. Right. Well, now we're running up against exactly what you described earlier, which is
                                         
                                         the privacy advocates and the encryption, essentially, absolutists. And let me start
                                         
                                         this part of the conversation by saying, I'm a reporter. I don't offer my opinion on exactly how this problem should be solved.
                                         
                                         My point is that this is the clearest example of where privacy has stark and terrible consequences
                                         
                                         for a group of people. Okay. But that said, you're right. Amazon, Apple, they seem to pay
                                         
    
                                         very little price for filing almost zero reports of child sexual abuse. And meanwhile, Facebook
                                         
                                         gets a bunch of initially negative headlines for filing an enormous amount. Now, as we've discussed,
                                         
                                         those numbers are actually indicative of them doing a
                                         
                                         very good job. But as you said, in March of last year, Mark Zuckerberg announced plans to encrypt
                                         
                                         Facebook Messenger. Now, let me put some context around Facebook Messenger and just how commonly it's used to share images of child
                                         
                                         sexual abuse. In 2018, of the 18 million, a little more than 18 million reports made to the National
                                         
                                         Center, nearly 12 million of those, about 65%, two out of three, were from Facebook Messenger,
                                         
                                         from Facebook Messenger, right? In 2019, Facebook Messenger was responsible for even more, 72% of all reports made to the National Center. So, I mean, whenever I tell people these facts,
                                         
    
                                         the response is almost always, who are these idiots that are trading child sexual abuse on
                                         
                                         Facebook? I don't know the answer to that, but there's lots
                                         
                                         of them. Now, if Facebook encrypts Messenger, which again, Mark Zuckerberg has said they're
                                         
                                         going to do, they will almost completely lose the ability, they'll lose the ability to do any kind
                                         
                                         of automatic image detection, which is what everybody fundamentally relies on to do this.
                                         
                                         And while
                                         
                                         they will say that they're going to use other signals, the experts and people I've talked to
                                         
                                         anticipate that there will be nearly 100% decrease in reports from Messenger. You know, maybe they'll
                                         
    
                                         be able to use some of these other types of indicators, which I would actually be encouraging
                                         
                                         them to be using anyways. Maybe they are, but to find this,
                                         
                                         and these are signals, what they call, which are messages that are sent from one to many people,
                                         
                                         or adults messaging children, things that are, again, I think they should hopefully be using
                                         
                                         anyways. But the fact that they plan to encrypt Messenger, which Jay Sullivan, product manager
                                         
                                         and project management director, I'm sorry
                                         
                                         for messaging privacy at Facebook, in the fall of last year in prepared remarks said, you know,
                                         
                                         private messaging, ephemeral stories, and small groups are by far the fastest growing areas of
                                         
    
                                         online communication. And so by saying that, what he's saying is that this is what our users want.
                                         
                                         Our users want encrypted messaging.
                                         
                                         Our users want privacy.
                                         
                                         Our users want everybody to stay out of their living room, to use an analogy that they often
                                         
                                         use.
                                         
                                         But the truth is, and people are really terrified by this, that if they encrypt it, not only
                                         
                                         are they not going to be able to see CSAM, they're not going to be able to see all the other kinds of crime and grooming and sextortion and everything else that is occurring all the time on their platform.
                                         
                                         So obviously there's a serious conversation that has to be had around tech's role in this and the incentives and this cult of privacy and its consequences.
                                         
    
                                         And we have, I mean, that's its own ongoing topic of conversation that we're not,
                                         
                                         certainly not going to exhaust it here. I guess the best use of our remaining time is just to
                                         
                                         give as clear a picture as we can of the urgency and scope of this problem. Because, again, when you give me a factoid
                                         
                                         of this, what you did from New Jersey,
                                         
                                         so you have a law enforcement official in New Jersey
                                         
                                         who says, you know, we've got 9 million people
                                         
                                         in the state of New Jersey,
                                         
                                         and based upon our statistics,
                                         
    
                                         we could probably arrest 400,000 of them, right?
                                         
                                         These are 400,000 people who he imagines have looked at this material in
                                         
                                         some way online. Now, whether they saw it inadvertently, whether some of them are New
                                         
                                         York Times reporters doing their research, discount all of that, it appears that there's
                                         
                                         an extraordinary number of people who seek this material out because this is what gets them off, right? These are not research
                                         
                                         papers. And we have a culture. I mean, what do we know about the culture of pedophiles and
                                         
                                         all of the elaborate machinations they have to take to not get caught producing this material,
                                         
                                         to take to not get caught producing this material, trading in it, viewing it. First of all, how do predators get access to kids that they abuse in the making of these videos? I mean, yes,
                                         
    
                                         apparently there are truly evil parents and step-parents and caregivers, but how much of
                                         
                                         this is a story of abductions and other horrific details? What do we know about what's happening, you know, among the culture of people who produce and consume this content?
                                         
                                         But let me just raise a few things that I think would be helpful to the conversation, especially with your audience. Because often when you come to the idea of encryption, it's this position of either yes encryption or no encryption, right? think the government is using our reporting and using the issue of child sexual abuse as kind of the new terrorism.
                                         
                                         That now, like a week after we put out our first report, Attorney General William Barr held an
                                         
                                         event at the Department of Justice. And the event was entirely about how encryption is enabling child sexual abuse
                                         
                                         and how they need a backdoor into encryption because of this.
                                         
                                         Now, what that event did not discuss at all
                                         
                                         were the multiple other failures of the federal government
                                         
    
                                         in dealing with this issue.
                                         
                                         So I do feel like there is some disingenuous behavior,
                                         
                                         not only on their part, also on the part of people who there's a lot of this is becoming a weaponized topic around encryption.
                                         
                                         Well, this is so grim because if ever there were an attorney general who did not inspire confidence for his ethical and political intuitions and his commitment to protecting civil liberties.
                                         
                                         It's William Barr. So yeah, I mean, it just makes me think that the answer has to come more from
                                         
                                         tech than from government, at least government at this moment, right? I mean, obviously government
                                         
                                         has to be involved because we're talking about crime in the end. But yeah, it's easy to see how fans of Edward Snowden and everyone else who wouldn't trust the current
                                         
                                         administration as far as they can be thrown will just say this is completely unworkable.
                                         
    
                                         You can't let these people in because they obviously can't be trusted to protect civil
                                         
                                         liberties. Right. And even Snowden has weighed in on this series specifically about saying he thought one
                                         
                                         of the stories we wrote was particularly credulous to law enforcement and to this argument against
                                         
                                         it. But you're right. I do think there are things to be done. And we'll focus on Facebook solely
                                         
                                         because of this messenger example, right? Now, I think one of the most compelling things
                                         
                                         I've heard from people who are really willing to engage on this issue is maybe encryption should
                                         
                                         not be deployed against all types of platforms. So for example, Facebook is a platform where
                                         
                                         children are at a distinct disadvantage to adults. Not only for all the
                                         
    
                                         reasons that children are always at a distinct disadvantage, they're younger, they haven't
                                         
                                         learned as much, they don't have as much life experience, but literally I found dozens, at least,
                                         
                                         and by far from an exhaustive search, of examples of adults going on Facebook, creating profiles that say they're 13 or 14, befriending other children,
                                         
                                         getting that child to send them an image of what is child sexual abuse, whether it's self-generated
                                         
                                         or not, coercing them into it, sometimes by sending that child other images of child sexual
                                         
                                         abuse that they've collected. And then as soon as the child sends them an image, they'll say,
                                         
                                         actually, I'm a 25-year-old guy, and if you don't send me more images, I'm going to post this image
                                         
                                         on your Facebook wall and tell your parents. So then you have a 12 or 13-year-old, I guess you're
                                         
    
                                         not supposed to be under 13 on Facebook, although we know those rules get bent and broken all the time. Now you have a 12-year-old, a 13-year-old saying,
                                         
                                         holy shit, I don't know what to do. I'm going to send them more photos. I'm so terrified if
                                         
                                         they tell my parents, if they show, if they post that on my wall. This happens all the time. It's
                                         
                                         called sextortion. It's one of the biggest issues coming up. So now you have this platform where
                                         
                                         adults and children
                                         
                                         can interact at a distinct advantage to the children. Children are discoverable despite the
                                         
                                         fact that Facebook says you can't search for them in certain ways, which is true. There's still
                                         
                                         plenty of ways for an adult to find children on Facebook and message them and coerce them.
                                         
    
                                         And Facebook knows it's a huge problem. We're not starting from a place
                                         
                                         where they don't know it's a problem. They know it's a huge problem. Now, at the same time,
                                         
                                         Facebook has an encrypted messaging service, WhatsApp. And if we look at the number of reports
                                         
                                         to the National Center from WhatsApp versus Facebook, of course, it's not even close.
                                         
                                         WhatsApp isn't even a fraction of a percent of the reports that Facebook sends. But that said,
                                         
                                         WhatsApp isn't even a fraction of a percent of the reports that Facebook sends. But that said,
                                         
                                         Facebook could, there's one hypothesis, one possibility, not something I'm advocating necessarily, but an interesting thought. Facebook could direct people, say, look, Messenger is not
                                         
                                         encrypted. We're not encrypting it. These are not encrypted messages. Law enforcement has access to
                                         
    
                                         these messages. Shout it from the hills.
                                         
                                         Everybody knows. If you want to have an encrypted chat, we own a company. Go use WhatsApp. We'll
                                         
                                         kick you right over to WhatsApp. Use WhatsApp. Now, that would make it substantially harder
                                         
                                         to coerce children because at that point, what you have to do, in order to even have WhatsApp,
                                         
                                         you have to have a phone number. So the child has to have a phone, the child has to have WhatsApp
                                         
                                         on it. And WhatsApp, as opposed to Facebook, doesn't have the same sort of discoverability
                                         
                                         issue. You can't just go on WhatsApp and start finding children, right? Certainly not in the
                                         
                                         same way you can on Facebook. So maybe there should be
                                         
    
                                         more discussion around what types of platforms should be encrypted. What types of platforms
                                         
                                         are children at a distinct disadvantage? Like, do I believe, I believe privacy is a fundamental
                                         
                                         human right. So I absolutely believe that there should be encrypted messaging. But do I,
                                         
                                         has this course of reporting shaken to my core how that should happen? Absolutely.
                                         
                                         And has it caused me to say like, wow, how do we, how do we have a technology such as encryption,
                                         
                                         which by definition cannot have a back door,
                                         
                                         and still protect children. And what I find to be counterproductive when I start talking about
                                         
                                         these discussions, Sam, is that privacy absolutists, which is a term I use for them,
                                         
    
                                         who have been thinking about this issue for years, they will immediately chastise me. I mean,
                                         
                                         I tweeted out this idea probably in October after we had started thinking about it and developing
                                         
                                         it, my colleague and I, and I'm sure we're not the first people to think about this, but I said,
                                         
                                         shouldn't there be a discussion around what platform should be encrypted? And I was attacked.
                                         
                                         I was attacked by people who said, I've been thinking about this problem for 20, 30 years.
                                         
                                         I've analyzed it from every single point of view.
                                         
                                         I've run every scenario down
                                         
                                         and every single one ends in encrypt everything.
                                         
    
                                         Now, I don't know if that's the right answer.
                                         
                                         I don't know what the right answer is.
                                         
                                         But what I do know is that at this point in time,
                                         
                                         when the general public is just starting to become aware of
                                         
                                         privacy implications online, just starting to understand what it means to have private messages,
                                         
                                         what it means to have social media platforms, what it means to broadcast your life to everybody or
                                         
                                         choose not to, the worst thing you can do is come in and tell people they're idiots
                                         
                                         for thinking about these things out loud. And so I would just like offer that message to a
                                         
    
                                         community that I very much respect and enjoy communicating with. Like, again, I helped start
                                         
                                         the New York Times tip line. We implemented secure drop
                                         
                                         technology, which is encrypted technology. You can send us messages on WhatsApp and Signal.
                                         
                                         You can get to us in untraceable ways. I very, very much understand the importance and value
                                         
                                         of encryption and private communications. But I do think there is room to discuss where those
                                         
                                         are implemented, how those are implemented. Should those be implemented when you know there is a problem? Should those be implemented when you know children are at a distinct disadvantage? And still the answer've already run every scenario and you know the answer,
                                         
                                         well, help people get there in a way that's constructive
                                         
                                         because the other ways are going to drive people away from it.
                                         
    
                                         I should echo a few of those thoughts in that, you know,
                                         
                                         I am also very concerned about privacy
                                         
                                         and I would be the first to recoil from the prospect of someone like A.G. Barr having any more oversight of our lives than he currently has.
                                         
                                         So it's easy to see how an ability to pry into our private lives can be misused by an ideological government.
                                         
                                         And yet, we're talking about more than that. We're talking about the fact
                                         
                                         that at this moment, you have some sadistic lunatic mistreating children who, for whatever
                                         
                                         reason, he has access to, and shooting his exploits on an iPhone, uploading it to AWS,
                                         
                                         shooting his exploits on an iPhone, uploading it to AWS, posting some of this stuff on Tumblr or wherever else. And Apple built the phone, and Amazon runs the server, and Tumblr just got
                                         
    
                                         acquired by Automatic. And you have people getting fantastically wealthy on this technology,
                                         
                                         People getting fantastically wealthy on this technology, and this technology is what is enabling the lone maniac in his basement to accomplish all of this.
                                         
                                         If he just had a Polaroid camera, yes, he could walk that single photo in a brown paper
                                         
                                         bag to the parking lot of a shopping mall and trade it for $100 with some
                                         
                                         other nefarious stranger and risk getting caught that way. But these companies have built the tools
                                         
                                         to bring all of this to scale. So presumably people are making a fair amount of money trading
                                         
                                         this material now, and they're managing to essentially groom a very large
                                         
                                         audience of vulnerable people. I mean, you have to imagine that the audience for this
                                         
    
                                         is growing the way the audience for kind of weird adult porn is also apparently growing,
                                         
                                         because people are getting numbed, you know, to various stimuli, right?
                                         
                                         People have access to every possible image online, and certain people are vulnerable to just
                                         
                                         needing more and more extreme images to even find them to be salient, right? So I would imagine at
                                         
                                         the periphery, if we're talking about, you know, 400,000 people in New Jersey downloading this stuff, not every one
                                         
                                         of those people 30 years ago would have been trying to find who they could exchange Polaroids
                                         
                                         with in a parking lot.
                                         
                                         And so this is a kind of a cultural contamination, again, much of which is redounding to the
                                         
    
                                         bottom line of these companies that are getting fantastically wealthy for the use of their bandwidth for these purposes. So you can't be a privacy absolutist here.
                                         
                                         We have to figure out how to change the incentive structure around all this so that the companies
                                         
                                         themselves find some way to make this much harder to do and to make it much more likely that someone
                                         
                                         will get caught for doing it. Yeah. And I know that I do want to speak about pedophiles in education and finish up there.
                                         
                                         Yeah.
                                         
                                         But we haven't even discussed a few things that when you're talking about
                                         
                                         what people could use their 10% or 20% time or just what these
                                         
                                         extremely bright people who work at these companies, right?
                                         
    
                                         We have a part of one of our stories that's just
                                         
                                         almost too terrible to talk about, about live streaming. And this live streaming is going on
                                         
                                         on Zoom, where there's over a dozen men sitting around watching another man rape a child and
                                         
                                         cheering him on. And the only reason, the only reason I was able to report this case out is because a
                                         
                                         Canadian undercover officer happened to be sitting in the Zoom chat room because it was
                                         
                                         known that this was a problem on Zoom and recorded it.
                                         
                                         And the police the next day went and saved that child, right?
                                         
                                         So that's a wonderful story.
                                         
    
                                         But the fact is, when that case went to trial, which is kind of unbelievable that it went to trial, but it did go to trial, what the prosecutor said, the federal prosecutor, a man named Austin Barry, he said that the offenders know that live streams are harder to detect and that they leave no record and that, quote, that's why they go to Zoom. It's the Netflix of child pornography.
                                         
                                         So there's things we haven't even discussed, like live streaming and new content and classifiers
                                         
                                         that need to be built for that, that it's hard.
                                         
                                         You know, like this is a hard, complicated, technical task.
                                         
                                         And the implications and what people were absolutely going to respond to is the idea of
                                         
                                         walking into a surveillance state, which I know you've had multiple conversations. And that is
                                         
                                         why we did the reporting on this subject. That is why we did it, because it brings these questions
                                         
                                         to a head, is how do we deal with this? Now, the answer for you of how we deal with this right now, as far as I'm concerned, is education. So I was in
                                         
    
                                         Amsterdam, I was actually in The Hague in the Netherlands, late last year doing some reporting
                                         
                                         because this is an international problem. And some of the laws in the Netherlands make this
                                         
                                         make it more common for this type of material to be stored on servers in that country.
                                         
                                         But that said, while I was there, I ran into some law enforcement, some international law
                                         
                                         enforcement, and I ran into a Finnish, basically, homeland security agent. And we were having a
                                         
                                         couple of drinks, and I was talking to him. I told him what I was there for. He didn't believe me for
                                         
                                         a while, thought I was a Russian spy. That was interesting. Once I had finally convinced him
                                         
                                         that, no, I'm actually a reporter, and I'm interesting. Once I had finally convinced him that, no,
                                         
    
                                         I'm actually a reporter and I'm reporting on this subject, he told me that he was actually there
                                         
                                         for a conference on that subject, which I knew. I was there for the same reason.
                                         
                                         And he said he had two small children. And I said, all right, man, you know, so what do we do?
                                         
                                         Like, what is the answer to this? And he said, the only thing
                                         
                                         you can do is educate your children. Like you need to sit down with your children. You need
                                         
                                         to explain to them that they don't know who they're talking to online, that they cannot assume
                                         
                                         that that's another child, that they should not be sending images of themselves. They should not
                                         
                                         be live streaming images of themselves. And even more
                                         
    
                                         importantly, if they do, they should feel totally comfortable knowing that you will protect them
                                         
                                         and support them even if they've made that mistake. That as of right now, there is no tech solution to
                                         
                                         this problem. And honestly, it's not in the near future. We also didn't get the opportunity to talk very much about survivors.
                                         
                                         And that is, I mean, absolutely heartbreaking.
                                         
                                         I remember I spoke with a 17-year-old girl who had been raped by her father, who invited
                                         
                                         another man to rape her when she was seven years old, videotaped it and put it online.
                                         
                                         The confusion this young woman feels that her father would do something.
                                         
                                         She then lost her father.
                                         
    
                                         And not many people know this, but people who have been identified in these images and videos,
                                         
                                         every time their image or video is found on an offender's computer or cloud drive or whatever,
                                         
                                         they get a notice from the FBI.
                                         
                                         And this notice is to allow them a chance for financial compensation during the trial,
                                         
                                         which is often a few thousand dollars maybe.
                                         
                                         But they get these hundreds of notices a year.
                                         
                                         So every year or every day, their parents, or often a lawyer because their parents cannot
                                         
                                         bear it, get these notices saying that it's been found again.
                                         
    
                                         It's been found again.
                                         
                                         It's been found again, it's been found again, it's been found again. So it's really important as we talk about the technology companies and the efforts that need
                                         
                                         to be made and everything, that the first generation of children who have been sexually
                                         
                                         abused and now have to deal with that every day of their life. Constant reminders. And this isn't a reminder that like
                                         
                                         one, you were once physically assaulted, like in a fistfight and that video is online,
                                         
                                         which I'm sure would be terrible. This is the knowledge that you being sexually assaulted as
                                         
                                         a child is serving the sexual pleasure of other deviants online is just, I mean, I came out of that
                                         
                                         interview and I was devastated. And so it's very important we keep the survivors in mind because
                                         
    
                                         it's not just the child that it's happening to when it's happening. It's the, again, a whole
                                         
                                         generation now of children who are growing up every day.
                                         
                                         I mean, they change their appearance because people try to find them later into the future.
                                         
                                         They can't speak out about it because it is such a stigmatized thing.
                                         
                                         It's just, it's an unbelievable pain.
                                         
                                         And so the thing that in law enforcement, and yes, we hope tech companies and there are huge battles ahead, whether it be encryption, whether it be building classifiers that can detect new content, whether it be trying to figure out how to stop children and young adults from sending photos of themselves that can then be weaponized against them, all of those things, fundamentally for the next few years at the very least, the
                                         
                                         onus is on the parents to do a better job educating their children, to realize that
                                         
                                         when their children are playing Minecraft or Fortnite, that there are adults on there
                                         
    
                                         trying to coerce them, that no matter what platform your child is on, the unfortunate
                                         
                                         truth is there are monsters on that platform
                                         
                                         trying to do terrible things. And so while the tech companies, which I really hope,
                                         
                                         figure out how to deal with this, it is on the parents to educate themselves and their children
                                         
                                         on how to be aware and avoid these problems.
                                         
                                         Hmm. Yeah. Although that really only addresses a peripheral problem here.
                                         
                                         The sexploitation thing is a problem, I'll grant you,
                                         
                                         and obviously any parent should communicate with their kids around this.
                                         
    
                                         Don't send images of yourself.
                                         
                                         Realize that you could be talking to an adult.
                                         
                                         Obviously don't agree to meet people in the physical world
                                         
                                         based on these
                                         
                                         online contacts and all that. But apart from the problem of possible stranger abduction being
                                         
                                         facilitated by that kind of online communication, that doesn't really bring us to the bottom of this
                                         
                                         hellscape, which is the sort of thing you described happening on Zoom, right? Where you
                                         
                                         have an adult who, for whatever reason, has access to a child who is then raping that child to
                                         
    
                                         produce video for people who have happily gathered to consume it, right? So there's a culture of this.
                                         
                                         You're right. You're right. It doesn't address that, but I don't want to stray from the idea that A, a large part of the community feels
                                         
                                         that any of these types of videos and images, the more that circulate, whether they're coerced or
                                         
                                         self-produced, the more it drives people down that hole that we just discussed to more and more
                                         
                                         extreme content. And B, there are several examples, many examples of children,
                                         
                                         again, it's almost impossible for me to even imagine, but being 11 years old, having made
                                         
                                         the mistake of sending an image of my genitals to somebody thinking they were 12, and then having
                                         
                                         them say that they're going to tell everybody in my whole world that I've done this, and not only
                                         
    
                                         that, show them, that has resulted in children
                                         
                                         abusing. Often it is very common to bring in their siblings. I mean, the amount, the way it spreads
                                         
                                         and then to actually sexually abuse their own siblings, which then leads to more extortion.
                                         
                                         So I completely agree with you. It does not solve the dark, dark, dark depraved things that we mentioned
                                         
                                         quite a bit in our articles. But sexploitation and sextortion are the fastest growing number of
                                         
                                         child sexual abuse images online. So it is in no way a panacea, but it is one opportunity to help
                                         
                                         stem the problem. So take me back to the Zoom case. What was
                                         
                                         revealed about it? Were the people who were watching also arrested or was it just the
                                         
    
                                         perpetrator? No, they were. So let me have it in front of me here. So what happened was it was a
                                         
                                         man in Pennsylvania who was, it was not the first time this had happened, in fact,
                                         
                                         that he had, I believe it was his nephew, honestly, but it was a six-year-old boy.
                                         
                                         And there was, I think, about more than a dozen men.
                                         
                                         And these men were from all around the world, speaking to the point of what we're talking
                                         
                                         about, the technology.
                                         
                                         They were all around the world.
                                         
                                         Now, I think a dozen or so were in the United States, but all around the world.
                                         
    
                                         And what was this? Was this monetized in some way? I mean, how does... No, no. They were all around the world. Now, I think a dozen or so were in the United States, but all around the world.
                                         
                                         And what was this? Was this monetized in some way? I mean, how does... helped shut down other types of dark market crimes like drugs and some of those things that are traded on on the dark market which is by bitcoin transactions like whenever even even those can be
                                         
                                         traced to a certain extent so there is certain types of things that go on like bitcoin mining
                                         
                                         that they leverage other people's computers to do they do sell some of this stuff online but
                                         
                                         actually what we found,
                                         
                                         certainly on the open web
                                         
                                         and on the platforms we've been talking about,
                                         
                                         is a much greater predilection
                                         
    
                                         to just share it with one another,
                                         
                                         to share and stockpile.
                                         
                                         So they create these huge, huge stockpiles,
                                         
                                         often stored on their cloud storage.
                                         
                                         I mean, we found cases
                                         
                                         where there are millions of files
                                         
                                         on their cloud storage. But these people, I mean, we found cases where there are millions of files on their cloud storage.
                                         
                                         But these people, I mean, it is truly horrendous. The men are sitting around, almost always men,
                                         
    
                                         sitting around. They have a rule, okay, in these. It's known that they have to have their webcam
                                         
                                         on because in their minds, a police officer would never sit there with their webcam on.
                                         
                                         So the rule is cams on.
                                         
                                         So they're all sitting there.
                                         
                                         They're rooting this man on while he rapes the boy.
                                         
                                         They're masturbating as they do it.
                                         
                                         The detective, I think it's Detective Constable
                                         
                                         Janelle Blacketer, who is a Toronto Police Department,
                                         
    
                                         she was in the room.
                                         
                                         She recorded the stream.
                                         
                                         That night, she sent the file
                                         
                                         to Special Agent Austin Barrier,
                                         
                                         Homeland Security Investigations.
                                         
                                         They then subpoenaed Zoom,
                                         
                                         who was very compliant.
                                         
                                         I mean, when the companies learn about this,
                                         
    
                                         they almost always are very quick to react.
                                         
                                         Zoom sent him information.
                                         
                                         Turned out the man is in Pennsylvania.
                                         
                                         The man's name is William Byers Augusta.
                                         
                                         He's 20 years old.
                                         
                                         The next day, Homeland Security shows up,
                                         
                                         is able to identify the setting
                                         
                                         based on the video footage they've seen.
                                         
    
                                         They identified certain objects that were also in the room.
                                         
                                         Saved the six-year-old boy.
                                         
                                         14 men from multiple states have been arrested and sent to
                                         
                                         prison. And Mr. Augusta, he received a sentence of up to 90 years in prison.
                                         
                                         Okay. So it worked in that case, but...
                                         
                                         It didn't work because the tech companies caught it.
                                         
                                         Right.
                                         
                                         It worked because law enforcement caught it.
                                         
    
                                         Exactly. And I mean, first of all, just to say something to the privacy absolutists here, Right. It worked because law enforcement caught it. this sort of thing from happening, or you could actually bring the people who are doing this sort of thing to justice. For me, it's just, it would be trivially easy to agree to that in the terms
                                         
                                         of service. It's like, of course. I just don't understand how you get to, what is it that you're
                                         
                                         doing in your life that you think absolute privacy under any conceivable scenario is important to you. What are you doing on Zoom
                                         
                                         that you can't imagine the government or the tech company ever being able to search it,
                                         
                                         even just algorithmically, to vet its content? It's a religion. It's a fetish of some kind of
                                         
                                         absolute... First of all, it's nothing that human beings have ever a fetish of some kind of absolute...
                                         
                                         First of all, it's nothing that human beings have ever had a right to this kind of privacy.
                                         
                                         I mean, there's no place in the real world where you've ever done anything or said anything that has given you an absolute right to privacy.
                                         
    
                                         It's physically impossible, right?
                                         
                                         There's no room in your house that could hold all your secrets
                                         
                                         and never be unlocked by a third party, no matter what you had done in the world, right? And yet
                                         
                                         somehow in digital space, some of us have convinced themselves that we need these rooms,
                                         
                                         right? And it's, again, for the purposes of this conversation, I've completely lost touch with the
                                         
                                         ethical intuitions that suggest that we need an absolute right to privacy.
                                         
                                         And it's the reason we do it.
                                         
                                         It's the reason we're doing this reporting is because, again, it was a shortcut to the
                                         
    
                                         conversations that I think need to be have around privacy on the internet, around, you
                                         
                                         know, should companies be scanning people's photos? Should companies be
                                         
                                         scanning people's videos? Should they be detecting, doing natural language processing to detect
                                         
                                         grooming? Should they be doing all these things? Like, let's have these conversations. And as
                                         
                                         you're saying, should Zoom, at what point does your expectation of privacy go away? Like, so you're in
                                         
                                         a room with 16 other people around the world. Is there an
                                         
                                         expectation of privacy up until 30 people, up until 50 people? At what point? And again, these
                                         
                                         are just, I'm sure that people are going to attack me for even raising these questions, but they're
                                         
    
                                         honest questions about at what point do these things start to affect people in ways
                                         
                                         that are in fact detrimental, if that is the case, if that would happen.
                                         
                                         But I think we need to move beyond a little bit of the conversation. Yes, there's some harm in
                                         
                                         Facebook, let's say, giving our likes to Cambridge Analytica. But there's far, far greater harm, I think we'd all
                                         
                                         agree, in people being able to trade child sexual abuse material under the cloak of encryption.
                                         
                                         So let's have that conversation. One other question here, which trips a lot of ethical
                                         
                                         intuitions one way or the other. What do you think about the prospect of allowing entirely fictional production of similar material,
                                         
                                         you know, animated child pornography or the CGI version of it, such that it could answer to this
                                         
    
                                         apparent appetite in many people without being at all derived from the actual victimization of children.
                                         
                                         At the moment, I assume all of that material is just as illegal as anything else that is a real record of a crime.
                                         
                                         Is anyone arguing that if we could only produce this stuff fictionally, the real problem would be at least diminished?
                                         
                                         There are people arguing that.
                                         
                                         I'm not going to say the name of the company
                                         
                                         because I think that is very questionable.
                                         
                                         It is illegal in the United States
                                         
                                         for any kind of like drawings or depicted imagery, I believe.
                                         
    
                                         But I think this gets to a very interesting point.
                                         
                                         And I want to talk specifically about pedophiles.
                                         
                                         And so before we did this reporting, and even in our first story,
                                         
                                         as soon as we published, we got lambasted by several people
                                         
                                         saying that we had used the term pedophile inappropriately.
                                         
                                         So to speak specifically, pedophiles are people who are attracted,
                                         
                                         sexually attracted to children.
                                         
                                         There is a whole other group of people who look at child sexual abuse imagery.
                                         
    
                                         These are people who are not necessarily attracted to children. They are extremists. They are
                                         
                                         wandering down rabbit holes. They are addicts. And they are a huge part of the problem. But let's
                                         
                                         speak about pedophiles, because I do think there is, when I'm talking with child advocates and some
                                         
                                         of the other people, I say, in grappling with this problem, I think the same way that you're starting to grapple with it or have been grappling with it,
                                         
                                         which is, holy shit, what do we do? I think you have to think about attacking it from all angles,
                                         
                                         right? And that also means dealing with the people whose natural attraction is to children.
                                         
                                         I do want to, I mean, sympathy is probably the only word I have for it. There is a group of people, and I'm not going to get into citing any of the studies. As soon as you cite a study, you have a million people telling you why that study was garbage. But there is a group of people who, when they go through puberty, begin to realize that they remain attracted to children of a certain age.
                                         
                                         attracted to children of a certain age. And that is the very, very common report of a true pedophile is that they turn 12, 13, 14, and as they continue to grow older, as they can go through puberty,
                                         
    
                                         they realize something is wrong. They realize they are still attracted to children. So how do
                                         
                                         you deal with that? Now, first of all, according to some of these studies, you then have a few years, right, where this child, this young adult now knows that they have some sort of issue. And so that's an opportunity. That's an opportunity to intervene if we can find out a way to do that.
                                         
                                         The second thing that I often think about, and this is a bit tangential from what you're saying,
                                         
                                         I don't know, and I put the same question, it's a good question, I put the same question to these people about should there be imagery that would help satisfy this? I mean, imagine as soon as we
                                         
                                         get to virtual reality, the different types of implications there. I don't know. I think it's
                                         
                                         worth talking to scientists, talking to people who study this to see if that would if that would stop them from offending, children, that's a good thing. But I could see the argument or perhaps if a study were done saying it would drive them to actually want to do this in real life.
                                         
                                         I'm not really sure.
                                         
                                         But what I do think adds another layer of complexity, because it's very easy.
                                         
    
                                         What I just told you is that a bunch of these men who were arrested for watching another man assault somebody on Zoom, they
                                         
                                         received very lengthy sentences. I mean, they're getting sentences of 30, 40, 50 years for simply
                                         
                                         consuming and trading this material. And I don't mean simply to say that it's not serious. I just
                                         
                                         mean they're not actually doing the abuse. Now, I will get jumped on for that as well. Yes, they are re-victimizing
                                         
                                         the person in it, but simply to say they are not the person physically abusing the child.
                                         
                                         And they're getting prison sentences of 30, 40, 50 years. Previously, I worked at a place called,
                                         
                                         I helped start something called the Marshall Project, which is a criminal justice website.
                                         
                                         And we dealt a lot with this idea of rehabilitation, crime, punishment, rehabilitation.
                                         
    
                                         I do not know if a true pedophile, somebody who's truly attracted to children, is going to be any
                                         
                                         less attracted to a child or any less able to constrain themselves from doing this type of thing
                                         
                                         when they get out of prison 30 years later.
                                         
                                         And in fact, we have, the sentencing is all over the map, whether it's at state or federal level.
                                         
                                         So some of our survivors who we spoke with, they had somebody who went to prison. Remember,
                                         
                                         they get these notices, went to prison because he had their imagery on his computer, got out,
                                         
                                         had their imagery on his computer, got out, went to prison again, and again, their imagery was found on it. So I don't know exactly how to honestly help people who have attractions to
                                         
                                         children. Because if it was you or I, and I think about the people I'm attracted to, there's nothing
                                         
    
                                         I do to be attracted to that person. And I don't
                                         
                                         think there's anything I could do to not be attracted to some of the people I'm attracted
                                         
                                         to. I mean, this is a instinct, it's natural, it's whatever it is. And I do feel sympathy for
                                         
                                         people who, for whatever reason, are attracted to children. And I see that as an opportunity to somehow get
                                         
                                         in front of the issue at that point. And whether it's with animated, 3D models, virtual reality,
                                         
                                         whatever it might be to help them live as normal a life as possible, and with the number one
                                         
                                         absolute goal of not harm a child, then I think those options should
                                         
                                         be explored. Yeah. Well, I think we, again, we have to differentiate pedophilia from
                                         
    
                                         some of the rest of what we're talking about. Because pedophilia is a very unhappy sexual
                                         
                                         orientation, essentially, right? It's one, the implications of which pitch you into something that's illegal, non-consensual, and, you know, therefore non-actionable if you're an ethical
                                         
                                         person, right? So you didn't pick who you're attracted to. As far as I know, the research
                                         
                                         on this suggests that it's undoubtedly, it's partially genetic, but I think it also has to
                                         
                                         do with what happens to babies in utero. You know, I think there's, I think it also has to do with what happens to babies in utero. I think
                                         
                                         it's developmental. But obviously, we don't understand exactly how someone becomes a
                                         
                                         pedophile, but we should understand that they didn't make themselves. So they have this,
                                         
                                         they're profoundly unlucky on some level to find that their sexual attraction never matures to
                                         
    
                                         being attracted to adults who could consent to have sex with them.
                                         
                                         And yet that doesn't fully capture or even explain the picture we're seeing when you describe something like that Zoom atrocity,
                                         
                                         which is you have people who know that they're watching a child getting raped and they're happy
                                         
                                         to do this. I mean, that's analogous to a heterosexual man being invited to watch a woman,
                                         
                                         you know, who he's attracted to women. He can be invited to watch a woman being raped on camera,
                                         
                                         you know, in a Zoom session. What sort of heterosexual man is a part of that
                                         
                                         project, right? That's the culture of unethical pedophiles that has to exist for this whole
                                         
                                         problem to exist. You have to know that what you're doing is facilitating, motivating, enabling the mistreatment of,
                                         
    
                                         and in many cases, torture is not the wrong word for it, torture of children.
                                         
                                         That's where we can be far more judgmental of what's happening here.
                                         
                                         I don't know if you have anything to say about that.
                                         
                                         No, just absolutely.
                                         
                                         I don't mean in any way for me saying that I have sympathy for somebody who is born a pedophile.
                                         
                                         Yeah, I wasn't taking it that way. I share your sympathy.
                                         
                                         And I totally agree that even if somebody is born a pedophile, there is no room to trade or share or actually abuse a child.
                                         
                                         I am deeply sorry if that is your position, if you are a pedophile.
                                         
    
                                         I am sorry.
                                         
                                         I still feel extremely strongly there is absolutely no circumstance in which it is ever okay,
                                         
                                         whether you film it or not, to abuse a child.
                                         
                                         There is no consent.
                                         
                                         It's right where we started.
                                         
                                         There is no consent.
                                         
                                         There is no opportunity for It's right where we started. There is no consent. There is
                                         
                                         no opportunity for this child to agree. I mean, some of these, and whether they're pedophiles or
                                         
    
                                         just terrible people, not to say that those are the same thing, whether they're terrible people
                                         
                                         or not, some of them will bend over backwards to say that the children like it, that this is
                                         
                                         called loving a child, that these are things that if you could only see it,
                                         
                                         I mean, you wouldn't imagine the amount of times that some of these people told me,
                                         
                                         if you could only see it, you would see how much they enjoy it. To that, I say, you're
                                         
                                         doing terrible things and you need to be punished for them. And we need to figure out a system.
                                         
                                         Well, clearly, so the people who are saying that sort of thing, and that's why I have these questions around the culture of this,
                                         
                                         because anyone who's saying, listen, we pedophiles are a happy lot and we treat children well. And
                                         
    
                                         if you go back to ancient Greece, this was a norm, right? Presum presumably Plato was doing this to the boys on the block and no one minded.
                                         
                                         So, you know, get over yourselves, 21st century people. Presumably even these people can't say
                                         
                                         with a straight face that, as you report in one of these articles, you know, an infant being anally
                                         
                                         raped is enjoying this, right? I mean, it's just like, there's no way. I mean,
                                         
                                         I put the question to you. I mean, are there pedophiles who are saying, who are acknowledging
                                         
                                         that part of this picture is every bit as horrific as we think it is? Then they're pointing to some
                                         
                                         other part of the picture that they consider benign, or are they not making those concessions? I mean, the one who I spoke with most extensively
                                         
                                         insists that the children enjoy it.
                                         
    
                                         And the only distinction I could start to get them to draw
                                         
                                         is prepubescent versus postpubescent.
                                         
                                         I mean, I said, okay, let's leave aside postpubescent,
                                         
                                         even though it's still incredibly wrong to take advantage of any child. But let's leave aside the people. Like, how can you say that these prepubescent children are consciously making the decision and understand the ramifications and even further enjoy this activity.
                                         
                                         And I mean, if there's such thing as privacy absolutists,
                                         
                                         there are child sexual abuser absolutists.
                                         
                                         And actually, Sam, it's a big part of the culture.
                                         
                                         It's similar to many other internet cultures where they radicalize one another.
                                         
    
                                         That's what's going on in that Zoom room.
                                         
                                         That's what's going on in there.
                                         
                                         They're radicalizing one another. They're trying to normalize one another. That's what's going on in that Zoom room. That's what's going on in there. They're radicalizing one another.
                                         
                                         They're trying to normalize their behavior.
                                         
                                         They're trying to share it amongst other people in order to make themselves feel like it's more normal.
                                         
                                         And when I was speaking with this person
                                         
                                         and he finally came to understand
                                         
                                         that there was no way in hell
                                         
    
                                         I was gonna look at any of this type of imagery
                                         
                                         and that all I was trying to do,
                                         
                                         honestly, all I was trying to do is find out more information about how he was managing to keep his
                                         
                                         site up and running and listening to his beliefs system happened to unfortunately come along with
                                         
                                         that bit of reporting. But there are people who fundamentally are telling themselves that this
                                         
                                         is an okay thing. Well, Gabe, we have gotten deep into the
                                         
                                         darkness together, and I just want to thank you for taking the time to educate me and our listeners
                                         
                                         and, again, anyone out there who has any, even a semblance of a privileged position with respect to
                                         
    
                                         working in tech, having a brother or sister who works in tech, please start putting
                                         
                                         your shoulder to the wheel here and figure out how to make this a prominent problem that will be
                                         
                                         emphatically solved at some point in the near future. Because clearly, if we don't have the
                                         
                                         technology that can solve it today, that's coming. And if we incentivize ourselves to
                                         
                                         produce it, we'll do so and we can get the policy right. But clearly what we have now is something
                                         
                                         bordering on a moral catastrophe. So again, Gabe, thank you for all your hard work on this.
                                         
                                         Thank you so much, Sam. I'm sincerely grateful for the opportunity to discuss it with you.
                                         
                                         Well, as I said at the top, this conversation was a few months old,
                                         
    
                                         and I've gone back and asked Gabriel if anything has happened in the meantime. The biggest update was actually the results of all the New York Times coverage Gabriel produced.
                                         
                                         Apparently there are two additional bills that have been introduced in Congress. The first was
                                         
                                         introduced by Lindsey Graham and Richard Blumenthal, and it's called the Earn It Act,
                                         
                                         Graham and Richard Blumenthal, and it's called the EARN IT Act. And if it passes in its current form, companies will lose their Section 230 protections when it comes to child pornography.
                                         
                                         The second bill was introduced by Ron Wyden, and it seeks five billion dollars in funding,
                                         
                                         which would be amazing for law enforcement and others who are on the front lines.
                                         
                                         And I believe that funding would be over 10 years. So this is a hopeful sign. Once again,
                                         
                                         thank you to Gabriel and his colleagues for doing so much work here. They certainly brought
                                         
    
                                         this problem to my attention, and now I've brought it to yours. Thanks for listening.
                                         
