Site-wide Ad

Premium site-wide advertising space

Monthly Rate: $1500
Exist Ad Preview

Podcast Page Sponsor Ad

Display ad placement on specific high-traffic podcast pages and episode pages

Monthly Rate: $50 - $5000
Exist Ad Preview

Modern Wisdom - #956 - Laila Mickelwait - How Pornhub Became The Internet’s Biggest Crime Scene

Episode Date: June 19, 2025

Laila Mickelwait is an anti-sex-trafficking activist, founder, and an author. One of the most visited websites on the planet is more than just a site; it’s a crime scene. As Pornhub rose to global... dominance, a lack of regulation allowed thousands to be exploited against their will. Now, Leila Mickelwait is leading the charge to expose the truth, demand justice, and bring real accountability to an industry built on harm. Expect to learn why Pornhub isn’t just a porn site but a crime scene, the story of Pornhub across the years and where it all went wrong, the major significance of the #trackinghub hashtag, what the most uncomfortable truth people ignore about online sexual abuse is, why regulators decided not to act despite obvious red flags from the site, the fallout of trying to get Pornhub shut down, what changes need to occur in tech regulation to stop abuse from occurring are, and much more… Sponsors: See discounts for all the products I use and recommend: https://chriswillx.com/deals Sign up for a one-dollar-per-month trial period from Shopify at https://shopify.com/modernwisdom Get the best bloodwork analysis in America at https://functionhealth.com/modernwisdom Get a Free Sample Pack of LMNT’s most popular Flavours with your first purchase at https://drinklmnt.com/modernwisdom Extra Stuff: Get my free reading list of 100 books to read before you die: https://chriswillx.com/books Try my productivity energy drink Neutonic: https://neutonic.com/modernwisdom Episodes You Might Enjoy: #577 - David Goggins - This Is How To Master Your Life: https://tinyurl.com/43hv6y59 #712 - Dr Jordan Peterson - How To Destroy Your Negative Beliefs: https://tinyurl.com/2rtz7avf #700 - Dr Andrew Huberman - The Secret Tools To Hack Your Brain: https://tinyurl.com/3ccn5vkp - Get In Touch: Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/modernwisdompodcast Email: https://chriswillx.com/contact - Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Pornhub is not a porn site, it's a crime scene. What's that mean? It means exactly what you just said. So what I discovered about five years ago was what millions of people already knew, and that all it took to upload to the world's YouTube of porn. So this is user generated porn, the biggest porn site in the world at the time. Actually it was the fifth most trafficked website in the world at the time. I made this discovery that all it took to upload
Starting point is 00:00:28 to Pornhub was an email address that anybody in under 10 minutes could upload to the site and they were not verifying ID to make sure that these were not children and they were not verifying consent to make sure that these are not rape or trafficking victims. And because of that, the site had actually become infested with videos of real sexual crime.
Starting point is 00:00:47 So we're talking about child abuse, child sexual abuse material we call it. This is child rape. It's also self-generated child sexual abuse material where children would be filming themselves and sharing it and then that would get uploaded to PornHub, which is completely illegal to be viewing and distributing that content, to adult rape, unconscious women, completely drunk, non-consenting, all the way to what we used to call revenge porn. This would be image-based sexual abuse, so all kinds of non-consensual content, and even copyright violations, where this is illegal content because it was stolen material that was being uploaded to the site. So that was the state of Pornhub.
Starting point is 00:01:32 And they had at the time, they had 6.9 million videos that were uploaded in 2019. And this was my fight to hold them accountable for these crimes started in 2020. And at that time, they had 56 million pieces of content uploaded to the site, and they had actually 170 million visitors per day, 62 billion visitors per year, and enough content being uploaded that it would take 169 years to watch if you put those videos back to back. So that's how much content was being uploaded. And mind you, this is now anybody with an iPhone. So anybody anywhere in the world that had a camera could film a sex act and with no checks whatsoever,
Starting point is 00:02:19 using a VPN even to be even more anonymous, they could upload this content to Pornhub and it was infested with videos of crime. Why was this your job to find? This seems like 62 billion visits per year. One of those 62 billion visits could have sprung somebody else into action. Why was this you? I mean, that's just one of the things that really kind of amazes me even now is that this was something that was hiding in plain sight. So this was under everybody's noses. Like really anybody could have sounded the alarm on this.
Starting point is 00:02:59 And it's, it's amazing that it took from 2007 to 2020 for it to get any attention, that this was actually going on. And, you know, I am just, I am actually honored to have the opportunity to shine a light on this and to, you know, be helping the many, many, many countless victims now who've had their trauma immortalized on the site. And so I don't know why it took this long for it to come to light, but there's this saying that I love, and it's an idea whose time had come. And I think that that's actually true.
Starting point is 00:03:37 It was an idea, Trafficking Hub, which is the movement that I started to hold PornHub accountable, it started with a hashtag on social media and grew and went viral. But I think trafficking hub was an idea whose time had finally come. And enough was enough and it was time to expose what was going on. What's the story of PornHub across the years?
Starting point is 00:03:57 Huge company owned by an even bigger parent company. What's the arc of how they ended up where they are? Sure. Yeah, it's so interesting because you think about PornHub as a solo site often. But really what people don't know is that PornHub is owned by a parent company. Most people in the world have probably heard the name PornHub. They had spent millions and millions of dollars to become a household brand name for porn. They had done things like massive PR campaigns to save the bees and save the whales
Starting point is 00:04:29 and clean the oceans and even donate to breast cancer awareness. And they had this whole arm of Pornhub called Pornhub Cares, which was like this philanthropic arm. They're walking New York Fashion Week. They have faux commercials on Saturday Night Live. So everybody pretty much understood the name Pornhub. Now if you talk about the company that owned Pornhub, well, that's a different story.
Starting point is 00:04:52 So most people had never heard of the parent company of Pornhub, which was called Mindgeek. Now Mindgeek had essentially rolled up the global porn industry under one huge international, multi-billion dollar corporation, because with a $362 million loan, they had actually bought up most of the world's most popular porn sites and brands. So there was a hedge fund called Colbet Capital that had 125 secret investors
Starting point is 00:05:22 that included JP Morgan Chase, that included Cornell University, and they had loaned these hundreds of millions of dollars to, which was Manwin at the time, so I'll tell you the history of that, but to the company to buy up the world's most popular porn sites and brands. So they actually owned everything from Playboy Digital
Starting point is 00:05:42 to Pornhub and all of its sister tube sites. So Pornhub and all of its sister tube sites. So Pornhub and its sister sites. So that would include RedTube, U-Porn, GayTube, ExtremeTube, XTube, PornMD. I mean, I could go on and on. Massive amounts of tube sites that operated all the same way. But my geek used to be a company called Manwin. And before Manwin, it was a company called Mansef. And there was, you know, these men in Montreal that started Mansef in 2007. They had purchased
Starting point is 00:06:13 the website Pornhub.com for about $2,500 at a Playboy Mansion party and they launched Pornhub. But it was a man named Fabian Tillman. So he's a German entrepreneur that actually put Pornhub on the map. So he, uh, you know, was a very shrewd business man. And he had this idea that he wanted the world to be able to access free porn. And so he kind of took Pornhub from, you know, somewhat popular site to the brand name that it is today. And then he actually got in trouble for tax evasion. So the company was sold because originally the owners at Mansef were in trouble for money
Starting point is 00:06:53 laundering so they had to suddenly sell the site and they sold it to Fabian. And then Fabian got in trouble for tax evasion and then he suddenly had to sell the site. And so he sold it to VPs at his company and now they're in trouble again. So criminally charged by the US government and now they're trying to sell the site again to a hastily concocted private equity firm. And so that's kind of the story of Pornhub where you have this history where they get in trouble with the law, they rebrand each time they name the company something else and sell it and sell it again.
Starting point is 00:07:27 And this time they were criminally charged for intentionally profiting from the sex trafficking of over 100 women in California. So now they sold MindGeek and now they call it ALO. So it's a new, it's the same company, it's the same website. And many of the same owners and executives have been involved from the very beginning that are still there in Montreal today running the site. It's kind of like a dirty penny that keeps on getting passed around. Or like a cursed penny or something that, you know, the holder ends up in getting in hot water in some way and what's the, what's the story of how you found yourself embroiled in this?
Starting point is 00:08:08 I don't understand how somebody just sort of stumbles upon creating a movement that causes the biggest porn site in the world to end up being basically shut down. Yeah. So I have been in the fight for many, many years before the fight to hold PornHub accountable began the fight against sex trafficking. So I've been involved now it's almost 20 years that I've been involved in the fight for many, many years before the fight to hold Pornhub accountable began, the fight against sex trafficking. So I've been involved now, it's almost 20 years that I've been involved in the fight against sex trafficking and child sexual exploitation. So it was in the context of that work that I happened to test the upload system for Pornhub
Starting point is 00:08:39 as I was investigating the site because I was really concerned by some video, sorry, some news stories that I had heard at the end of 2019. So I was paying attention to the headlines because this is my work, right? And so there was a very concerning story that I credit to the launch of the trafficking hub movement. And it was about a 15-year-old girl from Broward County, Florida, who was missing for an entire year and she was finally found when her distraught mother was tipped off by a PornHub user that he recognized her daughter on the site.
Starting point is 00:09:13 And she was found in 58 videos being raped and trafficked under an account named DaddySlut. And he had impregnated the teen girl and she was finally rescued from his apartment when surveillance footage from a 7-Eleven was matched up with the perpetrator's face in the Pornhub videos and they actually found the girl and rescued her. And then at the same time the London Sunday Times had done an investigation into Pornhub and they found dozens of illegal videos on the site within minutes, even children as young as three years old. And at the time, they had called out Heinz and Unilever for advertising on Pornhub, shamed them for doing that.
Starting point is 00:09:51 They actually apologized. They took their ads off the site. At that time, PayPal cut ties with Pornhub. And it was just this really, you know, kind of shocking moment for anybody in the anti-trafficking and anti-child exploitation space to finally say, hey, what's actually going on here? We're hearing stories of children being abused on PornHub, very young children, in fact. And that is when I just couldn't get those stories out of my mind. I kept thinking about them and thinking about them. And at the time, you see PornHub in the headlines all the time, just getting so much press.
Starting point is 00:10:25 And I said, how in the world is this happening? And that's when late one night I was putting my own fussy baby back to sleep in the middle of the night on February 1st of 2020. And I was thinking about the story of that 15-year-old girl. And that's when I said, look, I'm just going to see what it takes to upload to PornHub. And I just took a video of the rug in the dark room and the laptop keyboard and tried it and for myself and realized it only took 10 minutes. It took a few clicks, no ID, no consent form. And that's, and then I started really paying attention. So that's when I launched the
Starting point is 00:11:02 trafficking hub hashtag on social media. I mean, I only had a few thousand followers at the time from all of my advocacy work that I was doing and I shared the trafficking hub hashtag. And the reason why trafficking hub is what came to my mind is because anytime you monetize an act, a sex act that involves a minor, so anyone under the age of 18 who's involved in a commercial sex act or anybody that has been induced into a sex act by force, fraud
Starting point is 00:11:33 or coercion, so this is non-consensual, is a victim of trafficking when it's monetized. Now on Pornhub, it's free porn, but it's not free. These are heavily monetized porn videos and they're monetized mostly with ads. So they were selling 4.6 billion ad impressions on Pornhub every single day. And that's how they were monetizing these millions and millions of videos, including child sexual abuse,
Starting point is 00:12:00 rape, and all forms of non-consensual content. So that's why I said, hey, this is a trafficking hub. We have to hold this company accountable. And the hashtag just started to catch on. And then I said, okay, this has to go bigger than my tiny social media following. I'm going to write an op-ed about it. So I wrote an op-ed. I sent it to a few different outlets and the Washington examiners who actually decided
Starting point is 00:12:22 to publish it. And then that kind of started to get a bit of a virality. And so people were reading it, they were horrified as to what was going on. And then one of my followers just said, hey, you need to start a petition. And if you don't start one, I'll start it. So I said, okay, I'll do it. And so I started the Trafficking Hub petition to shut down PornHub
Starting point is 00:12:41 and hold its executives accountable for enabling trafficking and that started to go viral. And today we have 2.3 million signatures on the petition from every country in the world. We've had 600 organizations involved. Hundreds of survivors have come forward. At one point, survivors were coming forward to me on a daily basis saying, I was exploited on PornHub. My videos are still on Pornub. I was a child. I can't get the videos down. Please help me. I'm unconscious in this video. All of these things, and we're able to
Starting point is 00:13:15 connect them with lawyers. And then, since then, thousands of media articles have been written on this exposing the criminality of Pornhub. And one of the most important things that we did throughout this campaign to hold them accountable was go after the credit card companies. So we knew that the Achilles heel of Pornhub was a credit card companies because, you know, without credit card companies, you don't have a very profitable online business. Uh, and actually the former owner of Pornhub reached out to me in the midst of, you know of this viral campaign.
Starting point is 00:13:47 And he said, listen, if you really want to get Pornhub, you have to go after the credit card companies. And that's what we did. And eventually, enough pressure through litigation, through lawsuits, through public pressure, through the articles that were being written about this, especially the children of Pornhub by the New York Times, the credit card companies finally cut off Pornhub and they were forced to delete 91% of the entire website. Why did credit card companies cutting them off result in them deleting content? Did that mean that credit card companies would reactivate payments? So they were hoping that would be the case. So they understood that their site was completely infested with crime. They didn't know what was consensual and what was not consensual. They didn't know who was 16 and who was 18.
Starting point is 00:14:33 They were just guessing. So, you know, one of the things that we understood was that their moderation system was a joke. So we had moderators who came forward and they exposed the inner workings of how they were, you know, trying to vet the content and basically they were. What was the, what was the process? So it was 10 people in Cyprus. So imagine this.
Starting point is 00:14:54 So 10 people who were in charge of millions of videos. Now it wasn't just PornHub. Remember I described my geek, right? They were in charge of vetting all of the porn tube sites. So again, YouPorn, RedTube, Tube8, XTube, all of them, 10 people per shift, and they were just clicking through. They were expected, like they were actually reprimanded if they didn't click through at least 700 videos per eight-hour shift, but some of the more experienced moderators were clicking through 2,000 videos per eight hour shift
Starting point is 00:15:29 with the sound off. And they were told, essentially the moderator said, our job was not to make sure that illegal content wasn't getting on the site, our job was to just make sure that as much content could go through as possible. So think about that.
Starting point is 00:15:44 And so that's how the site was just, they actually had no idea how many of these videos were rough sex and how many of them were rape. So the only thing they could do. Am I right in saying that the two main issues here, one is consent and the other is age? Are those sort of the two big buckets? Yes, both. Yeah, it's is age. Are those sort of the two big buckets?
Starting point is 00:16:05 Yes, both. Yeah, it's because age, again, like a pediatrician can't even guess on a consistent basis who's 16 and who's 18, right? I mean, they had very young children on the site. One of the stories that just, it's probably one of the worst that I've heard is of a 12-year-old boy from Alabama who was drugged and overpowered and raped by a man named Rocky Shay Franklin. Franklin filmed 23 of the assault videos and he uploaded those videos to Pornhub and the police went after the site to take those videos
Starting point is 00:16:40 down when they found out and they were ignored multiple times for seven months. Those videos stayed up even though police were demanding that they come down, getting hundreds of thousands of views, monetized views, mind you, making money for the owners of Pornhub. But so there's 12 year olds on the site, three to six year olds on the site, but most of the victims that came forward were underage teens.
Starting point is 00:17:02 So they were young teens and teens who were under 18. And again, that's because, you know, like they were just obviously not vetting the videos at all, but even if they were looking at a video, there's no way they could tell who was 15 and who was 18. So yeah, so you're right in saying it's underage issues and then it's consent issues, right? Because also, how in the world could they tell if a video was non-consensually uploaded
Starting point is 00:17:29 but consensually recorded? There's literally no way they could tell. So the only choice they had at that point when the credit card companies cut them off was in an attempt to woo back the credit card companies, they said, we have to delete all of the unverified content off our site. And so today they have actually taken down 91% of Pornhub. So they took down over 50 million images and videos
Starting point is 00:17:57 in what Financial Times has called probably the biggest takedown of content in internet history. And they still have to delete more. So they're going to delete more next month. So by June 30, they're going to have to take even more content off the site. And that's because the content that they left on the site, they left verified uploaders. Okay, so listen, the verified uploader doesn't take care of the problem. So they had some videos on the site that they had actually verified who the uploader was. And Rocky Shay Franklin, who I just told you about, he was a verified uploader.
Starting point is 00:18:34 But that didn't mean he wasn't uploading victims in his videos. So they're going to have to take down a lot of the remaining content in the next month. So we'll see how much that ends up. How much do you think will be left? I mean, it's hard to say, you know, as of September of 2024, they've been forced to start verifying the age and consent of people who are in the videos, so the individuals in the videos
Starting point is 00:19:04 for the new content being uploaded. And that's because they've been sued. So they've been sued now by nearly 300 victims in 27 lawsuits and that includes class actions on behalf of tens of thousands of child victims. These are certified class-action lawsuits. They have one in Alabama, one in California, and they're just getting, I mean, they could have potentially billions of dollars in damages for what's happened to these victims. And to the damages, sometimes people think of it and they kind of minimize it as, oh, this is just online.
Starting point is 00:19:38 These are just, you kind of think of it as pixels on the screen and the actual victim is not humanized in the way that they really should be. But I think one of the things that we have to think about is the trauma that they face when these videos are uploaded online, because it's one thing to be raped or abused as a child. But then when that's recorded, and then it's distributed to the world, and it's distributed with a download button. So they had a download button on every single video on that site.
Starting point is 00:20:09 So anybody could then download onto their device the worst moment of that victim's life. And then re-upload it again and again and again forever. That they just have to then engage in this sadistic game of whack-a-mole. Where they're constantly in fear of who's going to upload their video to the internet. forever, that they just have to then engage in this sadistic game of whack-a-mole where they're constantly in fear of who's going to upload their video to the internet now. And they call it the immortalization of their trauma. They say, you know, one victim said, my abuser put me in a mental prison, but Pornhub gave me a life sentence.
Starting point is 00:20:41 And so the severity of this, when you think about the lawsuits and this going to trial and the facts being put before a jury, I mean, this could be massive, massive damages. In other news, Shopify powers 10% of all e-commerce companies in the US. They are the driving force behind Gymshark and Skims and Allo and Nutonic. And that is why I've partnered with them because when it comes to converting browsers into buyers, they're best in class. Their checkout is 36% better on average compared with other leading commerce platforms and with ShopPay, you can boost your conversions by up to 50%.
Starting point is 00:21:15 They've got award-winning support that's there to help you every step of the way. Look, you're not going into business to learn how to code or build a website or do backend inventory management and Shopify takes all of that off your hands and allows you to focus on the job or build a website or do backend inventory management. And Shopify takes all of that off your hands and allows you to focus on the job that you came here to do, which is designing and selling an awesome product. You can upgrade your business and get the same checkout that we use at Newtonic with Shopify by going to the link in the description below and signing up for a $1 per month trial period, or by heading to shopify.com
Starting point is 00:21:43 slash modern wisdom or lowercase. That heading to Shopify.com slash modern wisdom or lowercase that Shopify.com slash modern wisdom to upgrade just selling today. It's, I mean, horrifying, but we, there's sort of two big buckets again, of crimes that are happening, one being the actual incident, presuming that somebody isn't of age, didn't consent during the act, didn't consent to the recording, the actual distribution on Pornhub side. I get the sense that a lot of the ire and sort of hatred and vitriol and stuff that's directed at Pornhub is also, it's like Pornhub are a conduit for who did the crime too and the actualhub are a conduit for who did the crime too.
Starting point is 00:22:28 And a lot of the times we can't, who is this person? How do we find them? Where are they? Investigation and so on and so forth. Very difficult to do. I know sometimes people wear masks or purposely blur faces or, you know, do things that mean that you can't see who the potential perpetrator is.
Starting point is 00:22:46 So yeah, Pornhub are definitely going to feel an awful lot of wrath from everyone. Yeah. And I mean, to your point, there is, there's multiple levels of perpetration in this issue and what's happening. And for sure, you know, the person who actually did that abuse, who filmed it, they have to be held accountable 100%. I mean, we wanna see accountability across the board. When it comes to Pornhub,
Starting point is 00:23:14 I mean, the facts that have been uncovered in legal discovery, I mean, Nick Kristof of the New York Times, I mean, he wrote a scathing expose in 2020 called The Children of Pornhub that featured the story of one particular victim. Her name was Serena. I'll just share her story because it's an important story. She was a young teen, so she was 13 years old and she was from Bakersfield, California. An innocent teen, I mean,
Starting point is 00:23:38 she'd never even kissed a boy before. She was a straight-A student. She had a crush on a boy older than her and he coerced her and convinced her to send him some nude images and videos of herself, which she did. And she shared those with him, and then he shared them with classmates, and then they got uploaded to Pornhub, where they got millions and millions of views. And she would beg for those videos to come down, and she would be ignored. Because they only had one person. So we uncovered through the legal discovery process
Starting point is 00:24:07 that out of employees that they had 1800 working for Pornhub and MyGeek and they employed one person to be reviewing videos flagged by users as containing rape, child abuse or other terms of service violations. So they had one person, they had a backlog of 706,000 flagged videos. So they also had a policy where they wouldn't even put a video in line for review unless it had over 15 flags. So a victim could actually flag their video 15 times and it would never even have been put in line for review.
Starting point is 00:24:42 So Serena would beg for them to come down. If she would get a hold of anybody, they would hassle her and say, prove that you're a victim, prove that you're underage in this video. And if she eventually got it down again, it would just get uploaded again. So this sent her on a spiral of despair. She ended up dropping out of school because she was being bullied. She got addicted to drugs to try to numb the pain. She ended up trying to kill herself multiple times. This is very common among victims of image-based sexual abuse. So the suicide ideation rate for these victims is about 50%.
Starting point is 00:25:17 And then she ended up homeless living out of a car. 50%. Yes, 50% have suicidal ideation. So they think about it. So they think that ending their life might be better than enduring the pain of constantly having their trauma on the internet. And so that was the trajectory of Serena.
Starting point is 00:25:38 But if you think about the intentionality, so going back to like who's responsible in this situation, right? We have the individual perpetrator, but then the executives making the decisions, the intentional policy decisions, and we know they're intentional because we've uncovered email exchanges and messages and all of the communications and policies that they had put in place to enable this abuse to happen. So, I mean, even all the way from having a VPN,
Starting point is 00:26:08 where they offer a VPN to people, so they weren't just checking, they weren't not checking ID and consent. They were allowing you to anonymously upload, but then you could also access the site with a VPN. So law enforcement need an IP address in order to locate a perpetrator. That's how you actually locate a device.
Starting point is 00:26:26 So if you use a VPN, well then you're masking your location. But not only that, like they were not reporting child sexual abuse that they were aware of to authorities for 13 years until we finally held their feet to the fire and exposed them. So it's actually mandatory in Canada where they have headquarters to report. when you know about child sexual abuse
Starting point is 00:26:46 you have to report it to authorities and they were not reporting they were not reporting for 13 years even though they were aware of Children who were being abused on the site and so then you think about that it's like how many perpetrators could have been apprehended and how many children could have been saved from years of abuse if they were actually reporting the videos to authorities like they should have been, but they were hiding it from the public. How damning are the internal documents? How does anyone get a hold of the emails of a company?
Starting point is 00:27:19 What's the story of getting behind the scenes? Yeah. the story of getting behind the scenes. Yeah, so one of the amazing tools through civil litigation is being able to get behind your opponents, communications, exchanges, emails, text messages, all kinds of internal policy documents. So as a litigation, as a civil litigation progresses, they have this period of what they call discovery. Basically, they can compel the company and they have compelled the company to release
Starting point is 00:27:52 documents. It's hard for lawyers. They do not give this stuff easily. They put up a fight, but these are amazing attorneys that are representing these victims. they've been able to get this information, messages. Now, an amazing thing happened a few weeks ago, and this was the basis of Nick Kristof's recent article. So I told you about the Children of Pornhub, but actually he just released a follow-up. And it was because the court in Alabama for the child trafficking class action lawsuit accidentally. So the court accidentally released thousands of pages of internal documents and communications
Starting point is 00:28:31 and messages and emails that were supposed to be sealed. So they had actually accidentally unsealed all of this information. So now we have, I mean, it's an amazing amount of information depositions where they actually deposed under oath. It's a crime if they actually tell a lie in these depositions, the managers, the employees, the executives, the owners. And we have all like, you know, one deposition is like a 500 page deposition. And all of this put together, the question becomes how in the world are Pornhub's executives not in prison?
Starting point is 00:29:11 And I honestly think after this release of this evidence, they will go to prison. I feel confident that we will see this company properly criminally prosecuted. Why aren't they in prison? Sounds to me like relatively open and shut case. There's already been investigations. This weird lily padding thing where something goes wrong and then it's rebranded over here and then we rebrand a little bit more, and there's a tiny exact change,
Starting point is 00:29:39 but most of the people that behind the scenes all stay the same. It doesn't really matter who's been switched in and switched out. Is it just taking a long time? I guess it's only been five years to do this. It's a big investigation. Is it, is it just the kind of slow lumbering behemoth that is legislation happening? What's going on?
Starting point is 00:29:57 I mean, there is this saying that's, you know, the wheels of justice turn slowly. Right. And I think that's true. I think the wheels of justice turn slowly, but they turn. And I think especially if we keep the pressure on, they turn. I think that when we focus on something, when we give it attention,
Starting point is 00:30:12 and when there's a public outcry about something, then like the squeaky, what's it? The squeaky wheel gets the oil or whatever the saying is. But yeah, if we can continue to put pressure on those in power to do their job, then I think that we will see it happen. And so I think that's a matter of time. There was a company called Backpage
Starting point is 00:30:34 and the fight to hold them accountable for child trafficking on that website. I think that was a 10 year fight. So we're at five years now from really starting to shine a light on this. And I really believe that if we can keep it up, that's public pressure coupled with civil litigation to continue to hit these companies where it hurts in their bank accounts, that we will see the outcome of justice really being served.
Starting point is 00:30:59 And why is that important? I think, you know, why is it important to hold Pornhub and its parent company accountable? You know, people might say, well, this is just one of so many different sites. This is just one website. That's true. But there is something that is real and it's called deterrence. And one of the most important things that we can do to prevent abuse is to deter future abusers because at the end of the day, this is a risk benefit calculation for what I call corporate traffickers.
Starting point is 00:31:29 And this is about money for them. And they're just saying, you know, is what's going to happen to me, the cost of doing business, or is it worse than that? Like, will I face real and serious consequences? And when they understand that they will face real and serious consequences, they when they understand that they will face real and serious consequences, they'll make different decisions.
Starting point is 00:31:48 They don't have to distribute illegal content on that site. They can, although it's expensive and it's not easy, they can make the decisions to prevent that, to put in those safety policies and they have to be forced to do it. And so what we're seeing right now is the power of deterrence. Like right now, Pornhub's biggest competitors are proactively seeing what's happening on Pornhub and they're actually taking down illegal content from their sites.
Starting point is 00:32:16 They're changing the way that the upload process works. Fear of being hit with the same kind of litigation. Right. Okay. Exactly. of being hit with the same kind of litigation. Right. Okay. Exactly.
Starting point is 00:32:24 I think, yeah, the obvious question is Pornhub and even Aloe X-Mine Geek aren't the only adult website in the world. So you shut this thing down and it goes elsewhere. And I know that you're pushing for Pornhub itself to be shut down entirely as opposed to just meeting the standards of moderation that you would be happy with. Is this because taking down Pornhub would be a very loud shot across the bow for everybody else and then presumably moving forward, you want what kind of moderation, why shut down, not
Starting point is 00:33:05 moderation exclusively on PornHub? And then what does a, what does a healthy porn moderation process look like? Yeah, those are great questions. From the beginning, the call to action has been to shut down PornHub. And I absolutely mean that. I didn't say it lightly when we started. And that is because the level of harm that has been done by this company to so many victims since 2007 with impunity, with intentionality, on purpose, for profit is absolutely unacceptable. And the only
Starting point is 00:33:38 just outcome is for the site to be shut down, for reparations to be paid to all victims, significant reparations, and for there to be criminal prosecution. And that's what justice served looks like. And justice is important because that's how victims can heal, when they see that what happened to them was recognized and it was paid for. And so that's important.
Starting point is 00:34:02 It's also important, like I said, to be a deterrent to future abusers. So they understand that there will be consequences if they act in the same way. And in that way, we're going to help other websites not act in the same way that Pornhub has. But again, it's not enough to hold one company accountable. And don't forget holding Pornhub accountable
Starting point is 00:34:23 is also holding probably most of the world's most popular tube sites accountable because they're all owned by the same company. But going forward, we need policy to make sure that this doesn't happen in the future. And that's why I am a strong advocate for age and consent verification policies, age and consent verification policies. Because the crux of the problem here was unfettered, unmonitored uploading on user generated sites, right? And so the solution is pretty simple. It's verifying the age ID and consent,
Starting point is 00:34:57 documented consent of every person in every video on every website that per terms of service allows user generated porn. And this can be done at scale. So we have the technology to be able to do this at scale. How do you, how do you do it? What's, what's the technology do? Yeah.
Starting point is 00:35:13 So there's, there's numerous companies that do this. One of them that PornHub is currently being forced to use is called YOTI. And what they do is they do a biometric scan coupled with verification of government issued ID in order to verify that the person in the video includes a liveness scan. And so there's a liveness scan. So it's like, yeah, so you move when you're doing the scan of your face. So you can make sure that you're not just putting somebody's picture up there. So I mean, there's different ways that this can be done, but the
Starting point is 00:35:48 technology is there and they can do this quickly, efficiently. Now it costs money, right? So the one who's going to pay for this is the porn companies who have to implement these third party checks. And I think third party is so essential because I would never ever want anybody to give their ID to Pornhub. I mean, they're actually facing a class action lawsuit for the exploitation of user data. So what have they done? What's the story behind the user data? Yeah. So apparently what was happening was that they were without consent obtaining and selling the user data of millions and millions of people who are visiting their sites to third parties without consent.
Starting point is 00:36:27 And so they're facing a class action lawsuit for that. In other news, this episode is brought to you by Function. Did you know that your annual physical only screens for around 20 biomarkers, which leaves a ton of gaps when it comes to understanding your health, which is why I partnered with Function. They run lab tests twice a year that monitor over 100 biomarkers. They even screen for 50 types of cancer at stage one. And then they've got a team of expert physicians that take the data, put it
Starting point is 00:36:52 into a simple dashboard and give you actionable recommendations to improve your health and lifespan. They track everything from your heart health to your hormone levels and your thyroid function. Getting your blood work drawn and analyzed like this would usually cost thousands, but with Function, it is only $499. And for the first thousand Modern Wisdom listeners, you get $100 off, making it only 399 bucks. So right now you can get the exact same blood panels that I get and save $100 by going to the link in the description below or
Starting point is 00:37:18 heading to functionhealth.com slash modern wisdom. That's functionhealth.com slash modern wisdom. They are fully fucked, aren't they? Like they are so fucked, dude. Holy shit. Like how many different ways? I don't know. Maybe, maybe it's the case that we will look back on Pornhub and think that they were kind of the first through the door, wild west frontier style porn company that just made all of the errors.
Starting point is 00:37:52 Right. It was, look, this was before we had the, the, uh, Lila Mikkelweight act of fucking 2028 or whatever. You know what I mean? Um, it was before we had the correct barriers in place. Technology had enabled this kind of user-generated porn uploading and it had done it at such a pace and no one had any idea what was going on
Starting point is 00:38:14 and everyone was making money and lots of people were enjoying free access to porn on the internet from mobile devices and their laptops. And then we realized just how sort of rotten the core of this was and then we realized just how sort of rotten the core of this was. Uh, and maybe we'll look back and go, wow, Pornhub and Aloe are a shining example of all of the different ways that you can get this stuff wrong online. But it is kind of impressive.
Starting point is 00:38:39 It actually genuinely is impressive to have one company that has accumulated they're like the neutron star of making errors with this stuff. Like how many, they're the LeBron James of getting, you know, like the goat of fucking up. Um, I mean, it's funny that you say that though, because if they had just been left alone, I mean, they were so popular. I mean, people were wearing their apparel proudly in public. Culturally. I mean, it's like, that's the power of brand. That just shows, obviously you need a
Starting point is 00:39:11 product that backs it up. But if you are, they're the apple of porn, right? They're the first mover advantage. You think mobile phone, you think apple, you think porn, you think Pornhub. Exactly. That's absolutely true. And the thing of it is this, is that I think a lot of people today still have no idea that this ever happened, that they're facing all of these consequences for the horrific actions that they have deliberately done. And we're talking about like the wild west of the internet
Starting point is 00:39:41 and this and that. And the thing that really really I think is important for people to understand about PornHub based on all of the evidence that we've uncovered, like I said, is the knowing intentionality. It's that these were decisions that were made where it's not like they were completely oblivious to the children that were being exploited on the site
Starting point is 00:40:03 or the rape victims. I mean, there's pages and pages of in these recently released documents, accidentally released, where there's just years and years where they had people filling out their contact form and saying, please take these videos down. I was unconscious in this video. I was raped in this video. I was a child this video I was a child in this video or like I this is my friend. She's 15 in that video She doesn't know that this was uploaded you take it down and these were this was for years that they knew about that So, you know if you knowingly distribute Whatever underage sexual material look at me trying to sound like I know in terms of
Starting point is 00:40:46 legislation and stuff. I've heard this sentence before, right? Not at me. I've seen other people have this sentence, a lot of shit, than before. Um, if you knowingly chair underage, something you get in trouble, right? Like it's, you're really, really fucked.
Starting point is 00:41:01 It's actually a crime. Yeah. Is there, is there a particular different type of carve out or was there a particular different type of carve out in the same way as whatever that article was that said we are not a curation site. We are a pipeline utility. That was a thing that all of the social media. Right, you're talking about section 230.
Starting point is 00:41:21 Yeah. So, yes. Section 230. Thank you. Yes. Section, yes. Section 230. Thank you. Yes. Section 230 of the CDA. So the communications decency act that essentially created a loophole for sites
Starting point is 00:41:33 that allow users, right, to upload things. And so they say that they're not responsible for things that other people upload it. Basically they consider themselves to be like a neutral party. They're just the telephone, they're the wires of the telephone and they're not responsible for what people are saying. That's not the case with Pornhub. So Pornhub has actually tried to argue Section 230. They tried to get all of their lawsuits dismissed.
Starting point is 00:42:05 I should have been on the legal team. I could have been a part of the legal team. I don't want to be a part of the legal team, but you know what I mean? They tried it. Yeah. No, they tried it and they've lost. And the reason why they've lost every single, except for one lawsuit that unfortunately it was brought by a victim herself.
Starting point is 00:42:22 She didn't have an attorney. Um, but all of the others, and again, and again, there's been dozens of these lawsuits. And in every case, the judge has said, absolutely not. You do not get dismissed based on Section 230. And that is because they were actually creating. So they were part of creating the content and curating the content and promoting it and duplicating it. So they were taking the content from Pornhub and they were actually uploading it to their sister sites.
Starting point is 00:42:51 They were also creating thumbnails of the content both legal and illegal. They were recommending and they were helping people reach illegal content by suggesting things like, you know, minor and childhood and whatnot in titles and tags. In fact, in those uncovered legal discovery documents, we have communications where they actually refused. They were recommended by employees to take certain words and keywords off the site, and they actually refused minor, childhood, wasted, things like that where they were actually intentionally and they were tracking to the dollar how much they were making on these categories that included illegal content like teen, which was one of the most popular categories on Pornhub
Starting point is 00:43:39 and the most profitable of all the categories and so they didn't want to take down any of that. profitable of all the categories. And so they didn't want to take down any of that. Um, so because of that, they have lost section 230 protection. Right. What was some of the most or more surprising ally ships that you made when going through this, I have to assume, I mean, like Bill Ackman got involved, but I have to assume that there were other, even more I mean, like Bill Ackman got involved, but I have to assume that there were other, even more left field people than Bill Ackman.
Starting point is 00:44:09 Yeah. I mean, probably the most surprising to me was when the former owner of Pornhub came forward to help, you know, in the summer of 2020. Question on that, got to interject on that. Yeah. How much is that someone begging like, oh, please, like the lady doth protest too much. I must be the white knight that can come in and save you. It's the dude. Yeah. So, yes.
Starting point is 00:44:36 Covering your own tracks here. It was, there was definitely a self-interest in that that I discovered later on. I didn't really care what his motivation was in coming forward. Just be effective. What I wanted was information. And so one of the things that he actually disclosed was the hidden shareholder. So there was a hidden majority shareholder for years that nobody knew who he was. And the former owner, Fabian, told me who it was. So he just, I asked him who's the secret shareholder of Pornhub now. And he told me who it was. So he just, I asked him, who's the secret shareholder of PornHub now?
Starting point is 00:45:08 And he told me, his name was Bernd Bergmeyer. And he was a businessman that grew up in Austria and then lived part-time in Hong Kong and part-time in London. But he was the majority shareholder that was hiding his identity forever. And so he was able to be found and exposed. And today he's being sued personally by dozens of victims. But then again, also he said, listen, go after the credit card companies
Starting point is 00:45:31 because they're the Achilles heel of Pornhub. And that's exactly what we did. Other than that, obviously Bill Ackerman, he was a surprising ally. He had read the Children of Pornhub, New York Times article, and he was incensed because he said his daughters of his own. And so he wanted to get involved and help. And so not only did he start tweeting about this, but he also reached out to the CEO of MasterCard because he knew him from the tennis circuit. And so he actually stepped in to help
Starting point is 00:46:00 convince the credit card companies to cut ties with Pornhub. And actually they did at the end of 2020. But what we found out was that two weeks later, they quietly snuck back. So they actually snuck back to the advertising arm of Pornhub. And it was another two year fight to finally get them to cut all ties with Pornhub. And Bill Ackman played an important role that whole time. He helped put the pressure on. There was a lawsuit against Visa. So this was key to this whole story.
Starting point is 00:46:31 So Serena, who I told you about, the victim, she not only sued Pornhub, she also sued Visa. And Visa actually lost their motion to dismiss their case. And it was the pressure of Visa losing their motion to dismiss their case, coupled with Bill Ackman helping myself and Serena's lawyer get on CNBC Squawk Box with Andrew Sorkin, which was this, you know, in America it's a very popular, you know, morning financial show. And for 17 minutes, we were able to call out the CEO of Visa, Al Kelly, saying, What in the
Starting point is 00:47:06 world are you doing? Why are you monetizing child sexual abuse? And it was just enough pressure that finally, he actually made a personal statement. And he said, I'm a father, you know, we're going to withdraw our services from Pornhub. And so that's what happened. But other than him and Fabian, I think probably the porn stars and the porn performers were just very helpful allies that maybe people wouldn't normally think would be part of this. But they were essential in the fight against PornHub. I mean, there was allies that were porn performers and porn producers. And they came to me and we would talk for hours. And one of the things that they would share is their own struggle with trying
Starting point is 00:47:50 to spend hours on Pornhub every day, trying to scour their Pornhub and the Tube site. So the other sister sites, trying to take down their own stolen content. And when they were going through Pornhub, they actually were finding so much illegal content and they were sending it to me. And so then we documented. Yeah. I suppose they are, um, they have an incentive to, to get content that they didn't consent to being uploaded from a monetization copyright standpoint.
Starting point is 00:48:27 But you're observing professionals, adults, right? That maybe even have a management company behind them and have preferential access to Pornhub's moderation team and stuff like that. They are struggling to keep on top of this game of whack-a-mole. So if you are a 19-year-old ex old ex 15 year old girl, trying to chase down your thing, trying to keep it from your parents. You don't want mom to find out. You don't want dad to find out.
Starting point is 00:48:53 You're already ashamed. You're doing this on your own. You're spending all of this time. You're in your room. You're worrying. You're, you know, yeah, it's insane. It is absolutely insane. And you're absolutely right.
Starting point is 00:49:04 They're, I mean, imagine the professionals who do this for a living struggling and frustrated at the fact that every day they're having to scour these sites. And so, you know, the adult industry actually hated Pornhub and the porn tube sites for what they did by allowing just unregulated free porn to just flood the internet, which included so much of their copyrighted material. So they actually want age and consent verification when they're talking about kind of the professional porn industry. And they've abided by that for many, many years.
Starting point is 00:49:40 Like it has been the standard in studio produced, you call it brick and mortar, it's corn valley, right? Like, many years. It has been the standard in studio produced, you call it brick and mortar, porn valley, right? Good old brick and mortar porn. The brick and mortar porn, like porn valley porn, where they've actually had to abide by this law called USC 2257. And basically in the United States, and they abide by this internationally because they want to not be in trouble for distributing it in the US. And the DOJ can actually inspect a porn company's records and they are mandated to make sure that they had the ID of everybody who's in a scene
Starting point is 00:50:16 to make sure that they're of age. And this was enacted in 1988. And the DOJ can inspect those records. And if you don't have those records, it's actually a criminal offense and you can be criminally charged for not having that and they've Accepted that and actually have for the most part abided by that It's just with the advent of the internet and the free user generated
Starting point is 00:50:41 porn site model that things have not been abided by with regard to the law. Now thinking about criminality, one of the things that's illegal is to distribute content under USC 2257. So I mentioned the download button. So porn have had a download button on every single video. They were distributing that content from their servers onto the devices of people around the world, including illegal content, and they were not checking. So they could also be held criminally liable for millions of violations of USC 2257.
Starting point is 00:51:17 That's something to note as well. Have you ever got to sit down with the people behind Pornhub? Have you ever been in a room face to face with them? I have not. No. No. How do you imagine that would go? Well, I imagine it would never happen because one of the things that they did when I started
Starting point is 00:51:34 this campaign to hold PornHub accountable, one of the things they did was engage in attacks, smears. Like they just tried to discredit the work that we were doing. They tried to discredit the trafficking hub movement, whatever ways that they could, they were doing that. I mean, they've done, they've, they've what we call, uh, dirty tricks. Like they've engaged in dirty tricks to try to silence instead of address it. They wanted to silence it because they knew it would be expensive.
Starting point is 00:52:02 They actually made the changes that were necessary to stop the illegal content from being uploaded. So no, they didn't want to engage. They wanted to silence. And they've done some pretty horrible things, not only to me, but, you know, victims have faced some real hardships as well for speaking out. This episode is brought to you by Element. Summer hits different when you're actually hydrated.
Starting point is 00:52:26 That tired, foggy, muscle crampy feeling most people chalk up to heat or exhaustion. It might be electrolytes and plain water alone won't fix it, which is why Element is such a game changer. It's a zero sugar electrolyte drink mixed with sodium, potassium and magnesium in the exact ratio your body needs. No sugar, no food dyes, no artificial junk. And this summer they've dropped something new, lemonade salt. It's like a lemon popsicle grew up,
Starting point is 00:52:49 got its life together and started supporting your adrenal system and your hydration. And best of all, there's a no questions asked refund policy. So you can buy it, try it for as long as you want. And if you don't like it, they'll give you your money back and you don't even need to return the box. That's how confident they are that you'll love it.
Starting point is 00:53:03 Plus they offer free shipping in the U S right now. You can get a free sample pack of all eight flavors with your first box by going to the link in the description below or heading to drink lmnt.com slash modern wisdom that's drink lmnt.com slash modern wisdom. What has, has there ever been direct Pornhub response to your work? Have you, has there ever been, have they interacted with that stuff directly? I mean, one of the things that they, are you talking about like their responses in the media or?
Starting point is 00:53:34 Everything. I mean, have they, have they tailed you with private investigators? Have they tried to counter Sue for you accessing stuff? Well, they've never been, they've never tried to countersue because here's the thing, if they were subject to legal discovery, I mean, they know that they're going to be in just hot water. And the problem with, you know, if they were to engage in a defamation lawsuit, right, if from the very beginning, you know, they could have done that.
Starting point is 00:54:01 But the problem is when you're telling the truth, I mean, that's the ultimate defense. And so, I mean, an absolutely 100% everything that I've been saying, and it's not just me, I mean, really, this has been a movement of so many people, hundreds of organizations, hundreds of survivors, attorneys and lawmakers and law enforcement and lawyers and businessmen and so many people coming together that no, they have not done that. But yes, I've had faced, and some of this cannot be tied directly to the company. Some of it can, and some of it has been,
Starting point is 00:54:41 but yeah, there's been a lot of backlash from doxing, hacking, you know, online smear campaigns, media smear campaigns, letters being sent to my house with my children's names, middle names saying, we're watching you, you're going to get somebody killed, you know, even things like getting reported for child sexual abuse material distribution myself. So they, you know, people who we know are directly tied to Pornhub put in fake police reports about me to actually get me investigated, but it didn't go well for them because obviously,
Starting point is 00:55:20 you know, when they looked, they didn't find anything. But besides that, they did, they heard a lot about Pornhub. And so the police that were investigating me ended up becoming allies and saying, Hey, we're on the same page, we're on the same team. And how can we help you? Um, so that didn't go well for them. I have to imagine that the valuation arc of Pornhub looks like the saddest investing could have sold at the top opportunity of all time?
Starting point is 00:55:49 Well, we know some numbers now. So it was a multi-billion dollar corporation and just a few weeks ago, some information was released from court documents, uh, where we understand that the site was sold and I put sold in quotation marks because it wasn't actually sold, it was just sold on paper, but it wasn't paid for. But the sale price was $400 million. So it has lost a significant amount of value as a company for sure. Okay.
Starting point is 00:56:22 I guess the question, Pornhub is the biggest, largest mold that needs to be whacked. It's a shot across the bow of people who are maybe going to do something similar. It hopefully will be a massive deterrent. Presumably there needs to be some changes in tech and or regulation to make this more scalable, so they're like scalable protection. I'm aware that what, you know, the purest approach would be this is on the tube sites, they just need to be very strict with their moderation and so on and so forth.
Starting point is 00:57:03 But we need to be realistic and kind of enable that, enable moderation to be very strict with their moderation and so on and so forth. But we need to be realistic and kind of enable that, uh, enable moderation to be made as easy as possible from a tech side and then increase the level of deterrence from a regulation side. It seems like kind of those are two important routes to go down. So what's the, what does the future look like with that? Yeah. Yeah. So that is such an important question.
Starting point is 00:57:27 And we have to think that way because we have to, at the end of the day, we need to make the internet a safer place. And yes, we need to hold these porn sites accountable. And like you said, the justice and the deterrence, but how do we at scale help prevent this across the many different user generated porn websites and other sites that may not be porn websites, but per terms of service, they allow and they distribute user generated porn.
Starting point is 00:57:52 And that is mandatory third party agent consent verification for every person in every video. But the scale solution, the at scale solution isn't just that governments implement this policy because these are international corporations, right? Every website is pretty much operating in every place, every country in the world. So if we have that policy in the US, well, we have to implement it in Canada and then
Starting point is 00:58:15 we'll have to implement it in everywhere, which we should do. But I think that the at scale solution is to have the financial institutions. So Visa, MasterCard, Discover, PayPal, say we don't do business with user generated porn sites that don't verify the agent consent of every individual in every video. And just like they have anti-money laundering policy, they can have anti-online exploitation policy. And we know that these websites are highly motivated by credit card company demands. I mean, that's exactly why PornHub took down 91% of the entire website
Starting point is 00:58:52 was because of the credit card companies. So we know the power that they have. And when they enact that policy, it's instant and it's global. And I think that's going to be the most effective way to get all of these websites into compliance to start verifying agent consent. Yeah, in Texas, where I am at the moment, there's all manner of age verification stuff
Starting point is 00:59:18 being debated right now. It seems like that's even in the news, sort of at the moment, there's stuff bouncing back and forth. What is, you know, I have to assume restricting access as well. We haven't even talked about that. We literally haven't talked yet about, and what about exposing porn to people who are underage? That's like an entire other world too.
Starting point is 00:59:40 That's the other world. And that's the debate right now in Texas that's going on. And there's this movement across the United States and in other countries. So we're seeing it in Europe, we're seeing, you know, in the UK and Canada, where countries are understanding legislators and the population is understanding the harm that is being done to children through unfettered free access to these tube sites, to these porn sites, and just to porn online. And so yeah, there's age verification laws that have been enacted.
Starting point is 01:00:11 And right now in Texas, so they enacted mandatory age verification for users. So that's, you know, people who go to that site, they're going to have to verify they're an adult to get onto the site. And PornHub obviously does not want this to happen. So they're shutting themselves down in states across the US that are implementing age verification for users in protest of this policy. And why? It's because it's expensive for them.
Starting point is 01:00:39 They have to pay to get every user verified, but that's the cost of running a porn site and making sure that children aren't all over your site, both in back of the screen and in front of the screen, because this is a form of secondhand sexual abuse for a child to have to access and witness what's happening on these sites. Like I said, so much of it is legal, so much of it's illegal, some of it's illegal on the home pages of these sites, and some of it's illegal, but it's pretend illegal. So there was a study done by the UK Journal of Criminology in 2021, and they
Starting point is 01:01:16 looked at 150,000, I think was the number of videos on the home pages of the most popular porn tube sites, Xvideos, Xhamster, Pornhub, these free porn sites, and they were analyzing what's showing up to just anybody who may accidentally or intentionally land on the home page, and they were finding that one in eight of the videos was depicting sexual violence. So, you know, some of this may be pretend,
Starting point is 01:01:41 like, you know, what I said, there's no way to tell what's rough sex and what's rape. There's no way to tell who's 15 and who's 18. But so much of this was also the teen content that children are witnessing as their sex education from as young as eight. 10 years old. I mean, I get messages all the time from especially men who say, I was addicted to the free porn tube sites when I was, I even, you know, even six years old, eight, 10, and they've been addicted ever since.
Starting point is 01:02:13 And it's shaping their sexual template, right? This is where they're saying, what's normal? What is sex supposed to look like? What is it supposed to be like? And they're seeing so much of, you know, things that I wouldn't wish on my worst enemy to have to witness that I've seen on these sites. I have to imagine that this is going to get even more complex as AI renderings of, of porn, whether it's, um out non-consensual AI porn?
Starting point is 01:02:48 Because there hasn't been anybody's consent that has been crossed, but there is a, like an ethical essence of what this potentially increases real-world harm by changing people's expectations. This something kind of just the virtue of a person being represented in this way is also something that kind of should be protected. The ability to take photos of people and then recreate videos that aren't them but are like them? Do you own your own likeness when it comes to this sort of protection? So I mean, it is like the real front lines
Starting point is 01:03:32 of this at the moment. Absolutely. And there was just a law that was passed in the US called the Take It Down Act, where now it's federally a crime to upload even AI generated non-consensual content. So this would be like deep fakes where people could have their face superimposed onto porn and it looks realistic and it's being distributed.
Starting point is 01:03:56 So that's illegal. But also, what's important too that parents might not even realize when it comes to AI-generated content like this, is that now there's the ability for predators, abusers, anybody to take an image even of a child. So if you have an open social media account and you're posting pictures of your children, they could take that image of the child's face and put that, you know, into an AI-generated child sexual abuse material video and make it look like it's actual abuse of that child. And that's actually happening. Now, in certain countries, even the depiction of child sexual abuse is illegal.
Starting point is 01:04:42 So in Australia, in the UK, in Canada, even if it is somebody over the age of 18 that is being used in a video and they look like they're a child, that's illegal. In the US, you know, that actual, and we're not talking about AI right now, but the depiction of a child by a person that is over the age of 18 is not
Starting point is 01:05:05 illegal. That was made legal in a case called Ashcroft versus the Free Speech Coalition, unfortunately. But in other countries, that's illegal. But yeah, there's this whole frontier of what's going to happen with all of this AI-generated content. And I actually think that at least on websites that distribute user generated content, that by having agent consent verification policies in place, you can actually prevent that even the AI generated content from being distributed.
Starting point is 01:05:35 Because that would have to cross the same. Yeah, so if you have somebody's face superimposed on a deep fake, how are you gonna get the ID and actually have a verification of that person to verify their government issued ID and consent, it would stop that from being uploaded. That's an interesting single solution to multiple problems. Yeah.
Starting point is 01:06:01 How do you think that sort of user monetized platforms, OnlyFans, Admini.VIP, stuff like that, how do you think of that as contributing to this ecosystem at the moment? There's a lot of moral panic around the normalization of regular people becoming sex workers online and all of the objectification that comes along with that. regular people becoming sex workers online and, you know, all of the objectification that comes along with that. Do you think about, have you got concerns around that?
Starting point is 01:06:31 How do you think about that sort of working into this world? Yeah, well, I know that we have seen some really concerning reports from the BBC, multiple reports from the BBC and from other news outlets that have been investigating the subscription sites. And so some of those are like the OnlyFans model where they've actually had children and victims who have been abused and even that content under, behind the paywall on the subscription sites. And even victims that I have had come forward to me
Starting point is 01:07:01 that have been abused on PornHub or the TubeSites have, some of them have also been abused on OnlyFans. I think that they've tightened up a lot of their regulation now. Again, the power of deterrence, right? Starting in 2020, when they saw what was happening with Pornhub, I definitely know that there was a change in policies in the way that they were checking who's in those videos, but it's not by any means it doesn't seem like it's perfect. And I think miners are being abused on the subscription sites for sure. I know that that's true. And so that, I mean, it's just a real
Starting point is 01:07:38 concern that a lot of this again is self-generated, where it's not that a child is out there getting raped and having an abuser post their content. Children are now, it's very, become very normalized for children to be sharing nude images, not realizing the harm that that could do to them, the way that the internet is forever. And there was a study done by Thorne. So Thorne is a big child protection organization in the United States.
Starting point is 01:08:09 They focus on CSAM online, child sexual abuse material, and they surveyed over a thousand children. And they found that one in seven, nine to 12 year olds said that they had shared a nude image or video of themselves with somebody else. One in seven. One in seven, nine to 12 year olds had shared a nude image or video of themselves with somebody else. Holy fuck.
Starting point is 01:08:35 I am so glad that I had a Nokia 3410 when I was 14 years old. Like, you know what I mean? Yeah, I think the same thing. I do. I mean, it's so hard to be a child these days. Perilous minefield of bullshit. That's a question. I had, do you know Jeffrey Katzenberg is? He's the guy that founded DreamWorks with Steven Spielberg. He did Aladdin. He did The Lion King. He is now pushing this thing called Aura, A-U-R-A, and it is, I mean, it's, to be honest, it's pretty mind blowing what it can do.
Starting point is 01:09:08 It's a security app, I guess, but it allows parents, you just install it on your child's phone and it uses sentiment analysis to work out whether kids are accessing or messaging stuff that isn't good. If they receive adult images or send adult images or take photos of adult images, it sort of pings the parent immediately. So it doesn't restrict the use of the phone all that much, and it's not an overbearing level of supervision, but it keeps, it allows the parent to sort of be notified about what's going on.
Starting point is 01:09:41 It can do stuff. It can even work out the mood of the kids based on the geolocation of where they've been. So it'll say when you go to football for an hour, you type less hard, you hit the screen less aggressively and less aggressive screen hitting has been associated with lower cortisol, which means that you're typically in a better mood. On the nights when your child doesn't use their phone for half an hour before they go to bed, they stay in bed, i.e. they don't use their phone for a bigger window,
Starting point is 01:10:07 which we can correlate with better quality sleep. It turns a phone into a wearable, like a biometrically informed wearable device. The whole thing, I think pretty much everything's done locally, so it's not like they're sending this up to the cloud security. So I just think, you know, these kinds of, I know there's, there's always this arms race of tech versus tech. Um, but that was the first time that I sat down and he had, uh, Hari, his, his co-founder at this company.
Starting point is 01:10:38 And they just kept on telling me more and more of this, oh yeah, we can work out, but you know, how hard they hit the screen is an indication of their level of autonomic arousal and whether they're stressed. So there are some cool- Amazing. Yeah. I mean, technology is the capability of technology to help solve the problems that technology creates is amazing.
Starting point is 01:11:03 And I mean, there are even some apps and different programs now that for children's devices where they can prevent even the filming. So the camera itself could detect whether it's filming a nude image or video and actually stop it from ever being filmed in the first place if it's a child's phone. That kind of thing. I mean, that level of prevention at that level. But yeah, I mean, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's, that's phone, that kind of thing. I mean, that level of prevention at that level. But yeah, I mean, the ways that we can implement technology to help is amazing. But the harm that children face, the danger that children face online and young people and, you know, even adults, right, is incredible levels. And I think the more that we can talk about it and at least have young people understand the consequences of distributing content online. And that, yeah, it's nude images and videos,
Starting point is 01:11:58 but it's also things that we say online that we take it so lightly. Sometimes even things we say or things we distribute online that we take it so lightly, sometimes even things we say or things we distribute online, not realizing the internet can be forever and that maybe you'll regret saying that 10 years from now when your employer goes and there's a screenshot of you saying whatever. But just to have more of a sense of, I don't know, I guess a gravity about the way that things do get immortalized online and the consequence of that, especially when it's a nude image and video, because there's sometimes there's so many people that think that, you know, it's fun right now and I'm
Starting point is 01:12:36 going to do this, but they don't realize that maybe they won't want those nude images and videos online in 10 years from now, but then, you know, it's forever. So informed consent, I think, is a really important thing as well for people to not just consent, but understand exactly what they're consenting to when they're sharing those kinds of images online. There is a weird type of, I think in the history of ideas, it's called conceptual inertia. So you can imagine there is a time when it's proposed that maybe the entire solar system and
Starting point is 01:13:15 universe doesn't orbit around the earth, perhaps the earth orbits around the sun, and this is a total heresy and we can't believe that this is the truth. And then, you know, evidence continues to come forward and you can say, somebody proposes a thing that most people aren't sure whether they agree with. And then slowly maybe, uh, evidence or data or science catches up with this. And they go, okay, this person wasn't talking bullshit. This is actually legit. This is the way it is.
Starting point is 01:13:38 But there is still this huge lag and it even happened with that revolution, this huge lag for just most people to use the right language. And you know, given that we're talking about the internet being around for two decades, sort of widespread porn being around for one, one and a half, something like that. And you think, okay, is it any surprise that cultural norms and expectations and understandings of behavior and the way the parents communicate with their kids, that these things are taking time to catch up? And, you know, it's people like you that are applying a nitro boost, like turbocharged
Starting point is 01:14:22 thing to, hey, these are all of the areas, these are all of the different bits of weakness and vectors where shit can go awry and don't fall down that fisher over there and we need to be worried about this thing. And yeah, I mean, you're a trooper. You're a real hero for putting this stuff together. I think, you know, God knows what would have happened if it hadn't been for you. And it certainly seems like the, the hashtag and the movement that you put behind this has definitely expedited this process.
Starting point is 01:14:50 So, yeah. Well, thank you. I always want to pass that on because I know that shout out to the survivors who have spoken up and without their voices, none of this would have been possible in their bravery to speak up and to share their stories, their powerful, powerful stories. And so many of them have done that at risk. It's hard to talk about your own exploitation. Um, but they have done it because they don't want this to happen to others. And so I just, yeah, thank you for that.
Starting point is 01:15:22 And I would love to pass. They've got a powerful ally in you. And my intention is to spend the rest of my life not being the subject of an investigation that you do. I do not want to be on the other side. I'm sure you will not. I guarantee that. Yeah.
Starting point is 01:15:39 Look, tell people where they can check out your stuff online, support you, do all of the things. Of course. Yeah. So many people are still signing the Trafficking Hub petition, and it is still a powerful awareness tool, and it is a way that so many people are getting this message. So you can go to traffickinghubpetition.com and sign that and join others. You can also, so I wrote a book about this story that was released last summer called Take Down Inside the Fight to Shut Down Pornhub and you can buy that book and all proceeds, 100% of author proceeds from the sale of the book go to the cause, go to the Justice Defense Fund, an organization that I
Starting point is 01:16:18 founded. And in the book, like you will go through this journey with me. It's written, a lot of people are calling it a true crime thriller where, you know, it's first person present tense. And from that moment in February 1st, that night, when I tested the upload system, you go on the journey with me all the way through and you will understand this issue, not only in your head, but you'll understand it in your heart. You will experience it with me. And so hopefully you'll be inspired by that.
Starting point is 01:16:45 So you can do that at takedownbook.com and you can join what we created called Team Takedown. So this is a team of dedicated activists that are saying, look, we are, yes, we're going to take down PornHub, but we're going to work to take down illegal content across the internet to make the internet and actually a safer place for our children for generations to come. So you can do that. And my organization is called the Justice Defense Fund and you can go to justicedefensefund.org.
Starting point is 01:17:11 So you are doing God's work. I appreciate you very much. I appreciate you too. Thank you. Thank you. I get asked all the time for book suggestions. People want to get into reading fiction or nonfiction or real life stories. And that's why I made a list of 100 of the most interesting and impactful books that I've ever read.
Starting point is 01:17:32 These are the most life-changing reads that I've ever found. And there's descriptions about why I like them and links to go and buy them. And it's completely free. And you can get it right now by going to chriswillx.com slash books that's chriswillx.com slash books

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.