Front Burner - Front Burner Presents: Deepfake Porn Empire

Episode Date: April 6, 2026

Deepfake porn is a billion-click industry built on stolen faces, while the people making it hide theirs behind screens. Hosted by journalist Sam Cole, Understood: Deepfake Porn Empire traces the decad...es-long rise of synthetic porn, the targets who are fighting back, and the global investigation that led to its Canadian kingpin.Understood takes you deep inside the seismic shifts reshaping our world right now. From online porn and crypto chaos to the rise of tech oligarchs, deepfake AI, and the broken promises of the internet — we explore the stories that define our digital age with hosts and characters embedded in the heart of the action. More episodes of Deepfake Porn Empire are available wherever you get your podcasts, and here: https://link.mgln.ai/DPExFB

Transcript
Discussion (0)
Starting point is 00:00:00 All right, I got a big question. How do you feel about aging? Maybe you've got a fear of death, fear of the unknown, fear of change. Because certainly how we look starts to change. The actor Amanda Pete has given it a lot of thought over the year. She's been in the public eyes since the 90s. She'll tell you why her new film Fantasy Life has her thinking about validation, about vanity, and what she thinks about cosmetic surgery and whether or not to get it.
Starting point is 00:00:23 You'll hear that conversation now. Just search for Q with Tom Power, wherever you get your podcasts. podcast. Hey everybody, Jamie here. We have a special bonus episode for you all today from the brand new season of Understood, a deep dive miniseries feed covering everything from crypto chaos to the rise of tech oligarchs and the broken promises of the internet. This new season is called deep fake porn empire. Deep fake porn is a billion click industry built on stolen faces while the people making it hide theirs behind screens. Host Sam Cole traces the decades-long rise of synthetic porn,
Starting point is 00:01:04 the targets who are fighting back, and the global investigation that led to its Canadian kingpin. Sam's been on the show a bunch of times, a fantastic reporter, and this season is really excellent. Now here's the first episode, The Dawn of Fake Porn. If you like it, all four episodes of the season are available to follow right now. Have a listen. It's January 31st, 2023.
Starting point is 00:01:33 when a streamer who goes by cutie Cinderella wakes up. And something strange is happening. She's trending on Twitter, which even when you're internet famous like cutie is, is really never a good sign. And I'm like, just waking up, trying to figure out what is going on. And then I get a call from one of the women involved that she's like,
Starting point is 00:01:55 oh, have you seen everything? I'm like, no. And then she really broke it down for me. And we went through it. And then that's the first time I got on Twitter. And I was like, oh, my God. Like, Ah.
Starting point is 00:02:08 Cudy Cinderella is a Twitch streamer. We're doing something fun today. I think it's genuinely been a few years since we have done and asked me anything about me. Cutty Cinderella, hi, that's me. Hi, guys. So we're going to do that today. Streaming is pretty simple. You broadcast yourself live, usually playing video games, to thousands of people watching in real time.
Starting point is 00:02:29 Oh, boy. Oh, sorry. Cutie got her start playing League of Legends. It's one of the biggest video games. on the planet, a kind of online gladiator arena where teams battle it out. Yeah, you're dead. Oh, my Lord, he'd be camping. They ate you for long.
Starting point is 00:02:45 From there, she became known for bringing people together, co-hosting podcasts, arranging meetups in real life. Cutie Cinderella is used to being online, and she's used to people talking about her online, but she wasn't prepared for what happened that January morning in 23. It all started the day before with a guy who goes by Atriac. Atriac is in QD's circle. They've streamed together, so he's a colleague, a friend, really. And he was live on Twitch when he wanted to check a different window.
Starting point is 00:03:21 What time is it where I am? What time is it PST? That's the moment that upended Qudy's life. Atriac hits Alt tab, and for barely a second, every window he has opened flashes on screen to the thousands of people watching. Among them, a YouTube video of Lovely Day by Bill Withers, a Google search for Jennifer Garner in Catch Me If You Can,
Starting point is 00:03:47 his recent Uber Eats orders, and a website selling porn. Not just any porn, but porn of women he knows, women he works with, including cutie Cinderella. Someone takes a screenshot and posts it online. And then it just creates,
Starting point is 00:04:07 to wildfire. The pictures ripple across Discord, Reddit, Twitter, and by the next morning, Qudy's phone is blowing up. I was already getting DM to the photos or replying in my tweets and stuff like that before I even had a full grasp of what the hell is going on. A bunch of the messages have links.
Starting point is 00:04:30 And when she clicks through, she finds a porn site she's never seen before. Never even heard of. And she's on it. Her face, doing things she never did. Because here's the thing. It wasn't cutie. Not really.
Starting point is 00:04:49 The porn? It was a deep fake. It is so convincingly my body, but not my body. And holy shit, it hits you like the truck. You feel so violated. Deep fake videos, which are manipulated using AI, can make someone appear as though they're saying, or doing something that they're not.
Starting point is 00:05:12 Tech experts are sounding alarm bells over the rapid spread of AI-generated explicit images of women online. We're moving into a future where you really won't know what's real online. We've seen a significant increase in deep fakes and as our research has shown, 96% of these are pornographic.
Starting point is 00:05:32 I'm Sam Cole. I'm a tech journalist at 4-4 Media where I write about sex and the internet. And I've been reporting on deep fake porn, since the very beginning. And here's what I've learned. Deepfakes didn't just come out of nowhere. They were built by people, on platforms, inside subcultures.
Starting point is 00:05:55 They were allowed to spread while governments dragged their feet, and tech companies shrugged. And at every step, someone profits while the targets, almost always women, pay the price. So how did we get here? And if you follow the trail of deepfake porn all the way to the source, who does it lead to? This is understood, DeepFake Porn Empire. Episode 1, the dawn of fake porn.
Starting point is 00:06:32 Deepfakes. You've seen them everywhere. AI videos of politicians, celebrities, random people on TikTok, sometimes so seamless that you might not even realize you're looking at a fake. And we have this idea that it's a fast-moving, new. threat. But fake images, yeah, they're not exactly new. Basically, as soon as the camera is invented in the 19th century, people are faking their photos. This is Walter Schreier. He's a professor of engineering at Notre Dame, where he researches AI, the tech, but also the cultural history. And when it comes to fake things online, he wrote the book. I am the author of a history of
Starting point is 00:07:15 fake things on the internet. I think a lot of folks today believe film photography was rather cut and dry, right? It's like you took the photo and it captured an objective picture of reality. But that was never really the case. In fact, the very first fake photo showed up in 1840, made by an early camera inventor. This inventor, Hippolydi Bayard, is in France, and he's really, really annoyed that his camera process is not receiving the attention he thinks it should. So he comes up with a publicity stunt. He fakes his own death in a photograph.
Starting point is 00:07:53 Writes a little story on the back of this photo, right? Protesting all of this, explaining why he died. But again, it was all this sort of hysterical exaggeration. The kind of thing we associate with the Internet today. From there, fakes only multiplied. One of the most famous came decades later in 1917. So you have two cousins in the English countryside that have access to a camera, and they produced a remarkable series of photographs where they appeared to be,
Starting point is 00:08:27 you know, in different natural settings, the woods, with a bunch of dancing fairies. Yeah, like little winged tinkerbells. Now, the fairies generated this enormous debate. Notable figures get involved in this debate. Most prominently, Sir Arthur Conan Doyle, the author of the Sherlock Holmes series. And when the creator of the most famous detective in literature saw the photos, he was fully convinced they were real. He absolutely defended these photos until he died. Now, flash forward several decades later, one of the cousins steps forward and says,
Starting point is 00:09:06 you know what, those were just cardboard cutouts. They were fake. The incident became known as the Coddingly Fairies. And a century later, we're still at it, still faking images. only our tools got a serious upgrade. In the 1970s, digital cameras entered the scene. Flash forward then just a couple more years, you get the appearance of Photoshop,
Starting point is 00:09:31 a piece of software that has stood the test of time. In 1990, Adobe was best known for making printer software and tools for font nerds. Photoshop put them on the map and got them on the news. Technology makes it difficult. maybe even impossible to tell what's real and what's not. Recently, the issue of altering photographs came up when Playboy added its logo to a picture of Rosanne Arquette, without her knowledge or her approval.
Starting point is 00:10:00 And then there was a huge publicity outcry when TV guide tacked Oprah Winfrey's head onto Anne Margaret's body. Suddenly, the kind of image editing power once reserved for million-dollar studios was available to every graphic designer. and everybody else. If you look at the actual examples of how people were using the earlier tools, right, like Photoshop, in very bad context, it was almost always pornography. And we were able to track down a guy who was making Photoshop porn in this era. I actually started playing around with this stuff at home, strictly privately.
Starting point is 00:10:42 And yes, that obviously already involved nudity. It was very clumsy and even limited to gray scale. images. But at the time, it was cutting edge and easier to suspend disbelief than taping a cut-out physical picture of a head over a Playboy centerfold, I guess. This is TMFU. That's his username. And he has a digital footprint in the fake porn scene going back decades. TMFU didn't want to be interviewed, and he wouldn't share his real name. But when we emailed him, he responded with a multi-thousand word manifesto. Those emails are what my colleague is reading from. And the story goes right back. It got all started when Luxlucer, a Canadian guy, libertarian type, created a group dedicated to
Starting point is 00:11:30 celebrity nude faking in 1996. Luxlucer's real name was Carrie Pearson, and he ran two Usenet groups dedicated to fake nudes of celebrities. Quick refresher, Usenet was basically the early internet's version of Reddit. Think of it like a message board but run through email. I'd say there were probably a good hundred regular people from all over the world sharing their photoshops. This was at the time of Britney Spears, Baywatch girls, and every American sitcom girl you can imagine. Luxlucer also ran a website, which is still archived all these decades later. Luxlucer.com. Lux's website is peak 90s internet. A blackbacker.
Starting point is 00:12:16 background, decorated with neon, naked lady silhouettes, half a dozen different shades of neon text, and comic sands everywhere, all of it dedicated to fake celebrity porn and the community churning it out. My sound designer is going to read from it. We all love seeing the heads of some of our favorite stars and other prominent people placed on top of a nude model. The sheer godlike power of exposing them to the world for our own fantasies is a healthy and thanks to the easy availability of advanced image manipulation programs, relatively easy task to undertake. When TMFU found Luxlucer's website, he'd found his people. When you can share a particular interest like that, especially a naughty one, with other people who are also fascinated by it,
Starting point is 00:13:04 it gets an entirely different dimension. I probably spent every night participating in the news group, working on fakes and posting them, learning, commenting on fakes by others, exchanging tips. I probably finished and posted 150 fakes in the first year. It completely consumed me. The celebrity fakes groups did have rules about who they could fake and how. You can still read these on Lux's website. No real nudes of celebrities.
Starting point is 00:13:37 These have plenty of their own groups. No advertisements for commercial porn photo sites, also known as spam. No fakes of bestiality and no fakes of minors. artists will not post or take requests for celebs under 18. If that feels somewhat reassuring, it doesn't last. Right below that warning, there's a chart listing underage celebrities. All teenage girls with the years they would turn 18. Lux died in 2004 from complications related to diabetes.
Starting point is 00:14:14 This means that we can't ask him how he'd look back on the ethics of all of this now, but he had a statement about it on his website. website. Are these pictures ethical? My personal opinion is that it would be unethical to present them as being a real representation of the person or the activities depicted. A projection of personal fantasy, rightfully identified as such, should not cause us to lose any sleep, except of course to masturbation. In 2003, the year before he died, Luxlucer told Wired that he believed there were about 300,000 celebrity porn fakes out there. And even once he was gone, the community kept growing.
Starting point is 00:14:54 Meanwhile, everyone else was discovering digital fakery with a little help from Hollywood. Boy, no, I hate being right all the time. Yeah, I remember seeing Jurassic Park when it first came out in the theater. And there was this game to play, you know, what scene was shot with a puppet versus what scene was computer graphics. Some of the T-Rex shots, right, when the kids are in the Jeep. Keep absolutely still. His vision is based on movement. You know, it's like the head, the puppet, right?
Starting point is 00:15:31 Was it the body that's CG? You know, there were a lot of questions. And when you have that kind of ambiguity, you're doing a really good job with the computer graphics. Jurassic Park wasn't the first film to use CGI. But in 1993, the film proved computer-generated creatures could look real. or real enough. And that opened the door to another Holy Grail.
Starting point is 00:15:55 Digital actors. Hey, thought you could leave without saying goodbye. The most famous early example happened in 2013. When Paul Walker died midway through filming Fast and Furious 7,
Starting point is 00:16:08 the studio needed a way to finish the movie. So they got Paul's brothers and CGIed his face over theirs. What do you think? Parking brake slide right at to the school.
Starting point is 00:16:20 Philosophically, it kind of really creeped people out. But visually, it worked. This is where you start to get these debates around, like, what is real? Like, what do you trust, right? And then you really do start to wonder, you know, like, where could this veer off course? But there were limitations. The dinosaurs, Paul Walker's face. This was labor intensive.
Starting point is 00:16:45 It was all hand-animated by teams of people, frame-by-frame. So that's when you start to see researchers in this space discussing the possibility of automating this process, right? Again, turning to computer vision and moving away from the sort of, you know, very manual process of computer graphics. And that's where this story tips into something new. You've got Hollywood's painstaking process on one side, testing the public's tolerance for fakes, fake celebrity porn photoshoppers on the other. and soon a brand new technology about to smash through both. On big lives, we take a single cultural icon. People like Jane Fonda, George Michael, little Richard.
Starting point is 00:17:41 And we pull apart the story behind the image. And we do this by digging through the BBC's vast archives. Discovering forgotten interviews that change exactly how we see these giants of our culture. We're here for the messy, the brilliant, the human version of our heroes. I'm Emmanuel Jochi. I'm Kai Wright. And this is Big Lives. Listen to Big Lives, wherever you get your podcasts.
Starting point is 00:18:07 The future arrived in 2014 at a Montreal MicroBurray, where a grad student named Ian Goodfellow was getting drunk. I don't want to be someone who goes around promoting alcohol for the purposes of science, but in this case, I do actually think that drinking helped a little bit. That's him talking on the Lex Friedman podcast. Today, Ian Goodfellow is an AI researcher at Google Deep Mind. But at the time, he was a PhD student at the University of Montreal. He was basically done.
Starting point is 00:18:37 He'd just handed in his thesis. And that night, out drinking with friends, they kept circling the same question about something they were all studying. Generative models. Here's Walter Shrier again, author of a history of fake things on the internet. All a generative model is is an algorithm that generates data that is new. At the time, there were a few generative model. models that produced images, but the results looked wrong. Blurry, distorted, half apples,
Starting point is 00:19:09 mishap and faces. What researchers wanted was a system where you could dump in data, and it would invent totally new images, ones that looked real. At the time, Walter Schreier was also a PhD. He was at Stanford, working in a similar field, and he remembers this being the big goal. Yeah, I think all the researchers wanted was useful output, like photorealistic output. That's all they were looking for. So this was the problem Ian and his friends were debating that night. How to build a program that could do this. And then finally, when I was arguing about chartered bottles with my friends in a bar, something clicked into place. And I started telling them, you need to do this, this, and this. And I swear it'll work. And my friends didn't believe me that it would work. But I believed strongly enough
Starting point is 00:19:57 that it would work, then I went home and coded it up the same night. And it worked. It worked. Ian invented a system that could produce new, photorealistic images. He called his invention generative adversarial networks, or GANS, for short. And the breakthrough was this. Two systems playing a game. One makes fakes, the other tries to spot them. You can think of it as kind of like an art critic.
Starting point is 00:20:26 At the start, the generator makes images that aren't real at all, and the discriminator doesn't know what's real or fake. But when they play the game against each other where they have to try to fool each other all the time, eventually the generator learns to make very realistic images. Within months, Ian Goodfellow published a paper. I remember specifically my lab mate showing me this paper. He's like, look what Goodfellow just, you know, release.
Starting point is 00:20:48 It was like, what? It was like, this is incredible. I mean, it was basically a turning point for the entire field. When Ian cracked this technology, he had big dreams for what his invention might unlock. I thought about a lot of important problems in science that I could solve that would help people. He was excited about how GANS could be used to help design medicine, how they could fix bias in data. Like if there was an underrepresented category, a GAN could create realistic examples to help fill in the gaps.
Starting point is 00:21:20 And then there was what they could do for teeth. Specially trained technicians spend about two weeks to make each patient's dental crown. Now with generative adversarial networks, it's possible to design them basically instantly and then 3D print them. But no one was really thinking about porn. No, nobody was thinking about porn. Here's the thing about science. Researchers love to share, make their findings reproducible.
Starting point is 00:21:59 That's how the field moves forward. So after his paper was published, Ian Goodfellow quickly made the source code for GANS available on GitHub, the public code sharing platform. So as the technology just got more accessible, You see amateurs on the internet taking this code and doing things researchers never imagine what happened because they expected only other researchers to use the open source code. Surprise, surprise.
Starting point is 00:22:25 That did not happen. And so you see the appearance of like the original deep fake algorithm. You know, it's like a pseudonym on Reddit saying I have this new porn generating, you know, AI system, check it out. And, you know, it spirals from there. I remember that spiral very clearly because I was the journalist who broke that story. It was December of 2017 and I was a reporter at Motherboard. That's VICE's tech outlet. And my editor found a post in a subreddit called Celeb Fakes, where people were doing the old school Photoshop thing.
Starting point is 00:23:05 The post he sent me was by a user named Deepfakes. This was a play on Deep, as in Deep Learning, like Gantz, and Fake, as in celebrity fakes. This one guy would go on to inadvertently name the whole phenomenon, but at the time, he was just some rando. Deepfakes had posted a video. It was of actor Gao Godot, you know, Wonder Woman, lying on a bed, waving a sex toy around.
Starting point is 00:23:35 I clicked, and the page opened to what looked like a porn video of Godot. Only, it wasn't quite her. It was her face stitched onto a porn performer's body. I researched porn in the internet for a living, and I'd never seen anything like it. At that point, deepfakes had posted porn videos featuring the faces of Scarla Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza,
Starting point is 00:24:05 and now, Gal Godot, on Reddit. The tech wasn't seamless. Gow's face glitched, and every now and then you could see the real actor's face under hers, But I remember thinking, this is going to get so much worse. So, I sent him this message. Do you think this technology could be used with bad motivations in the future? And he got back to me.
Starting point is 00:24:30 Every technology can be used with bad motivation. I don't think this is any different than recreating Paul Walker and Fast and Furious. I wrote it up for Motherboard, and in 2017 became the first journalist to publish a story on AI porn. My headline was, AI-assisted fake porn is here, and we're all fucked. When my story came out, I'm not sure what I expected to happen, outrage maybe. And there was some of that. But mostly what I got were shrugs. Within the computer vision community, deepfakes comes out.
Starting point is 00:25:08 No one really wants to comment on that. It makes the field look bad, obviously. It's sort of just like, you know, it's not our problem, right? We didn't create it. We're not the deepfakes pseudonym on Reddit. you know, and that's just not, that was not a good response. And again, I think also the hype about the political stuff, right? Like, the field was happy to continue to talk about that.
Starting point is 00:25:28 People were paying attention to deep fakes. Just not porn. These manipulated images can pose a very real national security threat. Fake videos could become a real and present danger to our democracy. The presidential election season is ramping up, and so are the warnings about deep fake technology being used to disrupt In 2018, BuzzFeed and Jordan Peel made a fake video of Barack Obama. Obama looked straight into the camera and called Donald Trump.
Starting point is 00:25:58 The total and complete dipship. Of course, none of it was real. Jordan Peel provided the voice, the AI filled in the lips. It was a warning. This is what's coming. The warning landed. Newsrooms ran headlines about the end of trust. Congressional hearings predicted fake candidates and
Starting point is 00:26:20 fake wars, DARPA, the Pentagon's research arm, launched a massive media forensics program. Tens of millions of dollars were poured into building deepfake detectors, algorithms that could spot tiny lighting inconsistencies, missing frames, spliced pixels. During Trump's first term, the panic went nuclear. Analysts went on cable news to freak out about a hypothetical apocalypse triggered by a hypothetical deep fake of Trump saying he'd launched nukes at North Korea. Which is like the most implausible scenario in foreign politics, if you ask me. The hardest thing to do is like launch a nuclear missile, right?
Starting point is 00:26:59 Like an anonymous account online, right? Like saying things is not going to like trigger World War III. Of course, political deepfakes do happen. There's a notorious YouTube ad featuring a deep fake of Canadian Prime Minister Mark Carney trying to lure people into a shady investment scheme. It's a special investment system backed by the government and designed for the average citizen to earn money. And in 2022, a deep fake of Ukrainian president,
Starting point is 00:27:25 Vlodemir Salinsky, surrendering to Putin, went viral. But nobody in the trenches threw down their weapons and ran off the battlefield, because political fakes are very quickly delegitimized. Some people are genuinely fooled, right? which is a problem, right? But for the most part, again, I don't think that's quite the case
Starting point is 00:27:47 with a lot of this material. They're almost always released on like an anonymous account or through a pseudonym, right? It's not a major media outlet reporting it as fact. You know, if it were real, we would have heard about it
Starting point is 00:27:59 through a legitimate source. But for the people targeted by deep fake porn, it doesn't matter if it's delegitimized or not. It doesn't matter that it's not technically real, whatever that means. The violation feels, the same, which brings me to something else. Calling non-consensual deepfakes porn, even when they're technically pornographic, is not quite right. It's like how there's no such thing as abusive porn or
Starting point is 00:28:26 non-consensual porn. That's sexual assault or rape. But the phrase non-consensual AI generated sexually explicit images is a mouthful. So deep fake porn, for better or worse, is the name that's stuck. And the scope is massive. Back in 2018, researchers at the cybersecurity firm Deep Trace found that 96% of all deep fake videos online were pornographic and non-consensual. Five years later, a 2023 study by Sensity and Identity Verification Company had the same findings. And yet, it takes like several years after the appearance of deepfakes till we're really talking about the problem of porn, right? And the moment that really happened,
Starting point is 00:29:20 2023 with QD Cinderella. This is probably the stupidest thing I've ever done. That morning, after finding out that she'd been deep faked and that the videos were all over the internet, QD Cinderella had to decide what to do. For her, the answer was almost inevitable. She's a live streamer. So, she lost.
Starting point is 00:29:47 I'm sure everyone in the world would tell me not to go live right now. But I want to go live because this is what pain looks like. This is what it looks like. Fuck the fucking internet. Fuck the constant exploitation and objectification of women. It's exhausting. It's exhausting. Fuck Atriac for showing it.
Starting point is 00:30:24 thousands of people. Fuck the people DMing me pictures of myself from that website. Fuck you all. And I think you guys need to know what pain looks like, because this is it. This is what it looks like to feel violated. This is what it looks like to feel taken advantage of.
Starting point is 00:30:45 This is what it looks like to see yourself naked against your will being spread all over the internet. And QD's pain, her anger, it goes viral. The streaming world was shaken to its core recently. Her name is QD Cinderella with a heartbreaking and infuriating video that she posted. AI deep faked porn of his streaming coworkers. QD's stream and the story about what happened, it's written about in Wired, The Washington Post,
Starting point is 00:31:18 Business Insider, Entertainment Tonight. The list goes on. But the ramifications from this are going to last much longer. Hello. Oh. Hi. Can you hear me? Hello. And shortly after that, she talked to me. I know a lot of the women a part of this are refusing to do interviews,
Starting point is 00:31:39 and it's because they don't want this a part of their narrative. I don't want this part of my narrative either, but it's the only option I have to hopefully do something about in the future. I hope that in 10 years when my niece is more on the internet, she does have to deal with something like this, just for existing as a woman on the internet. For QD, the violation of being deepfake, wasn't the end of the invasion.
Starting point is 00:32:01 The harassment has been relentless. You go to my YouTube cards. You search my name. You do anything. It's just there. It's like, it sucks. Something that's really important to me is it to be known that like,
Starting point is 00:32:12 I am not opposed to sex work. I just don't want to be a sex worker. The problems could send. It's just fucking miserable that people could take whatever they want for me and turn it to whatever they want. Beyond the harassment, the videos and the comments,
Starting point is 00:32:28 everyone could see. Cutie told me about all the things that happened behind the scenes. The dominoes, the deepfakes knocked over. The things they dredged up. I said this in one interview before, but I was sexually assaulted as a child, and it was the same, feel it.
Starting point is 00:32:46 Like, we feel guilty, you feel dirty, you feel what just happened. And it's bizarre that it makes that resurface. I genuinely didn't realize, it would. When I first heard about it, I think I was born and whirlwind. I didn't get hit with it psychologically until I saw the photos. I think I potentially could have been someone on the internet that was lacking empathy
Starting point is 00:33:08 before it did happen to me. Yeah. Yeah. It's so scary how like, you know, even it's like years and years later and you're like, I'm healing. And then something like that can just be like right back there. It did. Another later too is my family seen it.
Starting point is 00:33:26 It went so viral and my family saw it. but some people went out of their way to send it intentionally. Some of the girls, the photos were used this blackmail to try to get stuff out of them. It's just like, it's been so gross. And none of this is stuff that we've created. These aren't any of photos of ourselves. But I'll tell you what, my 65-year-old dad, it'd be a hard time explaining to him that that's not real. Have you had to, like, explain it to your parents?
Starting point is 00:33:53 I've literally avoided calling my dad. I've had conversations with a few of my cousins. one of my aunts as well as my sister and they all get it but they also think that the internet's evil and you know they it's so sad because of all the good things I've done in this career
Starting point is 00:34:12 I've done so much I've raised tons of micro charity I've made community events that tried to highlight people that maybe haven't had the opportunity to be highlighted and I've done so much but this is what my family now knows as my job
Starting point is 00:34:26 For many people, news coverage of QD Cinderella's story was their first time hearing about deep fake porn. Or if they had heard of it before, this was their first time coming face to face with the real emotional impact it had on the people who were used in it. It was also one of the first times someone, in this case, Atrioc, the streamer who exposed the porn site, was caught and called out for watching it. It was a shock, genuinely a shock. up to see a face behind someone that paid for this and someone that saw this. And so I think the taboo of that is one reason that it got so large.
Starting point is 00:35:05 I think also there's even more taboo because he is my boyfriend's best friend. My boyfriend was in his wedding party and so the fact that he was able to scroll past naked photos of me and have no emotional or visceral reaction to exit out of the page. Right. Bizar.
Starting point is 00:35:25 The fallout came quickly. Atriroch streamed a tearful apology, his wife in the background, also tearful. Later, he donated $60,000 to a law firm to help with takedowns and legal support for the streamers who'd been deepfaked. QD filed takedown notices. Some of the content came down. Not all of it. The site Atriac had been looking at, pulled the videos offline, and posted an apology. Today, Qudy and Atriock still streamed together.
Starting point is 00:36:02 By all appearances, they've moved past it. There's a cynical way to look at this. He got wrapped on the knuckles, and she had to get over it to keep her career in a space they shared. But there's also a hopeful one that people can apologize and come back from this. But here's the thing. Yes, Qudy saw who was watching deepfakes of her, but she never found out who made them.
Starting point is 00:36:34 For years, fake porn has been thought of as a celebrity curiosity, a weird fringe corner of the internet, pop stars, movie stars, streamers, as if that was just the price of fame. But anyone can make a deep fake of anyone, which means the person uploading it could be your neighbor, your co-worker, your best friend, and the person in the video could be you. We were still living together, so we still had to share common spaces.
Starting point is 00:37:07 We still had to share a bathroom. I was terrified. The reality is, it's very easy to anonymize yourself online. There is no woman in the world who is safe from this technology. I'm not just going to, like, sit here and take it. This season of Understood, we are burrowing into the digital world of non-consensual deep fake porn. The people impacted, the political battles, and legal loopholes. And we'll follow the trail because behind the people watching and making deep fake porn, there's someone else.
Starting point is 00:37:40 A spider at the center of a web, a kingpin. An investigation that begins in Denmark will ignite a chase across three countries and four newsrooms as a CBC journalist closes in on the very real person behind the world's largest fake porn website. Mr. Deepfakes himself. Mr. Deepfakes. Mr. Deepfakes. Mr. Deepfakes or the man behind Mr. Deepfakes. Mr. Deepfakes is the most notorious non-consensual deepfake porn site in the world.
Starting point is 00:38:09 He has been using women's faces for years, exploiting them, earning money on them. Now it's his turn to be on the front page. In this episode, you heard archival tape from CBC and QD Cinderella, Ludwig React, Mintberry Crunch, The Kim Commando Show, La Duke Leadership, the Today's show from NBC Universal, Jurassic Park from, Universal Pictures and Amblin Entertainment, Furious Seven from Universal Pictures, Lex Friedman podcast, Preserve Knowledge, The AI Podcast, Rework, Association for Computing Machinery, Fox News, BuzzFeed, NPR, Dextero, Penguins, and Another Body, My AI Nightmare.
Starting point is 00:38:52 Understood, Deep Fake Porn Empire is written and produced by our showrunner, A.C. Row. And me, your host, Sam Cole. Arman Agbali is the producer, sound design by Julian Uzielli, who also voiced Carrie Pearson, aka Lux Luka, TMFU was voiced by Greg Kelly, our story editor is Veronica Simmons, and our executive producer is Nick McCabe Locos. If you enjoyed this episode, make sure you check out previous seasons of understood, like season two, the Porn Hub Empire, which I also hosted. I'll take you inside the rise and reckoning of one of the biggest porn platforms in the world. You'll find that wherever you found this podcast. So go check it out and make sure you hit follow. Then meet me back here for the next episode of Understood Deep Fake Porn Empire.
Starting point is 00:39:43 That was the first episode of Deep Fake Porn Empire. If you like what you heard, all episodes are available right now. Just search for the Understood feed wherever you get your podcast. There's a brand new season coming later this spring. So be sure to follow the feed so you don't miss an episode. For more CBC podcasts, go to cBC.ca.ca slash podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.