Rotten Mango - #210: The Influencer Deepfake Sex Club - Blackmail & Torture of Taiwanese YouTubers

Episode Date: October 30, 2022

Zamy, a Taiwanese influencer, had worked hard to build up a wholesome reputable image. Yet, it was all going to come crashing down in the most nightmare-ish way possible. One day, she woke up to being... flooded with messages with people asking her “what are you doing this for? Do you not make enough money on YouTube?” It was a “leaked” video of Zamy. She was engaging in sexual activity yet she didn’t remember anything about this encounter. But clearly, it happened because there was no denying that was her face. But the more she studied the video the more she realized a few small things were off. Her body didn’t look the same. She didn’t speak Japanese, but in the video she did. She was so confused. She had to dig further. What she found was terrifying. A group of thousands of men who were trading and purchasing deepfake p*rn videos of YouTubers and celebrities. None of the people depicted consented for their faces to be used like this. And this was just the start of a nightmare. Full Source Notes: rottenmangopodcast.com To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Rambles. Whether you're doing a dance to your favorite artist in the office parking lot, or being guided into Warrior I in the break room before your shift, whether you're running on your Peloton tread at your mom's house while she watches the baby, or counting your breaths on the subway. Peloton is for all of us, wherever we are, whenever we need it, download the free Peloton app today. Peloton app available through free tier, or pay to description starting at $12.99 per month. Butter being butterboot.
Starting point is 00:00:32 Welcome to this week's mini-sode of Rotten Mango. I'm your host Stephanie Sue. And let's talk about a Taiwanese YouTuber that goes by the name of Zami. She's really pretty. And apparently she has a very attractive body. Because in a netizen poll that put by the name of Zami. She's really pretty. And apparently she has a very attractive body because in a netizen poll that put all the Taiwanese influencers against each other, she was voted the most wanted to have a one night stand with. Who is making these polls? They need to get fired.
Starting point is 00:00:59 So great, I love the objectification. But Zami is also known to have this super-lovable personality. She is well-loved, well-respected from what I can tell in the online community, no crazy scandals from what I can tell, and it seems like her life was going really well. I mean, she's conventionally beautiful, she had the dream job, she gets to travel the world, make content, and then in November 2021, her life blew up in her face. She started getting these messages from viewers that her porn video and nudes were leaked, and some of them were even shaming her for it.
Starting point is 00:01:33 Asking her, you don't make enough money on YouTube, you don't make enough money as an influencer, you got a resort to this. What's wrong with you? What kind of example are you setting for your young followers? But the thing is, Zamy knew damn well, there would be no such video in existence out there. She never took any of these videos, she never took nude photos, and of course she's not judging people who do because this is still violation, but she never let people take photos of her.
Starting point is 00:01:57 So unless somebody secretly illegally recorded her, or maybe the viewers just might think that she looks like a porn star? Maybe they look similar? So she did some digging. She found the videos, her alleged porn videos, and out of pure curiosity, she clicked the video, and she was shocked. She did not remember the setting of this video. She did not remember filming this video, or she doesn't even know the person that she's doing things with in this video. And when she looked closely, I know what you're thinking maybe she was drugged, maybe she has amnesia.
Starting point is 00:02:30 When you look closely, she realized that doesn't look like my body. I know my body. I see my body every single day in the mirror. That's not my body. But that's my face. And it's so realistic. It's not like someone just put a Photoshop picture of me on somebody's face and it wasn't moving every time the actress in the movie was making a specific face or
Starting point is 00:02:54 moaning or doing something her face corresponded with it. It looked like her. It looked like her. And most alarmingly, Zami was like, wait, I don't speak Japanese. And the woman is speaking in Japanese in this movie. So do I have no memory of this? And I just learn Japanese in one night and then completely forgot about it? Or is there something fishy going on? The clip of her alleged porn video was about 30 seconds long. And at the end of it, it said, Hey, if you enjoyed this this preview go to the telegram group and pay money to watch it in full. The entry fee started at
Starting point is 00:03:28 $14. Sozami in full traded the telegram group acted like a strange man and literally paid for her own alleged porn video. She also recorded all of this for a YouTube video because she wanted to prove hey guys it's not me in this porn. This is you know the best way to show it to you, they're selling it in this chat. And if you guys don't know, this is a deep fake. Someone made a deep fake porn video of me. And she wanted the world to know, be on the lookout. This is incredibly serious.
Starting point is 00:03:59 She infiltrated the group. She saw that they had hundreds of the most famous celebrities and influencers who had been deepfaked into porn videos. and you could pay to see all of them. Essentially, the editors and the company would steal, quote, steal the faces of celebrities, photoshop them onto porn stars and porn videos with this sophisticated AI run software to make it look as realistic as possible. Then they would repackage this and sell it to perverts. Zami said it felt like someone had gouged out my face to sell a sexual merchandise.
Starting point is 00:04:32 And it wasn't just that someone existed that made these types of videos, but there was a demand for it. The group consisted of thousands and thousands of people who were excitedly throwing money to watch their favorite influencers, their favorite celebrities in a fake porn video. Everyone's day. There was a poll sent out by the administrators of the chat, the admin, to ask,
Starting point is 00:04:55 which one do you wanna see next? You can vote. And it'd be like, A, this A list celebrity B, this influencer, C, this YouTuber. There would be even female politicians, local mayor, senators on there. What? Yeah. And on average, 500 people are more participated in each poll every single week.
Starting point is 00:05:14 I mean, it was just so sick. Sometimes the admins would even host giveaways where one lucky winner could request a deep-fake video of literally anyone, even a close friend, a classmate that they liked, a coworker perhaps? Maybe your boss! Anyone, no rules, could potentially even be a minor. Or if luck was not in your favor, or you just didn't have the patience, you could pay the admins to make a porn video of your classmates. You just have to provide money, thousands of dollars,
Starting point is 00:05:45 and photos and videos of that person that you want to victimize. The more photos and videos that you find, the more realistic the porn is going to be. And if you ever dared ask why it was so expensive, the admins would respond. We'll make one minute of content. We spend about 15 to 20 minutes to manually adjust the face to make it look super realistic.
Starting point is 00:06:05 In addition to the electricity bills of all the software that we run, collection of video data, we have to charge that much, you know, thousands of dollars for one porn video for a special request. Now, if you just do their poll and they release, you know, every couple of days a porn video of a celebrity, you can just pay like $14 to watch it. But if you wanted a custom one, you would have to pay thousands. So it didn't take long for Zami to post the video of her infiltrating the group chat to pay for her own, quote, leaked porn, and her end message was that she hoped with enough
Starting point is 00:06:37 outrage they could get the chat removed and delete it, or whoever was doing this would be caught and arrested. Zami mentioned that she even consulted a lawyer to help her handle it, but the lawyer told her, no, I mean the admins are most like if they're creating deepfakes at this level of sophistication, they're hiding their IP addresses, let's be real. And the social media platforms might not want to cooperate with potential police investigations for something like this. So it's going to be really, really hard to find the perpetrators. Meanwhile, Zami calculated how many people watched her video with how much the ticket price
Starting point is 00:07:10 to watch her video was and whoever was making this on just her video alone had made nearly $43,000 for victimizing her. As always, full show notes are available at rottenmangopodcast.com. This is an international case. I did hire professional translators for this one, but it's just there's so much going on. There's so many people involved, so many victims, and it's not just about this specific case in Taiwan.
Starting point is 00:07:36 Deepfakes are terrifying. So with that being said, let's talk about it. Even with Zami drawing attention and awareness to the group chats on Telegram, they were starting to take off more and more. Like more people were joining. It almost kind of had the opposite effect because everyone in her comments were like, oh my god, disgusting. And I'm sure all the people watching were like, oh my god, disgusting. But it also led to people who didn't know her. That was like, wait, that's what I want to see. So they were gaining more and more traction. They were called something along the lines of Taiwan Internet celebrity face swap would
Starting point is 00:08:09 be like the names of these group chats. And the business model of the creator of the group was pretty simple. You post into the group 30 seconds, a preview of a celebrity or an influencer's quote sex tape, a deep fake, and everybody in the chat knows it's a deep fake. It's not real. And if you wanted, you could pay to watch the full thing. But the thing is, none of the primarily female celebrities or influencers knew that this was happening.
Starting point is 00:08:33 No, the videos weren't real. They were deep-faked, but they were so realistic that if they were shared outside of the group, you would assume that's one hacked into those celebrities I Cloud, or that their nodes were leaked, or that their porn video was leaked. I mean, it was so realistic that nearly 6,000 new members joined the chat within two to three months. And this isn't just free for all. Like, you have to request to join the chat and they usually vet you to make sure that you are a pervert indeed, that you're not some sort of cop or
Starting point is 00:09:00 investigator or reporter. So a reporter who was researching the case, snuck into the group chat, pretending to be a little perv, and they found out that it wasn't just celebrities though. It wasn't just influencers. Sometimes regular people's faces would be turned into porn, and sometimes minors were even being depicted as being raped. So I guess they would take fake, rape porn videos and insert minors minor spaces onto them.
Starting point is 00:09:25 So yeah, the reporters started to talk about it. They started to write about it in Taiwan, and the group got smarter. They blew up their original chat, meaning they deleted everything. So there was no proof, then they made a new chat where it was even harder to become a member. When the reporters tried to join again, the new chat group, they were asked by the operators, how did you hear about us? How do you know about us? Oh, um, a friend. Oh cool. Well, we're going to need your friend's ID and name to verify
Starting point is 00:09:50 that they are a member, and can you ask him for his transaction history with us to prove that he is a buyer of our services? So, some reporters, they were lucky, or unlucky, I guess, but they got in, and they said not only was the group selling hour long videos of influencer deep fake porn videos But they were selling enthrome content Do you remember that the illegally leaked videos of the Korean enthrome incident they It was basically a Korean group of dudes that would go around raping minors raping people and then posting those videos and blackmailing those minors into sending in more nudes or sending in more porn. And they created this telegram account where they would sell those videos or just share it with people.
Starting point is 00:10:30 So they were leaking those after all of those got banned. The things were not slowing down after the initial scare. They were only getting worse. Which if you're like, wait, tell me more about deepfakes. Let's get into it. Photoshop is freaking insane these days. You can swap your face in real time with whoever's sitting next to you. We've seen those TikTok snapchat filters. I mean, yeah, they don't look that realistic, but it's still kind of crazy. Sometimes you can even swap your face with a dog or maybe you can merge your face with the celebrities face using a filter or an app. Whatever it is, it feels like the options for Photoshop and filters are limitless.
Starting point is 00:11:07 You can be whoever you want to be on the internet. I've seen videos of Photoshop work where people would transplant a whole set of hair and makeup onto their face, and it looks so believable. So it felt almost inevitable that this was going to happen, right? Even though it's giving sci-fi dystopian, like I would rather die, than live another day in this world type of vibes is what it's giving. I feel like there was a point in time where everyone was like, deep fakes. Well, I'm like, oh no, what does this mean for our future? The future looks horrible. And then so much shitty shit started happening in life that we
Starting point is 00:11:39 collectively just forgot about the existence of deep fakes for a second. It's coming back now with, I don't remember the company name, but Logan Paul in a bunch of VC capitals. They invested into it. They make a bunch of Tom Cruise deep fakes. Yeah. And there's trying to get into production and, you know, working with ad companies. Yeah, it's a whole thing. It's like a AI deep fake company and they're trying to separate deep fakes away from the porn industry where it's mainly been a problem
Starting point is 00:12:08 I'm gonna get into it But you know it keeps popping up back in the news here and there whether it's Emma Watson or Skarlit Jo Hanson or even Elon Musk Deep fakes look like they're gonna be with us for a while. Yeah, I'm trying these Elon Musk Oh, yeah, that guy. Is that what you're saying? No, there's another one with Elon Musk or there's been so many with Elon Musk But the oh yeah, there's some fake Elon Musk. Oh yeah that guy. Is that what you're saying? No there's another one with Elon Musk or there's been so many with Elon Musk but the Oh yeah there's some fake Elon Musk like scams. Yes, yes we're talking. Oh yeah but I thought you're talking about that. Take talk, take call him. Elon Musk. It's like that's honestly a really good deep fake. I genuinely thought he looks like Elon Musk. Like just a double gay guy So yeah, there's been so many deep fakes. It looks like it's going to be with us for a really long time.
Starting point is 00:12:49 I mean, forever. A lot of the explosive conversation around deep fakes, which by the way, deep fakes is a hybrid of the word deep learning, aka AI, and fake. So anyway, the big conversation that took this super mainstream started with a Reddit user by the name of Deepfakes that started posting videos and images onto Reddit. He would take a porn movie and use AI technology to swap the faces of the porn stars to celebrities
Starting point is 00:13:15 like Gal Gadot, Emma Watson, Scarlett Johansson. And this wasn't just another Photoshop attempt. I mean, it was so believable. A deepfake can make someone look like another person in a video. So if the porn star is making a specific face, then whoever you've swapped their face for is going to make similar facial expressions. And because you've compiled and fed the AI software
Starting point is 00:13:39 so many videos, it looks so incredibly realistic. Everything about it. It's incredibly difficult to distinguish the real from the fake. And as long as you have a ton of photos or videos of someone, it is so easy. And it's so easy if you have, you're trying to make a deep fake of a celebrity or a vlogger because think of all the footage that they've put out there. And it won't even look edited if you do it really well. When you see the deep fake again, it is so hard to distinguish between real and fake. Some deep fake victims have said, even I was getting confused. Like, wait, there's no way I know, but it might go crazy. Is that me? So the Reddit user said that at first, it was just for self-emusement,
Starting point is 00:14:20 to work with AI and spare time to make these videos. But then it started getting out of control, like really fast. People started seeing the interest in the market and the demand, and they started releasing apps for everyone to download where you can create your own celebrity porn. One of them was marketed as, and I quote, the founder of this app school is to make deep fake technology available to people without a technical background or programming experience. That sounds kind of dangerous. So scary.
Starting point is 00:14:48 The founder continued, I think the current version of the app is a good start, but I hope to streamline it in the coming days and weeks eventually. I want to improve it to the point where users can simply select a video on their computer, download a neutral network correlated to a certain face from a publicly available library,
Starting point is 00:15:04 and swap the video for a different face with one click. So you're telling me, you want people to make deep fake porn in one click? Some users even bragged on Reddit in other forms about how they turn their classmates and friends into porn videos. And the crazy part is, I always thought deep fakes were some crazy hackership, okay? You know, very hard to do. Maybe only 0.01% of the population can create a deep fake. Apparently not. The software to create deep fakes is readily available on the internet if you look hard enough, which isn't that hard,
Starting point is 00:15:35 honestly. And it's not illegal. There's not that many laws around deep fakes. It's crazy, which I know you might be like, Stephanie, deepfakes can be used for good too, because there are a lot of companies that are using it for quote unquote good. Disney's using it to save on production cost. So they're using a lot of deepfake technology to make their movies better. Kim Kardashian posted a deepfake of her dad
Starting point is 00:15:57 who passed away. You know, the hologram technology that they use, they also use deepfake technology. So if you see Michael Jackson give a hologram concert performance, it's a also use deep fake technology. So if you see like Michael Jackson give a hologram concert performance, it's a mixture of deep fakes as well. Or like when deep fake technology was used in a whistle blowing documentary to protect the identity of the people involved, right, rather than just censor their face.
Starting point is 00:16:17 I mean, fine, I guess I get it, there can be good in this. And I'm not saying that we should get rid of deep fake technology entirely. I mean, I don't even have the authority to say or suggest something like that, but a Dutch cybersecurity firm investigated and found that 96% of all deep fakes were pornographic. And a quarter of the subjects were South Korean women. What? Primarily K-pop stars. Oh my gosh.
Starting point is 00:16:43 So yeah, you just get the software, download photos and videos of your victim and feed it to the AI and it will create a deep fake. So after the invention of these apps, they start getting traction. A subreddit emerged called celebrity pornos. There was even a list of celebrities that were used for the porn videos as like a giant index via deepfakes, but the most popular victims in the US were Emma Watson, Natalie Portman, Scarlett Johansson and Jessica Alba. Once the subreddit gained enough traction though, the media picked it up.
Starting point is 00:17:15 The public was outraged, Reddit went to shut down the deepfakes forum, and other platforms announced, you know, we are also going to delete any deepfake related content that is in the sexual nature. So it seemed to momentarily that the horn dogs had been shut down, right? All the perverts had been shut out. Then the scammers rose to the equation and decided it was their turn. They started using deep fake technology and related software to scam people out of their money. One of the related software is this audio version of deep fake.
Starting point is 00:17:44 So you insert audios and videos and just voice recordings of anybody, which is so easy if you're looking at business executives, politicians, influencers, celebrities, right? Paul Castors. Podcasts, so much of their voices out there. You feed it to the Deepfake technology and then it's almost like a text to speech thing. You type it and their voice says it. Are you kidding me? The really good ones are really good at the perfect tone, the cadence, copying all of
Starting point is 00:18:10 that, the pauses. You can do voice too. Yes. In 2019 in the UK, there was this huge British energy company that was bought out by a huge German company, okay? See, that's scary because like, banks and... Because I feel like face sometimes, you can still kind of tell.
Starting point is 00:18:27 I feel like with the face, there is uncanny valley. You remember we talked about uncanny valley where humans are really good at sensing when something doesn't look quite human and it kind of unnerves you. Yeah, yeah, yeah. It's only gonna get better, the technology. So one day, nobody can tell anymore.
Starting point is 00:18:44 Yeah. And then the voice, if you add a voice on top of the face, how do you explain this? This is not me. Like, who's going to believe you? Exactly. No one's going to believe you. Oh, okay. In 2019, let me know you had already happened. And they haven't been caught. There was this huge British energy company that was bought out by this huge German company. So just think UK energy company, they have a parent company, right? And the company remained the same. They just this huge German company. So just think UK energy company, they have a parent company, right? And the company remained the same, they just have a parent company.
Starting point is 00:19:09 And someone called the executive at the British company with the deep fake audio of the CEO of the parent company. Being like, hey, I need you to wire $250,000 into this Hungarian account for this business purpose. This guy thinking, well, that's my freaking boss. I mean, I gotta do it, it's business related. So he wired a quarter million dollars. I mean, it wasn't personal like, hey friend,
Starting point is 00:19:32 I need to borrow some cash. It was from one business account to the Hungarian account. So he wired it over thinking nothing of it. And then the CEO of the German parent company called back asking for another wire. So that's when he was like, okay, now it's getting a little bit suspicious. So he had IT look into it.
Starting point is 00:19:49 It was a freaking deep fake. The parent company CEO never called for a quarter million dollar wire. And the scammer hasn't been caught. What? Authorities listen to the voice while he was talking to the executive. And the deep fake audio was very realistic.
Starting point is 00:20:06 It got the cadence, the punctuation, the tone, everything was on spot, even the German accent was so incredibly believable. What if comparing car insurance rates was as easy as putting on your favorite podcast? With Progressive, it is. Just visit the Progressive website to quote with all the coverage as you want. You'll see Progressive's direct rate, then their tool will provide options from other companies so you can compare. All you need to do is choose the rate and coverage you like.
Starting point is 00:20:33 Quote today at Progressive.com to join the over 29 million drivers who trust progressive. Progressive casualty insurance company and affiliates comparison rates not available in all states or situations, prices vary based on how you buy. Why would you break into these apartments? For money, for drugs, whatever was in there. Why aren't you afraid of getting caught at doing this? No. Who's going to catch us?
Starting point is 00:20:57 What a police. It was the height of the crack era, and instead of locking up drug dealers, some New York City cops had become them. I would suit up in my uniform and we're going to want some drug dealers and I know how to do it really well. This is the inside story of the biggest police corruption scandal in NYPD history and the investigation that uncovered it all.
Starting point is 00:21:23 Did you consider yourself a rat? 100%. I saved my soul just like everybody else does. Listen to and follow the set, an Odyssey originals documentary podcast series available now in the Odyssey app, Apple podcasts, or wherever you get your shows. I'm not a big guy, man, but I love being that dirty mother f***er. So with the success of that scam, other scammers tried. Which by the way, they said a lot of people will even use deep fakes and zooms to try to get scammed
Starting point is 00:21:56 or try to scam people. What does that mean? Like you can turn yourself into Elon Musk in a zoom meeting. So they, a lot of these big companies, they put out like an email and they said, Hey, if you ever are talking to someone and out like an email and they said, hey, if you ever are talking to someone and you're kind of worried they're deep fake, ask them to turn their head to the side. Usually, the deep fakes don't work well with that. Mr. Elon Musk, can you turn your head to the right piece?
Starting point is 00:22:16 Pretty much, yeah. Welcome to my podcast, by the way. So, with the success of that scam, other scammers tried. An employee of a massive US tech company received a voicemail from the CEO of that company Saying I need your help with an urgent business deal right now Is this the new age of this is the prince of Yes, this is the next phase of email fishing Straight up. Yeah, but the what's that they employee was like, my boss would never leave a voicemail.
Starting point is 00:22:45 The CEO of this giant tech company, he would just fucking text me nonstop or threaten to fire me or find his bodyguards to find me. Like, he's not gonna leave a voicemail. No, I'm kidding, but probably. So he reported a voicemail to the fraud department and like, imagine if it really was your boss. Oh God, but the company was quick to jump to action.
Starting point is 00:23:03 They even hired a US security firm, NISOs, to help investigate, but it seems like they got nothing out of it. They never found out who it was. The main thing that they noticed, and they tried to raise awareness for it, was that the deep fake audio had a hard time creating their realistic background noise.
Starting point is 00:23:19 And the pitch and tone of the voices were a bit too smooth. I mean, you wouldn't really notice it, but it's very smooth like a voice actor. But it's still incredibly believable. And then Elon Musk was hit. There was a video, a short 30 second video of him where he talks about how he launched a platform called VittVix, where you insert your Bitcoin, literally give your Bitcoin to the platform and you will earn 30% of whatever you invested every single day.
Starting point is 00:23:48 Okay. The deep fake was not the most convincing thing in the world I'm gonna be honest with you. If you've ever periodically even watched Elon Musk talk, you might be inclined to believe it. Because, okay, so in the deep fake he looks a bit bizarre. He doesn't really have like a lively cadence. He's very monotone. So it wasn't the most the deep fake he looks a bit bizarre. He doesn't really have like a lively cadence. He's very monotone.
Starting point is 00:24:05 So it wasn't the most advanced deep fake, but it was still good enough to scam people. Elon Musk had to come out and let everyone know. That wasn't him. But a lot of people say, this is just an indication of what's to come. This is the beginning stage. It's like when the internet first started,
Starting point is 00:24:20 the software, the memes, the jokes, the scams, they weren't that advanced, but soon it's gonna get to a point where governments, police departments, social media platforms are gonna need to dedicate a whole team to fight this type of fraud. That's what I'm saying. I feel like the police are not keeping up with this. The police can barely keep up with cyber bullying, where I'm like, hi, this person literally,
Starting point is 00:24:40 someone that I know that lives right next door, sent me a death threat on their verified Instagram account. Here you go. And they're like, we can't find this person. Who is at handle suck your mom? I don't, we don't know. We don't know. I mean, it's impossible.
Starting point is 00:24:55 And you know, the death threat, how can we be sure that they sent it? You know what I mean? I mean, I see it's their face. They even sent a selfie with it. But we don't know. Where's the metadata? What's metadata?
Starting point is 00:25:05 It's my lunch break. The US started holding House Intelligence Committee hearings in 2019 about the dangers of deepfakes in the political world, in terms of extortion and blackmailing and overall just fake news false narratives. I'm sure a lot of them already have departments for this in the political world, but maybe not the police because who cares if regular civilians are getting deep faked. But you know what I'm saying. An AI researcher, Alex Champendard, said, people should know how fast things can be corrupted
Starting point is 00:25:36 with deep fake technology. Deep fakes can be leveraged to defame and impersonate and spread disinformation. The primary pitfall is that humanity could fall into an age in which it can no longer be determined whether a medium's content corresponds to the truth. So just to give you some more context on how serious this is getting, and it feels like we haven't really heard of deepfakes in a while.
Starting point is 00:25:57 But recently, in March of 2022, a one minute video started circulating of the Ukrainian president, seemingly telling his soldiers to lay down their arms and surrender to Russia. It was promptly taken down by social media platforms and debunked, and hackers had even inserted the fake, fake video to the TV station Ukraine 24. It appeared briefly, but President Zelensky had to respond with his own video where he announced we don't plan on laying down any arms to our victory. Which it's still unclear who created this video. So yeah, it's really serious. It's not just
Starting point is 00:26:31 he-he-ha videos of Barack Obama making a funny joke deep fake-wise. It's like, you know, you're talking about wars that are going on. I mean, I think before getting into this research, I thought deep fakes were not a trend per say, but had its moment to shine, where again, everyone was trying to deepfake Obama and Trump, and oh, did you see that one where they insert Nicholas Cage into every single movie ever? Nicholas Cage is like that funny actor, so they started putting him into like Harry Potter and all of these different.
Starting point is 00:27:00 It was honestly kind of he-haha, right? But also terrifying, because the more that you start reading about deepfakes, the more the future sounds horrendous. In 2020, Microsoft made public that they're developing a deepfake detection software tool, and it seems like a lot of tech informed people are incredibly worried about the future. More crypto scammers came out with deepfakes of well-known
Starting point is 00:27:21 investors like Ark Invest CEO, Cathy Wood, to make it seem like they supported their crypto platform or their crypto coin. And again, the deepfakes were quickly debunked by the investors. And this isn't the worst part of deepfakes. The billionaires, they have resources. They can hire a full-time staff to debunk rumors and deepfakes, but it's the victims that fall for their scams that matter. And because the scams haven't been as successful to date, there's not really much concern
Starting point is 00:27:48 about, okay, should we even be scared? But we should be very scared of revenge-porn deep fakes. I think when it comes to celebrities and influencers, yes, when a deep fake video gets leaked, I think there is the argument that even though it's a deep fake, someone used their public videos and photos to victimize another person. And I think that there's going to be trauma, there's going to be, you know, loss of income, there's going to be all of these backlash to things that they never did. But I think a lot of people's supporters will rally behind them.
Starting point is 00:28:20 However, think about the non-famous citizens. Imagine you break up with the toxic partner, and suddenly there's videos and photos of you that you know aren't you, but they're being shared around. Nobody's gonna believe that they're not you, because why would someone take the time to make a deep fake out of an ordinary person that they can't even profit off of? At least with celebrities, people can charge money to watch them, right? It's already bad enough that real photos and videos are being shared. Did you know there's a website for revenge porn? No.
Starting point is 00:28:50 It keeps getting taken down and they come up with a new website and people just post videos and photos of every single nude that they've ever received. But now, now it's getting worse. I mean, with enough photos and videos being shared online of TikTok, Instagram, YouTube, you could have no connection with someone. You could have never even met someone in real life. And they could make a deep fake and pretend like they met you and you guys filmed this together. So with that being said, let's get back to time one.
Starting point is 00:29:15 Because this is arguably the biggest scaled deep fake porn production scam that we've seen in a while. Who the fork did it, you know? In 2021, the reporters that had infiltrated the group chat reached out to the victimized woman who had been deepfaked and they started to interview them one by one for their thoughts and how they feel. Only a few of them wanted to talk about it.
Starting point is 00:29:33 The rest said, I just, I wanna ignore it because if I don't, then I'd just gonna draw more attention to it and more importantly, I'm kinda scared. If I say something, I don't know who these people are, they already know that I exist, they made this video of me. If I say something, I don't know who these people are, they already know that I exist. They made this video of me. If I say something, what if they make more videos?
Starting point is 00:29:49 What if they make weirder videos? Or what if they make videos of me saying things I shouldn't be saying? That could end my career for real. How would I prove that that's not me? So no, I don't want to talk about it in public. One YouTuber who mainly focuses on food content, her name is Qua Qua, she said,
Starting point is 00:30:04 you know, I usually post food vlogs and I don't talk about serious stuff. So to even bring up me being deepfaked, it's not on brand, it's not what my viewers want to talk about, but I've wanted to talk about it. I want to face up to this and I want to take the opportunity to speak out. In the beginning, I was super uncomfortable.
Starting point is 00:30:22 I tried to explain to people that the video was fake and I thought, okay, well, I told everyone it's fake so it should be okay, right? Well, not really, because some people still believed it was me. They thought that I was lying, that I was embarrassed that it was leaked, and I was claiming it was a deep fake. Wow.
Starting point is 00:30:38 Some people would even say things like, you're not famous enough to want to be deepfaked. And on top of that, she was getting death threats, she was terrified. And she said, you know, the worst were people who would say things like, why are you doing all of this? I mean, first of all, it's not shameful to do sex work, but also, I didn't do this.
Starting point is 00:30:57 So it's just like that extra feeling of injustice. The worst part is, her own father is a police officer. And he told her, there's nothing we can even do. She said she was so confused, she sought out help, she broke down in front of her psychiatrist, she was dealing with depression, anxiety, insomnia, she even engaged in self-harm. Everyone around her kept telling her, you're it's not your fault. And her it just felt like, obviously it's not my fault, but it happened.
Starting point is 00:31:25 So why did it have to happen? Maybe I should have never been on social media in the first place like what did I do to deserve something like this? Meanwhile, she said it's so hard because her job is to make videos. And nobody wants to see a depressed person in a video. So she would sit there, smile, and talk to the camera, act happy, and the minute that she turned off the camera, she would remember the deepfakes and breakdown. There was another big YouTuber that was hit with the deepfakes.
Starting point is 00:31:49 She came out to tell everyone that was fake and she said, you know, if I had a body like that, I would look in front of the mirror an hour for every single day. But on a serious note, guys, what do I do? Should I press charges or call the police even? As a semi-public figure, I want everyone to know that it's not a good thing to be deepfaked. Please don't watch these types of videos or purchase something like this. You're only feeding the cycle. Shockingly enough, of course the good people supported her, but some were like, well, shit.
Starting point is 00:32:15 Other YouTubers were even more celebrities were deepfakes, so get over it. You're not the only one. Some people were like, stop making a big deal about it. It's not even you. Which made her even more upset because she said if someone makes a porn in the future with a body that looks exactly like mine How could I ever refute it? How could I ever say that's not me? So in this situation she got really lucky compared to the other YouTuber because I'm assuming that she had enough proof to show that it wasn't her body without revealing too much about her body
Starting point is 00:32:43 Yeah, so it's really clear that the proportions weren't right like this wasn't her. But what if next time she wasn't so lucky? She said, if I speak up and say no, this is not me next time if someone uses a better body that looks like me, would anyone even believe me? I don't even think my family would believe me. How would I clarify that it's not me? Those were the concerns of female celebrities and influencers and even non-famous citizens because it's terrifying to think about. Meanwhile, the group just got ballsier and ballsier, and of course the government didn't
Starting point is 00:33:15 care or intervene when these women were being deepfaked to one of their own wise. The group released a video of a female president of Taiwan in a deep fake video. Now, she was not performing anything sexual, there was no, it wasn't a sex video, but it was rather like a gif. The president's face was swapped onto an e-girl's body, so she's wearing a tank top with her hair and pigtail braids and wearing cat ears, but the face is the president of Taiwan. And she's smiling. And I mean, it wasn't perfect.
Starting point is 00:33:48 You can kind of see flaws when the president's face is blinking, but it's pretty realistic. And although it wasn't porn or something explicit, it was removed almost immediately and the video was traced and the government decided, now it was time for a crackdown. Now. Wow.
Starting point is 00:34:04 A lot of commenters wrote, yeah, that's serious. You're trying to mess with the president. You got a little bit too cocky. You messed with someone you shouldn't have. And when the police finally found the culprit, and he was arrested, he had been the kingpin of the illegal video group. When he was, he was got, he said to the police,
Starting point is 00:34:22 I'm so glad that you guys came to arrest me. I've been so freaking tired. I did need a break I've been editing non-stop every day and finally in prison. I can rest a little So who doing the culprit was everyone was expecting a computer tech genius some some unknown figure that was Has been in their room coding since they were three years old They didn't talk much all they did was code code code and they know this crazy AI software, right? Mm-hmm. No, it was one of them.
Starting point is 00:34:50 It was one of the most former famous Taiwanese YouTubers. What? Yes. A Taiwanese YouTuber with millions of followers quit his YouTube career to create an AI deep fake illegal porn ring against other YouTubers, influencers, and celebrities. And apparently politicians, you're kidding. I mean, it was such a shock.
Starting point is 00:35:15 So let's talk about it. There is a guy on YouTube named Shaoyu, formerly known as Chu Yuchen. He's a YouTuber in Taiwan with millions of subscribers. He mainly focuses on extreme, extreme pranks and challenges. Like I'm not talking couples cooking challenges, I'm talking drinking his own urine and his own sperm challenges. Yeah, those types of challenges, and it obviously paid off. The guy had like 1.4 million subscribers and amassed more than 567 million views, but let me take you way
Starting point is 00:35:45 back before this guy made it on YouTube. He was just a normal dude. Shall you is the type of guy that felt like he was born to be a star. Even though he was incredibly normal, he was like, yeah, I'm going to be a freaking star for sure. It's that that he really was somebody that craved money, fame, and attention. Like kind of your stereotypical cloud chaser. We'll do anything for cloud.
Starting point is 00:36:08 We'll put out any video. Just even though piss off their own community and they'll keep posting videos as long as it's bringing in views. Like he was one of those people like where you're like, I don't know if this guy has morals. Like is this a show that he's putting on? Like what's going on? He definitely was the type to choose fame over everything. But he needed to make a living.
Starting point is 00:36:28 So he gets into construction work and he's pursuing his dreams on this side. He tried to join this idle training academy and at the end of the training, you might get recruited into an agency that might make you a star. So he spent all of his free time doing that, but nobody recruited him. So then he spent the rest of his free time doing that, but nobody recruited him. So then he spent the rest of his free time uploading videos of himself playing popular
Starting point is 00:36:48 songs on his guitar. So this is like went back then when covers were the shit on YouTube do you remember that time where everybody was singing a cover of a popular song with like the acoustic version of pop songs and that those were like the days right? He would post them on YouTube, but they never blew up. So he decides one day. You know what, I'm gonna try something else. August 7th, 2016, he started posting more, quote, YouTube-esque videos. He did some challenges. He unboxed strange products that he found on the Internet.
Starting point is 00:37:16 He said that he was so broke at the time that he didn't even have Wi-Fi in his house. He would go to the Internet cafe to edit and upload his videos. He didn't have a good camera He didn't have good editing software. He just knew that he had to keep trying So he poured his soul into uploading three times a week and it worked Most of his viewers were of the younger age group which is alarming considering his content gets very bizarre very quick So he mainly had like elementary schoolers and high schoolers watching him Meanwhile, this guy is taking shots of his own child-bearing semen, like it was weird. In two years, he went from being a tiny apartment to doing really, really well in life, almost too well.
Starting point is 00:37:52 Shall you seem to get lost near the end of his YouTube career? I'm not gonna lie. His content went from somewhat wholesome to him drinking his own semen and to flexing all day. He had titles like Buying a Super Expensive Villa. He even bathed in a million Yuan in a bathtub. It was just a really obscene display of wealth. He filled an entire bathtub with cash and bathed in it,
Starting point is 00:38:13 which doesn't really make for thrilling content, but that's what he did. He even bragged about how much money he made, straight up he disclosed his income, but not in like a, hey, this is like my way of teaching you finance stuff or how I did it But just straight up like guys look at how much money I make and that attracted the attention of the Taiwanese IRS Oh, yeah, this was all scandal so he was bragging about how much money he's making the IRS was like oh, yeah
Starting point is 00:38:40 Oh, yeah, looks like we should look into it and And it made things really, really bad because they did not just investigate Shou Yu, they investigated a ton of the other top YouTubers in Taiwan. So they were all pissed. They were like, fuck you, Shou Yu, okay? Yeah, but he did not learn his lesson from that. The guy kept going. He did a video which was titled something along the lines of Unboxing a Murder weapon. And he showed off what he called an assassination knife that was used to assassinate someone.
Starting point is 00:39:10 This was so controversial that he was investigated by the local police department, until they quickly realized that this guy just has no morals, but it is not a murder weapon. Again, nothing happened to him. If anything, he got more attention from the investigation and he just kept going. In 2018, he unboxed an Aphrodisiac, which is something that you would eat or consume to make you more sexually potent or more sexually excited, right?
Starting point is 00:39:36 And he always films his videos with his female assistant and he's like trying to get his female assistant to take it. And it's already controversial because his audience is so young, so the fact that he's promoting an aphrodisiac, and it was made worse by the fact that he took way more than the dosage that was recommended. And he talked about it.
Starting point is 00:39:54 So in the video, he's all like, I just checked, it's harmless. I'm not, it's not really an aphrodisiac, it's just for fun. To which the assistant responds, it is an aphrodisiac. Hey, but this small bottle costs like $80. How do you think it's gonna taste? I think you're disgusting, shall you? Then he starts pouring it in a glass. Wait, it just says one drop in the manual. I read it. Yeah, but come on. Just a few more drops. We already paid for it. So he went over the dosage, drank it all, and he said, oh wow, it's very sweet. Then a few minutes later, you see him giggling and saying, oh wow, it's getting so hot in here. My body is burning, can you feel it? Yeah,
Starting point is 00:40:29 you're starting to blush, shall you? Then 10 minutes later. Oh, my heart feels really hot. 15 minutes later. I have a headache and it's still really, really hot. 25 minutes later. Is there a horse in front of me? I feel dizzy. I feel like I'm on a high speed Ferris wheel. You start seeing his eyes look dazed and he said, it's so noisy in here. And he finally joked at the end. If you're afraid of being cold in the winter, this is going to make you hot. Or if you think a regular Ferris wheel is too slow, just take this. Basically promoting it to his elementary school audience.
Starting point is 00:41:05 The video was ultimately taken down pretty swiftly. And let's talk about the P in sperm video. The title for the sperm video was, How much sperm does Shao you have? Again, he has an elementary school audience. So he bought this male sperm detector, this kit that you essentially finish ejaculating into and it will measure the amount of liquid to tell you if you have a normal amount He bought this male sperm detector, this kit that you essentially finish ejaculating into, and it will measure the amount of liquid to tell you if you have a normal amount of liquid
Starting point is 00:41:29 that comes out or a large amount or a small amount, which honestly doesn't really have that much correlation with your sperm count, but he was trying to be like, oh, my sperm count, right? He leaves the camera rolling to go ejaculate into this sample cup. Oh my gosh. And then he puts it straight up into the camera frame, Like, you know, a YouTuber would do with their back on with their hand
Starting point is 00:41:48 behind it as if they're showcasing a product to get it to focus on camera. Showing what? His sperm inside of this clear tube. Yeah. What? And he said, wait, seriously? I think that my sperm count seems a little low. It's only up to number three.
Starting point is 00:42:03 Well, I guess the standard is 1.5 milliliters according to the instructions. So I guess I'm average. Now, I'm going to squeeze all the liquid into this measurement and wow, look, the liquid is so thick and sticky. You can see his assistant visibly uncomfortable on the side screaming at how disgusting he is.
Starting point is 00:42:21 Then he puts his sperm into a petri dish where he hooks it up to his phone so that he can see a quote microscopic version of his sperm swimming around. And he stared at it and aww. Wow, it's so beautiful. It's disgusting! How can that be beautiful? You don't think it's beautiful? No! Oh my god! How can you watch your own sperm for so long? You're so weird. Because it's my sperm. Let me count how many are in it. It's estimated that there's 50, but from this perspective,
Starting point is 00:42:50 I think my sperm mobility is pretty good. But anyway, it's getting pretty boring now. So shall you decide to make the video more fun by testing his spit? So he did the same thing with his spit. And later at the end, he was like, you know what? Since we're already here, let's eat the sperm. So he had his assistant and him dip a finger into his sperm
Starting point is 00:43:11 and try it on camera. No. Yeah, really, really disgusting. Yeah. The assistant almost immediately spit it out into a tissue, which again, this is somebody that works for him. So I don't care what kind of friendly relationship that they have. Forcing her to taste his sperm just, I mean, in the video, you can tell that she's against
Starting point is 00:43:34 it and she's uncomfortable, so it was just really questionable activity. To even put this online, to even do this on camera or off camera, it's just so bizarre. She almost immediately spit it out onto a tissue, but Shao used smack his lips carefully as if he had just tasted a fine wine at a Michelin star restaurant. He was savoring it, he was tasting the more subtle notes of the flavors. And finally he said, it tastes salty, but it's okay, just a little fishy, but it's still quite delicious. Then in another video, he drank his own urine after filtering through it with a super
Starting point is 00:44:07 water filter. Yeah, but um, Shaoyu really went far. He even drank from his own toilet bowl. And yeah, it's bad. And if you're thinking, God, how did anyone put up with this? It sounds bizarre. Shaoyu did try to connect with this audience. He would talk consistently about how his mom had passed away when he was only five, but how his whole life he had been
Starting point is 00:44:27 insecure and had low self-esteem. And that's why he just always wanted approval from others, which is why he wanted to be popular and famous. He talked about how the number of views he gets in videos directly correlates with how he feels about himself. So some people felt like it was this broken guy that was just working hard and, you know, in this toxic cycle of trauma, some people were even cheering him on.
Starting point is 00:44:50 He said, you know, YouTube is hard because it's like a poison. It gives you the taste of fame, and then it squeezes you dry, and then it abandons you. In order to not be abandoned and forgotten, you have to do crazier and crazier things every day. He just wanted more and more fame, so he resorted to doing really questionable things. Now to be fair, he did have some positive videos. He had one video promoting water conservation, and in the video he's even seen giving away food and water to people without homes. But when he goes shopping for the food in the convenience store, at the end, he pours like all the food that he had in his basket. It's still
Starting point is 00:45:28 packaged, don't get me wrong, but he pours it onto the ground and just rolls around in it. Seemingly, like, I'm so happy about this. But the audience was kind of split by the whole part. It just seemed unnatural and bizarre. Like, he was trying to hype it up, and some thought it was disrespectful for him to throw the food on it up and some thought it was disrespectful for him to throw the food on the ground and then pass it out to people with that homes. I mean, yeah, the food was in packaging, but would you do that if that was your food that you were buying? Would you throw your packaged food onto the ground at H. Martin Roller and it? Probably not. It's like just a subtle lack of respect for people that
Starting point is 00:46:02 don't have homes. It's a, well, they should be grateful on bringing them food, type of vibe. Then other people were saying, he's giving food away. What are you doing to help people? Then people would argue, there's tons of ways to give back. You don't have to do it like this.
Starting point is 00:46:16 So regardless of charity work or not, it seemed like everyone agreed that, even though he was doing something nice, Shao Yu seemed really into showing off his wealth. That was like his main thing in every video. Like it was less of like, I really want to help people and more of, look at how much food I bought to give away for free.
Starting point is 00:46:33 And then his downfall happened. It wasn't his rest interestingly enough. Shaoyu posted a video that offended a ton of people. April 2020, Hyde of Lockdowns, or I guess, beginning of lockdowns, shall you thought that he would up his usual content, because everybody's stuck at home, everybody wants some crazy videos to watch.
Starting point is 00:46:53 So he went online and ordered a shipment of breast milk to his house. So if you don't know, there's a lot of women who will sell breast milk, not because they're trying to like profit off of breast milk, but there are some moms who are able to produce profit off of breast milk, but there are some moms who are able to produce a lot of breast milk. Then there are some moms who can produce none
Starting point is 00:47:09 or a very little breast milk or for whatever reason can't, you know, breast feed their child. And so these mom will sell their breast milk. And a lot of babies can be allergic to things that are in formula. So breast milk is very important. It's one of those natural resources that you're like, holy shit, this is so, so delicate. Like we need to be very careful with this,
Starting point is 00:47:29 right? Freshly squeezed. The video was titled, I drank a mom's human milk bubble tea made out of breast milk. Nobody really cared about the title or even the concept of the video because you know a lot of YouTubers had tried breast milk before. There have been a lot of YouTubers whose wives or sisters or you or people in their family had had, yeah, but I've seen people try breast milk and it's like the tiniest little bit that they take, like not even a shot glass of breast milk and it's just to, I guess it's just intriguing because people are curious, right? But shall use disrespect for breast milk was the problem. He bought a large supply of it, right, that a baby could have used.
Starting point is 00:48:08 He didn't heat it up like you're supposed to heat up breast milk. And he was just being very disrespectful. I mean, breast milk is not sold to be another bubble tea. It's not sold to be the most delicious thing in the world. It's literally a woman's blood, sweat, and tears. Like, breast milk is not an easy thing. But they were screaming, it smells like dogs. It smells like dogs.
Starting point is 00:48:30 Oh, it smells like dogs. And then when they drink it, shall you and his assistant, they start screaming, oh, disgusting, I'm getting nauseous. It's like stinky old sweet milk, or like really bad tea. Shall you take another sip and pretend it to gag? I swear it smells like dog, it stinks like a little chewawa, it's so gross. They kept going and they're like, well let's see if we can make it better. They added chocolate chips, chocolate chip cookies to it.
Starting point is 00:48:55 They tried making a cereal out of breast milk. They even tried mixing the breast milk into bubble tea and they hated it all. They said the milk was so so bad it made them uncomfortable. Shall you even end it with? The milk was so so bad it made them uncomfortable. Shall you even end it with? The milk of a mom does not taste very good. And a lot of moms were angry and upset. Because breastfeeding takes so much mental physical and emotional toll on you. Like you're like, what? Emotional mental? Okay, so you're in pain. It's not pleasant. Every two hours for an hour, there is another human being that is biting your breast apart.
Starting point is 00:49:27 And then your ducks can get clogged and you have to squeeze it. It's like so painful to breastfeed. Not only that, but imagine you're at work, you got to go and pump because if you don't pump, your breastmilk's just going to leak everywhere or it's going to get clogged and it's going to cause you a world of pain. So your whole life turns into breastfeeding for like months. A lot of moms say, you know, the pressure to breastfeed is even high. If you say that you're not breastfeeding, people are going to judge you. Even your husband might be like, you should breastfeed it's better for the baby. You just gave birth, your hormones are all over the place.
Starting point is 00:49:59 You feel like a straight-up cow. You can't even take any psychiatric medication for anxiety or depression because it'll end up in the breast milk. Straight up cow, you can't even take any psychiatric medication for anxiety or depression because it'll end up in the breast milk. So you literally give up your mental health, your whole body, to breastfeed after you gave birth already. So yeah, I mean, I can see why they're so mad. It's not even that they made the video.
Starting point is 00:50:18 It was just how disrespectful they were. A mom commented, you know, it's a good topic that you could have talked about, but the content is just so unpleasant and not informative at all. Another comment it, breast milk is made from a mother's own blood, but you're just making fun of all of it. So after being dragged to filth,
Starting point is 00:50:37 shall you disappear for about two months before coming back in a full-suitantie with a serious look on his face, and for the first time on his channel, a video is titled, I kneeled down and apologized to all the mothers in Taiwan. He said, I've been silent for a while, and I've been reflecting on myself for a long time, and now I fully understand what I did wrong. I just want to apologize to all the mothers in Taiwan and all over the world. On my birthday, which is a very odd detail to add.
Starting point is 00:51:03 And then he bowed his head. As a public figure, my words and actions are likely to have heard some people, even if they are petty. Yes, I was really wrong. The video style does go against my original intention. I sincerely hope that everyone can give me another chance. I wish to continue to bring laughter to people and join hands with everyone to protect the more important things in people in front of us. That's what we should be doing.
Starting point is 00:51:23 I'm very sorry. Then in April of 2021, he announced that he quit his YouTube channel and would be making a living by investing in the stock market. But did he really quit? Or was he just operating different channels on different platforms? It turned out the guy had started dabbling in deepfakes.
Starting point is 00:51:38 Even before he quit his YouTube channel, he was posting deepfake videos of the mayor of his city saying some wild, weird, funny stuff, quote, on the funny. When that video went viral, Sha'yu started quote, the comment entertainment company and started heavily marketing on telegram and Twitter. He made group chats called Taiwan Internet celebrity face swap, where he made deep fake videos of influencers and celebrities and porn videos.
Starting point is 00:52:02 And by this, nobody knew it was him. No. Okay. Okay. Yeah. So they he did post by- But this nobody knew it was him? No. Okay, okay. Yeah. So he did post a few where people knew that it was him. So like the mayor face swap? Yeah. Yeah, he posted with his face on it. And people thought, oh, it's just Shaoyu being Shaoyu.
Starting point is 00:52:17 But there's so many people making deep fake porn. They thought Shaoyu would never do that because, I mean, at the end of the day, he's a YouTuber with very low morals, but he's making money off of pretending to be the mayor. Like, it felt like very YouTube content. Does that make sense? Like the YouTube deep fake world. And then this felt like a different world
Starting point is 00:52:35 of like the underground CD, a legal porn world. So nobody imagined it was him. Okay. And by October, 2020, the videos were getting a ton of traction. And well, you know the rest of the story. So finally, October 18th of 2021, very recent, Shouyu was arrested.
Starting point is 00:52:52 And again, he told the police, I wasn't even having fun doing this anymore. The pressure to keep working was so high, like I had no personal life, all I did every day was sit there and edit. There's a lot. It's like, what are we supposed to feel bad for you? Meanwhile, the victims are still
Starting point is 00:53:07 reeling with the reality of what happened. One YouTuber said, one day my mom called me and told me very nervously that she had seen my video, my naked video. And my dad was so angry he couldn't sleep all night, thinking that his daughter had become someone that people would masturbate to. Once they realized it was a deep fake,
Starting point is 00:53:24 they urged me to quit my job. They said, no, I worked so hard, like I built all of this up myself, what do you mean you want me to quit? Another YouTuber said, I just hate him for what he did. This whole incident ripped me apart. I don't even know how to express myself, it's so painful, there are literally no words to describe it, to explain it. Because of this, brands stopped working with me, I lost tens of thousands of dollars,
Starting point is 00:53:48 my income was cut, I mean, it was really hard. And even after these statements were made public, people were commenting horrible things like, well, you should actually think, shall you, because without him, who would even know your name now? Shall you was sentenced to five years in six months in prison, and he was fined more than $400,000. He tried to apologize and he said, I confessed to my crime, and I sincerely apologized to all the victims there. Thank you. But what now are deep fakes over? Hardly. I don't think we're ever going to live in a world without deep fakes.
Starting point is 00:54:19 And ultimately, it's being used as another tool to objectify people's bodies and victimize people, actually. There's more software targeting women. There's an app that was released called Deep Nude. It's an AI-powered software and you upload someone's picture. They can be in a bikini, they can be fully clothed, and it will render an image that shows them nude. Now, it's not real.
Starting point is 00:54:41 They use the skin tone, they use the body proportions to show you what's under. And it's said that the fake nude images can be possibly realistic, especially if you use like a bikini picture of someone. It can look real. It can look like a nude photo. Like even the angles of the breast beneath the clothing, the nipples, the shadows, everything. The creator of the app said, first, the software locates the clothes, masks them, speculates the anatomical positions, and renders it.
Starting point is 00:55:08 And all this processing is pretty slow, maybe 30 seconds on a normal computer, but it can be improved and accelerated in the future. In comparison, deep fake videos take hours or days to render a believable, face-swapped porn video, but this bikini picture, even if it's not a bikini picture, you can get someone's quote, fake nudes in 30 seconds. But here's a crazy thing. It won't work with pictures of men. If you insert a picture of a man, the private parts will look like a woman.
Starting point is 00:55:35 I mean, just think about that. It could take a few seconds for a coworker to try and see what you look like nude. And I get it. It's not real. But that's so objectifying and that's terrifying. It's just another avenue for primarily women to be mistreated and victimized. And once the technology gets great, imagine all it takes is three seconds for someone to circulate your quote-nudes at work or in a friend group or at school.
Starting point is 00:55:58 The terrifying part is, for this to happen to someone in a place like the United States, it would be traumatic. It could possibly result in the loss of someone's entire income professionally, socially, and potentially their life, because it could lead someone to take their own lives. We've seen that before. But imagine if this happened to women in countries where women can be killed for showing a little bit too much hair or skin. They do nothing wrong, and yet it is a death sentence. Imagine the blackmail
Starting point is 00:56:26 power, the extortion, that a photo like this, a fake photo like this could be an automatic death sentence to women in other parts of the world. And it's not even real. Caitlin Bowden, a founder and CEO of a Brevenge porn activism group, said, now anyone can find themselves a victim of revenge porn without ever having taken a nude photo. This text should not be available to the public. A law school professor said, this is an invasion of sexual privacy. Yes, it's not your vagina, but others will think that they are seeing you naked.
Starting point is 00:56:58 And as a deep fake victim said to me, it felt like thousands of people saw me naked and it felt like my body wasn't mine anymore. Now, thank God, the creator of the app got so much hate that he was forced to take it down, but I highly doubt that this will be the last that we see of apps like this. Even the original app developer said, technology is technology. If someone wants to do something bad, it doesn't make any difference whether they use my app to do it or not. Even if it's not me, someone else could make the app within a year.
Starting point is 00:57:24 What kind of logic is that? Yeah, just like, well bad things are going to happen, so I might as well make an app before someone else does. Yeah, I don't know what kind of logic that is. And he said, you can ban software, but technology can never be banned. Then a scandal hit in China with an app that allowed men to provide a picture of a woman, right? And a software will scan that woman's face to look for nudes and videos on porn sites or even private social media platforms and like forums. The app developer said this and it's so gross.
Starting point is 00:57:59 My intention of creating this app was to prevent men to be cheated by, you know, a specific type of woman. Yeah. People who have posted nudes, I guess, or sent a nude. The backlash was swift. Okay, most netizens believed it's a disregard for privacy, but it's also another tool to threaten and violate a woman's privacy. The argument of us, what if these girls are already victims? What if they're not pointing out their porn videos because they want to?
Starting point is 00:58:24 What if they were a victim of revenge porn or someone leaked something? Or what if they didn't even consent to being recorded or defaked and now whatever relationship she's in, you're saying that you need to protect the men? What are you saying? The app developers tried to argue that women who were secretly recorded or found that someone had released revenge porn could file a complaint to the app so that they would not show that result. Which all I have to say to that is fuck you, who the fuck do you think you are to take down illegal videos, you shouldn't have been showing them in the first place. And the worst part is, you could use the new app to develop a photo of someone seemingly nude, posted somewhere, and now this stupid ass app will tell your scumb boyfriend that you've got a nude out there.
Starting point is 00:59:08 But there's more. There was a new software that hit the market. A VR deep fake software that lets you create a deep fake of a celebrity, but you can interact with it with a VR headset. It's terrifying. So you can move around a virtual naked body that looks exactly like a celebrity. And the celebrity will follow you with their eyes while you walk around them. And on the side is a long menu drop down of sex positions that you can put the non-consenting 3D model into.
Starting point is 00:59:35 Some of them include but are not limited to. A backward bend, squatting, touching herself, several versions of kneeling, on her back holding her legs over her head. A vice reporter who saw a video created with this VR software said, I feel like this is one of the creepiest deep fake videos I've seen. It looked like you had the actress trapped inside paralyzed bodies where all they could do was move their eyes and be put into positions that whoever was controlling it put them into. But another user wrote about the app.
Starting point is 01:00:05 I think it's pretty damn interesting and it's gonna sell like hotcakes if someone ever decided to market it. Sorry what? Yeah, some of these people using these softwares, they believe they're own shit. One creator of a non-consenting porn deepfake said, I don't see the problem with deepfakes
Starting point is 01:00:19 because as the name suggests, they're fake. They're not real, so there's no real victims. Another one said, if there was like a worldwide ban on deepfakes, I guess I would stop making it, but that won't happen. We have more pressing issues on our hands than this, because it's just a deep fake. People get manipulated every day.
Starting point is 01:00:37 We do this because we like these celebrities, and most, if not all men, fantasize about sleeping with celebrities and having sex with them. So we make these videos. A victim's rights law from activists, Kerry not all men, fantasize about sleeping with celebrities and having sex with them. So we make these videos. A victim's rights law firm activist, Carrie Goldberg said, at heart, this is a product that gamifies the violation of sexual consent. Anyone depicted is going to feel violated.
Starting point is 01:00:55 Anybody who says the internet isn't real life or virtual reality is fake is just constructing excuses for doing bad shit. There is no question that building a bot to rape in VR delivers a different type of injury to the depicted person than actually going and attacking her. Yes, it's going to be a different type of trauma. However, two dissimilar things can be wrong and unethical at once. Because a lot of people argue, it's not like I'm raping her in real life. Yes, she might not feel the same trauma as a rape victim, but she's gonna be traumatized.
Starting point is 01:01:27 There are forms out there of people learning to deep-fake celebrities and to porn videos, and they ask each other questions that are so, I mean, it's chilling, the way that they talk about creating another victim in this world, and the way that they talk about women being victimized in this manner. Hey, guys, let's say you have some frames
Starting point is 01:01:44 where the porn star has a dick in front of her face. Do you mask the whole face cutting through the dick or do you mask just the dick? Just technical questions about how to put a celebrity's unconcending face onto a porn star. Yeah, who cares about people? It's about the specs of their violation, you know? In another vice piece, which by the way, vice actually did a lot to try and draw attention
Starting point is 01:02:04 to the overwhelmingly female victims of deepfakes. And has tried to argue that the biggest threat to deepfake technology right now is not a politician being deepfaked, but it's women being deepfaked non-consensually in pornographic ways. They interviewed a deepfake creator who charges people to create deepfaked porn videos. And the creator said, I make the deep fakes without question, as long as both the source and the target, he said target.
Starting point is 01:02:29 I mean, that's gotta imply something, no. As long as both the source and the target are clearly and obviously 18 and up. So do you ask and confirm that they're 18 and up? Do you ask for legal proof? No. But I have morals, you know. He claimed that he wouldn't deep fake people into a gas chamber or stuff like that.
Starting point is 01:02:48 When asked about how he would expect people to react if they discovered non-consensual deep fake porn of themselves, he said, I think guys would laugh or take it as a compliment and girls would probably freak out in screen rape. I think that last sentence tells us all we need to know about this guy, no? And the worst part is, most states don't even have laws where you can go after someone for creating deepfakes of you in a pornographic manner. It's said it's because deepfakes defy most state-revenge porn laws because it's not the
Starting point is 01:03:16 victim's own nudity depicted. And on top of that, most of the deepfakes nowadays don't include celebrities' names next to it, because technically, you can't be legally sought after for creating a deepfake, but if you use a celebrity's face and likeness and name to gain profit, then you might be able to go after them. What makes it even wilder is that if you see all the laws being put in place for deepfakes, a vast amount of them are in place to protect politicians, and billionaires, and massive business owners
Starting point is 01:03:48 against slander and the spread of misinformation, which don't get me wrong. Misinformation collectively has been the bane of the US's existence recently, but deepfakes at the end of the day are primarily used to victimize women. But sure, let's protect the politicians first. What are your thoughts?
Starting point is 01:04:06 Do you think it's just gonna get worse and worse? There is even a whole community of this forum, which I forgot to mention, but they have this rule that you can't deep fake a celebrity that just turned 18 because then they're gonna get sought after by the police because all the pictures and videos that the AI would source from are when they're miners. So they have to be well into their 20s so you can argue that most of the imaging that was used was when they were not miners.
Starting point is 01:04:32 Is that not disgusting? Wow, wow, wow. So it's not because they don't want to victimize a freshly 18-year-old. Welcome to the adult world, let's give you some trauma. No, it's because they don't want to get shut down. For depicting minors and porn. What are your thoughts on all of this? Let me know. Please stay safe out there.
Starting point is 01:04:50 Enjoy your Halloween, but stay safe. And I will see you guys on Wednesday for the main episode. Bye!

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.