Rotten Mango - #386: New Nth Room: Middle Schoolers Deepfake Videos Of Mom, Sister, Classmates In "Humiliation Room"

Episode Date: August 29, 2024

South Korea is going through an erase all faces panic. Every single selfie is being wiped from the internet just in case. As a response - a male college student wrote on a forum -  “Put a finger d...own and ask yourself  Have I ever been asked out or confessed to? (Joke confessions do not count)  Do a lot of men try to talk to me even though I don’t talk to them first? Have I ever looked into the mirror and resented my parents? Can other people recognize me without makeup? Do I weigh more than 145 pounds? Do I frequent feminist online forums? When I walk on the street, do children avoid me? If you answered yes to more than 3 of these questions - don’t worry. Your rate of being a victim of sexual deep fakes is very low. Men have standards too. We don’t just deepfake any woman.” This is the case of the “new Nth Rooms” where over 400 schools have been found to be operating illegal chatrooms where students are getting deepfaked into sexually explicit photos and videos.  A whole network of deep fake trading and creating has been found.  Most victims and perpetrators? Teenagers.  Full Source Notes: rottenmangopodcast.com To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Ramble. Plus to wager Ontario only gambling problem call connects Ontario at one eight six six five three one twenty six hundred Bet MGM operates pursuant to an operating agreement with iGaming Ontario Badabing badaboom Paris France August 24th of this year literally last Saturday a Man named Pavel steps off his private jet at the airport right outside of Paris He's mainly flying into Paris for a dinner. He's there with maybe a business partner, maybe a relative, maybe a dinner date.
Starting point is 00:00:53 We don't know because he's very private, but he's there with a woman and his bodyguard. And right as he steps off, almost as if they're waiting for him, as he steps out of the airport, Paris authorities rush in and they detain this man. Within hours, if not minutes, honestly, his detainment is going to be news all around the world. Every major news publication is talking about the detainment of Pavel Doroav. Pavel Doroav is being investigated for the circulation of child abuse images, illegal drug peddling, and the refusal to cooperate with authorities. So why do people care? Sometimes it feels like these types of news are every day. Why is this making international headlines?
Starting point is 00:01:36 Because Pavel Derov is worth $15.5 billion and Pavel is the founder of Telegram. Meanwhile, approximately 5,500 miles away in Seoul, South Korea, an erase all faces panic has ensued. Middle schoolers, high schoolers, college students, teachers, military soldiers, professionals, they're busy trying to wipe their face from the internet. Not specific photos, literally any single selfie off of social media, no matter how innocent it is, they're going dark. They're deleting it all. Even local universities sent out these mass emails, warnings to their students. The emergency notification urged students to please remove or avoid posting photos of themselves online, no matter how innocent.
Starting point is 00:02:24 Photo deleted, photo deleted, photo can't be found, account can't be found, everybody is just deleting their accounts. One high schooler said, most of the girls in my whole class took all of their photos off of social media. In response, a male university student wanted to help calm down the chaos. He's like, you don't need to be that scared. He posted on the university forum for all the students because you noticed all of the people are just taking down all their photos. He said, read this first. Put a finger down
Starting point is 00:02:53 and ask yourself. If you say yes, put a finger down. One, have I ever been asked out or confessed to? Joke confessions do not count. 2. Do a lot of men try to talk to me even though I haven't talked to them first? 3. Have I ever looked in the mirror and resented my parents? 4. Can other people recognize me without makeup on? 5. Do I weigh more than 145 pounds? 6. Do I frequent feminist online forums? 7. When I walk on the street, do children avoid me? If you said yes to three or more of the above questions, be relieved. You don't need to delete your photos off the internet. Your rate of being a deepfake victim is very, very low. We men have standards too. We don't just deep fake any women. Why do women on
Starting point is 00:03:45 Instagram and social media sites throw a fit as if they're victims of every situation? If you've answered yes to more than three questions, quit making such a fuss and commotion. But whatever, go ahead, hurry and delete your Instagram pictures and mock us men. That's crazy. Is his name on there? No. Wow. Some girls said they were messaged by male classmates. Hey, you're safe. What do you mean? I told you you're safe. They only go after the pretty ones. Don't worry about it.
Starting point is 00:04:16 Because the breaking news has been that while the Telegram founder has been arrested in Paris, South Korea is going through another Enthrum case, where users are flocking to Telegram to share non-consensual, sexually explicit material of victims. We have two episodes on the first Enthrum cases in South Korea where women and girls were being blackmailed into providing quote, content for sick men who are willing to pay to watch these girls get blackmailed into providing quote content for sick men who are willing to pay to watch these girls get blackmailed and tortured. But now the girls don't even know they're victims until it's too late. Some of them were even as young as
Starting point is 00:04:52 middle schoolers, elementary schoolers, and it's very different this time because it's all deep faked. We would like to thank today's sponsors who have made it possible for Rotten Mango to support the Korea Sexual Violence Relief Center. They have assisted and supported SA Violent Survivors through various methods, including providing more than 87,000 one-on-one counseling sessions and fighting for the revision of laws. This episode's partnerships have also made it possible to support Rotten Mango's growing team, and we'd also like to thank you guys, our listeners, for your continued support as we work on our mission to be worthy advocates. As always, full show notes are available at rottenmangopodcast.com. A few disclaimers for today's case. It involves identity theft, non-consensual filming, rendering of explicit material, CP, and the essay on exploitation of students. This case is currently ongoing, so we're gonna try to keep you guys updated. If there's any massive updates, we're going to leave pinned comments but the case itself will be current
Starting point is 00:06:09 through the publication date of this episode. If you feel like today's case is a little bit too intense, please take a break, take a bath, and take a breather. We will see you in the next one. Now with that being said, let's get into it. A set of telegram rooms were exposed recently in South Korea. Each room is another name of a middle school, high school, or college in South Korea. The way to get into these telegram chat rooms is you submit a set of photos of any girl or woman. It has to be someone that you know personally.
Starting point is 00:06:38 So think classmates, siblings, friends. Then you have to provide personal details of that person in the picture. Name, date of birth, occupation, school, grade, age, Instagram handle, phone number, and if possible, a full-blown home address. In addition to that, you need to provide 10 pictures of them. So usually the pictures are a collection of that person's Instagram, Naver, Facebook, Kakao, social media posts, as well as pictures you might have taken in person of them
Starting point is 00:07:08 because these are people they're supposed to know. To get entry into these telegram chat rooms, you have to send in pictures of your acquaintances, friends, or family. Members of the group chat will then make sexual deep fake videos using those pictures. That's why it requires 10 because the more pictures you provide, the better the videos. Netizens have made a list of all the schools that are being impacted, and there are dozens of middle schools and high schools, and even some of the top Ivy League universities in South Korea listed on there.
Starting point is 00:07:37 Middle schools. Yes, but the youngest ones that netizens have found to be victim are elementary schoolers. What? Yes, but the youngest ones that netizens have found to be victim are elementary schoolers. What? Yeah. There's even a picture that includes the names of 66 different victims, and next to their name they list their school, and it's almost like an inventory count. Someone in the chat room even responds, if you send me the names and the pics of the bitches who are not listed above, I'll make more humiliating pics and videos of them. If I'm horny enough, I'll even go and find them in their schools and stalk them. If you don't go to any of the schools above, don't message me personally. If you can't
Starting point is 00:08:12 message me personally, just press a heart. What does that mean if you don't go to those schools? Because you have to be acquaintances of other victims of that school. So they're saying if you're just a random person, don't be messaging me. Like you gotta go to schools You got to be a student so that you can send in pictures of your acquaintances that are already not listed above He's saying I have 66 victims. I want more from these schools, but if you don't go to these schools, don't even bother Oh, so he doesn't want other schools. No these schools. Yes Now in one telegram channel, there are more than 70 individual chat rooms you can go into. Each chat room is, like I said, another university, another school.
Starting point is 00:08:53 So they have over 70 schools that you can quote, visit through the chat rooms. That particular single Telegram channel, in total, it's slowly been uncovered that over 400 schools have been affected, which is about 7% of the entire nation's middle schools, high schools, and colleges put together. That's an insane percentage to be affected by this, and that's just what we know so far. I do want to note that these figures are gathered by netizens trying to follow this case, and not by the South Korean authorities. It has not been confirmed yet by them. But if you remember the way that the authorities handled the original Nth Room case, I don't know if you want to go by the authorities on this one.
Starting point is 00:09:33 I mean, if they found 7%, that only means there's more. Yeah. It could be a lot more. A lot more. It's called the deepfake damage school map. There seems to be a strong implication that each school was rather well organized into these chat rooms. So it's by school name, chat room, and then all the deepfake videos of the students of that said school are in that chat room.
Starting point is 00:09:59 Meaning, some people will specifically go to a certain chat rooms to try and find very specific people. They're not just browsing through. They go to these schools, they're parents of kids at these schools, they're teachers of these schools, who knows? They're going into these chat rooms specifically looking for these pictures and videos. If a male peer at university goes into that chat room to specifically look for one girl's photos that have been defaked, think about how dangerous he is in person. Once you
Starting point is 00:10:28 get into those chat rooms and you now have possession of these videos, people say you can now decide if you want to blackmail someone. So you find videos of that said person, that specific person, and you can try to blackmail them because like I said, every single person's school, name, location, age, everything is listed. They're not just like, hey, look at this girl from my school. They list their full names. For universities, they list their student ID number.
Starting point is 00:10:56 They hunt them down and show them the videos, asking, is this you? Now, the potential here being to try and blackmail them into doing more things, or just getting a sick thrill out of here being to try and blackmail them into doing more things, or just getting a sick thrill out of them being panicked and scared and fearful, this is clearly done so often that there's a name for it. It's called jineunyok. It's not a super commonly used term from what I can find, but it's becoming more frequently used.
Starting point is 00:11:20 To put it simply, the word means to create sexual photos of acquaintances. Usually by pulling pictures off a social media account and deepfaking their faces into doing very degrading, humiliating sexual acts, it's to instigate other men to sexually harass that woman or spray biological fluid onto the woman's video and pictures. It's oddly specific. It's suspected that these human trash people get the first thrill, the first kick from seeing the pictures and videos that are deep faked of their classmates and coworkers. Likely these are people that have been rejected by them or have zero chance with them or maybe they're people that they're too scared to even talk to.
Starting point is 00:12:03 They will get off on knowing that they're doing this to them. Then the second rush hits. When they see other men degrading the victims in the chat rooms. Perhaps a third rush comes from either showing other people that know the victim, Hey, look at our classmate. Hey, look at our boss. And this thrill of feeling superior to the victim, like you have something on them, something that could ruin their whole life
Starting point is 00:12:25 You are the one that holds the power or maybe you even show it to the victim to scare her And the fourth rush comes each and every time they see the victim They may feel an immense sense of power just coursing through them And who knows how long these feelings will last. That's why it has to be acquaintances. Strangers are no good. They need to know these people and feel that superiority, like they have the power to control their fate. On top of that, they probably get another kick out of it when the victim finds out what's happened and realizes that her classmates are doing this to her and her classmates have seen it. Clearly the trauma and impact is going to be very deep. A lot of the victims report they have no idea who to even trust now. Their worldview is completely
Starting point is 00:13:09 shattered after this and every interaction that they've had they're so paranoid they even start feeling disgusted because they feel like they need to act nicer to all the guys in the school because they hope if they're nice to them they're not going to victimize them. Wow. As for the perpetrators it also seems like they have no shame in hiding what they're doing. The types of pictures are easily viewable. They just straight up use hashtags. Hashtag acquaintance, hashtag kids, hashtag high school, hashtag deepfakes, hashtag slave,
Starting point is 00:13:40 hashtag free. On Telegram? Yes. On social media platforms. I think they're blurred on other social media platforms that don't allow explicit content, but there's usually ways to get onto the Telegram chat rooms for more. The way they talk about women and honestly literal children in the threads is, well you decide the word.
Starting point is 00:14:03 Someone comments, is there anyone who can help with humiliating redacted victims full name? Anyone willing to humiliate acquaintances with me? Is there anyone who can share their video editing sources? It's difficult for me to collect so many body photos. Is there anyone who wants to exchange deep fake photos of acquaintances? I will provide the best ones. Perpetrators have started threads that read, I'm getting some information about acquaintances, I will provide the best ones. Perpetrators have started threads that read, I'm getting some information about acquaintances, sex slaves, bitches, and friends, and it's been a while since someone submitted a report.
Starting point is 00:14:33 If you want more pictures, just leave a comment. There are some rooms just titled Kim's Sexual Bathroom, so you can kind of take a guess on the theme of that Telegram room. A journalist went undercover into one of the Deep Fake Anth Enthrum chat rooms and they said some of these chat rooms are just run by bots. Right when you enter into the chat room, it immediately sends this automated message.
Starting point is 00:14:55 Hello, I am Magic Photo Bot. Let's try to send the photo of the woman you like, right now. You get one free photo to Deepfake, but by the second one, you need to pay one diamond. One diamond per photo is the structure here. The journalists create a fictional picture of a woman that they make through AI. Then within 5 seconds of uploading that into the magic bot, the chatroom bot sends an illegal video with that person's deepfaked photo on there. They said you could even change and adjust some of the
Starting point is 00:15:25 body parts. Let's say you want a fuller chest. You could micro adjust little things like that. This particular chat room alone had 227,000 people on there. Then another message reads, for additional photos after the first free one you can purchase diamonds in units of 10 diamonds. 10 diamonds is $5. To put it simply, for $5 you can get 10 pictures turned into deepfakes. For $5. Wait, there's 220,000 people, users, active. Yes.
Starting point is 00:15:59 You do have to purchase the diamonds through crypto, which means I truly don't know what the conversion rate would be in a chat room like this. It's not a normal situation of going onto a website and clicking checkout now for your cart so I don't even know how to do the conversion rate. A few things to consider. Usually in these types of environments people are more prone to spend a bit of money to satisfy their perverted fantasies and the entry into buying the product is $5. So it's even easier to buy in, if you will. Which means if you have a conversion rate of 10 to 50%,
Starting point is 00:16:32 that means this bot-run chatroom is generating anywhere between $100,000 to maybe $500,000. Maybe even more, depending. One expert from the Sexual Violence Counseling Center said, the fact that there's a profit structure in place means that there's already a lot of demand. This chat room is considered a production chat room. So you go and you submit a photo, you get deepfakes back, but you're not here to talk to other people, which doesn't make things better or worse. You're literally not allowed to have discussions in the chat room.
Starting point is 00:17:03 It's just a magic photobot. But it's important because that means 200,000 plus people are on here for the sole purpose of creating illegal deep fakes. Even if they all just create one free one, which is highly doubtful, what's the next phase? The next phase is the distribution phase. The damage is expected to be huge because there's so many chat rooms available for them to go to to exchange these deepfakes, to post these deepfakes even if they just go to group chats made with their classmates. The damage is immense.
Starting point is 00:17:36 Yeah, so if there's 220,000 people, if they only each make one, that's 220,000 victims. And if they make more, they make five, they make 10. You're talking about millions of victims. Yes, and there are questions of there could be foreigners in this chat, so we don't know if this is specific to South Korea, but I will say the whole chat room is in Korean. So I mean, the odds of foreigners being in this chat room,
Starting point is 00:18:04 this is not me saying that this doesn't exist in other languages. I'm pretty certain it does. But for this specific one, for these 227,000 people to be exact, it seems like a good chunk are South Korean users. Someone had commented that there are approximately 260,000 taxis in South Korea. They wrote, that means the likelihood of encountering a sex criminal in Korea is almost as high as finding a cab on the streets. Pretty much everyone that you can think of is getting affected in Korea. Like I said, anywhere between elementary school to teachers and soldiers, even Olympians,
Starting point is 00:18:41 the athletes are getting deep faked. Nurses from hospitals are getting sexualized and deep faked. There are reports that some men are even sending in pictures of their own mother, their mom, to get deep faked. Oh my gosh. How did this like, I mean these rooms have been around for a while? Some of these deep faking rooms have been around since the original Enthrum case. A lot of them were uncovered in the process of uncovering like Gad Gad and Paksa's chat rooms,
Starting point is 00:19:10 but people didn't take it as seriously because they were deepfaked. Whereas there were real... I don't want to say real because these are real people, but... Does that make sense? I think it was kind of put on the back burner. And then all of the commotion about the nth room chat rooms this wasn't included So now it's getting brought up again because it's just gotten even worse That was in 2020 when people first started reporting a little bit here and there on it So it's only gotten worse four years down the line
Starting point is 00:19:40 The technology has only gotten faster quicker cheaper. And so the amount of victims is going crazier. So why is it like blowing up right now? Top universities were found to have operating chat rooms that have been active for three years of some of the female college students getting deep faked. Okay, so just one person exposed it or it was multiple high profile colleges So think Harvard and Yale, they had their own chat rooms. They had their own deep fakes being made victims came forward They keep urging the police to do something and I guess because their voices might hold more weight Considering how society works then the police got involved and they realized that there's actually a lot of middle school reports that have been coming in of middle schoolers Deep-faking their classmates and it's become this big snowball like every single day every single hour. There's more developments. I
Starting point is 00:20:33 See it just starts unraveling This show is sponsored by better help when I spend time with my nieces, sophie and mia, i am always amazed by how curious they are at the world. i know that we all dread the endless yyy from kids, but it's also so fascinating to see how curious they are, and it made me realize that i've lost a lot of that curiosity because i'm just always a little bit stressed about anxiety and what if I sound weird and what if I am doing this and I've pushed a lot of hobbies off to the side. My therapist has made me realize that I need to take more time to take care of myself to help grow that inner child
Starting point is 00:21:15 and open up that curiosity again. So with my therapist, I've been trying to think of all the things I wanted to do as a kid and we essentially made this mini bucket list for myself of things that I want to try and learn. painting, pottery, finally beating my husband at badminton, and making the list with my therapist has made me feel very excited. I feel like I've reconnected with this sense of wonder that I used to have as a kid and I just always wanted to learn new things it sounds really cliche but therapy is very helpful for everyone. therapy can be useful for big and small things in your life. If you need help with setting healthy work boundaries
Starting point is 00:21:47 or even just wanting to communicate with your partner, therapy can help with that. If you're thinking of starting therapy, give BetterHelp a try. The best thing about it is that it's entirely online, it's designed to be convenient, flexible, and suited to your schedule. Just fill out a brief questionnaire to get matched with a licensed therapist. If you feel like your therapist isn't the right fit for you, no worries because you can switch your therapist at any time for no additional charge.
Starting point is 00:22:09 Rediscover your curiosity with BetterHelp. Visit BetterHelp.com slash rotten today to get 10% off your first month. That's BetterHelp.com slash rotten. Now at DoeFresh, get 25% off all children's apparel only until Monday, September 2nd. Shop smart with one cart and check everything off your back to school list all in one place. Now that's some smart shopping. Conditions and exclusions apply. See in store or joefresh.com for details. Sarah, who is a fake name, she studies at Inha University. So this is one of the two schools I was telling you about.
Starting point is 00:22:48 It's one of the top 20 universities in the entire nation. They're known to have one of the best physics programs in the entire country, but that's not the problem right now. She gets a random DM on Instagram from this account that has no followers, following nobody, it's set on private. She clicks open the message. Photos of your face and personal information are being shared in a Telegram chat room.
Starting point is 00:23:10 I'm going to send them to your friends, family. I'm going to drug you, abduct you, and essay you. The anonymous Instagram account even tells her which department she studies in. It seems to know everything about her while she knows nothing about them and nothing about this Telegram chat room, which means she's now dealing with figuring out what the hell is even happening.
Starting point is 00:23:30 She's never really even been familiar with Telegram and now she's anxious every day, even go in a class to keep up with her schedule. She tries to go to the police. They tell her Telegram is hard to track. There's nothing we can do. She starts doing her own investigation. She finds out many of them, the chat room members, they would even create audio files of a bunch of her classmates, female classmates, girls that go to her college,
Starting point is 00:23:55 saying sexually provocative phrases created by AI, saying things of how they all wanna be slaves, they wanna be enslaved by an owner, they're looking for owners, and they love their owners. When the case finally gets attention of the local police force, they realize that over 1200 people inside the chat rooms were active. So 1200 people are active in this Inha University Telegram chat room, sharing pictures, sharing deep fakes of current female students. So that means 1200 people are connected to the school right? Yes. Wow. They also took it to the
Starting point is 00:24:33 next level by circulating private information for each victim, which major, which classes they're in, sometimes even class schedules, their phone numbers, even their student ID numbers. This telegram chat room targeting this specific university and targeting the female students had been operating and had been active for the past three years. They also discovered chatrooms of nearby high schools and even middle schools. Some text messages in those chatrooms just read literal victims name. Obviously I'm redacting it. Some text messages in those chat rooms just read, literal victim's name. Obviously I'm redacting it.
Starting point is 00:25:06 High school, born in 2007, 17 years old. Does anyone know? Give me a personal DM if you know her. So now people are submitting. They see someone they like, whether it's on social media or maybe they've heard it's like a friend of a friend of a friend, and they want deep fakes of them, but they don't have access to their photos. Maybe their Instagram is private. Maybe they don't know them close enough to take pictures of them in
Starting point is 00:25:28 real life, so they're asking this chat room, do you guys know her because I need pictures of her to create deep fakes. One 17 year old high schooler received an anonymous message once, just like the college student. She said, I looked at my phone after school when I saw that I had this message from this unidentified account. The message just read, do your friends and parents know about this side of your life? It had three photos, sexually explicit deep fakes of her, but she said it was so good you couldn't even tell. I mean, the only reason she knew they were fake is she never took those photos. She said, I was so shocked and terrified when I first saw the photos,
Starting point is 00:26:05 even though I knew they were fake, I knew they were generated images of me. They seemed so real. I wanted to believe it was a whole bad dream that I would eventually wake up from, but I didn't. She responds back, who are you? That's not me. Where did you get those? The anonymous account messages back, what if I send this photo to your parents? Tell them that this is what you're doing instead of studying. I think you need to be taught a little lesson
Starting point is 00:26:30 They then start trying to call her phone. She of course is not picking up, but they would write why aren't you answering? I'm bored. Relieve me of my boredom. Entertain me. I just want to hear your voice lol If they're calling that also means they could track them, right? It was calling through Instagram DMs. Like I think you can call, yes. The students themselves would end up investigating and they found out that most of the time, it's not creepy strangers on the internet that they were expecting.
Starting point is 00:27:01 It's their friend. It's their boyfriend. It's some random guy in the same class as them or two grades below them targeting all the female students at the school. Wait, they find out? They join the chat rooms. There's one case of a middle school boy
Starting point is 00:27:16 that deep faked 12 of his female classmates and two teachers. Wait, middle schooler? Middle schooler. Like how old is that? 13, 14. There are- There's no consequences for middle schooler right? There's not even consequences for adults really. Oh we're gonna get there. Yeah. There are further reports that some elementary school girls had been targeted as well by middle schoolers. Even teachers are getting targeted with risks of losing
Starting point is 00:27:44 their job because in Korea and in certain schools, first of all who knows if the teacher even feels comfortable working there knowing all the students likely watched or maybe even created these types of illegal videos. But even if she needs to stay in that position for her livelihood, some schools regardless of the fact that she's a victim, they might think she's too much of a hassle to keep around. The kids can't focus on their studies because all they can do is snicker about her video, or the parents feel uncomfortable even though they know it's fake.
Starting point is 00:28:14 She might lose her job regardless, so now she's got financial stress in addition to emotional sexual trauma. The National Teachers Workers Union stated that since this news broke, they were able to confirm that at least 170 teachers were impacted by the deepfake videos. They even found a telegram chatroom that reads, Toilet bowl faces are basically degrading, obscene content because you urinate and defecate into toilet bowls. Which is obviously a very gross sexual reference here. One teacher wrote on a forum, My life as a teacher collapsed. I don't know if I can even trust the children now.
Starting point is 00:28:58 She's a middle school teacher in her 30s and when she- Middle school. She- Crazy. She found out the kids were spreading deep fakes of her around the school and not only that they're spreading them through telegram she said when I think of the perpetrator and the other students that were on their side I felt disappointed and betrayed as a teacher I should be
Starting point is 00:29:17 able to embrace all students but I can't even do that anymore she says it's really hard to face the children properly even because we don't know who was all the perpetrators, who knew who are the ones that saw and did nothing. She found out that even her wedding photos were used to deepfake sexual videos of her. An online chat room with 900 participants was also uncovered which people were exchanging deepfake videos of female soldiers. We don't have screenshots of this specific conversation in the chat room, but it's been reported that someone will post a deep fake video of a female soldier and they will just get bombarded with degrading comments.
Starting point is 00:29:55 They rinse and repeat. They call the female victims munitions. Just a tool for the military. One text in the chat group reads, I can't forgive women anymore. I'm gonna ruin everything. The reason they have such a sense of superiority is because they're wearing military uniforms. But all military uniforms will remain with a sense of humiliation for them from now on. Not a sense of superiority.
Starting point is 00:30:19 Aren't you curious about the commanders? The platoon leaders that you've worked with? All of them? Aren't you curious? Then in the same chat thread, the operator asks for military supplies, aka pictures of female soldiers, but he says don't just give me pictures. I'm not gonna deep fake them if you just provide pictures. I need names, their ranks, personal phone numbers, Instagram handles, age, photos of them in military uniform, and then daily photos of them in military uniform, and then daily photos of them.
Starting point is 00:30:47 Outside of uniform. When someone would send in those photos, it would be deepfaked and sent back into the group where they would all quote enjoy it together. The Military Violence Counseling Center has come out to state, Recently, I have confirmed a telegram room that systematically insults female soldiers and spreads deepfakes of them. I can't help but be angry at the wickedness. Discrimination and hatred for women is so rampant in Korean society
Starting point is 00:31:10 that we are discriminating and hating female soldiers in the military that serve and protect our country. The perpetrators have used photos of female soldiers wearing military uniforms to treat them solely as sexual beings. They also said, if you keep this up, and if the government does nothing to respond to this, then they're going to lose their female soldiers, which is the last thing South Korea wants because they're already facing huge challenges in military manpower. Is that what you want? Because these female soldiers, I mean, even joining the military, they're not having a
Starting point is 00:31:41 blast. Yeah. They're already facing discrimination. Look at the way that they talk about even their platoon blast. They're already facing discrimination. Look at the way that they talk about even their platoon leaders. These are ranks above them. So you think they're doing it because they love their job? No, they want to protect the country.
Starting point is 00:31:54 And this is how the country protects them. But with this new Enthrum case, or how they call it, it's not solely about deepfakes. They also found very alarming telegram chatrooms reminiscent of the former Enthroom case that prosecutors and police vowed to never let happen again. They discovered humiliation chatrooms. There's one chatroom with almost 2,000 people in there with categories that read, acquaintance room, so friend of a friend room, cousin room, older sister room, younger sister room, mom room. when you enter it is exactly what you imagine. either guys
Starting point is 00:32:31 posting pictures of their family members to get entry into the room so that other members can deepfake their relatives or they're literally getting off on essaying and harming their family members. There's evidence of guys touching their own sisters and mothers and posting the videos of them into the chat rooms. In one instance, a perpetrator posted a video of himself sneaking into his sister's room. She's asleep. He lifts up her skirt and starts touching her inappropriately. He uploads the video into the chat room with the caption my little sister's thighs are so fucking soft
Starting point is 00:33:07 I failed at making her take sleeping pills today, so I'm gonna try again tomorrow Someone responds. I will be waiting for it holding my breath in Another chat room someone asks another room member. Are you going after your mom? No, I gave up Why because my mom's asshole would be fucking nasty. Why her asshole? Do you think you're not gonna like it? I felt like I wanted to fuck her a few months ago, but I don't want to anymore because I got caught with my spy camera. LOL. You were caught while spying on her? Like filming her?
Starting point is 00:33:41 Yeah, when I was filming her secretly while she was taking a shower. Oh, and your phone was caught. Another room member comes in to try and cheer up the others. Well, if you guys ever want to jerk off together, I have a video using my older sister's underwear. Just personally message me. One victim writes about what happened to her and it reads, there was a time when my younger brother, who's just a year younger than me, made me feel so awful I wanted to die. We were both in middle school at the time. We were actually so close as siblings
Starting point is 00:34:09 that people were jealous of us. The first time it happened, I was so shocked. I didn't even know what to do. I just pretended not to notice. I really regret that. One weekend afternoon, I was taking a nap and our parents were home. My brother touched me and even tried to film it
Starting point is 00:34:25 I woke up and I caught him in the act, but I was just so shocked at the time I couldn't even I couldn't even think straight even though he kept profusely apologizing and acting like nothing happened I felt so embarrassed like I couldn't even stand looking at him. I just told him to get out leave My mind was racing with all sorts of thoughts. I kept wondering is he gonna post it somewhere? Did he post it somewhere? How many times has this happened? And I just now noticed. Who else did he send this to? Did he sell it somewhere? Does he use it to self-pleasure himself? What does he really think of me? I went into a panic. My hands and feet were shaking and I couldn't even cry or breathe properly. I never thought my brother would do something like this. I mean, especially not my brother.
Starting point is 00:35:04 Apparently OP told her mom about what happened and her parents and brother went out so she starts freaking out about what's gonna happen and she writes my parents went out for a bit and after about two hours I thought something was wrong and I sent a message to my mom. I asked her don't tell my dad because he really cares about me and I'm just so embarrassed. About an hour later my brother left the house when my mom came home and asked about it, she said that dad had called my brother. So my concerns were just completely ignored. LOL.
Starting point is 00:35:30 What does that mean? Just I guess the parents did nothing. In response to a follow up question, because I think it was a little confusing, people were asking about the family dynamics and she says, so my family is usually pretty close and gets along well. We're financially comfortable and have a good time together, but that evening my mom had been drinking and ended up hitting my brother a few times. What really got to me was I heard everything and then my mom said,
Starting point is 00:35:55 you're part of the problem too. Who wears such short pajamas at home? I've never dealt with something like this before, so I didn't even know what to do. My pajamas back then were just these short sleeve, short pants ones I've never dealt with something like this before so I didn't even know what to do. My pajamas back then were just these short sleeve, short pants ones that were pretty trendy at the time. She did post a photo of what her pajamas look like. For my audio listeners, they look honestly like home pajamas that you would not wear if your boyfriend comes over for the first time.
Starting point is 00:36:23 They look very home pajamas. They're not reminiscent of lingerie, not that it would matter, but it really doesn't even make sense that her mom is even saying this. They have teddy bears on them. Okay. Yeah, I mean, I guess you could say it was all my fault. I was pretty slim and my chest was C-cup, so it wasn't exactly small. But does that really mean it's okay for something like this to happen?
Starting point is 00:36:47 I don't know, maybe it is my fault. I was just wearing normal pajamas in my own home and went to sleep comfortably. Does that mean it's my fault? After that day, I couldn't really live a normal life for about a year. Even when I was in class or school or eating a meal or even at home and especially when I was showering those
Starting point is 00:37:05 Moments feelings and sensations were so vivid and I just wanted to I just wanted to cut myself out of my own body And my family they all seemed like they were doing fine It was only me who felt like a complete mess while everybody else acted like nothing had happened It felt like they were even too afraid to even talk about it later I had to go on this family trip that was already planned It was less than a week after the incident. And since it was a trip with my grandparents, I couldn't really say no. Throughout the trip, my parents just kept trying to get me and my brother to make up,
Starting point is 00:37:34 and they looked like they were just having a great time laughing and enjoying themselves. In the end, I reached out to my brother first and we became close again and nothing has ever happened again. Everyone around us keep praising him, saying things like, he's such a great brother, you're so lucky to have him. I once brought up the whole situation in front of the family again and my mom said, I didn't realize how far it went, I'm sorry. But because it was during an argument, it didn't feel like a genuine apology.
Starting point is 00:38:00 And my dad was like, how long are you going to keep talking about this? Are you going to bring it up every time you're in a tough spot? This was the first time I mentioned it to my dad since the incident, but he acted like I talk about it every day. I'm still struggling with it, even after years. It's not something I can just forget. What is he trying to do? My dad had even called me and my brother fucking assholes, and then he couldn't control his own anger and took it out on us. I don't know. It's suspected that that video could have been used for the Telegram chat rooms. So there is a panic in South Korea right now where everyone is resorting to deleting everything they can off social media.
Starting point is 00:38:38 Even archiving Instagram photos is not enough because when you archive them, you can still see them as the account holder. The public can't see them even if they're following you and you have a private Instagram page they cannot see them. Literally only you can see them. But some victims have archived their post and then their Instagram accounts have gotten hacked just to steal those photos to be deepfaked. It's that bad. Some people are just stealing even the profile pictures off of Instagram, the small circular photos. They're taking those to be deep faked. Even though photo booth locations in South Korea,
Starting point is 00:39:13 it's kind of part of the custom that some people would take the pictures with their friends and then stick it onto the wall as memories and leave it inside of the building. Though some people are warning others, don't put those up or go back and take those down. Wow. It's so widespread that in one city,
Starting point is 00:39:32 so think Irvine, California, they've had so many rounds of chat rooms that they're now on Irvine season five. That's what they call them, seasons. The terrifying part is even if your acquaintance is the one posting your photo and getting them deep faked, they are now being spread to all the other evil people in the Telegram chat rooms that are in Irvine because people join the communities that they're a part of or the schools that they go to.
Starting point is 00:39:58 So how do you feel safe? You don't know who in town has seen those pictures and videos? I mean, remember, one of the requirements is you have to post name, age, school as well. What if people try to do something in person? It's likely that these victims can't even go outside and walk around without feeling incredibly paranoid. One net is in comments, it gives me goosebumps at the thought that people can just commit a crime with my photos at any given time, even if my account is private. I don't know how and where the picture is being shared, so I'm even more at a loss.
Starting point is 00:40:30 We're not whores or sluts. We don't exist to satisfy someone's sexual urges. We're dignified human beings, each with our own careers and dreams. Some students, younger students, have taken to a more depressing approach. And by depressing, I mean what they're saying is very depressing. One middle schooler says, I'm worried, but to be honest, what can I do? It's kind of out of my control.
Starting point is 00:40:52 It's just like a natural disaster. A group of men in South Korea were also freaking out too. They're posting onto forums about everything going on and how scared they feel about their futures. They write, if I delete every trace of Telegram, will I still get caught? posting onto forums about everything going on and how scared they feel about their futures. They write, if I delete every trace of Telegram, will I still get caught? Someone comments and actually goes out of their way to make a full forum post, a must read for kids who are worried about deleting their Telegrams because they're anxious right
Starting point is 00:41:18 now. It's very noisy right now, there's lots of yapping and chatter, there are probably a lot of kids worried about this becoming a public issue and they're gonna get caught, but I'll just let you know. Unless you left something in that room that could identify your identity, it doesn't matter if you delete Telegram. This will be caught anyway. Hopefully it all dies down soon, which it does look like it will.
Starting point is 00:41:40 If you left not a single piece of identifying information in the chat room, you don't have to worry. Even if the police dig, you don't have to worry. Even if the police dig, they're not going to find you. Telegram will never cooperate with the police. They didn't catch the Enthron people previously. Only the people who revealed their own personal information were caught in the chat rooms. They're talking about the original Enthron case. If you leave no information behind, there is a 0% chance of being caught.
Starting point is 00:42:05 Don't worry and enjoy. They also go on to post about how women are tipping off police officers to the telegram chat rooms and they're misandrists, basically like misogynists, but the opposite. Yeah, a person with a strong prejudice and dislike for men. To Wichita Netizen comments, protecting ourselves from malicious men who feel entitled to our bodies? Really? Common sense has left us. Some men have accused the story of being super fake or exaggerated as it's part of media's secret sinister goal to grow the hatred for men in the country.
Starting point is 00:42:39 Honestly, many Korean men's reactions to the crimes are pretty alarming. One comments, I am just realizing that Korean women have gotten to a situation where there's no solution for them. This nation really is pussy-rea. Pussy plus Korea. What can I expect from a country whose justices on the Supreme Court are from the researching groups for feminism and gender laws lol. the Supreme Court are from the Researching Groups for Feminism and Gender Laws, LOL.
Starting point is 00:43:06 Another one writes, Even deepfake producers only choose pretty people, okay? So they're saying, no need for everyone to be concerned? Another one writes, Starting today, I'm gonna be gathering women's photos and turning them into deepfakes. They're just too many shit female pig bitches. Whether it's their pathological tendency to exaggerate or their overinflated egos, they would honestly be rejected by deep fake makers. But I will make them, because I need to be the abyss to fight the abyss. This is crazy. This is not real.
Starting point is 00:43:40 Netizens have responded to these comments saying, saying you're so ugly, you don't deserve to be sexually assaulted is no different from saying, quote, you should be grateful that you deserve to be sexually assaulted. These men are literally out here trying to say essay and sexual harassment are pretty people, quote, privileges. One chat room between a few guys has been exposed and they're all reacting to the news of the deepfakes. One writes, I don't know if I'm allowed to say this, but to be honest, it's disgusting that women are protesting right now after their human rights were just established. Meaning they just got their rights established so please sit down and be grateful. You don't need to do all of this right now.
Starting point is 00:44:20 Someone responds to that, their pussies will just be dehydrated. Another one comments, am I the only one who thinks like this? Another one joins the chat, they received too many rights. For real lol, women's rights should have been promoted moderately, much less than what it's been now. Now this is a world where whores are so confident, you know? Another one just mocks women by writing, give us human rights, treat us like humans. In another chat room, a guy is talking about how he wants to quote, I want to fuck my younger sister without a condom the day she hooks up with a guy. I miss the old days. So he's responding to the military chat rooms and
Starting point is 00:45:01 he says, back then, military guys would have no no phones so they would just have to self-pleasure imagining another military member's older sister and the guys who can't get married they would hit on her and if she didn't like it she would have been essayed wait what is he saying he said in the old days what what what are they doing you don't have phones to get into all this trouble deep faking go through all of this. You would just self-pleasure imagining your military member's older sister and then if that didn't do enough, if you couldn't get married after that, you would just essay her. Yeah.
Starting point is 00:45:36 There have also been surprising videos made by YouTubers who, I guess, some of their branding is making controversial videos or sometimes often being just against women in general arguing that women or rather Korean feminists are taking this story, going to foreign media networks to spread the news like some sort of man-hating propaganda machine that the Korean feminists get a sparkle in their eyes when stories like this come up so they can talk about it spread the news They're a well-oiled machine. They have the platforms in place. They go to all the social media platforms where the women like to gather. They have these beautifully designed graphics and eye-catching titles to make sure everything and everyone stays tuned to the news.
Starting point is 00:46:19 Some people have commented across the board about this case. Korean men. Why are our country's birth rates declining so fast? Also Korean men. Another one writes, imagine being a quote grown man and still managing to say something as embarrassingly pointless and thoughtless as this. I've heard literal babies say things with more insight than whatever these entire videos are supposed to be. Another comment just reads, none of these men have ever felt the touch of a woman before. But you also see comments like, Korean feminists are not the ones who fight for women's rights. They only hate on Korean men and they assume that almost every Korean man is a criminal. They cause major gender conflict in Korea which leads to the low birth rate. Wake up to reality and don't be brainwashed. Or the fact of the matter is, women should never have been even given the right to vote.
Starting point is 00:47:11 Another one reads, I think this whole deep fake and thrum thing is the same trend as the mad cow disease propaganda in 2008. So at the time, there was propaganda that said importing American beef would cause mad cow disease to the entire country. And there were a lot of people who fell for it. But if you looked at the results, did you get mad cow disease from eating American beef? It was all nonsense. It was all propaganda. Did the media companies covering it up apologize?
Starting point is 00:47:39 Did they say sorry for fear mongering and causing all this incitement? No, I'm not defending the perpetrators, but it's just there's no confirmed facts, the statistics feel very inflated and exaggerated, and you know, the women are always in big trouble. It's the same as the mad cow disease. But the problem is this type of propaganda continues to work. You're mad. In another comment where a commentator is expressing their frustrations with the Korean government, someone just comments because it seems like someone who is living through a lot of this, and they're writing and almost ranting about how unfair it is, and someone
Starting point is 00:48:16 comments, fact, definition of the word fact, writing bullshit at length to make it sound true. Another comment reads, I'm also a woman, but these Korean feminists, listen, the funny thing is they're so obsessed with numbers and they're man-hating sports that once the truth is revealed, they cannot refute that and they just spam things in English saying,
Starting point is 00:48:37 oh, we are with Korean women. So there's lots of comments on articles and YouTube videos of foreigners commenting, we are with Korean women. But Korean men are accusing Korean women of writing that in English to make it sound like they have international support. Really? Yeah. Yeah.
Starting point is 00:48:57 This is freaking crazy. The responses to that comment are just making fun of the poster. Stop trying to be a woman. We know you're not a woman. Come on now, let's be freaking real. And then other comments are like, you are also doing the same thing in English, but the only difference is your English is bad, so we know you're a Korean man. Like, we don't.
Starting point is 00:49:16 I do want to mention, it seems like this is the small minority. I think most men in Korea, I would hope, are taking it seriously. I mean, I would assume that if you're at least a husband, father, son, like, I mean, you should take it seriously regardless. I just don't know if their voices are as loud. There's a lot of men going on these hysterical tirades about Korean feminists, saying they are so busy trying to make this case international so the hatred for men can just grow everywhere like a global disease But they say Korean women don't even know what it's like for men in their 20s to live in Korea to give up one to two years of their lives for national defense
Starting point is 00:49:56 They don't know what that's like Yeah, which I mean What does that have to do with anything? If that has something to do with anything, then this has something to do with everything, okay? Apparently, there was a recent video that had gone viral of a little girl in Korea dancing, and she's just dancing to music,
Starting point is 00:50:16 truly just like a chill day dancing to music, and some men took it upon themselves to comment, uh-oh, why is it going up? Talking about their private parts. No. The girl looks five. upon themselves to comment, uh oh, why is it going up? Talking about their private parts. The girl looks five. Another one reads, do you want to be secret friends with ajusshi, which means old man?
Starting point is 00:50:36 So maybe take a deep look at what's going on and use some critical thinking skills before you go on these feminist hatinghating rants online. One net is in comments. Also the big issue here, in my personal opinion, is parents' attitudes. A lot of the perpetrators are kids, and even when they're adults, parents have the attitude of their child can do no wrong. When the child does do wrong, someone else has to take the blame.
Starting point is 00:51:02 Where were the teachers doing to stop my child from doing this? Was that the bus driver doing? What were they doing when this happened? Why was she wearing makeup? Did she seduce my son? It's always something or someone else. Someone comments on that. Actually, correction. Not a child does no wrong, a son does no wrong in South Korea. Another netizen says, Korea lives off the saying boys will be boys girls should be more careful and responsible. I can honestly tell you that my least favorite part of the day is planning and cooking dinner. There are just so many steps for it too. It just becomes draining. I need to prepare the ingredients, take out all my pots and pans for the dish, and it just makes me
Starting point is 00:51:47 dread the hour or two that I'm gonna spend in the kitchen. It's why I've developed a very bad habit of ordering takeout lately, and at the end of each month I always regret how much I spent on takeout alone because it's just so expensive, and if I'm being honest, it's probably not the healthiest. That is why I am obsessed with Hungry Root. Hung sends you fresh high quality groceries with simple delicious recipes. I love it because it's just as easy as ordering takeout. I can place my order on my phone and they will deliver it straight to my door. The best thing about hungry root though is that they have recipes that take five minutes or less.
Starting point is 00:52:19 I don't even have to worry about making this huge mess in my kitchen because some recipes are as simple as three ingredients My go-to's have been the curry chicken and zucchini stir-fry the coconut curry sauce just tastes so delicious And with the quinoa and chicken, it's super filling but hungry root can be tailored just for you They get to know your personal health goals your dietary restrictions Favorite foods how much time you want to spend cooking because I know some people love it, and so much more. Once they have all of that, they will build you a personalized cart with all your grocery needs for the week. It's like having someone else do all the planning and shopping so you don't even have to think about it. But even if you don't like all the recommendations, each order is fully customizable,
Starting point is 00:53:00 meaning you can take out their suggestions or choose to replace it with anything you want. And Hungry Root has everything you could want. They've got fresh produce, high quality meat and seafood, healthy snacks, smoothies and even snacks and meals made specifically for kids. Everything from Hungry Root follows a simple standard It's got to taste good, be quick to make and contain whole trusted ingredients Right now Hungry Root is offering Rotten Mingo listeners 40% off your first delivery and free veggies for life. Just go to HungryRoot.com slash rotten to get 40% off your first delivery and get your free veggies that's HungryRoot.com slash rotten. Don't forget to use our link so they know we sent you. For a few years investing was the one subject that I was too scared to
Starting point is 00:53:40 touch. Everyone was telling me how important it is to learn this is before I met my husband. I didn't have the time to look into it and understand it and like what kind of account should I be putting my money into? am I putting in enough money? what's good? what's bad? it's just so overwhelming that I would push it off until later and later and later except investing can actually be so easy to the point that I wish I started it sooner today's episode is sponsored by Acorns. Acorns makes it easy to start automatically saving and investing for you, your kids, your retirement.
Starting point is 00:54:12 You don't need a lot of money or expertise to invest with Acorns. In fact, you can get started with just your spare change. Acorns recommends an expert-built portfolio that fits you and your money goals, then automatically invests your money for you, and now Acorns is putting their money into your future. Open an Acorns later IRA and get up to a 3% match on new contributions. That's extra money for your retirement. Acorns has helped me take the stress out of investing with just this one app, I'm able to save, invest, and learn all at the same time.
Starting point is 00:54:43 They make it so convenient because I'm the type of person to forget to invest, but Acorns lets me set up automatic transfers, allowing me to meet my investment goals. Head to acorns.com slash rotten or download the Acorns app to start saving and investing for your future today. Paid client testimonial, compensation provides incentive to positively promote Acorns. Investing involves risk. Acorns advisors LLC and SEC registered investment advisor. View important disclosures at acorns.com slash rotten. There's also discourse about the fact that sending deep faked videos is almost always seen as some sort of prank or play culture by some young students
Starting point is 00:55:23 and even their parents giving the energy of why she's so emotional. It's not even her. It's not even real. He's just joking around. Should we ruin our son's future for a joke? One officer even said, it seems that students lack the awareness that deep faking is a crime, which, okay, but ignorance is not a defense in the court of law. So where are we going with this? You can't just defend yourself with, it's a joke, I didn't know.
Starting point is 00:55:49 So that, a lot of Korean women have stated, we feel like we have no country. This feels like social collapse. How far is the lowest an ordinary Korean man can go? How much is our government going to tolerate? Women are treated as objects by male classmates, seniors, co-workers, even very close acquaintances. To add to that, a Korean feminist who was sharing information for victims, she was on X Twitter, her account was mass reported by a bunch of incels and suspended. Some women have even stated that they want to protest to raise more awareness to show the government how angry they are because
Starting point is 00:56:22 they're angry but they're scared. Their identities could be leaked by the incels that come to almost all the women organized protests. If you're labeled a feminist in Korea, it could make you a social outcast and even cause problems for you at work depending on your company. They won't tell you directly, hey, you're fired because you're a feminist, but they don't like when women are outspoken. Like they don't like when women want basic human rights, at least at some companies.
Starting point is 00:56:50 Someone tweets in response to this, In every country, there have been misogynistic crimes that shake the entire country and cause protest. In all of those countries, regardless of gender, people ran out onto the streets to protest and condemn the perpetrators. But what about our country? They mock the faces of the female protesters. They do binge watching protests in front of them. Incels will come and just taunt female protesters.
Starting point is 00:57:15 That is crazy. So they're also in public? Yeah, but they're masked up, covered everything. That is crazy. The commentator asked, does this still make it seem like there's hope for the men of Korea? This is also surprisingly happening in America as well. A lot of middle schoolers, high schoolers, they will use quote, notification apps, get pictures of all the girls from
Starting point is 00:57:39 the last homecoming dance. This really happened last year in Washington state. No, this year, but it was a last homecoming dance. Get the pictures of the homecoming dance of all the girls in their gowns, run them through these apps to produce deep faked pictures of them completely nude. Then those quickly get spread around to the whole school. It seems like the main difference so far it seems is that the American cases, they stay within each school so they will spread it to classmates and maybe neighboring schools but they don't go on telegram to spread which I mean it doesn't make it better or worse but just
Starting point is 00:58:15 noting a difference I've seen so far I don't know now the terrifying part is one expert stated one boy with his phone in the course of a single afternoon can victimize 40 girls. Minor girls. And then their images are forever out there. We need to make this so painful for anyone to even contemplate doing it because this is harm that you cannot simply undo. Even if it seems like a prank to a 15 year old boy, this is deadly serious. It's also happening to American adults. There's always social media posts popping up here and there of people wanting help on Reddit confused on what to do. One from just four months ago reads from a 27 year old from America.
Starting point is 00:58:54 She writes, My boyfriend and I had been nearly in a nearly perfect relationship for five years. Five years. I never doubted that he loved me or cared about me. Everyone considered him pretty kind and genuine Just all around a good guy. I trusted him so much that I was never really suspicious of him Never snooped through his phone or stuff That's until about two months ago when my phone was stolen and I briefly borrowed his whilst borrowing his phone
Starting point is 00:59:17 I essentially opened Pandora's box secure folders private browsers in his photo library I discovered pictures and videos of my friends, my family, my coworkers, roommates, as well as his best friend's girlfriend. I also found deep faked pictures of his cousin who was 15 at the time. These were then posted all over the internet on various porn sites, Reddit, you name it. He posted these pictures with each person's first and last name, captioned with words I could never even imagine coming out of his mouth. Officials have stated that fake images can really put these young women at risk of being barred from future employment
Starting point is 00:59:57 and making them vulnerable to physical violence if they're recognized. The president of South Korea addressed this situation by saying, will eradicate those digital crimes through investigations. I hope that the education plans will be made so that a healthy digital culture can be established." What? He honestly said nothing, basically. Also, he's not really a credible person considering he's the same man that has been actively trying to abolish the Ministry of Gender Equality and Family in South Korea, so... I would take anything he says with a grain of salt about this. But the craziest part is to some people even his nothing burger statement was too much of a statement. A politician responds to the
Starting point is 01:00:53 president's statement by stating the threat is being overestimated right now. He says it's good that the president said that the purpose of establishing measures needs to be early on but on the other hand it seems that there is a possibility that excessive regulation may appear. He's like, we might regulate deepfakes too much. Calm down. Oh my God. Investigate this guy.
Starting point is 01:01:14 Yes, exactly. And maybe make deepfakes of him if you must. I'm just kidding. Don't do that. I don't condone that, but maybe he needs a taste of what it feels like. He's concerned that people are going to get too anxious fear monger into thinking the deep fake problem is massive and that leads to over regulation which is so bizarre. I hope he keeps the same energy if he ever gets deep faked. He says, I think it's important to grasp that this content, the anxiety about
Starting point is 01:01:40 it is not exaggerated more than it actually is. okay? Let's all be aware, you know. Most people want him to never open his mouth again. Many activists argued, if you don't do anything, the state, the government is also an accomplice in this. The country, the politics, the media close their eyes. This land is a paradise for criminals. One female politician, I don't know why she said this, but she said, the fear of deepfake pornography is spreading. However, I can't help but mention the behavior of some opportunists who exploit this anxiety and fear as material for another gender conflict.
Starting point is 01:02:17 I don't know why women fighting for their rights and not to be exploited gets turned into a quote gender war. It's not a war. We just want rights. It's not what she continues. The majority of people are taking this opportunity to demonize one side, aka men, and promote gender conflicts. Basically saying that men are being demonized, but obviously this statement did not sit well with most women. Nobody is demonizing men for behavior that they have not repeatedly taken part of in this country. No one's going home to their husbands demonizing them. They're demonizing the perpetrators who most of them happen to be men. What do you mean? It doesn't make any sense. Clearly nobody is saying all Korean men and that's like coming from myself, a Korean
Starting point is 01:03:05 with a good chunk of my loved ones being Korean men, but to say that this is not a case of women being targeted, but rather a case of deep fake technology being abused, is a bit thick-skulled. It's a bit pick-me-energy. Another professor at a department of sociology from a university said, there is a stereotype that men become perpetrators and women become victims, but women can also become perpetrators if they really want. What? What does that even mean? So there were some cases of men deep faking other men and women deep faking other women as a way to bully them,
Starting point is 01:03:44 but the vast majority are men victimizing women wow these are crazy what a strange thing to say it doesn't even make sense so a lot of people including people that I personally know in Korea they have been mass taking down all their social media pictures honestly just any picture of them they're trying to wipe it from the internet, for good reason obviously. But that's not even good enough. That still puts a lot of professionals who don't have that choice at risk, people who
Starting point is 01:04:13 can't avoid being on the internet, which did you know a vast majority of deep fake explicit videos are of Korean idols, K-pop idols, and it's not even just being produced in South Korea. It's being produced globally. I would say I think it's either 96 or 99 percent of all deep fake technology right now is being used to create non-consensual explicit videos and almost 99 percent of the subjects that are being used are women. So but also even if you are trying to wipe everything from the internet, there have been cases, a lot of cases, of people just taking pictures of you in real life and also pictures from
Starting point is 01:04:55 yearbooks. There are reports of teachers who refuse to even take yearbook photos now. They don't want that in there. They will take pictures from your yearbook and make deepfakes. In one instance, there was a Telegram chatroom that was recently busted where the perpetrators took graduation photos from Seoul National University, which is the top university in South Korea, deepfaked videos of alumni. In that one school alone, it's speculated that there are 61 victims.
Starting point is 01:05:22 There were a total of 2,000 deep faked pieces of content that were distributed amongst the group and outside of the group. There were allegedly men asking in the chat rooms, is it okay if I rape her? Is that the prey this season? One of the victims from Seoul Dae, she was the first victim to go to the police. She said she received a video file of her being gang-essayed by a group of men. She couldn't even bring herself to open the video file. She just saw the thumbnail, had this weird unsettling... She sees this creepy smile on her deep fake face.
Starting point is 01:05:54 And it's so terrifying. I'm sure at first it's a moment of, am I drugged? I don't remember this. How come I don't remember this? To finally realizing it's a deep fake, which honestly, I don't even know if better is better in that situation It just becomes a whole other problem entirely. The craziest part is she never had social media So whoever did this to her took her cacao talk It's like what's up? We chat
Starting point is 01:06:20 Line tiny circular profile picture, which is basically like your Instagram profile picture tiny circle Technically even if you're private people can see it She updated it a few times in the past two years Whoever did this screenshot it every time she changed it to get more pictures of her and use that to create the deep fakes When she ran to the police station to report it She got a creepy message from her phone saying report me. You can't even catch me if you report me. And deepfake is getting that good now, huh? Really good.
Starting point is 01:06:52 It's very hard to tell when something is deepfaked. Truly hard to tell. She felt like she was being followed. They said, try to answer me. I can tell you who I am so you can try and catch me. More messages keep coming in reading things like, I can't forget the first day I saw you. It went up really hard. You're so pretty. A lot of people believe, and this is likely true, that these perpetrators get off on the fact that
Starting point is 01:07:19 this type of trauma stays with the victim forever because now every single person in her life every interaction She's ever had she's trying to figure out who the hell did this to her who saw it Because when she told the police this they told her they can't do anything because it's on telegram a few months later She was grabbing coffee with an old college friend. She had graduated years ago She's grabbing coffee with an old college friend that she hadn't seen in such a long time and she's trying to tell him She's nervous. She's trying to tell him everything that's been going on in her life and he puts his coffee down looks very grim hesitates Why what's wrong? What is it? I have something to tell you
Starting point is 01:07:59 Actually, I I got a picture of I got a picture like that a while back Someone sent it to him? Yes. He takes out his phone to show her, and it's a picture of her and two other female classmates. Three nude bodies in a row. And it's not just the picture. The deepfaked picture is on an iPad. It's like the full screen of the iPad.
Starting point is 01:08:22 And someone is taking a picture of the iPad, right? But their genitals are on top of the tablet. Which is part of the humiliation. Right, so it's been surfaced to someone else and someone else took a photo of that and then send it over again. So it's like multiple layers of like... Yeah. And she asked, who sent this to you? He said, just another guy from school. I mean, he told her everything that she hasn't spoken to in so many years.
Starting point is 01:08:55 And this photo was sent recently. So this is years after she forgot about these people. The caption that he sent it to him was, let's enjoy together. It's good, huh? She actually became a lawyer. She went on to become a lawyer. So this was all during her law school days. She went to Seoul National Law School and whoever submitted her photo into the group chat was also in the law school program, meaning that person is likely a lawyer right now. Potentially could go on to be a prosecutor
Starting point is 01:09:24 or a judge. Another perpetrator was revealed to have been working in the blockchain technology sector. So they're all very established people in positions of power and this is what they're doing. There's one deep faked photo of a Zohide student laying down in a very suggestive pose and engraved on her body are the words women who are amazing not only on the desk but also in bed. One of the ho-ri-dae accomplices was caught he was sentenced to five years in prison recently like this month. The judge said that the content was that this specific creator this perpetrator made the content was so quote disgusting and humiliating,
Starting point is 01:10:05 it's not something the public can even digest. So I'm assuming it was either very violent or very degrading. The first victim to find out from Seoul Daeddeul lawyer, she wrote to the judge and she said, first of all, I would like to express my sincere gratitude for allowing me to share my story as a victim before the court.
Starting point is 01:10:23 When dozens of pornographic images digitally altered to include my face and videos of men self-pleasuring themselves to them were dropped in my lap by an anonymous account, when I saw multiple perpetrators insulting and mocking me in a chat room where my photographs and personal information had been shared and when not long after that I came to realize that all of this had been perpetrated by people I studied with at university, my own university acquaintances had been calling me a cumbucket, a whore, a slave behind my back, the world I thought I knew had come crashing down around me. In consideration of the immense harm this incident has caused me and dozens of other
Starting point is 01:11:04 victims, I earnestly petition you to give the defendants the most severe punishment available. When was this? This year. This month. August 2024. Wow. So she, did she catch the people? She caught the people. She was the one that investigated and caught the people. The police were like, we can't do anything, it's telegram. Wow.
Starting point is 01:11:28 Yeah, so a lot of the politicians have now initiated these conversations of better education so that younger teenagers stop doing this because a vast majority of the perpetrators are alarmingly minors. One professor from the Department of Police Administration writes, The problem is that these perpetrators who are committing such crimes don't realize that their actions could be harmful to others. It's just- Yeah. This is the dumbest thing I've ever heard. It's just another way to relieve themselves of their curiosities and interests.
Starting point is 01:11:58 Proper education methods must be in place to help more people realize this is a crime, not some form of entertainment. That's just dumb. I don't know. I've never seen so much sympathy given for girls who are curious. And why is it that satisfying male curiosities almost always includes harming women? Why are we using that word curiosity? You think they don't know what it means to do this? I think we're past that. If you're using deep fake technology, we We're past that I think the only thing at this point is the fear of being in prison for a very long time That's what most Korean netizens men and women agree with at least the ones who have firing neurons in their brain Apparently one school in Korea and I don't know how accurate this is because this was on social media
Starting point is 01:12:40 But I trust social media these days over authorities There were allegations that they held an emergency seminar about the deep faking issue. This one school were like, we got to do something about this. Let's have a seminar. They force all the girls to the auditorium to give them quote education on deleting all pictures, never taking another photo again, never posting a singular photo, nothing to avoid somehow being victimized in something that they can't even control while the boys played soccer outside.
Starting point is 01:13:09 And I don't even think punishment is that scary. Many of the Telegram chatroom creators, the ones who run the rooms, have said, even with all this news going around, they've posted, regional season 4 is closed, once everything quiets down, regional chatroom season 5 will resume. Another Telegram chat room owner started a new room in the midst of all of this to deep fake. They write in the chat rooms, don't be ashamed, insult reporters who are publishing articles about this. Many of the chat room members are praising the chat room owner stating,
Starting point is 01:13:41 you made a room during such harsh conditions. We're proud of you. They're not scared at all. Which makes sense because another netizen writes, a person who makes deepfakes for the purpose of distribution can be punished by imprisonment of up to five years or a fine of 50 million won. Which is how much? $50,000.
Starting point is 01:14:02 $50,000. But in reality, most people are punished by probation and suspension. No fine. In fact, only five people were sentenced to prison for purely deepfake-related crimes last year. The others that were sentenced to prison, they all had other sexual offenses on top of the deepfake crimes. Which, let's be real, none of these are strong deterrents. Even though technically the law is creating deepfakes with the intention to distribute, is five years. And about $50,000 fine.
Starting point is 01:14:30 But, I mean, if you're not getting it, it's not really the punishment. One net is in comments. One thing I will say though, is that crimes similar in severity to this are usually given a slap on the wrist here in South Korea. So if perpetrators think they can still live a normal life afterwards, there's really no incentive not to do it. They all have the same excuse, I'm a first time offender, I'm young, I'm sorry. The magic words. In court docs, they will literally say, the judge will literally write, in one case, a high schooler victimized four of the student, four girl students in his school spreading deep faked videos of them, spreading them.
Starting point is 01:15:07 The court only sentenced him to one year in prison. He's 19. And they wrote, relatively young, vows not to reoffend, has no criminal record. They state that's why they went easy on him. Are we really awarding people because it's the first time they're caught doing something? In another case, a high schooler not only made deep fakes of his classmates, but took them to her, shoved them in her face, and threatened her, if you don't do as I say, I will spread these everywhere.
Starting point is 01:15:34 He received six months in prison because he said, and I quote, I'm admitting to my mistakes and reflecting, I have no criminal record. An overwhelming 75% of people involved in making, distributing, and watching deepfakes last year were teenagers. 20% were in their 20s, so truly a vast majority in their teens and 20s. Wow. A lot of police officers won't even help the victims.
Starting point is 01:15:56 They'll just say, well, Telegram has servers that are not in South Korea. Telegram doesn't work with the police. What are we supposed to do? Or quote, I can't catch it, it's on Telegram. Which leads victims to try and infiltrate the chat rooms, just like we saw in the original Nth room. And it just adds to the trauma. It's a weird feeling. Most victims have resorted to joining these chat rooms where they go undercover as fellow perpetrators, which is a traumatic feeling. Or they'll even direct message the chat room owners and ask for more pictures
Starting point is 01:16:28 so that they can get direct evidence this guy is committing a crime. And many of the victims end up getting deep fakes of their friends. Because again, these are all acquaintances. Pavel Derov, his arrest is not related to South Korea. Paris had a warrant out for him. The founder of... Telegram. Recently.
Starting point is 01:16:49 Yes, it's all unfolding at the same time. Like both stories are developing at the same time, but he was not arrested for what's going on with Telegram in South Korea. I mean, potentially that's part of what's going on. He's just being arrested for not co-op- Not arrested, detained. He's being detained for not co-op- not arrested, detained. He's being detained for not cooperating with the authorities for Telegram's potential part in distributing CP
Starting point is 01:17:13 as well as drugs, selling drugs. So it is kind of related but not necessarily. It's not a direct correlation but it's all in the big scheme of what's going on with Telegram. And there have been so many conversations recently of how good and bad is Telegram. OneNetizen writes, very complicated issue. We need communication channels in the world that government officials can't access. So Telegram is used a lot in war zones. And a lot by oppressed parties as well. So that they can't get censored, so that they can get news out there. They can talk to journalists that are not on the ground.
Starting point is 01:17:50 You know, it's for activists, journalists, it's pretty much vital. But at the same time, good is never just used for good. Essentially, everything that makes Telegram good is also what makes Telegram bad. There was also a point where Telegram good is also what makes Telegram bad. There was also a point where Telegram was called Teragram because a lot of terrorist groups were on there. One netizen was just pointing out how much trauma it is for a 13 year
Starting point is 01:18:16 old to even just be told, hey be careful, be on the lookout, someone might create fake deep fakes of you on the internet. Others write is this really gonna be a thing now where parents have to sit their teenagers down and say someone at school might make these explicit videos of you? Others are concerned about how bad this is gonna get. I mean this is already really really bad but one commentator writes imagine if someone hates you, puts your face on some bestiality or some very illegal videos involving children or something. This is an ongoing developing case.
Starting point is 01:18:49 As of August 27th, police said in South Korea, they will carry out extensive crackdowns on deepfake images. They say for the next seven months, they will aggressively hunt down those who produce and spread those images, especially those of children. Which, I don't know if that's even gonna do anything. Right now a study found that 96% of all deepfake videos are X-rated and non-consensual. So, I mean. And August 28th, I guess the public was like, yeah, that's not enough, police. The Ministry of Science, Technology, Information and Communications in South Korea announced that it was going to organize a budget of 1.5 million dollars, which is nothing, to prevent quote deepfakes used to commit sex crimes.
Starting point is 01:19:35 The whole country? Yeah. So this specific Ministry of Science, Technology, Information and Communications has a expected annual budget in 2025 of $14 billion. And they said, guys, we're going to spend $1.5 million to prevent deepfakes. Yeah, that's someone with like $1.5 million that's spending $14,000. That's crazy. That's nothing. Yeah. I mean, I know $1.5 million sounds like a lot, but like 14 billion. But also I feel like that's not going to do anything. No, it's so little it can't do anything.
Starting point is 01:20:14 Yeah, with the amount of people that's on there and how much resources they... it's like they try... they're saying that, oh, we're gonna do something, something but their actions not doing anything. Nothing. A lot of these politicians, I won't say all of them because some of them are really upset and they're trying to get laws passed, but it's really hard when they're not cooperating. I will say a good chunk of them are just saying empty things. They're like we're gonna take this seriously you guys. It's like okay maybe you should have for the past however many years. It's a little too late to just be like we'll take it seriously
Starting point is 01:20:48 What are your thoughts? Do you think telegram should be banned is telegram the problem? Do you think it's okay that Paul Pavel Dura was arrested or is deep fake the problem? I mean, I tried to look into deep fake being used for good I saw a few small things where they were helping some people in certain medical studies but I don't know. What are your thoughts? Do you think it's just taking away jobs and or just creating victims? Should the founder have been detained? Is this a matter of free speech? If you were the president, how would you even stop spreading of deepfakes? Let me know in
Starting point is 01:21:22 the comments and please stay safe and I will see you in the next one.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.