Money Rehab with Nicole Lapin - Sextortion: The Darkest Deepfake Scams, How to Protect Yourself and Laurie Segall's Hunt for “Mr. Deepfakes”

Episode Date: June 3, 2025

In the U.S., the cost of cybercrime in 2025 is estimated to hit $639.2 billion— up from $452.3 billion in 2024—an alarming surge fueled in large part by advancements in AI. Today, Nicole sits down... with tech journalist Laurie Segall to uncover one of the darkest AI-driven scams: a disturbing scheme where scammers generate fake nude images to extort their victims. In this conversation, Laurie breaks down the most common deepfake crimes and scams, how to spot them, and how to protect yourself. She also shares the jaw-dropping story of her investigation into a shadowy figure known as “Mr. Deepfakes,” a man she describes as one of the most dangerous people on the internet, and what happened when she tracked him down and confronted him face-to-face. Nicole and Laurie zoom out to ask even bigger questions: What does this mean for women, for democracy, and for the future of AI? Spoiler alert: we are still very much in the Wild West. Follow Laurie’s work, and her investigation into Mr. Deepfakes here.

Transcript
Discussion (0)
Starting point is 00:00:00 Imagine if you had a co-host in your life. You know, someone who could help manage your every day and do the things that you don't have time for. Unfortunately, that's not something we can opt into in life, but it is something you can opt into as an Airbnb host. If you find yourself away for a while, like I do, maybe for work, a long trip, or a big life adventure, a local co-host can help you manage everything. From guest communications, to check-in, to making sure your place stays in tip-top shape, they have got you covered. These are trusted locals who know your area inside and out,
Starting point is 00:00:33 giving your guests a warm welcome while you focus on your own starring role, whatever that might be. You know that I love how hosting on Airbnb helps you monetize your home, an asset that you already have, that is a holy grail. And as a longtime fan of Airbnb, I have been telling everyone I know that they should be hosting too.
Starting point is 00:00:52 But some of my busiest friends have been overwhelmed by this whole idea of hosting. But thanks to the new co-host offering, they have finally signed up. So if you've got a secondary property or an extended trip coming up and you need a little help hosting while you're away, you can hire a co-host to do the work for you. Find a co-host at Airbnb.com slash host. I'm Nicole Lapin, the only financial expert you don't need a dictionary to understand. It's time for some money rehab. Today I'm joined by one of the bravest voices in tech journalism, Lori Siegel. And as you're about to hear, I've known her for about a hundred years.
Starting point is 00:01:34 But you know her too. You've seen her on CNN, 60 Minutes. And if you've been following her reporting over the last few years, you've probably found yourself both captivated and terrified. Lori's latest work uncovers one of the darkest corners of the internet. Deepfakes. Specifically the dangers of AI-generated images of real people in fake sexual acts. In our conversation, she explains the common deepfake crimes and scams and how to protect yourself. We also talk about her totally insane investigation into Mr. Deepfakes, a man that she calls one of the most dangerous people online,
Starting point is 00:02:07 and what happened when she actually tracked him down and confronted him. And finally, we talk about the bigger picture, what this means for women, for our democracy, and the future of AI. Honestly, my takeaway? It's definitely still the Wild West. It's definitely still the Wild West. -♪ SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! SHH! Ah, Laurie Siegel! I'm so happy to be here with you. I'm so happy to say welcome to Money Rehab.
Starting point is 00:02:32 Thank you. We've known each other for 100,000 years. Correct. We worked together at CNN 50,000 years ago. Correct. But when I saw online that you were sex-torted, first of all, I wanted to kill that person. Yeah. And then I was like, what is sex-tortion? And can I were sex-torted. First of all, I wanted to kill that person. And then I was like, what is sex-tortion?
Starting point is 00:02:48 And can I be sex-torted? Yeah, the answer is, unfortunately, like, any of us can, which is terrifying. That's like the reality of the world we're entering. So what is sex-tortion? This would be like, if you're a child, and someone reaches out on, like, Instagram, and pretends to be your friend,
Starting point is 00:03:04 and let's say you have a teenage boy, by the way, all this is going to seem really dark. So I just, sorry. Let's say you feel like a teenage boy, pretty girl reaches out, starts trying to get them to send some kind of a provocative image. And the next thing you know, they say, if you don't pay me X amount of money,
Starting point is 00:03:19 I'm going to send this to every single person you know. But your child never even took the image, right? They were never even tricked into it. It's a deep fake of them. And it doesn't matter that it's not real, because it looks real. And these types of, like, sex-dorsing campaigns are so horrific, and children are ending their lives.
Starting point is 00:03:34 And this isn't just children. I say this is happening to adults. This has been an ongoing thing for a while. So what's happening with the rise of artificial intelligence and deep fakes is basically the democratization of these types of scams, and none of us are safe, right? So someone could say, I have a nude image of you,
Starting point is 00:03:49 and I'm going to pass it around to everyone. And I always look at this from a victim standpoint. You can't just say, oh, it's just not me, because you'll see that image. It looks like you. And it looks like you to an untrained eye. It's you. It could tarnish your reputation.
Starting point is 00:04:02 People might not know it's real. So these images aren't real, but the impact is real. So we always want to like shout this from like, the mountaintops, this is what's coming. And we need tech companies to do much better jobs. So let's talk about it. So somebody reached out to you, and then what happened? Apparently, this is like a very common scam.
Starting point is 00:04:18 And I just happened to know a lot of good guy hackers from my days going to hacker conferences as a tech journalist. So I was able to forward this and be like, what is happening? But someone reached out and said, I have images of you. I've been able to hack into your device. If you don't give me X amount of money, implying they had intimate images of me, I will put this out there.
Starting point is 00:04:37 And what they do is I think they put some identifiable information about you, like your home address or something like that. You thought it was like you, you thought maybe somebody... For a second, and this is what happens, and I'm like a long-time tech journalist, right? Like, I am like, I got this. Like, I don't have to freak out. And even me, I was like, oh my God, do they have some? What do they have?
Starting point is 00:04:57 And you start questioning. Your breath gets short. I literally had a security guy come onto my computer. I like had him remote into my computer and look for any malware. I wanted to be completely sure. And he was like, no, these scams are actually going around. So I posted about it. And the next thing you know, I get all these people
Starting point is 00:05:15 messaging me privately, being like, this is happening to me. I was so scared. And one of the things they say, this is so dire. I love that we're really starting out strong on this. They'll be like, we saw you on a porn site or something, and now we've remote it in. People are embarrassed to talk about it.
Starting point is 00:05:30 So it's just like a wild, I would say, like... It is a wild west right now, and there are so many of these different types of scams going around. Like, we are in the wild west of scams that are only made so much more believable by artificial intelligence, right? Like, parents getting calls from what they believe are their children, because their voice has been imitated using artificial intelligence, because it takes 30 seconds of a voice sample
Starting point is 00:05:52 to be able to mimic that voice. This is, I think, the world we're entering at all these different levels, where our identities are up for grabs, and AI can just mimic our most intimate features, our face, our bodies, our voice. And so, it's a bit of a Wild West. And I think we have a long way to go with educating people on it. For sure. My husband and I even had this conversation recently
Starting point is 00:06:14 where we said we needed a safe word. So if somebody gets called saying that they were kidnapped, or I don't know, I can't even imagine. We probably don't even know what could happen or will happen. Like, say, strawberry, that's not our safe word. But say something like that. 100%. Hilariously, we all need human passwords. This is literally what one of the top security guys said to me.
Starting point is 00:06:35 I was like, how do people protect themselves? He's like, human passwords, safe words. I called my mom and I was like, if you get something like this from me, which it seems crazy to have to call our parents and say like, if you hear an AI-generated voice, or you're not sure if it's me, this is the word you need to say. This is our human password. In an interesting way, it's almost like our humanity is the thing that we're hoping will help us pull through in this weird time.
Starting point is 00:06:57 The analog way, I guess, in this brave new world. So then what happened? You brought in the security expert, you are used to tracking down the criminals and the scammers. So, this was all kind of a precursor to... We did a larger investigation that we've been working on. When I say, like, I get obsessed with topics. Like, this is for better and for worse. And I think three years ago, I became obsessed with this idea. Someone had mentioned to me there was this really shady site online.
Starting point is 00:07:26 And it's a deepfake pornography site. So, literally, it looked like, I mean, this is dark, but it's like you were watching sex tapes, essentially, of many women in the public. Even though they never made them, even though they never would ever consent to something like this, but you were looking at hyper-realistic, deep fake pornography of,
Starting point is 00:07:48 if you are even kind of a public figure, there's a chance you were on this site. And I remember going to this, being like, wait, this is insane. And then I started looking into it at the peak of it. 18 million people were going to this site on a monthly basis. So I'm like, none of these women consented
Starting point is 00:08:04 to having their image and likeness used and like, you couldn't tell if it was real or fake, although like we know it's not real, but that harm is very real. And I just remember being like, why does this exist? Like why are 18 million people allowed to go and see this and these women have no control? And ironically, it was like a lot of women in power.
Starting point is 00:08:22 So- Was it Taylor Swift? Yeah, Taylor Swift was, there were so many people that were like, their likeness was taken and used on the site. And this site, I became obsessed with it because I was like, okay, it's not just like about this shady site on the internet, but it was a platform, right? So it's not like people just went and saw these two horrific videos.
Starting point is 00:08:41 They could also like create them or pay people to create them. And so, like, it became a whole platform and an ecosystem where the idea of sexually explicit deepfakes of saying, oh, I like that woman. I want her doing this with this person. I don't care what she says. I'm just gonna use AI to make my dreams come true. Like, your wish is AI's command. That was what this site was, and it was called Mr. Deepfakes. And I remember they also had like training manuals.
Starting point is 00:09:06 So it wasn't just about these public figures, these women, it was about training young men how to do this and take this into their schools, or take you this into their workplace. So you look on the message boards and it'd be like, oh, I wanna do this to my sister-in-law. I love tech, I love artificial intelligence. I think it's gonna do incredible things,
Starting point is 00:09:23 but this is a ground zero for what happens when it is misused and it's used as a weapon against women and girls and eventually all of us, right? So I became very obsessed with Mr. Deepfakes and tracking him and it took us all on an investigation that was very wild and never a dull moment. spoiler alert, we found Mr. Deepfakes, yay! Yes, it was probably a couple years ago,
Starting point is 00:09:46 and I'm like, we should just start talking about this on the internet and explaining why people need to care about this shady site. So we said, okay, I believe this is one of the most dangerous men on the internet, the person behind this, and we need to know his name before it's too late. Because why should this person who has harmed so many women be afforded anonymity?
Starting point is 00:10:05 This site had been up and running for seven or eight years and he was anonymous. So you had no idea who was doing this. And I just thought, let's find him. Let's just try. On my team, we have like some incredible investigative journalists that came with me for my 60 minutes days.
Starting point is 00:10:20 One of my colleagues, Nicole, she could be an FBI agent if she wanted, she's wonderful. We started talking about it and we went out and I remember I started talking about it like a moms' conference and all these moms got behind us with this idea that this might be about this shady deep fake porn site, but actually this is about the future of bullying.
Starting point is 00:10:37 This is about what could happen in your schools with young men doing this to women, thinking it's okay. Like, this is normalizing a new type of abuse. And so I think a lot of people really resonated with that message. And I remember, I was getting my nails done. And all of a sudden, I didn't even know I had another inbox on TikTok, but I was on TikTok
Starting point is 00:10:55 looking at other inbox, which is like messages that sometimes they filter. And this security company, security legal company called Sidenti, reached out out and a guy named Jordy was like, we have a tip. We believe we have found him. And so I'm like, okay, this feels, I'm not sure if this is real, like 100%,
Starting point is 00:11:13 but I'm like, obviously we're vetting it. And we ended up like going on this, we got like a dossier that had, I want to say, 17, 18, 19 different data points. I brought in another security firm, like we all basically tracked down, like via social media, via the names we were given. And there were so many connections,
Starting point is 00:11:30 because any time you do something on the internet, like you're just not hidden. This is what I have learned through all my years in investigative reporting. Like covering your traces is actually like very difficult, and you will make mistakes. And like, you know, he made mistakes years ago. There was an 8chan post with him, like, an 8chan.
Starting point is 00:11:47 It's like 4chan, but like this message board, where people put like crazy theories and memes and cultural things, and it's like a place where a lot of like, you know, internet lovers, for better and for worse, go and say some of the weirdest stuff. And great stuff too, but it's a weird place. He had an 8chan post, we had him talking about a car, like a red Mitsubishi, we were able to track,
Starting point is 00:12:08 and we ended up in front of his parents' home with the red Mitsubishi. Like, all sorts of crazy investigation went into it. And we tried to reach out to him many, many times. He wouldn't answer, he took down all his social media. We reached out to friends and family, and then finally we said, let's go, let's try to find him and talk to him in person.
Starting point is 00:12:29 Hold onto your wallets. Money rehab will be right back. And now for some more money rehab. I found out he worked as a pharmacist in a hospital, like helping people. I found out that he as a pharmacist in a hospital, like, helping people. I found out that he was the man that had really helped create this site that enabled so much, I would say, digital abuse against women.
Starting point is 00:12:54 Had a wife, he had a new baby. Like, he was really living a double life. And we showed up outside the hospital. We were able to call the floor he worked on, figure out exactly when his shift was starting. We were there the next day, and we showed up outside the hospital. We were able to call the floor he worked on, figure out exactly when his shift was starting. We were there the next day and we confronted him. And so it's been a pretty wild journey just to say we shouldn't live in a world
Starting point is 00:13:13 where this type of thing is enabled. And it's interesting because when we confronted him, I knew we would have 30 seconds. I knew that he wasn't gonna wanna speak to us and I knew he would know exactly who I am because I had been reaching out to him for months before. And he saw me, and he just started walking incredibly quickly towards the door.
Starting point is 00:13:31 And I just remember asking for comment. Legally, I want to ask for comment, right? We have all this evidence. I asked him, I said, I want to negotiate how someone who's a father and a son can create this type of thing that perpetuates this type of abuse. And as the doors were closing, I said, the harm is real. And did he say something?
Starting point is 00:13:49 He wouldn't say a word. I've interviewed like some categorically sketchy folks in my career, but I was really shaken by how he looked at me. And that was just part of our investigation. We did so many things to be able to really fan out. And we presented our findings to lawmakers around the world. Started talking about why this mattered.
Starting point is 00:14:07 I think when this happened to Taylor Swift, I want to say January or something of 2024. I hated this thought, but I thought maybe now people will pay attention. It's happened to one of the most powerful women in the world, which is horrific and it shouldn't take this type of abuse happening to Taylor Swift for people and lawmakers to pay attention. But it did help, I would say, people be like,
Starting point is 00:14:31 oh, this is the language behind it. This is why it's bad. And I think we were able to speed up our investigation. And so it's been never a dull moment. And then I got pregnant and have had a child in the process. But did that change how you viewed this? And bringing a child into this world? It's a really good question. I think when we were initially out, I was thinking about it because we just had Mother's Day. And I was thinking about having a child. And I remember thinking like, if we are not careful,
Starting point is 00:15:00 it's not just about the victims, right? We are gonna train a whole new generation of abusers, of young men who grow up and think that, I can nudify this girl from class in a couple clicks, using artificial intelligence. And I think that actually was very much, as I was thinking about wanting to have a child, like, God, I remember not,
Starting point is 00:15:19 this is like probably way too much information, but like, when we were in the hotel room the day before tracking him, I was literally tracking ovulation. Like, I was like, it was so top of my mind This is probably way too much information, but when we were in the hotel room the day before tracking him, I was literally tracking ovulation. Like, I was like, it was so top of my mind of thinking, what happens for our children? I just feel like we have to do better for them.
Starting point is 00:15:37 And so, it was wild. We went out there and a couple months later, I found out I was pregnant, and this felt so personal to me. I just don't want my child to grow up in a world where people think they can control women and girls. It spreads out. And we had a team of women in the field, which is pretty incredible. And the producer I was working with,
Starting point is 00:15:58 who worked on Mostly Human, which was my show at CNN, she was six months pregnant in the field. And we had this moment. She still wanted to come. I was like, are you sure you want to come? She's like, 100%. I'm like, okay. We're doing like car stakeouts, and she's literally six months pregnant at the time. And I remember we had confronted him at the hospital,
Starting point is 00:16:15 and he left through another door, like he was able to get out. He took some kind of car out, because we were right near where he had parked. We didn't know his home address was. I remember feeling like a little bit of a dead end. We came all the way out here. We wanted to get some answers. We wanted to ask for some kind of comment and understanding of how you could have created
Starting point is 00:16:33 this thing that became so big without any accountability. I'll never forget, we were in the car, and Roxy, who's the producer I was working with, she was like... Because we had just figured out he was a dad, because we had gone to his parents' home and we saw a baby seat, like a car seat in the car. And I was like, is Mr. Deepfakes a dad?
Starting point is 00:16:52 And she was like, let's call a local toy store and see if he's registered. Like, pregnant Roxy is saying this, and I'm like, oh, that's actually probably not a bad idea. And we ended up calling. Mr. Deepfakes was registered, I guess, for his child, and like, we were able to somehow get his home address from that, which was just this... It took, I think, a lot of women,
Starting point is 00:17:09 just in the only way I feel like a pregnant person would think. We're trying to figure out a better future for our children. And the reason I focus so heavily on Mr. Deepfakes, because it's not just about Mr. Deepfakes, it's about the future of consent and bullying and being able to create like, create a better world for our children.
Starting point is 00:17:26 And I think that was really personal to me because I was thinking about having a child during this investigation. Then I got pregnant, and then I had a child. It's been a wild journey, but it makes it, I think, really meaningful that the site is now down. As of the last couple weeks, the site was down, and I would say it took probably part of it.
Starting point is 00:17:45 Us showing up at his door, other people beginning to understand who he was. It took people creating friction, Google deranking the site. So it took all this friction, but it was such a win, because I think so many people sometimes say, oh, it's a game of whack-a-mole. It's what? You take one down, there's gonna be so many others, and I just don't buy it. Do you know how many women are gonna sleep better tonight
Starting point is 00:18:05 because of this? And if it's like a game of whack-a-mole, we just whacked like a giant one. So, that makes me sleep better. л Yes, me too. Do you know if he had a boy or a girl? I don't know. It's messed up in both ways. I think he might have had a boy.
Starting point is 00:18:18 And I'm not positive when we did a little investigating, which is just crazy to me. And I might sound like a total crazy person now. I always try to understand the why. I think it's too simple to be like, you're just this terrible person and you've done this thing. I think it's actually in trying to understand the why, maybe the more interesting reasons.
Starting point is 00:18:38 He reminded me a little bit of Ross Ulbricht from Silk Road, the guy who created one of the largest sites on the dark web where illegal things were bought and sold. Ross very much had this libertarian ethos of, this is kind of the future of the internet and all these things. I can't speak for David. That's the name of one of the creators of Mr. Deepfakes, according to all of our evidence.
Starting point is 00:19:00 I can't speak for the why, but I do think that it started as more of a hobby and an interest. Deep fakes and also porn and all this stuff. And I don't know if there was just a lack of empathy. If maybe he didn't believe that the harm was real. I think that those walls closed in on him. I think the stakes got higher as the psych got bigger and as people started talking about it more.
Starting point is 00:19:22 And as more people started saying, this is really harmful. He never shut it down until a couple weeks ago as the site got bigger and as people started talking about it more, and as more people started saying, this is really harmful. He never shut it down until a couple weeks ago when it was shut down. So, I have no idea where he is now. I've... You get fired? I don't know if they fired him.
Starting point is 00:19:39 There was a report that he could potentially be overseas. I don't know. Does his wife know I have... I had that question too. I mean, so I reached out to her after all this happened and his name is out there and the site is down. She hasn't responded. I did at one point show up at his parents' home. It didn't feel like, I always think it's important
Starting point is 00:19:57 for it not to feel like, oh, gotcha, I'm gonna get the bad guy. It felt sad. We grew up in like a beautiful neighborhood where kids are playing on the street. I didn't get the sense his parents knew, but I don't know. Like, he spoke to his father very briefly before he went inside.
Starting point is 00:20:14 And I didn't want to, how do I say this? I didn't want to stay for too long and be harassing at all. There's always this fine line of asking. But I never wanted to be that. We've seen that just without any empathy. And I'm not saying like I need to have empathy for this, but I think like to be that. We've seen that. Just without any empathy. And I'm not saying, like, I need to have empathy for this, but I think, like, empathy is the thing that we lack.
Starting point is 00:20:32 So many instances, it's the whole reason, I think, we're seeing a problem in sexually explicit deepfakes. People don't realize that there's real harm here. And so I... A person and a family. Yeah. I like to think that we showed up with a certain amount of empathy and being inquisitive without harassing his family. I think I walked away feeling really sad.
Starting point is 00:20:51 How did it affect you? I think I get frustrated sometimes because it's like... I thought for so long, this is why it's so big, right? It starts here, then it goes to schools. Then it goes to democracy where we can make anyone say anything. And then it goes to conspiracy. So it goes to democracy, where we can make anyone say anything. And then it goes to conspiracy. So I always like to be like, how do I frame this to different people?
Starting point is 00:21:12 And I think sometimes it can get frustrating to be able to be like, no, it's not about the shady site. It's actually about safety and consent. And it's about a tech threat that you don't realize. It's not what all the tech bros are talking about, which is AI becoming conscious and like Terminator. I'm like, no, no, no, this threat is already here and it's impacting your children.
Starting point is 00:21:32 And I think sometimes that can be frustrating to me because I'm sometimes a couple years ahead on this and I feel like I talk about it and people are like, huh? But I do think people are really understanding and I don't blame them. It's a weird one to wrap your head around. But yeah, I think it's, you have to, like, divide in certain ways. I said this to my colleague this morning,
Starting point is 00:21:51 because we were speaking to a woman whose son ended his life after an A.I. sex-torsion. Someone using A.I. did exactly what I explained at the beginning of this. And he ended his life. And... How old? I think he was, like, 14. He was a teenage boy. And I said to Nicole, I'm like, because we're going into turbo mode, we're like, go, go, go.
Starting point is 00:22:09 And I think sometimes if I sit, I'm like, man, like, I have a boy, right? That's so messed up. And I can't almost, whatever to say, like, I can't sit in it for too long, but I think feeling it is probably the most important thing. And how do we, as a tech, like, for my company, and like trying to like tell stories about technology, like how do we produce humanity and how do we produce empathy and just use tech as our way to do that? I think part of that is you have to feel it and you have to like not just be outraged,
Starting point is 00:22:40 but be organized about that outrage and be able to tell that story and let other people tell their stories and see them. So, I don't know, that's a roundabout answer to say, I think I do okay with it. Good days, bad days. And I think it's weird when you have a child and you just look at, like a child is so innocent and amazing and like, you're just obsessed with your kid and you're like, I don't want you to see this world. I want, I want to, I want you to have the best world.
Starting point is 00:23:09 Hold onto your wallets. Money rehab will be right back. And now for some more money rehab. It's a reality I think that we're gonna see in a few years because you're always ahead on these trends. Like, when you and I were growing up, guys still looked at Playboy. And then they moved into porn online.
Starting point is 00:23:36 And we've seen how that's affected men. And so is the next generation gonna be involved with user-generated AI porn? I think that's the thing that's so scary, which is like, at least like Playboy, they consented. There are all these issues that we think about when it comes to this. But now it's like the big thing and one of the biggest questions about the future of artificial intelligence,
Starting point is 00:23:55 and we're seeing this play out in Hollywood. We're seeing this play out literally with writers. We're seeing this play out everywhere is consent. Did you consent to have your materials uploaded? Did you consent to all of these things? I think when we look at this through the lens of consent, it's should anyone be able to have the power to make anyone do anything without their consent?
Starting point is 00:24:18 I feel like this is like a no-brainer, the answer is no, but it's a wild west. Oftentimes, by the time we're having the conversation, it's too late to have the conversation, because in the time that Mr. Deepfakes has risen and also fallen, there are all these new-tifying apps, right? There are all of these apps that have been popping up that allow people to do this with so low friction.
Starting point is 00:24:38 You don't have to be high-tech to do this. It's just a couple clicks. And now we're seeing the conversation around that, and thankfully, the laws are catching up, but the genie is certainly out of the bottle at this point. SHANNON. In that time, it sounds like this horrific story of a young boy killing himself came from another site.
Starting point is 00:24:57 So you're playing this game of Whack-a-Mole. There are obviously other moles. TAMI. Yeah. And I think it's, how do we educate parents now to say, okay, what are the conversations we need to have with our children so we can keep a really open environment? If something like this happens to them, they don't feel embarrassed.
Starting point is 00:25:13 They don't feel like ashamed to come to us and say, hey, I received this photo and I didn't take it, or I did, who cares? Being able to even be prepared for these types of things so we can get in front of what's gonna be inevitable. You have these groups online that are now targeting folks. And so it's like Mr. Deepfakes was just our way in
Starting point is 00:25:32 to talking about like a deep fake world where we can't really believe anything we see, where our likeness is weaponized against us, where our most intimate qualities are mimicked by artificial intelligence. And that can seem scary, but the biggest thing for me, honestly, is giving people agency.
Starting point is 00:25:51 There's also stuff we can do to get in front of it. The idea that Mr. Deepfakes is down, there was so much friction created that like legally made it very difficult for this site to operate the way it was. That's agency. That's like saying, we're just not gonna live in that world. We can actually make changes, and AI can work for us.
Starting point is 00:26:10 It doesn't have to work against us. It's a tool. There's so many amazing things that AI can do. You're so into the tech world. You've covered it for a couple of decades. You know all the major tech founders was Meta, TikTok. What did they say? Some of these companies have done better than others. It's also like a closed model,
Starting point is 00:26:32 so it's harder to have AI generate these types of images. They have worked very hard against this. But some of these other open source models make it easier for this type of thing to happen. The thing that I've been obsessed with, I feel like this is my next thing, is as I started digging into Mr. Deepfakes, and I was like, I want to talk to
Starting point is 00:26:50 other women who have been victims of this, and survivors of this. I spoke to a woman recently, her name is Bree, and she is so hard to describe, other than joy in the morning. She is a local meteorologist outside of Nashville. She is loved in her community. I feel like she's the person who walks down the street
Starting point is 00:27:07 and people hug her because in news right now, the meteorologist is the least controversial figure ever. And they are in your basement with you when there's a tornado telling you what to do. So she's really loved. And I got in touch with her after seeing her, she was trying to get a law passed in Tennessee, after she was like, all of a sudden on Facebook Meadow,
Starting point is 00:27:29 she would post something and then a fake Brie, who seemed like her, fake profile, would respond to her fans and say, hey, reach out to me on Telegram more soon. And she had someone message her and say, I think your husband has been sending nude photos of you out, and she was, no, he hasn't. And they were using deep fakes of her to make it look like she was nude. And they would get her fans to go on a Telegram account. And they
Starting point is 00:27:55 would send this. And one of these scammers said, meet me at a hotel in Nashville, pay me like X amount of money. And here's a taste, and sent these images of her. And there was literally, she was shocked by this. And then another one reached out to one of her fans, got them on Telegram, they think it's her, and said, join my VIP fan experience. There's another one that said, I'm in a terrible relationship and I can't get out,
Starting point is 00:28:20 there's abuse, like, lies. All of these are lies. But preying on her fans and like utilizing AI and sexually explicit deep fakes, and they used an AI-generated video of her to say, no, it's really me. And all of a sudden, we started looking into it. And we worked with a security company called Vermilio, and they did like an analysis of how many fake profiles
Starting point is 00:28:40 of her were out there. And 5,000 and counting. So she was living this whole fake life on the internet fake profiles of her were out there. And 5,000 and counting. So she was living this whole fake life on the internet where people were profiting off of her likeness. They were sexualizing her. They were doing all this stuff. And she reached out to Meta many times with profiles.
Starting point is 00:28:59 And she told me the woman said to her, I don't know what Telegram is. I was like, you know what Telegram is. Oh, I would love to look in your Telegram, by the way. Oh, my God. I've been talking to one of her scammers for weeks now. It's wild. As yourself? As a fan, to try to understand.
Starting point is 00:29:15 But I think, like, the biggest thing is we don't even realize it, but our identity has been taken and AI is front row center here, and we could be living these fake lives on the internet, we don't even realize it. Selling crypto, selling sexually explicit deep fakes, all of these things, because it wasn't just her. She started talking about this and all these other meteorologists came out and said, this is happening to me.
Starting point is 00:29:33 I realized it was happening to me. There's multiple fake lorries out there selling crypto scams. Can we check if it's happening to me? Yes, 100%. This is like my latest obsession. I think we are living these fake lives out there, and I am sure the tech companies know, they are, people are reporting it. I think they know about this.
Starting point is 00:29:51 And I think this is tip of the iceberg. Deepfakes are our way into talking about a whole deep reality where none of us are immune, and we're just beginning to see that. WHOO! Thank you so much for the work you do. I am officially scared. And a lot of this extortion or sextortion is around money. They want money in crypto.
Starting point is 00:30:13 And so we end our episodes by asking all of our guests for a tip that listeners can take straight to the bank. So how can you protect yourself? SHANNON COFFEY I would go back and say what we were talking about at the beginning, because it's a real tangible thing, this idea of a human password, and also monitoring your accounts
Starting point is 00:30:29 and making sure there aren't those small charges, right? If there's a small charge on your account, you're not sure where it comes from. Like, oftentimes, this is what scammers will do. They'll try to see if they can get away with a little, and then they'll go and charge a lot, so that's definitely one thing. And I think really trying to talk to people you love,
Starting point is 00:30:47 tell your parents, tell your friends, like, these links that are coming up, these text messages that you are getting, these emails, like, you have to be 1,000% sure before clicking and sending your information, because now they are personalized. These scammers are getting better and better. They make it seem high stakes.
Starting point is 00:31:06 And I hate to say this, because I don't want to end it in a sad way, but like, question everything. If you need to, if you're getting some stuff from the bank, call the bank. Actually, not the number from the text message where they send it, but look up online, the number to your bank and call it, or go in person, right, and triple check.
Starting point is 00:31:20 Because these scams are getting really sophisticated. They feel very personal, and they're coming from all directions. And I think being able to understand that is gonna be really important for the future. Listen, I just got identity theft-ed. And so, I'll just tease this. We're going after you, Mr. Identity Theft.
Starting point is 00:31:37 We are coming for you. That's our next episode. Laurie's gonna find you. I'm going to your rack. I'm going to your parents' house. I know, friends and security. Yes. Yes. So So you're going down. Yeah. Money Rehab is a production of Money News Network. I'm your host, Nicole Lapin.
Starting point is 00:31:53 Money Rehab's executive producer is Morgan LeVoy. Our researcher is Emily Holmes. Do you need some Money Rehab? And let's be honest, we all do. So email us your money questions, moneyrehabatmoneynewsnetwork.com to potentially have your questions answered on the show or even have a one-on-one intervention with me. And follow us on Instagram at MoneyNews and TikTok at MoneyNewsNetwork for exclusive video
Starting point is 00:32:16 content. And lastly, thank you. No, seriously, thank you. Thank you for listening and for investing in yourself, which is the most important investment you can make.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.