The Daily - Real Teenagers, Fake Nudes: The Rise of Deepfakes in American Schools

Episode Date: June 7, 2024

Warning: this episode contains strong language, descriptions of explicit content and sexual harassmentA disturbing new problem is sweeping American schools: Students are using artificial intelligence ...to create sexually explicit images of their classmates and then share them without the person depicted even knowing.Natasha Singer, who covers technology, business and society for The Times, discusses the rise of deepfake nudes and one girl's fight to stop them.Guest: Natasha Singer, a reporter covering technology, business and society for The New York Times.Background reading: Using artificial intelligence, middle and high school students have fabricated explicit images of female classmates and shared the doctored pictures.Spurred by teenage girls, states have moved to ban deepfake nudes.For more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday.

Transcript
Discussion (0)
Starting point is 00:00:01 From The New York Times, I'm Sabrina Tavernisi, and this is The Daily. A disturbing new problem is sweeping American schools. Students are using artificial intelligence to create sexually explicit images of their classmates and then share them without them even knowing. Today, my colleague Natasha Singer on the rise of deepfake nudes and one girl's fight to stop them. It's Friday, June 7th.
Starting point is 00:01:06 So, Natasha, for the past year, you've been reporting on artificial intelligence in American schools. And your most recent reporting has uncovered a pretty new and very disturbing trend involving AI. Tell me what you've been finding. It's been a really striking year on my beat because we're seeing kind of a rapidly spreading new form of peer sexual exploitation and harassment in schools, which is called deep fake nudes. And what that means is that we're seeing middle and high school boys across the United States using these widely available nudification apps to allow people to surreptitiously take clothed photos of girls and women and without their knowledge or consent, use these apps to remove their clothes. And so it looks like a simulated nude of a ninth grader who, you know, has posed for a photo at, say, a high school football game or her high school prom in a dress,
Starting point is 00:01:58 except when these boys use these apps, it looks like the girls have posed for these nude photos at these events. Right. And the thing is that boys have been sexually harassing girls online for decades, including using Photoshop to make fake nude pictures. But the difference here is that generative AI is an exponentially more powerful tool. It makes it look realistic. But second of all, it enables the mass production of these kind of fake nude images of teen girls very, very quickly. And one expert I interviewed pointed out to me that one boy with his phone in the course of an afternoon can victimize 40 girls and then the images are out there. Right. And so the possibility of harming many girls instantly and permanently is what is the concern here.
Starting point is 00:02:53 Okay, so this is a very disturbing and very new problem. Where did your reporting to understand this problem start? So one recent afternoon, Sydney Harper, a producer at The Daily, and I traveled to Westfield, New Jersey. Do you want to tell us a little bit about what we're looking at, what we're seeing? We are driving through downtown Westfield, which is a small, affluent community of a few thousand people. It's this idyllic, tiny suburb outside of Newark. Very manicured lawns. Very manicured. Definitely landscapers employed. With manicured lawns and lush dogwood trees in full bloom. So here we are at the Manny's. And then we went to see Francesca Manny and her mom, Dorota.
Starting point is 00:03:49 Oh my God, I thought you guys were coming at 4. You said 3.30. Did I? Come in. Give me three seconds. Dorota was surprised to see us at first because she'd been expecting us a little later in the day. Hi, Francesca. Hi. It was like a half hour after school was over and Francesca was in the back, you know, sun tanning outside with the dog.
Starting point is 00:04:09 And after we said our hellos, we sat in the living room and Francesca started to tell us more about herself. So I'm Francesca Manny. I'm 15 years old and I'm a sophomore at Westfield High School. Francesca is a typical teen in many ways. I love playing sports. My main sport is fencing. I love hanging out with my friends. She plays sports. She is training in fencing.
Starting point is 00:04:34 She's really gregarious, and she's really confident. What was your first inkling that something had happened that had to do with the girls in your class? Well, Francesca began to tell us what happened to her at school last October. So I found out that like there was a group of kids in my school and they were taking like girls' faces, pictures of girls' faces and putting them on like AI generated like pictures. And it was like not rumors, but it was like word to mouth. It was like, you know, like the game telephone was like going around.
Starting point is 00:05:09 Yeah. She found out that boys in her class had created these AI-generated fake nude images of some of the girls in her class. It was in history class, the second period. And I was just sitting down and then I came back from the bathroom and like the girls were talking about it. I was like, hey, what's going on? And then people were like talking about it. And I was like, oh, and then I came back from the bathroom, and, like, the girls were talking about it. I was like, hey, what's going on? And then people were, like, talking about it, and I was like, oh.
Starting point is 00:05:28 And we were all super surprised. The girls hadn't seen the images. They'd heard about them from boys who had seen them and who knew who was in them. Francesca said the girls made a Snapchat group to discuss what they should do, and then they decided that some girls needed to go down to the principal's office and notify administrators about these deepfake nude images. And what did you tell the principal? We were like, there's this group of kids taking pictures of girls' faces and putting them on AI-generated bodies.
Starting point is 00:05:56 And we were just worried, and we didn't know if it was, like, one of us. So we just went down, like, even for the sake of the other girls, we just went down to, like, tell the principal. So the school isn't even aware these images exist until Francesca and her friends notify administrators. That's right. So then Francesca goes back to class and goes about her school day. Like, the day went on, but a lot of girls were worried. A lot of girls were, like, crying. And, you know, I didn't really think it was going to be me.
Starting point is 00:06:23 So I was doing my own thing. I was going to my classes. There was a lot of like counseling or something, but I needed to stay on my work. And at this point, Francesca isn't thinking that any of this is going to affect her directly. But as time goes on, she begins to hear something troubling. It was during fourth period. I was going to go print something out with one of my friends, and I see one of my other friends, and I was talking to them, and they said that they think there was one of me. She heard that she might be one of the students who had these deepfake nude photos made of her. So as she's worried about this, the school begins to announce the names of girls in her class over the loudspeaker and ask them to come down to the office. And I hear my name over the intercom telling me that I have to go down to the principal's office. And at that moment,
Starting point is 00:07:21 I knew it was one of me because I think all of the girls were getting called down from the intercom. So I went down, and then the principal told me I was one of the victims, and then that's how I found out. And how did you feel about that? So basically, in the beginning, I felt shocked because I didn't think it could happen to me, and that's the funny part because it could happen to anyone. I'm kind of sad, too, because it's, like, surprising
Starting point is 00:07:49 because I didn't think my classmates could do this to me. Francesca is upset and also feeling like there's a double standard, like the girls are not being treated by the school with the same sensitivity and privacy as the boys who allegedly made these fake nudes. What I found weird is that they announced, like, school with the same sensitivity and privacy as the boys who allegedly made these fake nudes. What I found weird is that they announced like the victims over the intercom while the the boys that were getting investigated for it were like pulled out of class privately. So no one could know who it was. But we girls had to get, you know,
Starting point is 00:08:20 called over the intercom, which I think is like, like kind of violating our privacy. Who heard that announcement? Everyone. Because the class is finished and usually people speak over the intercom, so everyone heard it while they were in the hallway. And as she walks out of the principal's office. So I'm walking down the hallway to my math class and I see these group of boys laughing at these group of girls crying about the situation and I'm just like, are you kidding me? The first thing she sees is a group of girls standing together crying
Starting point is 00:08:53 and a group of boys laughing at them. So that's when like I started like boiling. I was super mad and like I just thought it was not fair because it's a serious thing. Like why are you laughing at it? And what did it make you want to do when you were mad? Well, I wanted it to stop because girls shouldn't be, like, laughed at because of something that had happened to them. And especially by, like, you know, a guy.
Starting point is 00:09:19 Like, I really don't think that's fair. You know, it is a pivotal moment for Francesca. She sees the girls crying, and she talks about how she was just pissed off, and she wanted to do something about it. And she says to her mom, how are you going to help me do something about this? And I was like, Mom, this is what happened at school. We need to do something because I'm super mad this happened to me and to other girls, and I think we could do something to fix it.
Starting point is 00:09:54 And what does Francesca's mom do when she hears what happened to her daughter and her friends? Well, the first thing her mom, Dorota, sets out to do is find out what consequences there will be for the boys who did this. One of the vice principals called me and informed me that, yes, there has been a situation where boys created nude images, but, you know, the boy has been suspended. So I asked the legitimate question of for how long? Dorota says she learned from the school that it's primarily one boy who has made and circulated these images and that he will be suspended for just one day. You know, there was no consternation. There was no, you know, we're thinking about it. We're still investigating. We're going to decide that for now he has been suspended.
Starting point is 00:10:46 for now he has been suspended. And Dorota's had several conversations with administrators about what has happened and what they're doing. And she is beginning to feel that they're not taking seriously and they're not doing enough. She's like, but don't worry, he is gone. And I'm sitting there and I'm saying, gone where? Gilligan's Island? He's going to be back on Wednesday. Again, you know, I'm composed at this point. I said, well, he's back on Wednesday, isn't he? Do you think as a mother, as a woman, do you think this is the right approach? She says, well, you know, this is what it is. You know, this is what we have decided. And I said, well, then I want to inform you that I'm not going to let it go. And the combination of the idea that, A, the school has announced the names of the girls
Starting point is 00:11:28 over the loudspeaker, further humiliating them and violating their privacy. A, B, the boy who has instigated this has been suspended for one day and maybe is taking another day off, and then he will go back to school with the girls he did this to. These things in combination make Dorota feel that the school is not doing enough to protect the girls and is not going to
Starting point is 00:11:52 do enough to punish the boys. And that's how the whole ordeal started. It completely activates her. And so one of the first things she does is Dorota writes a letter to the local online kind of village news website. And she explains what happened to Francesca. And she says something like, am I the only one who thinks that the school's response is inadequate? And she puts her phone number on this letter to the editor of the local Westfield publication. I think more than 300 people contacted me. It was like a hotline. Older women to young mothers, to parents of Westfield High School, students to counselors, some teachers. So we all connected together through that. She hears from the parents of
Starting point is 00:12:44 other girls. She hears from the parents of girls who have been subject to other kinds of sexual harassment in schools. She hears from local politicians. And she decides that she's going to hold a town meeting about this in her own living room. Put armchairs and dining chairs. And, you know, it was a full room of beautiful people that came to support us. And so she has this meeting and dozens of folks come, including the mayor and other parents and local legislators. And they talk about what needs to be done. And then we need to put code of conduct, AI policies and bullying policies in place. And there should be just one approach, right?
Starting point is 00:13:23 If you've done the wrong thing, you've done the wrong thing. It's as simple as that. And what comes out of that meeting is Dorota and Francesca decide they want to push school districts nationwide to update their policies to better protect girls from deep fakes and prevent this from happening again. again. So how does the school respond to all of this? So when I reached out to the Westfield Public School District, they said they had opened an immediate investigation into the incident and consulted with police, but that they couldn't comment on the specifics of the case or talk about any disciplinary actions against the boys for privacy reasons because the students were under 18. And when I asked them earlier this week whether they changed any of their policies
Starting point is 00:14:13 to address AI abuse, the district declined to comment. So, Natasha, the story of the Mannies, I mean, it really shows how unprepared schools can be. It seems like the school was completely blindsided by this and really fumbled it. But I guess in the school's defense, this is all really new, right? I think that's an important point. This is brand new AI technology. And maybe school administrators, principals, superintendents didn't even know that deep fake nudes existed or that there are apps you
Starting point is 00:14:48 can use specifically to nudify photos of women, right? So you could say they weren't prepared for this kind of technological abuse. And I spoke to parents in a number of districts and I reached out to schools in a number of states. And mostly the way that Westfield High handled this is typical. The typical response was there will be minimal discipline for the guy or guys who did this. And, you know, girls, this is just a fact of life. Or like, girls, yeah, this is really bad, but it will pass. And there did not seem to be understanding in many districts that actually this could be devastating for mental health and also that it could have long-term consequences for these girls. I guess the thing that's bothering me is it seems pretty clear that what these boys are doing is wrong. But is it illegal? I mean,
Starting point is 00:15:47 are they breaking the law when they're doing this? You would naturally think that it should be illegal for anyone to make deep fake nudes of 12 and 13 year old girls. But the answer to this question is a lot more complicated than you might think. We'll be right back. So, Natasha, you talked about the question of legality as being very complicated. What do you mean by that? Well, what I mean is that there are different laws that could cover this. And so let me break it down.
Starting point is 00:16:46 There's a whole section of the law that outlaws the possession and distribution of what's called child sexual abuse material. And that's explicit video or photos or images of underage children engaged in sexual acts. But that doesn't mean that every photo of an unclothed child is automatically illegal. For example, a photo of a naked child in and of itself may not be illegal. Like if you think of a parent taking a photo of their toddler in a bathtub. But if the image shows a child that is sexually suggestive or engaging in explicit sexual conduct, that is illegal. That's the red line. The big question, as we're starting to see more and more of these deep fakes, is where do these images fall on the spectrum of material? If an image shows a 17-year-old without clothes on, does that meet the federal definition of illegal material? It came up in my reporting because there is a case in a high school where boys made these deep fake nudes of their female classmates.
Starting point is 00:17:50 And according to the police reports, local police heard about it from the parents of the girls who complained, but they didn't hear about it from the school. And so the police detective went to the school and said, you know, these images fall under the child sexual abuse material statute. And you as a school are mandatorily required to report such images. And eventually the school district does report the deepfake nudes. But when I reached out to them to ask the school district about this, the school district sent me a note saying that they had reported the deepfake nudes of female students out of an abundance of caution because the school district's lawyers said these were fake, they weren't real, and maybe it wasn't necessary to report them. So even within school districts, there is confusion about the legality or illegality of these deepfake nude images.
Starting point is 00:18:52 Okay, so some of these deepfakes are actually illegal. Sexually explicit AI-generated images of minors. But some images are actually still pretty unclear. So what is being done to clear up that gray area, to make it more black and white? A number of families whose daughters have been subject to these deepfake nudes want to see a standard law or policies to protect their daughters and other people's daughters. daughters and other people's daughters. And so you see that some families are lobbying for states to pass laws to prohibit these images. And the Manis are one family who have been lobbying for New Jersey to pass a law to prohibit these deepfakes. But many states are introducing new bills and there are different approaches to doing this. Some states are expanding their laws on child sexual abuse material to include these AI-generated nudes and make those illegal.
Starting point is 00:19:58 Got it. there's another approach where states have laws on revenge porn, and that prohibits people who have taken consensual sexual images with their partners. You know, when they break up, if they're pissed off, it's illegal to post those online without the other partner's consent. And so states are taking these revenge porn laws and adding AI-generated nudes. And the third approach is to come up with entirely new laws that are specific to these deepfake nudes and to prohibit the possession or production or distribution of these AI-generated sexually explicit images of minors and sometimes both of minors and adults. So you're talking here about state laws. Is there anything happening on the federal front? So to be clear, federal law on child sexual abuse material applies here. It covers computer-generated
Starting point is 00:20:58 and real images. And the FBI has recently said that it covers these deepfake AI-generated images as well. And there are also efforts to address this more directly. The White House issued an executive order last fall, making it a priority to find ways to stop AI from producing these abusive images. And Congress also recently introduced bills to prohibit these AI-generated nude images and to allow victims to sue. But these efforts are not very far along. And really, the momentum for change is happening primarily at the state level right now. And it's important for states to do this because these cases are often prosecuted at the state and local level. at the state and local level. Okay, so you talked about the third approach, creating entirely new laws. What does that process look like on the ground?
Starting point is 00:21:50 What are you seeing pop up around the country? We've seen over the last two years, lawmakers in two dozen states introducing these bills that are specific to AI generated deep fake news. And these bills really vary in content and in the severity of the punishments. Massachusetts lawmakers are hammering out a new bill that would criminalize the sharing of these explicit images, but it takes a different approach to adults and minors. It would be a crime for adults to make these deep, fake nudes of minors.
Starting point is 00:22:28 But if it's teen boys making the same kind of images, they could first be sentenced to a diversion program, which means that they'd have to learn about what's problematic about these images. They'd have to learn about the responsible use of generative AI. They'd have to learn what the punishments are. So Massachusetts is taking a more graduated approach. You know, by comparison, Louisiana passed a law, and under the new Louisiana law, anybody who knowingly makes or shares or sells these sexually explicit deepfake nudes of minors could face a minimum prison sentence of five to 10 years. Oh, wow. Pretty tough. Right. And it raises the question of, should a teen boy who makes images of a teen girl face the same punishment as a 50-year-old pedophile who makes these images? Right. Important question. And that is very tricky.
Starting point is 00:23:23 It is really tricky. And it came up in my reporting because at the end of last year, police officers in the Miami area arrested two middle school boys for allegedly making and sharing these simulated nude images of two female classmates who were 12 and 13. And according to the police documents we got, the boys were charged with third degree felonies under a 2022 state law like the one that the Massachusetts lawmakers are working on. Or maybe, you know, they would have to register as sex offenders for something they did when they were 13 years old and that would follow them for the rest of their lives. And so there's a really complicated question about as new laws are being passed, how should students who are doing this in middle and high school be treated as opposed to pedophiles. So, Natasha, at the end of the day, we're left with a pretty complicated picture here and with a very hard question. This technology has resulted in real harm for young girls and their families. But at the same time, addressing that harm raises the question of criminalizing young boys. I think that's the crucial question, Sabrina.
Starting point is 00:24:48 And it underscores how schools and parents and lawmakers and law enforcement are just in the opening stages of figuring out how to deal with AI abuse. And I've come to believe, based on my reporting over the last year, that it's actually part of a much larger problem. Young people's increasingly toxic relationship with technology. Children and teens' relationship with technology changed during the pandemic. We all know that. Many kids and teens began spending more and more time online, and some kids became much more isolated. Schools tell me they believe that this has led to a spike in cyberbullying, and some psychologists blame it on kids' compulsive use of social media. And I think the technology issue is also connected to a whole constellation of problems that schools are dealing with post
Starting point is 00:25:45 pandemic, like increased depression among students and increased absenteeism. And so I think that this is not just a story about AI abuse in schools. It's a story about how technology is fundamentally changing childhood, adolescence, and education. And it's a story about how the grownups, whether it's school administrators or state lawmakers, parents, or members of Congress, are all racing to catch up. Natasha, thank you. Thank you, Sabrina. We'll be right back.
Starting point is 00:26:41 Here's what else you should know today. On Thursday, an Israeli airstrike hit a United Nations school complex in the central Gaza Strip that had become a shelter for thousands of displaced Palestinians and, Israel said, Hamas militants. Gaza health officials said 40 people were killed in the attack, including 14 children and nine women. The Israeli military said that its forces had targeted a group of about 30 militants, using three classrooms as a base, including some who
Starting point is 00:27:13 had taken part in the October 7th attacks. And Stephen Bannon, a longtime advisor to former President Donald Trump, was told by a federal judge to surrender to authorities by July 1st to start serving a four-month prison term. Bannon was sentenced in October 2022 for contempt of Congress after he disobeyed a subpoena to give testimony to the House committee that investigated the January 6, 2021 attack on the Capitol. to the House committee that investigated the January 6, 2021 attack on the Capitol. Bannon also faces a trial later this year on charges of misusing money that he helped raise for a group backing Trump's border wall. A reminder, we'll be sharing a new episode of the interview tomorrow. Those few days that we shot the pivotal scenes in the movie, I had to call home a lot.
Starting point is 00:28:08 I really was a tad unhinged. This week, Lulu Garcia Navarro talks with comedy legend Julia Louis-Dreyfus about what it was like to go to much darker places in her latest film. was like to go to much darker places in her latest film. Today's episode was produced by Sydney Harper and Shannon Lin. It was edited by Mark George, contains original music by Marian Lozano, Alicia Baetube, and Dan Powell, and was engineered by Chris Wood. Our theme music is by Jim Brunberg and Ben Landsberg of Wonderly.
Starting point is 00:28:56 That's it for The Daily. I'm Sabrina Tavernisi. See you on Monday.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.