The Journal. - Teens Are Falling Victim to AI Fake Nudes
Episode Date: July 12, 2024Last fall, nude photos of a 14-year-old student started spreading around her high school. But they weren't real… they’d been created with AI. WSJ’s Julie Jargon breaks down how fake photos like ...these are a growing trend among teens and why it’s difficult to deal with. Further Reading: - ‘I Felt Shameful and Fearful’: Teen Who Saw AI Fake Nudes of Herself Speaks Out - AI Fake Nudes Are Now a Frightening Reality for Teens Further Listening: - Artificial: Episode 1, The Dream - He Thought Instagram Was Safe. Then His Daughter Got an Account. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Last October, Elliston Berry was heading into her final week of school before fall break.
At the time, she was 14, a freshman at Aledo High School in Texas.
I had just had my homecoming dance, and it was peak volleyball season, and I had a game the next day.
So can you tell me about the day it happened?
I had woken up on that Monday morning with a whole bunch of calls and text messages from
one of my really, really good friends, and she had told me that these photos of her and I,
as well as another one of our friends, was going around.
And at first, I was completely shocked.
The photos of Elliston and her friends were fake nude photos,
and a classmate had made them using artificial intelligence.
In total, there were pictures of nine girls.
One of the photos showed Elliston naked on a cruise ship.
Another girl appeared to be naked at the beach.
They were doctored versions of real photos that had been taken from their social media profiles.
It was terrifying knowing that these were going and I had no idea.
And the photos were realistic.
And the photos were realistic.
I hate to admit it, but at first glance, if you hadn't seen the original photo, it's very, very realistic.
Could you tell how much these images were spreading around online?
I wasn't really able to know who had seen them. I was told that the entire school had seen them, and my freshman class alone is 600 kids. So in the entire high school, that
just sounded awful. So it was really scary, and to this day, I have no idea who all has seen them.
Elliston was victim to a growing trend of AI photo manipulation,
as the tools that make these images have become more accessible.
And for many parents, schools, and the government, this is uncharted territory.
Welcome to The Journal, our show about money, business, and power.
I'm Jessica Mendoza. It's Friday, July 12th.
Coming up on the show, a new threat for teens, fake nude photos made with AI.
What is dedication?
People ask, how your children learn how to ride a bike, and you didn't.
I just created an environment where they taught themselves, and all I had to do was be there.
That's dedication.
Visit fatherhood.gov to hear more.
Brought to you by the U.S. Department of Health and Human Services and the Ad Council. In the past few years, there have been a lot of viral moments
when fake videos, photos, and audio have put famous people in awkward situations.
Like earlier this year, when someone made an audio deepfake of President Joe Biden telling people not to vote in the New Hampshire primary election.
The message, which was sent out yesterday, apparently used an artificial intelligence voice to simulate President Biden's voice.
Or when someone made fake nudes of Taylor Swift.
Sexually explicit deepfake photos of Taylor Swift have been circulating online.
And now, the problem has made its way into schools. Actually explicit deepfake photos of Taylor Swift have been circulating online.
And now, the problem has made its way into schools.
Our colleague Julie Jargon has been reporting on it. I first heard about fake nudes through AI last fall when I was working on a story about an incident in Westfield, New Jersey, at a high school there.
incident in Westfield, New Jersey, at a high school there, where a teenage boy had allegedly created fake nude images of some of his female classmates and then shared them in group chats.
There have been a number of middle schools and high schools around the country where this has
happened. So several that have made news, and probably a lot more that haven't made headlines. So it's something that's definitely happening
across the country at a variety of different schools.
Were you surprised by what you found when you were doing this reporting?
I was really surprised when I heard about it. This was the first time I'd really heard about this happening among teenagers. And
it was really kind of shocking that this technology had gotten into the hands of
young people. It wasn't just, you know, sophisticated, technical, savvy people that
knew how to create really realistic looking images. This was now available to anyone.
All users have to do is upload a photo to an app
and ask the AI to make a version where the subject is naked. And in many cases, it looks real.
And this is just these apps, these sites, they're just available online or on the app store,
that sort of thing? Well, I think they're becoming harder to find on the app stores.
Google and Apple have been removing nude-generating AI apps from their app stores,
but they have been there at least in the past. And, you know, maybe they're going by different
names now. They're probably getting sneakier. Wall Street Journal reporting shows that while
some apps have been taken out of app stores, more have shown up recently. And once the images are made, users can spread them quickly through social media.
They're sending them in group chats, either through Snapchat or other messaging platforms,
you know, sending these pictures to their friends and it's circulating that way.
And like, why is it so hard to stop or control that spread?
can circulate very quickly and very broadly.
So I think that's the fear here is where do these images go?
And do they ever really disappear?
Back in Aledo, Texas,
Elliston Berry learned about the fake nude photos of her on a Monday morning before school.
And at first, she was nervous about telling her parents.
I honestly wanted to push
it under the rug and not tell them. But as I was getting ready that morning, my friend who had sent
the photos said that she told her mom. So then I came to the conclusion that I need to tell my mom.
And I was so scared. So she comes in my room crying. You know, I'm immediately disturbed by it.
That's Anna McAdams, Elliston's mom.
And I'm like, what's going on, you know?
And she shows me the pictures, and I didn't know what to do.
I mean, really, in that moment, it was like, okay, I'm mom.
I'm supposed to protect her.
So, you know, I just went into mama mode.
Anna took Elliston to school.
While Elliston started her day, Anna went to the administration and told them what had happened.
Soon after, Elliston gave a statement to the police.
And then she went home and did the rest of her classes remotely.
So, Elliston, you stayed home on Tuesday.
What was that day like for you?
I was getting text messages from people all throughout the day just asking me what happened,
or they have more information, or they're trying to figure it out. I just got so many responses,
and I was very overwhelmed.
Were you, like, talking or messaging with your friends who were also part of this? Yes. We all have a group chat and we kept in contact with each other all day and we're
checking up on each other and if there was any new leads or any new information, like we were the
first to talk about it. At the time, did you have any idea or any guesses as to who could have done this?
Well, that Tuesday, one of my friends had reached out,
and he gave me a list of boys and said that,
Oh, well, we're figuring it out. We're going to do it. We're going to figure this out.
Elliston stayed at home for a few days,
but by Friday, she decided to go back to school.
And that day, the student behind the fake photos was caught.
Here's Elliston's mom again.
This kid gets on school Wi-Fi and starts posting more pictures on his account.
And so the tech with the school was able to figure out who it was
because he went on the Wi-Fi.
So they caught him.
And without a lot of precedent for these kinds of situations, it's not always clear what to do next.
That's coming up.
What is dedication?
People ask, how your children learn how to ride a bike, and you didn't.
I just created an environment where they taught themselves, and all I had to do was be there.
That's dedication.
Visit fatherhood.gov to hear more.
Brought to you by the U.S. Department of Health and Human Services and the Ad Council.
Because fake nude photos made through AI are a relatively new phenomenon,
schools don't really have a typical way to deal with them.
Here's our colleague Julie again.
There's no one standard that's being used across school districts.
But then it's usually parents in schools speak with their local law enforcement,
either their county sheriff or local police,
and maybe it goes to a district attorney.
So it tends to follow the local sort of criminal reporting paths.
And what can schools do in these situations?
From what I have found,
there's usually some sort of school disciplinary action,
but schools are usually pretty behind in terms of their policies around the use of AI. And so, as typically happens, technology moves faster than laws or school policies do.
And so schools are struggling to figure out, you know, what does this fall under in terms of a disciplinary action?
figure out, you know, what does this fall under in terms of a disciplinary action?
At Alliston's school, using AI to generate fake nude photos of fellow classmates wasn't in the code of conduct. Her mom, Anna, said that it was difficult to get information about the student
who made the photos and to learn what the school was going to do about it.
They ended up putting him in suspension, but the school never would tell us, like, who it
was, how long he'd be in suspense. You know, like, we were afraid every day that semester, okay, we're
going to show back up to school and he's going to be in class. And really, the girls just felt so
unprotected in all of it, not knowing what was going on. A spokesperson for the Aledo Independent
School District said the school assisted the local sheriff's department with its investigation and punished the teenager who made the photos in accordance with school policy.
The spokesperson also said, quote, the district is reviewing its student code of conduct with the possible use of AI by students in mind.
Eventually, Ellison and her mom guessed the student's identity when his parents
took him out of school. Did you know who this boy was? I did. I had a couple classes with him in
eighth grade, but he was a classmate and we were mutuals on social media. I didn't see him as a
threat. It was really shocking knowing that he did this because he was a peer, he was a classmate.
The school district declined to comment
about the enrollment status of the teenager
who made the photos.
The local district attorney said
he couldn't give specifics because the student was a minor,
but that the boy was sanctioned
within the juvenile justice system.
Meanwhile, Elliston's mom, Anna, decided to take some steps on her own.
Could you walk me through your decision to come forward publicly and talk about this?
You know, I was like, we have to do something about this. I just called anybody and everybody
that would listen because I was like, this is not going to go away. So, and it could happen to her
again. You know, it's like, I couldn't prevent it the first time,
but if I could do something to help prevent it in the future.
Anna eventually got in touch with the office of Senator Ted Cruz,
who listened to Elliston's story.
In June, Cruz, along with Democratic Senator Amy Klobuchar and others,
introduced a new bill.
It's called the Take It Down Act.
Here's Cruz talking about it.
This is an issue that really cuts across partisan lines, ideological lines.
There's a real need here.
People are being victimized.
We're seeing thousands upon thousands of people being targeted, predominantly women.
The bipartisan bill would criminalize the publication of non-consensual nude images, real or fake.
It would also require social media companies to take down images and the accounts that posted them within 48 hours.
Have you seen the text of the bill itself, the Take It Down Act?
And how do you feel about it?
I think it's first steps. You know, so the two things that they're so huge about it is one,
that there would be consequences. And then the second part is getting accounts down. So we have
no idea how many times they were shared. You know, the everyday person should be able to know that if there's
something that happens to them like that, they should know that those images can be taken down.
For Elliston, the incident has left her feeling anxious. She says that she cleaned up her social
media accounts, removing followers and contacts she's not close friends with, and that she's less
social than she used to be. Do you think about
whether these photos could impact you later in life? Absolutely. That was one of my biggest fears.
I was scared that these photos will come about when I'm applying for college or for a job
application, or if my future kids were to search my name, these would come up. I will live with these photos for the rest of my life
and have that fear of them resurfacing.
And that was terrifying,
knowing that there was nothing I could do in that moment.
And it was scary.
Does this make you all, like, anti-AI?
I mean, AI has always been, like, a weird concept,
but it's very aggravating knowing that in the school conduct,
we have regulations on AI when it comes to cheating and creating.
If you use AI to cheat on a test or write its own essay,
we have regulations and punishments for that.
But for a more serious matter like essay, we have regulations and punishments for that. But for a more serious
matter like this, they have nothing. So it's frustrating knowing that the school is able to
put regulations when it comes to academics, but when it comes to like mental health and physical
and emotional things like that, they don't have anything to do. So I'm kind of more angry at
like the school. Well, thank you very much, both of you,
for taking the time to chat,
for sharing the story
and being so open about it.
So thank you.
Thank you.
Thank you a lot.
That's all for today, Friday, July 12th. Thank you. Our engineers are Our theme music Nathan Singapak, and Peter Leonard.
Our theme music is by So Wiley.
Additional music this week from Katherine Anderson,
Marcus Begala,
Peter Leonard,
Bobby Lord,
Emma Munger,
Nathan Singapak,
Griffin Tanner,
and Blue Dot Sessions.
Fact-checking by Mary Mathis
and Najwa Jamal.
Thanks for listening. See you on Monday.