Your Undivided Attention - Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet
Episode Date: February 1, 2024Over the past year, a tsunami of apps that digitally strip the clothes off real people has hit the market. Now anyone can create fake non-consensual sexual images in just a few clicks. With cases prol...iferating in high schools, guest presenter Laurie Segall talks to legal scholar Mary Anne Franks about the AI-enabled rise in deep fake porn and what we can do about it. Correction: Laurie refers to the app 'Clothes Off.' It’s actually named Clothoff. There are many clothes remover apps in this category.RECOMMENDED MEDIA Revenge Porn: The Cyberwar Against WomenIn a five-part digital series, Laurie Segall uncovers a disturbing internet trend: the rise of revenge pornThe Cult of the ConstitutionIn this provocative book, Mary Anne Franks examines the thin line between constitutional fidelity and constitutional fundamentalismFake Explicit Taylor Swift Images Swamp Social MediaCalls to protect women and crack down on the platforms and technology that spread such images have been reignitedRECOMMENDED YUA EPISODES No One is Immune to AI HarmsEsther Perel on Artificial IntimacySocial Media Victims Lawyer UpThe AI DilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Transcript
Discussion (0)
Hey everyone, it's Tristan.
And this is Aza.
And for this episode of Your Undivided Attention,
Aza and I are going to be handing over our hosting duties to our friend, Lori Siegel.
Lori is a highly respected tech journalist,
and she's done some really powerful reporting on the topic of deepfakes and AI-generated porn.
And we are thrilled to have her insights and expertise for this episode.
So over to you, Lori.
Thanks, Tristan.
You know, I've been covering the human impact of technology for 15 years, and I'm here today to talk about what I truly believe is one of the most profound risks with the recent rise in artificial intelligence.
And that is non-consensual, sexually explicit deepfakes that are generated by AI.
This is also commonly referred to as deep fake pornography.
These are AI-generated images and videos of real people, and an overwhelming majority of those people are women.
According to the Cyber Civil Rights Initiative, one in 12 adults on social media have been victims of non-consensual pornography.
And that was before AI came along.
Yet, for some reason, we have a hard time talking about this.
Deep fake pornography receives far less attention than other risks like disinformation or plagiarism or cyber attacks.
But the impact is extraordinary.
And now, because AI generated images of Taylor Swift went viral on X.
The conversation is officially mainstream.
Swift might be one of the most famous women on the planet,
but she represents every woman and every girl when it comes to what's at stake.
I hate to see this happen to anyone.
But my hope here is that what happened to Taylor Swift
will finally spark a much-needed conversation
about the risk of deep-fake pornography,
like the one you're about to hear me have with Dr. Marianne Franks.
Marianne is an internationally recognized expert at the intersection of civil rights, free speech, and technology.
She's a legal scholar, writer, and an activist who specializes in issues of online harassment, discrimination, and violence.
She's the president of the Cyber Civil Rights Initiative, and she's a professor at George Washington University Law School.
So this is quite the resume, and I couldn't think of anyone better to walk us through all of this.
Mary Ann, thank you so much for joining.
Thank you.
I have to say I am so excited for this conversation.
It's weird because I feel like I've covered your career.
I've been following you for the last decade.
You've been at the forefront of all of this,
but I would say, like, for the last seven or eight years,
I called you, and I don't even know if it's okay to call you this,
but I've called you one of the angels of revenge porn,
a non-consensual pornography,
because I remember back in 2015, I did a series for CNN when I was our senior tech correspondent at CNN.
And I did this on non-consensual pornography, on revenge porn.
And I remember knocking on doors of dudes in the offices there being like, we have to pay attention.
There's this emerging threat online because of social media that's happening.
And that threat was men were taking sexually explicit photos of women and posting them to these sites that were popping up all over the internet that were devoted to
shaming women. But there was this small group of really incredible female lawyers. There were
like five or six of you guys. And the names just kept coming up. You in this small group, who I
referred to as the Angels of Revenge Bourne, people would come to you and you would fight against
this. And you knew how to fight against this. And you have helped change laws when it came to
this type of harassment. So that was my nickname for you since you've always been on the front lines.
I am delighted by this, really.
I think that I'd probably have to explain it at dinner parties,
but if I could fit this on a business card, that would be amazing.
You were just these modern-day female superheroes.
So I want to take that concept.
I mean, you were at the forefront of the fight against revenge porn.
And I say revenge porn, we could call it non-consensual pornography,
which is probably a better term for it.
How has this threat evolved with advances in AI?
Well, it's gotten a lot scarier, and it's gotten a lot more common.
So, you know, we're talking about the terminology to start with, right, what do you call this?
Revenge porn, non-consensual pornography.
We've tended to start using terms like image-based sexual abuse because I think that captures more of the range of really terrible things that are happening using technology.
But if five or six or seven years ago, the main priority and what we were really seeing was actual intimate images of primarily women and girls.
being taken either without their knowledge or maybe images that they had shared consensually
with an intimate partner. We were seeing those being exposed without the permission of the
people depicted. And that's still happening. That problem has not gone away, but there are these
new variations that are also forms of image-based sexual abuse, including the deep fake phenomenon.
And what that really has done to transform the landscape is it doesn't matter if you ever
ever had a nude photo of you anywhere, actually. It doesn't matter if you shared it, didn't share
it. It doesn't matter if it never existed, because now it's possible for anyone to use very
easily accessible technology to make it seem as though you are naked or engaged in some kind
of sexually explicit activity. And all they really need is a few photos or videos of your face,
things that they can get from innocuous places like social media sites. And the next thing you know,
a person can produce an image or a video of someone that makes it really look as though
it's an intimate depiction when, in fact, it never took place.
When I try to explain this to people and I try to say, hey, this is coming, it is so bad.
But now with what happened to Taylor Swift, it is officially here.
So this is definitely a risk.
I think about it like this.
Like, imagine if every time I met you, I wondered if you'd seen me having sex with a stranger.
Like, that is the feeling of this.
And that's fueled now by advances in technology that are becoming increasingly available to everyone.
That's right. And the story doesn't start with technology. There are very few things that people can do to each other that are completely dependent on technology. It's the accessibility of it, right? That it used to take a lot of time and a lot of effort, a lot of focus on someone to be able to torment them, to be able to try to exploit their image.
If you ran into that kind of person in your life, you were going to experience that abuse.
But now you have, what I think of sometimes as kind of the whole world of opportunistic abusers
who were not for how easy technology has made this, would probably have never thought about
exploiting somebody this way.
But now that it's so easy, it's something that doesn't feel really unethical because it's
hitting buttons and it's looking at screens, it's so easy, it doesn't feel like it could
actually be that bad.
And you also have so much of this imagery in circulation that you realize, as a young man in particular, that this is social capital, right?
You produce these images, you exchange these images, you commission these images, and it becomes something that you can get validated for.
You can earn actual money, but you can also get the admiration of your peers, or you can feel superior, or you can get out your frustrations from being rejected.
It's all of these things that are now so much easier to do and so much more tempting to do.
And it's being done.
Right? It's like, it's happening in high schools near you. You were recently at a news conference
supporting a New Jersey high schooler. Her name was Francesca and her mom. Can you talk to us a little bit
about what happened to her? Yeah, and this, as you say, is becoming increasingly common.
Francesca, Moni, is one of the girls who was targeted at her high school by her peers. So we know
that it's one or more boys at the same high school who have done exactly what we're describing.
They've taken innocuous photos of their peers who Francesca at least was 14 at the time that this
happened. She's now 15. And they have created imagery that depicts them nude or in sexually
explicit positions and have distributed them in ways that Francesca is not even entirely sure
what the scope of this is because no one's talking about it. And the school hasn't been
particularly forthcoming about exactly what this imagery, what it's like.
I just felt like betrayed because I never thought I'd be my classmate.
And I just felt very uncomfortable knowing that he was walking like the hallways of my school.
And when I came home, I told my mom and I said, we need to do something about this because it's not okay.
And people are making it seem like it is.
And so when you hear Francesca talking about this and she described,
how knowing that there's this image of you out there that looks just like you, that the only
person who knows it isn't you is you. And it portrays you doing all kinds of things that are
either incredibly personal or just things that you wouldn't do and that people can just have those
on hand. They can just have them at any point. They can do whatever they want to with them.
They could use them against you when you are applying for college. They could be looking at them
on their phones while you're sitting next to them in the classroom,
the sense of you could never feel as though you could just go to school,
be a kid, enjoy your life without having to think,
is someone looking at a photo of me that's really intimate
that I never agreed to be depicted in?
Is someone thinking about me in this really dehumanizing kind of way?
And I'm not even sure who it is.
Is it my teacher?
Is it the person down the street?
Is it a predator?
That you can never actually feel like you own yourself anymore.
And that's what I think Francesca has really,
really bravely spoken to about how disorienting and upsetting that is. And she and her mother both
have decided that they want to come forward on this issue and talk about the real world impact
because it seems at least at the moment it's very abstract for a lot of people.
As long as I've covered technology, it's this idea that like what happens to us online,
it's not supposed to impact us offline, but of course it does. And now you add in AI in these
advances. And by the way, like even looking at that report, it wasn't even
just her. It was like 30 other girls at this high school. I was talking to a lawyer recently,
and she, you know, she specializes in this topic. And she said to me, she's like, Lori,
this just happened at my child's school with 15 young women. And so we're maybe hearing
Francesca talk about it. And thank God, she's talking to us in a way that is human and talks about
the humiliation. But my gut, I'm going to say my journalistic instinct is that this is rampant. And we are
barely scratching the surface here.
That's what's so concerning about this, right?
That for every one of those that has become a scandal,
we have no idea how many other groups like this there are.
We have so many apps that are developing every day
that are specifically designed to try to solicit this kind of imagery
and to offer services to people so that they can personally commission
their next-door neighbor or their peer.
You have many people who are engaging in this kind of abuse
who are quite motivated not to ever have the victim find out about it.
You know, there's this thing that happens if you talk about this, like, publicly.
I've done this in many different ways.
I'm like, we have to care about image-based, sexual abuse, all these things,
deep fake pornography.
And I promise you, like, people's eyes glaze over, right?
Like, they're just like, I don't even know what you were talking about.
Deep fake porn.
This is a whole other world.
What do you tell people about why they should care about this right now?
What is your argument as a lawyer who argues very well?
You know, it really depends on the audience.
So if people care about children, and most people do, right, we should never be putting a 14-year-old child in harm's way like this and making that 14-year-old have to grapple with the consequences of being sexualized without her consent, right?
And producing material that can be infinitely distributed.
It can end up literally anywhere.
And in many cases, we're talking about situations.
where the person who's produced this material or has received it,
then uses that fake imagery to extort actual sexually explicit imagery from these minors.
So the horrific cases that we're hearing about extortion,
about how there have been some teenagers who have been driven to suicide
because something like this happens.
Someone creates this deep fake, topless photo,
that person says, I have this photo of you,
everyone's going to think it's you.
In order for me not to spread it everywhere,
you're going to have to actually give me much more graphic actual imagery.
And the next thing you know, these children are caught in this kind of unwinnable situation
and they're ashamed and they're scared and they have no idea what to do.
And I would hope that people hearing that, understanding that that is the reality we're dealing with,
I would hope that people would care about those kinds of situations.
And that's even before we think about what often happens in these situations, which is that
you also start getting all of these overtures, strangers trying to contact you to say,
I saw your photo, I've read that you're into X, Y, or Z, because often this will be accompanied
by a disclosure of that person's name, their real name, where they go to school, where they live.
And so all of that information is being connected out there and sent out to a bunch of strangers
who sometimes in real life are trying to find you and trying to communicate with you.
I'd love to talk about the terminology.
I know we spoke about it a little bit before, but I can't help but think with advice.
advances in AI, we are living in this world where our most human qualities can now be mimicked
by an algorithm. My voice, the way I speak, my face, my images. And interestingly, you don't
call this deep fake pornography. You call this digital forgery. Why is that? It's somewhat similar
to the question about revenge porn, right? That that term was never really the right term to use
because, you know, that's the abuser's term for what it is. And there's so many things about it
that seem wrong and rewarding that that person with this terminology just seems like a really
backwards thing to do. So I try to avoid it also because it's not particularly explanatory as a
term. So I prefer the term digital forgery so that people can think more about what this is really
doing, what it resembles that there's this tendency to think that technology is so kind of beyond
our understanding and something new is happening every day. And really, oftentimes it's very
on themes of things that happen all the time already. And the concept of forgery, of impersonation,
of counterfeiting, all of those things, I think, are much more evocative to explain to people why it is
that this is not about someone talking about you. This is someone trying to take over your identity
and make it seem as though they are you. And I really want people to understand that as part of the
stakes, because it heads off some of these typical kinds of objections about how, well, this person
is just engaging in some kind of free speech activity or satire parody. And a digital forgery is something
very different. This is someone who is hijacking your image that is taking over your identity, right? And that's
how we need to see it. And I can't help but think. Like our identities online are increasingly relevant
in the real world. We are spending so much of our time online. And these worlds are blurring.
You can't just say, well, this is happening here and it's not real. And so, you know, it's not going to
impact you here. That's right. And that was going to be true about really all of the terrible things
that people do to each other. The story is pretty much always the same, which is that the really
terrible things happen to women and girls first. But they're going to come for everyone.
So we've just been hearing about how there are fake robocalls pretending to be Joe Biden,
right, and telling people not to vote. That's possible now, right? There are infinite applications that
can be used here for this technology to cause incredible amounts of harm to make it seem as though
people have committed crimes when they haven't,
or to make it seem that they haven't committed crimes when they have, right?
To distort everybody's perception of what is true and what is false
and to leave us with just this chaos.
Yeah, we could view this as this is the way into talking about
the future of democracy and misinformation and fraud and all of these things.
It just so happens that women and children,
this thing is happening to them right now.
We've got to pay attention for obvious reasons.
And also it has this whole other effect that we're beginning to talk about.
One of the reasons I was excited to talk to you right now is I feel like we're in this moment where we have the democratization of technology that makes this easier and easier to do.
So we could have had this conversation, you know, five years ago, but we wouldn't be in a place where it was so easy to create non-consensual images.
In the past year, there's been a mass adoption of artificial intelligence image generators.
A lot of teenagers have these on their phones, but I don't know if our audience, they might not.
fully understand those capabilities.
Could you talk us through how easy it is to create non-consensual images now?
And it really takes no kind of skill at all anymore.
Something that even just a few years ago would have required you to be really technologically
sophisticated and have access to software that was really quite obscure or expensive.
Now you maybe need a handful of photos.
Videos are even better, right?
Just being able to find clips of someone.
And you can really just sort of push a few buttons, and then you can have this image or video produce that is virtually indistinguishable from something that looks real.
So things that you could have only seen in movies, you know, 10 years ago you can do if you have an app.
And then if you don't even want to do that, you can send in the raw material to someone else and they can do it for you.
So it's incredibly easy.
Literally anyone can do it now.
And I think about this now and I'm like, okay, you get rejected if you're a teenage guy.
Oh, there's an app for that.
You can use clothes off, which is an app that enables you to just digitally undress the person you want.
I mean, I was looking through all of the different apps for that when it comes to this.
And I have to tell you, I was horrified, shocked.
I mean, these tools make it really easy for anyone to become a victim.
but maybe even more noteworthy.
It makes it really easy for anyone to become an abuser.
Right.
And that's the part I think we don't spend enough time thinking about.
It's said that sinister technology just sort of reflects our society.
So if bad things are happening in technology or on the internet or through apps,
well, that just means that society has problems.
And that's such an incredibly short-sided way of looking at it
because technology obviously also creates our impulses and rewards our impulses
and teaches us what kinds of things are possible.
And so if you, especially as a younger person,
are being bombarded by this kind of cottage industry
of, hey, have you ever thought about creating a naked photo
of the girl who said no to you?
You may have never thought about this,
but now this is something that's coming to you.
You don't have to seek this sort of thing out.
You don't have to have a particular vested interest in it.
You don't have to be someone who is struggling
with some kind of intense obsession.
It can just be that you're bored.
and this is an option for you, and we have this entire machinery at their disposal to think of different ways to dehumanize women and girls and use them for purposes of entertainment, you've got a real nightmare on your hands, right?
Because I don't think that people just naturally come to these things, but now you've got an industry that is monetized and incentivized to get to as many people as possible and turn them into predators.
I mean, I have empathy for a parent who's trying to navigate this.
Of course, right? Because this is uncharted territory for a lot of people. And, you know, one of the other challenges of developing technology is that the younger generation always knows more than the older generations do. So parents are kind of in a double bind.
I'm curious, because we're talking about incentives, right? I know the folks who host this podcast, they talk about social media and incentives that race to the bottom all the time. I'm curious, how do you think algorithms on social media and these platforms are making?
the problem worse? I mean, you really have to back up and think about incentive structures as a
whole, right? And not to get too far in the weeds about the legal reasons for this, but the way that
we have allowed the tech industry to do its own thing, right, to take care of its own problems
for more than 20 years through federal protections to say you don't have to deal with liability
and negative consequences the way that other industries do, right? So you are not going to be treated like
let's say, a hotel owner who chooses not to have lights in their parking lot and women keep getting
assaulted there, right? There's room to say to that hotel owner, we know that you're not the person
who's causing the assaults, but you have a responsibility to provide safe environments for the people
who are here, especially now that you've been informed that that's a problem. And if you sit back and
do nothing, there's a chance that you should be held accountable for that. And you think about what
happens online, and it's the opposite. You can never be held accountable. So even if you, a Google or
a meta or what have you isn't investing in specifically these kinds of apps that are targeted at
saying let's create a non-consensual deep fake these things end up in search engines they end up on
facebook they end up in all these places and then they become content right they become monetizable
content and so these companies are benefiting from this and we keep asking you know the same
questions of these major companies why aren't you doing more to stop x and the answer is because they don't
have to
I would love, I mean, I think this is actually the perfect time to walk into legal, right?
And what can you do if this were to happen?
And how does that differ where you are in the States abroad?
I know in the time that I covered non-consensual pornography, we now have laws in the United States to battle this.
How do you think we need to rethink those laws with these new threats, what we've just been talking about?
Yeah, I do hope that this.
be a moment where we assess how the limited progress we've made when it comes to traditional,
authentic, non-consensual pornography, how that progress has really been compromised, right?
That we are definitely in a better world where we can say 48 states and D.C. and others
have laws against this non-consensual pornography. That's much better than 2013 when there's
basically not, right? But when you look at what those laws actually do, several of them
are defining this kind of abuse as requiring some kind of malicious intent, that you have to have
some conscious desire to hurt someone else. Now, obviously, if people do have that intent,
we should be able to punish them. That makes sense. But when you think about how many other
motivations people have, now that we're talking about these apps and these sites, the fact that
these are used for kind of social bonding, the fact that they are used to make money, the fact that
they're used, sometimes not in any kind of deliberate sense of, I hate you and I want to destroy
your life, but I don't see you as a full human being. So if you have defined the crime as you
have to want to hurt that person, well, it turns out a lot of people aren't actually trying
to hurt other people. They just don't see them as human beings. And so now you add to this the
deep fake problem of manipulation. This should be a moment where we recognize and get rid of any of those
extraneous elements that require some kind of animus towards this person. It shouldn't matter whether
the reason that you did this was because you are so angry at this person for however they have
disappointed you or rejected you or if it's just that you have never learned to see women and
girls as human beings. It shouldn't matter, right? Either way, this is a problem. And the problem is
in using them, using those people without their consent. Yeah, it seems to me that if we're trying
to prove malicious intent here, right? With deepfakes, it's just
harder to prove, but yet, if we look at the history of all of these types of harassment,
there's proven impact, right? So it just doesn't seem like the law is matching what's actually
going on. Exactly. And we understand that in other context, right? When we think about how you can be
responsible for something like reckless driving. So you get in a car and you choose to be distracted,
you're looking at your phone, you're whatever it is. You don't intend to hurt anybody,
but because you have chosen to be careless,
you've chosen to be reckless,
you end up hurting someone.
We do not offer that as a defense to say,
well, you didn't mean to hit that person
and kill that pedestrian, so it's fine.
It doesn't work like that.
You did make certain choices
that you were conscious of.
You chose to take certain actions
that were going to benefit you personally,
but it requires you dismissing some kind of harm
and a risk of harm to another person.
We can punish people for that,
and we should punish people for that.
And in other contexts, when we're not talking about sexual-based defenses, people understand that.
From a legal standpoint, if you could say very clearly, this will move the needle.
This will give future victims protection.
What would this be?
I mean, I do think when it comes to deepfakes specifically, we need a law that prohibits on both a criminal level
and also creates a possibility of suing to say you cannot engage in digital manipulation of a person
without their consent, with the clarification that we're only talking about creating images
that are virtually indistinguishable from real ones. So if we had a law that says you cannot do that,
right? Because that is a criminal offense. I think that actually would move the needle quite a lot.
And it would be ideal if that law were at the federal level as well as at every state so that you
wouldn't have this confusion of, well, how is it defined here? And will that apply if the perpetrators
in another state and I happen to be not in the same state as they are?
I mean, put simply, it's like you're saying, look, if people think they're going to go to jail for doing this, less people are going to do it.
Right. And the caveat is incarceration is not the right or good response for a lot of things.
I'll just stipulate that, that there's good reason to be skeptical about bringing the very troubled criminal justice system that we've got to bear on these issues.
When we're talking about image-based sexual abuse, you cannot actually undo what has been done.
we have to have a situation where a would-be perpetrator thinks twice and decides not to do this.
The point is to have it be a criminal prohibition that puts people on notice how serious this is
because not only it will have negative consequences for them,
but one would hope that it would communicate that the incredibly negative consequences for their victim will never end.
What is the psychological impact this has on victims?
happens to you. I know that you talk to folks every day who've had this happen to them. What is the
impact? So we're seeing psychological distress. We're seeing reputational harm. We're seeing
girls leave their school situations because they can't concentrate anymore or they're told
that there are disruption and so they have to leave their schools. There are women who get
fired. There are people who have to leave their homes because they're not safe anymore. So
all of those things that we have seen in the more classic sense of non-consensual pornography, we
see playing out with the digital forgery situation. And it's a really deeply disturbing experience
for the victims that I've spoken with who've said, what that does to your identity, what that does
to your sense of self is really hard to explain. But it's extremely disruptive.
It could be a good time for me to bring up what happened to me. I gave the folks from the
Center for Humane Technology, my consent, to do a live demo.
in front of lawmakers, where they would use my voice and my images in a public experiment
with a deep fake. And so what they did, and I feel like this is relevant because it really
speaks directly to what you just said, I remember I'm sitting on stage with them, and everybody's
riveted by their conversation on the impact of artificial intelligence. And I come up as
this longtime journalist, and they had broken Facebook's large language model.
And then they were also using Chad GPT.
And they said, name three or four things about the tech journalist Lori Siegel that she could be known for.
And Chad GPT, to its credit, was saying so many lovely things.
I loved it.
It was like, Lori Siegel is known for hard-hitting interviews with folks like Mark Zuckerberg.
And Lori Siegel's known for covering the human impact of tech.
I was like, oh, my God, I feel so seen by this.
algorithm. This girl network. It's amazing. And then they went further and they were like, okay, come up with a personalized attack based off of these specific examples. The next thing you know and said, well, you could imply that Lori Siegelson has an intimate relationship with Mark Zuckerberg. I want to go ahead and say that's upsetting for so many reasons, but like that, not, you know, not my choice. And so what they did was they started generating articles that you
used my real images of me interviewing Mark Zuckerberg, because I've interviewed him many times,
with what looked like almost a New York Times style reporting, which was too close for comfort,
Lori Siegel and in a relationship with Mark Zuckerberg.
And I was like, oh, that's weird.
But then all of a sudden they started generating tweets with cultural memes that actually talked about this.
So I say this to set up kind of the final thing, which was they essentially showed deep fake images of me to discredit me.
And there was one of me, I just want to say it.
Like, I never thought I'd say half naked with a bong or whatever it is on an intellectual
podcast, but here it goes.
They had, they deep faked my image of me, looked like my body, right?
You would have just to someone looking at it, you would have thought it was me.
It wasn't, but holding some kind of bong, right, like to discredit me.
And then they also deepfaked me walking with Zuckerberg holding his hands, right?
Which is like, I mean, you hear it in my voice even talking about it.
And then I would say the grand finale, Marianne, like the grand finale of all this was when they, quote, leaked audio of my voice, which actually wasn't my voice.
It was my voice saying someone else's words, which was a conversation that sounded like it happened in private, where I was saying, Mark, I don't want people to find out about us.
I'm worried it'll ruin my credibility.
And I look back at that demo.
What was interesting about that demo is I've covered this stuff for like a decade, almost 15 years, actually, at this point, which is insane to think about.
And I remember sitting there in front of this audience and, like, I felt shame.
I was embarrassed.
And, I mean, I get chills even thinking about it because I'm like, if I feel shame
looking at something like that and feeling like I have to justify it, even though it's
clearly not me and it was set up with consent that it wasn't me, I can't even imagine
how other people must feel.
So I guess my question to you is, like, I felt shame, but like, am I unique when it comes
to this? I don't think so. And thank you for sharing that, although I'm horrified for you,
because, you know, yes, I can hear it in your voice that, you know, you knew what was happening,
you knew. And yet that, it is such a jarring, it's such a destructive kind of experience of seeing
your likeness, you know, something that looks so much like you being used in this way. I can't
emphasize enough just how much it depersonalizes someone and makes them doubt who they are, right?
And knowing, too, that even if you're able to say and have the chance to say, that's not me, let me explain, when we see something, especially a video, but images and video, and then you add audio, we are experiencing as if it's real, even if we rationally know that it's not.
And in many of these situations, that rational knowledge won't even be there.
So I think that that's very telling and very illuminating about how destructive and experience this really is.
And I emphasize they were doing this to show, and I thought it was important.
and I volunteered myself for this, but they were showing these deep realities that could be built,
that could take all those things that I've worked for over the last 15 years and shatter them.
When we look at the trajectory of where this technology is headed, what are you afraid of?
What is keeping you up at night when you think of the future of AI and consent?
Really just all of these kinds of technologies that are being developed, to some extent,
right now separately coming together.
We think about augmented reality.
And virtual reality combined with this really privacy-invading, personality hijacking kind
of technology, yeah, I'm really worried not only about that existing as content for
people, but of course, then the demand that rises with that content and what we said
before about what does that do to people's impulses, right?
That it creates a desire that maybe wasn't there before.
And people start to think of each other in this very instrumental way of, oh, well,
she said no to me now or she humiliated me in a meeting or she got a job that I thought
that I should have gotten. Well, I know how I can sort of reassert my feelings of a power and
adequacy. Yeah, like in the past, you can go anonymously talk about it on Reddit, whatever. Now you
can literally create a nightmare. And I think the thing about these worlds is that the idea is that
they feel real, that they're integrated with the real world. And I just can't even envision what
that's going to look like, because then you add in things like the gamification of these types
of things, audience participation, all these things that are trends in the internet, when applied
to this, are terrifying. Yeah. Why is that what happens every time there is a new and exciting
innovation or form of technology? Someone is going to come along and think about how they can
use that to hurt or to humiliate a woman. You have said before that deepfakes threaten free speech.
Why is this a free speech issue?
When you think about what it is that makes an effective deepfake or how it is that people create
deepfakes of other people, they have to take images of you doing, saying something, right?
So to put another way, the source material for their abuse of you is actually your own speech.
It's your own expression.
And if we're trying to think about ways to navigate around that or to avoid situations where that
can happen, the literal response or the way we'd have to do that is to not speak, is to not
express ourselves. It's not appear. We'd have to disappear if we wanted to make sure that no one
could do this to us. And so when you think about how perverse that is, that it's people's own
actions and expression in the world that are being used in this way for these really sinister
purposes, I think it makes pretty clear that there's such a cost there to people's expression.
And then it's the question of, well, what happens every time someone is humiliated or
exploited in this way? It makes them feel as though they can't speak.
any longer in the environments where they used to be comfortable. It makes them retreat.
And that's oftentimes the conscious objective of the person who's engaging this abuse.
But even if it's not, that is the effect.
When we talk about tech companies, what have tech companies done so far? What are they saying
they will do about this issue, about fake non-consensual pornography?
Yeah, we've seen some progress on these fronts. Certainly, there have been more efforts
to address these issues seriously in the last.
let's say, five or seven years than it were before when, you know, the door would just be slammed
on people's spaces. So you now have most major reputable companies now have policies that relate
at least to actual images. And many of them are catching up to say either it's, if it's a sexually
explicit private image or whether it's one that's manipulated to seeing private, that is one of
the policy changes that seems pretty easy to update and say, we're going to forbid that. But, you know,
that's just really the very first step. The question then is, well, what are you doing to proactively
prevent that from happening because, as we've been discussing, if you wait for it to happen
and you say, well, we've got some measures in place that you as the victim must take the burden
of trying to navigate, that's not really helping, right? The last thing that a person in the
situation of being depicted in this way has time or desire to do is to troll through all the
images, all the sites this may have appeared on and report them to the search in and say,
please take this down. That shouldn't be what we're asking victims to do. So there are
some companies are at least claiming, and we'll see if these play out, to be proactive in their
prevention of this kind of material being uploaded to begin with. And I think at a minimum,
that's what we should be requiring the platforms. You have to have some kind of policy that is not
merely reactive, but actually tries to make it harder for people to engage in this behavior
and punishes people for engaging this behavior. And if there were two to three things
that you could get tech companies to change right now, what would they be?
I could get tech companies to change.
You know, I can say I would ask them to adopt one concept.
And that is, when you are designing anything, a policy, a building, a whatever it is
or a product, you should think about the most vulnerable person in society and how whatever
it is that you are building is going to affect that person.
The kinds of vulnerabilities that that person might face, the kinds of impact that it
might have on them, whether or not they'd be able to use it effectively the way that other
people could be, and to try to design for them. And so I would suggest if I could snap my fingers
and just have the tech industry adopt that principle first and foremost, as they are producing,
creating these products and services that are going to affect all of our lives, that would be
the one.
Last year, Congress introduced a bill called Preventing Deep Fakes of Intimate Images Act.
Can you talk me through the bill and what is actually holding it up from being introduced?
Where does it stand?
Yeah.
So, Congressman Joe Morel from New York has proposed a bill that my organization, the CyberSill Rights Initiative,
and several other organizations and experts who worked on these issues were asked to give feedback on.
And I think the big and approach of the deepfakes bill was a really solid start,
which I hadn't really seen up to that point with other attempts.
to legislate on this issue, because it was very clear to say the problem here is the creation,
is the distribution without consent of these highly realistic, sexually explicit, digitally manipulated
images. And it basically straightforwardly said, you shouldn't be able to do that. In addition to that
being something that is prohibited, people who are depicted this way should have a cause of action
that will make that possible for them to sue that person for doing this to them. And so creating the
possibility of getting some kind of compensation.
Now, in terms of what's holding it up, it depends on who you ask, but it would seem as though
this is a bipartisan issue.
I was pleased to see at the press conference with Francesca and her mother that this is being
sponsored not only by Congressman Morel, who's a Democrat, but also by Congressman
Kane, who's a Republican.
And so I would hope that that's a signal that there really will be genuine bipartisan sponsorship,
but then also the urgency to get this through the appropriate committees and get it voted on as soon as possible.
And, you know, many people who listen to this, they might not see, and we talked about this a little bit at the beginning,
but they might not see how this directly impacts their lives.
But if you care about women and you care about young girls in your life and you want them to avoid becoming victims,
I would say, like, the argument is you've got to care. We've got to get people to care.
How do we get people to care?
We have to care not just about women and girls in an abstract sense, but we have to care more about them than we care about sexual entertainment, more than we care about sexual objectification, more than we care about profit, more than we care about shiny new objects. That's the problem, right? Because in the abstract, it's easy to care about women and girls. But for society to really say, oh, but I'm contributing to something that is actually harming women and girls. But I want to keep doing it because I benefit from it. That's the tricky conversation perhaps.
having. So I think getting people to understand that caring about other people means not always
putting your interests first and the fact that it isn't affecting you in the same negative way that
it affects them shouldn't be a reason not to care about it. And to also think about the fact that
whatever abusive technology is out there, it's not, as we've been discussing, just about the impact
it has on the victims. It's about who it's turning all of us into, right? Worry about the fact
that your son may turn into the kind of person who looks at his classmate and sees an opportunity,
right?
Worry about the fact that men who would otherwise maybe want fulfilling respectful relationships
with women are now sort of turning towards this community that tells them, no, disregard their
feelings and just think of them as objects and we will celebrate you, right?
Worry about that because nobody should be happy with that as where we land as a society.
Mary Ann, I cannot thank you enough.
Thank you for all the work that you do.
And honestly, part of why I was excited to have you on here is the audience really listens.
We have influential people inside tech companies in the halls of power in D.C. who listen to this podcast.
And I think you definitely laid out a number of concrete steps, which is always important.
I hate leaving people feeling hopeless.
And I don't think we're actually hopeless when it comes to this.
This is something we can work towards.
And I want to especially thank Tristan and Aza for inviting.
me to take over the mic to host this episode of Your Undivided Attention.
Your undivided attention is produced by the Center for Humane Technology, a non-profit
working to catalyze a humane future. Our senior producer is Julia Scott. Kirsten
McMurray and Sarah McRae are our associate producers. Sasha Fegan is our executive producer,
mixing on this episode by Jeff Sudakin, original music and sound design by Ryan and Hayes Holiday.
And a special thanks to the whole Center for Humane Technology team for making this podcast
possible. You can find show notes, transcripts, and much more at HumaneTech.com. If you like the
podcast, we'd be grateful if you could rate it on Apple Podcast because it helps other people find the show.
And if you made it all the way here, let me give one more thank you to you for giving us
your undivided attention.
