TED Talks Daily - How a deepfake almost ruined my political career | Cara Hunter

Episode Date: December 6, 2024

A nightmare scenario happened to politician Cara Hunter: just weeks before her election, she became the victim of a deepfake scam that threatened to upend her life and career. In a fearless t...alk, she explores AI's potential to undermine truth and democracy — and offers a path forward to harness this powerful technology as a force for good. (This talk contains mature language.)

Transcript
Discussion (0)
Starting point is 00:00:00 TED Audio Collective You're listening to TED Talks Daily, where we bring you new ideas to spark your curiosity every day. I'm your host, Elise Hugh. A nightmare scenario happened to politician Kara Hunter. In her 2024 talk, she details how she became the unwitting victim of a deep-baked skin and what that experience taught her about a path forward for AI that doesn't harm people. A heads up that today's talk contains sensitive content and may not be suitable for all audiences.
Starting point is 00:00:38 It's coming up after the break. Support for this show comes from Airbnb. As 2024 comes to a close, I've been reflecting on my travels this past year. And of course, the highlights include several great Airbnb stays you've heard me mention. Palm Springs, Sedona, Tokyo. In 2025, perhaps it's the year I finally host on Airbnb.
Starting point is 00:01:03 With the amount of time I spend away from home, it just seems like the practical thing to do. I love the idea of looking back this time next year having hosted several great stays and enjoying the extra income I saved. Your home might be worth more than you think. Find out how much at airbnb.ca slash host. And now our Ted Talk of the day.
Starting point is 00:01:26 You're a little whore and we've all seen your little video. That was the text message that was sent to me in April of 2022. I'm sitting in my grandmother's living room in what is her 90th birthday. Surrounded by family and friends as my phone blows up with messages from strangers right across the country who say they have seen a video of me engaging in hardcore pornographic activity with a man. I knew this was impossible. With just three weeks out from my election I felt as though my career was crumbling before my very eyes. My heart pounded, my mind
Starting point is 00:02:11 raced, sweat beaded on my skin and then I watched the video and my worst fear was realised. Although this woman in the video was not me, she looked exactly like me, impossibly like me, eerily like me. I had so many questions running through my mind. Was this AI? Was it not? Who made this? How did they make it?
Starting point is 00:02:39 Why did they make it? So I did what anyone would do and I approached my local police service to ask for advice, for guidance and really where did I go from there? But they informed me that they wouldn't have the cybercrime technology to assist to find out where this video came from and it was from that moment I knew that I was on my own. Now to set the stage as you can probably tell I'm from Ireland and to be exact I'm from Ireland and to be exact I'm from Northern Ireland which is an even smaller place. We have just 1.8
Starting point is 00:03:10 million people, very similar to size of Vienna. So you can imagine a rumor of this sinister nature, particularly in the world of politics, can go very far, very fast and that old saying, seeing is believing, began to haunt me. And in the weeks leading up to my election, this video, this false video, was shared thousands and thousands of times across WhatsApp. And attached to this video was photos of me at work, smiling, campaigning, building a sense of trust with my constituents. And as the weeks went on, messages flooded in
Starting point is 00:03:50 faster and faster. And they were of a very vile and sexual nature. Ding, we've all seen your little video. Ding, you should be ashamed of yourself. Ding, ah, now I see how you got your position in politics. It was very difficult. And having been in politics since the age of 23, and at this point I've been in it
Starting point is 00:04:18 for about four to five years, and I'm from Northern Ireland, which is a post-conflict society, still very deeply divided. So I anticipated challenges, I anticipated disagreements, I even anticipated attacks, it's politics after all. But what I did not anticipate was this moment. This was different. This was the moment where misogyny meets the misuse of technology, and even had the potential to impact the outcome of a democratic election.
Starting point is 00:04:50 And the sad thing for me was this lie became so far spread, so far so fast, that even my own family started to believe it. Some would say that they'd heard it at a golf club, others would say they heard it at the bar and of course some even said they heard it in a locker room. A really good example of how far this went was people that I knew my entire life would pass me in the street without whispering a word. People like school teachers, people I had a sense of trust with and you know an affinity with and that was really hard.
Starting point is 00:05:25 It felt like overnight I was wearing a scarlet letter. And as things moved on, and we're about two, three weeks out from the election, I kept receiving messages and it got wider and wider. It was global. Not only was I receiving messages from Dublin and from London, but I was also receiving messages
Starting point is 00:05:43 from Massachusetts, Manhattan, and I was getting so many follows on my political social media predominantly from men hungry for more of this scandal. And this intersection of online harms impacting my real life was something I found utterly strange and surreal but it got to the point where I was recognized on the street and approached by a stranger who asked me for a sexual favor. And it was just for me, it was like in the blink of an eye, everything had just changed. And it was utterly humiliating. I didn't want to leave the house and I had turned notifications off in my phone just so I could kind of catch my breath but this wasn't ideal in
Starting point is 00:06:28 the lead-up of course to an election and for me I think that was the the purpose of this false video was to do just that. But what hurt the most for me was sitting down my father and having to explain to him this strange surreal situation. And my father is an Irishman, completely disconnected from tech. And so having to explain this horrific situation was an entire fabrication was very hard to do. This was this strange moment where the online world met my life, my reality. Not only having the impact to ruin my reputation but have the capacity to change the outcome of a democratic election. And you know for years I spent so much time
Starting point is 00:07:21 building trust with my constituents. I mean, we all know how much people like politicians, and you know, were as likable as the tax man. So for me, it was hard. It was really hard because it was years of hard work. You know, I'm so passionate about my job and this video, this complete falsehood had the ability to just undermine years of hard work in mere seconds.
Starting point is 00:07:46 But instead of succumbing entirely to victimhood, I ask myself today, you know, where do we go from here and how can AI evolve to prevent something like this happening again? Not only has it happened to me, but we want to future proof and ensure that this doesn't happen to the woman of tomorrow. How can we, you and I, people who care about people, ensure that this is a tech for good? How can we, the policymakers, the creators, the consumers, ensure we regulate AI and things like social media, putting humans and humanity at the centre of artificial intelligence. And now back to the episode. AI can change the world. In fact, as we've heard today, it already has.
Starting point is 00:08:40 In a matter of seconds, people who speak completely different languages can connect and understand one another. And we've even seen the Pope as a style icon in a puffer jacket. So some really important uses right there. But then in my case as well, we can also see how it is weaponized against the truth. And a good examples of this would be art that appears like reality, AI-generated reviews on fairly boosting certain products, and things like chat bot propaganda. And then politically speaking, we've seen over the years deepfakes of Nancy Pelosi slurring, Joe Biden cursing, and even President Zelensky asking his soldiers to surrender their weapons. So when AI is used like this to manipulate, it can be a threat to our democracy. And the tech is becoming so advanced that it's hard
Starting point is 00:09:34 to differentiate fact from fiction. So how does AI interfere with politics? And for us as politicians, what should we be worried about? Could truth and democracy become shattered by AI? Has it already? Well to dive a little deeper here I think firstly we need to talk about the concept of truth. Without truth democracy collapses. Truth allows us to make informed decisions, it enables us to hold leaders accountable which is very important and it also allows us to make informed decisions, it enables us to hold leaders accountable, which is very important, and it also allows us as political representatives
Starting point is 00:10:11 to create a sense of trust with our citizens and our constituents. But without that truth, democracy is vulnerable to misinformation, manipulation, and of course, corruption. When AI erodes truth, it erodes trust, manipulation, manipulation, and of course, corruption. When AI erodes truth, it erodes trust, and it undermines our democracy. And for me, in my experience with a deepfake,
Starting point is 00:10:36 I've seen what a fantastic distortion tool that deepfakes really are. So how can we safeguard democracy from this ever-advancing technology? It's becoming ever harder to distinguish between real and synthetic content. And politically, what role does AI play in the future? And I can't talk about politics without talking about media as well. They're undeniably linked, they're intertwined. And I think journalism has its own battle here as well. From AI algorithms boosting articles unfairly,
Starting point is 00:11:09 to clickbait headlines, and then also moments where they can manipulate the public as well. But politically speaking, we've seen AI tailored political messaging, influencing voters, we've seen it adding to existing bias. And definitely definitely I think we all have that aunt that's on Facebook and kind of believes anything. So for me as a politician I think it's really important we dive a little deeper into
Starting point is 00:11:33 the relationship of AI, journalism and media. But it also puts us at risk of creating a more divided and reactionary society because falsehoods can create a lot of reaction. And for myself, coming from Northern Ireland, which is that post-conflict society, I do have concerns about how it could shape our political landscape and other places across the globe. Sadly, this deep fake video is not the only instance of me having experienced abuse with AI.
Starting point is 00:12:06 Just six months ago, I received 15 fake deep fake images of myself in lingerie, posing provocatively. And I thought to myself, here we go again. And, you know, I spoke with some other female politicians. Thankfully, where I represent, we have more women getting into politics. But I had a really good conversation with them and it's around, if this is happening to you now, what happens to me tomorrow?
Starting point is 00:12:34 And I think this really strikes at the heart of the climate of fear that AI can create for those in public life. And I don't blame women, it's very sad, I don't blame women for not wanting to get into politics when they see this kind of technology come forward. So that's so important that we safeguard it. What also concerned me was the position of elderly voters,
Starting point is 00:12:57 perhaps their media literacy, their comprehension of this technology, people who don't use the internet, who perhaps are not aware of AI and its many, many uses. So that was really concerning as well. But it doesn't have to be this way. We can be part of the change. For me and my video, I still don't know to this day
Starting point is 00:13:18 who did this, I can imagine why they did it, but I don't know who did it. And sadly for me and for across the globe, it still wouldn't be considered a crime. So from being the enemy of fair, transparent, good-natured politics, to warfare, to international interference, despite its beauty and its potential,
Starting point is 00:13:40 AI is still a cause for concern, but it doesn't have to be this way. I feel passionately that AI can be a humanistic technology with human values that complements the lives that we live to make us the very best versions of ourselves. But to do that, I think we need to embed ethics into this technology to eliminate bias, to install empathy, and make sure it is aligned with human values and human principles.
Starting point is 00:14:12 Who knows? Our democracy could depend on it. And today I know we have some of the brightest minds in this room. I heard some of the talks earlier and I'm certain of that. And I know each and every one of you have weight on your shoulders when looking to the future. But I also know each and every one of you want to see this tech for good.
Starting point is 00:14:34 And what gives me hope is witnessing the movement right across the globe to see this hard journey begin, to regulate this ever-advancing technology, which can be used for bad or for good. But perhaps we, each and every one of us in this room, can be part of finding the solution. (*applause*) Support for the show comes from Airbnb.
Starting point is 00:15:01 As 2024 comes to a close, I've been reflecting on my travels this past year. And of course, the highlights include several great Airbnb stays you've heard me mention. Palm Springs, Sedona, Tokyo. In 2025, perhaps it's the year I finally host on Airbnb. With the amount of time I spend away from home,
Starting point is 00:15:20 it just seems like the practical thing to do. I love the idea of looking back this time next year, having hosted several great stays and enjoying the extra income I saved. Your home might be worth more than you think. Find out how much at airbnb.ca slash host. That was Kira Hunter speaking at TED AI Vienna in 2024. If you're curious about TED's curation, find out more at TED.com slash curation guidelines.
Starting point is 00:15:51 And that's it for today. TED Talks Daily is part of the TED Audio Collective. This episode was produced and edited by our team, Martha Estefanos, Oliver Friedman, Brian Green, Autumn Thompson, and Alejandra Salazar. It was mixed by Christopher Faisy-Bogan. Additional support from Emma Taubner and Daniela Ballarezo. I'm Elise Hu. I'll be back tomorrow with a fresh idea for your feet.
Starting point is 00:16:14 Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.