TED Talks Daily - The real-world danger of online myths | Vidhya Ramalingam

Episode Date: December 5, 2024

How do we protect ourselves from being misled online? Counter-terrorism expert Vidhya Ramalingam reveals how disinformation is being weaponized to justify violence — increasingly against cl...imate scientists — and introduces a powerful tool called "prebunking": a proactive approach that empowers people to recognize and reject manipulative messages before they take root.

Transcript
Discussion (0)
Starting point is 00:00:00 TED Audio Collective You're listening to TED Talks Daily, where we bring you new ideas to spark your curiosity every day. I'm your host, Elise Hu. Misinformation and disinformation have altered societies and fractured realities around the world. I think about it a lot, especially as what we read, see, and hear is so filtered through algorithms. In her 2024 talk, Disinformation Disruptor Vidya Ramalingam unpacks how disinformation works, how it can lead to violence,
Starting point is 00:00:38 and crucially, how to protect ourselves and our loved ones in the face of information manipulation. After the break. Support for the show comes from Airbnb. As 2024 comes to a close, I've been reflecting on my travels this past year. And of course, the highlights include several great Airbnb stays you've heard me mention.
Starting point is 00:01:01 Palm Springs, Sedona, Tokyo. In 2025, perhaps it's the year I finally host on Airbnb. With the amount of time I spend away from home, it just seems like the practical thing to do. I love the idea of looking back this time next year having hosted several great stays and enjoying the extra income I saved. Your home might be worth more than you think.
Starting point is 00:01:23 Find out how much at airbnb.ca slash host. At Radiolab, we love nothing more than nerding out about science, neuroscience, chemistry. But, but we do also like to get into other kinds of stories. Stories about policing or politics, country music, hockey, sex of bugs. Regardless of whether we're looking at science or not science, we bring a rigorous curiosity to get you the answers. And hopefully make you see the world anew. Radio Lab, adventures on the edge of what we think we know. Wherever you get your podcasts. And now our TED talk of the day. You are a disgusting liar. Someone somewhere will hunt you down.
Starting point is 00:02:11 I hope someone puts a bullet between your eyes. These are messages received by climate scientists. According to a recent survey, 39 percent of climate scientists have faced online abuse. Eighteen percent of those are threats of physical violence. At the end of the day, we're going to see just how much you believe in your global warming and whether you're willing to die for your so-called research.
Starting point is 00:02:52 No scientist should have to fear for their lives, but this is just another day in the life of a climate scientist. I'm not a climate scientist. I'm not a climate change activist I'm not a climate change activist. I'm a counterterrorism expert. I started my journey meeting with white supremacists in basements in Sweden and went on to lead a global policy effort after Europe's first major terrorist attack perpetrated by a white supremacist.
Starting point is 00:03:22 I went on to found Moonshot, an organization that works to end violence online. I care about climate change denial because it's so often weaponized to serve as a justification for violence. It would be easy to think that if only we could get people to understand climate change is real, we could put an end to this. Unfortunately, it's not that simple. In 2019, a gunman walked into a Walmart in El Paso, Texas.
Starting point is 00:04:01 He killed 23 people, many of immigrant background. He called himself an eco-fascist. He believed in climate change, but he had bought into mis and disinformation that immigrants were the root cause of it, that sustainability would only be possible with the elimination of people of color. Mis and disinformation are so often weaponized to serve as a justification for violence. Although they're often used interchangeably, misinformation is information that's false or misleading. Disinformation is spread intentionally to cause harm. It's so powerful because it taps into your grievances,
Starting point is 00:04:54 what makes you really angry. And it offers simplistic solutions. There's typically a villain and a hero. Over the last two years, my team and I have been researching different kinds of manipulation tactics used all over the world to spread disinformation. Two of the most common were decontextualization and fear-mongering.
Starting point is 00:05:19 Decontextualization is the practice of taking information out of its original context to deliberately mislead people. For example, earlier this year, Europe experienced a series of protests by farmers against a range of proposed environmental regulations. There were street blockades and protests, demonstrations, occupations. Adding to an already tense moment, several inauthentic images circulated.
Starting point is 00:05:51 This one purported to show the Ukrainian embassy in Paris getting pummeled with manure. This was actually footage taken months earlier from an entirely different protest about an entirely different issue in Dijon, not even in Paris. And this effort to mislead the public, it wouldn't be complete without the use of new technology.
Starting point is 00:06:17 Here's an image showing the streets of Paris lined with bales of hay. This never happened. It was entirely generated by AI. And this isn't just happening in Europe. Last year, after wildfires raged in Hawaii, a disinformation network linked to the Chinese Communist Party spread inauthentic images,
Starting point is 00:06:44 purporting that the US government had intentionally spread the wildfires using a so-called weather weapon. Can you imagine? Over 100 people died in those wildfires, and the idea that those fires were deliberately set by their own government against their own people, it's terrifying. These kinds of conspiratorial narratives can spread widespread fear,
Starting point is 00:07:13 which takes us to the next powerful tactic of disinformation, fear-mongering, deliberately exaggerating an issue so that you can provoke fear and alarm. We know that emotion-driven information processing exaggerating an issue so that you can provoke fear and alarm. We know that emotion-driven information processing can overtake evidence-based decision-making, which is what makes this form of disinformation so effective.
Starting point is 00:07:37 It's for these reasons that a recent MIT study found a false story will travel six times quicker to reach 1,500 people than a true story will. And we know Facebook fact-checkers take up to 72 hours on average to identify and remove this content. By that time, most impressions have already been made. Now, I know we have all seen this online, and when you see it happen, it can be really tempting to respond have already been made. Now, I know we have all seen this online, and when you see it happen,
Starting point is 00:08:06 it can be really tempting to respond with the facts. I get it. We pride ourselves on logic and science. The truth matters. So when someone is so obviously spreading false information, just correct them, right? Unfortunately, this doesn't always work. Believe me, I spent the last two decades learning how to have conversations
Starting point is 00:08:31 with people buying into white supremacy. That is disinformation at its worst. Disinformation wins because of the emotions it inspires, because of the way it makes people feel. So if someone is so bought into disinformation, getting into debates on the facts with them can just risk pushing them even further into a corner, backing them into a corner so that they get really defensive.
Starting point is 00:09:07 And now back to the episode. Okay, so if we can't debate the facts endlessly, what can we do? Last year, Moonshot partnered with Google to test an approach known as pre-bunking. Pre-bunking is a proven communication technique designed to help people spot and reject efforts to manipulate them in the future. By giving them forewarning and giving them tools to be able to reject a manipulative message, you lessen the likelihood that they will be misled. This is not about telling people what is true or false or right or wrong. It's about empowering people to protect themselves. We've tapped into the universal human desire not to be manipulated.
Starting point is 00:09:52 And this method has been tried and tested for decades, since the 1960s. All pre-bunking messages contain three essential ingredients. One, an emotional warning. You alert people that there are others out there who may be trying to mislead or manipulate them. Be aware, you may be targeted. Two, stimulus.
Starting point is 00:10:20 You show people examples of manipulative messaging so that they will be more likely to be able to identify those in the future. And three, refutation. You give people the tools to be able to refute a message in real time. For example, if you see a headline that's really sensational and it either seems too good to be true or it makes you really angry, always Google around for other sources. Always Google around.
Starting point is 00:10:55 Okay, so we knew the steps to take, but we also knew if we were going to really get at this problem around the world, a one-size-fits-all approach wouldn't work. We knew we needed to get local. So we partnered with civil society organizations in countries around the world, from Germany to Indonesia to Ukraine.
Starting point is 00:11:18 And we started first with the evidence. We met with dozens of experts, we surveyed the online space, and we identified the most common manipulation tactics being used in each country. We then partnered with local filmmakers to create educational videos that would teach people about those manipulation tactics
Starting point is 00:11:40 that were being used in their home country. In some contexts, we found that people trust close peers and relatives the most. So in Germany, we filmed close friends chatting in a park. In Ukraine, we filmed family dialogues around a kitchen table, a setting that's so familiar to so many of us where so many of us have had those difficult conversations.
Starting point is 00:12:07 We wanted to encourage people to have these kinds of conversations within their own trusted circles, whether they're in El Salvador or Indonesia, and to do so before pivotal moments where online manipulation efforts intensify, like elections. So as we prepared to head into the EU elections, we knew that distrust in climate science had already emerged as a critical misinformation theme. Now, one study had found that adults over the age of 45 are less likely to investigate false information when they
Starting point is 00:12:46 stumble across it online. Now we also know that adults over the age of 45 have higher voter turnout, which means if it wins, disinformation can have a disproportionate impact on the outcomes of elections. So as we prepared to head into the EU elections, we created content for every EU country in 27 languages, aiming to empower Europeans to spot and reject efforts to manipulate them before the elections. Over the last year, we have reached millions of people
Starting point is 00:13:23 around the globe with these videos. In Germany alone, we reached 42 million people. That's half the German population. And we found, on average, viewers of these videos were up to 10% more likely to be able to identify manipulation efforts than those who hadn't seen those videos. This is a winning formula. The evidence shows us that pre-bunking is effective at building resistance to disinformation. It begs the question, how do we make that resistance last?
Starting point is 00:14:01 How do we build long-term societal resilience to disinformation efforts? There is an ongoing effort to use disinformation to undermine our democracies. Just last month, the U.S. Justice Department seized 32 internet domains secretly deployed by the Russian government to spread disinformation across the US and Europe. This included deliberate efforts to exploit anxieties and fear across the public about the energy transition, specifically to encourage violence. Now it's not just the Russian government that we need to be worried about.
Starting point is 00:14:45 Easy access to generative AI tools means that anyone, not just those with resources, money, and power, can create high-quality, effective, powerful disinformation content. And the sources of disinformation are varied. They can come from our elected officials all the way through to our neighbors down the road. Many of us don't need to look further than our own families. But so many of the tools we tested online are even more powerful when they come directly
Starting point is 00:15:20 from the people that you trust and love the most in real life, IRL. So instead of endlessly debating the facts, give your loved ones the tools that they need to protect themselves online. Information manipulation is, unfortunately, the new norm. But that doesn't mean we need to accept our loved ones being misled. And we shouldn't accept our climate scientists living in fear. So if we can't fact check our way out of this problem, we need to beat disinformation at
Starting point is 00:15:56 its own game by reaching people before disinformation does and giving them all the tools that they need to protect themselves online. Thank you so much. Support for the show comes from Airbnb. As 2024 comes to a close, I've been reflecting on my travels this past year. And of course, the highlights include several great Airbnb stays you've heard me mention. Palm Springs, Sedona, Tokyo. In 2025, perhaps it's the year I finally host on Airbnb.
Starting point is 00:16:31 With the amount of time I spend away from home, it just seems like the practical thing to do. I love the idea of looking back this time next year, having hosted several great stays and enjoying the extra income I saved. Your home might be worth more than you think. Find out how much at airbnb.ca slash host. That was Vidya Ramalingam at TED countdowns dilemma event in Brussels.
Starting point is 00:16:59 If you're curious about Ted's curation, find out more at Ted.com slash curation guidelines. And that's it for today. Ted Talks Daily is part of the TED Audio Collective. This episode was produced and edited by our team, Martha Estefanos, Oliver Friedman, Brian Green, Autumn Thompson, and Alejandra Salazar. It was mixed by Christopher Faisy-Bogan. Additional support from Emma Taubner and Daniela Ballarezo. I'm Elise Hue.
Starting point is 00:17:24 I'll be back tomorrow with a fresh idea for your feet. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.