The NPR Politics Podcast - How To Spot Misinformation
Episode Date: November 27, 2019In this special collaboration with NPR's Life Kit the NPR Politics team breaks down what misinformation is and how you can spot it. This episode: Congressional correspondent Susan Davis, political rep...orter Miles Parks, and national security editor Philip Ewing. Connect:Email the show at nprpolitics@npr.org.Join the NPR Politics Podcast Facebook Group.Subscribe to the NPR Politics Newsletter.Find and support your local public radio station.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy
Transcript
Discussion (0)
Hey, guys, before we get started, we have two live shows coming up. One is in Chicago on January 10th,
and the other one is at Drew University in Madison, New Jersey. That one's on January 22nd.
We would love to see you there. To grab a ticket, just head over to NPR Presents dot org.
OK, here's the show.
Hey there, it's the NPR Politics Podcast. I'm Susan Davis. I cover Congress.
I'm Myles Parks. I cover voting.
And I'm Phil Ewing, election security editor.
Myles, we have you in studio to talk about the work you've been doing for our sister podcast, Life Kit.
Our brother podcast, whatever you want to put it.
Our sibling podcast.
Sibling podcast.
Life Kit.
Yeah, yeah, yeah. So this is basically NPR teaches you how to do stuff.
And I have been helping people walk through the democratic process in a bunch of different ways. We have one on how to vote, one on how to run for office at the local level. And then the one we're talking about today, which is how to spot misinformation, which is obviously a pretty hot issue right now.
Okay, so let's just talk about and define misinformation. For the purposes of the podcast episode you did,
how did you define misinformation? I think it gets really too specific. We focus so much on 2016 and the Russian interference effort before that election, right? And that's part of it.
There is misinformation during election seasons about candidates on both sides.
Which is like weaponized misinformation. Exactly. Which is basically like this person
said this or this person feels this way about this thing.
And either it didn't happen or it's focusing too much on it without giving context, things like that.
So, Miles, tell us about some of the misinformation campaigns you came across in your reporting.
There's a whole misinformation campaign every election day around when election day is, how you can vote, what you need to bring.
There'll be people online in 2020, I promise,
in almost every state who will post something on social media that says,
Democrats vote on Tuesday, Republicans vote on Wednesday. That is like one of the most famous
misinformation campaigns that's been happening since election days existed years and years and
years ago. And then the story we focus on in the podcast is about this local government,
which is not a national government misinformation campaign, but actually something that happened at the local level in Idaho. I talked to Caitlin
Dickerson, who's an immigration reporter for the New York Times. And she basically brought me to
this time in 2016, in Twin Falls, Idaho, where basically the entire local government was thrown
upside down by a fake news scheme about a crime that actually didn't happen. People were showing up
at community meetings saying that a group of Syrian refugees had sexually assaulted a five-year-old,
and that campaign was completely devoid of facts. Here's Caitlin talking about the city officials
in Idaho after this misinformation campaign takes roots online. Members of the local government,
the mayor, the city council members,
local judges, the county prosecutor, they were basically inundated for months on end with
threats, violent threats, very visceral and descriptive threats from all over the world.
It does sound like misinformation, Phil, is something that can be, in that case,
dangerous or disruptive beyond just being a nuisance or feeding sort of misunderstandings of issues.
I mean, there's a whole range of impact that misinformation can have in our politics.
It can be very dangerous. parlor in Northwest D.C. with guns and was going to shoot the place up and actually, I think, fired a couple of rounds in the restaurant because he believed a conspiracy theory online
that it was involved with some kind of completely false child sex ring, which was also connected
with politicians in 2016. When people believe things that are not true strongly enough to take
action, that can be very consequential in real life if there's violence involved, or even just if it
animates the actions by people who are important and influential because they believe something
that isn't correct, and they have the power to act on that belief in ways that can be
important for the rest of us. So, Myles, who are some of the people you talk to to explain how you
can identify misinformation? So, we talked to a number of folks. We talked to Caitlin from the
New York Times,
which is a really interesting case study.
She covers immigration.
Immigration is a huge topic for misinformation.
We also talked to a couple of professors
from the University of Washington
who actually teach a class.
I can't actually say the name of the class
on public radio,
but it's called the Calling BS.
It's like one of the most popular classes
at the University of
Washington. And they basically walk through from a numbers perspective, how people can skew numbers
to tell a story that isn't actually there and how you can discern news that is real and true.
So how do you spot misinformation?
It's a really complicated answer. But I think the starting point is something Caitlin told me, which I think is
really helpful, is skepticism is like the number one thing you just have to bring with you whenever
you take an information. This is when I am picking up the Washington Post, even if it's a reporter I
like, I need to just read it with a slight bit of like, okay, I want to expect this reporter to tell
me how they know what they know and also
how they got to that conclusion. You want the reporter to show their work. And then this is
also you have to bring that same skepticism to social media. When I'm scrolling through Facebook
or I'm scrolling through Twitter, it's not just taking things and assuming because I see them on
a picture and someone's photoshopped something to say, you know, Bernie Sanders said this or
something like that, that that means Bernie Sanders actually said that. I just have to have that question in the back of my mind that says,
oh, wow, did he really say that? And do a quick Google search or something like that,
bringing that skepticism, even when somebody you're at a backyard barbecue or something,
and your friend who you've known for 10 years is like some science thing. Did you know vaccines
cause autism or something even more mundane? I was in a conversation the other day where a friend told me something about beer. And my first response, I think, because I've
been reporting this episode was like, where'd you find that out? Because I was like, in my head,
it was the real question I had was like, is that true? And then but by the way, I expressed it was
like, Oh, where, where did you read that? Or where'd you see that or something like that.
And so I think bringing that skepticism to all information gathering in your life is kind of the step number one.
But it does seem, and you brought this up and it's in the podcast that's so important, is that huge factor in misinformation, especially in politics, social media.
And as someone who talks in the podcast says, social media has brought a lot of good into our lives, but it has also been a main channel for misinformation in the modern era.
And so much of what makes misinformation pernicious is confirmation bias.
A lot of the reasons that people get into stories they find on Facebook and Instagram and Twitter
is because it confirms something they already think or they believe about the way the world works.
And the way social media works is by rewarding you for interacting with things that it can show that you like.
So the more time you spend on a story,
the more links that you click on of a certain kind,
the more Facebook, for example,
is going to show you things like that.
There's a big factor on YouTube as well
with extremist content,
where we've had a lot of reports about people
who all of a sudden find themselves
going down a rabbit hole toward violent extremism or white identity types of politics, because all YouTube wants to do is
keep you on YouTube. I think the professor from University of Washington, his name's Carl Bergstrom,
put it in a way that exactly what Phil's talking about, but it freaked me out even further.
The content that we're delivered has been curated and selected by a set of machine learning
algorithms that are basically running large-scale experiments
on all of the users of the platform
to see what keeps people clicking, what keeps people on the site.
Large-scale experiments on our brain.
And this is what we are...
That is some spooky talk.
Yeah, this is what we've decided.
On a global scale.
Yeah.
This is how we spend, not just like minutes, hours of every day.
God, that's so depressing.
Take that out of the pot.
I'm like, does anybody have a cigarette?
It's not good.
We should be drinking a whiskey and having a marble
while we're having this conversation.
All right, we're going to take a quick break,
but when we come back,
we'll talk about how to spot misinformation.
Support for this podcast
and the following message come from Uber.
Uber is committed to safety and to continuously raising the bar to help make safer journeys for everyone.
For starters, all drivers are background checked before their first ride and screened on an ongoing basis.
And now Uber has introduced a brand new safety feature called RideCheck,
which can detect if a trip goes unusually off course and check in to provide support.
To learn more about Uber's commitment to safety, visit uber.com slash safety. check, which can detect if a trip goes unusually off course and check in to provide support.
To learn more about Uber's commitment to safety, visit uber.com slash safety.
This week on Bullseye, Lin-Manuel Miranda on his dark materials, hip hop, and life after Hamilton.
I know it's the first line of my obituary. So if that line is handled,
then what else can I do with my time here? It's Bullseye for MaximumFun.org and NPR.
And we're back. And Miles, one of the points you make in the podcast in your reporting
is that one of the hallmarks of misinformation in social media is oftentimes it is a meme or
a story or something that triggers an intense emotional response.
Yeah, there are like three factors that so you bring your skepticism to every news story. That's
like step one. And then like step two is actually knowing the types of information that are going
to be most ripe for misinformation. Like immigration. Like immigration. And why is
immigration such a good place for
misinformation? Because it triggers fears about things like refugee resettlement. They're going
to take my jobs or they're going to hurt my kids or something like that. That emotional response
turns off your brain to be critical thinking about the information you're taking in. You just
get angry. And when you're angry, you're tunneling in on an emotion and you're not taking in all
sides at that point. So emotional response is something that you need to be aware of.
Why else is immigration such a successful place for misinformation?
Because it's complicated.
People have jobs.
They don't want to go read, you know, 17 books on the immigration laws in this country.
They don't understand it completely.
And so that's another reason why voting rules are so good for misinformation, because they're different in every state, they're different in every locality, and people don't know all of the intricacies. So bad actors can come in and take advantage of that knowledge gap, throw in things that are slightly untrue or slightly biased, and people aren't able to discern the difference. The third thing you have to look out for in misinformation is breaking news situations.
Times when people are really hungry, if a shooting is happening,
something's exploded in a train station, people want information,
and they're looking for the information that's coming the quickest.
They're not necessarily discerning whether that information is true or believable.
I want to focus on that emotional trigger thing, though,
because one thing we have learned, especially in politics,
is that that is what people use in political misinformation campaigns, too. And that is one
thing we learned in the 2016 interference in our elections by Russia, is the types of issues that
they use to fuel that misinformation through social media preyed on some of our most divisive
social issues, things like racial attitudes, immigration. I mean, they did it because it's
effective. Correct. And there's a long history in the case of 2016 for foreign governments,
especially Russia, exploiting racial divisions within the United States. And this goes back
decades. It goes back to the civil rights era of Malcolm X and Martin Luther King and other civil
rights leaders being the focus of, in those days, printed material or reports that would surface through global news agencies
and then make their way into American newspaper reports.
And then gradually those same efforts became more and more sophisticated.
One of the most famous from history is this story that the AIDS virus was created by the CIA
as a bioweapon for use against black populations in the United States and Africa, but which retained
so much danger as a misinformation storyline that there's still episodes every once in a while where
someone will talk on TV and say that they believe this. A total lie, but which took hold so strongly
in the United States that it's still doing damage even now. That's such a good point because once
misinformation takes root, it's really hard to rip the roots out. Peter Adams from the News Literacy Project is another expert I talked to specifically about this.
He made a good point that when we talk about that emotional response, I think the first people think about is like fear or anger.
But I think that same emotional response sometimes can be even stronger in the sense of like fighting for these values like equality and things like that.
Those responses can also tunnel your vision to
make it so you're not seeing it 100% the truth. A lot of misinformation exploits our values. It
exploits our patriotism. It exploits our religious faith. It exploits our dedication to ideals like
equality. Is the takeaway that you should be the most skeptical about information online that makes you the angriest?
I think that's right. I think what Adam said basically is if I'm getting angry, that shouldn't be an indication that I should share this information.
That should be an indication that I should check this information.
One other factor is a call to action. In 2016, one of the most consequential documented cases was of an account on social media called Blacktivist, who was talking within the Black Lives Matter movement.
And the messages, all of which were coming from Russian influence mongers, were,
we black voters have been so marginalized, and these candidates don't speak to us at all,
we shouldn't vote for any of them. So we need to stay home on election day. How many people did
that actually change the behavior of on election day? We don't know that. But if you're looking for these stories and the coda of the messages, do something or don't do something, that's another thing to be suspicious about.
All right, we're going to leave it there for today. But you can find Miles episodes on how to vote, how to run for office and how to spot misinformation and all other episodes of Life Kit at npr.org slash Life
Kit. I'm Susan Davis. I cover Congress. I'm Miles Parks. I cover voting. And I'm Phil Ewing,
election security editor. And thank you for listening to the NPR Politics Podcast.