Cautionary Tales with Tim Harford - Bonus: Why We Believe What Isn't True (with Axios Today)
Episode Date: February 12, 2021We're no stranger to stories about misinformation or deliberate disinformation. We live in a world where now more than ever, you have to be skeptical. That skepticism can be healthy, but it also can b...e used to cast more doubt and misinformation on data and statistics that are very real. Tim Harford talks to Niala Boodhoo, from the news podcast Axios Today, about why people believe things that aren't true.Check out Axios Today, where Niala delivers the news every weekday - in just 10 minutes.Subscribe to Axios Today wherever you get your podcasts. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Pushkin
Hello, cautionary tales listeners. As I'm getting ready to kick off my second season
on the 26th of February, I wanted to share my appearance on a different show from my friends at Pushkin and Axios.
On this episode of Axios Today, I talk with host Naila Boudou about my book The Data
Detective and why people believe things that aren't true. We live in a world where now,
more than ever, you have to be skeptical. That skepticism can be healthy, but it can also be used to
cast doubt on data and statistics that are very real and to spread misinformation.
Listen to my conversation with Nila and subscribe to Axios Today, wherever you get your podcasts.
Here's the show. Tim Hartford is a senior columnist with the Financial Times and he's also author of the Data
Detective which is just out this week. Hi Tim, welcome to Axios today.
Oh thank you very much for having me.
So I have to first just ask you, the title of this is how to make the world add up outside the US,
but it's the data detective here in the US. This is like a Harry Potter situation. Why do we have a different title?
Yeah, I'm hoping for the Harry Potter sales.
That would be nice.
Yeah, it's as simple as the UK publisher
didn't like the US title, the US publisher,
didn't like the UK title,
and I just have to explain it to everybody that I talk to.
Oh, I thought it had some greater significance,
like statistically, about the way
that Americans interpret words.
It's exactly the same book all over the world, just a different title.
And it's all about trying to help people think clearly about the world, using, among
other things, the tools of statistics.
And you're right that we might be at a bit of a fork in the road, or a moment when it
comes to statistics, particular when we think about the pandemic.
Yeah, I think so because we've seen a tremendous amount of
misinformation and even deliberate disinformation,
but we've also seen a credible appreciation of just how
life-saving accurate numbers can be.
All of the questions we want answering, like, you know, where is the virus?
Who's got it?
How's it spreading?
What are the risky activities?
Do masks help?
What treatments work?
Do the vaccines work?
All of these life or death questions,
you can't answer any of them without good data.
And so I think people have started to appreciate that,
while there is a lot of polarization while there is a lot of polarization,
there is a lot of misinformation,
they are helping us make incredibly important
and consequential decisions.
But of course our societal problems
in the polarization aren't about the statistics themselves.
They're about whether we believe them.
Today, in 2021, the idea that statistics are a lie
is almost accepted fact.
Yeah, although it's a lot easier to lie without statistics, let me tell you.
So, I mean, that idea goes back way, way, way, way back.
So, the time of Mark Twain, people were talking about lies,
damned lies, and statistics.
But for me, the moment that I really identified as significant was 1954,
because two different things happened in 1954.
In the same year, you've got this,
to me, incredibly dramatic illustration
of these different views as statistics.
There's this one guy, Darryl Half,
who wrote How to Lie with Statistics,
who's saying, yeah, it's like a stage magician's trick.
You can never trust them.
It's fun to figure out how the trick is done, and I'll show you how statistics are used
to deceive you.
And then you've got these two epidemiologists, Richard Dolan-Oston-Bradford Hill, who
were saying, this is not a trick.
This is life or death.
And their discovery that smoking cigarettes dramatically increases your risk of lung cancer
has helped to save hundreds of millions of lives.
It's not a game at all. The irony of that bifurcation back in 1953 is that pretty soon, Darrell Hough,
the How to Lie with Statistics Guy, ended up testifying in front of Congress, basically saying,
well, you couldn't really believe all the statistics that showed you that cigarettes were dangerous. So it was a very short trip from, here's a
fun book, exposing statistical fallacies to, I'm standing in front of Congress and I'm telling
you that there's no evidence that cigarettes are dangerous. It's pretty dark.
And how do you see that direct line from that moment with casting doubt on scientists work when it comes to tobacco and cancer to climate science deniers or to what we see now? Now that we've got more than 400,000 Americans dead in my own country, more than 100,000
Brits dead, now that those reassurances have been proved to be false, the defense mechanism
is to say, oh, well, look, the scientists got a load of stuff wrong as well.
So they'll, for example, point to the WHO, and they'll say the WHO told us that the infection fatality
rate was about 3.5% at the beginning of the pandemic and nobody now thinks it's 3.5%,
it's below 1%.
But that smear on the WHO is actually a deliberate distortion.
WHO never said that the infection fatality rate was 3.5%.
They said something else was 3.5%, the case fatality rate.
And the difference doesn't really matter.
What I think is interesting is you've got that same tactic being used,
which is, I've been caught out, I've been discredited,
and I'm going to lash out and I'll attack the scientists
and claim that they've got stuff wrong, that in fact they haven't.
Why is it easier to discredit arguments then? It's almost like an easier fight to discredit
something then to support something and prove it right.
There are so many ways to answer that question. I think really, I'm not sure why it's easier,
but it is easier. We've got good evidence that it's easier. You just have to look around
at the preponderance of negative campaigning, for example. But we've got good evidence that it's easier. I mean, you just have to look around at the preponderance
of negative campaigning, for example.
But we've got some really nice evidence
in experiments conducted by political scientists
and psychologists.
So there's one from the mid 90s
that just showed people a bunch of arguments
about real hot button issues like the death penalty,
gun control, abortion
rights, the sorts of things that people get really heated about and feel very passionately
about.
And they ask people to evaluate the strengths and weaknesses of these different political
arguments.
And what the research has found was, yeah, people find it a lot easier to come up with arguments,
supporting what they
already believe, and they find it harder to come up with arguments supporting the opposing
point of view. But they found that that's doubly true when it comes to negative arguments.
People found it incredibly easy to produce negative arguments, reasons to disbelieve
the political positions that they disagreed with.
And I think that that's behind the tobacco strategy, the climate change strategy, now
the COVID denial strategy, the same basic approach.
If you don't want to believe this, it's very easy for me to give you reasons to doubt,
very easy indeed.
Doubt has this special kind of power, it seems.
It's very tempting, even for people who really respect the numbers and respect evidence, it's easy for us to fall into
the trap of constantly focusing on errors and mistakes, and that I think just
feeds into this narrative that the numbers are always lying, that they'll never
tell you anything useful, and that's just not right. So how do we, for example,
for someone like you who loves statistics, obviously, how do we, for example, for someone like you who loves statistics obviously, how do we
not take numbers for granted? I think just to notice how important they have been in the pandemic,
the metaphor for me, it's like radar. So when we developed radar in the late 1930s, that turned out to be an incredibly important innovation.
In the UK, it helped us turn back the German Luftwaffe.
Then we took radar technology, took it to the States,
and the United States poured an incredible amount of money
into perfecting radar and perfecting that technology,
because it's just so incredibly important
to be able to see what's coming at you.
And for me, statistics are like that. They're showing us the threats. They're showing us the weaknesses
on our own system. They're showing us where policy is working and where policy is failing to work.
And where the supplies of PPE are going. Where the supplies of vaccine are going. Who is suffering most
and who needs support? All of these things. You've got no chance of figuring out any of this stuff
without good statistics, without good data.
And so it frustrates me when we sit around going,
oh yeah, lies, dam lies and statistics,
and we treat it as though it's just a weapon
in a political argument, and it's so much more important than that,
and so much more useful.
I think part of this is just the natural also we've talked about the human nature and just
sort of our tendency to doubt. I think a lot of this also is our how overwhelmed we are
with the amount of information and statistics that are coming at us. And so how do you personally
manage that? How do you keep from being overwhelmed with information and statistics?
The first piece of advice that I give in the book I think has surprised quite a lot of people.
There's nothing to do with technical tips on correlations or r-squared or sampling bias or any of that stuff.
I just say, whenever you see a claim, a statistical claim, a newspaper headline, ask yourself how you are feeling when you see that.
Ask yourself what your emotional reaction is to the claim.
Because so many media headlines, so many social media posts are designed to arouse an emotional reaction.
That's what makes a good newspaper headline.
That's what gets the clicks, that's what gets the shares and the likes.
But if you're processing information and you're in an emotionally hot state, you're feeling
angry, you're feeling vindicated, joyful, any emotion at all, you're not thinking clearly.
So my advice is just, you don't even need to count to 10, just count to three. Notice your emotional reaction and then go back and look at the claim a second time
and you'll already be thinking in a calmer and clear way. I know I sound like some yoga
instructor. Do you do that? When's the last time you do that?
Well, I do that all the time. It's like a total habit of mind for me because I'm once the last time you did that. Well, I do that all the time. It's like a total habit of mind for me,
because I'm as vulnerable to emotional thinking as anybody else.
But it's just a complete reflex when I'm on Twitter,
and Twitter is a place where there's a lot of angry stuff going on.
The moment I see something,
and I'm minded to retweet it, to comment, to share,
I just go, hang on
a moment. Just notice my own reaction to it. And then I may, of course, go on and share
it. But I've already started to spot the potential errors and the ways in which it's not just
that other people are fooling me. It's that I am fooling myself. And I'm always going
to keep fooling myself
if I'm feeling highly emotional when I see these claims.
I mean, you have talked about this.
We have talked about this many times
on the podcast access today.
And I'm sure anyone who is even a mild consumer of news
is aware about the idea of checking your sources, right?
Checking your emotions.
So I think I wonder if you feel like that's just the world that we live in now
where we have to remember to be vigilant about all of these things,
knowing that people are probably exhausted of being vigilant about a lot of other things.
Yes, I mean, it would be nice if every journalist, if everyone who ever posted on social media
did all that work for us,
put everything into context, gave all the sources, linked us to complementary or opposing points of view
so we could really sort of evaluate everything.
And the best journalists do, the best sources really do that.
But if they're not going to do it, you need to at least be aware that someone is
trying to get you to feel something. But it's important to be vigilant and to be skeptical
about the news stories that we consume. I think it's just as important to be vigilant and
skeptical about our own filters and biases. Because you can consume a diet of a really excellent
news and information. But if you're constantly processing it in a really excellent news and information,
but if you're constantly processing it in a very biased way, if you're really
yearning to reach a particular conclusion, you're still going to come out
thinking the wrong things. Can I, and by just asking, what is the,
what is the one thing you want people to take away from your book?
If you are curious about the world and you want to understand what you're being told and
how it fits into a bigger picture, it's not that hard.
Ask the right questions, be open-minded, not too open-minded, but be open-minded, and
ask whether what you're being told is making you smarter.
When you view information like that,
rather than as a weapon that might help you win some stupid argument,
you're going to be smarter about the world.
Sim Harford is the author of the Data Detective,
which is out this week in the US,
and you can also catch him on the cautionary tales podcast,
produced by our partners at Pushkin.
Tim, thanks very much for being with us. I appreciate it.
Thank you. Thank you.
PondCast produced by our partners at Pushkin.
Tim, thanks very much for being with us.
I appreciate it.
Thank you.
Thank you.