Cautionary Tales with Tim Harford - You Have Reached Your Destination
Episode Date: December 27, 2019We may mock our ancestors for seeking the advice of oracles, soothsayers and psychics, but today we rely heavily on computer programs and math formulas to help us navigate our world. If we continue to... follow them unthinkingly, should we be surprised when we end up in unexpected and dangerous places?Read more about Tim's work at http://timharford.com/ Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Pushkin
As the night draws in and the fire blazes on the hearth, we warn the children by telling
them stories.
The little mermaid warns them, he's just not worth it, sister.
But my stories are for the education of the grownups. And my stories are all true. I'm Tim 2.5,000 years ago, King Cresus of Lydia ruled a mighty empire.
Lydia had endured for more than 6 centuries in the lands we now call
Turkey. Cresus was legendarily wealthy, yet he felt threatened by the growing strength
of the first Persian Empire on his borders. What should Cresus do? Should he strike against
the Persians in an attempt to seize their lands and destroy their power, or should he aim
for a peaceful trading relationship?
King Cresus yearned for something we've all wanted from time to time, to see into the
future.
And this being the ancient world, that desire could be satisfied.
Cresus could consult an oracle. That meant travelling to a temple,
making an offering, and asking the advice of a god.
The most famous oracle was at Delphi in the heart of ancient Greece, where the god Apollo
would possess the temples appointed priestess and give divine answers to the questions asked of her.
Cresus had tested many oracles.
As the Greek historian Herodotus tells it, Cresus sent messengers to far-flung temples,
asking each oracle what the king was doing on a particular date.
What he was actually doing was, frankly, unguessable, boiling a stew of lamb and tortoise in a bronze
cauldron.
The messengers returned from each oracle, and when Cresus heard the pronouncement from Delphi,
which was correct in every detail, he bowed his head in respect. Truly, the priestess at Delphi spoke with the voice of Apollo.
And so, Cresus asked,
Should I seek war with the Persian Empire?
If King Cresus attacks the Persians, he shall destroy a mighty Empire.
And how long shall my kingdom endure?
Till the time shall come when a mule is monarch of Persia.
The kingdom of Lydia then would last forever, since the Persian Empire would never be ruled
by a half-breed animal.
The Persian Empire in any case was destined to be destroyed
by King Greece's himself.
So the Oracle foretold.
And since the Oracle guaranteed victory, Greece's attacked and was defeated.
Greece was the last king of Lydia. He was vanquished by the Persian king Cyrus,
whose complex heritage apparently qualified him as a mule in the eyes of the oracle.
Cresus had indeed destroyed a mighty empire, his own.
It's not easy to see into the future, but the tale of King Cresus tells us that even
when we do see the future, that doesn't guarantee we'll make wise decisions in the present.
In the modern world, we have our own oracles, and they too tell us what the future holds.
We call them by mysterious names, such as Alexa, Google and Siri.
And we need to think much harder about. I'm going to the beach. I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach.
I'm going to the beach. I'm going to the beach. August 2009 Death Valley, California, one of the hottest places on earth.
National Park Ranger, Amber and Trass has been following the tracks of a car along
a rough country trail that's barely a road at all. The tracks shouldn't be there. The
road has fallen into disrepair and been covered by the shifting sands of the Mahade Desert.
In fact, the route is officially closed. The closure marked by small rocks and bushes laid across the road,
but the tracks go straight through those slight barriers.
Then, Rangerna Trass sees the Jeep.
It's up to its axles in soft sand.
It has SOS spelled out in medical tape on the windshield, and there's someone lying
beside it in the 115 degrees shade.
Are you okay?
Ranger the Trass asks.
The prone figure scrambles to her feet.
She's alive.
Her tongue is swollen, her lips are bleeding and blistered.
She's not okay.
She's not okay at all.
Alicia Sanchez and her six-year-old son Carlos had been stuck in the sand
and the heat for five days. Carlos hadn't made it. He had drifted away, telling his mother
that had been speaking to his grandfather in heaven. How did a young mother and her son end up so terribly fatally lost?
They'd been relying on the directions of a dashboard computer.
Rangers in Death Valley National Park have a phrase for it, death by GPS.
There was probably a point where she said, oh my god, I don't know where I am.
I'm going to keep going because I think I'm going in the right direction.
People are so reliant on their GPS that they fail to look out the windshield and make wise decisions based on what they're seeing.
Many of us have done the same thing that got Alicia Sanchez stranded in Death Valley.
We've relied on GPS. I, only to find ourselves lost in one
way or another, and now I have. Once I typed in the wrong address, another time the route was blocked.
More than once, I've just lost the signal, and because I've relied on the computer's guidance,
I've been helpless. I've just never been so cruelly punished for my mistake.
The story of Cresus and the Oracle is the last and most ancient cautionary tale in our series,
but it holds a very modern lesson for us. I think it teaches us what might go wrong when we ask computers to predict the future
for us.
Most of the economic forecasts we see are made by computers.
So are the weather forecasts, and so are many computerised decisions that we don't even
think of as forecasts, such as when an algorithm recommends who should get a mortgage approved and who
shouldn't, or even which criminal suspects should get bail.
When you ask a GPS system or your phone to plot you a route, that's a forecast too.
The computer makes a prediction of which roads are open based on a map database that may
or may not be accurate, yet then unleashes an algorithm to forecast
which route through that map will be the swiftest.
If the map is wrong, the prediction will be wrong too.
But even when the prediction is right, you may still end up far from where you wanted
to be.
Ponder the predicament of a Swedish couple on holiday in Italy.
They went to the tourist office in the small town of Karp couple on holiday in Italy, they went to the tourist
office in the small town of Karpi near Bologna in the industrial north of Italy and asked
for directions.
Which way is it to the Grotto Azura, please?
Grotto Azura, sorry, the blue grotto is a sea cave.
Yes, the blue grotto is a sea cave. Yes. The blue grotto, exactly.
But the blue grotto is in Capri.
We are in Carpi.
Instead of driving four hours south from Rome
to the beautiful island of Capri,
they had driven four hours north to Carpi,
where they were about as far away from the sea
as it's possible for a little
Italian town to be.
Ah, you said to understand how they managed it. I mean, Capri is an island.
Well, yes. But it's not hard to understand at all, is it? It was a typo. A typo that meant
a long drive in exactly the wrong direction. Compared to the suffering of Elisier and Carlos
Sanchez, that was a small enough misadventure. If the Swedish tourists had known anything
about which cities in Italy are north of Rome and which lie south, or had checked
him out or had compassed, then they would never have made the mistake. But why would they
have done any of that? They'd asked their GPS
to take them to Carpe and it did. Just like the Oracle of Delphi, it produced exactly the
right answer. And just like Cresus, they acted on that answer without pausing to think.
The concerns of King Cresus 2.5,000 years ago may seem very different from our problems today as we type an address into a GPS or check the weather forecast.
But they really aren't.
Oracle's, like computer algorithms, are mysterious black boxes.
We ask them questions about the future, we receive answers, and then we have to work out
what those answers mean.
There's a fierce debate raging about the use of algorithmic predictions.
How can we trust a computer to decide whether a criminal is at high risk of reoffending or
whether a teacher is promoted or fired.
That debate tends to focus on whether the algorithms deliver predictions that are fair
and accurate.
Which of course is an important question, but it leaves out a point that we tend to overlook,
a point that is the lesson of our cautionary tale, just because you get
a good forecast doesn't mean you're guaranteed to make a good decision.
After all, the Oracle at Delphi told King Cresus the truth about the future.
The truth didn't help him.
And the GPS that delivered the Swedish tourists to Carpe also gave the correct answer to
the question
that asked. There'd still have been much better off if they'd used a road map.
And as for Alicia Sanchez, we don't know whether her GPS had an outdated map or an intermittent
signal, or maybe it worked fine, and she made some trivial mistake in the way she used
it. What we do know is that she didn't have a map, and she didn't stop when she got to the
barrier of stones that was supposed to show the road was closed.
Trusting the GPS had truly awful consequences.
A good forecaster can lull you into believing it's infallible, creating a crisis when it
fails.
Or, it can give you an accurate answer that you misunderstand. believing it's infallible, creating a crisis when it fails.
Or it can give you an accurate answer that you misunderstand, perhaps because you don't
know what you've really asked.
One of the gurus of Futurology was a French economist called Pierre Vac.
He once wrote, forecasts are not always wrong.
More often than not, it can be reasonably accurate.
And that is what makes them so dangerous.
I don't want to suggest that computers always produce good forecasts. In fact, behind the scenes, a mathematical algorithm was responsible for what I think has a strong claim to being
the worst forecast anyone has ever made. It's August 2007, the early days of the great financial crisis. An insurance
executive called Joseph Casano is trying to reassure the world that his company, AIG,
is doing just fine. You might remember AIG from our episode about giving the Oscar to the
wrong movie. AIG was an insurance company at the epicenter of the financial crisis.
It had been writing contract after contract, ensuring other companies against debts not being
repaid. It's hard for us without being flippant to see a scenario with any kind of realm of reason
that would see us losing one dollar in any of those transactions. Not a single dollar, not in any conceivable scenario.
18 months later, AIG announced that it had lost more than 60 billion dollars in a single
quarter.
What on earth had happened?
Simple.
AIG had wagered billions, no trillions, on the financial markets equivalent of a GPS,
and the GPS had led it astray.
The financial GPS was a mathematical formula that attempted to forecast the risk that two
bad things happened together.
Let me take a moment to explain.
Let's say I lend money to two businesses.
I hope that they're both going to pay me back, but they might not.
One of them might default and not repay the loan.
Both of them might default.
So here's a question.
What's the chance that the second one defaults, given that the first one does?
It's a subtle question.
If one business is a food truck in Cancun and the other is a food truck in Amsterdam,
their fates are presumably totally unrelated.
If you're trying to figure out whether the food truck in Amsterdam will fail to pay back
the loan, the fate of the truck in Cancun is irrelevant.
The technical term for this is that they're uncorrelated.
But if the food trucks are in the same city, their fates might be linked. They both have
to deal with the same local economy, the same licenses, and the same weather. If a big
local business closes, that's bad for both of them. If so, their fates are correlated.
What happens to Warren is likely to happen to the other.
But even then, the link between the two isn't obvious.
If one truck gets bad reviews or is shut down for hygiene violations, maybe that's good
news for the other truck.
One truck going bankrupt might make the other truck less likely to go bankrupt.
If so, their fates are negatively correlated.
What happens to one is less likely to happen to the other.
If you're on Wall Street,
it's important to figure out
whether loans are positively correlated,
negatively correlated or uncorrelated.
One of the things that Wall Street likes to do
is build big financial structures out of
these individual loans.
How safe the financial structure is depends a lot on these correlations.
If they're all highly correlated, then either none of them are going to go bad.
Or they all are.
That's a lot of risk.
But if they're uncorrelated or even negatively correlated, then I can pretty much guarantee
that when some go bad, the others will be fine.
Predicting the correlation tells you all you need to know about predicting the risk.
Given how useful it is to predict the correlation between loans, Wall Street's finest mathematicians
often dubbed the quants, turn to the challenge about two
decades ago.
They deployed a formula known as the Gaussian copula function.
You don't need to know what the Gaussian copula function actually is, just that it's
a way of predicting correlations automatically.
Plug your historical data into a computer and out comes the prediction.
It's the financial GPS.
And that financial GPS, the Gaussian copular function, is now commonly described in the
business press as the formula that destroyed Wall Street in the great financial crisis
of 2008. The problem wasn't just that the formula gave the wrong answer, although it did.
The problem was that AIG and other big Wall Street institutions had bet everything, absolutely
everything, on the assumption that the formula couldn't be wrong.
That was like trusting a GPS in Death Valley.
It's fine, until it isn't.
And then you're in terrible, terrible trouble.
When people trusted the Gaussian copular function, they didn't realise that they were making
a knife-edge bet.
If the copular function produced the right prediction, you could bet a billion dollars
and be absolutely certain that you wouldn't lose anything, not a cent. And if the copular function produced the wrong
prediction, you could lose absolutely everything. It's hard for us without being flippant
to see a scenario with any kind of realm of reason that would see us losing one dollar in any of those transactions.
$60 billion, Jocosano. $60 billion.
And if you're confident that this sort of thing isn't still happening in the financial
system, you're more confident than I am. Before you declare war on the Persian Empire, or drive into death valley, or bet trillions
on a mathematical function you don't really understand, how about this?
Stop and think.
Think hard.
Don't just trust what comes out of the black box, whether that black box is a priestess
possessed by a polo,
or a GPS-enabled device, or the financial spreadsheet.
Stopping and thinking is what most people did in ancient Greece.
Esther Eidenow is a professor of ancient history who studies what oracles did and what they
meant to the Greeks.
She says that people would prepare diligently
before they asked for divine advice.
They'd phrase their questions carefully.
They'd think about different possibilities.
They'd ponder the meaning of the answer.
In 2008, a group of Japanese researchers
led by Professor Toro Ishikawa, ran an experiment to test what GPS does
to our capacity to notice the world around us. They directed people on a route through
Kashua, a small city near Tokyo. Some of the experimental subjects chosen at random had first
been taken on the route by a guide. Others had been asked to follow the route on a paper map, and others had GPS guidance
instead.
After walking the route, everyone was asked to do it all again, this time without help.
The ones who'd followed a guide or used a map generally managed this task just fine.
The ones who'd used GPS had a hard time of it. They stopped
more often, walked more slowly, and rated the task as more difficult. The GPS may
have gotten them around first time, but it hadn't got them to engage with the world
at all. They didn't pay attention because they didn't have to, and then they regretted it. An automated decision doesn't have to work like that. In the mid-1980s,
a group of British doctors and medical statisticians carried out an intriguing experiment.
They wanted to test out a computerised diagnostic system for patients suffering acute
abdominal pain. Such pain could have a lot of different causes, and also a kidney infection,
a heart attack, appendicitis, even an ectopic pregnancy, and that means a lot of different possible
treatments. So, getting the diagnosis right really matters. This being the 1980s, the computers in question were old school. Apple 2E's big, beige plastic bricks with 64K of memory and software that you loaded using a 5 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 The doctor, or perhaps a medical assistant, would type away on the clunky keys,
laboriously entering data into the computer and ticking yes, no boxes, using the cursor,
no mouse or touchscreen, obviously.
This diagnostic system wasn't bad.
It wasn't particularly good, either.
It took a lot of effort to use, and it only gave the correct diagnosis to 3rd of the time.
But yet, it was a huge success.
The proportion of patients who died fell by more than 20%.
The number of cases in which a serious medical error was made from about 9,000 to just 2,000.
There was a huge drop in unnecessary surgeries. Why, given that the
computer diagnosis wasn't all that good? Simple. The computer prompted the doctors to
stop and think, to work through different possibilities rather than to leap to the most
obvious answer. It was like the opposite of the GPS's used to navigate around Casua City, which were
effortless and let people switch off.
The computer diagnostic was anything but effort thinking, and usually that's fine.
It would be exhausting to have to think about everything we do every time we do it, but
sometimes being prodded to stop and think can make you realise something important. I don't want to get too meta here, but let's stop and think for a moment about stopping and thinking.
There's a beautiful little experiment about what we do and don't notice.
It was conducted by two psychologists, Leonard Rosenblit and Frank Kyle, who gave their
experimental subjects a simple task.
Here's a list of everyday objects. As you'll see, there's a flush laboratory, a zip
fastener, and several others. I'd just like you to rate your understanding of each object
on a scale of 1 to 7. But after people had written down their ratings, the researchers would launch a gentle but devastating
ambush.
I see you've rated your knowledge of the flush laboratory at 6 out of 7.
That's great.
Here's a pen and a piece of paper.
Please would you write out your explanation in as much detail as possible?
Feel free to use diagrams, that sometimes helps.
Ah, not so easy now, is it?
And it wasn't that people had been lying to the researchers.
They'd been lying to themselves.
They felt they understood zippers and lavatories, but when invited to elaborate, they realized
they didn't understand at all.
Rosenblitten Kael called this the illusion of explanatory depth.
And when people were asked to reconsider their previous one-to-seven rating,
they marked themselves down, acknowledging that their knowledge had been shallower than they'd realized.
What King Cresus really needed was someone to gently ask him,
what exactly did he think the oracle might have meant?
This research is about more than zippers. It might even offer a way to make our political
discourse less polarised and bitterly partisan. How? Well, other researchers have adapted the flush lavatory question to ask about
policies such as a cap and trade system for carbon emissions or a proposal to impose
unilateral sanctions on Iran. The researchers, importantly, didn't ask people whether
they were in favour or against these policies. They just asked them the same simple question.
I'd just like you to rate your understanding
on a scale of one to seven.
And the same thing happened.
People said, yes, they basically understood
these policies fairly well,
then when prompted to explain the illusion faded.
They realized that perhaps they didn't really understand at all.
Another thing that faded?
Political polarisation.
People who would have instinctively described their political opponents as wicked
and who would have gone to the barricades to defend their own ideas
tended to be less strident when forced to admit to themselves
that they didn't really understand what it was that they were so passionate about in the first place.
It's a rather beautiful discovery. In a world where so many people seem to hold extreme views with strident certainty,
you can deflate someone's overconfidence and moderate their politics simply by asking them to explain the details. Whether we're asking people
to walk through a city in Japan or talk about their political differences, it really helps
to call their attention to what they're actually doing.
Asking the Oracle, or the computer, might give us a better prediction, but it discourages
us from thinking hard about the world around us.
And that's not something we should be giving up lightly.
Sometimes it's not the forecast that matters.
It's whether that forecast helps you think harder, or encourages you to stop thinking altogether.
Nearly 70 years after the fall of King Cresus, another great civilization was at war with
the Persians. An alliance of Greek city-states faced an invasion by a vast Persian army, and turned once again the Delphic
Oracle for advice. Ambassadors from Athens received the following prophecy, which didn't
sound encouraging. fly to the ends of creation, all ruined and lost, low, from the high-roof's trickle of
black blood, sign prophetic of hard distresses impending. Get your way from the temple and and brewed on the ills that await ye. The ambassadors were dismayed, they couldn't go back to Athens with that.
And so, they asked the Delphic oracle for another answer.
The second prophecy was vivid, but confusing, and included some lines that hinted at safety.
Safe shall the wooden wall continue for thee and thy children.
Wait not the tramp of the horse nor the footman mightily moving over the land.
But turn your back to the foe and retire ye,
yet shall a day arrive when ye shall meet him in battle.
When this message was brought back to Athens, it was the subject of heated debate.
What did the Oracle mean?
Different factions saw different meanings and made different arguments.
The Oracle mentioned a battle at Holy Salamis, an island near the coast of Athens, where
many men would perish.
That didn't sound good.
But maybe that was King Cresus' error in
reverse, maybe the men who would die would be the Persian invaders.
I realize this all sounds absurd to modern ears. Poetic and completely ambiguous predictions
from a Greek god possessing a young priestess. But the Athenians did what we should still be doing when faced with any
forecast. They discussed and debated. They asked, what does this really mean? Are we sure
how seriously should we take it? And what are we going to do about it?
The Athenian general, thermistically, successfully argued that the wooden wall referred to the Greek
navy and the Greeks should seek a sea battle.
They did and destroyed the large Persian fleet at Salamis.
It's a strange old story about a very different culture, but we could learn from it.
While we humans might not be very good at seeing into the future, thinking seriously about
what the future holds, might just make us slightly better humans.
If you've been with me for the entire first season of cautionary tales, thank you.
You'll have heard what airships teach us about the downsides of competition.
What an apocalyptic cult shows us about changing our minds.
And what a charismatic con artist tells us about the power of persuasion, one small step
at a time.
I hope that my strange stories have made you wiser, and I hope that they've been fun to
listen to.
I've certainly enjoyed making them.
Thanks again for joining me.
Please tell your friends.
I hope to be back with a second season of cautionary tales before long.
There is alas.
No shortage of calamities from which we can all learn.
Corsionary tales is written and presented by me, Tim Halford. Our producers are Ryan Dilly and Marilyn Rust.
The sound designer and mixer was Pascal Weiss,
who also composed the amazing music.
This season's stars Alan Cumming, Archie Panjabi,
Toby Stevens and Russell Tovey, with Enso
Chalente, Ed Gochen, Melanie Guteridge, Masey M. Ro, Rufus Wright and introducing Malcolm
Gladwell.
Thanks to the team at Pushkin Industries, Julia Barton, Heather Fane, Mia LeBelle, Carly
Miliori, Jacob Weis, and of course the mighty Malcolm
Gladwell.
And thanks to my colleagues at the Financial Times.