Hidden Brain - You Don't Need a Crystal Ball

Episode Date: March 28, 2022

When disaster strikes — from the explosion of a space shuttle to the spread of a deadly virus — we want to know whether we could have avoided catastrophe. Did anyone speak up with concerns about t...he situation? And if so, why didn’t someone listen? This week, we revisit a favorite episode about the psychology of warnings, and how we can all become better at predicting the future.If you like this show, please check out our new podcast, My Unsung Hero! And if you'd like to support our work, you can do so at support.hiddenbrain.org.

Transcript
Discussion (0)
Starting point is 00:00:00 T-15 seconds. This is Hidden Brain. I'm Shankar Vedad. T-1910. 9 or 8? On January 28, 1986, the Challenger spacecraft blasted off from Cape Canaveral. Two, one, and liftoff. Liftoff saw the 25th space shuttle mission and it has cleared the tower. Seconds after lift-off into a clear blue sky, something went wrong. Flight controllers are looking very carefully at the situation.
Starting point is 00:00:36 Obviously a major malfunction. It was much more than a malfunction. It was a disaster. As millions of people watched in horror, trails of smoke and debris flew off in different directions. We have a report from the flight dynamics officer that the vehicle has exploded. The space shuttle blew up. Those words flashing across America today. The challenger exploded this morning just after liftoff from Cape Canaveral, the crew of 7 now presumed it. Navarro, the crew of 7 now presumed it. After the explosion, NASA officials were hauled to Capitol Hill and put to questioning. Who finally made that decision to go?
Starting point is 00:01:17 What was the chain? How did that link? What happened then specifically? Members of Congress asked the tough questions. How could this have happened? Who's to blame? These are the kinds of questions we have when any catastrophe occurs, whether a spacecraft has exploded, a new war has broken out, or the world is swept up in a fast-moving
Starting point is 00:01:36 pandemic. In all these cases we want to know, who screwed up? Your government failed you. Those entrusted with protecting you, failed you. And I failed you. We ask, who knew what? When? The State Department had received repeated warnings that the situation was getting worse.
Starting point is 00:02:00 And we try to figure out how... How do we make sure that this kind of breach... That such an accident... ...does not ever happen again. ...with many of these incidents, there's a sense that they could have been avoided. That someone knew something, but didn't say anything.
Starting point is 00:02:15 Or if they said it, they weren't believed. Or if they were believed, nothing was done. This week on Hidden Brain, we revisit a favorite 2019 episode about the psychology of warnings and how we can all become better at predicting the future. Why some warnings get heard, why many are ignored, and the pitfalls of being a prophet. We begin the tale of warnings made and warnings ignored in the middle of Alaska. It's a balmy day, barely jacket weather.
Starting point is 00:02:52 I'm in a car riding on a weather-beating road that leads out of Fairbanks. Chris Heimstra is driving and he points out something. You can see in the road here, you see all these bumps and all these curves. These aren't your standard potholes. The state of the road is a sign of something far more significant. Beneath the pavement, the ground is disintegrating. What lies below the asphalt is the Alaskan permafrost, and it's melting. Permafrost is any soil or ice or rock that's frozen for more than two years, like two consecutive years.
Starting point is 00:03:28 Picture the plants and animals that lived in Alaska over the past tens of thousands of years. After they died and fell to the ground, they froze. Pumifrost is made up of layer upon layer of this organic frozen material. There's a place where you can see what's happening deep inside this pomephrost. It's a tunnel that's hundreds of feet long. It's built into the side of an Alaskan hill. Chris is a research scientist at the Army Corps of Engineers. He works in the cold regions research and engineering laboratory out of Fairbanks.
Starting point is 00:04:00 He has spent a lot of time in this tunnel and he's taking me there today to show me something important. There is 40,000 years of earth history that's stored and frozen in time. To get to that frozen history we walk through a set of gates over to a wood cabin, a whole bunch of hard hats. We head over to what looks like a nondescript wooden shed. It has a sign on the outside that says US Army Corps of Engineers and a warning to watch your step. Inside, it's pitch black. At the far end, there's another door that leads into the actual tunnel. Chris turns on the lights and leads me through.
Starting point is 00:04:39 We walk down to a lower area. The deeper we go, the lower down you go into the tunnel, the older you are in sediment. The further back we move in time. Okay, so like back over here, when Chris describes what's in the tunnel, he's even toned. But what I can see all around me is completely extraordinary. So on that, that's like a 43,000 year old piece of, probably willow, that's been sitting down here for quite a while.
Starting point is 00:05:09 43,000 years old. It actually looks like it could have fallen last year. It's amazing the cold temperatures and how long it can be organic matter can be preserved here. If you're picturing this like a real life trip on the Magic School bus, you're right. Except there's one part of the experience that doesn't exactly fill me with wonder. I think the aromas, one thing that people notice right away, I was just going to ask you, is it just me or does this place sting? It's suddenly got an unusual smell and that's your organic matter that's coming back into the atmosphere. All around me, the decaying plants and animals smell like food gone bad in a freezer.
Starting point is 00:05:53 The smell is unpleasant, but what it ought to be is terrifying. Something is happening here that has consequences for the entire planet. The organic matter trapped in the pomephrast, fungi, plants, animals, it's towing. As a thaw, it decays, and as it decays, it releases extraordinary amounts of carbon into the atmosphere. How much carbon, scientists say the amount of carbon that's stored in the pomephrast, is about double of what's in the entire atmosphere. Let me say it again. Double. It's in the deep freeze. What happens if that temperature goes up, or for some reason it thaws a little bit more, what happens at that carbon? A place you don't
Starting point is 00:06:38 necessarily want it is back in the atmosphere. We don't want it back in the atmosphere because carbon dioxide contributes to climate change. But a vicious cycle has already started. As the planet warms, the permafrost thaws, all those dead animals and plants and fungi start to decompose. More decomposition means more carbon released into the atmosphere, which means warmer temperatures, which means even more melting in the permaphrost. What follows is disaster. The firestorm in Australia has burned in area
Starting point is 00:07:10 as large as West Virginia. So it started raining and no big deal. And then we see the water going down the street start to get a little bit higher. At the rate, global temperatures are rising 60% of all the glaciers here in Xinjiang. Nearly 11,000 glaciers will be gone within 50 years. Chris doesn't need to turn on the news to see what rising temperatures are doing to the planet.
Starting point is 00:07:34 When he's in the tunnel, he can see it right in front of his eyes. He can smell it. He's reminded of it every time he drives on the cracked pavement. And there's a feeling that he can't escape. It comes when he's at work, when he talks to strangers, even when he's in the comfort of his own car. That feeling is futility. He can see a catastrophe unfolding in front of him,
Starting point is 00:07:58 but no one seems to be listening. Or people seem to be worried about the wrong things, like in a conversation he had earlier in the day. The claim was made that most of the CO2 in the atmosphere comes from volcanoes, which isn't the case. Who made the claim? It was somebody on the school field trip earlier today. And I pushed back against that politely.
Starting point is 00:08:22 It's hard because you gotta be really, it's so hard to be, you don't wanna offend people, you don't wanna, because making someone angry isn't gonna change your mind at all. So you gotta be really careful about how you wait into things and you don't know this person necessarily and there's not a familial tie.
Starting point is 00:08:44 All we have really in common is that our kids go to the same school and they're in the same class. But it was a crucial piece of misinformation that wasn't true, and then you just say flat out, that's not true. That's going to shut everything down, and it's not going to help me in any way. It's just going to make somebody who thinks I'm a jerk. To Chris, it seems as if the pushback he gets is driven by a logic contempt that some people feel towards science and scientists.
Starting point is 00:09:13 Like it's not relegated or limited to your work. It's a critique of you as a person as if you're just trying to lie to people about the work you do, which doesn't make any sense. I mean, your stock as a scientist is because of your honesty. If you're not honest as a scientist, your career is over. And it should be over. Chris feels invisible. He walks in a remote place, 40 feet below the surface,
Starting point is 00:09:44 in a part of the country that's unfamiliar to most Americans. A lot of them will never make it up here. A lot of them won't go into the Permphrost tunnel. A lot of them won't go run up above the tunnel or throughout Alaska where Permphrost exists or even understand what it looks like, what it smells like. So how do you communicate that? How do you say like there's a value and you understanding what you don't currently understand? How do you get people to see that there's value in understanding what they don't understand? Chris's question is an ancient question. For millennia, we often shunned and shamed people who have warned us of looming disaster. Why does this happen? Why do human beings who care about their survival ignore warnings of doom?
Starting point is 00:10:31 When we come back, we are going to do something unusual. We are going to look for answers to that question, not in science, but in literature. We are going to dive into Greek mythology and talk about a doomed prophet. The lessons from her story still resonate today. You're listening to Hidden Brain, I'm Shankar Vedantam. This is Hidden Brain, I'm Shankar Vedantam. In Greek mythology, the gods loom large, so large that many of them fill our imaginations even today, Zeus with his thunderbolts, Aphrodite the goddess of love, Hades in charge of the underworld, Athena goddess of wisdom.
Starting point is 00:11:19 The gods of ancient Greece moved among humans, some of those humans were themselves touched with divine powers. One of the most striking was the Prophet Cassandra. Cassandra remained so memorable that her story has inspired movies, television, even campy pop music. In 1982, the Swedish band Abba dedicated a song to her more than 2,500 years after she was first memorialized by Homer, Eskilis, and Euripides. This power that Cassandra had was incredible. Except, there was one problem. No one believed her. It's not Cassandra's fault that she's not believed.
Starting point is 00:12:11 This is Emily Wilson. I'm a professor of classical studies at the University of Pennsylvania. Emily is going to take us back to the legend of the Kingdom of Troy. She's going to help us understand what this ancient myth still has to teach us. Cassandra's father, Pryam, was the king of Troy. It's safe to say that Cassandra didn't suffer from only child syndrome. Pryam has, according to different accounts, either 50 or 100 children. She's one of the many children of Pryam.
Starting point is 00:12:42 Cassandra has a blessing that is really a curse. She can see the future, but no one will believe her. But even if you set aside the curse, it turns out she also did several things that made it less likely she would be believed. For one thing, her prophecies were a pick. When she revealed her visions, she spoke in language that was a little hard to understand. Let the mob endlessly gorging on this clan, raise a shriek over the sacrifice on which stones
Starting point is 00:13:29 will fall in their turn. She speaks primarily in symbols and metaphors that she can foresee the ox killed at the altar. She can foresee blood and slaughter. She doesn't speak in a way that spells out exactly as soon as I walk into this house, Clayton Nesteros is going to take the axe and hack us both to death. Because that's not the way oracles speak. She speaks in prophetic language. Cassandra's best known prophecy had to do with one of the world's most famous carpentry
Starting point is 00:13:59 projects, the Trojan horse. There's a war between the Trojans and the Greeks and it's dragging on. It keeps going for 10 years without decisive victory on either side. The turning point comes when the Greeks have an idea. That they should build a great big wooden horse and have their best warriors on the Greek side hide inside the wooden horse. They leave the great big wooden horse outside the walls of Troy. As the Trojans watch from their city, the Greeks get in their ships, seemingly in defeat, and sail away. Their mysterious gift remains on the beach, and the Trojan's debate whether
Starting point is 00:14:34 to throw up in the city gates and bring in the horse. Is this some kind of gift for the gods? Is it a holy offering? Should we mistrust it? Should we not mistrust it? Cassandra knows that this gift is not a gift at all. She tells her fellow citizens not to open the gates of Troy. In her own convoluted way, she says, this is a terrible idea. Don't do it.
Starting point is 00:14:58 But she doesn't have any real power. Her fellow children don't recognize her as a prophet. And so she's not taken seriously. The consequences are disastrous. The childrens decide in the end to bring the horse in, and then, of course, in the middle of the night, the Greek warriors spring out of the horse and start slaughtering the citizens inside the city.
Starting point is 00:15:23 You think that after Cassandra accurately predicts the fall of Troy, some people might start believing her. You'd be wrong. She makes another prediction that fails to gain any traction. As the city is burning and the women of Troy are forced to board Greek ships as slaves, all the prisoners are distraught, all except Cassandra, who seems weirdly happy. Her mother seems deeply concerned about her, worried for her, as one would be for a daughter who keeps saying crazy stuff, and who also seems to have this perverse idea that being taken into slavery could be a good thing.
Starting point is 00:15:58 There's a reason why Cassandra has this oddly positive reaction. Whereas everyone else can only see what's right in front of them, she can see two steps ahead. She already knows something about her situation. She knows she's going to die, but so is her captor, the Greek general Agamemnon. Revenge is coming, even if it means her own death. Cassandra sees a gleam of hope in the fact that she knows Agamemnon's going to be murdered horribly. She can see that there's going to be bad things for the Greeks down the line. So, to recap, there are several things besides the curse that make Cassandra less likely to be believed. She speaks in cryptic language, doesn't have any formal authority, and is too far ahead of everyone else.
Starting point is 00:16:42 There's one more thing. She asks too much of the people she warns. This happens when Agamemnon takes Cassandra back to Greece with him as a slave. They go to his home. Agamemnon doesn't know that during his absence, his wife, Clytemnestra, has started an affair. He doesn't know that Clytemnestra is not pleased to have him back, but she pretends to be happy. So she welcomes him into the house, along with Cassandra as his human property. He goes first into the house and then Cassandra pauses to give these prophecies, these riddling prophecies. Cassandra foresees his death and her own.
Starting point is 00:17:21 Look at this! Look! Keep the bull away from the heifer! She's caught him in her dress, her engine on her black horn, striking! Into the basin, he falls, with the water lies. He met his death in the bath. It's lay and wait for him, I tell you. Or, to translate, Clayton Nestres about to hack her husband to death with an axe. For Agamemnon to take a Sandra seriously, he would have to see that his life was in danger, that his wife despised him, that far from being a victorious warrior, he was walking into a death trap.
Starting point is 00:18:03 To save his own life, he would have to change his entire outlook. He wants to think of himself as a strong triumphant city-saccer and he's only going to process the information that confirms that belief about himself and he's going to ignore all the signs both from Clyde and Nesterah and from Cassandra that might suggest that reality is not the only reality and in fact you're missing a whole lot of information here. Things play out just as Cassandra that might suggest that reality is not the only reality and in fact you're missing a whole lot of information here. Things play out, Justice Cassandra predicted. She's caught him in her dress, her engine on her black horn, striking.
Starting point is 00:18:35 Clitam Nestra gives him a lovely bath and she entraps him in a net as well in order to make sure that if he struggles, after the first blow, he won't get away, so she strikes him multiple times till he's good and dead and also hats Cassandra to death. The soundtrack it was a particularly gruesome murder, but maybe I'm wrong, maybe this is part for the course in Athenian tragedy. I think it's part for the course, I mean,
Starting point is 00:18:56 it's much less gruesome than the death of Pentheus, say, yes. That was usually some gory death, yeah. Why else would you go to the theater? Now of course, Cassandra was cursed. By definition, it didn't matter how persuasive she was. She was never going to be believed. But her failed attempts to warn those around her
Starting point is 00:19:19 can give us insights into how warnings are heard, whether they're taken seriously, and when they're acted upon. So what does an effective Cassandra sound like? I speak loudly all the time, because I'm kind of aggressive person, you know, even though I'm all, I'll be 70 next year. My wife said, will you please calm down? She's been saying it for 43 years now, and it hasn't happened. The actual name of this Cassandra is Andrew Nazios. His moment of prophecy involved a life
Starting point is 00:19:49 and death choice that affected hundreds of thousands of people. Before we get to that, we need to understand the formative moments in Andrew's career. In many ways, he was an unlikely hero. In the 1980s, he served in the Massachusetts House of Representatives. At a Republican National Committee meeting in 1988, he didn't exactly get the star treatment. Chair recognizes Nassio from Maryland. Massachusetts. Nassio from Massachusetts.
Starting point is 00:20:22 Nassio, I'm quarantined here, sorry. Before he got to his Cassandra moment, Andrew was brought into salvage a boondoggle of a transportation project in Boston. The big dig. It redirected a massive highway into a tunnel under the city. Andrew came in several years into the project. At that point, it was a mess, with huge cost overruns. Andrew had two things going for him.
Starting point is 00:20:46 He had experienced leading big institutions, and he had his temperament. I'm sort of a type A personality, very kind of aggressive and a dominant figure in any institution that I run. So I could get the institution to do what I wanted it to do, what I thought was right to do. He was, in fact, able to figure out why the project's finances were out of whack.
Starting point is 00:21:11 But his efforts weren't always appreciated. It was the most difficult year of my career. Actually, I actually felt safer in Sudan and Iraq and Afghanistan that I did in Boston. One person threatened to break my neck while I was investigating the big dig. Andrew had spent much of his career overseas. His experience leading several national and international organizations, Coffee Eye of President George W. Bush, who appointed him to lead USAID. That's the agency responsible for America's involvement in international development.
Starting point is 00:21:48 It's here that Andrew had his Cassandra moment. In 2003, Andrew briefed top members of the Bush administration about escalating violence in Sudan. By that point, marauders were charging into villages, setting them on fire, sending civilians fleeing. They were known as a janguede. They quickly took over a vast and dusty region on the edge of the Sahara. Darfur. Andrew's warnings and advice had an impact.
Starting point is 00:22:17 The US supplied billions of dollars in aid to Sudan over the next several years. Inspired the economic difficulties, our aid will continue to flow. President Bush also put political pressure on Sudanese leaders. Some of his measures were behind the scenes, others more public. The news media began to take note. Today President Bush announced tough new economic sanctions against Sudan over the continued persecution of the minority population in Darfur. Andrews actions at USAID have become a case study on effective warnings. That's according to
Starting point is 00:22:51 Christoph Mayer. I'm a professor of European and international politics at Kingscovich, London. Christoph has spent many years studying how warnings are made and which ones manage to break through the noise. He says there are several reasons Andrew Nazio's managed to persuade the Bush administration to act. One is he was able to show how further escalation with many hundreds of thousands of people that was highly likely and he was able to put that into a presentation, you know, chart the escalation of the conflict into the future. Andrew commissioned a study that predicted how many people would die if the US didn't intervene. He also got US spy satellites to take pictures of Darfur.
Starting point is 00:23:35 To photograph the ground every day to show the villages that were being burned from day to day. And these photographs were so clear that the photographs were unimpeachable in terms of their quality, in terms of what the atrocities were and they weren't 30 at 100 villages. They displaced two million people. So Andrew laid out clear evidence. There was something else that helped him make the case to the president. He was an insider. Was I taken seriously yes because I had a relationship with the president? That's presidents plural. He had campaigned twice
Starting point is 00:24:14 for George H. W. Bush and he had worked on the George W. Bush campaign in 2000. He was not seen as one of these, you know, do-gooders, one of these kind of liberal NGO types. Yes, he had a history in the NGO sector. He was an expert, but he was also a kind of conservative pedigree. He had experience in the armed forces. He was seen as someone who understood that the president's time was precious and understood the preferences and he was seen as kind of part of us. Besides his political credentials, Andrew was also an insider in another way. He was a Christian who'd spent years leading an international Christian NGO.
Starting point is 00:24:53 He knew that George W. Bush's identity as a Christian was important to him and important to his re-election chances in 2004. Christoph Mayer says Andrew laid out the political consequences of inaction in Darfur, where many of the civilians being attacked were Christian. He was able then to show how that kind of escalation would be politically relevant to the Bush administration at the time, how it would impact on the re-election chances, and how it would connect to Christian constituency in the US, which of course are one of the supporting constituencies of the Bush administration because it was
Starting point is 00:25:30 also Christians who were in Sudan were largely affected by this violence. Andrew implicitly understood a widespread psychological bias. We all tend to look more sympathetically at suffering when the people who are suffering have something in common with us. There's another part to this. Andrew was successful because he didn't ask President Bush to make a major U-turn. From the very beginning of his administration, the president had been interested in what was happening in Sudan. The first presidential review that the president ordered was on Sudan policy. Now what came to be known as a genocide in Darfur did claim hundreds of thousands of lives. So it may be hard to see Andrews warning as effective. And yet if Andrew had not acted, Christophe says he believes things would have turned out worse.
Starting point is 00:26:19 I think the Bush administration did act, it didn't act very early, but it did act politically to put pressure on the conflict parties. And they did act, I think, as soon as could be probably expected on the humanitarian front, therefore saving lives through humanitarian action and probably preventing the conflict from being even more disastrous than would have been otherwise. I couldn't have done that alone, but President Bush did it, and I have to say I said in my book, I think he ended what could have been another Rwanda genocide. So in Andrew Nazios, we have a plain-spoken leader.
Starting point is 00:26:57 He had the insider credentials to get others on board. And he didn't ask policymakers to do something that was greatly at odds with what they wanted to do anyway. Contrast that with the doomed prophet we heard about earlier. Cassandra spoken riddles. Prophetic riddles, but riddles nonetheless. Let the mob endlessly gorging on this clan. Raise a shriek over the sacrifice on which stones will fall in their turn." Unlike Andrew Natios, Cassandra wasn't an insider. As Emily Wilson says, of an unauthoritative person for somebody who's dismissed as other than that can make it possible
Starting point is 00:27:45 to dismiss even a very clear articulation of a scary truth. And Cassandra was too far ahead of everyone else. When she and the other children women are being captured as slaves, she doesn't explain that she is happy because she knows the Greeks are going to be killed. She can see into the future, but she doesn't take others along with her. killed. She can see into the future, but she doesn't take others along with her. There are lessons here for our own time. Kristoff says many modern Cassandra's forget its hard for most people to look far into the future. Leaders especially are often pulled in different directions. Paying attention to one risk means fewer resources for others. If you come in with a vague warning about
Starting point is 00:28:24 a distant problem, you're going to get sidelined. Samantha Powell wrote in a book, a problem from hell, about the response from one administrator to the warning that was given that unless his telephones were ringing, he couldn't do anything. So even if he believed that what she was saying is right, he was so constrained by the lack of intense public clamoring in the beltway that comes support for acting that he
Starting point is 00:28:49 couldn't do something. Cassandra also asked the people she was trying to warn to stretch too far outside their comfort zone. Remember what Clitam Nestra is giving Agamemnon a lovely bath before she hacks him to death. Cassandra saw it coming. Look at this! Look! Keep the bull away from the heifer! But Agamemnon wasn't in a headspace where he could hear the warning. Kristoff says this happens with real-life Cassandra's and real-life policy makers. If leaders have to reject some foundational belief to act on a warning, there's a strong chance that they will simply ignore the warning. Quite often, what makes warnings so difficult to believe is they're politically inconvenience.
Starting point is 00:29:40 In fact, this is exactly what happened in the case of the Challenger Space Shuttle disaster. In fact, this is exactly what happened in the case of the Challenger Space Shuttle disaster. A scientific inquiry found that several engineers had had concerns about the safety of what came to be known as the O-rings on the shuttle. They told NASA managers to delay the launch, but the managers overruled the engineers, and the Challenger took off his plan. Like a troller's here looking very carefully at the situation. Obviously a major malfunction. We've painted a picture of warnings that is at odds with the way most of us think about them. In the conventional telling, someone raises an alarm and everyone jumps up and does something about it. In reality, warnings are likely to be heard when they're made by someone who's part of our in-group, when the warning is so imminent that nearly
Starting point is 00:30:30 everyone can see the danger, and when the solution doesn't require a radical shift in existing strategy. Unsurprisingly, this means that many warnings will go unheeded, and many Cassandra's will be dismissed. After the break, why you don't need Cassandra-like vision to predict what's to come. Psychologist Phil Tetlock tells us about the traits of super-frocasters. You're listening to Hidden Brain, I'm Shankar Vedantin. This is Hidden Brain, I'm Shankar Vedantin. We're surrounded by people who tell us they know what's going to happen in the future. A lot of people have no idea that Trump is headed for a historic defeat.
Starting point is 00:31:30 Bearsters is fine! Don't move your money from there! That's just being silly! These predictions have a few things in common. The commentators have complete confidence in themselves. We as the audience love to hear them make a complicated world seem simple. And finally, no one ever pays a serious price for being wrong. Donald Trump wins the presidency. There's stirs in the bargain bin. Sold to rival JP Morgan Chase for just two dollars a share. Making predictions is hard, even for the so-called experts.
Starting point is 00:32:07 Ironically, the people who are the best at forecasting the future tend to be ordinary people who happen to know a very important secret. Predicting the future isn't about being unusually smart or especially knowledgeable. It's about understanding the pitfalls in the way we think, and practicing better habits of mind. Phil Tattlach is a psychologist at the University of Pennsylvania. In his book, Super Forecasting, The Art and Science of Prediction, Phil explores how we can learn from these people to become better forecasters ourselves. Phil, welcome to Hidden Brain. Thank you very much.
Starting point is 00:32:48 So Phil, lots of people watch television at night and millions of people feel like throwing things at their television set each evening as they listen to pundits and prognosticators explain the day's news and predict what's going to happen next. Of all the people in the country, you probably have more cause than most to hurl your coffee cup at the television set, because starting in 1984, you conducted a study that analyzed the predictions of experts in various fields.
Starting point is 00:33:15 What did you find? Well, we found that pandits didn't know as much about the future as they thought they did. But it might be useful before we start throwing things at the poor pundits on the on the TV screen to consider their predicament. They're under pressure to say something interesting. So they resort to interesting linguistic gambits. They say things like, well, I think there's a distinct possibility that Putin's next move will be on a stonium.
Starting point is 00:33:47 Now, that's a wonderful phrase distinct possibility. It's wonderfully elastic because if Putin does move into a stonium, they can say, hey, I told you there was a distinct possibility. It was going to do that. And if he doesn't, they can say, hey, I just said it was possible. So they're very well positioned. Now, if you play the game the way it really should be played, the forecasting game and you use actual probability. So you play it the
Starting point is 00:34:09 way Nate Silver plays it and you wind up with, say, a 70% probability that Hillary will win the election a few days before the election in November 16, you're much more subject to embarrassment. If he said there's a distinct possibility that Hillary will win, he could he would have been very safely covered. Because when you ask people to translate distinct possibility of the numbers, it means anything from about 20% to about 80%. The truth is making predictions is difficult, but many biases also get in the way of making accurate forecasts. difficult. But many biases also get in the way of making accurate forecasts. When we make a prediction and it turns out wrong, most of us don't remember that we predicted something different. In fact, the hindsight bias prompts us to believe we'd gotten it right all along. We also hailed people who make predictions that turn out right, whether in this talk market
Starting point is 00:35:01 or politics or sports, but that keeps us from seeing the role of luck. Many people who get the right answer are just lucky, and some people who get wrong are just unlucky. Over time, the laws of probability mean luck can only take you so far. One reliable way to check if someone's success at predictions is driven by skill or by luck is to have them make lots of predictions and see how they turn out over time. A few years ago, the federal government launched such an experiment. They conducted a forecasting tournament where thousands of people logged into computers
Starting point is 00:35:37 to make thousands of predictions. As time passed, the forecast could be checked for accuracy. We were one of five academic research teams that were competing to pull together the best methods of making probability estimates of events that national security professionals cared about. What kind of questions were they asking? All over the map, quite literally. So there would be questions about violent clashes
Starting point is 00:36:05 in the east or south China Sea. There would be questions about the Syrian Civil War, about Russian-Ukrainian relations, about the Iranian nuclear program, Columbia Narco-Trafikers, literally all over the map. If you were asked to pick someone to answer a difficult question about foreign affairs, you might turn to an Oxford-educated public intellectual who writes a column for a very important
Starting point is 00:36:29 newspaper. You probably wouldn't turn to a retiree in Nebraska who spends his time bird-watching. But Filtaet Locke says, maybe you should. He was the opposite of Tom Friedman. Tom Friedman, of course, being an eminent New York time columnist, well-known for his explanations, but nobody has any idea how good a forecastry is. And Bill Flack is an anonymous, retired irrigation specialist working in Nebraska, working out of the public library or out of his home, and doing a fabulous job making probability estimates in intelligence, community forecasting tournament.
Starting point is 00:37:14 Superforkasters like Bill Flack turn out to have some things in common. Tell me about the kinds of philosophies they have and the kinds of thinking styles that you seem to find in common among many of these super forecasters. I would say the most distinctive attribute of the super forecasters is their curiosity and their willingness to give the idea a try. And when I say the idea, I mean the idea that forecasting is a skill that can be cultivated and is worth cultivating. Because it doesn't matter how intelligent you are, or how knowledgeable you are,
Starting point is 00:37:50 if you believe that it's essentially impossible to get better at these kinds of tasks, you're never going to try and it's never going to happen. It's as simple as that. SuperForkasts tend to gather information and update their beliefs in a very particular way. Filthat Lock points to Aaron Brown, the chief risk officer of the Hedge Fund AQR. Before he was a big shot in finance, he was a big shot in the world of poker. He was a world-class poker player. And we quote him as saying that you can tell the difference between a world-class poker player and a talented amateur, because the world class player
Starting point is 00:38:26 knows the difference between a 60, 40 bet and a 40, 60 bet. And he pauses and says, oh, maybe like 55, 45, 45, 55. The distinguishing more degrees of maybe is an important skill. Why is that? Well, the very best forecasters are well calibrated. So when they say events are 80% likely, those events happen about 80% of the time. When they say things are 90% likely, they happen about 90% of
Starting point is 00:38:52 the time. So it makes a difference how frequently you update your forecasters. If you don't update your forecasters reasonably frequently, you're going to fall out of phase with events. And that means often making adjustments that are relatively small. You suggest that forecasters should do something that doesn't seem to be very intuitive. Instead of looking at the particulars of an individual case, you say forecasters should zoom out and figure out how often something has happened historically.
Starting point is 00:39:24 So Daniel Connamen is probably one of the greatest psychologists of the last hundred years, and he calls that the outside view. He says, people rarely take the outside view when they do forecasting. They normally start from the inside and they work out, but there's a big advantage to you as a forecaster from starting with the outside view and working in. Take another example. Let's say you're at a wedding and you're sitting next to somebody who has the bad taste and working in. Take another example. Let's say you're at a wedding and you're sitting next to somebody
Starting point is 00:39:47 who has the bad taste to ask you, how likely do you think it is this couple is going to stay married? And you look at the person, it's bad taste and all that. You see how happy the couple is and you can see it's a joyous occasion. You say, I can't imagine these people
Starting point is 00:40:01 who are so happy together getting divorced. I think maybe a 5% chance they're going to get divorced. Now, if you'd ask that question of a super forecaster, they'd say, well, what's the, let's look at the sociodemographics of the couple. And let's see, what's the base rate of divorce within this sociodemographic group? Let's say it's 35 or 40% over the next 10 years. Okay, I think it's about a 40% chance to get divorced in the next 10 years. Now, that's not the end of the forecasting process.
Starting point is 00:40:27 That's just the beginning. The real value, though, of doing it this way, of starting from the outside and working in, is it puts you in the ballpark of plausibility right away. 40% is a much more plausible number than 5%. Now, then you could start adjusting the 40%. So if you discover things about the couple that suggest they really are deeply bonded to each other and they've known each other a long time and they really
Starting point is 00:40:48 understand each other and they've done great things for each other, you're going to lower your probability. If you discovered that the husband is a sociopathic philanderer, you're going to raise the probability. Those are inside view sorts of pieces of data that would cause you to adjust. Or you might just see that having a small fight and say, well, okay, I'm going to move from 40 to 41 percent. And that's one of the interesting things of a super forecast is they do a lot of updating and response to relatively small events. And most of the news, most of the time, is what statisticians would call low diagnosticity news.
Starting point is 00:41:23 It doesn't change things dramatically, but it does change things a little bit. And appreciating how the news gradually builds up toward one conclusion or another is a very valuable skill. I'm wondering if one reason many of us start with the inside view rather than the outside view is that just at an emotional level, that's how our minds think. You see a couple and you put yourself in the shoes of that couple and you try and imagine what's happening in their lives. We think in stories and we imagine what life must be like for that couple and we're trying
Starting point is 00:41:55 to see how that story will turn out and we're trying to follow the narrative where it leads rather than do this much more abstract remote process of saying, let me start with the rough estimate of how often something happens. Is something about that in some ways that requires us to step outside this narrative frame that we often use to understand the world? I think that's right. We're quite readily seduced by stories. Another example might be, let's say I ask you how likely is it that in the next 10 years
Starting point is 00:42:26 there will be a flood in North America that kills more than a thousand people and ask you to make an estimate on that. Let's say I ask another person to make the estimate, how likely is it that there will be a flood in California that will be caused by an earthquake cracking a dam leading to a massive outflow of water. Now when I put the two questions together like that, it's pretty obvious that a flood anywhere in North America, due to any cause, has got to be more likely than a flood in California caused by an earthquake cracking a dam.
Starting point is 00:42:57 The California event is obviously a subset of the more general North American flood thing. But people don't see it that way. The California earthquake dam story is more like a story. It's like a story. You can actually put it together in a more meaningful way, whereas a flood anywhere in North America is kind of abstract and vague. So people can transport themselves into the world,
Starting point is 00:43:17 and you can imagine it's like a movie, like a Hollywood movie playing out. And they can see it happening. Yes, I can see that happening. And that pumps up the probability, and that screws up your forecasting track record. So again, think of forecasting as a skill that can be improved with practice. When you're making the prediction, start with the base rate, the outside in view. Beware of the risks of storytelling. Finally, amateurs make three kinds of predictions. Yes, no, and maybe.
Starting point is 00:43:48 Professionals have many gradations of maybe, and the attached specific probability estimates to their predictions, allowing them to go back and learn where they went wrong. But even if you do all these things, I asked Phil how you can be really sure that predictions that turn out correct are because of good technique. You know, there's an old trick that they play in business schools to talk about the role of luck where they divide the class into pairs and they say, you know, have a coin toss between each person and then the winner of each of those coin tosses competes against another winner and after 12 rounds there's one person who has declared the winner and of course that person in a business sense might seem to be extraordinarily good, but really all that's happened is that they've happened to win 12 coin tosses in a row. They've just
Starting point is 00:44:34 been very, very lucky. How do you distinguish between people who are lucky and people who are actually very good? Well, that is indeed the $64,000 question. And it comes up in finance, too. I mean it comes up in finance too. I mean, there are some finance professors out there who would argue that the really famous super investors, Warren Buffett or Ray Dalio and people like that, are in some sense like coins that come up heads 20 or 30 or 40 times in a row. We have a lot of people competing in financial markets and they're making a lot of predictions over long stretches of time. So you're going to expect some streaks and when we get really
Starting point is 00:45:07 streaky performance we declare we found a genius. So the skeptics would say, well, Phil Tetlock is doing essentially the same thing here. He's anointing people who are essentially lucky. So we built in a lot of statistical checks, but you know, you can never be 100% sure. One of the other critiques of Phil's work is that the kinds of questions that super forecasters are answering are not really the questions that people want answered. Most of us are interested in the big questions. Who is going to win the current standoff between the United States and Russia? Super forecasters tend to answer much more narrow questions.
Starting point is 00:45:44 Is Russia going to invade Ukraine in the next 18 months? I think that's a very fair criticism of the first generation of forecasting tournaments that we put all of our effort into improving forecasting accuracy. Now I'm not going to say the questions that people were trying to answer were trivial, but could we have made the questions more relevant to deep policy questions? I think the answer is yes. I think we should be focusing as much
Starting point is 00:46:09 on the insightfulness of the questions as the accuracy of the answers. I'm gonna ask you one final question, and this is also, I think, a potential critique of super forecasting, but it comes in the form of a forecast that I'm gonna make. The reason I think many of us make forecasts or look to prognosticators and pundits to make forecasts is that it gives us a feeling like we have a handle on the future. It gives us a sense of reassurance. And this is why liberals like to watch the
Starting point is 00:46:38 pundits on MSNBC and conservatives like to watch the pundits on Fox. A more cautious style that sort of says, the chance that you're going to die from cancer is 65.3%. These estimates run up against a very powerful psychological impulse we have for certainty that we actually want someone to hold our hand and tell us you're not going to die. We don't want the probability estimate. We want actually an assurance that things are going to turn out the way we hope. So here's my last question for you.
Starting point is 00:47:09 If someone advises people to do something that runs against their emotional need for well-being and reassurance, I'm going to forecast that that advice, however well-intention, however accurate, is likely not going to be followed by most people. What do you make of my forecast, Phil? accurate is likely not going to be followed by most people. What do you make of my forecast film? Well, I think there's a lot of truth to what you say. I would say this. I would say people would be better off if they were more honest with themselves about the functions that they're believed serve. Do I believe this because it helps me get along with my friends or my boss, helps me fit in, helps me feel good about myself,
Starting point is 00:47:45 or do I believe this because it really is the best synthesis of the best available evidence. But you're right, when people sit down in their living room and they're watching their favorite pundits, they're cheering for their team. It's a different kind of psychology. They're playing a different kind of game. So all I'm saying is you're better off
Starting point is 00:48:03 if you're honest with yourself about what game you're playing. Psychologist Phil Tatlock is the author of Super Forecasting, The Art and Science of Prediction. Phil, thank you for joining me today on Hidden Brain. My pleasure. Hidden Brain is produced by Hidden Brain Media. Our audio production team includes Bridget McCarthy, Annie Murphy-Paul, Kristen Wong, Laura Quarelle, Ryan Katz, Autumn Barnes, and Andrew Chadwick. Tara Boyle is our executive producer. I'm Hidden Brain's executive editor.
Starting point is 00:48:38 Our unsung hero this week is Sophia Dawkins. Sophia has spent years studying the conflict in South Sudan. She patiently helped us understand the complicated history of the region to get the nuances right. Sometimes, the most important contribution someone can make to a story is to get things out of the story. Thank you, Sophia. If you like Hidden Brain, please consider making a financial contribution to help us make the show. You can do so by going to support.hiddenbrain.org. Any amount really helps and we truly appreciate your support.
Starting point is 00:49:14 I'm Shankar Vedantam. See you soon. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.