Freakonomics Radio - 50. The Truth Is Out There…Isn’t It?

Episode Date: November 22, 2011

There’s a nasty secret about hot-button topics like global warming: knowledge is not always power. ...

Transcript
Discussion (0)
Starting point is 00:00:00 Yes, life was pretty good until I got a phone call from my broker. That's Stephen Greenspan. He's an emeritus professor of psychology at the University of Connecticut. Hi, Catherine. Hey. And this is Catherine Wells. She's one of the producers on our show. Hi, Catherine.
Starting point is 00:00:20 Hi, Stephen. So you are here today with a story for us, yes? Right, a story about us, yes? Right, a story about Stephen Greenspan. He has an interesting specialty. He's an expert in what he calls social incompetence, which we all feel. What he means is he studies why people do dumb things. Presumably that means why smart people do dumb things. Right, that included. And when he told you there that life was pretty good, what did he mean? What was so good exactly about his life? Well, it was December 2008, and he had a book coming out called Annals of Gullibility. The other thing that seemed pretty
Starting point is 00:00:55 good was his financial situation. About a year earlier, he had invested in a hedge fund that was doing pretty well. So he was getting nicely set up for retirement, too. So one day in December, he gets this first pre-release copy of the book, the gullibility book, and two days later, his broker calls. I said, how are you? He said, terrible. It's the worst day of my life. Now, this is a man who'd lost his son. So when he said it's the worst day of my life, that got my attention.
Starting point is 00:01:22 And I said, why? He said, well, Bernard Madoff just admitted he was running a Ponzi scheme. And I responded, who is Bernard Madoff and what's it have to do with me? So, Catherine, I think we can kind of smell where this is headed. Right. This fantastic hedge fund that Greenspan had invested in turned out to be a feeder for Madoff's Ponzi scheme. And Greenspan had no idea. He didn't even remember having heard Madoff's name. Oh, man. So the gullibility expert has been gulled. Right. Gulled in a big, ironic way. He lost $400,000. Now, this was just about a third of
Starting point is 00:01:56 his savings, so it wasn't the total end of the world. And he should get some money back eventually from settlements. But he's 70 now. He has two college-aged kids, and he'd really hoped to be retired by now. And he certainly didn't want to be remembered in this way. There was a columnist, a financial columnist in Canada, who in his blog wrote, the first Greenspan, Alan, will be remembered as the economist who didn't see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who didn't see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who forgot to read his own book on gullibility. Ouch, right?
Starting point is 00:02:30 Poor guy. Yeah. I mean, it's ironic because Greenspan's own research shows that even the smartest people can be duped. I mean, a good example of that would be Sir Isaac Newton, the greatest scientist of all time, who lost over a million dollars in modern dollars in the South Sea bubble. And so he wrote, I can calculate the orbit of heavenly bodies, but I cannot fathom the madness of men.
Starting point is 00:02:54 In reference to losing this money? In reference to his own foolishness in putting all of his fortune at risk in something that he wasn't really, in spite of his incredible brilliance, able to really understand or adequately calculate the risk of. So in a way, you joined kind of an elite club of brilliant, informed, educated people who can be fooled. I joined the human race, basically. Like Sir Isaac Newton and the South Sea Bubble,
Starting point is 00:03:33 I knew nothing about Madoff and just basically went along with the crowd. And that's powerful. We tend to take our cues from other people, especially in situations where we don't quite know what to do. So it may no longer surprise us to learn that smart people sometimes make dumb decisions. Right. It's like Greenspan says, it's the instinct to go along with the crowd and to take our cues from other people. And that's really what today's show is about. Right. And I want to talk about something else Greenspan said, an even more elemental
Starting point is 00:03:59 issue, which is how we make decisions about a risk that we just aren't equipped to calculate. But here's the thing. If it can't be calculated, then maybe it's not exactly a risk. About 100 years ago, the economist Frank Knight argued that risk and uncertainty are nearly identical, but for one key difference. Risk can be measured. Uncertainty by its nature cannot. But what happens when you can't tell the two of them apart? From WNYC and APM American Public Media, this is Freakonomics Radio. Today, the truth is out there, isn't it? Here's your host, Stephen Dubner. So Stephen Greenspan, the gullibility expert, loses a third of his life savings in what turns out to be a Ponzi scheme.
Starting point is 00:05:07 Now, even if you feel sympathetic toward him, you might say, hey, you know, he's just one person. Bad things happen to people every day. At least the world didn't end. But what if we were worried about something that might end the world? No, I'm not talking about an attack by alien nations. Not yet, at least. That'll come later in the program. I'm talking about climate change. How are people like you and me supposed to calculate the threats from something like climate change? There's so much complexity, so much uncertainty.
Starting point is 00:05:46 So most of us do what Stephen Greenspan did when he was looking to invest. We take our cues from other people. It's not a question of debate. It's like gravity. It exists. The reason that you know you're right is that you know things they don't know. And because they don't even have that baseline of knowledge to chat with you, they can't even understand where you're coming from.
Starting point is 00:06:10 And that's exactly how I feel talking to people who believe this global warming crap. The science is solid, according to a vast majority of researchers, with hotter temperatures, melting glaciers, and rising sea level providing the proof. When the University of Madison, Wisconsin comes out with their definitive study, do I believe that? No. Do I believe scientists? No. They lied to us about global warming. Who do you believe? Who do you believe? That was Glenn Beck, by the way. Before him, from the top, you heard Al Gore and then Rush Limbaugh and an ABC World News report. When it comes to something like climate change, as fraught as it is with risk and uncertainty and emotion, who do you believe?
Starting point is 00:06:55 And more important, why? You know, my personal perception is that I don't know enough about it, believe it or not. This is an issue that I think – Wait, could you just say that again so everyone in the world can hear an honest response? It's so rare for some version of I'm not quite sure or I don't know. So, sorry, say it again and then proceed. What I was saying, I'm not sure exactly what I believe on it in terms of the risk perceptions of climate change. It's something that I don't think I am personally educated on enough to have a really firm opinion about that. That was Ellen Peters.
Starting point is 00:07:31 She teaches in the psychology department at Ohio State University. She is part of a research group called the Cultural Cognition Project. They look at how the public feels about certain hot button issues like nuclear power and gun control, and then they try to figure out how much those views are shaped by people's cultural values. That is not empirical evidence, but people's what they call cultural cognition. So they recently did a study on climate change. How was it, they wanted to know, that the vast majority of scientists think the earth is getting warmer because of human activity, but only about half the general public thinks the same? Could it be perhaps that people just don't trust scientists? Here's Dan
Starting point is 00:08:19 Kahan. He's another cultural cognition researcher and a professor at Yale Law School. Well, in fact, scientists are the most trusted people in our society. The Pew Foundation does research on this. This has been a consistent finding over time. Okay, so there goes that theory. That explanation won't work for us then. Correct. All right, so maybe people just don't understand the science. Surveys have found that fewer than 30% of Americans are scientifically literate.
Starting point is 00:08:49 Ellen Peters again. People have the belief that the reason that people don't believe the risks of climate change are high enough is because they're not smart enough. They're not educated enough. They don't understand the facts like the scientists do. And we're really interested in that idea and whether that's really what was going on or whether something else might matter. So Peters and Kahan started out their climate change study by testing people on their scientific literacy and numeracy, how well they knew math. And the items are things like it is the father's gene that decides whether the baby is a boy or a girl, true or false.
Starting point is 00:09:24 True. So fairly simple. Is it true? You know, actually, I'm not even positive on that one. I think it's the, oh, no, it has to be the father's gene. I'm putting my money on father true. Father is true there. Absolutely.
Starting point is 00:09:36 Second question, antibiotics kill viruses as well as bacteria. Negative. That one is absolutely false. You can see why they wanted to know how people did on these kind of questions before asking them about climate change. Numeracy in general, what it should do is it should help you to better understand information, first of all. And that kind of comprehension is sort of a basic building block of good decisions across a variety of domains. Right, right. But numeracy should also do other things.
Starting point is 00:10:03 It should also help you just simply process the information more systematically. It should, in general, help you to get to better decisions that are more in line with the facts. All right. That makes perfect sense. But you have found something that kind of flies in the face of that, haven't you? We have. It's the idea that people who are highly numerate and highly scientifically literate, they seem to actually rely on preexisting beliefs on these sort of underlying cultural cognitions they have about how the world should be structured more than people who are less scientifically literate or less numerate.
Starting point is 00:10:40 So if I wanted to be wildly reductive, I might say that the more education a culture gets, the more likely we are to have intense polarization, at least among the educated classes. Is that right? Based on our data, that's what it looks like. It's so interesting and so disturbing at the same time. It is interesting, isn't it? I mean, Peters and Kahan found that high scientific literacy and numeracy were not correlated with a greater fear of climate change. Instead, the more you knew, the more likely you were to hold an extreme view in one direction or the other. That is to be either very, very worried about the risks of climate change or to be almost not worried at
Starting point is 00:11:33 all. In this case, more knowledge led to more extremism. Now, why on earth would that be? Dan Kahan has a theory. He thinks that our individual beliefs on hot-button issues like this have less to do with what we know than with who we know. My activities as a consumer, my activities as a voter, they're just not consequential enough to count. But my views on climate change will have an impact on me in my life. If I go out of the studio here over to campus at Yale, and I start telling people that climate change is a hoax, and these are colleagues of mine, the people in my community, that's going to have an impact on me. They're going to form a certain kind of view of me because of the significance of climate change in our society,
Starting point is 00:12:31 probably a negative one. Now, if I live, I don't know, in Sarah Palin's Alaska or something, and I take the position that climate change is real, and I start saying that, I could have the same problem. My life won't go as well. People who are science literate are even better at figuring that out, even better at finding information that's going to help them form, maintain a view that's consistent with the one that's dominant within their cultural group. So you're saying that if I believe that climate change is a very serious issue, and I want to align my life with that belief, that it's actually more important that I align my life with that belief, not because of anything I can do, but because it helps me fit in better in my circle.
Starting point is 00:13:14 There's more currency to my belief there. What about you? You're in New Haven, Connecticut at Yale. I gather you haven't walked into a classroom and publicly declared that you believe climate change or global warming is a hoax. Have you? No, I haven't done that. This makes sense, doesn't it? But it's also humbling. We like to think that we make up our minds about important issues based on our rational, unbiased assessment of the available facts. But the evidence assembled by Kahan and Peters shows that our beliefs, even about something as scientifically oriented as climate change, are driven by a psychological need to fit in. And so we create strategies for doing this.
Starting point is 00:14:06 Here's my Freakonomics friend and co-author Steve Levitt. One of the issues with information gathering is that when people go to the trouble to learn about a topic, they tend not to learn about a topic in an open-minded way. They tend to seek out exactly those sources, which will confirm what they'd like to believe in the first place. And so the more you learn about a topic, you tend to learn in a very particular way that tends to reinforce what you believed before you ever started.
Starting point is 00:14:37 Aha. So if you're already scared of something, you tend to read more about how scary it is. And if you're not worried, then you don't worry, right? So if there's one thing that human beings are terrible at, it's assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are things that evolution has bred into us. So my wife is terrified of snakes, mice, flies, butterflies,
Starting point is 00:15:08 everything small that flies or that runs she's terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean, the thing that you should be afraid of are french fries and double cheeseburgers and getting too much sun for skin cancer. Those are the kind of things that really end up killing us in the modern world. Coming up, since we're so bad at figuring out what's really dangerous, let's bring in the professionals, shall we?
Starting point is 00:15:43 I'm Mr. Skeptic. Anything that can be analyzed critically and skeptically, that's what we do. And a cautionary tale about siding with the conspiracy theorists. I think somebody actually thought I was an alien myself. yourself. From WNYC and APM American Public Media, this is Freakonomics Radio. Here's your host, Stephen Dubner. So as Steve Levitt sees it, we seek out information that confirms our pre-existing biases. And we're congenitally bad at assessing risk. So how are people supposed to figure out what to be afraid of? Here's Levitt again. To know what to be afraid of, you need to go
Starting point is 00:16:45 through an in-depth data collection process. You need to be properly informed. And people too busy, rightfully too busy, leading their lives instead of dwelling on what the exact, almost infinitesimal probability is that any particular thing will kill them. And so it's sensible for people to be uninformed, and it's sensible to rely on the media. It just turns out that the media is not a very good source of information. If you really wanted to make sure that every one of your beliefs was worth holding, you'd have to spend so much time gathering primary data that you'd have no time for anything else in life. You'd practically have to become a professional skeptic. And that's not a job, is it?
Starting point is 00:17:33 Yeah, I'm Mr. Skeptic. Anything that can be analyzed critically and skeptically, that's what we do. So anything from UFOs and alien abductions to Bigfoot and conspiracy theories, all the way up to things like global warming and climate change and autism and vaccinations, and we cover it all. Michael Shermer, a professor at Claremont University, has a master's degree in experimental psychology and a PhD in the history of science. He's also the publisher of Skeptic magazine, and he writes books. His latest is called The Believing Brain. Now, as a professional skeptic, I'm guessing a lot of people look at you
Starting point is 00:18:16 or hear about a guy like you or read a book by you and think, oh, man, that's like the dream job. You know, people think, well, I'm a skeptic. I don't believe anything. So what do you have to do to be you, Michael? Well, we actually do believe all sorts of things, and you have to have all sorts of beliefs just to get out of bed in the morning. And so the question then becomes, well, which of your host of beliefs are the ones that are really supported by evidence or questionable or probably not true? And which are those that we base on
Starting point is 00:18:46 instinct and intuition? And which are we basing on, you know, solid evidence? And so that's where the rubber meets the road is not do you believe something or not? Of course, we all believe all sorts of things. The question is, are they true? And what's the evidence? And what's the quality of the evidence? Talk to me about how we end up believing what we believe in. I was going to say how we choose to believe what we believe in, but it sounds like it's not really a choice, right? It isn't really a choice, no. Our brains are designed by evolution to constantly be forming connections,
Starting point is 00:19:17 patterns, learning things about the environment, and all animals do it. You think A is connected to B, and sometimes it is, sometimes it isn't, but we just assume it is. So my thought experiment is imagine you're a hominid on the plains of Africa three and a half million years ago. Your name is Lucy. And you hear a rustle in the grass. Is it a dangerous predator or is it just the wind? Well, if you think that the rustle in the grass is a dangerous predator and it turns out it's just the wind, you've made a type 1 error in cognition, a false positive.
Starting point is 00:19:43 You thought A was connected to B, but it wasn't. But no big deal. That's a low-cost error to make. You just become a little more cautious and vigilant, but that's it. On the other hand, if you think the rustle in the grass is just the wind and it turns out it's a dangerous predator, you're lunch. Congratulations, you've just been given a Darwin Award for taking yourself out of the gene pool before reproducing.
Starting point is 00:20:02 So we are the descendants of those who were most likely to find patterns that are real. And we tend to just believe all rustles in the grass are dangerous predators, just in case they are. And so that's the basis of superstition and magical thinking. But then we get to something like climate change, which is theoretically an arena bounded entirely by science, right? You would think so. I mean –
Starting point is 00:20:27 Yeah, you would think so. So what do we find? Either the earth is getting warmer or it's not, right? I mean it's just a data question. Well, because it also has ideological baggage connected to it, left-wing versus right-wing politics. And so the data goes out the window. It's like whatever the data is, I don't know, but I'm going to be against it now.
Starting point is 00:20:44 I can't just say I'm against it because my party is, or I just do what other people tell me. Nobody says that. What you do is you make the decision, I'm skeptical of that, or I don't believe it. And then you have to have arguments. So then you go in search of the argument. It doesn't sound like it surprises you at all then that education, level of education doesn't necessarily have a big impact on whether you're pro or con something, correct? That's right. It doesn't. And giving smart people more information doesn't help. It actually just confuses things.
Starting point is 00:21:14 It just gives them more opportunity to pick out the ones that support what they already believe. So being educated and intelligent, you're even better at picking out the confirming data to support your beliefs after the fact. Let's talk now for a bit about conspiracy theories, which we're nibbling around the edges of. How would you describe, if you can generalize, the type of person who's most likely to engage in a conspiracy theory? Their pattern-seeking module is just wide open. The net is indiscriminatory. They think everything's a pattern. If you think everything's a pattern, then you're kind of a nut. I suppose I'm best known for having had a job at the government where my duties were investigating UFOs. That's Nick Pope. Until 2006, he worked for the British Ministry of Defence. And in the early 90s, he headed up the office that handled reports of UFO sightings.
Starting point is 00:22:13 Flying saucer sightings, as they were called then. His job was to figure out if any of these sightings had merit and if, perhaps, there were extraterrestrial visitors. To satisfy ourselves that there was no threat to the defense of the UK. Pope came into the job as a skeptic. But some UFO reports, especially from pilots and police officers, got him wondering if perhaps we were being visited by aliens. Now, mind you, there was no hardcore confirmatory evidence, but Pope started talking in the media about the possibilities. You say that you believe with 99% certainty that we're not alone. So tell us what you've discovered.
Starting point is 00:22:57 Well, I think it's inconceivable in this infinite universe that we're alone. And then that begs the question, if we're not alone, are we being visited? It's a related question. When I started speaking out on this issue, I think some people in the UFO community thought that I might be some sort of standard bearer for them. Meaning one of them? Yes, absolutely. That I could be a spokesperson for the movement. Of course, I had the huge advantage that whilst everyone else had done this as a hobby, I'd done it as a job. Did that make you a bit of a hero in the UFO community?
Starting point is 00:23:32 It did, and a lot of people still hold that view. They want me to come out and say, yes, it's all real, and yes, I was part of a cover-up. Their fantasy is what they call disclosure with a capital D, as if there's going to be some magical parting of the curtains and a moment where a crashed spaceship is revealed for all the world to see. Because I say, you know what, I don't think that spaceship exists. So in a sense, I managed to upset everyone. I go too far for a lot of the sceptics by being open to the possibility, but I don't go far enough for the believers, particularly the conspiracy theorists. And I
Starting point is 00:24:08 get called things like shill. And that's one of the more polite things I've been called. Yeah. I've looked at some of the comments on YouTube from a speech you gave. I'll read you a bit of it. We'll have to employ our bleeping technician later. Nick Pope, what a f***ing spastic. He works, he quote, works for the government. Why else is he constantly on every bloody UFO program on every f***ing channel? He talks enough bulls*** to keep the UFO nutters happy while never actually saying anything of importance.
Starting point is 00:24:38 Let's unpack that one a little bit. Shall we, Mr. Pope? Yes. It says you, quote, work for the government. Do you still work for the government? No, I don't. This is in itself one of the great conspiracy theories that in 2006 I didn't really leave. I just went under deep cover and that they're passing me wads of banknotes in a brown paper bag or something.
Starting point is 00:24:59 But here's my favorite. There's one claim on a UFO blog that you, Nick Pope, have been abducted by aliens yourself and now lie about it. Well, yes, I've heard that one. I've even seen one which I think you might have missed. I think somebody actually thought I was an alien myself. Ah, that would explain a lot, wouldn't it? Nick Pope discovered a sad truth. The more transparent he tried to be, the more information he released about himself and his work, the more worked up his attackers became.
Starting point is 00:25:45 They took facts that would plainly seem to work in his favor, and they somehow made these facts fit their conspiracies instead. But before we judge, consider how good we all are at deciding first what we want to believe and then finding evidence for it. So what's the solution? What can we do to keep ourselves headed down the road, albeit slowly and clumsily, toward a more rational, reasoned civilization? Here's Ellen Peters again from the Cultural Cognition Project. So I guess the depressing conclusion one might reach from hearing you speak is that ideology trumps rationalism. I think that we are seeing some evidence for that in this study, but I don't think that that has to be the final answer. I think that policymakers, communicators need to start paying attention to some of these cues that deepen cultural polarization.
Starting point is 00:26:44 So, for example, telling the other side that they're scientifically inept, probably a bad idea, probably not the best way to continue people coming together on what the basic science really does say, or coming up only with solutions that are antagonistic to one side. And you know it if you're listening to them, that those are just antagonistic solutions. Again, probably not the best idea. It's a sign or a signal that we're not listening maybe as well to beliefs on the other side. Dan Kahan agrees that whatever the solution, none of us are able to go it alone. What's clear is that our ability to acquire knowledge is linked up with our ability to
Starting point is 00:27:22 figure out whom to trust about what. And ordinary people have to do that in making sense of the kinds of challenges that they face. But the amount that we know far exceeds the amount that any one of us is able to establish through our own efforts. Maybe you know that the motto for the Royal Society is Nolia in verba, which means don't take anybody's word for it. And it's kind of admirable and charming, but obviously false. Not very practical, is it? Can't be right. I mean, what would I do? I would say, you know, don't tell me what Newton said
Starting point is 00:27:57 in the Principia. I'm going to try to figure out how gravity works on my own. And speaking of Isaac Newton, you remember what Stephen Greenspan told us earlier, how Newton was suckered into this terrible investment? Well, it's heartening to learn that even Newton, the scientific sage, was able to acknowledge the flaws in his own thinking, the shortcomings in his own thinking. And he left behind some advice that might be helpful for us all. He wrote, to explain all nature is too difficult to task for any one man
Starting point is 00:28:41 or even for any one age. It is much better to do a little with certainty and leave the rest for others that come after you, than to explain all things by conjecture without making sure of anything. In other words, don't get too cocky. Freakonomics Radio is produced by WNYC, APM, American Public Media, and Dubner Productions. This episode was produced by Catherine Wells. Our staff includes Susie Lechtenberg, Diana Nguyen, Bray Lamb, Colin Campbell, and Chris Bannon. Our interns are Ian Champ and Jacob Bastian. David Herman is our engineer. Special thanks to John DeLore.
Starting point is 00:29:40 If you want more Freakonomics Radio, you can subscribe to our podcast on iTunes or go to Freakonomics.com where you'll find lots of radio, a blog, the books, and more.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.