Hidden Brain - The Vegetable Lamb

Episode Date: January 22, 2019

We like to think that science evolves in a way that is...rational. But this isn't always the case. This week, we look at how information and misinformation spread in science. ...

Transcript
Discussion (0)
Starting point is 00:00:00 This is Hidden Brain, I'm Shankar Vedanta. During the Middle Ages, words spread to Europe about a peculiar plant found in Asia. This plant had a long stock with heavy pods attached. When you cut those pods open, inside you would find a tiny little lamb. Complete with flesh and wool like a live little lamb. Complete with flesh and wool, like a live animal lamb. This creature, half plant, half animal, came to be known as the vegetable lamb of tautry. Various travel writers wrote that they had either heard about this or that they
Starting point is 00:00:43 had eaten one of these lambs. And many of them said they had saw the kind of downy wool from the lamp. When these narratives made their way to Europe people felt they had a view of a different world. Of course no one in Europe had ever seen the vegetable lamb of Tarteri because there was no such thing. But for centuries, people kept talking about this fantastical creature, as if it were real. It even came up in scholarly works, right next to pictures of oak trees and rabbits. If people hadn't been telling each other about these things, nobody would believe that there were vegetable lambs because nobody had ever seen them, right? And this is by no means a unique happening at that time.
Starting point is 00:01:37 At that time, of course, we would never fall for vegetable lambs. We live in an era of science, of evidence-based reasoning, of calm, cool analysis. But maybe there are vegetable lamps that persist even today, even among highly trained, scientists, physicians, and researchers. Maybe there are spectacularly bad ideas that we haven't yet recognized as spectacularly bad. This week on Hidden Brain, we're going to look at how information and misinformation
Starting point is 00:02:15 spread in the world of science and why evidence is often not enough to convince others of the truth. Kaden Okhner is a philosopher and mathematician at the University of California Irvine. She studies how information, both good and bad, can pass from person to person. She is co-author with James Weatherall of the book, The Misinformation Age, How Falls Beliefs Spread. Kaelin, welcome to Hidden Brain. Oh, thank you for having me. So, one of the fundamental premises in your book
Starting point is 00:02:59 is that human beings are extremely dependent on the opinions and knowledge of other people. And this is what creates channels for fake news to flourish and spread. Let's talk about this idea. Can you give me some sense of our dependence on what you call the testimony of others? So one reason we wrote this book is that we noticed that a lot of people thinking about fake news and false belief were thinking about problems with individual psychology. So the way we have biases and processing information, the fact that we're bad at probability.
Starting point is 00:03:34 But if you think about the things you believe, almost every single belief you have has come from another person. And that's just where we get our beliefs because we're social animals and that's really wonderful for us. That's why we have culture and technology, you know, that's how we went to the moon. But if you imagine this social spread of beliefs as opening a door when you open a door for true beliefs to spread from person to person. You also open the door for false beliefs to spread from person to person. So it's this kind of double-sided coin. And what's
Starting point is 00:04:14 interesting of course is that if you close the door you close the door to both and if you open the door you open the door to both. That's right so if you want to be social learners who can do the kinds of cultural things we can do, it has to be the case that you also have to have this channel by which you can spread falsehood and misinformation to. So as I was reading the book, I was reflecting on the things that I know, or the things that I think I know, and I couldn't come up with a good, good answer for how I actually know that it's the earth that revolves around the sun and not the other way around.
Starting point is 00:04:47 Yeah, that's right. 99% of the things you believe probably you have no direct evidence of yourself. You have to trust other people to find those things out, get the evidence, and tell it to you. And so one thing that we talk a lot about in the book is the fact that we all have to ground our beliefs in social trust. So we have to decide what sources and what people we trust, and therefore what beliefs we're going to take up, because there's just this problem where we cannot go verify everything that we learn directly.
Starting point is 00:05:26 We trust the historian who teaches us about Christopher Columbus. We trust the images from NASA showing how our solar system is organized. Now we say we know Columbus was Italian and we know the earth revolves around the sun. But really what we mean to say is is we trust the teacher and we trust NASA to tell us what is true. And the social trust and ability to spread beliefs, I mean it's remarkable what it's let humans do. You know, no other animal has this ability to sort of transfer ideas and knowledge
Starting point is 00:06:02 dependably from person to person over generation after generation to accumulate that knowledge but you do just see sometimes very funny examples of false beliefs being spread in the same way. Now many of us believe there is a way to separate fact from fiction. Science. But as Kaelin points out, science rarely offers permanent truths. Well, first, I would say that in the book, we really encourage trust in science. It's not a book trying to undermine scientific knowledge.
Starting point is 00:06:41 But if you look at the history of science, there have been a lot of examples of cases where people believe something and then they discovered that that wasn't true. So one classic example is the Myasmatheria of Disease. So before we had the germ theory of disease, everyone thought diseases were caused essentially by bad vapors in the air and that you would have these bad vapors near swamps, for example. But this led to all sorts of problems, of course, for diagnosing various medical issues. If you believe that illness is coming from bad vapors in the air and there's a cholera outbreak, you're not going to go check the local well to see if there's some kind of, you know, germ or bacteria in there. So that's one kind of example. Of course, there have been really dramatic changes in
Starting point is 00:07:34 the way we understand the physical world. So Aristotle believed that things fall to the ground because they have the element of earth in them and they're trying to go to their natural position in the center of the earth. Newton argued, no, they fall to the ground because there's a force of gravity that happens between any two massive bodies and it pulls them together. Now we don't believe in that force anymore. We trust Einstein's theory of general relativity, which is that we're all on a curved space time. And when something goes to Earth, it's moving along its natural trajectory in that curved space time.
Starting point is 00:08:12 As a philosopher of science, Kaelin studies how scientists communicate and share information. If we rely on scientists to tell us what to believe, who do they rely on? Turns out, other scientists. Now showing that this is the case isn't easy. The process by which scientists change their minds on questions such as the spread of disease or the movement of objects through space is very complex. Studying this complex process can be mind-boggling. Say, for instance, Dr. A talks to Dr. B one day about her research. It also turns out that Dr. B is collaborating
Starting point is 00:08:51 with Dr. C, who recently met Dr. D at a conference. Now, Dr. D frequently reads Dr. A's papers, but doesn't know about Dr. C's research. A couple of years later, Dr. E reads what Dr. B has written about what Dr. A said in an article that Dr. C's research. A couple of years later, Dr. E reads what Dr. B is written about what Dr. A said in an article that Dr. C cited before Dr. F had even published her results. Imperically, it's hard to study scientists because things like theory change will happen over the course of 10 or 20 years and involve thousands and thousands of interactions between different scientists.
Starting point is 00:09:29 How would you ever study that? How would you ever study that? Because Kaelin can't follow all these interactions, she recreates them in a computer simulation. You'd want to think of it as a really kind of simplified representation of what's happening in the real world. She creates groups of fictional scientists and she gives them a series of rules like who they can talk to and who they trust.
Starting point is 00:09:55 These simulated scientists collect data and discuss their simulated research. Kaelin sits back and watches what happens. Even if you look at completely idealized agents, so you would think of these as simple representations of totally rational scientists or totally rational people testing the world that sometimes they do end up, you know, coming to a false belief about the world, even though they're able to experiment in the model, and they're able to draw really good inferences based on those experiments. Now, one factor in this is that sometimes in the model what you have is spurious results. So if you think about scientific data, usually it's
Starting point is 00:10:45 So if you think about scientific data, usually it's equivocal. You know, it doesn't just tell you what the answer is. If it did, we wouldn't have to do science on it. Instead, it's probabilistic. You have to use statistics to figure out what's true. So one thing we find sometimes in these models is that one agent or scientist will get data supporting the false belief. They'll share it with the entire community of scientists. And then everyone will come to all believe the false thing at once
Starting point is 00:11:11 and sort of ignore a better theory. And part of what happens there is this social spread of knowledge and belief causing everyone to turn away from a good theory. So if you have almost too much social influence within a community, that can be really bad because everyone can stop gathering data since the entire community is exposed to the same spurious results. You know, we've talked on hidden brain about the psychological reasons people sometimes believe in fake news.
Starting point is 00:11:44 We've talked about irrationality and biases and tribalism. We've featured cognitive scientists like Tali Sharat and Danny Kahneman. If I hear you correctly, what you're saying is that psychological factors can have an effect, but you can have the spread of bad information even in the absence of biases or stupidity. Yeah, so one way that the models we look at are really useful, is that you can kind of pair away things that are happening in the real world and see, well, suppose we didn't have any psychological biases, suppose we were perfectly rational, would we always come to the right answer in science and in our day-to-day lives and see that the answer is no?
Starting point is 00:12:27 day-to-day lives and see that the answer is no. Coming up, how the real world compares to the models that Kaelin bills in her lab. We explore case studies from science that show how good information can sometimes fail to spread, even as bad information metastasizes. Mathematician and philosopher Kailin Okhner studies how information spreads through social networks. People who know and trust one another efficiently pass information back and forth and learn from one another efficiently pass information back and forth and learn from one another. Unfortunately, the same rules of social trust can sometimes be a roadblock for the truth.
Starting point is 00:13:15 Mary Wartley Montague learned this lesson hundreds of years ago. She was an English aristocrat who found herself living for a while in what is modern day Turkey. Mary Montague seems to have been really enchanted by Turkish culture. You know, she was coming from England and aristocratic culture there. In Turkey, she discovered these beautiful shopping centers, bath houses.
Starting point is 00:13:40 She seems to have been enchanted by bath houses where there would be a lot of women sort of lounging, naked, going in the hot water, drinking hot drinks together. Another thing that struck Mary about Turkish women, they used an innovative technique to limit the spread of smallpox. It was called Variolation. What this involved, I mean, it's a bit like vaccination now. You would scratch maybe the arm of a patient
Starting point is 00:14:11 and take pus from a smallpox postual and put that pus into the scratch. So what would happen after you did that is that the patient would get a very mild smallpox infection. Some small percentage of patients would die, but many, many fewer than who would die of an actual smallpox infection. And after they had that more mild infection, they would actually be immune to smallpox. So this was practiced commonly in Turkey, basically unheard of in England at the time. Mary Montague had
Starting point is 00:14:47 herself had smallpox and survived when she was younger. She had lost a brother to smallpox. And so when she encountered variolation in Turkey, she decided, well, you know, why don't we do this in England? She had her own son very elated and she decided she was gonna try to spread this practice in her native country. So when she returns to Britain, in some ways Mary Montague here functions like one of your agents in your computer models
Starting point is 00:15:16 because you have one cluster over here in Turkey and one cluster over here in Britain. And essentially you have an agent walking over from Turkey to Britain. And Mary Mont you have an agent walking over from Turkey to Britain. And Mary Montague says, here's this wonderful idea. We can limit the spread of smallpox in Britain. Britain, in fact, at the time was actually facing a smallpox crisis.
Starting point is 00:15:36 How are her ideas received? So her ideas were not received very well when she first came back. One thing we talk a lot about in the book is that almost everyone has what you might call a conformist bias. We don't like to publicly state things that are different from the people in our social networks. We don't like to have beliefs that are different from the people around us.
Starting point is 00:16:02 It's somehow very socially uncomfortable to do that. And we don't like our actions to not conform with the people who we know and love. So when she got back to England, it was already the case that all these physicians in England didn't believe in variation. They thought this was a crazy idea, and none of them were going to stand out from the
Starting point is 00:16:25 pack of physicians and say, yeah I'm the person who's gonna try this or going to believe that this practice works because they were all busy conforming with each other. And of course these ideas were coming from another country, a country with very different cultural practices that seemed in some ways very foreign. The idea and the country itself seemed very foreign. The idea and the country itself seemed very foreign. That's right. So it's not just that it's a weird new idea that none of them believe in their kind of in-group. It's also that it's coming from Turkey and furthermore it's coming from women in Turkey. So it was a practice mostly done by women and a woman is bringing it to England as well.
Starting point is 00:17:07 So they also don't really trust her as a woman and someone who's not a physician. So social trust is a really important aspect in understanding how people form beliefs. Because we can't go out and figure out ourselves, whether the things people tell us are true, usually we just always have to decide who to trust. And people have little shortcuts in how they do this. They tend to trust those who are more like them. They also tend to trust those who share beliefs
Starting point is 00:17:41 and values and practices with them. So for example, if you are a physician, you might tend to trust a physician. If you believe in homeopathy, you might tend to trust someone who believes in homeopathy. We all use these kinds of tricks. So what we saw in the variation case with Mary Montague,
Starting point is 00:18:04 the physicians aren't going to trust this woman who doesn't share their beliefs and practices, who isn't much like them. Now you could argue that the physicians who rejected Mary Montague's ideas were not behaving like real scientists. They weren't being dispassionate. They weren't being objective. Theyate, they weren't being objective. They were bringing psychological biases into the picture.
Starting point is 00:18:29 Sexism, xenophobia, tribalism. In the real world, misinformation spreads because of some combination of network effects and psychological incognitive biases. You see the same thing in the case of the Hungarian physician Ignat Semmelweis. He was an insider, a man, and a doctor. He even had the assistance of scientific evidence to support his claims. But it turned out even these were not enough to overcome the barriers that confront the truth. Ignat Semayas was a physician living in Vienna. He was put in charge of this clinic, the first obstetrical clinic in Vienna.
Starting point is 00:19:11 Next door was the second obstetrical clinic of Vienna. He was in charge of training new doctors and obstetrics, and at the second clinic they were training midwives. And shortly after he took over, he realized that something really terrible was going on because in his clinic, 10% of the women were dying mostly of childbed fever. While the midwives next door, who presumably, they would have thought they were less expertise, only 3 to 4% of their patients were dying. So, Somalwise was obviously really worried about this. He had patients who would be begging on their
Starting point is 00:19:51 knees to be transferred to the other clinic. He had this kind of breakthrough moment when a colleague of his was conducting an autopsy and accidentally cut himself, and then shortly thereafter he died of something that looked a lot like childbed fever. Some of us realized, well, I've got all these physicians who are conducting autopsy's on cadavers and then immediately going and delivering babies. And he thought, well, maybe there's something transferred on their hands, and he called this cadaverist particles.
Starting point is 00:20:24 Of course, now we know that that is bacteria, but they didn't have a theory of bacteria at the time. So he started requiring the physicians to wash their hands in a chlorinated solution, and the death rate in his clinic dropped way down. And of course, the way we think about science, we say, all right, we've someone's discovered something wonderful. Everyone must have instantly adopted this brilliant new idea.
Starting point is 00:20:48 You would think, right? And he has this wonderful evidence, right? It was 10%. He introduced the practice goes down to 3%. But that's not what happened. So he published his ideas. And the other gentleman physicians did not take them up. In fact, they found them kind of offensive.
Starting point is 00:21:06 They thought, this is, you know, he's writing that we have dirty hands. We have unclean hands, but in fact, we're gentleman. They also thought it was just really far out of the range of theories that could possibly be true. So, they didn't believe him despite the really good evidence and the deep importance, you know, people's lives were really at stake. And it took, I mean, decades for his handwashing practice to actually spread.
Starting point is 00:21:36 In fact, I understand that Semmelweis himself eventually suffered a nervous breakdown. How did his own story end? So the way the story goes, though this is a little hard to verify, is that he was so frustrated that people weren't adopting his handwashing practice that he had a nervous breakdown as a result. He was put into a V&E's mental hospital where he was beaten by guards and died of blood poisoning a few weeks later. We've seen how being an outsider or breaking with tradition can be barriers to the spread of good scientific information.
Starting point is 00:22:21 But you could argue that these examples were from a long gone era of gentlemen physicians and amateur scientists. But even in the modern day of science where researchers demand hard evidence to be convinced, it turns out that false, inaccurate and incomplete information can still take hold. In 1954, ED Palmer published a paper that changed how doctors thought about stomach ulcers. So what he did was looked at a lot of stomachs. I believe somewhere in the range of a thousand. And he found that there were no bacteria whatsoever in the stomachs that he
Starting point is 00:23:03 investigated. A lot of people at that time had been arguing over whether stomach ulcers were caused by stomach acid or some kind of bacteria. This was taken as really decisive evidence showing that, okay, well, it can't be bacteria because everyone thought Palmer's study showed there are no bacteria in stomachs, so it absolutely must be stomach acid. And of course, in this case, Palmer was not trying to fabricate his data or make up data. He was sincerely arriving at what he thought was a very good conclusion. That's right.
Starting point is 00:23:36 And it seems that it just was a problem with his methodology. Of course, there are bacteria in our stomachs. He just didn't see them because of the way he was doing his particular experiment. This was not a fabrication at all. One of the things that's interesting about this episode involving Palmer and the Stomach ulcers is that as individuals essentially came over to believe what Palmer was telling them, there was a consensus that started to grow. And as each new person added to the consensus, it became a little bit stronger, which made it even harder to challenge.
Starting point is 00:24:10 Yeah, so although they had been arguing for decades about whether ulcers were caused by acid or by bacteria, at this point, people started to share palm-ish results pretty much everybody saw them. And this consensus was arrived at okay it's acid and everyone who had been studying the possibility that bacteria caused stomach ulcers stopped studying that and many people turned to looking at okay how can we treat stomach acid in order to treat ulcers. When Australian physician Barry Marshall came along a few decades later to challenge this theory, he was met with stony face resistance. He couldn't get his articles published.
Starting point is 00:24:56 Scientists sniped at him behind his back, even though as it turned out, his data was far better than ED Palmer's stomach studies. His data was far better than ED Palmer's stomach studies. Coming up, what Barry Marshall did to fight misinformation and what we can learn from his story about how to spread the truth. The Australian Physician Barry Marshall tried and failed for years to convince doctors that Stamacal sers were caused by bacteria. Like Ignor Simalwise, he found that mere evidence was no match for conventional wisdom. People were bleeding in my practice and dying from ulcers in my hospital. I could see it. match for conventional wisdom. I underwent a baseline endoscopy. I drank the bacteria, tented in the ninth colony forming units.
Starting point is 00:26:07 Then I had this vomiting illness, no acid present in my vomit. And when I vomited early in the mornings, still half asleep, but it was just like water coming out. What on earth was he doing, Keena? So he decided that his idea that in fact ulcers are caused by bacteria wasn't spreading fast enough for his taste in the scientific community. And so he did this demonstration. He gave himself H. Pyl, and gave himself stomach ulcers, and then he later cured them
Starting point is 00:26:46 with antibiotics in this publicity stunt almost to convince people that in fact ulcers were caused by bacteria. Eventually, Barry Marshall and Robin Warren went on to win the Nobel Prize in Medicine for their discoveries. Mary Martin Giu, the woman who faced resistance in bringing variation to England, never won a prestigious prize, but she also found a way to spread the truth. Like Barry Marshall, she found it had more to do with her sales pitch than with the evidence. So in the end, she did something really smart, which took advantage of the ways that we
Starting point is 00:27:28 use our social connections to ground our beliefs and our trust. So she ended up convincing Princess Carolyn of Onesbach to regulate her own two small daughters and to do it in this kind of public way. So she got one of the most influential people in the entire country to engage in this practice. So that did two things. So number one, it made clear, you know, because she did in this kind of public way and her daughters were fine, it gave people evidence that this is in fact a safe practice and it's a good idea.
Starting point is 00:28:02 But it also made clear to people that if they want to conform to the norm, if they want to share a practice with this really influential person, then they should do the same thing. And after Princess Carolyn did this, Variation spread much more quickly, especially among people who had a personal connection to either Mary Montague or to the princess. What's fascinating here is that this wasn't in some ways a rational way to solve the problem. It wasn't saying, look, there's really convincing evidence here. You're almost using a technique that's pretty close to propaganda. It is a propaganda technique, absolutely. So,
Starting point is 00:28:42 propaganda technique, absolutely. So, propagandists tend to be very savvy about the ways that people use their social connections to ground trust and knowledge and choose their beliefs, and they take advantage of those. In this case, it was using that social trust for good, but in many cases, people use it for bad. And if you look at the history of industrial propaganda in the US, or if you look at the way Russia conducted propaganda before the last election,
Starting point is 00:29:12 people have taken advantage of these kinds of social ties and beliefs to try to convince us of whatever it is they're selling. One last idea and how you counter bad information. Semilvice as we saw did not succeed in persuading other doctors during his lifetime to wash their hands thoroughly before they were treating patients. But of course, now that idea is widely adopted, what does that tell us, Kaelin, about how signs in some ways might be self-correcting?
Starting point is 00:29:43 It might not be self-correcting at the pace that we want, but over time, it appears that good ideas do beat out the bad ones. Yeah, so we have thousands and thousands of examples in science of exactly that happening, of good ideas beating out the bad ones. Of course, now we can look back and say, oh, well, that could idea one out and that could idea one out. We can't actually look at right now and know which of the ideas we believe now are correct ones or good ones. So there are actually philosophers of science like Larry Loudon and Kyle Stanford who
Starting point is 00:30:21 argue for something called the pessimistic meta-induction, which is something like this because scientific theories in the past have always eventually been overturned, we ought to think that our theories now will probably be overturned as well. But there's actually an optimistic side to this, which is that if you look at many theories in the past, ones that were overturned, often the reason people believe them is that even if they were wrong, they were a good guide to action. So Newtonian physics got us to the moon. It's not right, but it was really successful.
Starting point is 00:31:00 Even the theory of stomach acid causing ulcers, well if you treat stomach acid, it actually does help with ulcers You know, it wasn't a completely unsuccessful theory. It's just that it wasn't totally right and it wasn't as successful as the Bacteria theory of ulcers because antibiotics do better Of course when it comes to something like handhing, you know, you can say that over the last hundred and fifty years, or you know, people have adopted that idea, but it didn't actually mean that the people in Semmelvice's time changed their minds. It really was that those people essentially left the stage and new people came along. There's an old joke in science, which says science progresses funeral by funeral. In some ways, that's what you're talking about here.
Starting point is 00:31:47 Yeah, so that can be. So theory change can happen because ideas that are good ideas spread throughout a community and then more people start to test them and then they communicate them to more people. And eventually you reach a consensus. One thing that the philosopher Thomas Kuhn really argued is that when you're having these kind of big paradigm shifts in science often it's young people coming up with a new paradigm and then switching to it because they don't have any skin in the game in the old one You know they haven't spent their life
Starting point is 00:32:18 defending this older theory and then you know maybe eventually the people who are defending metal or theory, retire or die. And then you have theory change. One of the interesting implications about all of this is how we should think about the truth. And in some ways, I think the picture that I'm getting from you is a picture that says the truth is not a binary question. It's not, you know, is it true? Is it false? I mean, some questions, of course,
Starting point is 00:32:49 perhaps can be reduced to, is it true? Is it false? But really, scientists in the business are producing probability estimates for various claims. And I think what you're saying is that for us to actually be on the right side of the misinformation, information divide, it's helpful for us to think in probabilistic terms rather than in binary terms. Yeah, that's absolutely right. So we do think it's really important to think about belief in terms of degrees and evidence and believing something strongly enough. And part of the reason is that there has been this strategy where people who are trying to subvert our beliefs will say, but we're not sure about something.
Starting point is 00:33:31 They'll say, evolution's just a theory, or there's some doubt about global warming. But ultimately not being sure about something is not what matters. We're never really 100% sure about anything. And if you think about it, think about any belief you could have, you know, that the sun will come up tomorrow. Well, it always has in the past, but that doesn't mean that 100% sure it will tomorrow.
Starting point is 00:34:01 There's a really good chance it will tomorrow. We shouldn't be looking for certainty. Instead, we need to be saying to ourselves, when do we have enough evidence to make good decisions? Kaelin O'Connor is a philosopher and mathematician at the University of California, Irvine. She studies how social networks can spread both good information and bad. Along with James Weatherall, she is co-author of the book, The Misinformation Age,
Starting point is 00:34:31 How Falls Beliefs Spread. Kaelin, thank you for joining me today on Hidden Brain. Kaelin, thank you so much for having me. This week's show was produced by Kimela Vargas Restrepo and edited by Tara Boyle and Jenny Schmidt. Our team includes Raina Cohen, Laura Quarelle, Parts Shah and Thomas Liu. Our unsung heroes this week don't work at Hidden Brain or even at NPR, but the longer we do this show, the more we realize how many helping hands go into building it. We often found we turn to the
Starting point is 00:35:05 scientists and forecasters at the National Weather Service to tell us when we need to get an episode wrapped up by Friday because it's a storm arriving on Monday that could keep us from getting to the office. Government departments like the National Oceanic and Atmospheric Administration do their work so quietly and so well, we often take them for granted. Today we recognize the folks at NOAA as it's called for their vital work. For more hidden brain, you can find us on Facebook and Twitter. You can find information about the research we discussed on this show on our website, npr.org slash hidden brain. If you liked this episode, please think of one
Starting point is 00:35:46 friend who might enjoy our show and drop them a word about it. I'm Shankar Vittantum and this is NPR.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.