Hidden Brain - In The Air We Breathe

Episode Date: June 6, 2017

After a police-involved shooting, there's often a familiar blame game: Maybe the cop was racist. Maybe the person who was shot really was threatening. Or maybe, the bias that leads cops to shoot affec...ts us all. This week on Hidden Brain, we explore how unconscious bias can infect a culture — and how a police shooting may say as much about a community as it does about individuals.

Transcript
Discussion (0)
Starting point is 00:00:00 This is Hidden Brain, I'm Shankar Vedantum. On a September evening in 2016, Terence Croucher's SUV stopped in the middle of a road in Tulsa, Oklahoma. A woman saw him step out of the car. The doors of the car were open, the engine was still running. The woman called 911. Officer Betty Shelby was on her way to an unrelated incident when the call came in. Terrence was 40, African-American, born and raised in Tulsa.
Starting point is 00:00:30 He was a church-going man with four children. Betty was 42, white, a mother. She was born in a small town not far from Tulsa. In an ideal world, these two Oklahoma natives, close an age, ought to have had more to bring them together than hold them apart. But on this evening, there was no small talk or friendly chatter. The police officer told Terrence to take his hands out of his pockets. According to our attorney, he forced complied.
Starting point is 00:01:00 He then put his hands up in the air. Moments later, he put his hands back in his pockets. By this point, multiple police officers had gathered and drawn their guns and tasers. Overhead, the police chopper filmed events as they unfolded. From the video, it's hard to tell exactly what's happening on the ground, but an officer in the helicopter thinks Terrence isn't cooperating. Operator, I think. Moments later, one officer on the ground does fire a taser.
Starting point is 00:01:33 Betty Shelby fires her gun. She kills Terence Croucher. Later, police discover that he was unarmed. Soon, accusations are flying. Maybe the victim was high on drugs. Others said maybe the police officer was racist. At a press conference after the shooting, a journalist asked Scott Wood, Betty Shelby's attorney, about that.
Starting point is 00:02:06 Did him being a big black man play a role in her perceived danger? No, him being a large man perceived a role in her being in danger. She's worked in this part of town for quite some time. The week before, she was at an all black high school, homecoming football game. She's not afraid of black people. Terence Croucher's sister Tiffany sees it very differently.
Starting point is 00:02:35 She thinks her brother was shot because he was black. That big bad dude was my twin brother. That big bad dude was my twin brother. That big bad dude was a father. That big bad dude was a son. That big bad dude was enrolled at Tulsa Community College. Just wanting to make us proud. Betty Shelby's daughter, Amber, defended her mother. I am here to reveal to you the side of my mother
Starting point is 00:03:08 that the public does not know. My mother is an incredible, supportive, loving and caring woman. She is a wife, a mother, and a grand mother, with a heart of gold. She has always fought for the underdog and stands up for the week. And so it went with accusations against and defenses 4, the person who pulled the trigger and the person who got shot. Betty Shelby was recently acquitted of manslaughter charges. Still, the tenor of the back and forth, the psychological accusations and psychological
Starting point is 00:03:39 defenses, it's very revealing. When an incident like this occurs, we want to hear the story of what happened. We want to know what was going on in the mind of the shooter and the mind of the victim. Was Terence Croucher truly a threat? Did Betty Shelby dislike black people? What clues explain the behavior of these individuals? We home in, dig for facts, and look for psychological explanations. But what if there's another way to think about what happened, one that has less to do with the individuals involved, and more to do with the context in which the shooting occurred?
Starting point is 00:04:16 What we're discovering here is that the individual mind sits in society, and the connection between mind and society is an extremely important one that should not be forgotten. Individual behavior and the mind of the village this week on Hidden Brain. I'm Maserine Banaji. Maserine is a psychology professor at Harvard. She's made a career out of studying the invisible. For the past 30 years, I've been interested in studying those aspects of our minds that are hidden from our own conscious awareness. Maasareen's interest began in graduate school.
Starting point is 00:05:19 She was teaching psychology at Yale and looking for a way to measure people's biases. There was debate over the right scientific method to do this. You could simply ask people their views, but because prejudice is a sensitive topic, you often don't get anything. You couldn't walk up to somebody and say, do you agree that Italians are lazy and have them say yes or no? They'll just refuse to answer that question. A deep-seated discomfort about discussing prejudice was one hurdle for a researcher
Starting point is 00:05:48 looking to study the phenomenon. Maserine realized there was another barrier. What if some forms of prejudice are so deeply buried that people don't even realize they harbor such bias? Perhaps we behave in ways that are not known to our own conscious awareness, that we are being driven to act in certain ways not because we are explicitly prejudiced, but because we make Harry in our heads the thumbprint of the culture. Was there a way to decipher this thumbprint and expose people's hidden biases?
Starting point is 00:06:21 Eventually Maserine, with the help of her, Tony Greenwald, and then graduate student, Brian Nosek, developed a simple, ingenious test. It's called the implicit association test or the IAT. It's based on the way we group things in our minds. When you say bread, my mind will easily think butter, but not something unrelated to it. Like say, a hammer. Our brains it turns out make associations and these associations can reveal important things about the way we think.
Starting point is 00:06:52 So the way the IAT works is to simply ask people to sort things. So imagine that you're given a deck of playing cards and you're asked to sort all the red cards to the left and all the black cards to the right. I'll predict that it will take you about 20 seconds to run through that deck. Next, Mazarin says, shuffle the deck and resort the cards.
Starting point is 00:07:14 This time, I'd like you to put all the spades and the diamonds to one side and the clubs and the hearts to the other side. And what we'll find is that this will take you nearly twice as long to do. the diamonds to one side and the clubs and the hearts to the other side. And what we'll find is that this will take you nearly twice as long to do. Why? Because a rule that your brain had learned, red and red go together, black and black go together is no longer available to you.
Starting point is 00:07:38 Remember that in both scenarios, you are grouping two suits together. In the first scenario, you are grouping hearts with diamonds and clubs with spades. In the second scenario, you are grouping hearts with clubs and diamonds with spades. Because there's a simple rule for the first task, group red with red and black with black, that task is easy. In the second scenario, you need a fraction of a second to think about each card. You can't follow a simple rule of thumb. Maserine and Tony and Brian had an important insight. These rules of association apply to many subjects, including the way we think about other human beings.
Starting point is 00:08:18 So they created a new sorting task. Sort for me, faces of black people, and bad things together, words like devil and bomb and vomit and awful and failure. Sort those on one side of the playing deck. On the other side, put the faces of white people, and words like love and peace, enjoy and sunshine and friendly and so on to the other side. This turns out to be pretty easy for us to do because as my colleagues and I will argue, the association of white and good and black and bad has been made for us in our culture.
Starting point is 00:08:58 The test doesn't end there. After sorting white and good into one group and black and bad into another, you now have to do it again, this time grouping black with good and white with bad. And when you try to do that and when I try to do that, the data show that we will slow down, that we can't do it quite as fast because black and good are not practice responses for us. They're not habitual responses for us. We have to exert control to make that happen because it doesn't come naturally and easily to us. That's the IAT.
Starting point is 00:09:35 By the way, if you're wondering whether the order of the test makes a difference, it doesn't. The researchers have presented tests to volunteers in different ways. It doesn't make a difference if you ask people to first group black with bad or first ask them to group black with good. In both cases, people are faster to associate white faces with positive words and black faces with negative words.
Starting point is 00:09:58 Mazarin thinks the IAT is measuring a form of bias that is implicit or unconscious. Mazarin herself has taken the IAT many times. To her dismay, the test shows she has robust levels of unconscious bias. My brain simply could not make the association of black with good as quickly as I could make the association of white with good. And that told me something, it told me, it's not the IAT that screwed up, it's my head that screwed up. Because the IAT is a timed test, the results can be precisely measured.
Starting point is 00:10:33 Implicit bias in other words can be quantified. Most psychological tests are only available in the lab, but Mazarin and her colleagues decided to do something radical. They put their test on the internet. You can find it today at implicit.harvard.edu. Millions of people have taken this test. The data has been collected, shared, disseminated. The IAT is widely considered today to be the most influential test of unconscious bias.
Starting point is 00:11:06 As Maserine and Tony and Brian were developing the IAT, other researchers were developing different ways to measure bias. Psychologist Joshua Corral found himself diving into the field shortly after a black man was shot and killed in New York City in 1999. His name, Amadou Diallo. Diallo was standing unarmed on the front front stoop of his apartment building in the police thought he looked suspicious and they approached him and they ended up shooting him and the question that everybody was asking and this was I mean something that people you know across
Starting point is 00:11:42 the country were wondering about was was he shot because he was black. At the time, Joshua was starting graduate school. I took that question pretty seriously, and we tried to figure out how we could test it in a laboratory. Joshua and his colleagues eventually developed a video game. It was a pretty bad video game, but it did the trick. It's more like a slideshow where there are a series of backgrounds that pop up on the screen, and then in one of those critical backgrounds, a person will suddenly appear. So we've got photographs of, say, 25 or so white men and 25 black men,
Starting point is 00:12:18 and we've photographed these guys holding a variety of different objects, cell phones, can of coke, a wallet, a silver pistol, a black pistol. And so we've just edited the photograph so that the person pops up in the background, holding an object, and the player has to decide how to respond. And they're instructed, if the guy on the screen has a gun, he's a bad guy and you're supposed to shoot him. And you're supposed to shoot him. And you're supposed to do that as quickly as you possibly can. What Joshua wanted to know was whether players would respond differently, depending on the race of the target on the screen. Say, a black guy pops up holding a wallet and a white guy pops up holding a wallet.
Starting point is 00:12:58 What's the likelihood that the black guy gets shot and the white guy doesn't? If current events are any clue, you may guess the answer. Here we're looking at, say, you know, the player is responding to a target who's holding a wallet, and the correct decision is to say don't shoot, and what we found is that they are faster to say don't shoot, if the target is white rather than black. The same hell true for arm targets. Test takers were faster to shoot black targets, slower to shoot white ones. Now you might think that Joshua would conclude that his test takers were
Starting point is 00:13:30 just racist, but one important similarity between Joshua's test and Maasareans test is that they do not presume that the people with such biases have active animosity toward African Americans. These are not members of the Ku Klux Klan. It was just exactly what we had predicted and I guess both kind of hoped and animosity toward African Americans. These are not members of the Ku Klux Klan. It was just exactly what we had predicted, and I guess both kind of hoped and feared, right? I mean, it's an exciting scientific moment, but it also suggests something kind of deeply troubling
Starting point is 00:13:55 that these participants who are presumably nice people with no bone to pick, they're not bigots, they're not angry at black people in any way, but what we saw in their data very clearly is an tendency to associate black people with threat and to shoot them more quickly. You can see that both the psychological tests are academic exercises. Do they see anything about how people behave in real life? Joshua Corell is very clear that his video game experiment cannot replicate real life.
Starting point is 00:14:28 It's impossible, he says, to recreate in a lab the fear and stress that a real world police confrontation can generate. The IAT2 has been criticized for a somewhat hazy link between test results and real world behavior. Hundreds of studies have been conducted looking at whether the IAT explains or predicts how people will act. The results have been mixed. In some studies, unconscious racial bias on the test seems to predict how people will behave. Researchers found, for example, that doctors who score high and implicit bias are less likely to prescribe clot-pusting
Starting point is 00:15:05 heart drugs to black patients compared to white patients. But other studies, also looking at doctors and black and white patients find no correlation between results on the bias test and actual behavior. This discrepancy bothers psychologists to fill Tetlock at the University of Pennsylvania. He is a critic of the IAT. It's a test that is enormously intuitively appealing. I mean, I've never seen a psychological test take off the way the IAT has and it's gripped the popular imagination the way it has, because it just seems on its surface to be measuring something like prejudice.
Starting point is 00:15:41 Tetlock and other critics are concerned that just because someone shows bias on the IAT doesn't mean that they're going to act in biased ways in real life. If a test cannot predict how you're going to act, isn't it just an academic exercise? There is the question of whether or not people who score as prejudice on the IAT actually act in the discriminatory ways toward other human beings in real world situations. And if they don't, if there is very close to zero relationship between those two things, what exactly is the IAT measuring? It turns out a lot.
Starting point is 00:16:16 There's new evidence that suggests the IAT does in fact predict behavior. But to see it, you have to zoom out. You have to widen the lens to look beyond the individual and into the community. Hello, my name is Eric Haman. He's a psychology professor at Ryerson University. Eric got interested in the IAT as he was researching the use of lethal force in policing. He was trying to design a statistical model that would predict where in the United States people of color are disproportionately likely to be shot and killed by police.
Starting point is 00:16:47 First he needed some baseline data. This proved hard since the federal government does not require police departments to report deadly shootings by officers. We really had no idea about really basic questions such as how often they were happening, where they're happening, and who they were happening to. But in 2015 some new outlets, including the Washington Post and the British newspaper the Guardian, began to compile their own database on police homicides in the United States. According to official terminology, these are called justifiable homicides.
Starting point is 00:17:19 So what they were putting together was the most comprehensive list of these justifiable homicides in the United States. Eric used this data to pinpoint where disproportionate police shootings of minorities were most likely. Then he turned to the IAT data. Eric suspected that if bias was a factor in police shootings, it was likely that implicit bias rather than overt racism was at play. Traditionally, the field has found that explicit biases or predict behaviors that are under our conscious control, whereas implicit biases predict things are a little bit more automatic,
Starting point is 00:17:53 a little bit more difficult to control. And this is exactly this sort of behavior that we thought might be involved in police shootings. People take the IOT anonymously, but they need to provide some information, like their race and where they live. With the millions of data points the IOT provided, Eric painted a map of bias across the United States. Some places seem to have lots of bias. Others very little. Now he had two databases. He cross-reference them to see if there was any connection between communities with disproportionate
Starting point is 00:18:27 numbers of police shootings of minorities and communities showing high levels of implicit bias. A powerful correlation emerged. So, we find that in communities in which people have more racial biases, African Americans are being killed more by police than their presence in the population would warrant. Let me repeat this because it's important. In places where implicit bias in a community is higher than average, police shootings of minorities are also higher than average. Erics analysis
Starting point is 00:19:00 effectively pinpoints where police shootings are likely to happen. But here's what makes the finding crazy. Most people who take the IAT are not police officers. So we're predicting police behavior by not measuring police at all themselves. Coming up, we explore how a test can predict how people will behave, even when they are not the people who take the test. Stay with us. Psychologist Eric Heyman found a way to predict police behavior by comparing places that have high levels of implicit bias with places where police shootings of minorities are higher than average. Since police don't typically take the IAT, how could the IAT be predicting
Starting point is 00:20:01 how police would behave? Eric thinks the test has tapped into the mind of the community as a whole. Say there's a neighborhood that's traditionally associated with threat or danger and the people who live in that neighborhood have these associations between African Americans and threat or African Americans and danger. And these would be anybody in this community. This could be my mother or the person who lives down the street, not necessarily the police officers themselves. But there's this idea that this attitude is pervasive across the entire area. And that when officers are operating in that area, they themselves might share that same attitude. That might influence their behaviors in these split-second challenging life and death decisions.
Starting point is 00:20:47 Implicit bias is like the smog that hangs over a community. It becomes the air people breathe. Or as Mazerine might say, the thumbprint of the culture is showing up in the minds of the people living in that community. There are many examples for this idea that individual minds shape the community and the community shapes what happens in individual minds. Set Stevens-Devitovitz is a data scientist who used to work at Google. We featured him on a recent episode of our show. In his book, Everybody Lies, Seth explains how big data from Google searches can predict
Starting point is 00:21:21 with great accuracy things like the suicide rate in a community or the chances that a hate crime will take place. We've shown that you can predict hate crimes against Muslims based on searches people make. People make very, very, very disturbing searches, searches such as kill Muslims or I hate Muslims. And these searches can predict on a given week how many hate crimes there will be against Muslims. But I think the right
Starting point is 00:21:45 approach to this is not to target any particular individual to show up at the door of any particular individual who make these searches. But if there are many, many searches in a given week, it will be wise for police departments to put extra security around mosques because there is greater threat of these attacks. In other words, what the Google search data is doing is effectively taking the temperature of an entire community. That's what you're really saying, that you're picking up on things that are in the ether if you will in the community that might not show up in the individual, but are likely to
Starting point is 00:22:18 show up in the aggregate. Yeah, and I think you don't really know the reason that any particular person makes a search, right? Someone could be searching kill Muslims because they're doing research, or they're just curious about something, or they made a mistake in their typing. There are a lot of reasons that an individual can make these searches.
Starting point is 00:22:34 But if twice as many people are making these searches, well, if I were a Muslim American, I'd want some extra security around my mosque, right? Asking whether implicit bias affects the behavior of every individual is a little like investigating everyone who types an offensive search term into Google. A lot of the time, you're going to find nothing. And yet, when you look at the use of search terms in aggregate, it can tell you with great precision which areas will see the most hit crimes.
Starting point is 00:23:03 For her part, Mazurine Banajiy believes Eric's work is a key link between her psychological data on how individuals behave and sociological insights on how community behaves. What we're discovering here is that the individual mind sits in society and the connection between mind and society is an extremely important one that should not be forgotten and that more than any other group of people, social psychologists, oh it, to the beginnings of their discipline, to do both and to do it even handedly, to be focused on the individual mind and to be talking about how that mind is both
Starting point is 00:23:45 influenced by and is influencing the larger social group around her. This is why, says Maserine, when a problem has spread throughout a community, when it has become part of the culture, you can't fix it by simply focusing on individuals. One of the difficulties we've had in the past is that we have looked at individual people and blamed individual people. We've said, if we can remove these 10 bad police officers from this force, we'll be fine. And we know our social scientists, and I believe firmly that that is no way to change anything. This new way of thinking about bias showed up in the last presidential election.
Starting point is 00:24:27 Democrat Hillary Clinton said implicit bias probably played a role in police shootings. I think implicit bias is a problem for everyone, not just police. I think unfortunately too many of us in our great country jumped to conclusions about each other. Republican Mike Pence, now vice president, bristled at the idea. He said that Clinton was calling cops racist. When an African-American police officer is involved
Starting point is 00:24:53 in a police action shooting involving an African-American, why would Hillary Clinton accuse that African-American police officer of him? I guess I can't believe you are defending a position that there is no bias. But as Maserid says, it's not quite right to think of people with implicit bias I can't believe you are defending a position that there is no bias. But as Mausurine says, it's not quite right to think of people with implicit bias as harboring the kind of racial hostility we typically think of when we say someone is a racist. Small kids show implicit bias. African Americans themselves show implicit bias against other African Americans.
Starting point is 00:25:22 The test isn't picking up the nasty thoughts of a few angry outliers. It's picking up the thumbprint of the culture on each of our minds. So what can we do? Miserine is skeptical of those who offer training courses that promise quick, fixed solutions. There are many people across the country who say that they offer such a thing, called implicit bias training. And what they do is explain to large groups of people what might be going on that's keeping them from reaching their own goals and being the good people that they think they are.
Starting point is 00:25:58 And my concern is that when I'm an old woman, that I will look back at this time and think, why didn't I do something about this because I don't believe this training is going to do anything? In Mausoleen's view, you can't easily erase implicit bias because you can't erase the effect of the culture when people are living day in and day out in that same culture. But she and others argue that there might be ways to prevent such biases from influencing your behavior.
Starting point is 00:26:27 Let's return to psychologist Joshua Correll. Remember he created the active shooter video game that found test takers were more likely to shoot black targets rather than white ones. Many of Joshua's initial test takers were students. Eventually, he decided to see what would happen if police officers took the test. So he went to the Denver police. We brought down a bunch of laptops and button boxes, a bunch of electronic equipment that we were using to do this study, and we would set it up in their roll call room. And it was just complete chaos and really, really fun. And some of the police really wanted
Starting point is 00:27:03 nothing to do with us, but a huge number of them volunteered and they wanted to talk with us afterwards. At first, the police officers performed exactly the same as everyone else. Their levels of implicit bias were about the same as laypeople who take in the test, both in response times and in mistakes. But when it came to the actual shooting of targets,
Starting point is 00:27:24 the police were very different. The police officers did not show a bias in who they actually shot. Those early test takers, college students and other laypeople displayed their bias on response time, mistakes, and who they shot. Not the police. But whereas those stereotypes may influence the behavior of the college students and of you and me, the police officers are somehow able to exert control. So even though the stereotype, say, of threat may come to mind, the officer can overcome that stereotype and respond based on the information that's actually present in the scene rather than information that the officer is bringing to it through his or her stereotypes.
Starting point is 00:28:06 Joshua wondered whether there were certain factors that might keep police officers from exerting this kind of cognitive control over their biases. He found, among other things, that sleep made a difference. Those who were getting less sleep were more likely to show racial bias in their decisions to shoot. And again, that's just consistent with this idea that they are, they might be able to exert control, to use cognitive resources to avoid showing stereotypic bias in their decisions, but when those resources are compromised, they can't do it. They could be compromised in a variety of ways.
Starting point is 00:28:43 Sleep is just one way that we can compromise it. This, once again, is evidence that you can't train people not to have unconscious bias. But as Joshua suggests, you can do things to make it less likely that people will be affected by their bias. To be clear, Joshua's experiments are laboratory experiments. We know in real life that beat police officers do shoot people in error. Now this could be because in a real life encounter, stuff happens that makes it very difficult for you to actually think about what you're doing. So on the street, when somebody pulls a gun on you, it's scary, right? Like, cops when
Starting point is 00:29:21 they're involved in these fire fights, report some crazy, crazy psychological distortions because they're legitimately freaked out. If they think somebody poses a life and death threat, they may panic, and it may be hard to bring those cognitive resources online. In several recent high profile cases, as with the case of Terence Cratcher and Betty Shelby, police officers shoot people who are unarmed. Of course, officers do not always know whether someone is armed, it's only in hindsight that we know the officer was
Starting point is 00:29:50 or wasn't in real danger. Joshua's larger point is that police encounters can be inherently stressful. The uncertainty embedded in a confrontation can make it very difficult to think objectively. To put it another way, if you're running a police department and want to reduce errors and shootings, it may be less useful to give cops a lecture on how they shouldn't be racist and more useful to build procedures that give cops an extra half-second when they are making decisions under pressure. With practice and a bit of time to exercise conscious control, people can reduce the risk of falling prey
Starting point is 00:30:26 to their implicit biases. There is the potential to control it, right? The performance of the regular police officers, or even people that we train in our lab, suggests that people don't have to succumb to those stereotypic influences. They can exert control in certain circumstances. Maserine Banaji has a similar solution. She thinks we need more of what she calls in the moment reminders.
Starting point is 00:30:55 For example, it's been found that some doctors prescribe painkillers to white patients far more often than they do to black patients who are reporting exactly the same levels of pain. The only difference is the patient's skin color. This suggests that bias is at work. Maserine says if the bias is implicit, meaning physicians are acting biased without intending to be biased, a timely reminder can help doctors exercise cognitive control over their unconscious associations. You type in a painkiller that you want to prescribe to a patient into your electronic system while the patient is sitting next to you.
Starting point is 00:31:31 And it seems to me quite simple that when you type in the name of any painkiller, let's say, Codine, that a little graph pops up in front of you that says, please note, in our hospital system, we have noticed that this is the average amount of painkiller we give to white men. This is the average amount we give to black men for the same reported level of pain. In other words, giving doctors an opportunity
Starting point is 00:31:58 to stop for a second to make a decision consciously and deliberately, instead of quickly and automatically, this can reduce the effects of implicit bias. Psychology has spent many years understanding the behavior of individuals, but tools such as the IAT might give us away to understand communities as a whole, maybe even countries as a whole.
Starting point is 00:32:20 We did a study some years ago, Brian Nosek led this particular project in which we looked at gender stereotypes across many countries in the world, how strongly do we associate female with science and male with science. And then we looked at the performance of girls and boys, roughly around eighth grade, on some kind of a standardized test. And what we discovered is that the stronger the gender bias in a country, that is to say the stronger the association of male with science in a country, the less well girls in that country did on that mathematics test. It's very similar to the Haman kind of result because we didn't measure the gender bias in the girls and the boys who took the test. We were measuring something at the level of a country
Starting point is 00:33:12 in that case. And yet it did predict something systematic about the difference in performance between boys and girls. When we look at an event like a police shooting, we invariably seek to understand it at the level of individuals. If something bad happens, we think it has to be because someone had bad intentions. Implicit bias certainly does act on individuals, but it's possible that its strongest effects are at the level of a community as a whole. This might be why some police shootings of African-American men are carried out by African-American police officers,
Starting point is 00:33:49 and why some physicians who are not prescribing pain medications to people of color might themselves be people of color. Individuals can do their part to limit the effects of bias on their behavior, but if you want to fix the bias itself, well, that takes the whole village. This episode of Hidden Brain was produced by Jenny Schmidt, Maggie Peneman and Raina
Starting point is 00:34:15 Cohen. It was edited by Tara Boyle. The music in today's show was composed by Ramteen Arablui. René Clar does our social media. This week our unsung hero is Izzy Smith. He directs the promotion of NPR's podcasts and works tirelessly to get the content we're making to reach new ears. If you ever discovered a new favorite show because it was recommended by another podcast you love, you have Izzy to thank. Izzy recently led the TriPod initiative that encouraged all of you to share the podcast
Starting point is 00:34:46 you listened to on social media and though that campaign is technically over, it's never too late to tell a friend about Hidden Brain. You can find us on Facebook, Twitter and Instagram and listen for my stories each week on your local public radio station. I'm Shankar Vedantam and this is NPR. Hey Podcast listeners, a couple of things before we go. If you enjoyed this show on implicit bias and the IAT, make sure you check out the June 15th episode of Invisibilia, which looks at where our bias has come from and whether we can do anything about them. The show visits a group of people in a new program
Starting point is 00:35:36 in California called Racist Anonymous, talks to scientists who are exploring concrete ways to counter prejudice and writes along with a Ferguson police officer who's grappling with his own experiences as a victim and a perpetrator of bias. Finally, we're working on a show about regrets. If you have a funny story or a not-so-funny story about something you regret and you're willing to talk about it, please give us a call and tell us your story. Leave a message at 661-772-7246. That's 661-772-brain. You can also record a voice memo and email it to us at hiddenbrainatnpr.org. Please put the word regret in the subject line.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.