StarTalk Radio - AI, Autonomous Vehicles, and Race – Oh my! with Malcolm Gladwell

Episode Date: July 12, 2021

How do self-driving cars change our psychology? On this episode, Neil deGrasse Tyson and comic co-host Chuck Nice philosophize about science’s role in society with prolific author Malcolm Gladwell. ... NOTE: StarTalk+ Patrons can watch or listen to this entire episode commercial-free. Thanks to our Patrons Tobias Malmborg, Andy Pattinson, Adam Lenda, Naomi Martin, Johan Fredrik Oldervik, and Scott Heflen for supporting us this week. Photo Credit: Grendelkhan, CC BY-SA 4.0, via Wikimedia Commons Subscribe to SiriusXM Podcasts+ on Apple Podcasts to listen to new episodes ad-free and a whole week early.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to StarTalk, your place in the universe where science and pop culture collide. StarTalk begins right now. This is StarTalk. I'm Neil deGrasse Tyson, your personal astrophysicist. And today, we haven't figured out a title for this. Chuck, you want to call it Mafia Malcolm? Mafia Malcolm, yes. That's where you're going? And we're sticking with that.
Starting point is 00:00:29 Oh, my God. How catchy is that? I should get... Some advertising company should be recruiting me right now. Chuck, nice. My co-host, Chuck. Always good to have you here. Always a pleasure.
Starting point is 00:00:41 So the Malcolm of which you speak is... I got to call him a special guest because this guy is out and about i'm afraid to put him on my show because that means he's not otherwise writing his next book like i'm in in the way of this man's productive career in progress malcolm gladwell welcome back to star talk thank you so real yeah we. We had you on stage live. We might have been five or six years ago. Back when those things happened, yeah. And just one quick correction, Neil. Malcolm is so prolific.
Starting point is 00:01:14 He's actually writing a book right now. While we're talking, Malcolm is in the midst of composing his next work. Because I'm very serious. I don't like taking time away from people that they could be doing some productive.
Starting point is 00:01:32 But a whole string of bestsellers, Blink. Forgive me, I haven't read all of them, but I've read many of them. And Blink, Outliers, The Tipping Point. There's some very famous books in here. David and Goliath, checking the list here. What the Dogs Saw. Why don't we just start with the stuff Malcolm hasn't written? And then...
Starting point is 00:01:55 Back into whatever it is. Right, it might take a less time. You have one of my favorite podcasts online right now, Revisionist History. And now in its sixth season. Ooh. And this is where you sort of dissect a commonly held truth and figure out, well, maybe it's not as true as people think. It's a brilliant format.
Starting point is 00:02:18 And you're going strong. And you've got a new book, The Bomber Mafia. Okay. Now I see your connection. See, The Bomber Mafia. Okay, now I see your connection. See? The Bomber Mafia, a dream, a temptation, and the longest night in the Second World War. Man. Man, we'll get into that in a bit.
Starting point is 00:02:38 So let me ask you, Malcolm. Last time you were on, we discussed sort of sociology and the human condition, about which you've written quite a bit. And what I've found is that you, instead of just teaching people textbook style, instead of only telling stories, you've found a way to stitch together first person narrative or a narrative that directly relates to another human being who we can think about their existence and the science related to it and that's an art the way you've that that's an art so would you suggest that if scientists want to communicate they should adopt your methods here all right i'm gonna handle this one for you malcolm yes okay yes okay well okay i don't you
Starting point is 00:03:34 know i would only say that the if you're a scientist i mean you were saying earlier jokingly that you didn't you you were worried that if you interviewed me, I wouldn't have time to write. I don't want my scientists spending all their time writing popular books. I want them doing science, right? I mean, I want people to specialize in what they do best. We're not selecting out our astrophysicists and our chemists and our biologists for their ability to explain their ideas to the public. We're selecting them for their ability to push the boundaries of science. And it may well be that the thing that makes them good at pushing the boundaries of science
Starting point is 00:04:16 makes them bad at explaining their ideas to the general public. I'm fine with that trade-off. You know, the reason God invented journalists is that we can come along and do that job for them. I was wondering where journalists came from. God invented them. God,
Starting point is 00:04:33 in his infinite wisdom, said there was a role for someone to play, to be intermediary between the genius and the public, right? That's what we're doing. I'm not pushing the boundaries of science. I'm just telling stories and
Starting point is 00:04:49 standing between the public and the science. And that's what we're supposed to do. But you have to boost your science literacy so that you don't fumble over the content, right? Yeah. Although, you know, what I've discovered is that first of all i steer clear
Starting point is 00:05:06 of those areas where i don't feel you'll find that i you know i i don't write about the the most complicated corners of medicine or biology i stick pretty much to those areas of science that i find personally accessible so i do think it's tricky. Not everything can be explained to a general public. You know, it's like the, remember when Stephen Hawking's A Short History of Time was a bestseller? And there was kind of an open joke about whether anyone who was buying it was reading it.
Starting point is 00:05:41 Because, like, that is not a, you know, the idea that, you know, a standard, normally educated human being could read that book and make sense of it is a little bit of a stretch. I didn't have a prayer book. I got like 10 pages in. I was like, you know what, this is just so far above my pay grade.
Starting point is 00:05:58 So, you know, there are real limits to whether certain parts of science can be made accessible. I'm not sure they should be. Malcolm, you're deep in the publishing world. Was it true or was it just a story? I heard that the publishers wanted to actually find out if people were reading A Brief History of Time by Stephen Hawking. So in some set of books, they put in a coupon and said,
Starting point is 00:06:26 mail this in and get $100. And like, nobody mailed it in because they never got that far in the book. No, yeah. No, it's like, that was, there's a famous story about the guy who in, Ron Popeil, remember him? The pitch man on TV? Yes. His dad invented the pocket fisherman.
Starting point is 00:06:42 And when his father, his name was S.J. S.J. was once asked, someone once said to him, the pocket fisherman, you know, it doesn't work. And S.J. said very wisely, it's not for using, it's for giving. That's an excellent quote. And that, you know, I feel like the Stephen Hawking book was for giving, not for reading. Do you think that as a society, we are pulling away from scientific literacy? And if so, why would that be?
Starting point is 00:07:11 Well, I wonder about this. So this question comes up a lot about, is there something distinctive about the moment we're living in and popular attitudes towards expertise of one kind or another. I would say, Malcolm, your books, they bring people into the science and they show how the science manifests. So I think you are a force of good on this landscape. But go on.
Starting point is 00:07:39 I was going to say, I think the people who say that we're living in a uniquely anti-scientific moment have a short memory. So let me give you two stories from my mom. My mom grows up in Jamaica, gets a scholarship to a boarding school in the 1930s, and she's reading the Encyclopedia Britannica and she learns to her, reading some section on, I don't know what the section was, on race or something, she learns to her horror that the Encyclopedia Britannica considers black people to be genetically inferior to white people, right? So this is 1930, whatever it is, seven in Jamaica. Then she tries to marry my father who is white and discovers to her whore that a good majority of people think that it is
Starting point is 00:08:26 not just kind of casually a bad idea for white people to marry black people, but like something that is a profound problem for the future of civilization. Because they believe that the mixing of the races represents. Those are two pretty firmly held
Starting point is 00:08:41 views of that period. And eugenics was flying high. And eugenics was flying high. And eugenics was flying high. First of all, that's 70 years ago, not that long ago. And those are two crazy notions that were held by, among other things, the Encyclopedia Britannica.
Starting point is 00:08:57 So, have we made progress? I kind of think so. No one's saying stuff that crazy today. At least not on that level of the Encyclopedia Britannica today does not have those kind of wh so. Like, no one's saying stuff that crazy today. At least not on that level of... The Encyclopedia Britannica today does not have those kind of whoppers in it. Wait, wait, wait. But just to be fair to Chuck's question,
Starting point is 00:09:13 there were scientists saying that at the time. So the fact that the Encyclopedia Britannica echoes this is not being anti-science. They're just promulgating the science of the anthropologists of the day. That's different. All right, all right. So this opens up some categories here because this concept of skin color and race,
Starting point is 00:09:40 as we know in the history of this country, is quite colorful, to put it mildly. And that is a very deep, deeply endemic feature of society, as so many other topics are that you have addressed in your books. So do you have a certain priority order of which you say, I want to do this topic because it's the most inflammatory or because you have the best stories to tell. And that then shapes your books.
Starting point is 00:10:11 I think it starts from what question is meaningful to me. Not sort of globally meaningful, but meaningful at the moment. So, for example, since we're on the subject of race, which is meaningful for me for personal reasons, you know, because I come from a mixed race background. In this current season of Visionist History, I have a whole thing about HBCUs. Historically black colleges and universities.
Starting point is 00:10:36 And about the particular role they play in American education that's largely unrecognized. And it was super interesting to me, both because, you know, for personal reasons, that's near and dear to my heart, but also I've always been interested in kind of managing
Starting point is 00:10:56 or learning to understand the relationship between the kind of the dominant community in the United States and minorities. And when I think about my mother's evolution for a moment, so she grows up in a world where any Jamaican would, or any West Indian would, where people in positions of authority
Starting point is 00:11:18 are as likely to be, almost more likely to be black as white. So she grew up in a world where the doctors were black, where the teachers were black, where the politicians were black, where the lawyers were black. And so it never occurred to her that these were opportunities that weren't open to black people
Starting point is 00:11:34 or the black people couldn't do those jobs. And then she comes to North America and suddenly she doesn't see any black doctors or black lawyers or black politicians, at least in the 70s. And that transition is super interesting, right? And I was, you know, I'm a kid and I'm observing this. I'm observing my mother go through this transition and struggling with it.
Starting point is 00:12:00 I mean, we lived in a town where there were two black people, my mom and actually three, and these twin adopted girls of the local Lutheran minister, the Goosey twins. That's it. Three people. A town of 7,000 people. Right? Then my mother has to navigate that fact. All of a sudden, she's the only person who looks like her. And it's weird.
Starting point is 00:12:20 And it's also strange for the other people who've never seen a well-spoken, well-educated, you know, black person before, right? Can I touch her hair? Yeah. Oh, no. It wasn't. You know what it was, Neil? I wonder whether you've ever gotten this. When my mom was in England going to school in the 50s, what she got was people wanting to touch her skin to see whether, quote, it would come off.
Starting point is 00:12:42 It would rub off. Yeah. Yeah. That was the. Yeah. So there's a whole hierarchy there. I would just tell them that it's pink. I tell, it would come off. It would rub off. Yeah, that was the... I would just tell them that it's paint. I'd tell them it does come off, but quite frankly, you need a special solution, and I can't share that with you, unfortunately.
Starting point is 00:12:56 So let me just, not as a pushback, but as an addendum on what you just said about inequality, being able to exist anywhere among anybody. It may be an eight universally, but it's like a 12 for black people in America. That's the only difference. You know, that's the difference is that- Okay, it's an extra.
Starting point is 00:13:20 It goes extra. It's extra in America. And like you said about your mom, you know, here she is seeing a society where black people do hold authority and positions of authority. I mean, that was anathema in America and subsequently North America. I had a pediatrician and my mother's doctor were black and they were both West Indian. So how funny is that? This is part of my, every American child, school child, should be required to spend at least one year of their life attending a school in Atlanta. Why? Because that's the
Starting point is 00:14:05 American city where they are most likely to see black people in positions of authority. Lots of them, right? A really good experience. Up and down the spectrum. That's right. Up and down the spectrum. You go to a restaurant in Atlanta, you will see, you know, 50% of the seats in the restaurant will be occupied by black people, chances are. You go to the doctor, chances are your doctor will be a black person. You need to be exposed. People should be exposed to that, to understand that their little world is not representative of the world out there. You know, I think internal years abroad would be a really good idea for America.
Starting point is 00:14:36 Right? I like that. Internal years abroad. Yeah. You don't have to go to Africa to see black people. Go to Atlanta. Go to Atlanta. No, I was just in Atlanta.
Starting point is 00:14:46 I love Atlanta for that very reason. I just think Atlanta, like, you know, it's a different place. Same thing with Washington, D.C., by the way. Very similar kind of worlds where you just get a, that's to me a much more realistic picture. Okay, we've got to take a quick break. When we come back, more of our exclusive interview with Malcolm Gladwell
Starting point is 00:15:05 and all that makes his brain tick on Stock Talk. We're back. StarTalk. I got Malcolm Gladwell in the house. And Chuck Nice, my co-host here. Chuck, you're tweeting Chuck Nice Comics still. Well, thank you, sir. Yes. And the more you say it, the more I have to keep that handle.
Starting point is 00:15:45 Okay. Not that I care. I love that you say it. And Malcolm, tell me your social media handles. Do I know them? Hold on. I think I'm at Gladwell and Twitter. I'm really bad on social media. You know, I'm an old dude. Oh, you want people to like read your books. What the hell's wrong with you? I think it's at Gladwell. That's it. It's at Gladwell.
Starting point is 00:16:13 I'm the only Gladwell on Twitter, as far as I can tell. Gladwell. Okay. All right. Cool. All right. So let's get back into this. We left off with the recommendation that everyone take a
Starting point is 00:16:25 staycation in the United States and just visit Atlanta. No, they've got to go to school in Atlanta. You've got to live in Atlanta. You can't just visit. You need to stay there for a while. Move around. You know, I was
Starting point is 00:16:42 on this trip when I was in Atlanta. I was going for a run. Just to be clear, you're an avid runner, correct? I trip when I was in Atlanta. I was going for a run. And I, a guy. Just to be clear, you're an avid runner, correct? I am an avid runner. And I'm running around. And this guy starts running next to me. We start chatting.
Starting point is 00:16:52 And he's a white guy. And I said, oh, what do you do? He goes, I'm a, he's at my medical resident. I was like, oh, where'd you go to med school? He goes, Morehouse. And I said, wait a second. How many white people go to Morehouse Medical School? He goes, well, it's a small number.
Starting point is 00:17:06 But I love that fact. And he wanted, he got interested in medicine because he went to Africa, spent 10 years in Nigeria and Kenya, came back and said, I'd like to become a doctor to help people. But it seemed at that point, after 10 years in Africa, it was natural for him
Starting point is 00:17:25 to want to go to a place like Morehouse Medical School. He was comfortable around people who didn't look like him. And that just tells me that that's the advantage of people who live a cosmopolitan life, who get out of their own little worlds, that they think they'll think twice about.
Starting point is 00:17:41 And Malcolm, there's got to be a lot of unmind stories about going fish out of water, or really they're in water, it's just a different water, right? I mean, I remembered my elementary school, and I don't want this to be about me here because we don't have much, you know, how often do we have Malcolm Gladwell in the house?
Starting point is 00:18:01 But in elementary school, by whatever measures of exams that were administered, I was not put in the special, the advanced class. The advanced class got to take a foreign language. They got special teachers. All this was going on. And I knew I wanted to do that, but the system prevented it. I was one of three black people in my elementary school. All right. In middle school, I go to Lexington, Massachusetts, because my father has an appointment
Starting point is 00:18:32 up at Harvard, a one-year fellowship. And so I'm living in Lexington, Massachusetts, and I go to the regional middle school. There are three black people in that school once again. I get the highest grades in the school in four semesters. And I graduate getting accolades and awards. And I got pins. And I'm thinking to myself, okay, I don't know what's going on here. Okay. Why in one place they say I'm not deserving and in another place I'm at the top of the class. But I was not thinking, and I should have, you know, this is good for the white people to see this.
Starting point is 00:19:17 I mean, right, right. Because there's a black kid, one of three black children in this middle school and untouchable. Plus, I was on all the sports, so I could outrun them. And so I think maybe every white kid who's been told from the Encyclopedia Britannica
Starting point is 00:19:40 and its derivatives over the decades that they somehow represent some platform of superiority. You need stories like this. They need to get their clock cleaned by you in middle school. There's another one. And I, you know, I don't want to like rub it in, but, but, you know, I grew up my time up until then was in the Bronx. Right. And so there's a basketball court. And so I'm there and I go to take a pickup game of basketball and someone is ready to shoot. And I jumped to block the shot and I end up blocking his shot with my elbow
Starting point is 00:20:16 because I jumped 18 inches higher than was necessary. Wait, that's funny. You expect to block it with your hand but i kept rising and the boy he didn't jump higher with me and my elbow blocked the shot and i said where am i neil wait i have a question for you neil um how tall are you i'm six two okay because i was reading just this is a sobering note in this discussion i was reading just the other day this big discussion of school suspensions. For some reason, for long reasons which I can get into or not, I've become really fascinated by school suspensions.
Starting point is 00:20:55 And as you know, the data says that black kids, particularly black males, are way more likely to be suspended for the same offense than their white equivalents, right? But there is a nuance to this that broke my heart, which is when you look at all of the variables that most predict being suspended, being black is one, being a black male is two. Being a tall black male is the overwhelming predictor. There is something about, so I can equalize everything. And if you're tall and black and male, you are getting singled out for punishment in a way that outstrips every other conceivable variable.
Starting point is 00:21:43 Well, it's hard to miss you. It's hard, but there's something... You're tall and black. But it's triggering something. And the reason it's so important, it's important for a million different reasons, but this is the kind of thing that in a million years, the teachers or principals making those decisions
Starting point is 00:22:01 would never think... They might think, am I singling out a kid because he's black? Or am I singling out a kid because he's black or am I singling out a kid because he's a black boy? They would never think they were singling out someone unfairly because they were tall. So I was typically the second tallest kid in my class. And so, but I wasn't oddly tall.
Starting point is 00:22:21 But I had very gentle demeanor. I wasn't oddly tall, but I had very gentle demeanor. So I was never at risk of sort of a truant conduct. I'd never got into fights. Other people, I'd never got into fights. Were you also a good student? Well, it depends on what you mean by good. Well, I'm saying.
Starting point is 00:22:40 Very average grades. I had very average grades for most of my life. Well, when you're black, the expectations are lower. So I was a great student. You were a really good student. I'm sorry I had to do that. That's a joke, people. That's a joke.
Starting point is 00:22:55 Okay. So, Malcolm, let's talk about your current book. Yes. It has many more words in the title than any of your other books. A lot of words. So please defend yourself there. It has an unwieldy subtitle. A Dream of Temptation The Longest Night of the Second World War.
Starting point is 00:23:16 It is and it's a different book than any I've ever written. It's not a, it doesn't go in a million directions. It's not a it's a different book than any I've ever written. It's not a, it doesn't go in a million directions. It's not a, it's a, it's a, it's a, it's a, not a pure history book, but very close to it. It's a story about this weird little bit of history in the Second World War, which was a group of pilots down in Alabama who thought they had solved one of the most difficult physics problems in modern war,
Starting point is 00:23:47 which is how to drop a bomb from a plane with a high degree of accuracy, which if you think about it, is an insanely hard problem, right? You're six miles up, you're going 250 miles an hour, you're dropping a large heavy object through a constantly changing air pressure, weather, blown by winds, different temperatures, to a small target on the ground that might be obscured by clouds. I mean, it's insanely difficult, right? And most people don't know. I'd say correctly if I'm wrong, Malcolm, because surely you research this. Was it something like 80%
Starting point is 00:24:28 of bombs dropped in the Second World War missed their target? Oh, yeah. And not just by a little bit, by an enormous amount. The British do a study early in the war called the Hull Report, and they think they're doing a really good job of hitting German targets, and they discover they're, like, off by as much
Starting point is 00:24:44 as seven miles. That's how bad it is. Gentlemen, I'm sorry to report, we're actually bombing ourselves. That's right. And you want to be high enough so that anti-aircraft can't get you. So they're competing forces on your accuracy. So these guys think they've solved this problem. And they then go to the next stage and say,
Starting point is 00:25:04 if we've solved this problem, we can solve the problem of war, right? Because we can wage war in a way that doesn't result in hundreds of thousands of civilian casualties. We can bring a city to its knees without destroying it. We can take on our enemy without having to use armies and infantry and navy. We can just use bombers that just precisely take out the thing we want to take out, the aqueducts, the power plant, the bridges, and they'll surrender, right? And this incredible dream, which is hatched in Montgomery of Alabama,
Starting point is 00:25:39 and they take it into the Second World War, and they ascend the ranks of the Air Force in the Second World War with this incredibly ambitious, disruptive idea that the only thing that matters is the bomber because the bomber can drop bombs with perfect accuracy wherever they want. And this dream and its confrontation with reality ends in spectacular fashion over the skies of Tokyo in 1945. And my book is about that journey from the hatching of the dream in Montgomery in the mid-30s to its kind of cataclysmic failure over Japan in the summer of 1945. All right, that's the dream. What's the temptation? You don't mean the firebombing of Tokyo, do you? No. Well, the firebombing is what results from the failure of their dream. So the temptation is, the dream is
Starting point is 00:26:36 we can clean up modern warfare and use these perfectly aimed bombs. The temptation is when that strategy doesn't work, it's just to say you know, F it. Let's just firebomb everything down and burn everything down. And my dreamers won't take up
Starting point is 00:26:52 the temptation, but somebody else does. And that's the darkest chapter of the Second World War. Chuck, I interrupted you. What did you answer? No, he answered my question. I said, if that's the dream, what is the temptation? But now I'm intrigued to know, who is this Darth Vader figure who comes in and creates the darkest chapter in the Second World War? Which, God, I mean, that's such a bloody war.
Starting point is 00:27:15 I mean, to say that this person is the author of the darkest chapter, wow. Wow. Darkest chapter of the darkest war. Yeah, the darkest chapter of the darkest war. Who is this Darth Vader-like creature? A man named General Curtis LeMay, one of the most infamous or famous, depending on your perspective,
Starting point is 00:27:32 military aviators of the 20th century, who's just an extraordinary character. I mean, the most kind of brutal, ruthless, unsentimental, brilliant. If you remember, he was, I lost count, I think at nine or ten, the number of Hollywood movies that had Curtis LeMay, a villain based on Curtis LeMay in there. I mean, Dr. Strangelove.
Starting point is 00:27:56 There's a Curtis LeMay figure in Dr. Strangelove. I see. So he's influenced a, what do you call it, character type. Yes. He stands for. He's become a trope. Yes, he stands for the crazy coal warrior, the nuclear finger on the button.
Starting point is 00:28:12 He's the guy, so LeMay is the guy who, napalm is invented in the Second World War, not Vietnam, for the express purposes of burning down Japanese cities because they were made of wood
Starting point is 00:28:24 and they were highly flammable. And the first person to really... Wood and paper. A lot of walls are paper. And paper, yes, exactly. And the first person to use napalm to its fullest extent was this man, Curtis LeMay. The first person with the kind of gumption
Starting point is 00:28:39 to say, I'm going to give up on all traditional forms of warfare and I'm just going to drop so many firebombs on these flammable cities that I will literally burn my enemy out. And that's what he does. And he kills, in the summer of 1945, he firebombs, he burns to the ground 66 Japanese cities and maybe kills close to a million civilians.
Starting point is 00:29:01 I mean, it's an astonishing. If you list the people in the 20th century who were responsible for the most civilian deaths, Mao's one, Stalin's two, Hitler's three, Pol Pot's four, and Curtis LeMay is five. An American is number five. Wow. Well, I'm certainly adding him to my list of people
Starting point is 00:29:23 I'd like to have at a cocktail party. Just one thing I'll add, just to end the segment on a completely down note. Napalm, at least of the variety they used in Vietnam, maybe the early versions weren't quite this potent. The napalm doesn't just simply burn. It burns at an extremely high temperature, like thousands of degrees. And so what happens is where napalm is dropped, it sucks all of the oxygen out of that region. So even if you were not burned by the flames and the temperatures, you suffocated from the absence of oxygen. And in firebombing tactics in tokyo
Starting point is 00:30:06 i don't know if you've ever done this this is uh i've done this i used to do it when i still put as many candles in a birthday cake as my age which of course i've stopped but if you put candles around the rim of a cake all right you know get at least 10, but ideally like 20, just do that and then ignite them and just watch what happens. All candles point directly radially into the center of the cake. Are you serious? Because what you've heated the air, not only outside the cake, but inside the middle of the cake, but there's more heat in the inside of the cake than outside, right? Because they're all sort of focused there. And so if you heat the air there, that air rises. Other air comes in from outside to replace it,
Starting point is 00:30:50 pushing the candle flames inward. So while they were firebombing these cities, you don't have to ignite every square inch. You just have to make a circle. And then the air will come in and push the flames and completely incinerate everything in the center of the circle. Is that what they call a firestorm or a conflagration? It is that physical effect?
Starting point is 00:31:11 Those are elements of it, yes. But what I'm describing is you're using the forces of physics and chemistry to reduce how many actual bombs you would have to drop. And these are people who think this stuff up. reduce how many actual bombs you would have to drop. And these are people who think this stuff up. Neil, I'm very grateful that you didn't use your formidable gifts for ill. If you decide to become a bad guy,
Starting point is 00:31:33 this world will be over. No. It's about time we had a black supervillain. I'm going for Neil deGrasse Tyson. That's true, a black supervillain. It doesn't happen. One thing, again, just to end it on a completely down note, I have more than once, and it's never received well.
Starting point is 00:31:53 I don't know why. Malcolm, maybe you have the answer to this before we go to our third segment. But people talk about sort of violence in the cities or violence during wars or terrorist attacks. And there doesn't appear to be bandwidth to hear the following fact, that from 1939 to 1945, so the duration of the Second World War, 1,000 people per hour were killed in the name of that war. Oh, my God.
Starting point is 00:32:22 That's terrible. Per hour. And there is nothing today where people are dying at that rate. And so much lower things get people's, you know, astonishment. And I'm just thinking, you know, people alive today were alive at a time where that went every hour from 1939 to 1945. And so, yes, the world used to be worse than it is today. I'm just, you know, put that out there.
Starting point is 00:32:50 We got to take a quick break. When we come back, we'll try to land this plane. It might be hard because Malcolm is a fount of enlightened insight. You're the one who just told the birthday cake story. I don't know why you're telling it. Why are you pointing at me? That's supplemental to your story. You just taught a generation of listeners
Starting point is 00:33:11 how to create a firestorm on their birthday cake. I guess I did. You're telling me that. I'm just very observant if I see stuff, that's all. So you could actually put candles adjacent to those candles inward and they would each light their way into the center. I mean, you could do that experiment.
Starting point is 00:33:30 Anyhow, we're going to take a break. When we come back more with the one and only Malcolm Gladwell on StarTalk. Hi, I'm Chris Cohen from Hallward, New Jersey, and I support StarTalk on Patreon. Please enjoy this episode of StarTalk Radio with your and my favorite personal astrophysicist, Neil deGrasse Tyson. Malcolm, I feel like I know you, even though we've only met a couple of times, only because your books are so accessible. It's like you're a friend of mine sitting next to me. And that's a talent that you've honed ever since your days at The New Yorker.
Starting point is 00:34:34 That's a dangerous talent. That's a very dangerous talent, man. Only in the hands of the wrong person. That's the theme. That's the theme of this conversation. That's right. We all turn evil. It's an issue.
Starting point is 00:34:50 So in Revisionist History, like I said, it's one of my favorite podcasts out there. You open your season with a discussion of self-driving cars. And this is a little weird. And I don't know if, you know, do you need to be checked out on this? Where you try to get a self-driving car to hit somebody. What know do you need to be checked out on this where you try to get a self-driving car to hit somebody what did you try what tell me what happened in this episode first of all let me just say bravo well i read i i for some reason i stumbled across this paper by a guy who said he was like a an urban, and he writes this brilliant paper where he says,
Starting point is 00:35:26 now, wait a minute, if all the cars on the road are self-driving, then that changes the psychological dynamic between cars and pedestrians. The reason you don't jaywalk now, or you jaywalk, we do jaywalk, but we jaywalk pretty sparingly, right? The reason everyone doesn't jaywalk. We do jaywalk. But we jaywalk pretty sparingly, right? The reason everyone doesn't jaywalk all the time is that we're really worried that someone who's driving a car is either crazy and will hit us or distracted and will hit us. We don't jaywalk all the time because we know that the human drivers of cars are imperfect. Self-driving cars, autonomous vehicles, they're not absolutely perfect, but they're really close, right? So this guy says, well, if the fear of an imperfect driver
Starting point is 00:36:11 is removed, what do you do if you're a pedestrian? You jaywalk all the time. You can walk out into the middle of a freeway. You absolutely, 100%. And they will all stop. They will all stop. So I thought this was such a hilarious observation so you need to program in some psychopathy you need some psychopathy you guys wait a second you guys are absolutely right wait you've jumped ahead of me we're getting there yes yes yes the answer is yes, yes, yes. So right now, so I go to Phoenix where Waymo operates,
Starting point is 00:36:50 Google's Waymo. And that is the only place in the country where you can hail, like a newbie, you can hail a self-driving car. It'll come pick you up and you drive around. So we got ourselves a Waymo
Starting point is 00:36:59 and we drove around in it for a while. And it is clear, the driver, the AI that's driving Waymouse is the nicest, kindest, most patient, most long-suffering driver in the history of drivers. This is a driver who never gets angry, who will never flip you the bird,
Starting point is 00:37:16 who will never lean on the horn, who will never speed up when he should be slowing down. It's a perfect driver. So then I said, oh, let me just... Well, they actually slow down when they're approaching a yellow transition between a green and a red. Never run. They'll never
Starting point is 00:37:30 do anything. They're perfect. So I said, oh, this means that I can do whatever I want. So I got out of the car. We were in a parking lot of an Alamo draft house in Chandler, Arizona. And I got out and I started to mess with the Waymo. And I'm not going
Starting point is 00:37:46 to, I'm not going to give it away. But I was like running next to the Waymo. And I was trying to see whether if I behave like an idiot, what would the Waymo do? Right now I know what I what a driver would do a human driver if I did hit me. But what would happen if, you know. So, it was this great reminder first of all that people who come up with new technologies don't always think through all of the implications.
Starting point is 00:38:17 No one thought when they came up with the idea for autonomous vehicles that what they were really doing was liberating pedestrians to do whatever the hell they want. That wasn't on the table. That was not on their mind. That was not on their mind. That was not there.
Starting point is 00:38:30 And I realized, you know, if you make every car in Manhattan self-driving, you realize that no car will be able to make it down the street. That's true. It would be impossible. My track club meets on the Lower East Side and we're competing for space on one little lousy little track
Starting point is 00:38:49 with all of the walkers and whatever. If every car in the road was self-driving, we would get on the FDR drive and do our workouts at rush hour on the FDR drive and every car would stop and wait until we were finished. So it's like I found this idea so fantastic. It's amazing.
Starting point is 00:39:11 It means the really, really, really cool thing and the good thing is it means you'll be able to ride your bicycle through any American city without fear. That means you will always ride your bicycle, right? The only thing that's preventing us
Starting point is 00:39:27 from riding our bicycles everywhere is that we're legitimately scared of getting hit and killed. Now, I've removed that possibility. What will you do? Okay, so just to be clear, okay? You know, when, as they say, when they invented American football and people were cracking their heads open, this is a joke from Jerry Seinfeld, right?
Starting point is 00:39:49 Not a joke. It's a comedic observation of reality. Cracking their heads open. So rather than say maybe we shouldn't play this sport, they said let's still play this sport, but now wear a helmet, okay? So maybe you just have to add extra rules that constrain pedestrians so that the traffic can continue. And that's a solvable problem, right? Yeah. I thought you were going to say what you guys said before,
Starting point is 00:40:14 which is maybe the response is you program the AI into being crazy. Just a little bit crazy. So you don't know. Just a little bit. You have 90... That could be the one that doesn't stop at the red light. Yes. 95% certainty they're not going to hit you.
Starting point is 00:40:30 But the minute there's 5%, then you're like, you're back. Everybody's thinking now. Everybody's reconsidering. Malcolm, let me ask you something I don't have an answer to, and I've thought deeply about it, and you're exactly the kind of guy to think about it and write about it. So right now we lose at least 30,000 people a year to traffic deaths. That includes pedestrians, I think, 35,000 a year.
Starting point is 00:40:51 So that's 100 a day. And it's been that way for decades. You introduce self-driving cars, and that number drops to near zero. But initially it won't be near zero. It might be thousands of deaths. And these are deaths from errors in the software or the pattern recognition software thinks it's a clear road, but there's a truck in the way. That's actually happened. It has actually happened. So how do you convince people that 2,000 deaths are better than 35,000 deaths if those 2,000 deaths are
Starting point is 00:41:28 from the errors of a machine built by somebody in Detroit? You're right. It requires some persuasion, but I don't think it's framed properly. I don't think it's a difficult proposition for people. I mean, I am 100% in favor of the autonomous driving revolution, even as I recognize it creates
Starting point is 00:41:52 an interesting world where you can't actually drive a car in a city anymore, which I'm fine with. Get rid of them as far as I'm concerned. But I do, and also,
Starting point is 00:42:02 I think the transition, you said eventually we'll get to zero. I eventually we'll get to zero. I think we'll get to zero really quickly because I think these AI systems learn really quickly. Real fast. They're fast. You know what has essentially gone to zero?
Starting point is 00:42:15 Our airplane deaths. Yes. It's essentially zero. They're essentially zero. And if you, I was talking to the engineers at Google who have created Waymo, and if you ask them, they'll say, just in the last two years, Waymo has gotten so better.
Starting point is 00:42:31 The experience of sitting in the back of a Waymo in Arizona is you can perceive the difference between now and two years ago. Oh, wow. The car is just driving a lot more. The AI is driving it a lot more smoothly. There are fewer situations where it seems to be confused. I was stunned driving in the back of this Waymo. It's amazing.
Starting point is 00:42:59 I mean, it's... And I think that's the other thing, that once people experience what it feels like to be in these vehicles, what you quickly realize is how much better it is than you are. Yeah. And you're much less likely, I think, to be concerned about the occasional mistake. Plus, they don't consume ethanol.
Starting point is 00:43:15 Right. Yes. Yeah, right. That's right. Exactly. But the other thing— The active ingredient in your highball drink. The other thing is the insurance companies benefit from this so much
Starting point is 00:43:26 that what they'll be able to do is put together and make a fund where you pay people a lot more if they are harmed either in an accident or if they're killed. Waymo won't run you down. Like, there was the initial accident that happened. Remember, there was a fatality involving an autonomous vehicle in Arizona, in Phoenix, I think two or three years ago. And it was because a woman crossed, was jaywalking, and she had a bicycle.
Starting point is 00:43:56 And what happened is the autonomous vehicle approached, and the AI had a category for bicycle and a category for jaywalker for human jaywalker but no category for human and bicycle and it was confused it never seen this before and it didn't know what to do and it was going back and forth between is it a bicycle is it a human is it a bicycle is a human and while it was undergoing this this so that's artificial idiocy because any human would know that different yeah it hits the woman now Now the point is, that only happens once, right? If your system is set up properly, the next time you have a category
Starting point is 00:44:32 for that. And that's what's going on now. It's like building codes. Every code in there is because something happened where people died for construction and fire. Now I'm wondering why there's a warning to not put my cat in the microwave.
Starting point is 00:44:48 Somebody did it. All right, so I agree. So this would rapidly converge on something being completely safe. So what about the case, you know, there are these trolley quest problems, right, in sort of moral philosophy. So if it goes left, it kills one person. If it goes right, it kills two people.
Starting point is 00:45:09 But it will have to kill somebody, so it goes left and kills one person. Is this the kind of decisions it's going to have to make? Or is it going to be so good it never has to have a trolley problem? Well, it's interesting. The trolley problem
Starting point is 00:45:24 assumes out the possibility the trolley problem um assumes out the possibility the trolley can just stop right so the third the third strategy is you avoid having to make the decision at all by sacrificing the efficiency of the journey and what i think ai these are these are called in in ai parlance in autonomous driving parlance, these are called corner cases. A corner case is this difficult, tricky-to-decipher kind of case. And my understanding is that in most of these corner cases, what the car does is it just stops. Got it. Okay. And I think— It's always safer to just stop. It will do what we're reluctant to do as human drivers,
Starting point is 00:46:07 which is to compromise the efficiency of the journey. The AI will always do that. That's one of its great advantages, by the way. It's never in a hurry, right? I would imagine that if somebody figured out what percentage of fatal accidents happen because someone is in a hurry, it would be an enormous number.
Starting point is 00:46:26 An enormous number. 110% of them. Yeah, that's right. But if you have a car... In fact, that's why they change the warnings on signs. They say, traffic in, you know, 20-minute delay, plan for being late. Yeah. Because once you plan for it, then you don't have to rush anymore
Starting point is 00:46:41 because now everyone expects you to be late. I mean, it's a psychological dimension of the helpful sign information on the freeways. Yeah. But so let me ask you, Malcolm, humans are programming these AI and then they sort of continue to program themselves, perhaps. So what about the possibility of bias introduced? And let me just give a fast example. The possibility of bias introduced. And let me just give a fast example.
Starting point is 00:47:09 You surely read the news articles in the last year or so where they had racist sinks at airports. Okay. Because... So you go to the sink, and I always wondered, I put my hand there, and I said, well, I guess it doesn't work. And then a white person comes behind me, and then the water dispenses once they put their hand under it. So it's checking the reflectivity of the skin.
Starting point is 00:47:28 If I have dark skin, the signal does not get back to the sensor. So it thinks nobody's there, right? So I always thought it was just... Sorry, black man. That's right. You're going to have to have dirty hands. Dirty hands.
Starting point is 00:47:42 So this made the news, all right? Yeah. So I'm just wondering if I'm a black man crossing the street at night, is the AI going to know I'm there or is it going to think, clear street ahead,
Starting point is 00:47:54 keep going. Okay, now let me ask this. I'm a dead black man in the street. Here's the caveat to that, Neal. What? Are you Malcolm Gladwell black? Are you Neal T. Tyson black?
Starting point is 00:48:04 Or are you Miles Davis black? Guess what? One of y'all is dying. You know, this is real. I always tell my girlfriend, who is a lot darker than I am, she's black. I was like, I always tell her, if you, you're at it, she's at it at night in Manhattan. Like, I was like, you can't wear black. I'm sorry, you cannot wear black.
Starting point is 00:48:23 Don't. Are you nuts? Like, wear a bright color for God's sake. Shills looks at me like I'm a madman. But, you know, anyone who's driven at night knows when, if someone, now this is not just black people, but when
Starting point is 00:48:39 a person dressed all in black crosses a street that's badly lit at night, it's a problem. It's a problem. Correct. Why do people, my fellow runners, we would run at night on a winter's night on the streets,
Starting point is 00:48:56 and they would be wearing all black outfits. I'm like, you're crazy. What are you doing? I always wear a yellow jacket for a reason. I don't want to get killed by some car that can't see me. Right, right. So these are humans making that mistake, but when your machine makes the mistake,
Starting point is 00:49:11 or you don't know to put in, as a programmer, to put in some test, or if you test it only on white people and think that it's good for all humans, then this is a bias. Even if the person themselves, if they themselves are not racist or in any way.
Starting point is 00:49:28 It was sort of an unintended consequences of only thinking that white people are who's... You know what's a great example of this, Neil? I was listening to something on these AI systems, and they were developing an AI system for dermatologists to diagnose,
Starting point is 00:49:44 to figure out whether something on your skin was cancerous or not. And they went to all this work, and they thought it worked really well. And then they realized that what the AI was doing was looking for the presence of a ruler. Because in all of the images it was using to learn from, they were textbook images where there was a little ruler next to the spot to measure how long it was. This ruler, it's cancerous,
Starting point is 00:50:06 clearly. But no, but I do think that's a transition problem, right? That's the thing about totally, it is 100% true that AI systems when you're starting out reflect the biases of the people who are programming.
Starting point is 00:50:21 Okay, we've got to kill a few black people first and then we're on cue. But you don't, but you know, over time, I think it's reasonable to assume, by the way, here I am on a podcast with Neil Tyson, and I'm the one defending science. You're the one. What is going on here?
Starting point is 00:50:41 I'm like, guys, no, scientists know what we're doing. And Neil's like, these guys are crazy. I can't wash my hands in the airport. It's Chuck. If I spend too much time around Chuck, I become an angry black man. It is true. It is true.
Starting point is 00:50:54 I will do that to you. You have, this is, yeah, it's nuts. So I want to end, we got to, again, land this plane. Could you retell the story that I first heard you tell us on stage? What happened when you first started wearing the afro? Oh, I grew my hair out. Yeah, you grew your hair out, and you got that sort of Art Garfunkel kind of afro there.
Starting point is 00:51:19 Tell me, could you please retell that story? Well, it's, I mean mean i always feel sheepish telling this story because um but you're in a safe space here go on i'm gonna say so i my hair used to be my hair reflects the fashion of the times so when every black person's wearing their hair real short i was wearing my hair real short and then there was a moment, you know, when everyone started to grow their hair long again. So my hair can grow, can be quite considerable. So when I, in the kind of, about 15 years ago or so, I grew my hair out a lot and discovered, of course,
Starting point is 00:51:58 that I was getting stopped left and right by cops. On the most trumped up of, like, the most absurd, you know, I wasn't even speeding half the time. I'm sorry, sir, I'm going to have to ask you to step out of the car. Sir, I'm in the back of a cab. No! Excuse me, officer,
Starting point is 00:52:18 but I think you're talking to the wrong guy. Afro on a Sunday morning. It's in the rule book. And also after 9-11 in airport security lines, I was getting pulled out. And I realized that part of the informal algorithm used by cops and security officers was, you know,
Starting point is 00:52:41 big Afro codes for something questionable, right? I don't think it was a kind of active malignant racism. you know, big afro, codes for something questionable, right? This is just a, I was just, I don't think it was a kind of active malignant racism. It was just like, there are things- And lazy, actually. It's laziness. There are certain things that trigger. So you're a cop and someone goes past you
Starting point is 00:52:57 at 55 miles an hour. Okay, maybe I was going a little bit faster. But, and you know, you're not getting, it's not that you're focused on the individual. You're picking up a few key things is the person driving a sports car are they is the car shiny and new or old and ratty in my case they see me drive by and they see a big afro and they're like oh all right let's go get that dude i think that's what was going on but it was just like i didn't the last time i'd had a big afro was before i had my driver's license so i'd never been through
Starting point is 00:53:24 this very american rite of passage, which is if you resemble something even remotely black, you get pulled over a lot. So I just was introduced to driving while black. That's okay. My first... Welcome to the club. Welcome to the club.
Starting point is 00:53:36 I was like... Well, I have always been able to alleviate that problem personally because I drive with a driver's cap on all the time. Oh, that's a good butler. It just makes me look like I'm somebody's chauffeur. You're driving Miss Daisy. That's who you are. That is quite a defense.
Starting point is 00:53:59 I need one of those hats on my dashboard. That's like a throwback. It's a throwback defense. Yeah, take it back to the day. That's like a throwback. It's a throwback defense. Right. Yeah, take it back to the day. You're like a porter in the airport. Guys, we got to call a quiz there. Malcolm, it's always a delight to chat with you.
Starting point is 00:54:13 We love your perspectives on the world, and your capacity to share those perspectives is unmatched in all of your media now, with podcasts and books, even books with long titles. Good luck on that book. Thanks for bringing that story to the front. I think the world needs it, and anyone thinking of the future of war needs it. So, Chuck, always good to have you as co-host. Always a pleasure.
Starting point is 00:54:42 So, Malcolm, I want to put, if it's possible, you know how some people have a standing table at a restaurant, you know, once a month. I want a standing invitation with you for every next project you do, you come on the show. Oh, I would love that. Thank you. This is so much fun.
Starting point is 00:54:57 In that case, we'll be seeing you once a week. Once a week. Geez. Be careful what you wish for. I know. I know. know oh my gosh all right we're gonna call it quits there this has been star talk the malcolm gladwell edition of star talk neil degrasse tyson here your personal astrophysicist Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.