3 Takeaways - The Science of Failure – Right Kind of Wrong with Harvard Business School’s Amy Edmondson (#161)

Episode Date: September 5, 2023

Failure will happen. Count on it. Especially in today’s complex, uncertain world. Here, Harvard professor Amy Edmonson explains how we can transform our relationship with failure, how people and org...anizations can fail wisely, and how we can use failure as powerful fuel for success. You may never look at failure the same way again.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Three Takeaways podcast, which features short, memorable conversations with the world's best thinkers, business leaders, writers, politicians, scientists, and other newsmakers. Each episode ends with the three key takeaways that person has learned over their lives and their careers. And now your host and board member of schools at Harvard, Princeton, and Columbia, Lynn Thoman. Hi, everyone. It's Lynn Thoman. Welcome to another Three Takeaways episode. The world is growing ever more complex and changing ever more quickly. From climate change to the economy, from work to parenting, life is fraught with complicated
Starting point is 00:00:39 decisions, and navigating change and failing well is critical. Failures are an unavoidable part of progress, which is why they're so important for both our personal lives as well as for the vital institutions that shape society. That's why I'm so excited to be joined by Amy Edmondson and to learn about the science of failing well. Amy is a Harvard Business School professor who studies teams and failing well.
Starting point is 00:01:06 Her new book, which is wonderful, is Right Kind of Wrong, The Science of Failing Well. Welcome, Amy, and thanks so much for joining Three Takeaways today. Lynn, thank you so much for having me. Amy, it is really my pleasure. So a team of researchers at NASA ran an experiment to test the effects of pilot fatigue. Can you tell us about that experiment, why it's an example of the science of failing well and how it actually changed air travel? This is an experiment that took place 30 years ago. And indeed, it did change air travel. So let's back up. So the question, simple question, if you think about it, was, do fatigued pilots make more errors in the cockpit?
Starting point is 00:01:54 They decided to have some pilots who had just flown three days of their normal travel be one condition, and then freshly rested pilots would be in the other condition. And so two groups of pilots, one have just finished flying for a few days, the others are brand new, into the simulator. And guess what? The fatigued pilots indeed made more mistakes. So that wasn't surprising. But here's what was surprising.
Starting point is 00:02:22 The fatigued pilots happened to be with the teams that they had just been flying with. And the teams, as teams, made fewer consequential errors in the simulators than the well-rested pilot teams who had not flown together. So this was a surprise. And the big takeaway was teams can catch and correct each other's errors when they're used to working together. In other words, a good team will help compensate for human error. So that led to the revolution. They didn't just come up with better technology, but the revolution was we have to train cockpit
Starting point is 00:03:00 crew members to operate well as teams. So interesting. Why is psychological safety important for failing well? And what role did that play in the pilot example? Well, it's at the very heart and soul of the pilot example, although they weren't using that term and they weren't calling it. But essentially what the good teams still make human error and even slightly more human error than the well-rested teams were able to catch and correct each other's errors. That meant they had no problem calling each other out when they saw something wrong. Now, keep in mind that the cockpit is a very hierarchical environment. There's a pilot and a co-pilot and the hierarchical environment. There's a pilot and a co-pilot. And the hierarchical relationship, which is reflective of the military training that nearly all of them have,
Starting point is 00:03:51 is very well entrenched in the human psyche and even more well entrenched in the aviation community. So what we have here is people willing to speak up and say to the boss, hey, you're doing that wrong, or hey, we're about to make a mistake here. And that's not easy for human beings to do, as all your listeners already know. But what I call an environment of psychological safety makes it possible. So an environment of psychological safety is one in which it's not easy, but you understand it to be acceptable, even welcome, that you speak up with concerns, with dissenting views, with requests for help. And when you can do all of those things, you are better able to learn as a team. And of course, you're better
Starting point is 00:04:38 able to perform as a team as well. Amy, can you give another example of failing well and what happens when there isn't what you call psychological safety? The classic example of failing well would be an inventor in her lab or a scientist in her lab. These are environments where people know they're on the leading edge. They know they can't just look up the answer on the internet. They experiment for a living. But they don't just experiment randomly, right? They experiment thoughtfully. They've done their homework.
Starting point is 00:05:13 They know what we know and what we don't know. And they put some thought into designing just the right experiment to push the field a little farther along. Yet, of course, some healthy portion of the time, they're wrong in their hypothesis. And so they experience not a success, but a failure. So that's kind of the ultimate example of failing well is when you're in brand new territory, you're trying to discover something new. And you might have been right, but alas, you were wrong. Those are good kinds of failures. They bring us slightly closer to new knowledge,
Starting point is 00:05:45 new frontiers. But the whole domain of failing well is larger than just invention and science. It's having the knowledge to avoid the failures that can be avoided, to prevent failures in known territory, including among hospitalized patients. The science of failing well in that context is really about speaking up, which means it's also about psychological safety. It's about creating the environment where everyone knows that the work we do here is fraught with the potential for breakdown. It's fraught with the potential for error. It's fraught with the potential for inadvertent harm to patients. And the only way to reduce that to near zero occurrence is for people to be speaking up all
Starting point is 00:06:33 the time. So the science of failing well covers a lot of territory, but I think the most important way to summarize it is this, prevent the preventable failures that we can. Use the best practices that we know and have for preventing preventable failures and learn to welcome the intelligent ones in new territory with open mind, open heart, open arms. That is great advice. Amy, you have some wonderful quotes in Right Kind of Wrong. And two of my favorites are Wayne Gretzky and from soccer star and Olympic gold medalist Abby Wambach. Can you tell us about those quotes? Abby Wambach says that failure means you're in the game. And she did a 2018 commencement speech at Barnard College in New York. And she told graduates to make failure their fuel, which is a wonderful metaphor. She says failure, and this is the quote, failure is not something to be ashamed of.
Starting point is 00:07:35 It's something to be powered by. Failure is the highest octane fuel your life can run on. That is wonderful. Fuel is a wonderful metaphor for two reasons. Failures are your fuel because the lessons you get from them literally take you to the next step. But they're also your fuel because they have emotional resonance and they make you sit up and want to do better next time. So I think they really do drive ultimate success. Wayne Gretzky, the Canadian ice hockey superstar, said you miss 100% of the shots you don't take. And of course, that points to another really important part of the science of failing well,
Starting point is 00:08:20 which is taking risks. If you only act when you know you're 99% likely to get it right, to get the outcome you want, you're really going to miss out on an awful lot. How do we create a healthy failing culture? I think the most important thing is to make failure discussable in the following sense, to remind yourself, to remind your colleagues, your family, your friends, whoever is the relevant group in your life that we live in a complex, uncertain world. By definition, that means things will go wrong, right? It's just a given. So once you talk about it, it takes so much of the emotional heat away from it.
Starting point is 00:09:06 Talk about it, preview it, know it's a possibility before we even get going. So a healthy failure culture, first and foremost, is one that recognizes the inevitability of some failure. Then it's also one that is open about it. It's that psychological safety again, where people are speaking up early and often about not only the failures that happen, but the concerns. Let's go back to that NASA study where if co-pilots weren't speaking up about the concerns they had, then really bad things can happen.
Starting point is 00:09:41 So a healthy failure culture is really one that recognizes the inevitability of failure, but also is keen on preventing the ones we can prevent. And that often requires speaking up and being thoughtful about the risks and experiments you do run. And then I think finally, it's how do you respond and learn from the failures that do happen? And a healthy failure culture is sort of constantly and coolly looking at what happened. What was my contribution to it? What are some of the things I did? What are some of the things I didn't do that helped create that failure? And this is considered just a kind of normal activity around here. Amy, you've spent years studying complex failures in many industries, in healthcare, aerospace, and other businesses.
Starting point is 00:10:29 What are complex failures? And can you give some examples and explain why they are what you call, quote, the real monsters that loom large in our work, lives, organizations, and societies? Yes. Complex failures are multi-causal failures. They're the kind of failure or breakdown or perfect storm that happens when a number of factors line up in just the wrong way. A small example would be a hospitalized patient gets the wrong dose of a drug and it happened not because of one person making a mistake, but because two medications are marked similarly or there's a brand new nurse on the shift at the same time
Starting point is 00:11:11 and there's not a good lighting in the corner where the drugs are stored, right? Any one of these factors, if you changed it, would lead to the failure not having happened. Again, it's that perfect storm of factors coming together. Much bigger example would be the tragic Boeing 737 MAX accidents that happened when engineers and accountants and competitive forces and technology sort of all came together to lead to the production of a new airplane that left the pilots quite vulnerable to a certain kind of failure. And the results are,
Starting point is 00:11:55 like so many complex failures, enormously tragic. I say they're on the rise, complex failures, because our world is more complex than ever. There's higher levels of uncertainty, but more importantly, there's greater levels of interdependence and interconnectedness. You know, a small internet outage in one part of a company can lead to massive failures in another part. So our interconnection leaves us without buffers that would ordinarily catch and correct certain small problems before they spiral out of control into larger problems. what he called normal accidents, which were the kind of failures, the kinds of massive breakdowns that happened when systems were too interactively complex and had what he called tight coupling, where once a certain failure starts, you can't stop its effects on other parts of the system.
Starting point is 00:12:59 They're too tightly coupled. And I'm not talking about normal accidents when I talk about complex failures, but a complex failure definitely has something not talking about normal accidents when I talk about complex failures, but a complex failure definitely has something in common with normal accidents. It has that interactive complexity, but you don't need to have tight coupling. In fact, for me, one of the most important things about complex failures is that many of them can be prevented. All you need to do is be alert and aware that things are going wrong. And then oftentimes all you have to do is be alert and aware that things are going wrong. And then oftentimes, all you have to do is remove one of the causes and then the rest will be okay. Charles Perrault, in his groundbreaking book that you mentioned, argued that certain systems
Starting point is 00:13:37 are essentially accidents waiting to happen and that accidents are unavoidable. He believed that it was simply a matter of time before such systems failed. Back when he wrote it in the 1980s, there were only a few industries like nuclear that fell into what he considered the danger zone. But now you believe many more organizations of all kinds have moved into that danger zone. Can you talk more about that? The danger zone is where there's high risk of breakdown. And that happens when there's a lot of complexity, a lot of interactive complexity. That just describes our world today. There's very few businesses anymore that aren't deeply digital, aren't vulnerable to their IT systems functioning
Starting point is 00:14:25 as expected. But when you have a breakdown, terrible failures can easily happen. So it's our digitally enabled world that has made so many things work well, but has also left us far more vulnerable to those kinds of breakdowns. Now, when Perot was arguing that certain systems, those with high interactive complexity and tight coupling are simply too dangerous to operate safely. So we shouldn't have things like nuclear power or we shouldn't have things like aircraft carriers. There was only one small problem with that argument, which was that most of the time, really 99.99% of the time, such systems were in fact operating safely, error-free, or at least failure-free, not error-free, in that certainly with human beings around, there will be errors. But so long as those errors can be caught and corrected before they spiral out of
Starting point is 00:15:19 control, you can have failure-free operations. So Perot's great insight gave rise to another stream of research called high reliability organizations that took on that question. And it took on the question of how is it that these incredibly high risk, dangerous kinds of activities, just think of air traffic control, for example, do in fact operate safely nearly all the time. And their answer was rather than a technical, it's interesting because Perot's insights were quite technical and yet he was a sociologist. And many of the HRO researchers really were engineers by training, but their insights end up being quite psychological and sociological. Their insights
Starting point is 00:16:05 end up being about human beings and culture, sort of organizational cultures, the kind that you need to operate safely despite being in a danger zone. So these sort of engineers by training gave us so much insight into actually the human behaviors and the cultures that we need to do the very best, you know, to be alert and vigilant and to not let hierarchy get in the way of sharing important information and many other insights like that. Interesting. Amy, before I ask for the three takeaways you'd like to leave the audience with today, is there anything else you'd like to mention that you haven't already touched upon? What should I have asked you that I have not? I want to emphasize that this can be fun.
Starting point is 00:16:55 I think once you say the word failure, it doesn't sound like we're going to have any fun at all. But I think there's a lot of fun to sort of mastering these concepts and getting excited about preventing preventable failures, but also getting excited about taking more risks to experience more failures in your own life, in your work, and the good kind, right? The kinds of failures from which new knowledge comes, new adventure comes. And ultimately, I think even joy is found there. When we stretch and try, sometimes it works and sometimes it doesn't. It's all the sweeter when it does. Yes, I agree with you. It's so important. Anybody who's a perfectionist, whether a person
Starting point is 00:17:38 or an organization, they can't progress. So what are the three takeaways you'd like to leave the audience with today? Well, number one, maybe this is the gloomy one, but it really isn't, I promise. You are a fallible human being. We all are, but that's just a given. So let's get over ourselves, right? I'm a fallible human being, FHB, start there. Number two, that means we absolutely have to be willing to forgive ourselves, but as importantly, others for the mistakes we make and the missteps we make. We're just that spirit of forgiveness is absolutely key to thriving as a fallible human being. And finally, get curious. I think there's much more joy and much more adventure and yes, more failure if you can reinvigorate your own spirit of curiosity and use it to drive you forward. That's the real fuel, I think, in the science of failing well. Amy, this has been
Starting point is 00:18:41 wonderful. Thank you so much. I really enjoyed your book, Right Kind of Wrong. Lynn, I'm so grateful for the chance to talk to you about it. It is a pleasure. If you enjoyed today's episode and would like to receive the show notes or get new fresh weekly episodes, be sure to sign up for our newsletter at 3takeaways.com or follow us on Instagram, Twitter, and Facebook. Note that 3takeaways.com is with the number three. Three is not spelled out. See you soon at 3takeaways.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.