Motley Fool Money - Harvard Business Prof on Failing Well
Episode Date: February 18, 2024Why would Eli Lilly put on a failure party? Deidre Woollard talked about the art of failing with Amy Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School, an...d author of “Right Kind of Wrong”. They discuss: - The complex failure at Boeing. - What to do after something goes wrong. - The problem with “move fast and break things.” Companies discussed: BA, LLY Host: Deidre Woollard Guest: Amy Edmondson Producer: Ricky Mulvey Engineers: Dan Boyd, Chace Przylepa Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
This episode is brought to you by Indeed.
Stop waiting around for the perfect candidate.
Instead, use Indeed sponsored jobs to find the right people with the right skills fast.
It's a simple way to make sure your listing is the first candidate C.
According to Indeed data, sponsor jobs have four times more applicants than non-sponsored jobs.
So go build your dream team today with Indeed.
Get a $75 sponsor job credit at Indeed.com slash podcast.
Terms and conditions apply.
No, no, no.
No, because that's too scattershot, right?
I mean, I think fail fast, fell off and can imply that it means just try everything and eventually something will work.
No, it's much more of an iterative process.
It's a thoughtful process.
It's like you try something that, honestly, you believe it might work or why waste your time?
I'm Ricky Mulvey, and that's Amy Edmondson, the Novartis Professor of Leadership and Management at the Harvard Business School.
She's the author of Right, Kind of Wrong, the Science of Fair.
failing well. My colleague Deidre Willard caught up with Edmondson to talk about the complex
failure at Boeing, what companies and you can do after something goes wrong, and how investors
can apply a smart failure strategy.
I love this book. I was expecting to feel a little better, and I think I do feel a little
better about some of my own failures. And I want to get right into that because when we fail,
we just, we want to cover it up, right? And I've noticed as I've studied businesses, and I sort of, I have
a little love of kind of business failures, nearly every big dramatic one I've ever seen
comes from someone noticing something and making the decision to not expose it. So how do we get
better at failure, especially in the business world? Wow. Well, music to my ears, because that is
something I have noticed as well. Most of the big failures that I have studied can be traced
back to somebody, usually someone who is genuinely expert at a relevant aspect,
of the phenomenon, feeling unable to voice a concern at a crucial time, or in some cases,
voicing a concern, but really just not being heard for various reasons. And so those are at least
theoretically, practically preventable when we're at our best. And when I say we, I mean,
we as human beings, but more importantly, we in terms of the organizations that we create and
leads. So I think the goal is to lead an organization where no one ever believes that they're not
supposed to speak up, you know, with a relevant concern. Because catching something in time is,
is worth untold economic and sometimes human safety value. So that's got to be the idea. So then you said,
you know, how do we do better? How do we do better at failing well? And I realize,
I love the subtitle, but every time I hear it said aloud, it sounds like feeling well.
It's like a mini failure. It's not feeling well. It's failing well. But how do we do? Well, I think
we really do have to do something that's quite hard, which is to create organizations where people are
honest and agile and straightforward and willing to, you know, willing to tell the truth in a quickly way,
even in the face of real doubt and uncertainty.
Well, and part of that is this idea of what you call intelligent failure,
and that intelligent failure is actually what leads to success.
So I love this part in the book where you talk about Eli Lilly.
They're hosting failure parties,
and they want different teams to learn from small failures.
So getting comfortable with that idea of these little failures.
So how do companies kind of bring that failure out?
You're absolutely right.
Let's go right to intelligent failures because the art or the science of failing well means you should be engaging in intelligent failures.
So I'll clarify what that means in just a moment. And you should be doing your part individually and organizationally to prevent basic and complex failures as often as possible.
So what's an intelligent failure? Well, it's a failure that is in new territory.
we really don't yet have the knowledge we needed to ensure success. So we like it or not have to
experiment. We have to do something risky, something uncertain. So it's in new territory. It's in pursuit
of a goal, whether that goal is a new drug as at Eli Lilly or a life partner or you name it.
We're trying to make progress on something we care about. And number three, you've done your
homework. You know, you've done enough background work to know what is known, what is known, what is
known, what might work, you know, what's worth trying next. And then finally, the failure is
only intelligent if it's no bigger than it has to be, meaning you didn't waste or use resources
that were larger than necessary to get to the next step, to get the knowledge that we need.
So you said something just a minute or two ago about size, right? That it's, you know,
that we should have these small failures. And in truth,
intelligent failures are generally small failures, but the distinction between intelligent
failures and the others isn't just size. It isn't even largely size. It's more type. You know,
is this new territory? Is it in pursuit of a goal? Have we done our homework? Is it as small as it can
be and still get the knowledge? So at Eli Lilly, you know, one of the failures I write about was the
failure of a clinical trial of a new drug called a limita. This was some years ago. And, you know,
everything had gone well in the laboratory, in the phase one trials for safety to show the drug
was safe for people. And then this phase two trial is about to show with a large enough sample
of people, no bigger than it has to be, that it works, that it has efficacy. And alas, it didn't
work. It was a cancer drug and it failed to show the improvement in health. And it was a cancer drug. And it failed to show the improvement
in health that they had hoped. So that's a failure. So have a failure party. Now, why would you have
a failure party? Well, for several reasons. But one is to celebrate the hard work that got us that
far. There was literally no way to prevent that failure. The only way to learn whether it worked
was to do a trial. They did a trial, no bigger than it had to be. It didn't work. Number two,
when you celebrate things like that, other people tend to show up. That helps prevent the
terrible waste of someone else engaging in the same failure a second time. That's never intelligent,
right? So we need to share the knowledge about our failures within the organization so that we can
help our colleagues not replicate them. And number three, it allows people to call something
a failure in a timely way. So there's always this temptation, okay, it's not going well. Everybody
knows it's going well, but I don't really want to admit it. So I'll just,
I'll try harder and maybe magical thinking, maybe something good will happen. That's never a good
use of, you know, time and resources. So, and then, you know, if you do your homework, you know,
if you do it right and you say, okay, we fail, that's disappointing, let's understand why. And so in
this case, the Olymptic story, the physician in charge of the trials looked into the data
closely as you would, and discovered that, in fact, some patients in the trial did very well.
They were very helped by the drug, but others showed no impact. So, well, wait a minute,
what's the, why? Like, why did some do so well and others didn't? So let's look into that. And what
he discovers was that the patients who didn't do well had a folic acid deficiency. That's a simple
B vitamin. So what that means, you know, not all failures end up with such a happy story, but
what that means is all they had to do was add a folic acid to the to the drug and then it worked for
everybody right so so so then and you know that that ended up becoming a very very successful drug
both clinically and and economically for the for the company so I'm not saying that every time
you take the time to analyze a failure you will magically pull a you know multi-million dollar
product out of a hat or a billion dollar product out of a hat but I am saying if you
you don't take the time to analyze the failure, then you have no hope of gaining that extra
possibility of success. I like what you said there about magical thinking because I think that
plays into failure a lot is that we just think it's just going to get better. We'll just
try harder or, you know, what is the definition of insanity is doing the same thing over and over
and expecting a different result? It's kind of magical thinking. So you talked about the intelligent
failure. Let's go to the other side, the basic failure. So the basic failure, I mean, you describe it
the book. It can be anything small. I mean, we have tons of basic failures every day, but sometimes
those basic failures can lead to something really tragic. So when you're doing the same thing over and over,
how do you avoid some of the basic failures? Yeah, so you try not to do the same thing over and over.
So a basic failure is a failure caused by a single cause, usually human error. You know, I forgot to
plug my cell phone in and then the battery died and then I missed the meeting. It's a basic failure,
right? And it's completely my fault. I mean, not fault in a bad way, but just I was distracted and I
forgot to plug it in or worse, you know, a much more sort of blameworthy basic failure as I was
texting while driving and I got into a car accident. That's not, I didn't do that. But I mean,
that happens, right? That happens. And it's the cause of that failure, that car accident is
couldn't be more simple. It couldn't be more preventable. I mean, I'm not saying, again,
I'm not saying it's blameworthy per se. We always want to understand the whole story before we
start assigning blame. But it's not the kind of failure we celebrate or have a party for. It's the
kind of failure we do our very best to avoid. We recognize that people make mistakes, but we want to,
we want to find ways, especially in organizations, we want to find ways to make it easier for people
to do the right thing and harder to do the wrong thing.
Well, that's interesting because you talked about not doing the same thing over and over,
but then in some situations you have to.
Oh, yeah.
No, I mean, checklists and things like that.
Yes, yes, yes.
No, I mean, I think we should do the same thing over and over when it works to get the
result we want, but we don't want to do the same thing over and over that is clearly not
working for us.
Yeah.
But yes, checklists are a marvelous tool for helping us do things that we,
really truly do need to be done the right way, whether that's an airplane taking off or a cake
recipe. You want to use the protocol. You want to use the checklist. Yeah, absolutely. And part of your
book, you talk so much about psychological safety, which is allowing mistakes to happen, even these
basic failures where we should know better and to be addressed. And you've got some examples in the
about how businesses can build that psychological safety in. It seems it seems like it's an easy
fix, but it's one that I don't see a lot of companies doing. Yeah, so first I'll define psychological
safety as a belief that the environment is safe for taking interpersonal risks like speaking up
with a mistake. Right. So that's that's one. Another interpersonal risk would be asking for help
when you're in over your head. And that too can help avoid many a failure. Or, um,
pointing out that your colleague is about to do something dangerous, right? That is, that can feel very
hard to do, both at work and at home. And yet it's really crucial to feel that that's okay
around here. That's what we do around here because we care about each other, about the product,
what have you. So psychological safety defines this sort of environment where you just believe
that you won't be rejected, humiliated, punished for speaking up. Now, how do you
and particularly with respect to failure so that you are invited and feel invited to be a failure
preventer and an intelligent failure producer. So I'll be a basic failure preventer and I'll be
an intelligent failure producer, but I need to have psychological safety. So one of the ways that
I think good organizations try to create psychological safety, well, first and most importantly,
is by calling attention to uncertainty,
calling attention to the nature of the work,
calling attention to the reality of human error
or the reality of system complexity.
So you say things,
if you're a leader of a team or an organization,
you say things like,
you know, we've never done a project like this before.
Things are definitely going to go wrong on the way to,
I hope, our magnificent success.
We need to hear from you.
So you're just,
you're issuing that invitation that is,
logical and rational to say this is the kind of project or this is the kind of organization where
you are expected to speak up because of what's at stake. Then you put policies in place, like one
that I write about is called blame-free reporting, which is not the same as blame-free action.
It doesn't say, oh, anything goes, do whatever you want. You'll never be blamed for it.
No, it says when you report something that's out of whack or that you don't think is right or that you don't understand, you'll never be blamed for the act of reporting.
The active reporting is always valued around here.
Now, if we do our research and we get into something that went wrong and we discover that someone showed up for work, you know, drunk, that's a blameworthy act.
And they'll be held to some kind of standard for that.
So there'll be some kind of consequence for that.
But there will never be,
there'll never be negative consequences for,
for reporting an error or reporting a problem or reporting a deviation.
Yeah, that's, that's really important, I think,
I think for businesses especially, I mean,
even things you talk about in the book about assembly lines and factories
and being able to stop the line at any time and not, you know,
and not be worried you're costing the company money or something like that.
Right.
And you asked why is it so hard?
And I think it's because many managers equate blame-free reporting or the idea of reporting with a kind of lax or anything-goes environment.
And so they just, they haven't recognized the distinction, you know, between people behaving in a problematic way and people being willing to speak up honestly about what they see.
And it was a very different phenomenon.
Well, and another phenomenon that that has happened with Silicon Valley, this whole idea of like, fail fast, fail better, just keep, keep failing. And it almost seems like it's sort of the opposite of what you're talking about, which is really sort of analyzing every failure as something to learn from. And it's just like if you just keep failing, you'll eventually get there. That doesn't always work, does it? No, no, because that's that's too scattershot, right? I mean, I think fail fast fell off and can imply that it means just, just,
just try everything and eventually something will work. No, it's much more of an iterative process.
It's a thoughtful process. It's like you try something that honestly, you believe it might work or why waste
you time, or you believe it might work, whether it's starting a new company or, you know,
trying to invent something or design something. You earnestly believe it could work and here's why.
And guess what? You were wrong. And it's not your fault for being wrong. It's new territory.
Nobody's ever been here before.
Now your job is to figure out why it was wrong quickly and what that implies for what to try next.
So it's kind of a scientific process.
It's a thoughtful process.
So I don't mind the rhetoric, fail, fast, fail often, but it needs to be clarified a little.
It needs to be sharpened so that people understand it's not that scattershot process.
It's that very thoughtful learning process.
And the more we try and learn, we get smarter.
And our next experiment is a little better.
And the one after that's better still.
And it's also not about, okay, we failed just tossing the whole thing out.
It's also about, you know, that part of that after process is about figuring out what can be saved.
Yeah.
It's like you've got this failure.
Like you've invested in it.
You might as well get your money's worth.
Figure out what that failure taught you.
what new information do you have that you didn't have before and how do you put it to work?
You also in the book, you talk about complex failure. I found this part particularly challenging
because people want to figure out, okay, here's the one thing we did, it's wrong, we just don't do that
again. But there's often a lot of things that go into that. So when I'm looking at businesses
from an investor standpoint and I see there's something wrong with the company, I maybe can't
figure it out, I keep looking for that one thing they're doing wrong, but it doesn't know
go that way, does it? No. And, you know, some failures really are the perfect storm, just the
unpredictable breakdown caused by a handful of factors coming together in just the wrong way,
in an unpredictable way. So, you know, let's say you started a new business in February of 2020,
you know, that, and it was something that involved customer service, you know, in the real world,
right? Your business was very likely to not get off the ground very fast because of the timing
about something you could not have seen coming. It was sort of a complex failure related to a global
pandemic, for instance. Other times complex failures are, you know, you own some of the causal
responsibility, but they're still not that simple. Sort of the Boeing 7-3,
37 max failures, both the first two very visible tragic crashes in 2018 and 2019,
and the more recent challenges are the product of a handful of factors, some of them
self-imposed by Boeing's management and board, you know, some of them external, like the
unexpected, there was an unexpected sort of announcement of a new product by Airbus that
led them to hurry the development of a new plane to compete with this product, right?
So it was, you know, some of it's external, some of its internal, but they put themselves
in a position where they dramatically increased the chances, unbeknownst to them, really,
at least not, they weren't thoughtful enough to see it coming, where they dramatically
increase the chances of complex failures.
Yeah, that's a fascinating example, because we tend to just zero in on the
idea of the bolts and the bolts were what was, you know, well, they just have to fix the bolts.
It sounds like what you're saying is there's a lot of things they need to fix in order to make
sure that this type of thing doesn't happen again. Right. Because you really have to ask yourself,
it's the, it's the, you know, it's the why behind the why. Why would the bolts have been
not put in properly? Like what are the conditions that led that to happen? And then you get to
such things as capacity problems in the plant, excess hurry, you know, customers in a rush,
a design that's more complicated than is optimal, and on and on it goes, right? So you have to sort of,
you have to look at each of these factors and then step back and say, how do we design a system
that is less vulnerable to this kind of breakdown? Well, and the other factor there, too,
is that people want to fix the immediate problem without fixing the long-term problem.
So it sounds to me like with Boeing, okay, they're trying to focus on just this one piece of the plane that's having this issue.
But what you're talking about is a much larger issue with the company that could lead to something else happening again.
I think that happens in a lot of companies that they fix the immediate, but they don't necessarily fix the larger issues.
That's right.
They fail to check whether this problem is the problem or whether it's a symptom of a larger problem.
Interesting. Well, you have this line in the book that I found really interesting was about playing
to win versus playing not to lose. And as an investor, I know I'm probably going to lose sometimes.
And the more risks I take, the more, you know, smaller companies that I get excited about,
the more likely I'm going to have, I might lose. So how do you adjust your mindset when
about that? I think that's such, I think, I think of investing as a sort of a classic
context where you get good at this because you sort of understand that if you are taking on some
sort of highly risky but potentially very profitable investments, you will want to balance some of
those out with safer investments that are less likely to give you a huge return but are less
likely to fail as well. And so I think you're intuitively doing a smart failure strategy
in that context when you're good at it.
But there is always a pressure or a force that leads you to want to play it safe,
because none of us like to fail.
And if we give in in any field, not just investing,
but if we give in to that instinct to just want to play it safe,
like only do things that we're pretty sure, you know, 95% sure are going to succeed.
succeed, we may see a fair amount of success, but we don't know how far below the success we could
have had will be. So you could think about that in athletics or in, you know, in the job search.
You know, if you sort of take a job that you know you can get that job and it's not a,
not a stretch, then maybe you will get it, but maybe you would, maybe you could have gone
out for that dream job and maybe you would have gotten that too.
As always, people on the program may have interests in the stocks they talk about,
and the Motley Fool may have formal recommendations for or against.
So don't buy or sell anything based solely on what you hear.
We will be off tomorrow for President's Day, and we will be back on Tuesday.
Thanks for listening. I'm Ricky Mulvey. We'll see you then.
