Passion Struck with John R. Miles - Dr. Amy Edmondson on Why Failure Is the Key to Success EP 343
Episode Date: September 8, 2023Have you ever wondered how embracing failure could become a stepping stone to growth and enlightenment? What if we told you that there's a paradoxical concept known as "intelligent failures" that can ...drive progress and personal development? Today, we journey down this path with Dr. Amy Edmondson, a distinguished professor at Harvard Business School and author of the transformative book "Right Kind of Wrong." Want to learn the 12 philosophies that the most successful people use to create a limitless life? Pre-order John R. Miles’s new book, Passion Struck, which will be released on February 6, 2024. Full show notes and resources can be found here: https://passionstruck.com/dr-amy-edmondson-failure-key-to-success/ Understanding Failure as a Pathway to Growth: A Dialogue with Amy Edmondson In our discussion, we explore various facets of failure and learning. From the significance of teamwork and checklists in high-stress situations to the complexity of failure to the vital role of self-awareness, we leave no stone unturned. We also discuss the importance of systems thinking in preventing failure and the necessity of embracing failure for personal growth. Our conversation wraps up with a deep understanding of managing emotions and the necessity of building psychological safety in the workplace. Join us on this journey of learning and growth. Brought to you by Netsuite by Oracle. Download NetSuite’s popular KPI Checklist, designed to give you consistently excellent performance at https://www.netsuite.com/passionstruck. Brought to you by Indeed: Claim your SEVENTY-FIVE DOLLAR CREDIT now at Indeed dot com slash PASSIONSTRUCK. Brought to you by Lifeforce: Join me and thousands of others who have transformed their lives through Lifeforce's proactive and personalized approach to healthcare. Visit MyLifeforce.com today to start your membership and receive an exclusive $200 off. Brought to you by Hello Fresh. Use code passion 50 to get 50% off plus free shipping! --► For information about advertisers and promo codes, go to: https://passionstruck.com/deals/ Like this show? Please leave us a review here -- even one sentence helps! Consider including your Twitter or Instagram handle so we can thank you personally! --► Prefer to watch this interview: https://youtu.be/DyWiruBo7ms --► Subscribe to Our YouTube Channel Here: https://youtu.be/QYehiUuX7zs Want to find your purpose in life? I provide my six simple steps to achieving it - passionstruck.com/5-simple-steps-to-find-your-passion-in-life/ Catch my interview with Dr. Caroline Leaf on Parenting or a Healthy and Confident Mind. Watch the solo episode I did on the topic of Chronic Loneliness: https://youtu.be/aFDRk0kcM40 Want to hear my best interviews from 2023? Check out my interview with Seth Godin on the Song of Significance and my interview with Gretchen Rubin on Life in Five Senses. ===== FOLLOW ON THE SOCIALS ===== * Instagram: https://www.instagram.com/passion_struck_podcast * Facebook: https://www.facebook.com/johnrmiles.c0m Learn more about John: https://johnrmiles.com/ Passion Struck is now on the Brushwood Media Network every Monday and Friday from 5–6 PM. Step 1: Go to TuneIn, Apple Music (or any other app, mobile or computer) Step 2: Search for "Brushwood Media” Network
Transcript
Discussion (0)
Coming up next on PassionStruck.
When you say failure is not an option,
what you really mean is we are going to do our very best
with what we have to produce success.
We're going to use best practices.
We're going to use our skills.
We're going to help each other.
We're going to be a great team,
because that's what we truly need in this execution moment.
When we say fail fast, well, often we should be,
I hope, referring to context
in which there's no known solution yet.
And the faster we get
to some kind of viable solution,
the better off we all are.
Welcome to PassionStruct.
Hi, I'm your host, John Armeyles.
And on the show,
we decipher the secrets,
tips and guidance of the world's most inspiring people
and turn their wisdom into practical advice for you and those around you.
Our mission is to help you unlock the power of intentionality so that you can become the best version of yourself.
If you're new to the show, I offer advice and answer listener questions on Fridays. We have long-form interviews the rest of the week with guest
ranging from astronauts to authors, CEOs, creators, innovators, scientists, military leaders,
visionaries, and athletes. Now, let's go out there and become PassionStruck.
Hello everyone, and welcome back to episode 343 of PassionStruck.
Consistently ranked by Apple was one of the top 10 most popular health podcasts in the world, and thank you to all of you come back to the show every single week
to listen and learn, how to live better, be better, and impact the world. PassionStruck is now
on syndicated radio on the Brushwood Media Network, and you can catch us every Monday and Friday
from 5 to 6 pm Eastern time. Links will be in the show notes. If you're new to the show,
thank you so much for being here. Or you simply want to introduce this to a friend or a family member, we now have episodes
starder packs, which are collections of our fans favorite episodes that we organize in
the convenient topics.
If any new listener great way to get acclimated to everything you do here on the show, either
go to Spotify or PassionStart.com slash starder packs to get started.
In case you missed it, earlier this week I had two great interviews, and I'm using this
momentum Friday episode today to do a third
interview for a book release this week.
The first was with Todd Rogers, a behavioral scientist and professor of public policy at Harvard
University.
Todd has spent over three decades studying the science of writing and has authored the
groundbreaking book with Jessica Lasky Fink, writing for busy readers, had to communicate
more effectively in the real world.
I also interviewed Eduardo Brasino, and we explored how to escape the performance paradox in Embrace Intentional
Leaven for higher-level results. Eduardo has coined the term Chronic Performance
Trap to describe the counterintuitive phenomenon that often occurs when we relentlessly
work harder, only to find ourselves exhausted and unfulfilled. But fear not, Eduardo
brings a wealth of strategies from world-renowned individuals and companies that have cracked the code to peak performance.
Please check them all out, and I also wanted to say thank you for your ratings and reviews.
If you loved either of those episodes or todays, we would so appreciate you giving it a 5 star review and sharing it with your friends and families.
I know we and our guests certainly love to see comments from our listeners.
Now let's talk about today's episode in a world where complexity is the norm. And decisions often come with a side of confusion.
We find ourselves at a crossroads of progress and uncertainty, from navigating climate shifts
to deciphering the economy. From personal choices to professional paths, there's one constant
companion, the notion of failure. In this episode, I'm privileged to have Dr. Amy Edminson,
the respected Novartis professor of leadership and management at the Harvard Business School.
With the career spanning over two decades, Dr. Edminson's research has illuminated intriguing
facets of human behavior and how organizations tech.
You might have stumbled upon her award-winning insights in the New York Times, the Wall Street
Journal, and the Financial Times.
Her thoughts have found their way into the pages also of Psychology Today, Fast Company, and the Harvard Business Review. Recognized as the number one
management thinker in the world by Thinkers 50 in 2021, Dr. Edminson's impact on
management and leadership is undeniable. We discuss her new book,
Right Kind of Wrong, that released earlier this week. Her book challenges the way we perceive
failure, offering a nuanced perspective that goes beyond extreme avoidance or reckless pursuit.
Dr. Edmundson redefines failure as a source of insight and personal growth, a mindset shift
that could change how we navigate life's challenges.
During our interview, I explored tantalizing insights from her book, Gemsike, Why Do We Hate
to Fail, and Four Essential Tools for Failing Well.
The enigmatic realm of a fail well mindset comes to life as she and Earth's the paradox
of intelligent failures.
Those accidental breakthroughs that drive progress yet remain a rarity.
She dives deep into the chasm between catastrophic failures and preventable ones, shedding light
on the nuances that spell the difference.
Fear, shame, blame, these invisible barriers stand between us and the untold benefits of
failure.
Dr. Edminson unravels the intricate psychology that binds us to these impediments,
compelling us to embrace humility and honesty is the cornerstones of a resilient,
fail-well mentality.
Get ready for insights that will spark your passion and transform your perspective.
Thank you for choosing Passion Stark and choosing me to be your host and guide
on your journey to creating an intentional life now.
Let that journey begin.
and guide on your journey to creating an intentional life now. Let that journey begin.
I am so excited today to welcome Dr. Amy Edminson to Passionstruck. Welcome, Amy.
Thanks for having me. Today we are going to discuss your brand new book that I'm holding up here, right kind of wrong for the audience who might not be watching this on YouTube,
which releases the week that this podcast is coming out.
Congratulations on its release.
Thank you so much.
Amy, I have my own book coming out in February
and one of the first chapters is all about
the science of life crafting or said another way,
finding a problem worth of solving for you to pursue in your life.
How did you discover that your problem worth solving was helping people and organizations learn
that they can thrive in a world that keeps changing? I discovered it slowly, but it started many
years ago right out of college when I was working for the inventor, designer, educator, Buckminster Fuller, who was just a phenomenally creative person.
And in a sense, he was clear that his purpose, which he had
discovered in his maybe 30s, was to figure out what if anything,
one human being could do, what problems they could solve on behalf
of all of humanity, that giant corporations or governments, etc. couldn't solve.
Now that's not my purpose, but it was quite meaningful to me to discover this at the time in his 80s,
wonderful creative man who had such a full and impactful life. In fact, have a purpose, right or to articulate that this was his purpose.
And he was quite sure that all of us are here on this earth to use our brains to solve
problems on behalf of humanity, on behalf of others.
I didn't have any sort of grandiose ideas about what those problems would be in my own.
I worked with him as an engineer after that time.
I wrote a book about his work.
And then it was time to move forward into what I did not know
But I met a very inspiring entrepreneur named Larry Wilson who hired me to come be in charge of research at his kind of boutique consulting firm
And what that company was doing which is called Pekos River Learning Centers
It no longer exists was helping companies change their culture
Fundamentally to be learning organizations.
I didn't have that terminology right off the bat, but soon in that spirit, we brought in some
wonderful thinkers like Peter Senghi and Ed Schein and Chris Argeris and others to help us think.
Like, how do we help companies do this? And this was the late 90s, and one thing was absolutely clear,
now it seems almost quaint,
but it was clear to me that the world keeps changing
and that organizations, especially large ones,
have trouble shifting their complex systems
and operations and product lines fast enough
and effectively enough to meet the changing needs of their markets and of the world.
So I was fascinated by how hard that was, even when well-meaning people, smart people,
and leadership roles really saw the need and wanted to do it, it still remained hard.
And at a certain point, I became so flummoxed by the challenge that I thought, I guess I'd
better go back to school and figure out what's known and do a PhD and get a little smarter.
And so in the PhD, I kept going with this premise that learning at work is hard.
And I studied that in healthcare, I studied that in manufacturing.
Okay, so it's hard, but how do we make it easier? Thank you for that backdrop and today we're going to be discussing all things failure.
I love this topic because I had a peer when I was working at Lowe's always tell our boss who
is the chief information officer that we need to learn from failure and we need to fail all the
time in order to get better. And our member, our boss absolutely hated it.
He would always say to us, why would we want to fail when we can succeed?
I watched this battle progress over two years because both of them were named Steve and
Steve Shirley would never give up the argument because he was in charge of innovation that
we needed to keep failing and do all kinds of projects to fail and
The boss who wanted success was always fighting him on it
But I think it's an interesting dilemma and yes, you're right. I haven't been in companies like Lowe's and Dell and other
behemoths and myself a big for consulting
practitioner
I've seen just how hard it is for these systems to change in these large
I've seen just how hard it is for these systems to change in these large organizations. And it's interesting because one of the very first research projects that you did when
you were in your PhD program ended in failure.
And I wanted to ask, how did that failure end up changing the course of your academic
career?
Because it led to a research paper called Learning From Mistakes.
It's easier said than done.
Yeah. to a research paper called Learning From Mistakes is easier said than done. Yeah, so it's one of those stories
that in retrospect is such a blessing,
but at the time feels awful.
My research project, I was part of a larger team
of medical researchers, and they were studying
medication error rates.
And one of their primary aims for the study
was to fully assess how frequently this problematic and sometimes
tragic event happened of medication errors that either harm or in worst case scenarios killed
patients.
In the early 90s and this topic was coming into awareness really for the first time.
It had not been looked at much before.
And so they thought, well, while we're at it, why don't we invite in a social psychologist who can help us understand what if anything team dynamics have to do with
the creation of errors in patient care? And so my simple hypothesis, but I think reasonable hypothesis
was that better teamwork, which I would assess with a validated team diagnostic survey, would
be associated with fewer error rates, right?
Makes sense, right?
If you have better teamwork, you're catching and correcting, you're coordinating, you're
speaking up all of that stuff.
So fast forward, I've got my team data from month one of the study.
I wait six months for the medical investigators to have collected the error data is best they could going from unit to unit every day or every other day.
And lo and behold, there's a significant correlation between the teamwork measures and the wrong direction. The data seemed to be saying that better teamwork is associated with higher, not lower error rates.
Now, this was at the time devastating. It was such a failure. I just thought, I'm not going to make it out of graduate school.
I'm going to have to drop out, find another career.
And because it's just, how can you be so wrong about something so fundamental?
And so of course before dropping out I decided to give it a little thought. And the more I thought
about it, it suddenly occurred to me that maybe the better teams, and it did not doubt the
veracity of the team diagnostic survey. But maybe the better teams, higher quality relationships, reported better leadership and so forth.
Maybe they were not making more mistakes.
Maybe they were more willing and able to report them, right?
It could say blinding flash of the obvious.
But what I was suggesting was that there were interpersonal
climate differences across groups that had not been anticipated.
Teams in these large hospitals were not the same. That local leadership and local climate
might in fact, vary so substantially, that it would change a behavior as fundamental as error
reporting or willingness to be honest about what's really going on. Now, again, I think in retrospect,
it seems quaint because of course, now I think we take that for granted. So the title of the paper
I ultimately wrote, which was not the title of the paper I would have written had my hypothesis been
supported, learning from mistakes is easier said than done, essentially refers to this phenomenon
that you cannot learn from mistakes
if you're not willing to talk about them. Pretty simple, right? So that was the seeds of a concept
called psychological safety. That was an accident, right? That was like a retrospective sense-making
of the data. But in my next study, I had to test that on purpose. Are there interpersonal climate differences
to cross groups?
If so, is it associated with learning behaviors
like speaking up about errors?
If so, does it lead to better performance?
And I was able to say yes to all of those questions
in a much more structured systematic way later.
So that error, that failure led me
to a much more interesting research path than I was on at first.
Do you touch on failures and errors? And you level set for the audience what is the difference
between failures, errors and violations? An error which is synonymous with mistake is when
you do something wrong where the knowledge already exists for how to do it.
That doesn't mean that you're bad or lazy or anything like that.
We all make mistakes.
It can only be called a mistake if there is a right way to do it.
You're in familiar territory.
A failure is a broader concept and error can lead to a failure.
But you can also have a failure in new territory where literally no one could
have told you in advance what would happen when you engaged in a certain action or experiment.
A failure can in fact be truly the discovery of new knowledge where no error was involved at all.
It was a smart thoughtful experiment And a last you were wrong,
like in a sense, my research project. And so that was a failure, but not an error. A violation is when
you deliberately do something wrong. That deliberate mistake is nonsensical. It's a violation is a
deliberate act of doing something wrong to cause damage or mischief in some way.
doing something wrong to cause damage or mischief in some way.
Okay, well, I think that's really helpful.
And I thought it might be helpful to just look at your profession
because I know as a researcher, a lot of the work that you do is on research, but it also is on looking at lots of different failures
before you come up with successes. Why is it so important
for researchers to have failure along the way for them to learn and create success?
I think the answer to that is that if you're not having any failures along the way,
your questions aren't bold enough. You're not taking kind of intellectual risks that you need to take to be a truly leading-edge
scientist or professional in any field, really, that hopefully many things that you don't
end in failure, but some portion of them should, because then that means you're really on the
leading edge of your craft. That means you're trying to learn something that hasn't yet been learned.
You've thought about it, you've done your homework, but it turns out still you were wrong.
It was new territory.
And I think it's so ironic that the majority of people fail to learn the valuable lessons
that failures can offer.
Why is that such an issue in society?
The first reason is that as human beings, we don't enjoy looking at our failures. Sometimes we don't even recognize them because of confirmation bias in various other things. We assume we did
well and nobody tells us otherwise. But other times we're aware we've had a failure, but we just
would rather move on. Wouldn't you? I would. I'd rather move on and not pause to really look
deeply at it, try to understand the ways in which I contributed to that failure, the things I did
that contributed to it, the things I failed to do that contributed to it. It's just unpleasant
and not fun. I think that's the sort of first and most visceral reason why we struggle to learn
the lessons that failures offer. And the second reason is really the more cognitive or
intellectual reason, which is you can't just do it quickly and superficially. You have to be thoughtful
and analytical about, okay, let's start by saying what happened and think systematically and scientifically about it so that we can
not pause at the first most superficial lesson. Oh, you didn't try hard enough. But get a deeper
understanding of what happened and most people don't really want to do that work.
Well, thank you for that. And I released an interview today with Dr. Judd Breough, who's a professor of brown. I believe he went to Harvard Medical School
and got his PhD there as well.
And we ended up discussing how to break bad habits
to overcome anxiety.
And in your book, you discuss how when we're kids,
we learn to dodge blame by pointing the finger elsewhere.
This becomes habitual.
These habits over time end up leading us
to avoid stretch goals
and challenges as we get older where we might fail. My question for you is why does this combination
of human psychology, socialization, and institutional awards make mastering the science of failing
more difficult than it needs to be? I think you've just said it so well. So let me underline a
couple of things. I think there's this emotional aversion to failure.
It's just, it's quite instinctive.
We prefer to be associated with success than failure.
There's a real worry about what others will think of us,
or this sort of this stigma.
We want people to think, well of us, not badly of us,
also for quite instinctive reasons.
You want to be important.
You want to be part of the group.
That's where our survival comes from.
And then also, it's natural psychologically
to point the finger elsewhere.
There's something called the fundamental attribution error
studied by Lee Ross at Stanford.
It says that we rather spontaneously, when something goes wrong,
we will attribute that, let's say, come up short in some way.
I will think that's either an ability problem or a
character problem. And I don't think I wonder what the situation was that led to that failure.
Whereas if I come up short in some way, I will spontaneously think factors outside my control
led to that outcome. We start off with the Dex-Lanted against us in terms of doing the real work of learning.
And as part of learning, I know that psychological safety comes into play, and this is something
that you were very well known for studying. How does this concept of psychological safety play a
powerful role in the science of failing well? It starts with the premise, which I deeply believe,
that the science of failing well is a team sport that most,
you know, really good failures, let's say in R&D or in science or other,
you know, leading edge activities happen by teams and with teams.
It's where our ideas are richer when we're learning from people with
different backgrounds and expertise
than ours. And the process of diagnosing failures is going to be higher quality when we have more
of minds looking at it at the same time. So that means it's a social activity. And psychological
safety comes into play there because we can't do that social learning activity effectively.
We can't do that social learning activity effectively if we to take interpersonal risks, but that they're
expected and welcome and necessary for us to say do our job or be a strong family unit.
The psychological safety describes this climate where candor is welcome, expected, we can
be direct, we can be honest with each other, we can roll up our sleeves and do the hard work
of experimenting and then learning
from the failures that do happen.
Yes, I think we all experience the psychological safety net that we want to have around us.
And this term failure often carries such a negative connotation.
What is one of the best ways that we can reframe our perception of failure to see it as an essential aspect of success?
Let me go back to your story of the two steves, right? Because the boss Steve who was saying, no, why would you want failure if you can have success, please, by all means, let's have it.
We don't want failures where success was a viable realistic option.
Meaning, we don't want failures when we have a recipe or we have a good process, we have
best practice.
What we want is to support people and enable them to do what they need to do to get those outcomes. However, the
other Steve was right also because when he was saying was, but we need R&D, we need innovation.
Yes, absolutely. Otherwise, our business won't thrive over the long term. And he was saying the only
way to get new ideas, new products, innovation is to be willing to fail now.
Fortunately, at least for most organizations, that failure happens behind closed doors,
but you don't have the customers in there watching you fail.
You're doing it in the laboratory, you're doing it in the innovation teams.
This sort of recognition of context and the fact that one context is absolutely a context where it's realistic to say,
let's try to get six sigma perfection over here. But another context is one in which you have to say,
if we're not failing over here, we're not doing our job.
Yeah, and I think a great example of that is what we ended up doing at lows when you think of the complexity of having
At the time I was with them over 1800 stores a huge supply chain network
When you're introducing new changes to that environment. It's very easy to potentially have errors not only in the systems that you're introducing
But more importantly and how your 300,000 employees are utilizing those systems
to serve the customer.
After a whole bunch of failures
at trying to big bang implementations
out to the store system,
we ended up doing this very methodical way of rollout
where we'd start with a single store, then we'd go to
a group of stores in a district, then we would go to several districts all looking at what
errors we might have, are we conflicting with anything. And once we put that methodology
in place, the rollout rate and our overall success rate went up drastically because we
were able to correct errors that we found in the very early stages and then roll it out for much bigger success. You can take that example and implement it whether it's brilliant. And I like to call it instead of a rollout, which kind of implies there's a carpet on it.
It's ready to go smoothly from here into the future.
I call it a cycle out.
You do it here.
You discover some kinks, some things that need to be worked out a little better.
And then you go a little further to some more stores.
You learn some more.
And so your plan do check-act all the way forward until it's done.
This cycle out is much more of a mindset of learning as we go.
Not in a like slow down or casual way, but a really disciplined, rigorous way of learning from our own experiences to keep getting better with this new thing,
that we're not supposed to know already
how it works perfectly at scale.
That would be crazy in a way.
It's not something we've done before,
so we need to do this cycle out process.
One of the fundamental concepts that you have in the book is that of good failures, or
as you call it the title of your book, the right kind of wrong.
Can you help listeners to better understand this concept of good failures?
Absolutely.
So a good failure is one that happens in new territory, meaning you couldn't just look
up the answer on the internet. But it's genuinely new territory, meaning you couldn't just look up the answer on the internet.
But it's genuinely new territory. Now that could be new to the world as you're a scientist,
or new to you, you've never picked up the game of golf before, right? Or a blind date, let's say, right?
It's just, it's novel and there's no way to find out the answer without at least being willing to try.
So new territory, in pursuit of a goal, I'd
hypothesis driven, which is a fancy way of saying, you've got good
reason to believe it might work. You also know that it might not,
right? But you've done your homework. And then fourth, the
failure to be good, I think, should be no bigger than it has to
be to get the new knowledge you're trying to get.
You don't bet your life savings on an uncertain stock. You don't agree to spend a week with a blind
date. You agree to have a coffee, right? This is obvious in a way. This mitigating the risk,
like your low-story, that those little failures that happened in that cycle out were good failures.
New territory, right? In pursuit of an important goal with good hypotheses and no bigger than it
has to be. You don't want to risk a nationwide failure of some new service.
And I think from that, it's good to then articulate the difference between a good failure and a bad failure, and maybe a good way of doing this would be to book and one is right wrong and the others are not. The other two kinds of failure are basic
failures and complex failures. Now basic failure as the term implies is one that
has a single cause, usually human error. In other words, you make a mistake and
familiar territory and that leads to a failure. A complex failure is one that's multi-causal.
It's usually a number of things lined up in just the wrong way
at the wrong time to produce a failure.
But any one of the factors separately
would not have led to failure on its own.
It's not a big enough error or a big enough deviation
to have caused the failure.
But boy, when they all come
together, like supply chain breakdowns, for example, during the pandemic would be a good illustration
of complex failures, just a couple of labor shortages here and some parts shortages there,
and then pretty soon you've got the perfect storm of no chips available to anyone.
Basic failure thing, they run the gamut
from putting the milk in the cupboard
rather than the refrigerator in its spoils
to a few city bank employees failing
to check the right box computerized system
that led them to transfer $800 million,
essentially the full principle of a loan
rather than the interest, which they were trying to transfer
and leading to that rather massive $800 million loss
to the company.
Thank you for going through that.
And one of the things that I thought was fascinating
in your book is I've always loved flight and space travel
and other things like that.
And so it was interesting how you discussed
career resource management.
And when I was at the Naval Academy,
we had this unique experience of getting different lectures
that would come in to talk to us about leadership
and decision-making in high-stress situations.
And one of these happened to be the amazing flight crew.
I can't remember all the specifics,
but they were either on a DC-10 or a Lockheed 10-11
back in the day, and you and me might remember this
because as they were bringing this aircraft in
and ended up landing, but it did this catapult in the air
before it came down.
And it was amazing that this even was a success
and anyone survived at all.
And I think it was about 80% of the flight who survived
because when they took crews in the simulator,
not another single crew, I think it was United Airlines,
was able to land the plane.
And what we learned from their discussion
is that this crew, it wasn't the first time
that they had flown together.
They had flown together several times in the past.
But what happened in this flight is they lost all hydraulics.
So the only way that they could control the plane,
and this was 30 years ago, so I'm doing this on memory.
But was that they had to use the engines to control
the plane.
And so it meant that in the cockpit, there were three people.
And so they were all working in tandem to make sure that they were doing the best that
they could to control the plane and its altitude as a team.
And they said it was that teamwork.
And knowing from instinctively working together so much
and going through checklists that they were able
to even have a chance of bringing this thing down
and landing it as successfully as they did.
What are your research and looking at the airline industry?
Did they learn about the difference between crew management
and those who had worked together versus those who might not have worked together that were coming out
of maybe time off before going into a flight. Well, it is such an extraordinary
story. Is that the Sue City? Yes, that's exactly it. And I've actually heard
Captain Bill forget his last name speak about about it. And it is, it's so moving and he's so,
and I think this is relevant to our listeners.
It's he's so humble, self-deprecating,
and generous with his credit and description of that story.
And it really is a truly remarkable feat
that they were able to save so many lives on that day.
And it was indeed, it was the epitome of effective teamwork.
And I'll describe why I think that was such effective teamwork.
And then tell you the results of that wonderful study that was done at NASA by Clay Fouche.
So the effective teamwork, I think that they modeled that day,
was started with an absolute explicit
mission that we don't know.
We don't actually have a playbook here.
We don't have a solution for how do you land a large craft like this without hydraulics?
So it's a kind of, we're putting it right out there in the open that we don't know.
And then it's, that means we're listening intently to all ideas welcome. And not all of them will be
implemented, not all of you will be good, but keep them coming. And then also that ability to coordinate and compensate for each
other's strengths and weaknesses real time is the marker of a great team. We've all watched basketball teams on the court or other sports. And you can just tell the difference between a
fluid team that knows each other and knows who to pass to and all
the rest. That's what we think. It gets us excited to even think
about that. So now the study that was done in NASA in the simulator
is pretty straightforward, but they were surprised because it's back
in the 80s by the outcome. So here's the study.
They have a set of cockpit crews who are well-rested.
They're coming right to work with no flights recently behind them
and they've never flown together before.
And they're put in the similar given
all sorts of challenging situations.
Meanwhile, we have some other cockpit crews that have just come off of several shifts
of flying together real flights.
So they're so-called fatigued condition,
but they've had experience working together.
Now what happened was, indeed,
the purpose of the study had been to show
that these fatigued pilots would make more mistakes.
And indeed they did, but as teams, they made fewer mistakes
because they caught and corrected
each other's errors so that the teams actually, the fatigued teams actually outperformed well-rested
teams simply because they were better teams. They were better able to coordinate collaborate
with each other to land the plane, the simulation flight effectively.
No one expected that, right?
They were looking at fatigue,
and then instead what they discovered was teamwork.
And so then they thought, well, maybe we need to help cockpit crews,
and now indeed all the whole crew with flight attendance
and everybody be better at teamwork and be better at speaking up
and better at asking for help,
because that's gonna be where real excellence comes from.
Yeah, and you could see that play out in various different dimensions, but the one I would think of
is imagine a SWAT unit or a special forces team that's going in on a target and having a team that
has worked together, rehearsed together, know the intricacies that each member is going to play
when they
attempt that mission compared to a bunch of people who come together who might be very
skilled, but the first time they do that exercise is when they have to do it in real time.
And just looking at those examples, you could see how, just like this flight crew is one
would make such a big difference than the other.
So there are two different terms that we hear a lot. One is
fail fast and break things, which you hear a lot in the startup world, and then the other is failure
is not an option, which I definitely heard a lot when I was in the military because let's face it,
if you fail in combat, people are going to get killed. Those seem like very conflicting mindsets.
How can we strike a balance between these approaches for better outcomes?
They are certainly conflicting statements, right?
They have obvious contradictions, but the same mindset actually can encompass them both
if we incorporate the role of context.
I think when you say failure is not an option, What you really mean is we are going to do our very best
with what we have to produce success.
And that's, we're gonna use best practices,
we're gonna use our skills, we're gonna help each other,
we're gonna core, we're gonna be a great team.
So because that's what we truly need
in this execution moment.
And when we say fail fast, fail often,
we should be, I hope,
referring to context in which there's no known solution yet.
And the faster we get to some kind of viable solution,
the better off we all are.
This is appropriate for an experimental context, right?
It's appropriate for a new context.
And almost laughable to imagine
that you'd want to fail, fast, fail often
in the operating room.
Of course you wouldn't, right?
Once once we're in the operating room,
we wanna succeed brilliantly.
But we wanna make sure we've trained our surgeons
and their teams in a simulator
where it is possible to fail fast
so that they can learn quickly what works
and how to recuperate quickly.
These two mindsets are compatible
as long as we bring in the role of context,
as long as we're clear about the goal.
And the one thing I would add is if you aren't clear
and compelling about the goal and what you're trying to do,
and you use the phrase failure is not an option.
You know, without that additional contextual information that's so valuable,
you are at risk for essentially saying to people, I don't want to hear about it.
And that is the worst thing to do. So what you want to say is, we're going to succeed.
I know we can do it. And you see anything along the way that doesn't look quite right.
The faster you speak up about it, the better off we are. I mean, thank you very much for that. And I'm going to just go back
to the audience and remind them that you talked about intelligent failures, which is something that
you deep dive in chapter two, your book, and then you went into basic failures, which
is the study of chapter three, and then chapter four is your study of complex failures.
I wanted to talk for a little bit about basic failures.
How does a checklist allow us to cope with the basic failure in a more routine way?
I like the word allow, because you're right.
It's an allowing thing.
It's not a guarantee,
right? A checklist, which of course is a list of things that it's really important not to forget
in some procedure, whether it's aviation or surgical or just packing for your weekend trip, right?
Having a checklist helps us fallible human beings make sure we don't miss something that we shouldn't have missed, right? So we don't get a
preventable basic failure. But checklists have to be used with intent. If you so-called do it in
your sleep, you are at risk for not getting the benefit that checklist allows you to gain.
And I do tell a story of a flight of these aviation examples because they're so compelling and so important, but there was airflart of flight 90 back in 1982 that unfortunately crashed into a freezing cold Potomac River in Washington DC and killed almost everybody aboard. And the pilots went through that checklist
that we all know well, right?
It take off and they were air florida.
So they were sort of almost doing it in their sleep
when the copilot said anti ice off,
or the pilot said anti ice off.
Copilot said yes, and off they went.
But it was a freezing cold sort of blizzardy icy day,
then DC that January day.
And so the correct answer was Henty ice on, right?
You wanted it on, not off, but they were so familiar with it being off.
So in other words, they were using the checklist, but not really just going
through the motions, not having their brains fully engaged,
but the meaning of each of those important items on that checklist. Okay now we want to explore complex failures and I thought a way we could do this is I'm
going to give you another example and again this is going to be one since we're on
this aviation theme and NASA theme that comes from there. I have the
distinct honor of having a very good friend of mine who was the Chief Astronaut
at NASA.
And he's one of the most experienced spacewalkers that NASA has ever produced.
But when you think about spacewalks or what NASA calls EVAs, you don't think of them being
inherently dangerous, but they were actually something that NASA and
many of the observers around them feared because when you think about constructing something
like the ISS and all the spacewalks that you have to do, there are a million things that
could go wrong. So it's amazing that for so many years, they had done thousands of hours
of spacewalks without any major issues until my friend Chris happened
to be on the ISS. And this was back in July of 2013. And on July 9th, he and Italian astronaut,
I'm going to butcher his name, but I think it's Luca Parmitano went out to do an EVA and
everything went flawless. But when they got back to the ISS, they noticed
that the cap that Luca was wearing was wet. And they came to the conclusion that this
was just sweat and it was his first spacewalk. So he was just nervous, etc. Well, week
later July 16th, they go back out to do some additional maintenance, etc. Everything's
going extremely well. In fact, they're a head of schedule, 45 minutes into it when Luca reaches
into a crevice to use some work. And he notices that the moisture isn't just on his cap, but it's
now in his helmet. And then Chris comes around to do a visual observation and sees a blob because they're in space.
So it's a blob of liquid that's accumulating in his helmet.
And then all of a sudden NASA recognizes
that the issue here is that Luca could drown.
And so they immediately order an abort
but on his way back to the hatch two things happen.
One, they go from orbital day to orbital night,
so all of a sudden it's dark. And then two, there are all these precautions that they have to go
through because they have to avoid any sharp objects. And one of these happened to be in antenna
that's sticking out. And they can't puncture the suits or he can press and die. Luca has to do this maneuver where he inverts himself and when he does this blob ends up
covering his complete eyes, his nostrils and half his mouth.
So now he's barely able to breathe and the blob doesn't go away when he comes back.
It just stays there.
So now he's completely blinded and at this point, Chris has to help him get back into the ISS, which he does.
They then get back in the hatch and Chris has to calmly keep him from not overracting
as the water is accumulating even more, but they had to wait for 30 minutes of decompression
before they can take off the helmet.
That is a long story.
It turns out to have a successful outcome.
And I think part of that was because not only astronaut training,
but Chris was a Navy SEAL.
And I think it was his training to be able to adapt
in complex situations.
But the important thing here is this complex failure
could have been avoided if they were able to catch the small problem
of water being on the cap before they did that second space walk.
I illustrate that because oftentimes,
we don't look for these small problems.
And why is it so important that we do
before they spiral out of control like that example?
Such a full and rich story.
And it also just fills me with appreciation
because you have been a very good and close reader of my book.
You're referring in that story to so many concepts in the complex
failure chapter, and one of which is what I call either ambiguous threats or the small signals
that something might be wrong,
that we easily dismiss, easy to cognitively dismiss them.
We just assume we have this overarching assumption
that all is well and in complex systems,
it's a very bad assumption to have.
We have it, but then we have to override it.
And so you're absolutely right.
This complex and almost tragic failure was preceded
by little signals, little hello, I'm here signals that were ignored not because they're
bad or lazy people, but because it's natural and normal as humans to assume them away. And what led
to that breakdown was a variety of things, none of them on their own bad enough to cause the failure,
but the accumulation of water inside the suit,
the sharp objects that led to a need
to alter their course and their behavior,
and it sounds like a handful of other things
that come together essentially in just the wrong way
to let a failure through.
Now, people who work in complex organizations
or high risk organizations are trained
as your Navy SEAL colleague was to be highly aware,
to be not thinking, oh yeah, we won't fail,
but to be thinking where might we fail?
It to be ever vigilant and ever heedful of those small signals.
Now they missed the small signal the first time,
but fortunately he was able to respond
magnificently in action and prevent the far worse failure.
But complex failures are on the rise
that we live and work in increasingly complex environments. And so we have to train
ourselves to be willing to raise, not worry about people thinking you're chicken little.
When you say, oh, that doesn't look good. Oh, don't worry about that. Don't be a whim. But actually
praise and reward the people who are willing to speak up early so that we can
head these complex failures off at the past.
Complexity of our systems is a given,
but our ability to prevent most complex failures
is much greater when we're more open and more attentive
to those small signals.
And by the way, when you said at the very beginning
of that story, you said, we think of spacewalk as so routine.
It's partly because NASA has trained us to think that.
They put forward an image of themselves as doing these things that now we can do them
easily and do them in our sleep, non-sounds.
These are extraordinarily challenging, risky, skillful activities that they do and they should take
credit for it. Yeah, I think what we fail to realize is the two years of constant planning that
they've done in a pool environment to get ready for that one event just to make sure that they try
to prevent everything that they can from going wrong. Exactly right. As I was telling you before we got on the show, this podcast is really about intentional
behavior change.
It's about exploring our self-awareness, which is something that you explore in chapter
five of the book.
Why does self-awareness play such a crucial role in the science of failure?
By the way, I think that might be my favorite chapter,
or it's the one that I need to listen to the most myself.
Self-awareness plays a crucial role in the science of failing well,
because we are up against some hurdles as humans.
Our cognitive, emotional, we have this predisposition to think we know.
We think we see reality, and we have to keep training ourselves.
Here's a cognitive habit to break, right?
Keep training yourself to remember,
yes, I see a partial view of reality
and I am almost certainly missing something
relevant, important, useful.
But maybe my friend John here is seeing something
that I don't see.
It's a habit of mind to force curiosity where knowing is more habitual.
So I'm just likely to be confident.
Yeah, I know what's going on versus, I wonder what's going on.
It's just to keep reinvigorating that sense of curiosity
to embrace what Carol Deweck would call a growth mindset that understands that
the more sort of challenges mindset that understands that the more
challenges you take on, the more able you become, rather than our tendency to play it safe or
play not to lose as it were, rather than to really go for it and to stretch and go after the things
that you're hoping to do and achieve. Well, I remember when I was an executive and when I was in the military, I was trained
really in servant leadership, but as my career progressed, we ended up starting to take a number
of courses related to situational leadership. And that was how do you use empathy and the EQ
empathy equation in the way that you're leading and in the situations that you're involved. In your book, you talk about situational awareness, which is very similar. It is. What is the role of humility,
curiosity and honesty in adopting a fair, a farewell mindset? It's just mission critical. The
curiosity, the humility, humility is that that sort of reminder to yourself that you don't know everything, that there's much more to learn, that there are challenges that lie ahead.
Curiosity is the drive that thirst to learn more, rather than rest on your sense of,
I know, I see, I get it.
That's something more, it's a natural human thing to have,
but we tend to have it socialized out of us.
An empathy, I think, is mission-critical as well. It's that reminder of how would I want someone to respond to have, but we tend to have it socialized out of us. An empathy, I think, is mission critical as
well. It's that reminder of how would I want someone to respond to me if I were in their shoes.
That doesn't mean letting them off the hook or going easy on people. It's about just being
caring and compassionate with the feedback that you give or the response you have to bad news or
the response you have to request for help. And I believe all of those attributes are really helpful in
sizing up the context. As I talked about before, context really can vary. If you're
flying an airplane full of passengers, that's one context. That is a context in which, of course,
you want to be enormously safe and risk averse. You're in a laboratory that's one context. That is a context in which of course you want to be enormously safe and risk
averse. You're in a laboratory that's another context where you want to be taking risks and
experimenting with wildlife ideas. So the discipline of situation awareness is really the discipline
of saying what's at stake here, whether reputationally, financially, human safety, and how much is known, how uncertain is it?
And if it's really risky and we're in an unknown situation, as your, as your NASA story embodies
so perfectly, you are proceeding cautiously. The experiments you take are tiny ones. But what
happens if you go upside down? Okay, well that sort of worked, but it didn't work. You're absolutely careful, cautious. You're not going wild. If you're laboratory or simulation, go wild. Learn
as much as you can. Try to see where the failures are going to happen so that you can then prevent
them when it really matters. Thank you for sharing that. And I have done a large number of interviews on the show regarding
the importance of systems change. And they have primarily been through the lens of overcoming
things like climate change or water scarcity. And I remember for the last decade of my career,
I was a partner in a private equity firm. And one of the things that we looked at the most was the
mega trends that are impacting society because we were trying to look at the things that we looked at the most was the mega trends that are impacting society
because we were trying to look at futuristic things that are happening, such as climate change,
such as water scarcity, food scarcity, etc. And I think it's important for the listeners to realize
that today we are governed by extremely complex systems and is in these systems that we see them not play out just in politics or in organizations,
but they play out in nature.
They play out in our families.
How does this ability to see and appreciate systems help prevent a lot of failures that
could come our way?
It's such an important issue.
And so hard for us, as human human beings our brains have not evolved to be
spontaneously good at sizing up systems. We're really quite drawn to parts. We look at the parts not the holes and
the behavior of most systems is created by the ways the parts interact, but that's where let's we're sort of the causal power
lies and so to me for the science of failing,
well, the most important thing to overcome the natural sort of cognitive tendencies to overcome
our natural tendency to think first of our ourselves and our needs rather than to think of the
collective like what because a lot of times the things that would be good for me or that I want
will directly harm sort of the team or the unit.
But so I have to balance my initial instincts against something far more important, which is how well will we be able to do and perform.
The other thing I have to overcome is that I'm drawn toward now.
I think about what do I want now, what will work now? And I discount the future and
Systems thinking is about recognizing that the future is coming at you and to be able to think a few steps ahead is
Really mission critical and almost every field to think about if we do this now
What will the impact of this be on our success later next year, 10 years from now, etc. So that ability to go from me to
we to and now to later. And then finally to appreciate feedback loops because we tend to
think of X causes Y and on we go when in fact X causes Y, but then Y turns around to have
an impact on X. And so we have vicious cycles, which we all are familiar with that term or virtuous cycles,
but we don't spontaneously think that way. So we have to pause and step back and say, okay,
how is the system behaving and how might my actions either exacerbate or ameliorate the behavior
in question? Well, thank you for that because I think it leads into your last chapter
eight. And one of the things that I think is so important for people understand is I believe
we as humans were created to constantly learn. And one of the ways that we are programmed to learn
is by learning from the failures. And I think any of us who are parents and my kids are now grown,
but there's only so many times that you can tell them
not to do something.
And they still do it because they have to learn
from their own mistakes.
And luckily, most of those are just errors
and they're not catastrophic failures.
But how do we use this reality of the fact
that we are human and that we are going to fail to craft a fulfilling life.
I titled that chapter thriving as a fallible human being because I want to call attention to the fact
of our fallibility. That's a given. So get over it. Let's live with it and let's try to live with it
joyfully and productively. If we're fallible, if that's a given, what are the things we can do and get good at
to still have full, meaningful thriving lives?
And one is, of course, to be okay
with the failures that will happen along the way.
Just know that's a given, that's not shameful,
that's not something that shouldn't have happened,
that's something that's part of a full and meaningful life.
Another is to master the
art of apology because you're going to need it now and then to really be willing to repair
those relationships that might get harmed along the way through some of the failures that you
contribute to in your lives. Another is to be willing to persist, right, to pick yourself up and just keep on going.
And all of these have to do with learning.
You are right.
I could have titled that chapter just embracing the learning mindset and the learning orientation
because that's what it's really about.
The people that we look at, the stories that are told in the book and the people that I look at to try to understand
what it means to be a great student of and user of failure
to succeed wildly are all people who were curious,
who were driven, who weren't afraid of the failures
they experienced, who learned as much as they could from them.
So yes, it's really about learning to thrive.
Okay, my last question would be for a reader of the book
or a listener of this podcast,
what are some actionable steps that they could take today
to start cultivating a fail, well-mine set
in their personal or professional life?
The most important first step is just to get clear about the
different kinds of failure.
Maybe start using the terminology.
So that because I think when we say things like failure is not
an option or fail fast, fail often, it's superficial and
confusing to people.
Whereas if you say, yep, let's avoid all the basic failures
we possibly can, that's going to require us to speak up,
et cetera.
So get clear about the terminology.
I think the terminology literally gets you halfway
toward a healthy mindset and healthy practices
around failure.
I do believe that doing whatever you can
to contribute to an environment of psychological safety
is another practical thing you can do.
The best way to do that is to early and often
acknowledge uncertainty,
acknowledge novelty, let people know that you know that we're doing things we've never done before
quite this exact way. You can't stand in the same river twice. So you're acknowledging
reality and reality is that things could go wrong. So you're making it okay, making it safe to speak up about them. And I think finally, just master the pause. That's a habit to develop that. Don't try to overcome
our reactiveness. Bad things happen. We react. We feel bad. We feel ashamed. We feel angry.
Just take a deep breath, size it up, and have a more thoughtful learning oriented response.
size it up and have a more thoughtful learning oriented response.
Great, Amy. Well, for the listener who wants to learn more about you, your books and your research, where is the best place for them to go?
I guess the simplest answer is amycadminton.com.
Amy, thank you so much for being here today. I so enjoyed reading your book and it was a great discussion. Thank you so much for having me and it was an honor and a pleasure to talk with
your mastery of the material and seemingly such a short time.
I thoroughly enjoyed that interview with Dr. Amy Edminson and I wanted to thank
Amy and Atria Books for the honor and privilege of having her appear on today's
show. Links to all things Amy will be in the show notes of passionstruck.com.
Please use our website links if you purchase any of the books from the guests that we feature here on the show.
Everties or deals on discount codes are in one convenient place at passionstruck.com slash deals.
Videos are on YouTube at both John Armiles and PassionStruck clips.
You can now purchase my brand new book PassionStruck, which is all about the science of living an intentional life on Amazon.
Links will be in the show notes.
As I mentioned at the beginning,
we are now on syndicated radio
on the Brushwood Media Network.
You can find me at John Armiles
on all the social platforms.
You can also find me on LinkedIn
where you can sign up for my LinkedIn newsletter
where you can sign up for our weekly newsletter
at passionstruck.com.
You're about to hear a preview
of a special passionstruck podcast interview
that I did with my friend, Harvard professor, and number one, New York Times bestselling author Arthur Brooks.
In this interview, we explore Arthur's new book, Build the Life that you want, which he
co-authored with none other than Oprah Winfrey.
In this interview, we invite you to commence a journey towards greater happiness, no
matter how challenging your circumstances.
Drawing on cutting edge science and years of helping people translate ideas into action.
Arthur with the help of Oprah shows you how to improve your life right now instead of
waiting for the outside world to change.
What emotions are not nice to have or things we try to avoid, that's the wrong way of
seeing it.
Emotions are basically a machine language where your brain is taking
outside stimuli and then turning it into signals delivered to your conscious brain so you
know how to react. Emotions are a universal language. I don't care if you're born in Papua
New Guinea or Canada, you speak the same emotions. The reason is because humans see the
same things and they need to translate what's going on in their senses
and turn that into a language so you know how to react.
The problem is that not all those emotions are very pleasant.
And so the result of it is that if you're reactive, if you're just going to react on the basis of those emotions,
your life is going to feel like it's out of control.
Remember that we rise by lifting others. Share the show with those that you love.
And if you found today's episode useful with Dr. Amy Edminson
unfailure, then please share it with somebody
you could use the advice that we gave here on today's show.
And to me, time do your best to apply what you hear on the show
so that you can live what you listen.
Now go out there and live your life, Ash and Struck. nation struck.