Something You Should Know - SYSK Choice: What Is The Truth? & How Systems Fail
Episode Date: September 5, 2020We have a lot of devices – cellphones, tablets, laptops – and they all need to be charged up constantly. How much does that cost? And how much does it cost to run a television, light bulb or a Tes...la every year? Listen to discover the answers. http://www.forbes.com/pictures/ekhf45ellkj/ipad-1-50-per-year/ It seems as if the truth has taken a beating in recent years. Your truth may not be my truth and then, of course, there is alternative truth. Huh? It’s time we take a closer look at what the truth is and isn’t. So joining me is Hector MacDonald, he is strategic communications consultant who has advised the leaders of some of the world’s top corporations as well as the British government. Hector is the author of a new book called TRUTH: How the Many Sides to Every Story Shape Our Reality (https://amzn.to/2pVUYs6) and I think you will find what he has to say very enlightening. Everyone has been worried about their breath on occasion. We all know what a huge turn-off bad breath can be. So I will let you in on some proven strategies to fight bad breath when you aren’t able to brush your teeth. I’ll also tell you a few myths about bad breath that may surprise you. https://www.huffingtonpost.com/2011/12/03/cure-bad-breath_n_1126196.html You’ve heard of Murphy’s Law… Anything that can go wrong will go wrong. But why is that so? Why do things go wrong? Whether it is your morning routine to get the kids off to school (which in my house OFTEN goes wrong) to how you do your job or cook Thanksgiving dinner to disastrous space shuttle launches – things can and do go wrong. Listen to Chris Clearfield, co-author of the book, Meltdown: Why Our Systems Fail and What We Can Do About It (https://amzn.to/2pZgPy3) as he delves into the science of failure. You’ll discover how failure works and more importantly how you can learn from failure to prevent it from happening again. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Bumble knows it's hard to start conversations.
Hey.
No, too basic.
Hi there.
Still no.
What about hello, handsome?
Who knew you could give yourself the ick?
That's why Bumble is changing how you start conversations.
You can now make the first move or not.
With opening moves, you simply choose a question to be automatically sent to your matches.
Then sit back and let your matches start the chat.
Download Bumble and try it for yourself.
Today on Something You Should Know,
ever wonder when you charge your cell phone at your friend's house,
what does that cost them on their electric bill?
Then, with fake news and alternative facts,
how do you tell if the truth is really the truth?
We should respect people's statements as true if they are true, but we should also notice what
they're not talking about, what they're leaving out, what they're selectively highlighting at
the expense of other perhaps just as relevant information that might go against their agenda.
Then, ever worry your breath is not as minty fresh as it should be? I'll have some bad breath first aid.
And why do things go wrong? Why do systems fail? And how do you prevent things from going wrong?
The more pieces we have in the system, the more moving parts, whether you're talking about the
space shuttle or our cars or our companies or our personal lives, the more moving parts we have in
these systems, the more likely we are to have these failures.
All this today on Something You Should Know.
People who listen to Something You Should Know
are curious about the world,
looking to hear new ideas and perspectives.
So I want to tell you about a podcast
that is full of new ideas and perspectives,
and one I've started listening to called Intelligence Squared.
It's the podcast where great minds meet.
Listen in for some great talks on science, tech, politics, creativity, wellness, and a lot more.
A couple of recent examples, Mustafa Suleiman, the CEO of Microsoft AI,
discussing the future of technology.
That's pretty cool.
And writer, podcaster, and filmmaker John Ronson, discussing the rise of conspiracies and culture wars.
Intelligence Squared is the kind of podcast that gets you thinking a little more openly about the important conversations going on today.
Being curious, you're probably just the type of person
Intelligence Squared is meant for.
Check out Intelligence Squared wherever you get your podcasts.
Something you should know.
Fascinating intel.
The world's top experts.
And practical advice you can use in your life.
Today, Something You Should Know with Mike Carruthers.
Have you ever been over someone's house and said,
do you mind if I charge up my cell phone?
And then wondered, I wonder what that does to their electric bill.
I mean, what does it cost them to let you charge up your phone?
Well, Forbes.com did the research on that
and what it costs in electricity to charge up or use other devices.
Here's what they found.
For your iPad, if you fully drain and charge your iPad every other day, it will use about 12 kilowatts of electricity per year, and that means it'll cost you about $1.50 all year.
If you've ever felt guilty about charging your phone up at your friend's house, you
can stop feeling guilty about it.
Typically, your smartphone will use about 25 cents worth of electricity per year, and
a laptop will use about $8 per year.
If you watch an average of five hours of TV on a plasma TV, that'll run about
$45 a year in electricity. An LCD TV will be about $20 a year for that same five hours of television
per night. A 60-watt incandescent light bulb, if you run it for 10 hours a day, will cost about $26 a year in electricity.
An equivalent LED bulb, if you run it for 10 hours a day for one year, will cost about
$4.40 a year.
Clothes washers and dryers, now there's a big variation in this category, but on average,
they cost the typical family about $300 per year in electricity.
A Tesla, which is not an inexpensive car, will at least save you money in fuel.
A Tesla will use about $450 a year in electricity, while a car with a gas engine costs you typically about $2,200 a year in fuel.
And a microwave oven, if you run it for 15 minutes on high, will cost you about
four cents. And that is something you should know. I know my mama brought me up to tell the truth, and the truth was the truth, the facts.
But today there are alternative facts, fake news. Truth has become something to argue about.
But how can that be? How can the truth be the truth and also something to argue about?
This is really interesting, and Hector MacDonald has studied this very carefully.
Hector is a strategic communications consultant.
He has advised the leaders of some of the world's top corporations,
as well as the British government.
And he is the author of a new book called Truth,
How the Many Sides to Every Story Shape Our Reality.
Welcome, Hector.
Thanks, Mike. It's great to be here.
So it does seem that lately, anyway, that the truth has taken a bit of a beating,
that there are more versions of the truth, maybe, than before.
Is that true, or is this becoming more of an issue because the conversations about the truth are just getting nastier?
Well, I agree, It has taken a beating.
And unfortunately, what we're seeing recently is a real disregard and blatant disinterest in the truth,
to the extent that lies and falsehoods are being propagated on a scale never seen before.
However, what I think that masks is the longer-term trend,
which is that we've always been capable of misrepresenting the truth
without outright lies, without needing to use falsehoods. Clever communicators, but also all
of us in our different ways find it convenient at times to use different versions of the truth
to forward our agendas, to make ourselves look good, to win arguments, whatever it may be.
So this is not a new story.
It's just that everything is being blinded a little bit at the moment by, you know,
by the extraordinary levels of falsehoods and alternative facts that are being flying around.
And so what are we to do with this all?
How do we make sense of this when, you know, we used to think anyway in a simpler,
maybe in a simpler time, that there was the truth, and then there was
other stuff.
But now there isn't just the truth.
That's right.
And I think we have to start by getting a grounding in what's going on.
So I took a sort of almost scientific approach to this, to try and understand if we could
categorize truth into different forms, and see that it's more than just factual truth,
which is what perhaps most listeners will be familiar with.
But actually, many of the statements we make are not factual at all.
They're value judgments, things like, you know,
it's wrong to kill is a truth that most of us would subscribe to,
but it's not a fact as such.
It's a morality statement.
We make statements which are based on what I call artificial truths,
things like the definitions that we give to words or the social constructs we create,
the companies and government organizations that we create.
These things are very malleable because they're artificial, because we've created them.
And finally, I also look at what I call unknown truths,
which is things like predictions
and beliefs, whether that's religious beliefs or ideologies. And here we're dealing with things
that really mean a lot to us, and we act on the basis of these things as if they're true. But of
course, we can never establish as a clear fact something that is an ideological belief or perhaps
a religious belief, but that doesn't stop us from taking them as gospel truth.
So talk about some of the, you know, the greatest moments in truth's history, if you will,
and some examples of what you're talking about here to get people, you know, to whet their appetite.
Absolutely. So perhaps you're thinking particularly of where truth has been misused or abused.
And I think that there are some real classics.
One from the U.S., a very interesting one,
when George W. Bush was making the case for war against Iraq in actually 2002,
it was originally.
He gave a very important speech
that was broadcast across the nation in which he talked at some length about both Al-Qaeda,
which of course had recently committed the atrocity of 9-11, and Iraq.
So interweaved them in the same speech, talked about them in the same breath, but didn't actually say
they were working together, merely implied it by continuously associating them in every sentence.
Now, that's a very clever bit of rhetoric to give the whole nation, the world, the impression that
Iraq and al-Qaeda were collaborating on some terrible weapon of mass destruction to attack America.
But, of course, he never actually said that in his speech.
I think if you look at some of the interesting things that have been done with language,
also in the States, by people like Frank Luntz, who is the well-known Republican pollster,
he very cleverly changed the way people named things that were politically
sensitive, like, for example, grilling for oil. He rebranded as energy exploration, which sounds
kind of braver and more patriotic and perhaps a bit cleaner, too. You know, he was the one who
pointed out that global warming might sound scary, but climate change, that sounds kind of manageable.
So let's talk about climate change rather than global warming. Again, another infamous one from the history, the annals of truth
is Bill Clinton in his claim that he had not had sexual relations with that woman,
which we all remember very well. By his definition of sex, it's perhaps true, possibly true that he
hadn't had sexual relations with that woman.
But his definition of sex was not one that your eye or most of your listeners would ascribe to.
So the definition that appeared in legal testimony in the original court case that sparked the whole thing off was very specific and excluded all kinds of acts that most normal people would consider part of sex.
So those are just some of the kind of the classic stories that we're all very familiar with.
So how do we typically misunderstand? How do we typically
fall for stuff that, if there's a simple explanation for this, where someone will say
something as fact,
and we believe it, even though, you know, on closer inspection, maybe it's not.
Well, if it's a true statement that they're putting forward, and that's what my book is all about,
you know, truth is about truth, not lies.
If it's a true statement they're putting forward, then of course we should believe it,
but we should understand what it means and the limits of that statement.
So if someone makes a statement about, for example, in the book I use the example of driverless cars,
which is likely to become a big political issue in the next couple of years.
If someone wants to advocate for driverless cars, they will talk to you about things like the fact that fewer people will die on the roads. They're much safer than cars driven by humans. They're also much more environmentally
friendly. But they probably won't talk to you about things like the huge cyber vulnerability
of driverless cars, the potential that, you know, we will have driverless cars in our driveways.
They could be hacked by a foreign power and used against us.
We don't talk, they wouldn't talk about the many, many millions of jobs that would be lost in America and across the world
by truck drivers and taxi drivers when driverless cars take over.
And, you know, the Uber fleet is entirely made up of driverless cars.
A lot of people are going to lose their jobs. So, you know, we should respect people's statements as true if they are
true, but we should also notice what they're not talking about, what they're leaving out,
what they're selectively highlighting, what they're focusing on, and, you know, at the expense
of other perhaps just as relevant information that might, you know, go against their agenda.
Well, but they may not know.
I mean, I remember the conversation about driverless cars,
and I read a thing, and it had never even occurred to me
that one of the unintended consequences of driverless cars
will be that cities, governments, will lose so much money
because there won't be tickets.
And it's not that the person necessarily was omitting that to kind of hide it.
It's just, who would have thought?
That's right.
And there won't be parking revenues for municipal authorities.
So, you know, if your driverless car can just go home or go out of the town once it's taking
you to work, you don't need to park it and pay for parking anymore.
So that's a good thing because it means that, you know,
we can turn all those parking lots into amenities, real parks,
you know, all kinds of things.
But it's bad in the sense that it means that city governments,
and many of them, as you know, in the States, as in Britain,
are pretty close to bankruptcy at the moment.
You know, they're going to have even less revenue coming in
to pay for essential social services. So, but going back to your earlier point, you said they
may not know. That may be true for you or me, but you can be sure that, for example, a Washington
lobbyist who's been employed by the auto industry to lobby on behalf of, you know, getting permission
for private citizens to buy driverless
cars, they will know all the facts. They just won't use the ones that are inconvenient to their
arguments. We're talking about the truth today, what the truth is and what it isn't. And my guest
is Hector McDonald. His book is Truth, How the Many Sides to every story shape our reality.
Hi, this is Rob Benedict.
And I am Richard Spate.
We were both on a little show you might know called Supernatural.
It had a pretty good run, 15 seasons, 327 episodes.
And though we have seen, of course, every episode many times,
we figured, hey, now that we're wrapped, let's watch it all again.
And we can't do that alone. So we're inviting the cast and crew that made the show along for the ride. We've got writers, producers, composers, directors, and we'll of course have
some actors on as well, including some certain guys that played some certain pretty iconic
brothers. It was kind of a little bit of a left field choice in the best way possible.
The note from Kripke was, he's great, we love him, but we're looking for like a really intelligent
Duchovny type. With 15 seasons to explore, it's going to be the road trip of several lifetimes.
So please join us and subscribe to Supernatural then and now. Since I host a podcast,
it's pretty common for me to be asked to recommend a podcast.
And I tell people, if you like something you should know,
you're going to like The Jordan Harbinger Show.
Every episode is a conversation with a fascinating guest.
Of course, a lot of podcasts are conversations with guests,
but Jordan does it better than most.
Recently,
he had a fascinating conversation with a British woman who was recruited and radicalized by ISIS and went to prison for three years. She now works to raise awareness on this issue. It's a great
conversation. And he spoke with Dr. Sarah Hill about how taking birth control not only prevents
pregnancy, it can influence a woman's partner preferences, career choices,
and overall behavior due to the hormonal changes it causes.
Apple named The Jordan Harbinger Show one of the best podcasts a few years back,
and in a nutshell, the show is aimed at making you a better, more informed critical thinker.
Check out The Jordan Harbinger Show.
There's so much for you in this podcast.
The Jordan Harbinger Show on Apple Podcasts, Spotify, or wherever you get your podcasts.
So, Hector, tell the story about Coca-Cola.
Well, it's not exactly about Coca-Cola, but it's about their brand, Fanta.
I came across this story when I was having a look at corporate history
that Coca-Cola put out in 2011,
which was called something like 125 Years of Happiness.
It was a very beautifully presented corporate brochure,
but it didn't mention Fanta being invented, which seemed strange.
And I realized that the reason for this was that Fanta was invented in 1940 in wartime Germany.
In other words, it was invented in a country that was under the control of the Nazis.
Now, it wasn't invented by the Nazis.
It was created by the German branch of Coca-Cola
because they were unable to, during the wartime blockade,
source the essential materials they needed to make Coca-Cola.
They couldn't get the magic formula, basically.
So they came up with a new product,
and it was a very impressive story of innovation
and making do in tough times.
But it's not something that Coca-Cola wants you to know about
because it doesn't want either the Coca-Cola or the Fanta brand
to be in any way associated with Nazi Germany,
for understandable reasons.
And I would do the same thing if I was in their position.
Well, sure. Yeah, who wouldn't? I mean, why would you highlight, put a spotlight on that? I mean, that would be, that wouldn't be good marketing. So that's a perfectly reasonable
historical omission to make. But then when you start to look at some of the other historical
omissions that people make, they become less reasonable. And I would perhaps cite the example
that you may have seen in the book of the Texas Education Board
putting out guidelines in 2015 for what should go into the history curricula for public schools.
And leaving out the Ku Klux Klan, leaving out Jim Crow laws,
really downplaying slavery, the role of slavery in the Civil War, to the extent
that, you know, they're giving, by those omissions, quite a distorted picture of what many people
would consider to be, you know, a really important, informative part of American history, and
particularly where it pertains to racial, you know, inequality and all the concerns
of race that have emerged in recent years.
But you're not saying, are you, that if you're going to make a case
that you need to tell everybody everything about it
or otherwise you're not telling the truth?
I'm certainly not saying that.
I'm saying that you are telling the truth,
but you're not telling the whole truth.
Now, that may be the right thing to do in certain circumstances,
but as in the case of the Texas history curriculum,
that's a situation where I think, you know, public officials with a responsibility to
the whole population need to be a little bit more objective and fair in their portrayal
of the history of their country and state.
Now, you know, I'm a foreigner, so what do I know?
But you tell me whether it's fair to portray the Civil War
as primarily about states' rights,
with slavery just as a side issue.
That seems a distortion of the facts, at the least.
Sure.
But, you know, throughout history,
the whole purpose of debate is to present your side,
which by definition is going to omit the other side,
because that's the other guy's job, is to tell me his side. That's quite right. And that's,
you know, that's fair game, I would say, in debating societies and in, you know, in political
debating chambers. But is it necessarily fair if you're dealing with members of the public who
aren't trained in debating, aren't kind of ready to sniff out every possible, you know, point of weakness in your argument. Because the truth is that most of us
don't go through life as if we're in a debating chamber. We read a story in a newspaper, we hear
a report on the news, we see some quote on Facebook, we see some tweet, and we take that as
the truth. We don't look to see, ah, but what's the other side of the story?
And I guess part of my reason for writing the book is that I think we should, unfortunately.
We increasingly need to because we cannot rely any longer on the old geek keepers
like the New York Times, the Washington Post, for us, the BBC, to do it for us.
I mean, I still get most of my news from the BBC,
but I'm very conscious that most of the world gets their news from Facebook,
which means they get their news from whatever random sources their friends happen to have decided to click on and follow or share,
whether it be Breitbart, whether it be some random blog,
whether it be a bit of Russian disinformation, whatever it be Breitbart, whether it be some random blog, whether it be a bit of Russian
disinformation, whatever it may be. So I think because of that, because of the move to social
media news and the increasing fragmentation of the media landscape, we unfortunately each have
much more of a responsibility to try and get a handle on what is true and ways in which we might
be being misled by the messages that we hear.
Yeah. Well, so how do you do that? How do you become persuasive?
How do you convince people of your argument and still be ethical according to your standards?
Well, so I think that what you're asking is how the communicators put forward their argument
while not misrepresenting too badly the overall truth.
And I think that you have to be, each person needs to make their own ethical call on that.
When I'm working with businesses, for example, I will highlight issues that might be ethical concerns.
Say, look, this is our main message that we want to get across. But if we only say that and don't mention this other issue, then, you know, some of your staff might feel hard done by
if they later find out that we didn't mention it. So, you know, would it be okay if we include this
as a, you know, for your information, you know, bullet point? And I think that's, you know,
you have to be transparent with the people that you're working with, with your political associates, with your shareholders, with whoever it is who's involved, as to the agenda you're pursuing and the reasons why you're making messaging choices.
But ultimately, it comes down to personal ethical choices.
Right, and those are decisions you have to make for yourself.
And I can't think of a real example right now, but you could have an organization, for example,
that does wonderful things for the world,
but the founder's great-grandfather's uncle was a Nazi,
and you could make that an issue.
You could message that.
Yes, exactly.
Well, is that really necessary to bring up?
It has nothing to do with anything,
but an opponent could say,
aha, gotcha.
Well, exactly.
And we are becoming increasingly like that.
So, you know, the small part I hope I can play
in stimulating that conversation
and perhaps getting some kind of movement going
is to lay out what some of these practices are.
And that's exactly what I've done in Truth,
is to give
countless examples and stories of how truth is used for good and for ill by, you know,
well-meaning people and, you know, total villains.
But a person's truth is still always going to be tainted by their beliefs. I mean,
you can be as honest as the day is long about what you're saying and believe it to be true,
but your beliefs still enter into it.
It's impossible, unless you're doing math and saying two and two is four.
There are gray areas, and your beliefs fill in the blanks.
Well, that's true to a certain extent, but I think you can try and be responsible about these things.
So, for example, I've written about two very controversial subjects in truth. One of them is Brexit,
which is very controversial over here, and the other is climate change, which is controversial
everywhere. And I would challenge you to tell me which side I stand on both those arguments. I
don't think I make it clear in the book. So, you know, I've written at length about some of the
arguments around them without, I hope, giving away where I stand.
So I think it is possible to try and be objective
and present the arguments from both sides and think through the arguments
because actually that's what makes us better as a society
if more of us can try and see these different issues from multiple points of view.
It also is more likely to produce solutions, by the way,
because the more that we can see issues from different points of view,
the more likely we are to combine different ideas into potential solutions,
work with each other to come up with creative, imaginative ideas
that no one side in the debate would otherwise have come up with.
Well, the truth is certainly not as simple as my mama told me,
but maybe discussions like this can help everybody get a better handle
on what the truth is and isn't.
My guest has been Hector MacDonald.
He is a strategic communications consultant, and the book is
Truth, How the Many Sides to Every Story Shape Our Reality.
There's a link to his book in the show notes, and I appreciate you being here.
Thanks, Hector.
My pleasure.
Thanks very much for your time.
Hey, everyone.
Join me, Megan Rinks.
And me, Melissa Demonts, for Don't Blame Me, But Am I Wrong?
Each week, we deliver four fun-filled shows.
In Don't Blame Me, we tackle our listeners' dilemmas with hilariously honest advice. Then we
have But Am I Wrong, which is for the listeners that didn't take our advice. Plus, we share our
hot takes on current events. Then tune in to see you next Tuesday for our listener poll results
from But Am I Wrong. And finally, wrap up your week with Fisting Friday, where we catch up and
talk all things pop culture. Listen to Don't Blame Me, But Am I Wrong on Apple Podcasts, Spotify,
or wherever you get your podcasts.
New episodes every Monday, Tuesday, Thursday, and Friday.
Do you love Disney?
Then you are going to love our hit podcast,
Disney Countdown.
I'm Megan, the Magical Millennial.
And I'm the Dapper Danielle.
On every episode of our fun and family-friendly show,
we count down our top 10 lists of all things Disney.
There is nothing we don't cover.
We are famous for rabbit holes, Disney-themed games,
and fun facts you didn't know you needed,
but you definitely need in your life.
So if you're looking for a healthy dose of Disney magic,
check out Disney Countdown wherever you get your podcasts.
When you think about it, you have a system for everything you do,
whether it's your system for getting the kids up and off to school in the morning,
or the system you have for doing your job or managing your business,
or cooking Thanksgiving dinner.
NASA has systems that enable rocket ships to blast off into outer space. There are
systems that allow a car company
to put a car together so it doesn't
fall apart. Everything has a system
to it. Big or small,
simple or complicated.
And sometimes those systems fail.
You know Murphy's Law.
Whatever can go wrong, will go
wrong. But why do things go wrong? Why do systems fail?
How can you prevent failure or learn from it when it does happen?
Here to discuss that is Chris Clearfield.
He has studied systems and why they fail, and he's co-author of a new book called
Meltdown, Why Our Systems Fail and What We Can Do About It.
Hi, Chris.
Yeah, thanks so much for having me.
I think a good example to launch this discussion is the one that you give about Three Mile Island,
the nuclear power plant in Pennsylvania.
And in 1979, something went wrong with one of their reactors.
And despite all the safety systems in place, there was a meltdown, and it was the
most significant nuclear accident in U.S. history. So start there, if you don't mind.
After Three Mile Island, you know, there was an official investigation, and what the official
investigation determined was that the operators were at fault, that they had made mistakes about
how they responded to the problems in the plant and that that led to the meltdown.
But there was a sociologist who looked at the accident.
And what he realized is that the only way that the operators were – well, let me take a step back.
What he realized basically was that the logic of the accident couldn't be understood until you had a panel of, you know, engineers looking at it for nine months. And so there was no way that the operators themselves could have understood what
was going on, let alone responded in the correct way. And for this sociologist, whose name was
Chick Perot, this was kind of a terrifying conclusion. You know, there were no huge
failures, there were no huge external shocks, and yet this series of small failures came together and led to this
big meltdown. But isn't it part of building any system to make mistakes and then learn from the
mistakes? And people make mistakes. They're the thing in the system that screws things up,
but every system requires people to make mistakes. When people make mistakes, the most important thing is that we have built an organization
which enables them to talk about those mistakes and that we don't turn around and blame them
for those mistakes.
Because in a complex system, from your armchair, your conference room, you can't just write
down all the things that are going to go wrong, right?
There's too many things we don't understand. There's this potential for these unexpected
connections in the system. And so what we have to do is we have to learn from the system as it's
running. And in order to learn from it, we need people to talk about the problems that they see.
We need people to talk about the mistakes that they make. You know, there's a story in the book
where there is a sailor on an aircraft carrier who drops a tool on the deck during a big exercise, and he can't find it until he reports it. They
have to call off the exercise, send planes, you know, to divert to other places, and they conduct
this extensive search. They eventually find the tool. The next day, there's a big ceremony held
for this sailor to celebrate his bravery in coming forward and saying,
I've messed up, I've lost this tool.
You know, that's the kind of thing that we really need to see if we're going to start to get a handle on
these big systems where we can't think through all the failures ahead of time.
What are some of the big system failures that if you mentioned them, I might know?
The target expansion into Canada is a good one, right?
So, you know, Target, the big American retail company, wanted to open stores in Canada.
They tried to open about 130 stores in a very short time frame.
They declared bankruptcy and lost a couple of billion dollars. The BP Deepwater Horizon oil spill is another good example
where, you know, really a series of small mistakes
led to this massive consequence, you know,
billions of dollars for the company
and obviously loss of life and untold environmental damage.
And I think another thing that's interesting,
you know, we were researching how effective teams manage crises, so how SWAT teams and emergency room doctors, you know, respond to unexpected events.
And I have a five-year-old, and what I started to see was that our morning routine, you know, my trying to get him off to preschool every day, that actually looked a lot like a crisis.
That actually looked a lot like what these researchers were seeing when they studied these really effective teams.
We weren't so effective, but we were able to take some of the lessons that we were really
writing in the book and transform our morning routines. It's much, much better now. So,
you know, that's kind of two big failures and I think one crisis at home that many people
might recognize.
So just because I don't remember that, what did Target do wrong that caused them to fail in Canada?
What Target saw was that they had to set up a whole new supply chain for Canada.
And that supply chain was very, very complex.
And it didn't have a lot of slack in it. They were very much trying to move goods from their warehouses to the shelves in the stores just in time, just when they needed it.
And so in that case, you had a couple of issues with the supply chain software. You had people
who had entered data incorrectly into the software. So a case of paper towels was recorded
as one paper towel and not 24 paper towels. And these kind of small errors, there were a case of paper towels was recorded as one paper towel and not 24 paper towels.
And these kind of small errors, there were a lot of them,
but they really combined to mean that the warehouses were overflowing,
the supply chain had basically broken down, and store shelves were empty.
So it was a series of blunders rather than some big thing that went wrong.
Exactly.
That's exactly right.
And is that the typical way systems fail?
I mean, I remember when the Challenger blew up that, you know, the problem was the O-rings and that everyone pretty much agreed that there was a problem that caused the Challenger
explosion.
Is that more typical or is it more typical that it's a lot of little things
that combine and snowball into disaster? It's a great question. I would actually,
you're right in the sense that the O-ring is what caused the problem in the Challenger.
Actually, a lot of great research looking at the Challenger accident. And even though there's this
one proximate cause that we
can look at and we can put our fingers on, you know, there had been O-ring failures for several
missions beforehand. And it was actually something that NASA was looking at pretty seriously. And so
you had this whole culture at NASA that was kind of perpetuating these small errors that in many ways could have been caught and fixed
before Challenger was launched and tragically lost.
Well, I guess a part of any system is to take into account Murphy's Law,
that if you have enough parts of a system, something's going to go wrong,
and so part of the system is to catch those things, right?
Yes, that's exactly the right way to put it. The more pieces we have in the system,
the more moving parts, whether you're talking about the space shuttle or our cars or our
companies or our personal lives, the more moving parts we have in these systems, the more likely
we are to have these failures. And so the way we need to shift our perspective a little bit is by thinking about how we can learn from our systems, how we can
encourage people to speak up, and how we can catch these sort of small failures so that they don't
spiral out of control into these big ones. There's one story we talk about in the book about a nurse who almost gives the wrong
medication.
She has two patients with similar last names in the same room, and they're taking similar
sounding medications.
And she almost mixes them up.
But she catches her error.
She doesn't just fix that problem.
She doesn't just catch her own mistake.
She talks about it with her colleagues.
And so now all the nurses know to be aware of this. But then they go even a step further. They separate the patients. So now
they're not in the same room. So the confusion is less likely to happen. And then the hospital
actually builds a system to flag up when patients with similar last names are in the same room and
to prevent that from happening. So I think that's a great example of really seeing that learning
going from the nurse recognizing the problem all the way up to making the system better.
But in that case, and in so many other cases, it does seem that the mistakes have to happen first and someone pays a price.
That it's impossible or somehow not easy to anticipate what might go wrong.
We have to let it go wrong first and then go,
oh, well, I guess we need to fix that.
Well, yeah, you're onto something there.
But in this case, it actually wasn't even a mistake.
It was an almost mistake.
You know, the nurse didn't give the wrong medication,
but she used that and the whole hospital used that
as information about the
system. And I think that is, you know, that is what we see successful organizations doing. We
see them treating these small issues, not as one-offs and saying, oh, well, you know, glad
that happened. Our system worked well, but we see them treating these small issues as ways to learn about the bigger system. But usually, it's
what really motivates change is disaster. Because, you know, when there's a near miss,
it's never quite as big a deal as when two planes collide. So the two planes colliding
seems to generate more change than the almost colliding.
Yeah, I love that perspective, and I think it's a great example.
It turns out in aviation, that's, broadly speaking, not the case.
So aviation is one of the real success stories where we see over the last four decades,
even though airplanes have gotten more complex,
even though the whole aviation system has become more interconnected and there's less slack in it, aviation safety
has improved tremendously. And it's not because of technology. It's because of the way that aviation
approaches these questions. So I think you're right. In most industries, people don't learn
from near misses. I think in aviation, they tended to kind of focus on this stuff really obsessively.
And that's part of why commercial flying is so much safer today than it's ever been before.
But there's another element, too, which is as people in the world, as people who are making decisions, who are running companies or even thinking about things in our personal lives, one of the things we can do is we can learn from other people's mistakes.
We can learn from other people's failures.
So this is even a level removed from the near-miss element of things.
It's like, well, what happened to other industries, and how can we learn to manage the kind of
failures that we see emerging in lots of different places?
Well, I remember talking to somebody who made the point that, you know, one of the reasons aviation does so well versus, say, mistakes in hospitals is if a doctor, you know, cuts off the
wrong leg or whatever, you know, he's okay. But if the pilot screws up, he's dead. So when you've
got that much skin in the game, things get better. Yeah, it's a really interesting thought. The counter example to that
would be aviation four decades ago, the pilots were still up front, and they were still, you know,
the first ones to arrive at the crash site, as the saying goes. But they didn't have all of the tools
that they needed to make aviation safer. And I think what we've seen in that industry is not only
this focus on near misses,
but we've also seen these relatively small interventions that just help flight crews
communicate and share information that they're concerned about much, much more effectively. And
that's a really amazing thing. I mean, the bigger lesson for that, I think, is that we can learn
as an organization, we can train people to speak up and to listen to
these voices of concern, and we can use that to make our systems much, much safer.
I want to try to bring this down into a little more of a personal thing, and you use the example,
which I like because I had the same example, of the morning routine. And we have a morning routine
with my boys getting up, and often it's crisis and it's, you know, come on, we're going to be late.
And, you know, it occurred to me, you know, if we just started this five minutes earlier, all of this would go away.
And yet people don't think that way often.
They just think he's just got to hurry up.
But if you give him more time, so I guess what I'm asking is, what are the
takeaways here? I mean, I would imagine that in general, systems that are simpler are better, yes?
Yes and no. I mean, it turns out the antidote to these kind of problems isn't necessarily
simplicity, but it's transparency. And with your example with the morning routine, I mean,
I think there's
two lessons, adding more time, getting up five minutes earlier, starting five minutes earlier,
that's kind of creating more slack in your system, right? So these small problems like,
oh, I can't find my jacket, you know, now you have more time to absorb them. And that's broadly
speaking a good thing. But the other thing that we learned when we studied these crises is that
your idea, start five minutes earlier,
that's a great idea, but we can't necessarily predict that may, you know, that may just mean
that your boys move slower in the morning. And so I think the real key and what really successful
organizations do when dealing with these kinds of complex systems, which the morning routine is,
weirdly, is they try something and then they see how it goes, they circle back,
and then they try something else. And so what we started doing in our family is every weekend
having a five-minute meeting that's like, okay, how did stuff go last week? What worked? And what
should we try differently next week? And so you may find that this five-minute buffer works,
and that's great. Then you incorporate it. You may find it doesn't work. Then you have this opportunity in your family meeting to try something new and to figure out
something else. And that's just what emergency room doctors, pilots, and SWAT teams, that's
exactly how they approach these kind of things. And it was kind of fascinating for me that that's
something we could use in our day-to-day. But those guys keep training and training.
If you're trying to get your morning routine down,
good enough is good enough. And I would imagine at some point you stop examining
how we can shave off three quarters of a second on the morning routine. But that three quarters
of a second may mean something to the SWAT team. It doesn't mean much to get to school.
Yeah, you're absolutely right. But, you know, you know that these kids, when they're young,
they're different every week, right? So something that works one week may not work the next week.
And I think that's one of the things too, whether you're talking about competition in the workplace,
you know, between companies, or you're talking about the family routine, every day is a different
day. Every week is a different week. So, you know, we've seen this where some of the solutions that we've tried,
they work for two weeks, three weeks, and then they stop working.
And so we're back to the drawing board,
not because we're trying to save an extra minute,
but because what we've got doesn't work.
And we have to be more adaptive, both in our morning routines
and more broadly in these big systems that we have to take care of.
Last question, and that is, but when you have systems, when you have lots of systems with lots of parts,
doesn't randomness play a role that things will go wrong because that's just the way the universe works?
Things happen.
Yes, exactly. And we make an analogy to chaos theory. I think chaos theory is in many ways a very good description of what we're seeing. You can't specify. These systems get so complex that you can't specify all the failures. You can't even specify, we have these systems, we have all these connections. We should expect some base level of, you know, random failures. What's going to go wrong with your car? But if you have a car for long enough, something's going to go wrong
because there are so many things in that car that can go wrong.
Chances are something's going to go wrong.
You can't predict what it is, but you hopefully, you know,
when the brakes go or the transmission goes, you're prepared.
You've got the money to get it fixed, because something will go wrong.
And that is what's so interesting about this.
My guest has been Chris Clearfield.
His book is Meltdown, Why Our Systems Fail and What We Can Do About It.
You'll find a link to his book at Amazon in the show notes.
Thanks, Chris. Appreciate your time.
Yeah, thank you, Mike. This was a great conversation. I appreciate it.
It's estimated that about 65% of all Americans have bad breath,
and I bet about 100% of all Americans have worried about it at some time or another.
There's a lot about the problem of bad breath that you probably don't know.
Bad breath is, first of all, all in your mouth.
There's a common myth that the stomach causes bad breath, but there actually isn't a constant airflow
between your stomach and your mouth.
A stuffy nose can cause bad breath.
When a cold prevents you from breathing through your nose,
you're forced to inhale and exhale through your mouth.
This dries out the tissues and reduces the flow of saliva,
which is your mouth's built-in cleanser.
The less saliva, the more bacteria, the more the bad breath.
Mouthwash is a problem.
Mouthwash with alcohol often promises to kill 100% of the germs,
but what they don't tell you is those germs repopulate in less than an hour,
causing what's called rebound bad breath. Some alcohol-free mouth rinses can be beneficial,
and the results can last longer. Eating cheese or other dairy products can help neutralize acidity,
and that will cut down on the bad breath. Bad breath is a side effect of many drugs,
such as anti-anxiety drugs, antidepressants,
and even allergy medicines like antihistamines
can also produce a dry mouth and hence bad breath.
Chewing gum with xylitol is good.
Xylitol is a sugar substitute found in many gums and dental products,
and it helps to keep bacteria at bay and help with saliva flow.
And that is something you should know.
We have great advertisers on this program.
I hope you will support them.
By supporting them, you support this podcast,
and I am sure you'll be happy with anything you buy from any of them.
I'm Mike Carruthers.
Thanks for listening today to Something You Should Know.
Welcome to the small town of Chinook,
where faith runs deep and secrets run deeper.
In this new thriller, religion and crime collide
when a gruesome murder rocks the isolated Montana community.
Everyone is quick to point their fingers at a drug-addicted teenager,
but local deputy Ruth Vogel isn't convinced. She suspects connections to a powerful religious group.
Enter federal agent V.B. Loro, who has been investigating a local church for possible
criminal activity. The pair form an unlikely partnership to catch the killer, unearthing
secrets that leave Ruth torn between her duty to the law, her religious convictions, and her very own family. But something more sinister than murder
is afoot, and someone is watching Ruth. Chinook, starring Kelly Marie Tran and Sanaa Lathan.
Listen to Chinook wherever you get your podcasts.
Hi, I'm Jennifer, a founder of the Go Kid Go Network.
At Go Kid Go, putting kids first is at the heart of every show that we produce.
That's why we're so excited to introduce a brand new show to our network called The Search for the Silver Lining,
a fantasy adventure series about a spirited young girl named Isla who time travels to the mythical land of Camelot.
Look for The Search for the Silver Lining on Spotify, Apple, or wherever you get your podcasts.