Passion Struck with John R. Miles - Alex Edmans on Critical Thinking in a Post-truth World EP 463
Episode Date: June 4, 2024Order a copy of my book, "Passion Struck: Twelve Powerful Principles to Unlock Your Purpose and Ignite Your Most Intentional Life," today! This book, a 2024 must-read chosen by the Next Big Idea Club,... has garnered multiple accolades, including the Business Minds Best Book Award, the Eric Hoffer Award, and the Non-Fiction Book Awards Gold Medal. Don't miss out on the opportunity to transform your life with these powerful principles!In this episode of Passion Struck, host John R. Miles interviews Professor Alex Edmans on the importance of critical thinking in a post-truth world. Edmans discusses biases such as confirmation bias and black-and-white thinking that affect how people interpret information. He highlights examples like the Brexit referendum and the narrative fallacy to illustrate how misinformation can influence decision-making. The episode emphasizes the need to scrutinize information, disentangle correlation from causation, and consider different perspectives to make informed decisions in a world filled with misinformation.Full show notes and resources can be found here: https://passionstruck.com/alex-edmans-critical-thinking-a-post-truth-world/In this episode, you will learn:Post-truth world: Misinformation is prevalent and people tend to believe what aligns with their biases and identity, rather than facts.Data interpretation: Data can be manipulated to support a particular narrative, leading to misleading conclusions.Narrative fallacy: Creating a cause-effect explanation where none exists, often seen in successful books and talks that present a simplified story to explain success.Causation vs. correlation: Understanding the difference is crucial to avoid misinterpreting data and drawing incorrect conclusions.Context matters: Cultural context influences how information is perceived and spread, highlighting the importance of understanding different perspectives.Critical thinking: Applying critical thinking skills to scrutinize information, consider alternativeAll things Alex Edmans: https://alexedmans.com/SponsorsBrought to you by Clariton, fast and powerful relief is just a quick trip away. Ask for Claritin-D at your local pharmacy counter. You don’t even need a prescription! Go to “CLARITIN DOT COM” right now for a discount so you can Live Claritin Clear.--► For information about advertisers and promo codes, go to:https://passionstruck.com/deals/Catch More of Passion StruckCan't miss my episode with Marianne Lewis and Wendy Smith on Applying Both/And Thinking to Solve Your Toughest ProblemsMy solo episode on Does the Concept of Free Will Really ExistListen to my interview with Dr. Dolly Chugh on A More Just Future and How Biases Impact Our LivesWatch my episode with Arthur Smith on the Mind-blowing Art of Intentional StorytellingCan’t miss my episode with Katy Milkman on Creating Lasting Behavior Change for GoodLike this show? Please leave us a review here-- even one sentence helps! Consider including your Twitter or Instagram handle so we can thank you personally!
Transcript
Discussion (0)
Coming up next on Passion Strike. such strong evidence might people not respond to it in the way that they should, it is because of
these biases and these biases are reinforced by the fact that sometimes climate change is a matter
of identity and politics rather than science. So one great documentary on climate change was
An Inconvenient Truth and that was laden with facts and figures and evidence. But because it
was about Al Gore, this made it seem like a Democrat versus Republican issue. So even if
you're Republican, who is able to understand data and science, you're generally rational,
now your identity feels threatened because you think, well, climate change is something that people like them believe and people like us, we should resist.
Welcome to Passion Struck. Hi, I'm your host, Jon R. Miles. And on the show, we decipher
the secrets, tips and guidance of the world's most inspiring people and turn their wisdom
into practical advice for you and those around you. Our mission is to help you unlock the power of intentionality
so that you can become the best version of yourself.
If you're new to the show, I offer advice
and answer listener questions on Fridays.
We have long form interviews the rest of the week
with guests ranging from astronauts to authors, CEOs,
creators, innovators, scientists, military leaders, visionaries,
and athletes.
Now, let's go out there and become passion struck.
Hello everyone, and welcome back to episode 463 of Passion Struck, consistently ranked
the number one alternative health podcast and a heartfelt thank you to each and every
one of you who return to the show every week, eager to listen, learn, and discover new ways
to live better, to be better, and most importantly, to make a meaningful impact in the world. If you're new
to the show, thank you so much for being here, or you simply want to introduce this to a friend or
a family member, and we so appreciate it when you do that. We have episodes starter packs,
which are collections of our fans' favorite episodes that we organize in convenient playlists
that give any new listener a great way to get acclimated to everything we do here on the show.
Either go to passionstruck.com slash starter packs
or Spotify to get started.
I'm thrilled to share an incredible milestone
we've just achieved together.
We've officially crossed 40 million downloads.
This isn't just a number,
it's a testament to the movement we're building.
The conversations were sparking
and the change were inspiring across the globe.
In case you missed my interviews from last week,
I had enlightening conversations with Dr. Terry Walls and Brian Evergreen. Dr. Terry
Walls shared her revolutionary approach to health and wellness, detailing how
she defied conventional medical wisdom with the Walls Protocol. This diet-based
treatment transformed her life with multiple sclerosis and offers profound
insights into how dietary choices can impact chronic diseases and overall
health. Brian Evergreen takes us on a journey through his latest work, Autonomous Transformation,
creating a more human future in the era of artificial intelligence.
He reveals the critical imperative facing today's leaders, the need to pivot from
outdated mechanistic approaches to a new era of human-centered social systems empowered
by the latest advances in AI.
And if you liked those previous episodes or today's,
we would so appreciate you giving them
a five-star rating and review.
They go such a long way in strengthening
the PassionStruck community, where
we can help more people to create an intentional life.
And I know we and our guests love to hear your feedback.
Today, we have the distinct pleasure
of speaking with Professor Alex Edmonds, a true luminary
in the world of finance and economics,
currently in Lightning Mines at the London Business School. Alex's journey has taken him from the world of finance and economics, currently in lightning minds at the London Business School.
Alex's journey has taken him from the halls of MIT, where he earned his PhD as a Fulbright
scholar, to the trading floors of Morgan Stanley, and on to the esteemed faculties of Wharton
and the World Economic Forum and Davos.
Alex isn't just a scholar, he's a sought after speaker whose TED Talks, including What
to Trust and A Post-Truth World, have amassed 3 million views.
Challenging and reshaping our understanding of business's social responsibilities and the power of a pie-growing
mindset. Today, we delve into his compelling new book, May Contain Lies, how stories, statistics,
and studies exploit our biases, and what we can do about it. In this critical work,
Alex dismantles the minefields of misinformation that bombard our daily lives, from fabricated
tales that tug at our heartstrings to flawed studies that skew public policies and personal beliefs.
With vivid examples and rigorous analysis, he unveils the biases that lead us astray
and arms us with strategies not just to survive, but to thrive in a world awash with misinformation.
Join us as we explore the essence of critical thinking, the importance of challenging the
sources we trust,
and how to empower ourselves
through informed skepticism and personal accountability.
Get ready to rethink how you perceive the world
and your role in it.
Thank you for choosing PassionStruck
and choosing me to be your host and guide
on your journey to creating an intentional life.
Now, let that journey begin.
I am absolutely thrilled and honored today Let that journey begin.
I am absolutely thrilled and honored today to have Alex Edmonds on Passion Strike.
Welcome Alex.
Thanks so much, John.
It's great to be here.
Alex, you and I were introduced by Katie Melkman, our mutual friend, and I'm so interested to
understand how you got involved with the Behavior Change for Good initiative that she and Angela co-lead.
So I was a professor at Wharton in the start of my career. I started in 2007 and I worked
in finance and Katie joined the operations management department a couple of years later.
Now often as an academic you have your head down, you just focus on only what you're doing,
but Katie and I wanted to build some camaraderie within the junior faculty.
So both her and I and another professor called Cassie McGilner, we organized some junior faculty
wide events in order to encourage interdisciplinary interaction. And so that's how I knew of Katie's
work. Her work is behavioral and some of my work on finance is also behavioral, suggesting how
psychological factors, nudges can affect decisions.
Yeah, that is fascinating. And you have a best-selling book before the one we're
discussing today, but today we're going to be going through your brand new book,
which launches the week that this is coming out, which is May Contain Lies.
And I was hoping you could give us a brief overview of what prompted you to write May
Contain Lies and what you hope readers will take away from it.
So my day job is as a professor of finance and what professors do is they produce research and that research will get disseminated.
Now, so for many professors, their main goal is for this to be disseminated for academic journals and to be read by other academics. But I really like research that impacts
the real world. So how can this affect how investors allocate their money, how executives
run companies. But when I started to do this I realized that how practitioners would respond to
research would be often driven by whether they liked its findings rather than whether the research was accurate. So if it was
a research that accorded with their worldview, they would say this is the world's best paper
and even I wouldn't all call my own papers the world's best papers. And if there was a research
which contradicted their view of the world, they would say well that's just academic research that
has no bearing on the real world. So I thought then what are we doing as an academic profession?
We are supposed to be producing and disseminating information, but if the way in which the information
actually has impact is based on playing on people's existing beliefs, then some great
research will never have an impact and some flimsy research will.
So this is why I wanted to highlight the biases
that cause readers or practitioners
to fall for misinformation and to separate out
what is good research from what is flimsy.
Yeah, it's so interesting because my career was spent
primarily in management consulting for companies
like Booz and Anderson Consulting,
and then in Fortune 50 Enterprises.
And I can't tell you how many projects I've seen
where they are using research that's only slanted
in the way that drives the outcome
that they're hoping to achieve from the project,
and they don't share any research
that shows something completely different.
My whole point of bringing that up is, it's something we do on a regular basis in all
areas of our lives.
It's so fascinating.
Absolutely.
And this is not because people are intentionally set out to mislead.
They're not bad people, but they're human and humans have their biases.
They believe what their particular view of the world is.
And so it's human that we will latch onto something which accords with our worldview.
And if something doesn't,
we instinctively think that it's wrong.
And we might think, well,
we're avoiding spreading misinformation
by not publishing stuff that is wrong.
So we will tune it out and only present stuff
which accords with our worldview
because we think that's genuinely right.
So interesting.
Well, Alex, you start out the book with a bang by discussing your experience testifying in front of the UK House of Commons Select Committee on Business.
And you were put into a really difficult situation.
How did you end up discovering outdated and misleading evidence was being presented by another witness?
Yes, so let's describe what it means to be summoned to a select committee. So there's an inquiry into corporate governance, which is the way that companies are run.
And I got there early because I was nervous. I wanted to swat up on any question that the committee might ask me.
And so I sat in on the earlier session, but my ears pricked up because the witness in that earlier session mentioned some research which sounded noteworthy.
So it claimed that the lower the gap between CEO pay and worker pay, the better the company performance.
Now, this was music to my ears because much of my work is on sustainability, responsible business,
and it's responsible for businesses to pay their employees fairly and I want to claim that responsible businesses outperform
so this seemed to be a study which could be another weapon in my arsenal. So I
wanted to look it up and so I went to the witnesses written submission which
you have to submit before being called to testify and I saw the reference to
the paper. I looked the paper up myself and the
paper claimed the completely the opposite result. It said the lower the gap between
CEO and worker pay, the worse the performance. I thought, yeah, I'm nervous because I'm
about to testify myself, but I'm not so nervous that I'm going to be misreading this paper.
It was there, clear as day, it had the completely the opposite result of what the witness claimed.
And so I dug a little deeper, I figured out what had happened is the witness was quoting a half finished paper.
The finished version had actually come out and after going through peer review and correcting its mistakes,
it came up with completely the opposite result.
So I notified the clerk to the select committee about this after my own session.
And he seemed appalled.
He said, I should submit some
supplementary evidence highlighting that the initial evidence got overturned.
I did. The committee published it.
Yet their final report on the inquiry referred to the debunked study as it was gospel.
And so this taught me two things. Number one, it taught me that even sources that we seem to be, we
think are reliable, like a government commissioned study, may be incorrect
because they're undertaken by humans and humans have biases. And number two, you
can almost always find research to support whatever position you want to support, even a half finished study when the finished version shows the opposite.
So we like to band around these phrases like research shows that, studies find that, evidence proves that, but you can always find studies to support whatever you want to support.
So the fact that a study shows something doesn't necessarily mean that it's true.
Yeah, so when you're faced with situations like that where you've got studies on both
sides of an issue, how do you present the right side of the issue, or in your opinion
do you need to present both sides and then let the recipients draw
their own conclusions?
So there's a couple of things that you can do.
Firstly, what you can try to do is scrutinize the rigor of the study and not just be fazed
by the conclusions.
And so this might be thinking, well, is this correlation or is this causation?
So we might find a link between CEO pay and company performance,
but is it CEO pay that drives company performance or is it the opposite direction? If a company's
already performing poorly then maybe it can't pay the CEO as much or maybe there's a third factor
that causes both. Maybe there are certain industries in which CEO pay
tends to be higher and those industries also tend to be better performing so
that could be causing the outcome. And so you might think well isn't this tricky
to try to think about alternative explanations but it's not tricky. You
don't need a PhD or even a degree in statistics to do this. All the time when
I share studies on
LinkedIn or somebody else shares a study on LinkedIn that people don't like the sound
of, then there's no shortage of comments as to why it might be flimsy. It could be its
correlation but there's no causation. But we suddenly switch off our critical thinking
facilities when we see a study that we do. So I think the first thing that we can do is ourselves
just apply a sanity check are there alternative explanations for the same particular result.
But the second thing that we do is something that you were mentioning John is try not to be swayed
by one particular study because as I mentioned there are studies that can show everything. There
was a study published in a reputable journal showing that vaccination causes autism.
Rather than just stopping and finding the study that gives the result that we want,
let's try to look more generally at other studies.
Is there indeed a credible other side?
Sometimes it could be as simple as googling for the opposite of what we'd want to be true.
Maybe I want an excuse this evening to drink lots of red wine.
And if I googled red wine is good for your health, I'm sure I'd find some studies claiming that.
If instead I googled why red wine is bad for your health, let me see whether that throws up any high quality studies.
Now, I want to dive into the health area even in more detail, given we're an alternative
health podcast here in a second, but I thought to really make this apparent for the listeners,
I think it's important to go to the 2016 Brexit referendum that you cited in the book, because
I think this is a great example of how misinformation not only influenced public opinion, but overall
decision making on something that had major societal implications. Can you discuss this
example?
Absolutely. So in 2016, the UK had a referendum on whether to leave the European Union. And
there were two campaigns, one is the remain and the other is Brexit to leave. And there were two campaigns, one is the Remain and the other is Brexit
to leave. And one of the big pieces of information which may have affected people's vote was
the side of a bus. So the Brexit supporters, the Brexit campaign, had taken out advertisements
on the side of buses in the UK where it says EU membership cost the UK £350 million per week, let's fund the
National Health Service instead. Really powerful message, right? Health matters to everybody.
I'm sure this resonates with the listeners of this podcast and this is why we thought
well if we left the EU we would be able to have a better health service. But that number
was completely wrong. So number one,
the actual figure was 250 million, but then there's a huge amount of rebates that the
UK gets from the EU. So that 350 million pounds per week is actually 120 million pounds and
that is a third of the original size. Now you might still say 120 million pounds per
week is a lot, but compared to other things the government spends its money on, it's not.
And there's lots of benefits, such as free trade and a free movement from the people. Now people didn't question this, right?
So in the past, we knew what were the reliable sources of information. We had to go down to the library and get out in the encyclopedia, or you go to a doctor for medical advice.
You'd never think that the side of the bus is a reasonable
source of information, but because this was something that
played on people's biases, because people wanted it to be
true, their confirmation bias was in action, and they paraded
this 350 million pounds number, even though it had not been
vetted and it had come from an unreliable source. Man, to me, it's just so fascinating. And I look
at what happened there. And today, what's happening around the world with some of the conflicts we're
seeing and how much misinformation there is on both sides, depending on what story the other side wants to convey to its worldwide audience
about what is going on. It's so interesting how this impacts things on a
regional and global basis. And if people want it to be true they won't question
it. So both your incentives to press spread misinformation high but also your
ability to do so are high because people won't question it.
Yeah, absolutely. Alex, I think it's important for people to understand biases because they
play a very important role in the beginning of the book. And I'm going to go through each
one separately. Can you discuss what confirmation bias is in case a listener is unfamiliar and
how that ends up distorting our perception of information?
Certainly. So confirmation bias is the idea that we have a view of the world and so any piece of evidence that confirms that view, we will accept it uncritically.
And the basis of this is even neurological. When we see something we like, then this activates
the striatum, that's the part of the brain that releases dopamine. It just feels good
to see something we like. And the flip side to this is that if we see something that challenges
our worldview, then we immediately want to dismiss it. And again, neurologically, when
we see something we dislike, this triggers the amygdala
that is the part of the brain that activates a fight or flight response we respond to information
we don't like as if we're being attacked by a tiger. So let's make this concrete so let's say
you're somebody who strongly believes that climate change is a hoax. So if you see some new study come around
which says climate change is a hoax
which is perpetrated by certain policymakers,
you will accept this, you'll tweet this from the rooftops
without scrutinizing what it actually says.
And then if you saw a study which found the opposite
that climate change is manmade,
you might not even read it to begin with.
Or even if you did read it,
you'd now read it with a critical eye,
trying to tear it apart,
trying to look at any possible alternative explanation
for the conclusions which were drawn.
So that's about how confirmation bias leads us
to responding to information in a biased manner.
But it goes even further than that.
It will also affect what information we search for
to begin with.
For example, if you tend to be more right-wing,
I will only watch Fox News.
If you tend to be more left-wing,
I might only watch MSNBC.
And so we only get certain parts of information
to begin with, we are living in an echo chamber.
It's so interesting.
I just want to talk about climate change for a second.
I'm not sure if you know who David Attenborough is.
He's from Britain and he's been studying the effects on the
world for 93 years and Netflix just did this incredible
documentary profiling him talking about the changes that he has seen throughout his lifetime.
And it's just amazing to me because 93 years ago there were like 2.8 billion people on the earth and when it all started only 30% of the world inhabited
populace areas where 70% of the world was still wild. And you look at where we are today, and we now have double that amount of people plus,
and now we're occupying almost 70% of the world.
And you start seeing how this imbalance starts impacting everything.
And it's so interesting to me how,
when you can see this through his eyes
and this very well-made documenter's eyes of how much
things have changed, how people still won't believe
that it's changing and how there's such a different song
that's being sung when there is so much empirical science that
shows the changes are
happening. To me, it's just baffling. And it's also baffling to me. So David Attenborough comes
always close to the top of the greatest revered Britons of all time. So maybe Winston Churchill
is number one, but he is not close behind because all of his work using evidence to show the
importance of climate change and its impact on the environment.
So why is it when you see such strong evidence, might people not respond to it in the way that they should?
It is because of these biases, and these biases are reinforced by the fact that sometimes climate change is a matter of identity and politics rather than science. So one great documentary on climate
change was An Inconvenient Truth and that was laden with facts and figures and evidence.
But because it was about Al Gore this made it seem like a democrat versus republican issue.
So even if you're a republican who is able to understand data and science, you're generally rational,
now your identity feels threatened because you think, well, climate change is something that people like them believe and people like us, we should resist.
And similarly, some other messages we'll come across, there was a message ridiculing Ted Cruz for being a climate change denier and saying 97% of scientists agree that climate change is man-made.
But because you poke fun at Ted Cruz, then this might lead Republican supporters to stand up for Ted Cruz and stand up for the underdog.
And this now seems to be something where even if you're backed into a corner because the scientific evidence is pointing in one direction,
because this is not a debate about evidence but
ideology and whose side you are on then people might tune out the evidence because their identity
is something important to them. So then what is the best way to try to ensure greater climate
literacy and climate knowledge is to disentangle the message, the evidence from the ideology.
Perhaps sometimes resist the temptation to
label the other side as uninformed or deniers or going against science, instead
just present what the science is and what the evidence is without linking it
to a particular political affiliation. Sometimes it might involve Republicans
highlighting the importance of climate change. It might also mean the
importance of highlighting not just the causes but the solutions. So if the
solution to climate change is taxation and regulation, Republicans might be
unwilling to accept climate change is real because they don't want the
solution, but if instead the solution is innovation, ingenuity, capturing carbon and strong and deep geological formations, or launching some solar reflectors into the atmosphere to reflect the sun's rays, those are things which will be accordance with Republican values.
And that might cut through more than just presenting the facts.
And that Alex is a great introduction into the other bias that you bring up in the book.
Last year I had the honor of interviewing Marian Lewis and Wendy Smith, who you might
know.
Marian is a professor at University of Cincinnati.
Wendy is at University of Delaware.
Their book was the finalist for the Next Big Ideas at Club's Book of the Year.
And congratulations to you for your book being nominated as a must read for the May edition
of the Next Big Idea Club.
But their book covers both and thinking.
And it really goes into the ramifications of what happens
in most of Western society,
which has been taught to think about either or thinking,
or as you bring up in the book, black and white thinking.
And I think what you just expressed about Republicans
and if it was just explained in the terms that you use,
taxation, et cetera,
that's never gonna be accepted by them.
But if you start thinking a bit
that it's both that and these other things,
it makes it such a more palatable discussion.
So can you perhaps think of another example of this
where people get stuck in this black and white thinking
and how both-and thinking would change their complete
rationale for how they're thinking about a topic?
Absolutely.
So let me explain what black and white thinking
is to begin with.
So this is the view that something is either always bad
or always good. So this contrasts with
confirmation bias because confirmation bias is that you have a given viewpoint
and you look for stuff that supports that viewpoint and you might think well
that applies to many things, gun control, immigration, abortion, there's all
pre-existing viewpoints. But there's many things about which we might not have a
pre-existing viewpoint. So let's say food intake.
So protein, most people's view is that protein is good, it builds muscle.
Fat, most people's views on fat is it's bad, it makes you fat, that's why it's called fat.
But with carbohydrates, that's a bit more neutral.
So you might think, well, people don't have pre-existing views.
But what black and white thinking means is that people think that carbs are either gonna be always good or always bad,
there's no middle ground, so even though you don't know which side you're on, you
know that it can only be on one side. And so an example in which black and white
thinking was exploited was the Atkins diet. So Robert Atkins came up with a
diet which argued that we should have as few
carbs as possible. It was really extreme, minimised carbs. Not something nuanced, not something in
between where carbs are fine as long as they're 30 to 50 percent of your daily calories, he just
demonised carbs. So why was his book so successful? Why is it still the best-selling weight loss
book in history, even though it's been debunked by many scientists? Because it plays on black
and white thinking, our view that something has to be either always good or always bad,
and also because it's really simple to implement. We only need to look at the nutrition label
of food and look at the carbs line to figure out whether to eat something or not, rather than thinking about is it complex carbs or
simple carbs, maybe there's certain amounts of carbs that we should be
eating and certain amount that we should not. Notice that had Robert Atkins come
up with the completely the opposite conclusion and had a carb only diet, maybe that would have also
gone viral. Why? It's easy to implement. It plays on black and white thinking by suggesting
we should have as much of something as possible. And indeed, sometimes now the protein diets
might be playing into that role of black and white thinking, more is always better. We
don't allow for the possibility of diminishing returns or nuances.
Yeah, I think that's a great example. And I wanted to hit on this term that you use in the book called
post-truth world, where information is prevalent across various aspects of our life, which is the
world that we're living in right now. And staying on this health theme, one of the post-truth world realities to me is the way
that Western medicine has been treating us now for decades, based on the symptoms that
you can think of as the leaves of a tree instead of looking at us holistically.
And to me, it's so fascinating now as you get into personalized medicine or functional
medicine, how we're finding that oftentimes the best way to approach what's going on in
your life has nothing to do with pharmacology and writing a prescription.
It has everything to do with behavior change and the lifestyle that you're living.
And yet people don't want to accept that
because that's not what Western medicine
is really pumping out from an education standpoint.
In fact, very few doctors get much background in diet
and things like that out of what they're being trained to do.
So I think this is another one where the biases
are really harming people because they're such a
different and intentional way that we could go about making ourselves healthier, yet it's not
what's being prescribed by the majority of the doctors that we see. Well people want to have easy
solutions so behaviour change is difficult. We want to unlearn behaviors that have been with
us for maybe 40 or 50 years. Whereas the idea that you can take a pill, that's easy. Or if the
change in behavior is something that might cause a little bit of pain but can be gamified, then
that's something which is relatively easy to do because there's a clear target to aim for.
So the idea of cutting out carbs, that's a bit like gamification, try to have
as few carbs as possible.
And so even if it might be challenging at the start, because you really like
rice or bread, it's something where there's a clear target.
So if you contrast that with certain behavior changes, for example, to reduce
blood pressure by not getting stressed in particular situations
and by managing your emotions, that's something much harder to achieve.
That's something much harder to measure.
There's not the gamification element to it compared to cutting out carbs or
taking a couple of pills every day.
So Alex, you've given talks and you've participated in forums all over the world.
You've got some great Ted talks that have over 3 million views that I'd like to point
out.
But how do you think that different cultural contexts affect the perception and spread
of misinformation?
I think they certainly do have a huge effect.
Why?
Because the cultural context will mean that there is a pre-existing view within a particular
country or within a certain organization and therefore a message might resonate or not
resonate depending on whether it agrees with that particular viewpoint.
One of my main fields is sustainability or responsible finance, responsible business
as I've alluded to previously.
This is something which is really good for the planet,
it's really good for the people.
And if I go to some companies,
I will typically highlight that message.
But in contrast, if I go to investment banks
or private equity or maybe law firms,
if I was to give that message,
that might just come across as wishful thinking,
as going around in a circle and singing kumbaya
rather than
having a commercial knowledge of business. So when I speak to those
organizations I will slightly emphasize different things in the message, how
there's a commercial imperative, how evidence suggests, and this is high
quality evidence, that companies that are more sustainable will do better in the
long term. So this is not just a way of saving the dolphins, this is a way of making your company commercially and
financially successful. Notice this is not a case of chopping and changing the
message and saying different things to different people which cannot all be
true at the same time. That's inconsistent and you'll be quickly
found out for doing this in particular since the one way I communicate is often
in written version through newspaper articles and people can easily find out if you're contradicting
yourself, it's instead to look at the same picture from different perspectives, emphasizing
different things about a message. Even if I have 30 minutes to give a talk, I can't
give everything, so I might highlight the commercial imperative more for
certain audiences and the social and economic imperative for others. But it's really important
to understand the context in which you're speaking. Otherwise your message may not cut through,
no matter how rigorous it is in terms of the evidence basis.
Now, it's really important. I've traveled myself to over 40 countries around the world. And
it's so interesting, depending on what part of the world you're in, how
people perceive what you're saying.
I have found in Asia in particular, a lot of times I would go out there and do a
briefing to my staff who was there and they'd be nodding at me and I was
perceiving it as they understand what I'm saying and they're going to comply with it, which I found was not exactly true. What I found they were doing when they were nodding was they
were acknowledging what I was saying. But in that culture, I found that they didn't feel
comfortable contradicting me or challenging some of the things I'm saying. So they acknowledged what I was saying, but it didn't mean that they were
actually going to carry through with what I was wanting them to do, which
was a lesson I had to learn.
Absolutely.
Yes.
Just to understand this cultural context and how things have meaning.
So as I mentioned, we can't take evidence in isolation.
Evidence has a cultural identity to it.
We can't take even simple gestures in isolation. We need to understand what it means and
it'll be different in different contexts. So one big accusation of people is we
have a weird view of the world. So what does weird mean? It stands for Western,
educated, industrialized, rich and democratic and that's where a lot of
studies are conducted and so our view of the world might be skewed by what weird people will do,
how weird people act and what does a nod or a head shake mean from a weird person?
Yeah, absolutely.
I wanted to go through a couple of the areas of your book that I found fascinating.
You have this one section that's called Choosing Your Words and Data Carefully,
and you write this.
There's a different shortcut
if a statement is a direct quote.
You can simply search for it
without having to thrudge through the whole report.
And you give this example,
thousands of articles claim
that the former General Electric CEO, Jack Welch,
declared that shareholder value
is the dumbest idea in the world.
You do a Google search and it quickly tells you that what he actually said is
on the face of it, shareholder value is the dumbest idea in the world.
Shareholder value is a result, not a strategy, which has a
completely different meaning.
Yet, although that reference wasn't technically false, it's a lie
because they selectively pulled out that information and I guarantee you this happens all the time.
Absolutely and so why do I come up with the example? And actually this ties into the general
theme of the whole book. So the book is called May Contain Lies and you might think well the word
lie that sounds pretty inflammatory.
It's a pretty provocative title.
Did I choose this for clickbait?
No, so the word lie,
I want the reader to think about lie more broadly.
So normally the word lie is reserved
for an outright falsehood.
So this is why to call somebody a liar,
that is a big step.
But to me, lie is simply the opposite of truth.
Somebody can lie to you by saying something
which is completely true, but ignoring the broader context.
I think this is really important
because when you mentioned the post-truth world earlier,
John, often people think, well, the solution
to a post-truth world is to check the facts.
But even if the facts are 100% accurate they
could still be misleading without the context. So Jack Welsh absolutely did
say Shell de Value is the dumbest idea in the world and why did this go viral?
Because there's so many people who are concerned with capitalism that this was
catnip to them. Here was a big capitalist who ran GE who now is
turning his back on capitalism. But the full quote as you say, on the face of it
shareholder value is the dumbest idea in the world. He says shareholder value is a
result not a strategy. So what he means by this is when companies taking
decisions every day they don't sit there in their office and think, well, how do I maximize shareholder value?
They might think, how do I grow? How do I maximize market share? How do I inspire my employees? How do I deliver great products to my customers?
And if they do that, they end up creating shareholder value. So shareholder value is a good result. And so it is fair to say this company has
created shareholder value. So we should laud this company and applaud this company. Even if shareholder
value was not the day to day driver of the decisions, it's something that we can look at after the fact
to gauge whether a company has done a good job. But all of these nuances, these are swept under
the carpet. If we like to have the
message, capitalism must be overturned. One of the world's greatest capitalists is now turning his
back on Shell Devan. No, I think it's a great example. And I wanted to highlight the other one
that you bring up right after it, because one of the guests I've always wanted to have on this show
is Matthew Walker. And anyone who is wanting to learn anything about how to sleep better looks
at Matthew as one of the top probably two or three experts in the world.
And it's incredible.
You bring up his bestselling 2017 book, Why We Sleep, and it presents this bar chart showing
how more sleep is associated with fewer injuries in teenagers. But what he ends up doing is he removed the bar showing that five hours
sleep leads to fewer injuries than six or seven because it doesn't fit his message.
And I just couldn't believe it because I read that book and I never even questioned that data at all.
And this is not your fault because you're human and and very few humans would have questioned this data. Why? The book plays
into confirmation bias and black and white thinking. Why does the book Why We
Sleep play into confirmation bias? Because everybody wants an excuse to stay
in bed a little bit longer. You think the eager beavers who wake up at 5 a.m. in
the morning or who burn the midnight oil, they will get their carmapence later and this book suggests it. And also the book gave the black and white
impression that more sleep is always better, so six is better than five, seven is better
than six and so on and so forth. And because it's by somebody with academic credentials,
he puts on the front page, Dr. Matthew Walker, you believe him.
And so he presents this graph showing that when you sleep more, increasing your sleep from six to
seven to eight to nine, the fewer injuries that you get if you're an adolescent. But as you mentioned,
there is a bar which shows that if you sleep five hours you actually do even better than if you had six hours. So this contradicts his story. He chose to cut it
out which is misleading. It's technically not a lie because what he presented was
absolutely true but if you're a witness in a criminal trial you swear to tell
not just the truth but the whole truth and here he didn't tell the whole truth because he cut out a really important
bar chart and what is really surprising is that he didn't need to do that even
if he cut out that bar chart most of the charts supported his idea that more
sleep is better six seven eight nine that all led to
fewer and fewer injuries yes five went in the other direction, but mostly it was in his favor.
But because of black and white thinking, people like to see things as always
good and always bad, they often can't handle nuances.
He didn't want to present the full picture, which was a nuanced picture.
Man, fascinating to me.
And I now want to jump to chapter six of your book, which is titled
data is not evidence causation.
And the thing I want to talk about here is smoking cessation.
I'm a person who was in information technology and in some of these fortune
50 companies, I was in charge of all big data.
And to me, it is so interesting how, when you start interpreting data,
how you can make it look however you want it to be interpreted.
But you talk about this topic called reverse causation.
How does that complicate the interpretation of data?
And maybe let's use that smoking cessation one as a way to explain it.
This again highlights the importance of going beyond just the truth.
So even if something is 100% accurate,
it can still be misleading,
but this is in a quite different context
to what we discussed just a few moments ago.
Here what you have is large scale data,
which shows that if you are a smoker and you stop smoking,
your chance of death in the next couple of years
is actually higher rather than lower. So this seems crazy. How can it be that stopping smoking
leads to more death? And if you are for the pro-smoking lobby, you might say, well, this is
an argument for why we shouldn't tax or regulate smoking, actually stopping smoking does not improve the health outcomes. But this is the concern that correlation is not causation.
Is it that stopping smoking causes you to have a shorter life? Or is it that a shorter lifespan
causes you to stop smoking? So when is it that smokers finally quit the habit? It's when their doctors tell them
look you're in a really bad health state, your likelihood of lung cancer or any of these other
really serious illnesses is really high unless you stop, then that will finally get the person
to stop. But they would have probably died anyway and perhaps died even faster had they not stopped smoking. So what this suggests is it's the likelihood of imminent mortality causing
you to stop smoking rather than stopping smoking causing you to die soon. Now any
listener will know that correlation is not causation so you might think well
how do people fall for something like this? Well, again, when our biases are at play, we will accept the explanation being paraded,
no questions asked.
And this is true also in my field of sustainability.
So I would love to believe that sustainable companies always perform better and it's
sustainability that drives their success.
But sometimes it could be in the opposite direction. Sometimes if a company is already performing well
then it can start to invest in a sustainability.
It costs a lot of money to pay your workers more to cut your carbon emissions
and so that's why the highest quality studies will try to disentangle
correlation from causation but you have some flimsy studies often
that don't do that they just rely on their one explanation and they hope that people
will accept it. And so this is linked to the title of the chapter you mentioned
John was data is not evidence. So what does that mean? Data is just a
collection of facts. High sustainability companies perform better. Well what is evidence?
Let's go back to a criminal trial. Evidence is something that supports one
conclusion and doesn't support other conclusions. So evidence in a criminal
trial has to point to one suspect and not other suspects. Now just the data
showing that more sustainable companies perform better, that is not evidence
because it could be that sustainability drives performance or the alternative suspect could
be that performance drives sustainability.
But just like a prosecutor or a police officer who's honed in on their one preferred suspect
and blinds themselves to the possibility of alternatives, this can
also be the problem when we look at data when we have a preconceived viewpoint.
Yeah, Alex, I have to say I loved the titles that you used for each one of these chapters.
I found just as I was looking at the table of contents that made me want to read more.
And that is definitely the case with that last chapter and the next two.
So in chapter seven, you lay out a fact is not
data. And as I was reading this chapter, I got to this point on the narrative fallacy. And it was so
intriguing to me because I've read a lot of Simon Sinek's books and Malcolm Gladwell's books.
And I never thought of it this way that they're using a tried and tested theme
that's actually behind hundreds,
if not thousands of smash hits,
where they take a single big idea
to make themselves as memorable as possible,
and then they find as many examples as they can
to illustrate the idea.
However, they typically only draw ideas
that support their hypothesis.
So I was hoping you might be able to explain this narrative fallacy in a little bit more
detail and go into maybe more about Isaacson, Sinek and Gladwell.
Certainly.
Let's use Simon Sinek as an example.
So he's clearly been extremely successful, both with his book, Start With Why and a few
others and his few others,
and his TED Talk, How Great Leaders Inspire Action, which is, I believe, the third most viewed TED
talk of all time. So what he wants to do is to look at what drives success. And so he takes a
highly successful company, Apple, the first company to reach $1 trillion, and claims that
Apple was successful because it started with Y. It had this idea of
everything we do we believe in challenging the status quo. They were daring, they were innovative,
they had a Y. Why do we exist? It is exploration to break boundaries and this led to success.
But as we know correlation is not causation. Even if Apple did start with a Y,
how do we know that caused its success?
There could be so many other things
that caused Apple's success.
Maybe it was Steve Jobs just was unusually talented.
Maybe he had a great network of contacts.
Maybe it was all of those other factors.
But the narrative fallacy is the idea that you
weave an explanation, a cause-effect explanation, when none exists and you dress this up into a nice
story. And so the idea that Steve Jobs was somebody who was inspired, perhaps because of his upbringing,
he grew up in this household with a craftsman father who taught him about the importance of
design, that inspired him about being different,
about breaking boundaries, and he set up Apple.
Why is that the preferred explanation
that Simon Sinek and Walter Isaacson wrote about?
Because that is empowering.
So if indeed what led to Apple's success
was Steve Jobs' unique network of contacts,
that is not a great message because if you don't have a Rolodex of contacts, that is not a great message.
Because if you don't have a Rolodex of contacts,
you can't be successful.
A book which tries to give you the secrets to success,
and if those secrets are unattainable,
you're not gonna buy the book
because you don't have that network of contacts,
you can't put it into action.
But if the secret to success is to start with why,
that is empowering.
Anybody can come up with a why if you just brainstorm hard enough or just have big enough
blue sky thinking sessions.
And so this gives you the message that anybody succeed if you were just to read the book
and come up with your why.
And notice that Sinek doesn't just give the example of Apple.
He claims that Wikipedia beat Microsoft back in Carter to become the world's fountain
for knowledge because it started with Y.
And a Y is what led the Wright brothers to beat Samuel Pierpoint Langley to launch the
first test powered flight.
But he never considered any of these alternative suspects. He is the police
officer who believes that this person is guilty and hones the explanation only on that person
ignoring everything else. There could be tons of other reasons for the success of those organizations
but if you've narrowed your focus to one particular one and you parade that explanation,
that is what one might sell books if indeed people like readers are also willing to believe that a Wiley success because
that's empowering.
And for the listeners, I just wanted to say I wish we could go through more of this book
because in every single chapter, Alex brings fantastic details and stories like this to
life, which is why I wanted him to share a couple of the ones that I found most enduring.
I didn't want to have this discussion without talking about some of the
solutions and you go through thinking smarter as individuals, as companies,
and then as a society.
And I wanted to start out by thinking smarter as individuals. And something that
very well coming from the background that you have is the peer review process. Why is
this peer review process so important? So peer review, this might seem like an arcane
scholarly ritual, which doesn't matter to the person on the street, but it does matter.
It has real implications. So it's just like any sort of kitemark. So how can we sleep safely at night? If our locks bear a
kitemark showing it's secure, then we know it's something that we can trust. How do we know that
something is certified organic? Again, there will be a particular kitemark. And that is the same for
research. So if a paper is published in a top peer reviewed scientific journal this shows that all the leading scientists have
reviewed the study, have looked at the methodology to make sure that it is
accurate. This contrasts with studies produced by many companies and these
companies are many that I respect the likes of McKinsey or Blackrock they
will churn out studies and post
them on the website, but it hasn't been vetted by anybody and so it could be quite inaccurate
and misleading. As we explained earlier, a study did a complete 180 on its conclusions
after it went through peer review and corrected its mistakes. that was the study on pay gaps and company performance.
But it's important also to be realistic. So peer review is not perfect. So right now we
see this controversy with the work of Francesco Gino at Harvard Business School, who's had
some papers retracted, which were published, but now are seen to be potentially fraudulent.
But we need to recognize that perfect should not be the
enemy of good. Yes, it's not the case that you can attach complete gospel to one study because it's
been published in a peer-reviewed journal. This is why, as I mentioned earlier, we should not put too
much weight on one particular study, but it does increase our confidence. So it's certainly something
that as a reader we should be looking at, has the study been peer reviewed, and if it has we're going to put
more weight, not absolute weight, but more weight on it than if it has not. Thank
you for sharing that and I want to go into the next chapter, creating
organizations that think smarter, and it's interesting a couple months ago I
was watching a documentary on John F. Kennedy, and I was so intrigued,
especially by his actions around Cuba and how we learn from them.
And if he hadn't, what the repercussions could have been.
For those who aren't familiar, when he inherited the presidency pretty quickly afterwards, he got involved in the Bay of Pigs, which went disastrously wrong, primarily because he listened to group think from his advisors.
And so when a YouTube plane came over Cuba and found out that there were actually nuclear missiles there, he didn't want to repeat that same mistake again. And so perhaps with that
as a lead-in, Alex, I'll let you explain the rest of the story and why this is so
important. Certainly. So this chapter is on the importance of cognitive diversity.
So diversity is a very common word nowadays, but often people equate
cognitive diversity with demographic diversity. So what John F Kennedy did is after Cuban missiles
were spotted in Cuba, he set up this executive committee of the National Security Council to
deal with the situation. Now because it was the 1960s, unfortunately it had only white members,
but despite a lack of demographic diversity, there was a big diversity of thinking.
Now, some of the members of the committee
were from the military.
And obviously their initial response was,
let's bomb the missile sites, let's invade Cuba.
They wanted to look at military action.
But the cognitive diversity was there were people
on the committee who saw the downsides
of using hard power as well as the advantages and one of them
was Kennedy himself so unlike his predecessor Eisenhower who was a really decorated general
he was not from the same military background and he had other colleagues who also saw this as not
necessarily being the first solution more a last resort. And so rather than immediately going in and anchoring
on this view of blockade, of invading, instead he said well let's call a timeout, let's not anchor
on one particular solution, let's explore all the different alternatives that we have to respond.
They came up with six, they debated then whittled them down to just two. One was the invasion and the other was the blockade.
And then he divided the executive committee into two teams.
Each of them were tasked with focusing and justifying on one of the solutions and then
writing up a paper to explain the benefits of this.
And then they shared the paper with the other team, critique
those papers, and they respond to this afterwards to make the case even more strongly and to address
some of the concerns. And this is something which just doesn't often happen in corporate life, right?
It is that we latch onto a particular solution. Anybody who raises a critique is just not a true
believer. You're un-American or you're not brought into the mission of the company. Whereas here, this was something where people actually wanted to debate and they wanted dissent because if you were to disagree and highlight flaws in a particular proposal, this is not because you did not agree with the objective, you just thought that there were ways of actually achieving it in a different manner.
You just thought that there were ways of actually achieving it in a different manner.
Yeah, I think that's a great example. And it makes me think of the way that Jeff Bezos was running Amazon in that if he had a critical new initiative that he wanted done, he would have that looked at by multiple small teams who would then come back.
None of them knew they were working on this.
back, none of them knew they were working on this and then they would come back to more of an executive team who would each present the way that they would attack the initiative and then they chose
from all of them which was the best way to go and I think it's what ended up having them have so many
successful initiatives because they looked at different sides of it. Not sure your thoughts
on that but to me that's the way I think about it. Absolutely, and this is what it truly means to have diversity of thought. So yes, the focus on
demographic diversity has some advantages, but often people think that if I have an organization
where there's a mix of people, males and females and whites and non-whites, that is enough. It is
far from enough and this reduces cognitive diversity to some very simple metrics such as
demographics such as race and gender. Instead what matters is not so much hiring a mix of people
but making sure that they have the space to come up with different ideas. Often as a leader you
think I know best that's why I'm the leader here's my idea you junior people go and execute it. But
instead what Jeff Bezos was suggesting and what some of the truly great leaders do is,
well, here might be the objective.
I don't know the best way to achieve the objective.
Let me give you time and space to come up
with different ways and then start
from a blank sheet of paper and then look at out
of all the different suggestions,
what is the one that we think is most effective,
but we're gonna allow a thousand flowers to bloom first
and then choose the one that we think is most effective, but we're going to allow a thousand flowers to bloom first and then choose the one that we're going to go with. Yeah, I think it's a great
example of a type of leadership that I think more companies need to practice, which is eyes on,
but hands off. Meaning that example that you just gave is a great example of one where
Jeff Bezos absolutely is looking for a result, but he doesn't know how to do it. So he's hands off and giving people the autonomy to think about it and to come
back with an uninfluenced way to go about it.
So, well, I want to end the interview on this.
The last solution is to create societies that think smarter.
And I want to ask you, Alex, looking forward,
what are your greatest hopes for how society
will handle misinformation?
I think it's to encourage people to be more discerning
and more critical.
Now, critical doesn't mean that you're always
having to be negative, but it means
that you're just expressing your critical thinking faculties.
So just like if somebody was to sell us a used car,
we will think, well, are there incentives
to misrepresent the reliability of this car?
And so that should be the same with data.
If somebody's presenting data to us, is it that they presented this data because they're
trying to sell something, because it conforms with a particular viewpoint.
So we might want to apply the same healthy skepticism to something where we are chomping
at the bit to believe
as something that we're wanting to dismiss. And notice this is quite easy to teach, so there's
not a single equation in the book. As I mentioned, you don't need a degree in statistics to be
discerning about misinformation, but just to recognise the alternative suspects and alternative
explanations. And I give a couple of potential ways to teach this even in primary school. So just like we often have logic problems
in primary school, we can have statistical literacy logic problems. We have whodunit
murder mysteries where there's alternative suspects and you need to think outside the box
because often it's not the most obvious person who is actually the murderer or the culprit.
And similarly, when we see sets of data, it's not necessarily the most
obvious interpretation, maybe it's reverse causality or other factors
which are driving the correlation.
Well, Alex, I so enjoyed having you on today.
If a listener wants to learn more about you, your books, the work that you're
doing, where's the best place for them to go?
Well, I've really enjoyed the conversation too, John.
So my website is alexedmonds.com.
I also will be on X and LinkedIn under A, Edmonds.
The latest book is called May Contain Lies.
And so all of these things are ways in which
to find out more about my work.
And just for the audience, if you're watching this,
here's a copy of the book, Love the Cover Design.
Thank you so much and congratulations on its release.
Thank you so much john what an incredible interview that was with alex edmonds and I wanted to thank alex katey milkman
And the university of california press for the honor of him appearing on today's show links to all things alex will be in the show
Notes at passionstruck.com
Please use our website links if you purchase any of the books from the guests that we feature here on the show
Videos are on youtube at both john R. Miles and passion struck clips. Please
go subscribe and join over a quarter million other subscribers. Advertiser deals and discount
codes are in one convenient place at passionstruck.com slash deals. Please consider supporting those
who support the show. I'm at John R. Miles on Twitter, TikTok, Instagram, Facebook, and
you can also find me on LinkedIn. And if you want to expand your courage muscles, then consider joining our weekly Passion Struck
Challenge, which you can do by joining our ever-growing newsletter community of over
25,000 subscribers.
And you can sign up for the newsletter at passionstruck.com and it's titled Live Intentionally.
Do you want to find out where you stand on your journey to becoming Passion Struck?
Then consider taking the Passion Struck Quiz, which you can find on passionstruck.com.
It consists of 20 questions and it'll take you about 10 minutes. And it's based on the
principles of my new book, passion struck. You're about to hear a preview of the passion
struck podcast interview that I did with Angela Foster, a leading voice in health optimization
and biohacking as a former attorney turned health and performance coach,
Angela has transformed her life
and now helps others do it
through her podcast, High Performance Health.
In this episode, we dive deep into her insights
into achieving peak physical and mental performance,
exploring the latest in biohacking, nutrition,
as well as lifestyle strategies.
So when I look at health optimization
and we're trying to help someone optimize their health
for high performance and longevity,
I have a framework to make it easy to remember called SHIFT.
So how do I shift into optimal health?
And that stands for optimizing your sleep,
your hormones, gathering the insights,
which is a combination of lab and wearable data.
And then the F is how do I fuel my body?
And for that, there's an acronym FLO
because it's not just food.
It's not just the food we eat.
It's food like oxygen, so how we breathe,
and water, hydration.
And then the final piece of shift is the T, how do I train?
But I don't just mean physical activity.
How do I train my body and mind?
Remember that we rise by lifting others.
So share this show with those that you love and care about.
And if you found today's episode with Alex Edmonds useful,
then definitely share it with those
who could use his inspiration.
In the meantime, do your best to apply what you hear on the show
so that you can live what you listen.
Until next time, go out there and become passion struck.