Good Life Project - Julia Galef | How to Stop Deluding Ourselves
Episode Date: August 23, 2021We tend to think we’re smart, rational beings, making good choices based on clear information. In truth, we’re anything but. We are pretty much walking, talking bundles of delusion and bias, much ...of it utterly hidden from our consciousness, by no one other than our subconscious. How do we get past this, how do we learn to see more clearly, not just what’s going on around us, but also within us? To help answer this question, today I’m sitting down with Julia Galef, author, podcaster, and speaker with a passion for good reasoning, and host of Rationally Speaking, a biweekly podcast featuring interviews with scientists and other thinkers, about everything from “Should the United States have open borders?” to “Has scientific progress slowed down?” to “What have you changed your mind about?” She’s also the author of an eye-opening new book, The Scout Mindset, which is a deep dive into the learnable skill of looking at things honestly and objectively — why that’s so valuable, why it doesn’t come naturally to humans, and how we can get better at it.You can find Julia at: Website | Twitter | Rationally Speaking podcastIf you LOVED this episode:You’ll also love the conversations we had with Susan David about the role of emotions in how we think, feel and live.My new book is available for pre-order:Order Sparked: Discover Your Unique Imprint for Work that Makes You Come Alive and get your book bonuses!-------------Have you discovered your Sparketype yet? Take the Sparketype Assessment™ now. IT’S FREE (https://sparketype.com/) and takes about 7-minutes to complete. At a minimum, it’ll open your eyes in a big way. It also just might change your life.If you enjoyed the show, please share it with a friend. Thank you to our super cool brand partners. If you like the show, please support them - they help make the podcast possible. Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
So we tend to think that we're smart, rational beings making great choices based on clear
information, but in truth, we're anything but.
We are pretty much walking, talking bundles of delusion and bias, much of it completely
hidden from our consciousness by no one other than our subconscious.
So how do we get past this? How do we learn to see more clearly, not just what's going on around us,
but also within us? Well, to help answer this question today, I am sitting down with Julia
Galef, author, podcaster, and speaker with a passion for good reasoning and the host of
Rationally Speaking, a bi-weekly podcast featuring interviews with scientists and other thinkers about everything from should the US have open borders to has scientific progress slowed down to what have you changed your mind about? Scout Mindset, which is a deep dive into the learnable skill of looking at things honestly
and objectively, why that's so valuable, why it doesn't come naturally to us humans, and how we
can get better at it. And before we dive in today, a quick note, millions of folks right now are
seriously reconsidering the way they work and wondering if there's a better way, something
that will make them feel more
alive, something that feels truly meaningful and energizing.
If that is you, I've got a new book called Sparked.
I've been working on it for a long time.
It has tons of insights that will help you see yourself more clearly and understand what
truly makes you come alive and what empties you out.
So you can really make better choices,
more informed choices, less delusional choices, and choices that are likely to get you closer
to that feeling that you want. And there are some super cool immediate bonuses when you pre-order
now. So check out the link in the show notes, grab your copy of Spark from your favorite bookseller
today. Okay, on to today's conversation.
I'm Jonathan Fields, and this is Good Life Project.
Mayday, mayday, we've been compromised.
The pilot's a hitman.
I knew you were gonna be fun.
On January 24th.
Tell me how to fly this thing.
Mark Wahlberg.
You know what the difference between me and you is?
You're going to die.
Don't shoot him, we need him.
Y'all need a pilot?
Flight risk.
The Apple Watch Series 10 is here.
It has the biggest display ever.
It's also the thinnest Apple Watch ever,
making it even more comfortable on your wrist,
whether you're running, swimming, or sleeping. And it's the fastest-charging Apple Watch ever, making it even more comfortable on your wrist, whether you're running,
swimming, or sleeping. And it's the fastest charging Apple Watch, getting you eight hours of charge in just 15 minutes. The Apple Watch Series 10, available for the first time in glossy
jet black aluminum. Compared to previous generations, iPhone XS or later required,
charge time and actual results will vary. So you step into podcasting, I guess the end
of 2019, start producing 2010, early 2010 is when you actually start airing stuff. At that point,
according to my nerdiest friends, podcasting is this weird little thing. Most people think that
it's going to be gone
really, really, really soon, but for some reason you were drawn to it.
It was actually a joint venture in, uh, yeah, late 2009 was when we started planning it. It was the
initial brainchild of my then co-host Massimo Pagliucci, who's a philosopher of science,
who's based at CUNY in New York.
And I was a freelance journalist at the time, just getting interested in and excited about
rationality and reasoning and philosophy of science. And Massimo and I had met at a conference
and we had a bunch of very interesting and stimulating kind of half jam sessions, half
sparring matches. And he came to me a few weeks later and was like,
I have a proposal for you. I think we should start. Well, originally he was thinking of a
radio show. And then, you know, when these podcasts came on the scene, he was like, oh,
podcast, that's what we should do. And so I can't claim credit for the original idea of founding
Rationally Speaking that belongs to Massimo and our producer, Benny Pollack, who we recorded out of his apartment in Greenwich Village for the first few years of the podcast.
And he produced and edited the whole thing.
So I've been hosting myself for the last few years.
But the first few years, it was a co-hosting venture with me and Massimo.
Yeah.
I mean, I feel like I've seen so much transition in this space and so much evolution.
And then in the last couple of years, explosion.
Last time I checked, I think there were something like 4 million shows out there.
But-
It's quicker to count my friends who don't have podcasts of their own than it is to count
the ones who do.
It's amazing.
And I've often wondered, what is it about this medium that is so powerful?
It's really interesting to me that you said originally you were kind of thinking about
radio and Massimo was thinking about radio. Massimo and Benny, yeah.
Yeah. That was actually, my early interest was actually radio, public radio.
Oh, yeah?
And we started the podcast in no small part as a way to potentially prove the concept had legs
and then turn around and see if we could actually market it to like one of the
three big distributors for public radio. And then in the middle of that whole thing, you know,
podcasting itself just took on its own life. And we're like, maybe this actually is the thing.
I like that kind of tail wagging the dog type thing.
Yeah, it's kind of cool. So you spend so much of your energy exploring questions that fascinate me as well. And the first ever guest that we actually had way back when we were filming was actually Dan Ariely.
Oh, wow. Way to start off with a bang. researching irrationality, like why we do the weird, strange things that we do. And he ties
it back to this profound and horribly traumatic incident when he was 18 or 19 years old, and he
was burned on the vast majority of his body and spent three years in hospital. And that his
observations in hospital really triggered this fierce curiosity about why we do the things we do.
I'm curious for you, when you sort of trace back your interest, was it sort
of a more of a gradual evolution or for you, can you identify something that would be akin to an
inciting incident that says, Ooh, this is something. I don't think I have an inciting
incident. I, to be honest, it may not be the most interesting answer, but I just, I can't remember
a time when I wasn't interested in how to think about things.
And, you know, what is clear thinking and metacognition essentially, like thinking about what is it our brains are doing and trying to understand how they work and where they tend to
go wrong and how to compensate for that. I mean, I can, I can point to things about my childhood,
about my parents that are, you know, plausibly shaped how I grew up and
what I found interesting and important. Like they were both very analytical, intellectually curious
people. They're both retired economists. So they would do things that I often recommend to other
people who are, you know, becoming parents as like, oh, you should do this with your kids. It
was really great growing up like this. Like, for example, my dad did this kind of Socratic thing with me where I'd ask him a
question and he would sort of turn it back on me and ask, well, what do you think? And I'd propose
an idea and he'd say, well, you know, yeah, what about such and such? How would that work? And I'd
go, yeah, I guess that doesn't really work. Well, what about such and such? And we'd end up sort of
figuring out the phenomenon together. And they were both
also really good at one of kind of the key pillars of what I call scout mindset, which is basically
trying to be intellectually honest and objective and truth-seeking. And one of those pillars,
I would say, is being actually interested in the times when you might be wrong and actively
trying to look for those times and being happy
to acknowledge it when you discover that you were wrong. And my parents were good role models in
this sense. And they would, when we would get into an argument because I thought some rule was unfair,
for example, you know, when I was seven or eight years old and they disagreed with me every now and
then they would come back after the fact and say, you know what, Julia, we thought about it and we
discussed it and we decided actually you are right. This was an unfair fact and say, you know what, Julia, we thought about it and we discussed it and we decided, actually, you are right.
This was an unfair rule or, you know, we did go like we didn't follow through on our word or something.
So we'll change that rule.
And I remember feeling in those instances very appreciative that they would actually listen to me and consider my point seriously.
But also admiring that like this is a cool thing to do. And I want to do this
too. And I admire people who do this and I want to seek them out and hang out with them and read
them and follow them online. Well, I didn't think that when I was seven, but I think that was a
consequence of this early experience. Yeah. It's so powerful to actually have that experience as
a kid, because I think so many times, as a parent as a parent now, like, you know, on the one hand, you want to be open and, you know, like absolutely
own the fact that you don't know everything. And the vast majority of the time as a parent,
you don't know anything actually. And at the same time, like you sometimes struggle with,
but I'm supposed to be the authority figure. I'm supposed to be the provider of answers. I'm
supposed to be the one who creates a certain sense of safety and security. And I think sometimes there's a tension there when you're
trying to sort of play those dual roles of fostering intellectual curiosity. And at the
same time, allowing that creating the feeling that the world is a safe place when in fact,
maybe it's not, and maybe that's actually not the best thing to be doing.
Yeah. This is just me theorizing here, but I think that there's an important difference
from the perspective of the child or whoever the subordinate is. There's an important difference
between saying, don't question me, I'm right, and if you don't agree, then too bad, versus saying,
well, there might be better ways to do it, or this might be
unfair, but I'm sorry, you got to do it my way for now, just because we're crunched for time,
or mommy's really tired, or like, to be able to acknowledge, you know, I might not be right,
but unfortunately, we got to do it my way for now, just because that's how things work.
That to me feels very different from insisting that you're right
just to make the other person do what you say. There's more of a concern for the truth in the
latter and an acknowledgement of what the other person is perceiving, which is that, you know,
while you're being inconsistent or you're being unfair, I think it is important to be able to
acknowledge that when that is actually the case, even if you're going to follow it up with, sorry, we got to do this anyway. That's an important distinction, I think.
Yeah, no, that resonates a lot with me. You brought up this idea of the scout mindset
and the soldier mindset. I want to work our way into that. I think the bigger question is
there is so many of us, I think, move into adulthood
believing that we're rational beings, that we weigh the evidence, that we sort of like
gather whatever's in front of us.
And, you know, we kind of figure out like where does the quote truth lie, you know?
And let me make good decisions based on like all the data.
And yet we end up so often deceiving ourselves. You quote a line from
Richard Feynman, whose work I've loved for years about, I can't remember the exact language. I'm
sure you'll remember the exact language actually. Maybe the most important thing is not to fool yourself and you are the easiest person to fool? Right. So my curiosity is why we fool ourselves. Because I think we'd like to tell ourselves,
we may present a facade or we may present somebody like some vision or something to others,
but at least we're honest with ourselves. But in fact, that's just not true for so much of the time.
Yeah.
Well, so I'll start off by saying that there are a lot of things that we get wrong about
the world.
And it's not necessarily that we're fooling ourselves.
It's just that reality is very complicated and messy.
And we have imperfect information and limited time and processing power.
And we kind of have to do the best we can with our limited data points and limited time and processing power. And we kind of have to do the best we can with our,
you know, limited data points and limited time and so on. And so a lot of the things that we're
wrong about is not always about fooling ourselves. It's just, it's hard to figure out what's going
on in the world. But then there's also this really large category of things that we're wrong about
where we could have gotten it right, but we kind of
on some level didn't want to get it right because the, you know, the false belief was more
comforting or more convenient in some way, or, you know, more validating in whatever way.
And so that, that's the category that of errors that I've been especially interested in,
in the last few years, these kind of, sometimes I call been especially interested in in the last few years.
These kind of, sometimes I call them unforced errors in the same sense as like if you're playing a tennis match and you lose points that you could have scored.
We could actually be getting those things right if we were more intellectually honest and truth-seeking.
And so, yes, the big question is, you know, why don't we?
Why are we fooling ourselves? So this category includes things like beliefs about ourselves and our strengths and weaknesses.
We'll often fool ourselves about, you know, of course I'm a better driver than average.
Like almost everyone thinks they're a better driver than average.
Or that thing that went wrong at work wasn't my fault.
It was someone else's fault.
Or like I couldn't have been expected to know that.
Or there's all of these kind of comforting stories that we sort of reinforce and we look for support for and we convince ourselves
of because to some degree it makes us feel better about ourselves and our lives. And also to some
extent, we feel like it can help us convince other people. If I'm starting a business, I might have a
motive to convince myself that things are going really well.
And, you know, there's a huge market for digital beanie babies or whatever my company is going to sell.
And that's partly a sort of external facing purpose of making it easier for
me to sort of talk earnestly and sincerely about to potential investors or employees about,
you know, how great my company's future is going to be. There's a sense that if the more I believe
something, the better I am going to be able to convince other people of that thing. And so that
is also a large component of why we might be incentivized to believe things that maybe we don't really have good reason to believe. And there are problems with that. There's definitely a downside to that approach to deciding what to believe, which I can certainly talk about. But I think that is a large part us drinking our own Kool-Aid.
Yeah.
That it's not just that we're actually trying to convince ourselves.
We may actually do it because it allows us to be more publicly behind something.
And we know that that something matters or will get us something that we really desperately
want.
I mean, you gave that example, you're pitching an investor for money.
And all the mythology around that is you've got to be delusionally
optimistic. You've got to believe your numbers and you have to be stunningly passionate about this.
That is a common belief that many people in Silicon Valley insisted to me was true. And I
think the truth is not quite that simple, but that is a commonly believed thing. Yeah.
Yeah. And in fact, when you go there, on the one hand, I think, I look at
scenarios like that. I'm like, okay, so that is the mythology. And as you've shared, a lot of
what's underlying that mythology is not true. Then I remember seeing, I wish I had the site
for it. I remember a couple of years back seeing a study that looked at a whole bunch of established
business people who were startup founders.
And they had succeeded.
They'd gone through the pivots.
They'd done everything.
And now they were in a good place with big, strong businesses.
And they asked a question, which was some variation of, if you had known how evil this
process would be before you had started, would you still have done it?
And the mass majority said no. So I wonder what
other purpose this delusion is serving for us and what things get birthed that never would be birthed
if we actually were more rational from the get-go. Yeah. So there was kind of two intended
purposes that I was sort of gesturing at there, like purposes of delusional over-optimism.
One was the sort of being able to convince other people that your prospects are really good and your company is going to become the next Google or whatever.
Then the other purpose, which you're pointing at now, is the kind of motivational, like, psyching yourself up to take on a huge, daunting project and stick with it, you know, when things are hard.
And that is a very important thing. It's a very important thing to be able to do,
both for your own fulfillment and just for the world. We need people to take on
big daunting projects. And a lot of people think that the only way to do that is to just
not know about or try not to think about or deny how hard it's going to be or your risk of failure.
And so, yes, there are a lot of people, many of whom I'm sure you read about or they were quoted, who think, well, if I knew that it was going to be hard or if I knew that there were going to be a lot of risks and bumpy places, then I just couldn't do it.
And I think a lot of people genuinely feel that way.
And maybe that's how their motivational systems currently work.
But that's not the only way to psych yourself up to do something hard. successful, like founders of, you know, multi-billion dollar companies who said, no, no,
even from the beginning, I knew it was going to be hard. And probably, I thought probably I was
going to fail just because most companies fail. And you ask them or, you know, the interviewer
would ask them, well, then why did you do it? And they would say something like, well, you know,
I thought the expected value was really positive, which means, in other words, the upside if I did succeed was huge and really exciting.
And the downside if I failed was tolerable.
Like, if my company fails, okay, I can try again and start another company and I will have learned a lot.
And it will have been an exciting experience and very character building. And so, you know, they were able, because they had that kind of
sense of the, like the upsides outweighing the downsides in the sense that they could cope if
they failed, they were able to go into this whole, their whole endeavor with kind of clear eyes.
And they were able to be motivated while at the same time having kind of a realistic picture of
how hard and risky it actually was. And so I can't prove that everyone could manage
to do that, but clearly a lot of successful founders were able to do that. And I think
it's kind of a better solution if you can swing it, because then you get the motivation and the
ability to see realistically, which is really valuable for reasons that I could talk about.
But ideally, I think you should be striving to have both the emotional comfort and
motivation, et cetera, and also be able to see as accurately as you can. Yeah. And I don't disagree
with that. I wonder though, whether you would have a similar set of responses outside of the
world of tech startups, because to a certain extent, that is this weird domain where failure has been normalized. I feel like
in a way where in almost every other part of entrepreneurship, it hasn't been. It's sort of
like old school and brick and mortar in service-based businesses and more incremental
growth businesses or in businesses where you're primarily bootstrapping a business and not
raising VC. I feel like there's this weird bubble in the world of especially venture-backed
tech startups where failure is so normalized.
It's almost like, I mean, I know certain people that won't fund a startup team unless
they've been through a series of failures already because they want to know they've
been steeled against it because it's so expected in that domain.
Whereas if you looked outside of that, I wonder if you would still have like that same openness in a lot
of other sort of like approaches to business building or, you know, whether it's building a
business, whether it's starting a nonprofit or a foundation, whether it's, you know, like launching
a career in a field, which is perceived as being a really low probability of success, whether it's
in the arts or the performing arts. Like, I wonder if it would be consistent across those.
I think there's a number of factors going on. One is, I think, the one that you're pointing at,
that failure is somewhat more normalized in the tech world. A factor that goes against that,
or on the other side of the scale, is that the chance of success in tech is really quite low. Like if you're defining success
as becoming a billionaire or something like that, which I think that's kind of, that's the outcome
that gets a lot of people really excited to become a tech founder. So maybe it's not low compared to
becoming a Hollywood star, but it is low compared to most other endeavors that people start. And so
I think that's part of where the delusional optimism idea comes in,
is that because there really are only a few Mark Zuckerbergs or whatever, it feels like, well,
I have to fool myself into thinking that that's definitely going to be me if I just work hard
enough. Because if I was going to be realistic, then I would just be depressed. So that's a
factor on the other side. Another factor that sort of
might indicate that it's easier for tech founders to think this way is that it helps to be kind of
comfortable with probabilistic thinking, to be able to think in terms of like, well, you know,
I can accept that this, my success in this particular endeavor, this particular project
is not guaranteed. It's, you know, maybe it's like a 20% or 30% chance or something like that. And I can be okay
with that because the, you know, the probability of success times the value of success is high
enough compared to the probability times value of failure that the expected value is still really
positive and it's worth doing. Or a different way to think about it is if I were to do
10 of these projects over the course of my life, then even if each one only had a 20% chance of
success, overall, my chance of having one big hit over the course of my life is actually quite good,
even though it's still not guaranteed. And so it's kind of a portfolio thinking,
the way like an investor might think, well, I don't know for sure that this stock is going to
go up, but I've invested in a bunch of stocks and my chances of the overall portfolio going up over time is much better than the chance of any one particular stock making me rich.
So that kind of thinking I've learned from experience, that kind of thinking is not – it doesn't come naturally to everyone.
For a lot of people, the idea of sort of thinking in probabilities is like weird and unnatural.
And they just want to know like, well, is it going to succeed or isn't it?
Like, do I believe this or do I not believe this?
Kind of like a binary thing.
And that kind of thinking is harder to, it's less compatible with this kind of sanguine approach to looking at risk realistically. Yeah. I guess what that brings up for me also is that expected return,
when you can identify what is the probability of success or failure and what are the stakes,
like what is the expected return if we fail or succeed? It's great in a formula and it's great
in a laboratory and it's great in a controlled circumstance. But going back to Daniel Ellsberg and the Ellsberg paradox,
in the real world, very often you just don't have that data to be able to calculate.
So even if you think that-
You never have the data that tells you your probability of success. It's always subjective.
Yeah.
Right. So it's sort of like, even if you're trying to be rational about it, if you're trying to
figure out, okay, so how close can I get to actually making a good decision based on these estimates? You know, so often garbage in, garbage out, the data that you put
into it is such a wild guess that it's hard to get close to the point where you're not deluding
yourself. It's just that you're deluding yourself by putting fake numbers into a formula that you
feel more comfortable saying that there's a rational basis to make a decision on.
Yeah.
So this is another really important point that I'm so glad we're touching on.
I think a lot of people think about probabilities in terms of, well, you would need data.
You'd need a ton of data to know what the right probability is.
You would need to roll the die 100 times to know if it's a fair die, or you'd need data on all of the companies out there and whether they succeeded or failed to calculate the probability of whatever.
In the real world, you almost never have all the data you would need to know what the right probability is.
It's not even a coherent concept, really, to the idea of there being a correct probability to put on your company's chance of success or
failure. Instead, what I mean and what many people mean when they talk about a probability
is not some objective right answer that's built into the universe. It's just your kind of honest
best guess. It's an expression of how confident you feel you can be in the truth or in what's
going to happen. So this definition of probability
is called a Bayesian approach to probability, or sometimes subjective Bayesianism is this
way of thinking about probability instead of as an objective fact. It's just like the odds at which
you would bet on something essentially. And it's always going to be just sort of a rough estimate.
So for example, one person I read a lot about
in researching my book on the scout mindset was Jeff Bezos, who, you know, when he was starting
Amazon, he was in this category that I was referring to of people who thought, well,
the likeliest outcome is that this will fail. But even if it does, I will still be glad I tried.
And so I'm still excited to give this my best effort. And when he was initially deciding to leave his job, he had this
like cushy job on Wall Street in the 90s when he was deciding to leave and start the company that
would become Amazon. And he asked himself, like, what do I think is the probability that this
company will succeed? And his best guess was 30%. He thought it was probably, and you know, here's how he came up with that.
He estimated what percent of companies like tech startups at that time succeed.
And he was like, it seems about 10%.
And then he said, well, you know, I think I am smarter.
And I think my idea is better than the average tech company.
But, you know, I still have to kind of adjust upward from the baseline.
And so again, thinking very roughly, he was like, maybe 30%. I'll go up from 10 to 30. And that's clearly not a scientific
process. How could you possibly like estimate your odds scientifically? You couldn't. But it's still,
I would argue, it's still better doing these rough best guesses, these kind of off the cuff estimates
than to refuse to put a probability on it, refuse to even think about odds at all, and these kind of off-the-cuff estimates, than to refuse to put a probability on it,
refuse to even think about odds at all, and just kind of do whatever you are moved to do in the moment. I think a lot of people feel like the fact that we can't pin down the odds with precision
is an excuse to do whatever they have the urge to do, whether that's jump headlong into the risk
without thinking about it, or whether it's avoid the risk entirely without thinking about it. And neither of those
are usually the best option. So I feel like, and I think the evidence bears out that doing your
best to estimate odds is better than not trying at all. The Apple Watch Series X is here. It has the biggest display ever. It's also the thinnest Apple Watch ever,
making it even more comfortable on your wrist,
whether you're running, swimming, or sleeping.
And it's the fastest-charging Apple Watch,
getting you 8 hours of charge in just 15 minutes.
The Apple Watch Series X.
Available for the first time in glossy jet black aluminum.
Compared to previous generations,
iPhone Xs are later required. Charge time in glossy jet black aluminum. Compared to previous generations, iPhone XS or later required.
Charge time and actual results will vary.
Mayday, mayday.
We've been compromised.
The pilot's a hitman.
I knew you were going to be fun.
On January 24th.
Tell me how to fly this thing.
Mark Wahlberg.
You know what the difference between me and you is?
You're going to die.
Don't shoot him, we need him.
Y'all need a pilot.
Flight risk.
You know, and then there are the things that we're aware of, you know, which I think is a lot of what we're talking about.
But then, and you talk about this and you write about it, there's also the whole universe of things that matter that we're not aware of.
The unknown unknowns, you mean?
Yeah.
And there's a level of bias within all of us. You know, There are things that we see and there are things that we don't see. And we all have these blind spots
wrapped around really critical things that if we were aware of them would change the way we would
make decisions and take actions. But I feel like so much of the way that we move through the world
is based on things that we're not aware of that are happening in our
brains, tendencies and preferences and biases that lead us in so many different directions.
Yeah.
And I've always been curious, you know, how much do these play into the way that we move through
the world, the decisions and outcomes that we experience? And also, how can we make ourselves
more aware of these things? You started out a conversation talking
about, you used the word meta. Oh, metacognition. Yeah.
Yeah. And I'm fascinated whether it's cognition, awareness, but to me, the word meta is about
zooming the lens out and trying to actually look down into yourself or into a situation and ask a
question like, how can I see more clearly? What's really happening
here? And I'm fascinated at how you might try and do that in the context of our own biases.
Yeah. This kind of self-awareness or awareness of your own thinking or your own biases is one of the
trickiest things and also one of the most important things to actually seeing things
more accurately over the long run. And I have thought a lot about, you know, how do we become
more self-aware of our own biases? And one category of technique that I think is really valuable
is a thought experiment. So a simple example of a thought experiment that people will probably
already be familiar with is suppose a politician from your party, like the political party you vote for or prefer, does something controversial.
They're getting, you know, attacked in the news or by the other side.
And you look at it and you think like, oh, come on, like, stop making it such a big deal out of this.
You know, this isn't really that important or worth making a fuss about. Okay. Thought experiment.
Imagine that a politician from the other party had done the exact same thing. What would your
reaction be then? And in many cases, the result of this thought experiment, if you're actually
doing it honestly, is, oh, I would be furious and I would be calling for his head and saying,
you know, this is not only a, you know, resign worthy offense, but also it's an indictment of the whole party and how corrupt and, you know, craven that whole party is.
And so the general category of thought experiments is just, you know, imagining, flipping things around, imagining that things were different.
Imagining, for example, that it wasn't you in the situation that you're reasoning about,
but instead someone else.
How would you react if someone else was in this tough situation?
Would you say, oh, you don't need to worry about it?
Or would you say, no, no, you need to take precautions now to deal with this situation?
There's lots of variations on this.
But simply imagining that things were different and then noticing how you would react in that
counterfactual can be really revealing, can be really kind of highlight the bias that
was working behind the scenes in your initial judgment.
Does that make sense?
Yeah, no, it does.
I think it's a cool exercise.
It's funny that one of the things that popped into my head in this context is what a lot
of people would consider to be the farthest thing from a scientific process of inquiry, but, but I've actually found really illuminating and helpful.
And it comes from somebody who,
who actually had on the podcast a couple of years back, Byron Katie,
who described an experience of enlightenment that was spontaneous and
profoundly changed the rest of her life.
But from that came this process that she calls the work, which is super simple, but it's basically when, and she applies it largely
in the context of analyzing your own thoughts. So if you're spinning about something, or if you're
assuming something is happening in a dynamic between you and someone else, and it's causing
a lot of suffering, she calls it the work and she'll say, first, what is the circumstance?
And then you'll ask the question, is it true?
And then you'll ask simple questions like, what is the evidence that shows that it's true?
And what is the evidence that shows that it's not true?
And this came out of sort of like a spiritual process of inquiry, but it's actually a really rational, super basic way of looking at any experience to
try and deconstruct what's really happening here.
Oh, yeah.
I have no objections to that, to asking yourself those questions.
I think those can be great questions.
Yeah.
You know, is this true?
It's weird that that question, is this true, can work because you would think maybe that
the answer would be, well, if you, you know, had the
thought, you thought it was true. And then you ask yourself, is this true? Well, the answer is
yes, of course, I think it's true. But sometimes, especially if you're, you know, if you have some
practice being self reflective, or introspective, just asking yourself, do I actually believe this
thought that came into my head? Sometimes the answer is, well, maybe not. And that can be an
interesting moment in itself.
I guess I don't really know what... Do you happen to remember what she meant by enlightenment,
like the spiritual experience that produced her focus on these questions?
I'm going to bastardize if I try and use her language, but it was what a lot of people talk
about in the context of major spiritual awakenings, which is essentially a separation between you
and a capital E ego state that you no longer associate with that particular identity
on an ego basis. And you don't attach to anything or anyone, any feeling, any thought, any emotion.
There's effectively simply presence. That's what's been described to me by a lot of
different people in the context of what they would call a spiritual awakening or spiritual
enlightenment. And what's so interesting for me is that the way she describes it and what came out
of it is a process. A lot of times I've heard this and there's a lot of what follows that is a deep well of dogma and dharma and teachings and paths. But the process
that came out of it for her was very simple and very rational and very fact-based. And I just
thought that was really interesting to me. Yeah. Part of why I was curious about that is the
category of spirituality or spiritual experiences or spiritual awakenings or things in
that category. I like if you'd asked me 10 years ago or 15 years ago, I would have written all of
that off as well. People are, you know, people are confused or they're fooling themselves or
something. And I now, I mean, I'm, I'm still a scientific materialist. I don't think
there is a supernatural or that that makes sense. I'm a very kind of, yeah, classic materialist view
of the world. But I have come to think that a lot of the times when people are using spiritual
language or mystical language to refer to something, when I actually dig into it, I think,
oh, this is completely like real and legitimate. Like I would have used different language to describe it. I might've talked about, you know, metacognition or, or,
or about introspection or about emotion or something. I might not have used spiritual
language, but I don't think there's anything illegitimate about what this person is talking
about. So it's, this is one of the things that for me has been an instructive, a lesson in not writing people's experiences off too quickly and maybe being more open to the possibility that people have different languages with which they describe the world.
That can be a common cause of apparent disagreement or like times when I think someone is just wrong.
Yeah.
Interestingly enough, the older I get, I'm sort of coming to the same place.
Oh, yeah?
Well, which place did you start from?
Did you start from the more scientific materialist place or the more mystic?
Yeah, no, I wouldn't have used that language.
But yeah, I was raised around New York City, just outside New York City.
Spent 30 years in the city.
Classic New York City skeptic.
It's like, oh, prove it.
Before I buy anything, I don't want to just know what the
outcome is. I want to know how you can show me that it's real and that it's true. And walk me
through the process. Show me the logic, show me the rationale behind it. And if I can't understand
that, I can't get there. And if there's a lot of metaphysical language wrapped around it,
if there are missing steps, I really struggle with it. And I think two things have happened.
One is I've started to say, okay, so I can actually fill in some of the gaps in the process,
regardless of what the language is or regardless of the way the phenomenon was shared with me.
It's funny, I look at Buddhism and positive psychology now, and I look at a lot
of the academic peer-reviewed published research that's come out of the world of social science
and positive psychology. And I'm like, oh, this is actually just scientific validation for things
that the Buddhists have been teaching and practicing for thousands of years now. We're
using different language and a different process to get at it. But fundamentally, like Buddhism is steeped in the scientific process. It's just, it's different language. So I'm kind of fascinated at how these two worlds come together also. you're not there, but I am, is that there are also just phenomenon that happened where I have
zero explanation for it. There's nothing rational or logical, but I see it repeated so many times
that I'm like, there's something happening here. And maybe someday I'll be able to deconstruct it
and maybe not, but I've seen it or experienced enough that for now, I'm just going to accept
that there's something real about it, even though I can't explain it.
It's a weird place to be.
We might disagree.
If we dug into that more, we might disagree about the right explanations or the most plausible explanations for the data.
But broad strokes, I don't disagree with the idea that there can be phenomena that we don't
yet have good explanations for.
Yeah, yeah.
We're on the same page there.
Let's talk more about these things called the
soldier and the scout mindset, because it's sort of like we've been talking broadly about these
different concepts, but you've identified sort of like these two states of mind or states of being
and really argue also that one is maybe a more constructive, productive, and fulfilling way
to live at the end of the day. Yeah. So just to flesh out this metaphor, I talk a lot about soldier mindset in contrast to scout
mindset. And these are my metaphors. Soldier mindset refers to the state of mind, often
unconscious. You don't necessarily know that you're in soldier mindset, but we all are at
various times. It's a state of mind where when you're reasoning or reacting to some article you're reading or some argument you're hearing, you're not trying to figure out if it's true sincerely. You're trying to defend your pre-existing beliefs or defend something that you want to believe against any evidence that might threaten it. So it's a very, the term in cognitive science
that's closest to what I mean by soldier mindset is directionally motivated reasoning,
where essentially there is a destination that your brain wants to get to. So it's not like,
you know, reasoning to try to follow the evidence to whatever the truth happens to be. It's like,
I know the answer that I want to get, and I'm going to find a way to get there. And we do that by, well, my favorite kind of concise description of how soldier mindset works comes from a psychologist named Tom Gilovich.
And he says that when we're evaluating something that we want to believe, we look at it through the lens of, can I believe this?
So we reach for any justification to accept that evidence or argument.
Whereas if we're looking at something we don't want to believe, we look at it through the lens of, must I believe this?
So we're looking for any reason to reject it or dismiss it.
So it's this kind of asymmetry in the standards of evidence that we use or who we consider trustworthy sources
or how long we spend looking for evidence. And by applying those standards asymmetrically,
we can kind of, you know, usually get the answer that we wanted to get all along.
So that's, I call that soldier mindset just because it's so often feels like when you kind
of look at how we're behaving, like we're soldiers on a battlefield defending these fortresses of
our beliefs against any evidence that might threaten them. So that's soldier mindset.
And there's been a lot written about what I call soldier mindset under various names,
motivated reasoning, confirmation bias has a lot of overlap with it. Irrationality often includes
a large component of soldier mindset. But I have become increasingly interested over the last 10 years in the alternative to soldier mindset, the sort of more truth-seeking thinking that's directed at actually figuring out what is true.
Regardless of what I want to be true or what I previously have always believed, I just want to know what's true.
Because this is also a mode, you know, even though humans, yes, are often irrational or engaged in motivated reasoning,
we don't always. Sometimes, you know, people can be remarkably truth-seeking and honest,
and it just takes my breath away. And so I felt like there needed to be more focus on
the times when we actually succeed at seeking out the truth and reasoning objectively,
in spite of our, you know, preexisting biases and incentives to deceive ourselves.
And that we needed to be asking, well, you know, when people get this right, how do they do it?
And how can we get it right more often?
And so I wanted more sort of focus on this alternative to soldier mindset.
And it didn't even have a name really. So I call it scout mindset because
a scout, unlike a soldier, the scout's motivation or their role is not to attack or defend.
It's to go out, see what's really out there as clearly as possible and make as accurate a map
of the situation or of an issue as possible, including all the things that are still unknown
and that could change if you get a different perspective on the situation.
That's the Scout's goal.
And so I wanted to give that a name so that we could start to recognize and appreciate
it more and learn how to shift from soldier towards scout.
The Apple Watch Series 10 is here.
It has the biggest display ever.
It's also the thinnest Apple Watch ever,
making it even more comfortable on your wrist,
whether you're running, swimming, or sleeping.
And it's the fastest-charging Apple Watch,
getting you eight hours of charge in just 15 minutes.
The Apple Watch Series 10,
available for the first time in glossy jet black aluminum.
Compared to previous generations,
iPhone XS or later required,
charge time and actual results will vary.
Mayday, mayday.
We've been compromised.
The pilot's a hitman.
I knew you were gonna be fun.
On January 24th. Tell me how to fly this thing.
Mark Wahlberg.
You know what the difference between me and you is?
You're going to die.
Don't shoot him.
We need him.
Y'all need a pilot.
Flight risk.
I love that distinction.
When I first read that from you, I had this really interesting association with it, which
is in a very, very past life, I was a lawyer.
And I started my career at the SEC in New York.
And New York is the enforcement division of the SEC.
And our charge was to investigate the financial markets, to look at inside trading and market
manipulation and to find out, did something bad happen here?
And if so, what are we going to do about it?
So our mandate was basically to investigate,
to find out the truth. And because what we used to investigate had such profound and wide-reaching
financial implications, even the existence of an investigation was under the cover of secrecy.
Nothing would ever go public until we made a determination that something wrong had happened. And then it would go up the chain. You know, you convince your local supervisor and then the regional director and then like the people in Washington and then the five commissioners, you have to convince everyone along the chain that something wrong had happened. And at that point, it would go public and you would get subpoena power to start bringing in people.
And now it was public. That sounds so interesting.
It is. And you still didn't necessarily say something really bad has happened here,
but you had to convince everybody that there's enough potential evidence that something bad
went here that we're going to go public and we're going to subpoena people and this is going to be
a thing. And what I found really interesting is, and I'm sure this
is not exclusive to that agency. I'm sure it happens in any big bureaucratic agencies where
you've got to keep convincing stakeholders along the way to get behind you. That once you get their
approval at a certain level, and especially in this context, once it's public, that it's really, really easy to switch from fact-finding to proving the case.
Yeah.
And without even realizing that's what's going on. And you're like, okay, so now I've told so
many people there's something here that now if there isn't, I don't want to have to go back
and say there isn't. So even like the big mandate is let's
just find out the truth. And if nothing happened here, boom, we move on. Like no harm, no foul.
If something happened, we'll do something about it. But there's a shift in psychology. You become
personally vested in proving something happened rather than just saying, let's figure out what
happened here. And I found it even in myself, you know, and I was like, wow.
And that's very, that's very self-aware of you too. I think a lot of people
don't ever quite have that realization that they.
I think it really bothered me.
They don't notice the pressures that are impacting their thinking.
Yeah. And I think because, you know, we, I think we all, when you have personal stakes on the line
for something like that, you know, very often it's just, it's so easy to tip over to that side of,
of like, no, let me sort
of like decide what I want the outcome to be.
And then I'm just going to see what I need to see to get there and sort of like ignore
what doesn't fall in that argument as an outlier or something that isn't really worth
considering.
And I've wondered since then over the years, like how often I'm doing that, not just in
the context of my job, but in my relationships and my health and my life and all these different things. And I've caught myself
so many times and, you know, like really diluting myself. And it's, it's fascinating when you start
to do that. I have so many questions for you. Well, so as you're describing the experience
being in, what was your, it was the title investigator? Yeah. So I was an enforcement attorney. I was a lawyer for the division of enforcement. Yeah.
So as you were describing what that was like, I was thinking to myself like, well,
how would I deal with that? How would I, you know, you know, try to get myself to be more
truth seeking despite the pressures to not be so. And so I have like several ideas about how I would,
you know, deal with that. But, but I'm curious if you like found strategies to fight those pressures or if you felt like you never really did or what did you deal with it? that like, oh, there's something like there's an outcome happening here as we actually like
legitimately find facts. That's not what I expected. And that maybe, you know, it's going
to require a little bit of backpedaling and, and you know, like I had to take a stand and say like,
this is not going to be fun or easy. And maybe we don't get the stats, you know, or the numbers,
but like right is right at the end of the day. And I think part of that for me is I just,
I'm constantly asking myself in all parts
of life, is it right action?
And I think that was one of the early moments for me where I kind of had to make a decision
about who I wanted to be.
And it's funny that this is literally almost 30 years ago now.
And I still revisit those few moments where I was just sitting with the discomfort and deciding who I wanted to be effectively at the end of the day. really respect for like embodying scout mindset in difficult situations, they often say something like, you know, well, like the way that I psyched myself up to, to actually, you know, try to
disprove my own theory that I had put my reputation on the line for or whatever. The way that I would
do that is to think about how, you know, even if it turned out that I was wrong and I had to like walk it back
in front of my whole company or the media or whatever, I would think about how proud I could
be of myself and, and that how much I would be embodying the kind of person that I want to be.
And that, like that pride, it doesn't take all the sting out of the experience, but it's a serious counterweight.
And it can be enough to get you to do the thing that you don't really want to do.
Yeah. And that was definitely a lot of my experience.
And it does require having this as a value that you actually care about and want to live up to.
It does, right?
Not everyone does, but-
Right. And I think it requires a sense of it's know, like it's part of an identity level value.
Yeah, exactly.
You know, where it's not just a belief, but it's actually like, you know, it's a belief
on the level of, but this is who I am, like in the fiber of my being.
And who I want to be.
Yeah, exactly.
It's an aspirational identity.
Yeah, for sure.
And yeah, but it is interesting moments, you know, and it's, I think it's super cool that
you've actually had like the ability to sit down and talk with so many people about what was going on in their head during those moments. And I think it's super cool that you've actually had the ability to sit down and talk with
so many people about what was going on in their head during those moments.
That's been really cool. One thing that a number of people mentioned,
I was kind of surprised at how repeated it was, was how people said,
you know, it's especially hard in situations where I can't be confident that the people around me, like my coworkers or my audience, is going to respect me for changing my mind or reporting an unpopular belief or whatever.
I can't be confident that they're going to respect me for it.
Maybe no one will.
But people would tell me.
I have in my mind this kind of ideal audience of people whose opinion I
would actually care about. Maybe they're people I know or people I have known in the past, or maybe
they're just hypothetical. I imagine if in the future, more reasonable and intellectually honest
people looked back at my situation, they would really admire me for doing this. And, you know, maybe I'll never get to, you know, know that for sure or experience that. But
I have in my mind this, you know, even just hypothetical group of people who I actually
want to respect me. And I want to do the thing that would make those people respect me, even if
those people aren't the people who are around me right now in real life. It was interesting how many people said some version of that same kind of motivational story.
Yeah, that really resonates with me too.
And when I think about who are those people in my lives, it's my family.
And especially it's my daughter.
It's like if she knew the ins and outs of that particular moment or any number of moments,
would she be proud of her dad? You know, and that, that matters so, so much to me, you know,
and also I think as a parent, you know, I'm always trying to figure out how,
how do I model behavior? You know, how do I model making hard choices? How do I model doing
something simply because you feel it gets you closer to the truth and
there's something in you that feels like it's the right thing to do?
Yeah.
Even when it's hard and even when you're going to get judged for it and even when you
may pay a price for it.
How do you do that?
And if I can't do that, how can I expect anyone else around me who may look to me to
understand how they might explore being in the world.
How can I expect them to make different decisions?
Yeah.
But it's so fascinating to me that you've seen a similar exploration in different people.
Yeah, because you often need something to get past the very immediate reward punishment trade-off, which when it comes to truth-seeking
or scout mindset is often not in favor of the truth. Like if you're just concerned with like,
how am I going to feel in the moment? Then often the, you know, self-deception or other deception,
you know, wins on that score because you, you know, you get to feel good about yourself right
away or you like get to avoid blame or self-recrimination right away. And there are
consequences down the line in many cases because you're, you know, you don't develop this trustworthy
reputation or you don't, you know, start to notice the mistakes that you're making and be able to
fix them because you're so focused on feeling good in the moment by convincing yourself you're not to blame. So there are
consequences, but they're not immediate consequences. And unfortunately, human psychology,
including mine, is such that the immediate consequences feel so much more important than
the future consequences. And so I think of this kind of focus on your identity or who you would like to be and the satisfaction of knowing that you're living up to that.
That's kind of a way to get around this immediate cost-benefit calculus.
It's a way to feel good in the moment when you're doing something that won't pay off until the future, if at all.
I love that. I wonder also if there's sort of like, if there's a larger context that
influences us in how we both individually and societally define success, you know,
and is it winning at the thing that we set our minds to like accomplishing? Or is it just,
is it learning? Is it growing? Like, is it, you know,
like success by some predefined metric that like everybody agrees this is success? Or is it, have
I learned? Have I grown? Like, have I, have I evolved and allowed things to unfold? It's,
years ago, Eric Ries came out with a book on lean startups. And one of the things that he said is in the world of startups, you know,
where you're just constantly iterating fast, fast, fast prototype, get feedback, iterate, iterate.
You know, he said that the key to making this work is to let go of the end of the process,
having to be a successful company or a successful business model. You know, if you buy into the idea
that a startup is basically just a bunch of people in search of a viable business model. And like,
you make the central metric for measuring everything learning. Like, have I learned?
Have I grown? Like, do I know more now than I knew before, regardless of what it's telling me,
regardless of whether it's telling me it's possible or not. And that exalting that as the primary metric for success allows you to make much better decisions that at the end of the day, eventually will likely either get you closer to success, but it may look totally different than you thought it would.
Or it can convince you really quickly it's not worth doing, which is actually a really good thing because then you can go figure out something else that might be worth doing.
Right.
So I wonder if there's a bigger context here, which is really on a larger scale what we tell ourselves success really looks like in the context of business and life.
Yeah.
That is also something that was mentioned by a number of the successful and very scout-like entrepreneurs that I interviewed for
my book when I asked them, how are you able to take a cold, hard, realistic look at your
probability of success and still feel motivated? They tended to talk in terms of the long term.
Obviously, they wanted their current company to succeed, of course. But in terms of what
made it feel to them
like this is worth doing, it was not necessarily the success of that one company. It was just
a sort of generalized, hard to pin down success in the long run. And as you say,
a particular startup failing is not that... You can get a lot of useful stuff out of that that
can help increase your probability of success in the long run.
And that kind of long horizon perspective is not so easy for a lot of people to take,
but it's so helpful.
To generalize this even more, just to truth-seeking in general, there's a real tension between
wanting to be right in a particular argument or like on a particular issue versus wanting to
be right as much as possible over the long run. And the more you can let go of your desire to
be right in that particular argument, the more able you are to notice like, oh, I was missing
such and such, or, oh, I like totally discounted that person's opinion, or, oh, I like made this
mistake when I was reasoning. And the more you're able to notice those things,
the more right you will be over time, but you have to let go of the desire to be right
in the particular situation that's right in front of you.
So I feel like this is just a generalization
of the principle that you're talking about
with respect to kind of career success or life success.
Yeah, I think that makes a lot of sense.
And what lands also is that, you know,
like part of this is just fundamentally,
how do we let go of the compulsion to be right or to be seen as the person who was right? I imagine
there's a big social context there. Right. There's a difference between being right and having been
right or looking right. Yeah. Because I think a lot of us pin advancing in our lives and our careers on having
been seen as the person who was right. So there's just a lot tied up in that. There's a lot of ego
tied up. There's a lot of assumptions about how we'll be able to live, what our lives will or
won't be like based on the perception, what we think will or won't happen if people
perceive us as being more consistently right or not. So this is another interesting thing that
came up when I was researching scout mindset and like why we aren't always in it. So it's true that
to some extent, like being right about things does make you look good. And if you're, you know,
the guy in your field who tends to be right, then that does, that gets you respect and, you know, probably promotions and prestige and money and things like that.
Social status, to some extent that is true.
And yet at the same time, I have found again and again that people tend to, they tend to overestimate how bad it would be socially if they admitted they
were wrong about something. And that another thing that came up again and again in my interviews
was people said, you know, when I first started, you know, as a manager, a leader in whatever
respect, when I first started saying, you know, oh, I was, you know, looks like I was wrong about
that projection or, you know, I think we need to pivot.
The strategy is not working out as well as I thought it would or something.
When I first, the first few times I did that, it felt like I was going to be ruined. This was just going to be the end of the world. And in the vast majority of times, people were appreciative and
it turned out fine. And the consequences weren't nearly as catastrophic as it kind of emotionally
felt like they were going to be.
And so over time, it got a lot easier to say that you were wrong about something.
But it's interesting that it's so universal that we feel like it's going to be really, really bad if we ever admit we were wrong. And I think that's just a bias.
I think it's just like a systematic overestimation that we have of the bad consequences in this particular way. And that this is one of, one of the set of reasons that we're not in scout mindset as often as we
should be for our own, you know, effectiveness and our own happiness that we overestimate the
social downsides to saying that we were wrong about something, which is not to say they don't,
there's no downsides, but we overestimate them. Yeah. I, I wonder also, as you're sort of like,
as you're sharing that, I wonder if we actually exacerbate the social downsides, but we overestimate them. Yeah. I wonder also, as you're sharing that,
I wonder if we actually exacerbate the social downsides unwittingly by basically, because
especially if we're in a leadership role and we hold ourselves to the standard of being
professionally right or as right as we possibly can all the time, anyone who we're leading
is going to model us and they're going to hold themselves to that same standard.
So that then, like if you literally build an organization or a team where you're all working
with that same frame, right? And then you decide like at the top, like, okay, something happens.
I'm going to break from the norm and actually own this and be vulnerable. You've built an
organization and a culture up until that point, which doesn't tolerate that. So now you've got to effectively topple
all of this. You're effectively causing a lot more pain. You have the long-term ramifications
are probably going to be really good, but you probably sort of like built this level of like
scaffolding around the culture of rightness that when and if you finally decide to sort of, you
know, like step out of it, the whole, it's not just you who's owning it, but now you effectively
have to dismantle all this stuff, which will probably cause a lot more near-term pain,
but hopefully in the, you know, in the name of longer-term integrity and openness.
Yeah. There are these unfortunate kind of sticky equilibria that we get into where,
you know, even if we would be better off at a totally different point, it's any incremental
move away from your current equilibrium makes you worse off. And those are really, really tricky.
Sometimes it's called local maxima to get out of their traps really, uh, colloquially.
Yeah. Um, we are complicated beings. Complicated, irrational, but-
Have no argument with that. Yeah. But I do love the idea of the scout mindset where it's really
can we actually switch out of this mode of let me identify an outcome and then defend it and switch
into the mode of let's just try and find the facts.
Like what is true here?
Like what is not necessarily like my truth, but what is like as close as we can come to
the truth?
Yeah.
What is my honest best guess about the truth here?
Like if I could strip away my own baggage or my own motivations or like what would be
flattering or convenient for me, what would I think was most
likely to be true? And obviously, you're not always going to be right about those guesses,
and you can't always know 100% certainty. But Scout Mindset is about trying to take your honest
best guess about what's true. Yeah. This may be an impossible
question, but I'm going to ask it anyway. If you could, if you could give
somebody a single question or a single prompt to offer themselves on a regular basis, to keep
trying to get as close to that place as they can whenever they're in consideration or decision
making mode, is there any one that really stands out to you as being super useful or really powerful and effective?
I think the most universally and frequently useful question like that is, if I was wrong about this,
how bad would that be? And what would I do about it? So this can be applied in sort of emotionally fraught life decisions like, you know, if you, I don't know, suppose you're at a job and you're starting to worry that, oh, maybe I made a mistake.
I shouldn't have started working here.
Oh, but it would be such a, it'd be so hard to leave and find a new job.
And then I'd feel so bad about myself if I, you know, because I would
have like wasted all that time getting this job. Anyway, there are a lot of situations where we
are, we have a strong temptation to dismiss a potential concern or to convince ourselves that
the potential risks are not actually there because it would just be, we feel like it'd be so
unpleasant if it was true that we don't want to even consider it. And so in situations like that,
asking yourself like, okay, suppose it was true. Suppose I made a mistake taking this job. How bad would that be?
And what would I do about it? Oftentimes, I would say the vast majority of the time,
when you actually think about, okay, how bad would it be? You relax a little bit because it's no
longer just this horrifying thing that you're not allowing yourself to think about. You're just thinking through like, okay, what would I do? All right. Well, it would probably
take me a couple of months to find a new job or, you know, suppose you're like worried you might
have made a mistake and you don't want to admit it to your team. And so you're trying to deny that
you made the mistake, asking yourself like, how bad would that be? And what would I do about it?
Just coming up with kind of a simple plan for like, all right, here's how I would tell my team
that it was a mistake.
Here's like the wording I would use.
And how bad would it be?
Like, well, I don't know.
I have a lot of credit built up with them.
Probably wouldn't be that bad.
Like probably a lot of them would respect me for just, it can be fairly relaxing in a weird
sense to just have kind of a clear picture of like, all right, here's what I would do if that
were true. And then even if it's not appealing, it at least feels tolerable, like something you
could tolerate. And that is so valuable for allowing you to think clearly and honestly about
it because it no longer feels like, well, I have to deny that because it would be the end of the
world if it were true. Now you feel like, okay, wouldn't be
great, but I could handle it. And that gives you the affordance to think honestly about whether or
not it is true. And I think this applies more generally, not just to kind of tough life
decisions, but to like, you know, if you might be wrong in an argument, like if I'm having an
argument with someone on Twitter to kind of free myself up to think honestly about whether they might be right, I have to ask myself, right, suppose they
were right. How bad would that be? And what would I do? Well, I guess that wouldn't be so bad,
even though it felt like it would be terrible a minute ago, because I've been wrong before and
it wasn't the end of the world. And I guess here's like how I would say it on Twitter if I turned out
to be wrong. Okay. Yeah. I know how to say that. Cool. Okay. So I know how I would handle it if I
was wrong. And it's just very freeing because now you feel like you're allowed to get either answer,
the answer where you're right or the answer where you're wrong, and either one would be okay.
That's kind of the place you want to be in when you're trying to think honestly about what's true.
Yeah. I love that. And I imagine, you know, the more you allow yourself to do that,
you start to habituate to it where you realize like, oh, like I've done this before. I can do
it again. It's really not so bad. And it just becomes, you probably step out of just like the
level of like, you know, like perpetual defensiveness because you realize you don't
have to be there to defend anything or to protect anything anymore that like, no matter what, how it comes out,
it's going to be okay. Yeah. It gets, it's never completely easy, but it gets easier.
Yeah. Right. There are like new grooves worn in my brain.
Right. Exactly. Exactly. Yeah. This feels like a good place for us to come full circle in our
conversation as well. And I always, I always come around and wrap up with the same closing question, which is in this context of Good Life Project,
if I offer up the phrase to live a good life, what comes up for you?
Oh, the first answer that came to mind was a really nerdy and unromantic answer,
which was to live a good life is to maximize your utility, which is a very kind of
abstract decision theoretical way to define a good life and probably not what you're interested in.
But yeah, I guess I would say that to live a good life is to feel like you did the best you could
with the hand you were dealt and to, to not feel like you're
emotionally railing against the things that were out of your control. Um, but instead be able to
feel like I made the most of what I had, um, or, or of the opportunities and the, you know,
the positive things available to me. Um, that's kind of my goal anyway, is to be able to feel like I'm playing my hand as well as
I can. Love it. Thank you so much. Yeah, my pleasure. This was really fun. emotions and how we think and feel and live. You'll find a link to Susan's episode in the
show notes. And even if you don't listen now, be sure to just click and download it so it's
ready to play when you're on the go. And of course, if you haven't already done so,
go ahead and follow Good Life Project on your favorite listening app or platform.
And if you really appreciate the work we've been doing here at Good Life Project,
then go check out my new book, Sparked. It'll reveal some super eye-opening things about you and your very favorite subject, you. Then show you how to tap those
insights to reimagine and reinvent work as a source of meaning and purpose and joy. You'll
find a link in the show notes below, or you can just find it at your favorite bookseller now.
Thanks so much. See you next time. The Apple Watch Series X is here. It has the biggest display ever. It's also the thinnest Apple Watch ever,
making it even more comfortable on your wrist,
whether you're running, swimming, or sleeping.
And it's the fastest-charging Apple Watch,
getting you eight hours of charge in just 15 minutes.
The Apple Watch Series X.
Available for the first time in glossy jet black aluminum.
Compared to previous generations, iPhone Xs are later required.
Charge time and actual results will vary.
Mayday, mayday.
We've been compromised.
The pilot's a hitman.
I knew you were gonna be fun.
On January 24th.
Tell me how to fly this thing.
Mark Wahlberg.
You know what the difference
between me and you is?
You're gonna die.
Don't shoot him, we need him!
Y'all need a pilot?
Flight Risk.