Afford Anything - The Scout Mindset, with Julia Galef
Episode Date: June 1, 2021#319: Julia Galef is an acclaimed expert on rational decision making. She’s hosted the Rationally Speaking podcast for the last decade, and she’s passionate about good reasoning. Her book, The Sco...ut Mindset, highlights the importance of looking at situations objectively and honestly. This is something a lot of people struggle with -- humans are often irrational -- but Julia argues that this is a skill that we can develop with self-awareness. In this interview, she shares the difference between what she calls a soldier mindset versus a scout mindset. She explains why we often default to the soldier mindset of defending ideas we desperately want to believe, and details several thought exercises that we can use to instead train our brains to scout for the truth. Good decision making and ensuring you look for high quality sources of information can help when weighing trade-offs, and it can also save you from making costly investment mistakes. Julia and I also discuss specific examples of when having a scout mindset can prevent you from risk of ruin. For more information, visit the show notes at https://affordanything.com/episode319 Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
You can afford anything.
You just can't afford everything.
Every choice that you make is a trade-off against something else.
So how do you make better choices?
How do you make better decisions?
That's what we're going to talk about in today's episode.
Now, why is that important?
Because the choices that you make impact your money, your time, your attention, your energy.
The choices that you make impact any limited resource in your life.
Saying yes to something implicitly means saying no to all other opportunities.
And so we want to make rational decisions, but oftentimes we get carried away and make emotional
decisions, and we don't necessarily know that we're doing it. I'm not saying emotions are wrong or bad.
Of course, emotions are information, but understanding how to think clearly through a problem,
understanding how to think in terms of expected value, how to recognize our cognitive biases,
how to forge mental models that can help us avoid expensive mistakes.
That's how we make clear decisions about how we spend those limited resources,
our money, our time, our energy, our attention.
And that allocation of resources, thinking through that allocation clearly,
that's what this podcast is here to explore and to facilitate.
My name is Paula Pan.
I am the host of the Afford Anything podcast.
And today, Julia Galef joins me.
to discuss rational decision-making.
Julia is the host of the rationally speaking podcast
and an acclaimed expert on reasoning.
Her book, The Scout Mindset,
highlights the importance of looking at situations
objectively and honestly.
Now, of course, we've had many behavioral economists on this show.
We've had many financial psychologists on this show.
And so we know that humans are irrational.
That's our nature.
And so thinking rationally is not about denying
our nature, but rather developing greater self-awareness. In this upcoming interview, Julia describes
the difference between what she refers to as a soldier mindset versus a scout mindset. Now,
a soldier mindset, in her words, is where you are defending your ideas. You're protecting,
you're defending, you're supporting, bolstering, buttressing your ideas when those ideas
become challenged, destroyed, undermined, knocked down, shot down. And so this idea of defending
the things that you want to believe, that is what she refers to as a soldier mindset.
And in this upcoming interview, she contrasts that with a scout mindset where you've got
a map and you're mapping, you're scouting, you're learning, you're approaching an idea
from a place of curiosity. You're exploring. Now, in our upcoming interview, she details
several thought exercises that we can use to train our brains to scout for new knowledge and
new information. And this is particularly relevant when you're an investor, because as investors,
we get invested in what we're investing in. We can become attached to the assets that we're
already holding. And we seek out confirmation bias sources, things that will tell us that our
favorite stock, our favorite asset class, Tesla or Bitcoin or real estate or gold, whatever it is,
we tend to search for information that confirms or reinforces our preexisting beliefs.
And if we want to improve our skill set as investors, a scout mindset can be incredibly helpful.
And so that's what we're going to talk about in this upcoming interview.
Here is Julia Galeith on the Scout Mindset.
Hey, Julia.
Hi, Paula.
How are you?
I'm doing great. How are you? It's good to be here. Oh, thank you. I'm fantastic. Thank you for coming on the show.
My pleasure. Julia, you wrote about the Scout mindset. What is Scout mindset? And how does it differ from what you refer to a soldier mindset?
To start with soldier mindset, that's my metaphor for this incredibly common, very universal mode of thinking that we're often in, in which we're unconsciously trying to defend our preexisting beliefs.
or defend things that we want to believe against any evidence that might threaten them.
People might have heard of this under different names like rationalizing or wishful thinking
or motivated reasoning is the closest concept from the cognitive science literature.
And I call it soldier mindset just because of the language that we use to talk about reasoning
is so militaristic. This is something I didn't even notice until recently, but the more you pay
attention, the more you notice, wow, all of our words to describe reasoning or evaluating
are like we're talking about a battle. You know, we reach for supporting evidence or evidence to fortify
our beliefs as if we're building up a fortress to make it stronger against invaders. We shoot down
opposing arguments. We like poke holes in other people's logic. So I call it soldier mindset.
We're defending a belief, right? Right. Exactly. It's, yeah, we're trying to build up our beliefs so that
they can't be attacked or invaded by the opposing forces of logic and reason. And so scout mindset is
my metaphor for the alternative to that, because the scout's role is not to attack or defend. It's to
just go out and see what's really there and form as accurate a map as possible of a situation or a topic.
And that means recognizing that your map of what's going on, like your beliefs about the world or
yourself or other people, they're never going to be perfectly accurate. They're always going to be
incomplete. And the goal is not to defend some preexisting view, but to revise your views over time,
revise your map as you learn more information about what's really true. So scout mindset is basically
the motivation to see what's really there and not what you wish was there to be intellectually
honest and curious about what's really true and, you know, objective and fair minded and things like that.
So my book is about how and why to be more of a scout and less of a soldier in the way you think.
Right. I'm thinking about in the world of finance, if you are predisposed to wanting to believe that a certain investment is going to take off.
But the reality is you're just biased in favor of that investment, that specific asset class.
You might just surround yourself with confirmation bias that says, hey, this asset class is great because you like it, even though there's evidence.
to the contrary. Right, exactly. And the way that we usually go about defending our beliefs is this
kind of subtle, unconscious process in which we apply a different standard to evidence, depending on
whether it tells us what we want to hear or not. So my favorite definition of motivated reasoning,
i.e. soldier mindset, comes from a cognitive scientist named Tom Gillivich. The way he put it is
when we're evaluating a claim or some evidence that we want to believe, we're looking at it
through the lens of can I believe this? And so we'll reach for any reason we can find to accept it.
Whereas if we're looking at evidence we don't want to accept, we instead evaluate it through the
lens of must I believe this? And if we can find any reason to reject it, you know, any potential flaw
or exception or, you know, we object to something about the source, then we reject it. And so, you know,
it feels like we're being kind of skeptical and critical, but we're actually, the problem is we're
applying this very asymmetric standard of what we consider trustworthy or plausible or reasonable,
depending on our motivations. So, yeah, there are a lot of examples like the one that you brought up.
In fact, there's this book by a mathematician named John Allen Paulos. I believe it's called
a mathematician plays the stock market, but I could be wrong about the title. And he describes losing
a lot of money on, I think it was Enron stock.
because he was, you know, in retrospect, he now realizes he was applying this very, very different standard of evidence or of rigor to evidence in favor of Enron versus against Enron.
So, you know, whenever there was a news article kind of making it sound like Enron was maybe a little sketchy or not a great investment, he would get skeptical and he would keep searching until he found some article that told him something different and then he'd go great, okay.
Because, sorry, he had invested a lot of money in Enron.
He bought a lot of Enron stock. And so even as things started to look kind of dicey, he still just
found himself working really hard to find reasons to discount the negative news reports because
he was really hopeful that his investment would pay off. And he lost, I think he was unwilling
to disclose exactly how much money, but it was quite a lot of money on that stock. And yeah,
he's a very smart person. Clearly, he's an award-winning mathematician. Soldier mindset is not
something that only stupid people do. It's just a universal feature of the human brain.
really all we can do is just get better at noticing it, you know, get more self-aware and develop
the mental habits that help us correct for it. Right. And you recently had the creator of Ethereum
on your podcast and you asked the question. Oh, Vitalik. Yeah. Yeah. He's so great. And one of the questions
that you asked that I thought was fascinating was in retrospect, not to engage in like the resulting
fallacy, but in retrospect, why did so many people in the thinker community miss this for the last
10 years? Is this something that we should have seen based on the evidence that was presented to us
10 years ago? Or given that there was a large probability of possible outcomes, was it reasonable
even at that time to discount it up until now, up until further proof was warranted?
Yeah. I mean, I'm still very interested in this question. And I don't.
don't feel like I have a definitive answer to it, which is one reason I was so curious to get
Vatelic's thoughts, because, you know, it's very easy to look back and beat yourself up for having
not invested in crypto, you know, back in 2011 or 2012 or something like that, if you didn't.
But as you put it, that's kind of, it's easy also to fall prey to the resulting fallacy where
you judge your decisions based on the outcomes and not based on the information you had at the time
in what was reasonable to predict at the time beforehand.
And so like many questions, I think this is a little ambiguous.
And you kind of have to do your best to think back at the time and think, okay, was it reasonable
and rational for me to ignore crypto at that time?
And I don't think the answer is really clear one way or the other.
But at the very least, what Vitalik suggested is there is actually a heuristic you could use
going forward. Now, whether or not that heuristic should have been clearly true to you back in
2012 is less clear. But going forward, there's a heuristic that seems to him to yield positive,
expected results, which is just his view was. There's a small set of things that kind of smart people
in the tech world get excited about as potentially promising. And not all of those end up panning out.
Like, I don't know, virtual reality hasn't really panned out yet. And that's something people
are excited about. But it's a small enough set that if you invest a little bit of money or,
you know, at least a little bit of time and attention into all of them, enough of them end up
paying out like crypto that your overall portfolio of time and attention or money is going to be
very well spent. And so his heuristic, his suggested heuristic was just take seriously the small
set of ideas that smart people in the tech world get excited about and understand that most of them
may not pay out, but some of them will and the payout will be large enough that, oh,
all it's worth it. And that goes to a concept you sort of alluded to called expected value,
which is, yeah. Would you describe that? Sure. So I bring this up because in the context of a
misconception, like discussing a misconception that I hear a lot about scout mindset, which is that
people believe, you can't really afford to be honest with yourself or be objective about
your probability of success in some endeavor, like starting a business.
or investing in some risky new company or currency.
You can't really afford to be realistic with yourself if it's low probability because then
you'll just get discouraged and demotivated.
And instead, you should just try to believe with all your might that my new company
will definitely be a success or this bet will definitely pay off.
And that's what you really need to do to give yourself the motivation to work hard and
maybe succeed in the end.
So that's the common view.
actually there are a bunch of examples of very successful entrepreneurs, for example, or investors,
who recognized realistically that their odds of success were pretty low.
Like Elon Musk, for example, when he was starting Tesla and also starting SpaceX,
both times he estimated, I think my probability of success here is about 10%.
And then Jeff Bezos, when he was starting the company that would end up becoming Amazon,
his estimate of his odds of success was about 30%,
which he got that number by essentially looking at the base rate of success of tech companies in the 90s,
which seemed to him to be about 10%.
And he was like, well, I think I'm smarter than average.
And I think my idea is really good.
But I still have to adjust upward from the base rate.
So maybe I'm, maybe I've got about a 30% chance of success, not 10%.
But still, those numbers are much lower than the way that most entrepreneurs talk about their chances of success.
And yet clearly, you know, Elon Musk and Jeff Bezos are both.
extremely motivated, driven people who have both been quite successful. So these are examples of how
you don't actually have to convince yourself that you have a 100% chance of success in order to be
motivated and driven and so on. So my explanation for how people like Musk and Bezos were able to
motivate themselves despite recognizing that they were doing very risky things is that they think
in terms of expected value. What that means is that essentially the expected value of
Some endeavor is the probability of success times the value of success, plus the probability of failure
times the value of failure.
So what that ends up meaning is that some endeavor can be very worth doing, even if it's more
likely to fail than to succeed, as long as the value of success is high enough and the
kind of value of failure or the badness of failure isn't all that bad.
And so if you look at the way that Bezos and Musk both talked about their decision to start these risky companies a priori, they explicitly thought about, okay, how good is success and how bad is failure.
And Elon Musk, for example, was like, well, if I succeed at Tesla, that's amazing.
I will have helped humanity move away from fossil fuels and towards a more sustainable way to run our societies.
And that's fantastic.
And how bad is failure?
Well, you know, I won't be personally ruined if Tesla doesn't succeed.
seed. And also, I will probably make a little bit of progress towards that goal, even if the company
doesn't take off. So I can at least convince people that electric cars don't have to be these dorky,
you know, golf carts, which was kind of the way people thought about electric cars before Tesla.
So he was like, at least I can help change society's perception of electric cars. And that will be
valuable even if I fail. And so he kind of looked at those two possible scenarios and decided, well,
even though failure is more likely than success, this is still overall a worthwhile, a worthwhile
wild bet to take because the expected value is positive. I think that's kind of the missing piece
that a lot of people don't yet have that allows you to look at your odds realistically and still
be motivated and driven and ultimately often successful is you have to think in terms of expected
value instead of being motivated just by the certainty of success. So in other words, some people
argue that you should not adopt a scout mindset. You should not face the brutal truth because it is
advantageous to be self-delusional. Exactly. But the reality is, if you view things from the
perspective of expected value, what are the range of possible outcomes, what is the magnitude of those
outcome, and can I survive that risk of ruin? If you view things through that framework, then you can
maintain that same motivation, but without the self-delusion. That's beautifully and concisely put. I
appreciate that much more concisely than me. An important thing to note here is that is just why it's
important to, or so valuable to be able to look at your odds realistically. Because I think a lot of
people don't really recognize this. They're just like, well, it's much better to be confident in your
success than unconfident. So let's just try to be as confident as possible. But having a realistic
picture of the odds of whatever you're doing is that's what helps you make good decisions as your
project develops about whether to pivot or how much money to invest or whether you need to, you know,
improve your business plan before you launch or those are all like tricky decisions that you have to
make. And it's hard to make those decisions well if you don't allow yourself to think as realistically
and honestly as possible about the odds. And so I think it's really a huge boon to be able to think
realistically and make those decisions as well as you can while still retaining your motivation and
your drive. And so that's what I think this kind of scout mindset allows you to do. And I think
Musk and Bezos are good examples of that. In your book,
you talk about when it comes to the concept of motivated reasoning, you talk about some of the
functions that people argue that it serves. You break it down into six categories. Comfort, self-esteem,
morale, persuasion, image, and belonging. Can you briefly touch on each of those categories
and why to sort of continue on this conversation of giving up that self-delusion, why it's okay
to give up those categories? I mean, comfort, self-esteem, morale.
who wouldn't want that?
Absolutely.
I really wanted to do justice to the reasons why we so often default to soldier mindset.
We use it for all of these different purposes to give us comfort, i.e., to make us feel better
about things that are stressing us out or making us feel insecure.
We use it to defend our egos.
So we might unconsciously reach for defenses of beliefs like, well, I may not be rich,
but that's because rich people are all unethical.
And so the fact that I'm not rich is actually a good thing about me or rich people are all unhappy.
And so I don't want to be rich anyway.
There are narratives that we over the course of our lives build up and try to affirm to
ourselves that make us feel better about ourselves and our lives.
So that's ego.
And then morale is what I was just talking about with respect to the motivation to do hard
and risky things.
We'll often unconsciously try to convince ourselves that success is guaranteed or that we can't
possibly fail because we're using that to try to motivate ourselves to take risky bets.
So those three purposes of soldier mindset I put in the category of feeling good.
And then the next three are more about looking good to other people.
So one of those three is image.
And so sometimes we unconsciously try to defend beliefs that we think will make us look good
to other people.
Like I might be motivated to believe that deep learning is.
is going to change the world because the smart people around me believe that, and I want to look smart, too.
Virtue signaling.
Yeah, so that's like one big category of this image purpose of soldier mindset.
But, you know, people also might want to believe things that make them seem, I don't know, edgy or nonconformist.
Like some people are really motivated to reject popular beliefs because they want to seem, you know,
they don't want to seem like just one of the crowd.
And so these motives are kind of working in the background, even if we're not really
aware of it and they affect which beliefs we're kind of trying to accept versus trying to reject.
And then another type of looking good motive is belonging, where you're kind of motivated to
believe things that the people in your tribe or your peer group believe because you want to
fit in, essentially. And then the last one that I haven't talked about is persuasion, where
we're motivated to believe things that we want to convince other people of. So, you know,
if you're a startup founder, you have kind of a strong incentive to believe that, you know,
your company is going to take off and be really successful, not just for the morale purpose,
but also because you want to make that case to investors and you want to sound really earnest and
convincing when you tell them about your company's prospects. And so people kind of feel intuitively
like, the more I can get myself to believe this, the easier it will be to convince other people.
And so these six purposes, morale, comfort, ego, and image, persuasion, and belonging,
these are absolutely important parts of being happy and just being human.
And I'm absolutely not arguing that we shouldn't care about them or we should give them up.
Instead, I'm arguing that actually soldier mindset, even though we tend to reach for soldier
mindset in order to get these things, soldier mindset is often, or I would say usually
not the best way to get them.
That instead, you can have comfort and self-esteem and persuasiveness and so on while still
viewing things as realistically as possible.
and that ultimately that's a better approach.
So a lot of the book is just about how do you get all these things that you want without deceiving yourself?
Yeah, don't use faulty logic to justify your emotions, essentially.
Yeah, or like you might think that you need to use faulty logic to justify your emotions,
but you don't actually.
You can feel good about yourself and justify your emotions without falsehood.
We'll come back to this episode after this word from our sponsors.
The holidays are right around the corner,
and if you're hosting, you're going to need to get prepared.
Maybe you need bedding, sheets, linens.
Maybe you need servware and cookware.
And of course, holiday decor, all the stuff to make your home a great place to host during the holidays.
You can get up to 70% off during Wayfair's Black Friday sale.
Wayfair has Can't Miss Black Friday deals all month long.
I use Wayfair to get lots of storage type of items for my home, so I got tons of shelving that's in the entryway, in the bathroom,
very space-saving. I have a daybed from them that's multi-purpose. You can use it as a couch, but you can sleep on it as a bed. It's got shelving. It's got drawers underneath for storage. But you can get whatever it is you want. No matter your style, no matter your budget. Wayfair has something for everyone. Plus they have a loyalty program. 5% back on every item across Wayfair's family of brands. Free shipping, members-only sales, and more. Terms apply. Don't miss out on early Black Friday deals. Head to Wayfair.com now to shop Wayfair's Black Friday deals.
for up to 70% off. That's W-A-Y-F-A-I-R.com. Sale ends December 7th.
Fifth Third Bank's commercial payments are fast and efficient, but they're not just fast and
efficient. They're also powered by the latest in-payments technology, built to evolve with
your business. Fifth Third Bank has the big bank muscle to handle payments for businesses of any
size. But they also have the FinTech hustle that got them named one of America's most
innovative companies by Fortune Magazine. That's what being a fifth-third better is all about.
It's about not being just one thing, but many things for our customers. Big Bank Muscle,
FinTech Hustle. That's your commercial payments of Fifth Third Better.
On a previous podcast episode, I shared a quote that I recently heard. Someone said,
an analyst is only as good as their biases.
That is good quote.
It's a, but yeah, I loved it. As soon as I heard it, I was like, I'm permanently filing that one in
my brain. So how do we begin to notice our biases? Are there any tests or thought experiments that we can
use in order to become more self-aware? Yeah. So I think thought experiments are really valuable,
and I use them a lot myself, because as I was describing, when you're in soldier mindset,
it's really often invisible to you because maybe you're being more skeptical of some claim because
it's telling you something you don't want to hear. But it's hard to notice that because you don't have
anything to compare it to. In order to notice that you're being asymmetric, you would have to
compare that claim to the same claim, but coming from the other side or telling you something
you want here instead. So a thought experiment is what allows you to notice that you're applying
a different standard to different claims unjustifiably. For example, one type of thought experiment
I do a lot is I call it the outsider test. Suppose I'm running a team at a company and I'm
kind of dealing with an employee who's been a problem for a while, and I'm debating with myself
whether I need to fire them or not. And I really don't want to fire them because that sounds really
unpleasant, and I just, I hate to think of doing that. So I'm coming up with all these reasons why,
no, I should really just wait a few more months or, oh, he's really not that bad. In a situation
like that, I would step back and do an outsider test and imagine someone else in that exact same
situation as I'm in, who has that same employee with those problems. And I would ask myself,
what do I think that person should do in my situation? And very often, I think the outcome of this
test is, oh, in that situation, I think they should fire the person. This is not, there's really not
much hope at this point that they're going to improve and they're costing the company a lot of time
and a lot of strife. And it's just very striking oftentimes how different my judgment is when all
I change about the scenario is whether it's me who's in the scenario.
Right.
That's one type of thought experiment.
You can also try varying the source of a claim.
And so if a friend of yours gives you advice, would you judge that differently than if, say,
your colleague who you don't like gives you advice?
Because very often we discount advice just based on not liking the source.
And if you just imagine, well, what if my friend said the same thing, you notice,
oh, in that case, I would see it's actually quite reasonable advice.
So I just like to play with different features of the scenario that might be affecting my
motivations and notice if that changes my judgment.
Right, right.
You share the example in your book of Intel, how the leaders, the chief executives of Intel
back in the 90s, survived because they were able to apply that outsider test to their company
and ask themselves at a time when the market was pivoting, they were able to ask themselves,
if the board fired us and replaced us, what would the new people do?
Right, exactly.
Yeah, this was a story that Andy Graeme.
Rove tells in his memoir, where he and his co-founder, Gordon Moore, they were really struggling
because the company at this point in the mid-80s, I guess, had been overtaken by Japan in the memory chip
market. They weren't sure what to do, and just companies' sales just kept sliding.
And they were just really resistant to the idea of giving up the memory chip market, because
that was kind of Intel's identity up until that point. They were sitting in Andy Grove's office,
And Andy described looking out the window at, I guess, a ferris wheel in the distance.
And that gave him the idea to ask, hey, Gordon, suppose that the company fired us and brought in some new CEO or directors.
And suppose the new CEO walked into this office, what do you think he would do about this situation?
And Gordon said, oh, he would get us out of the memory chip business.
And Andy said, well, okay, then why couldn't you and I just walk out of this office and then walk back in and do the same?
same thing. And they kind of realized that that was the outsider test that they just performed. And
it made them realize that they were, they didn't actually have a good reason to continue in the
memory chip business. They were just kind of doing it because it was, that's what they were used
to. But it didn't actually make sense going forward. And doing the outsider test allowed them to
realize that. So that's why they switched into the microprocessor market and the company recovered and
has been successful ever since. That kind of test can really apply to anything, whether it's a
decision that you have to make that you're kind of resistant to or you could do the outsider test
about how reasonable your behavior was. Sometimes if I think that I've, I don't know,
asked a stupid question in a meeting or something and I'm feeling self-conscious about it.
Sometimes I imagine, well, suppose someone else asked that question, how would I judge them?
How big of a deal would it be? And I noticed, I actually wouldn't, it wouldn't make much of an
impression on me. I wouldn't really think it was that stupid or even.
if it was stupid, I wouldn't think it mattered very much. And so doing that kind of thought
experiment can help me notice that I'm applying a different standard to myself as I apply to other
people. Right. So you've talked about the outsider test. You've also talked about skepticism about
the source. You know, would you value this differently if it came from a different source?
Right. What are some of the other thought experiments that you can use in order to notice your own
biases? Well, another one that I use a lot is the selective skeptic test.
So I actually used this when I was writing the book because I found this study that contradicted my thesis in the book.
It said that soldier mindset makes you successful.
So I read that headline and my eyes kind of narrowed and I was like, let's see, let's check up their methodology and see if it holds up.
And I read the methodology section and indeed it was kind of a poorly done study.
I found some kind of glaring flaws in the design.
And then I asked myself, well, suppose I had found the same study with the same methodology,
but the conclusion was the reverse.
And it actually supported my thesis and said that scout mindset makes you successful.
What would my reaction have been to that study?
And I realized, oh, in that case, I would have said, great, let's put the study in the book.
And so that kind of made me realize I was being selectively skeptical.
And that I needed to up my game and just be a little bit more critical of evidence that told me what I wanted to hear.
So I went back and went through some of the studies that I had saved to put in the book,
and I kind of looked at their methodology sections with the same critical eye that I was applying to disconfirming evidence
and realized that a lot of them actually weren't really that strong, and I shouldn't lean on them in the book.
And I couldn't actually defend them.
And so I had to rewrite big sections of the book before publication.
But, you know, I think this is the kind of thing that, like, to go back to the example of John Allen Paulos,
who stuck with his Enron investment law.
after it was reasonable and ended up losing, I think, probably millions of dollars, although he
didn't specify the amount. If he had applied the selective skeptic test, like, for example, in many
cases, he would read an article talking about how Enron was a bad investment, and he would come up
with some reason to reject it, like, well, you can't trust this columnist because, whatever,
they have some bias, so I can ignore them. If he did the selective skeptic thought experiment
and asked himself, well, suppose this columnist had told me that Enron was actually a great investment.
How would I have reacted if he'd given the opposite advice?
I think he would have noticed, like, oh, then I would have accepted it.
Like, if the same columnist advised me to stick to Enron, I would accept it.
And if you told me not to stick with Enron, I would find this excuse to ignore him.
And so doing that thought experiment can just help you notice that you're applying a different standard,
depending on what someone tells you.
So you never know.
I think a little bit of judicious application of that thought experiment might have saved him millions of dollars.
Right. Exactly. It sounds like a lot of these go back to noticing whether or not there's a double standard. And we often apply these double standards very unconsciously.
Yeah, that's exactly right. Yep. They're all just specific applications of that general principle.
You also talk in the book about evaluating your level of certainty in the ideas that you have or in the ideas that you form. How can you both qualitatively and quantitatively,
know once you've reached a conclusion, how can you assess your relative level of certainty in that?
Yeah. So I think this is a really important facet of scout mindset, because to go back to the metaphor,
the scout has this map of the terrain or of the situation. But part of forming a map is recognizing
that it's not reality. Your judgments about what's going on are not objectively true. They're just your
best guess and they can be, you know, they can be wrong, they can be mistaken, you could be missing
information, you know, everything that we, all the judgments we form about the world are filtered
through our own biases and just imperfect perception. And so you always want to have at least
some amount of uncertainty of question marks on your map about what you're actually perceiving.
You know, that river may look like it's frozen, but you probably want to get at least one or two
more different looks from different angles before you're too confident that the river is actually
frozen and you walk across it. So the question is, how can you tell how confident you should be
in your different beliefs? And by default, people often tend to just default to certainty.
You know, they'll say, well, definitely the Democrats are going to win or definitely this is a
good investment. This company is going to take off. Or definitely, I made the right choice,
etc. But actually, you need to have varying degrees of confidence. And one way to sort of develop
the ability to calibrate your confidence appropriately is just to make predictions and check them.
In my book, I have a sample of 40 trivia questions, just things like which of these two countries
has a higher population or true or false? Bats are blind, things like that. And you have to answer,
give your best guess, and then say how confident you are that you're right, you know, 60%, 80%, 90%, and then
after you've made a bunch of predictions with corresponding levels of confidence, you go through and find out
the actual answers, like which ones did you actually get right or wrong? And the goal is to be well
calibrated, which means that when you say you're 60% confident in something, you actually are right
to 60% of the time. And when you say you're 90% confident in something, you're actually right,
90% of the time. That's the ideal that you're kind of shooting for. And most of us don't really get
close to that ideal when we first start attempting this skill. Usually we're overconfident and we're
right much less often than we think we're going to be right. But it doesn't actually take that
much practice to get well calibrated. You know, you can do this on trivia questions or you can do it on
something more specific to, you know, your career or your goals. You can make predictions about
which companies are going to take off or you can make predictions about how well an employee you
hire is going to turn out. But the important part is to put down a level of confidence along with
that prediction and also to go back and check later to see whether you actually got it right or wrong.
Because it's all too easy to just forget about the things that you got wrong and that just reinforces
our overconfidence in the long run. Let's talk about the opposite of overconfidence. What if a person
is stuck in analysis paralysis? They are so aware of uncertainty that they can't take any action or
make any decision so they get mired in the status quo. Yeah, I think this is a common thing that people
are concerned about with the idea of recognizing uncertainty. And it is a failure mode that I've
seen some people fall into. So it's not a ridiculous thing to be concerned about. But here,
I would say the important skill that you need to have to be able to recognize uncertainty and still
act is just to be able to act on your best guess while kind of keeping in the back of your mind,
this could change. And if I get new information, I can reassess. But for now, I'm going to act
on my best guess. And this is especially important to be able to do for decisions that aren't
incredibly high stakes or decisions that are reversible. Because a lot of the decisions we make on a
day-to-day basis are things where you can just change what you're doing later on if it turns out
that you were wrong. And it's more important to just make a decision now than to deliberate for
ages until you, you know, are certain you have the right decision. So to go back to the map
metaphor, you want to kind of retain the uncertainty in your map and recognize, you know,
this business plan that I've come up with could be wrong. And I want to make sure that I'm
open to any evidence that it might be wrong. But for now, I'm just going to go with it.
And at some point, six months from now, or at some point, if I learned something new about the market
that I didn't know or I get some feedback from my beta testers that surprises me, then I can
stop and reassess, but for now I'm just going to act on this assumption. So the important thing that
the aspect of Scott mindset that I want to emphasize is just remembering in the back of your mind
that you could be wrong and you want to be open to evidence about that. But it doesn't mean you have to be
thinking about your potential you could be wrong 24-7. Right. That reminds me of another expression,
strong opinions weekly held. Yeah, that is exactly what I think that expression was meant to
to capture, yeah, that you should act on your best guess and really investigate it strongly or
immerse yourself in it, but also be willing to give it up and switch to a different best guess
if the evidence suggests that you should. I forget who coined this phrase or what the context was,
but I think it was meant to be a corrective to this kind of analysis paralysis where people felt
like they could never act on any assumptions unless they could be certain that they were right.
We'll come back to the show in just a second, but first, let's talk about
practical real-life applications of scout mindset.
What are some of the ways that this has helped you in various aspects of your life?
Well, you know, one thing that I often struggle with is I get kind of dejected or I get in a funk where I feel like something I'm working on is hopeless or pointless or it's, you know, going to fail.
And I don't want to try to lie to myself and say, oh, it's definitely going to work out.
But I do want to prompt myself to just be more objective and not just give in to whatever my, you know, funk is at that particular moment.
And so one thing that I try to do is to try to make a prediction about how likely it is that I'm still going to feel this way about my project a week from now.
And often that shifts me out of the mode of just looking for reasons to confirm my current feeling that things are hopeless.
And it instead forces me to give my honest best guess about, okay, how likely is it that I'm going to still feel this way in a week?
And, you know, with what level of confidence would I make this prediction?
Like, suppose I had to bet on whether I'm still going to feel like this project is hopeless next week.
Would I be willing to bet $1,000 on that and lose $1,000 if I was wrong?
Often making it concrete like that forces me to realize, like, oh, I actually would not be willing to take that bet.
because in the past I have often felt this way and turned out to be wrong or, you know,
things have turned out much better than I felt they would or I've, my mood has changed.
And so no, I wouldn't be willing to stake $1,000 on my bad mood continuing indefinitely.
So imagining that bet kind of forces me to be objective with myself and not just wallow in my,
in my bad mood.
So I find that kind of a helpful and a more honest way to see.
snap out of my bad mood than just lying to myself. And then another technique that I find myself
using a lot is, you know, we've been talking a lot about improving self-awareness, like
helping yourself notice that you're applying a different standard to evidence, depending on whether
you want to believe it or not. And that is a really important part of becoming more of a scout.
But another whole category of technique that I think is really valuable is developing emotional
tools to make yourself willing to consider possibilities that you don't really want to accept.
And so the advice that I tend to give people about that is when you're thinking about something
that you don't really want to accept, like did I make a mistake or was I wrong in that argument
or do I need to do this difficult thing like fire an employee or reassess my business plan,
whatever it is that you're kind of resistant to. Before you try to figure out whether or not that's
true, you should instead take a step back and ask yourself, okay, suppose it was true.
What would I do about that, you know, or how bad would that be?
And first just kind of come to terms with the possibility before you try to evaluate whether
or not it's true.
An example of this from my personal life a little while ago was I was worried that I had
done something inconsiderate to a friend.
And I was debating with myself about whether I needed to apologize to her.
And I was kind of trying to convince myself that, no, no, I don't need.
to apologize because, well, sometimes I told myself, oh, she probably didn't even notice.
And then other times I told myself, well, she's probably already forgotten about it anyway,
which were kind of internally inconsistent excuses, which is like a sign that I was rationalizing
to myself. But anyway, I was struggling with this. I took a step back and asked myself,
okay, suppose I had to apologize. How bad would that be? And like, how would I do it?
And I thought for, I don't know, like less than a minute and realized, okay, I guess I would say
such and such. I kind of drafted a quick draft of an apology that I would feel good about giving her.
And I imagined her reaction. And I realized, oh, you know, it wouldn't really be that bad. I think she would
be appreciative and it wouldn't be, she wouldn't be angry. I don't know. I guess it wouldn't be that bad if I
had to apologize. And so then I was able to go back to my original question of should I apologize.
And then it was much easier to think about. And the answer was much clearer. I was like,
yeah, I should apologize. But only once you've realized that you could do.
with the bad outcome if it were true, only then are you kind of able to accept that it might be true.
And so, you know, the generalization of this example is just come up with a plan for what you
would do if you had to fire that employer. Come up with a plan for what you would do if you,
if it turned out that your investment went south or, you know, whatever the potential bad thing
is, come up with a plan for it first so you feel like you could handle it and then think about,
okay, is this actually true? So you don't feel nearly as tempted to rationalize with yourself.
Right, right. I often tell people a variant of that when they're thinking about buying a rental
property and there are a range of possible variables of what it could rent for and how much
repairs will cost and how much occupant, what percentage of occupancy it'll have. Yeah.
And so I say, you know what? Calculate that full range. Look at the worst case scenario.
Could you live with it? Not do you like it, but.
could you tolerate it? Exactly. That's exactly what I'm saying. You don't have to feel like,
oh, the bad outcome is going to be totally fine and great, because that's not actually realistic.
There are bad things that happen. The goal is just to feel like you could tolerate it,
that yeah, I could get through this. Because I think you really kind of need to have that level
of security in order to be able to think honestly about what's true. Right. Often, by the way,
to give kind of a new wrinkle to this, often the way that I reach that kind of level of acceptance,
is just by looking for a silver lining to the bad thing if it were true.
So, for example, I sometimes get into disagreements on Twitter or, you know, in person I might
get into disagreements with my family about politics or whatever.
Sometimes there's a moment when I start to worry, shoot, am I actually wrong about this?
Maybe they have a point or maybe I overstated my case or I don't know.
Maybe the article I was citing actually isn't as solid as I thought it was.
and it's very tempting in those moments to push that thought out of my mind and just focus instead
in finding reasons why I'm actually right or reasons to shoot down what they're saying because that's
more comfortable. So instead, I try to remember to focus on the silver lining of being wrong.
And I think an important silver lining that people often don't notice is that if you tell someone
that you were wrong, what you're doing is investing in your future credibility because you're
proving that you're the kind of person who doesn't just stick.
to their guns just for the sake of it, and that if you are wrong, you're willing to say so.
And so that might be uncomfortable now, but for the future, you've built up this track record
of being willing to change your mind. And so if you want to convince someone of something in the
future, it's easier to do that now that you've shown that you're not just going to say things
just to justify your own past opinions. So it doesn't make it easy to say I was wrong, but it
makes it a little bit easier. It makes it at least tolerable to say that I was wrong. And often
that's enough to make me willing to acknowledge it. Right, right, exactly. Plus, the more that
you build that habit, the habit of admitting that you were wrong and course correcting, the more that
you're able to then apply that habit to the way that you manage your investment portfolio, which
means that you don't get mired in sunk cost fallacy. Exactly. Yeah. Now, that's another very important
silver lining is that you're building the skill, this like very valuable.
generally valuable skill of being able to notice you were wrong and course correct. And so even if that's a little bit
painful in the moment in this particular case, you're, you know, you're getting stronger. It's like
the pain of exercising is tolerable because it reminds you that you're getting stronger and this is
going to be easier in the future. So I think you just need to apply that same framework to noticing you
were wrong or that you might have made a mistake. Right. And also, if you approach every thought with
there are a range of possible outcomes. I'm going to hold this one, but I'm also going to recognize that there is
uncertainty surrounding it, then it also becomes easier because you've adopted the idea loosely.
Yeah, exactly. This is, I think, a nice positive side effect of scout mindset that people often don't
appreciate, which is that it does make it easier to change your mind and recognize contradictory or, you know,
disconfirming evidence if you think in terms of probabilities and not in terms of certainty.
Because, I mean, think about it. If you see things in black and white and things are either right
or they're wrong, and I'm definitely right and those people are definitely wrong, then any evidence
that contradicts your certainty is a threat that you have to defeat or you have to find some way
to shoot it down or else you have to flip from black to white and that's very painful and a huge
deal. If instead you always have some range of uncertainty and you're you're like, well, you know, I'm 80%
confident that this choice that I made from my business was a good idea. I'm, you know, 75% confident
that this was a good investment. Then the stakes are much lower in any given situation. Like if you
read an article suggesting that, oh, actually this market is much more volatile than I'd realized or, you
know, read some study suggesting that, oh, actually this political view I had on immigration is, you
know, it contradicts my view, then you don't have to flip from certain yes to certain no. You just
have to adjust your confidence level a little bit upward or downward. It's rare for one piece of
evidence to be definitive, but it should usually make you a little less or a little more
confident. And so maybe you go from 80% sure that you made the right choice for your business to
only 65% sure or something like that. But either way, it's a, it's a smaller adjustment. And the stakes,
the emotional stakes are lower than if you think only in terms of certainties in black and white.
So I think that's kind of a nice upside to Scout mindset is that it just makes it easier to think about
disconferming evidence.
So I guess if I could summarize Scout mindset, the promise of it is that there is a high likelihood
that rejecting all or nothing thinking in favor of probabilistic thinking is advantageous.
Yes, that I mean, that's a great summary.
It's not the entirety of the benefits of scout mindset, but it's a very important part of it.
You're so good at concisely summarizing what I spend five minutes trying to say.
Thank you. Thank you.
Well, we are coming to the end of our time.
Are there any final takeaways that you would like to leave this audience with?
Yeah.
I think maybe the most important thing to remember that we haven't really touched on yet is that
when you notice yourself in soldier mindset, like when you notice that you're getting
defensive or that you're kind of reflexively arguing with someone instead of thinking about whether
they might be right, or you notice that you're rationalizing and trying to resist some unpleasant
or inconvenient truth or that you're applying a different standard of skepticism, depending on
whether you want to accept the claim. When you notice yourself in soldier mindset, you should
feel good about noticing, which might sound paradoxical, like it's bad to be in soldier mindset.
Why should you feel good about that? But the point is,
soldier mindset is so common and so universal that if you don't notice yourself doing it,
that that's not actually a good sign, which is more likely that you are just someone who,
unlike the rest of humanity, is never in soldier mindset, or that you're just not very self-aware
and you're just doing it a lot without even noticing it. I think the latter is a lot more likely.
And so when you notice, you should feel good. You pat yourself on the back for your increased self-awareness.
And I suspect a nice side effect of that policy is that you're more likely.
to notice because you feel good when you notice instead of beating yourself up about it.
So that can kind of help increase the self-awareness too. It's like a self-fulfilling cycle.
And I guess one other takeaway to try to push yourself more towards the scout end of the
spectrum is just to be incremental about it. No one is a perfect scout and you're not going to
transform your entire way of thinking overnight. And so don't feel bad for falling short of perfection.
But just try to build one or two habits at a time. Maybe start with the habit.
of doing that kind of mental exercise that we talked about, of asking yourself,
what would I do if this bad thing were true before you try to evaluate whether it is true?
Or maybe start with the habit of thinking about, well, how confident am I?
You know, am I 60%, 90%, etc?
Or you could start with the habit of doing the outsider test on some of your personal problems.
But just, you know, start with one or two and build those habits and every now and then introduce a new one.
Over time, incrementally, you can move yourself more from the soldier end of the spectrum to the scout end of the spectrum.
But it's a process.
Well, thank you, Julia.
Where can people find you if they would like to engage more with your work?
Thank you.
This has been wonderful.
So my website is julia gallif.com.
And my book is The Scout Mindset.
There's a page on my website about it.
And you can also just find it on Amazon or wherever you buy books.
Oh, you could also follow me on Twitter.
I'm just Julia Galef.
I talk a lot about these concepts on Twitter and I often try to explore or think out loud about
things that I'm uncertain about or have changed my mind about.
So it would be great if you join me and give me your info on that.
Thank you so much, Julia.
What are some of the key takeaways that we got from this conversation?
Here are seven.
Number one, think in terms of expected value.
Many people think that they need to be optimistic about their odds of success, when in fact,
it could be wiser to think in terms of expected value, like Elon Musk and Jeff Bezos both did.
You don't need to delude yourself in order to keep your motivation high.
Essentially, the expected value of some endeavor is the probability of success times the value of success,
plus the probability of failure times the value of failure.
So what that ends up meaning is that some endeavor can be very worth doing,
even if it's more likely to fail than to succeed,
as long as the value of success is high enough
and the kind of value of failure
or the badness of failure isn't all that bad.
If you can walk away from a failed experiment
with even a little more than you started with,
like Elon Musk reasoned,
then going ahead could be worth it
even if it doesn't work out the way you wanted it to.
And that thought can be motivating on its own,
especially if it aligns with your bigger,
why? And so thinking in terms of expected value and not buying into the idea that you need to
delude yourself in order to stay motivated, that's key takeaway number one. Key takeaway number two,
run an outsider test. To be able to assess whether or not you're applying a double standard,
you can run thought experiments. So let's review the thought experiments that Julia mentioned.
When you feel personally torn about something, you can gain perspective on the situation by running
what Julia calls the outsider test. This is when you swap yourself with someone else and think about
what advice you would give to that person, someone in the exact same situation that you're in.
That can cut through the noise in your own head. It's just very striking oftentimes how different
my judgment is when all I change about the scenario is whether it's me who's in the scenario.
And so running an outsider test in order to assess whether or not you're imposing a double
standard on yourself, that is key takeaway number two. Key takeaway number three, vary the source of
the claim. So in a similar vein, this is another type of test to tell whether or not you're applying
a double standard. What if you swap the person who's giving you advice or insight or information?
What if you swap that person with someone else? Would you judge insight, observation,
information, advice, would you judge it differently if it were coming from a different source?
Now, of course, not all sources are created equal.
Some sources are more credible than others, but are you emotionally conflating your assessment
of the source with your assessment of the insight, the observation, the information,
the advice that they're giving?
There is a distinction between assessing credibility versus emotionally conflating your response.
If a friend of yours gives you advice, would you judge some?
differently than if, say, your colleague who you don't like gives you advice, because very often
we discount advice just based on not liking the source.
And so that is key takeaway number three.
Very the source of the claim.
Key takeaway number four, check on whether you're being selectively skeptical.
This really hones in on confirmation bias.
If you find an article or a study that supports a position, a belief, or an idea that you
hold strongly to, you're likely to give more weight to it.
That's confirmation bias.
But if you find an article or study that disproves or that pokes holes in your position,
your belief, or your idea, you're likely to be more skeptical of it because you don't want
to believe it.
And this is one of the more subtle ways that we fool ourselves.
Julia herself even fell prey to it when she was writing her book.
I found this study that contradicted my thesis in the book.
It said that soldier minds that makes you successful.
I was like, let's check up their methodology and see if it holds up. And I read the methodology section. And indeed, it was kind of a poorly done study. I asked myself, well, suppose I had found the same study with the same methodology, but the conclusion was the reverse. And it actually supported my thesis and said that scout mindset makes you successful. What would my reaction have been to that study?
If you do this thought experiment and find yourself saying that you would agree if the conclusion supported your stance, then that's a red flag.
So for example, let's say that you wanted to start an online platform and you came across some research that said 80% of new online platforms flop within a year.
If you find yourself immediately disagreeing and trying to tear the research apart, flip the script and ask, hey, if this research had shown the 80% of new online platforms experience why?
Wild success within a year, would I be as critical of the methodology?
Would I be as skeptical?
If not, that's a cue to check your judgment, to check your internal biases.
And so that is key takeaway number four.
Key takeaway number five.
Evaluate your level of certainty of specific ideas and beliefs.
Now, all of this discussion eventually leads to the question,
how confident should you be in the beliefs that you hold?
Often, we default to overconfidence.
Julia says that you can calibrate your certainty by making predictions, noting your level of certainty,
and then keeping track of how correct you were.
It doesn't actually take that much practice to get well calibrated.
You know, you can do this on trivia questions or you can do it on something more specific to, you know,
your career or your goals.
You can make predictions about which companies are going to take off, or you can make predictions
about how well an employee you hire is going to turn out.
But the important part is to put down a level of confidence along with that
prediction, and also to go back and check later to see whether you actually got it right or wrong.
Eventually, you'll hone this skill and you'll be able to say how certain you were with
that amount of certainty. This is a useful exercise for understanding how anything that we think
we believe is actually a belief about a likelihood. So, for example, I think that the stocks in
my portfolio are likely to perform well in the next decade.
But really, what I'm saying is that there is a range of possibilities, and I think that there's a high probability within that range that my prediction is correct.
But I might be wrong.
And that leads to the next key takeaway, key takeaway number six, look at the range of possible outcomes.
In order to get out of analysis paralysis, in order to move forward with some type of decision, looking at the range of.
range of potential outcomes can be incredibly helpful because once you look at a range of
reasonable potential outcomes, you can create a worst case, midcase, and best case scenario.
Now, could you live with or tolerate the worst case? How badly would it affect you?
Sometimes worst case scenarios are at the end of the world, but we build them up in our mind so
much that we freeze on taking action. So looking at the range of possible outcomes can help us
except that the most likely outcome will fall within this range of worst to best case,
worst case, mid case, and best case.
And as long as we can live with that worst case, we can feel more confident about going ahead.
I often use this exercise with the students in my course, my rental property investing
course, when they're thinking through whether or not to buy a particular property.
And I say, hey, let's calculate a range of potential cap rates on this property.
Here is the worst case, mid-case, and best-case, cap rate.
Now, looking at that range of outcomes, how do you feel about those returns?
Julia gave a different example.
Julia gave the example of worrying about whether or not she was inconsiderate to a friend
and how she made peace with apologizing.
I was debating with myself about whether I needed to apologize to her.
I took a step back and asked myself, okay, suppose I had to apologize.
How bad would that be?
And, like, how would I do it?
And I kind of drafted a quick draft of an apology that I would feel good about giving her.
And I imagined her reaction.
And I realized, oh, you know, it wouldn't really be that bad.
I think she would be appreciative and she wouldn't be angry.
This thought exercise helped Julia make the decision to apologize to her friend because it helped her accept that she needed to apologize in the first place.
It also saved her a lot of mental anguish.
By taking action, she no longer had to agonize over whether the apology was necessary or
whether she truly hurt her friend. She looked at the range of likely outcomes, realized that
no bad could come of apologizing and perhaps only good could come of it. And so then she realized
that that was what she should do. So that's an example of how this thought exercise can be used
not just in investing, but also in daily social and family situations, as well as workplace
situations. And so that is key takeaway number six. Look at the range of possible outcomes.
And key takeaway number seven.
Keep the silver lining in mind.
Julia makes an interesting point about the silver lining in being wrong.
If you tell someone that you were wrong, what you're doing is investing in your future credibility,
because you're proving that you're the kind of person who doesn't just stick to their guns just for the sake of it,
and that if you are wrong, you're willing to say so.
That's a powerful way to reframe being wrong.
oftentimes we feel shame or embarrassment about being wrong, making an error, but by modeling
that it's okay to admit when you're wrong, you may increase the chances of others doing it.
And if nothing else, showing that you're a critical thinker and that you can pivot when
appropriate evidence is supplied, that alone should help you gain respect, at least self-respect.
And finally, as Julia said, associating a positive feeling with being wrong increases the chances
that you'll become more self-aware and notice when you're wrong more often.
Some of the best thinkers in my anecdotal observation
are the ones who can freely admit when their thinking has pivoted
or has become more nuanced or more sophisticated.
Because if you are constantly learning,
then your thinking will become more nuanced over time.
It will become more layered and more deep over time
as your level of knowledge increases.
And so those are seven key takeaways from this conversation with Julia Galeff.
Thank you so much for tuning in.
If you enjoyed today's episode, please go to afford anything.com slash show notes.
Sign up to get all of the show notes, the synopsies of every episode delivered directly to you.
It's completely free, and it's a great way to get notes from every episode that we produce.
Again, that's afford anything.com slash show notes.
If you enjoyed today's episode, please do three things.
Number one, share it with a friend or a family member.
That's the single most important thing that you can do
to spread the message of making wise decisions
about your money, your time, your energy, your attention.
Number two, make sure that you're following us
in whatever app you're using to listen to this podcast.
Just go into that app and hit the follow button.
And number three, while you're in that app, please leave us a review.
If you want to chat about today's episode with members of our community,
head to Afford Anything.com slash community.
We have, by the way, a thriving book club in that community.
Right now, they are reading The Psychology of Money by Morgan Housel.
He was a former guest on this show.
They've just got tons of great books going on, great discussion.
In addition to the book club, we've got villages of people inside of our community
who gather based on shared geography, like people who live in the Pacific Northwest or in the
southeast or in the northeast.
We've got people who gather based on.
age group, 20s, 30s, 40s, 50s, we've got people who gather based on profession.
We've got people who talk about shared interests, whether that's real estate or stock
investing or life after FI.
So whatever village it is that you're looking for, you can find it inside of our community.
Again, that's afford anything.com slash community.
Thanks again for tuning in.
My name is Paula Pant.
This is the Afford Anything Podcast.
Don't forget to subscribe to our show notes.
Affordainting.com slash show notes.
and I will catch you in the next episode.
