The Knowledge Project with Shane Parrish - #112 Adam Grant: Rethinking Your Position
Episode Date: June 1, 2021Celebrated organizational psychologist and author Adam Grant provides compelling insight into why we should spend time not just thinking, but rethinking. In this episode we cover how to change our own... views, how to change the views of others, hiring processes, psychological safety, tribes and group identity, feigned knowledge, binary bias, and so much more. Grant is a Professor of Psychology at The Wharton School of the University of Pennsylvania and the author of five books, including his most recent release, the New York Times bestseller Think Again. He also serves as the host of WorkLife, a TED original podcast. -- Want even more? Members get early access, hand-edited transcripts, member-only episodes, and so much more. Learn more here: https://fs.blog/membership/ Every Sunday our Brain Food newsletter shares timeless insights and ideas that you can use at work and home. Add it to your inbox: https://fs.blog/newsletter/ Follow Shane on Twitter at: https://twitter.com/ShaneAParrish Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Yeah, you're entitled to your own opinion.
If you keep your opinion to yourself, if you decide to say it out loud, then I think you have a responsibility to be open to changing your mind in the face of better logic or stronger data.
And so I think if you're willing to voice an opinion, you should also be willing to change that opinion.
It goes right back to something we talked about earlier, which is to say, okay, when would I change my mind?
If you can't answer that question, you are no longer thinking like a scientist, you have gone into preacher or prosecutor mode.
Welcome to the Knowledge Project podcast. I'm your host, Shane Parrish.
This podcast is packed with timeless ideas and practical insights to help you get the most out of life and business.
If you're listening to this, you're not currently a supporting member.
If you'd like special member only episodes, access before everyone else, transcripts, and other member
only content, you can join at fs.blog slash podcast. Check out the show notes for a link.
Today I'm speaking with Adam Grant. Adam has spent the past 15 years researching and teaching
evidence-based management, help the likes of Google, Pixar, and the MBA re-examine how they can design
meaningful jobs, build creative teams, and shape collaborative cultures. Adam's latest book, Think Again,
is this best yet. We're going to talk a lot about thinking, or more specifically rethinking,
attempting to answer the questions of how we update our own views, how we change the views of
others, and how we create a community. This podcast is an invitation to let go of the knowledge
and opinions that are no longer serving you well and to tie your identity to flexibility
and not consistency. It's time to listen and learn.
Tell us in your own words how you came to write about rethinking.
This is such a hard question to answer because everything I want to say, I'm tempted to rethink.
So we're going to have to question a lot of what I say here.
I think I really started thinking deliberately about it when I just had experience after experience of going into a new organization.
And usually I'd either give a keynote speech or the CEO or founder would reach out for some advising.
and I'd start to walk through the evidence on whatever their question was.
And more often than not, I'd get answers like, well, that won't work around here,
or that's not how we've always done things.
And at some point, I just started saying, hmm, BlackBerry, Blockbuster, Kodak, Sears,
should I keep going?
It was both surprising and mildly annoying to me that the very people who had called me
because they thought I could help them rethink their vision, their culture,
their strategy, were closed to the best evidence I could find. And I got really curious about
why that was. And then I saw a lot of the same thing with my students, where they'd come in,
completely locked into being investment bankers. I've had enough students regret that path
that I've spotted some of the warning signs that somebody might not find Nirvana going down
that road. And I'd try to encourage them to reconsider. And I got the same resistance. And any time I
see a 21-year-old and a 61-year-old grapple with the same exact problem, I think there's
something really important and interesting to explore. So here we are. Yeah, definitely. It's easy to
see that other people should rethink things. Like we're looking at them going like, oh,
you know, that's Blockbuster, they should have rethought that, or they should be rethinking
that. It's really hard for us to rethink ourselves. Why is that? There are probably multiple
reasons for it. But I think two of the reasons why people are really hesitant to rethink.
think things are one, it makes the world feel much more unpredictable. You know, if, if my views
aren't fixed, then who am I? And how do I navigate a really confusing and often turbulent
world? And two, it makes me feel like I am not an expert, right? And a lot of us take pride in our
knowledge. You know, when I think about power, there's a classic French and Raven framework
where they said, look, you know, there's, there's expert power, there's what's called
referent power, which is basically being liked in respect, and then there's coercive reward
and legitimate power. And most of the basis of power that people have in life come from a
position that they happen to hold, right? So my ability to reward you or punish you, my ability
to get you to listen to me because I have a role of authority is not something I can carry
with me. And so the knowledge I have is one of the few things that I get to,
to hold on to, and the idea that that might be fragile, it not only questions my identity,
it also, I guess, questions my status and my standing in the world, which is something pretty
uncomfortable to do. Let's double-click on that identity concept, because I think that
you sort of said our views are almost tied to our identity, and that gets in the way sometimes.
This is something I'm always puzzled by, and I feel like I see it in every field. There are
professionals in almost every field who not only are interested in particular ways of being
or particular practices, but they actually define themselves by living those practices.
You know this already, but when I was writing the book, I started thinking through, you know,
how terrible would the world be if some professions had not rethought some of their convictions?
So imagine, for example, that you went to a doctor whose identity was to be a professional
lobotomist. That would be extremely dangerous. And yet there was a time when a lot of physicians
define themselves by that method, by that set of tools.
We've seen the same thing with, you know, with police officers, right,
who identified themselves as the kinds of people who would stop and frisk
because you never know where a criminal could be.
And we know from the evidence that that just had horrendous effects,
particularly when it comes to, you know, disproportionately arresting
and prosecuting people of color, particularly black people here in the U.S.
you know, we've had teachers and parents who identified with practices that were just highly
ineffective and maybe even harmful. And I think that it's dangerous, right? I think that for me,
an identity is not about what you believe. It's about what you value. And so I want to have a set of
principles. For me, my highest values are generosity, excellence, integrity, and freedom. And I am
completely flexible on the best ways to live those values. And so you might come tomorrow and tell me,
what? The randomized controlled experiments that you do, the longitudinal studies you do, there's a fatal
flaw in them, and there's a better way to be helpful and excellent at your job. And I would be
skeptical because I believe in science, but I would be open to hearing the idea. How did you get to
that point? Like, how do we convince ourselves to attach our identity to values and not beliefs?
Like, that's a tricky path, isn't it? Do you think so? I don't know. Like, how do you, it seems
like it is. Otherwise, we'd just be rethinking all the time. I mean, isn't that part of the fun of
being human, though? To me, rethinking is code for learning, isn't it? Well, it is, but we don't update
our views very often. That's part of the issue, our own and others, right? It's hard to see it
when we don't do it. It's really easy to see when others don't do it. What is the process by which
we update our views? That's a great question. The way that I've landed at thinking about this
is to say, look, when you have a belief, you have two options.
One is you can subject it to a rethinking cycle.
The other is you can fall victim to an overconfidence cycle.
So an overconfidence cycle is something we've all both committed
and witnessed probably too many times.
But the basic idea is we start by being proud of something that we think we know.
And that leads us then to feel a lot of conviction.
That kind of launches us into confirmation.
bias, where we look for information that confirms our expectations, as well as desirability
bias, where we look for information that basically reinforces what we want to be true.
And then we see what we expected to see and what we wanted to see, and we get validated,
and that only makes us prouder of what we know and less open to rethinking.
The rethinking cycle is very much the opposite.
It starts for me with intellectual humility, which is about knowing what you don't know.
you know, no matter how much of an expert you are in a given field or a given topic,
you have a long list of things that you're clueless about.
And being aware of what your ignorance is leads you to doubt your convictions.
It makes you curious about what you don't know.
And that opens your mind in new discoveries.
And then every time you learn something new, it's not this sign that, oh, now I'm an expert.
It's this sense that, well, there's so much more to learn, right?
And I've made a tiny, tiny dot of progress in, you know, a whole universe of knowledge.
And I can't wait to see what I learn next.
And so I think one of the things we need to do is we need to give ourselves permission to enter rethinking cycles.
And there are a lot of ways to do that we could talk about.
But Shane, I'm going to ask you about this because a couple years ago, you wrote a post about how we should have more second thoughts.
And I had literally started writing about that.
I think it must have come out around the time that I was writing the Think Again book proposal,
and I had proposed a tentative title for this book as second thoughts.
I was like, this is amazing.
You're on the exact same wavelength as me, and this is what you do for a living, right?
You rethink things.
You also ask the Farnham Street community and your whole audience here at the Knowledge Project
to rethink a lot of their convictions.
So where do you start your rethinking cycles?
And how do you know when it's time to enter one?
Wendy's most important deal of the day has a fresh lineup.
Pick any two breakfast items for $4.
New four-piece French toast sticks, bacon or sausage wrap, biscuit or English muffin sandwiches, small hot coffee, and more.
Limited time only at participating Wendy's taxes extra.
I think, like, I've just summed this up as, like, outcome over ego.
And so I usually try to wrap my outcome or wrap my sense of identity or ego in the outcome.
And that's something I learned when I was working for the intelligence agency, right?
Like, it wasn't about me having the best idea.
It was like, who's got the best idea?
because that's going to get the best outcome.
And then you sort of grow up in an environment where that becomes,
I would say the norm by and large.
It's hard in a knowledge environment, though, right?
Because you have so much of your worth.
You want to contribute to something.
I think there's a biological need to contribute to something larger than us.
And if your identity, you're not mechanically making something.
You can't see there's nothing tangible to what you're producing.
Then you effectively are a knowledge worker in one way or another.
And then you're paid for your judgment.
So if your judgment isn't right, what is it?
And then what you do is you force your way, right?
Like you don't intentionally sabotage other people, but you only look for confirming evidence.
You're not open to changing your mind because your sense of identity is tied to being right,
because that's how you contribute to the organization.
It's interesting, but not at all surprising to me that you really learn this in the intelligence community
because the way you're describing your process of rethinking is exactly what,
what I learned from studying super forecasters, right, which is they will often come in to making
a judgment and say, okay, the only way to have a better shot at being right is to recognize all
the places where I'm wrong. And I love this practice in particular that came from one of the
super forecasters in the book, Jean-Pierre Begom, who, when he forms a tentative opinion,
will actually make a list of the conditions under which he would change his mind. And I've actually
started doing this over the past few months because I don't want to get locked into something that was,
maybe sort of a soothing belief, but ultimately one that's not going to serve me well.
I want to come to something you said about BlackBerry and Blockbuster, I was thinking,
you know, like how do you balance this notion of, okay, we don't know everything? There's a lot of
uncertainty in what we're doing. I'm open to rethinking it, but I also, I need to take action
and I need to do something. And then you have an escalation of commitment, the more action
you take, which it becomes harder and harder to rethink. You have these cost building up.
you have other sort of escalations, how do you balance those two things between being open
and also affording yourself the choices that you need to make to exist in an organization and
seize opportunity? I don't know that there's a way to get the best of both worlds in every
situation. I do think, though, that you can create conditions that at least increase the
probability, that you end up both open and decisive at the same time, which is a sort of strange
combination. So for me, that's really about changing the way that we reward people. So in too many
organizations, people are basically counted as successful if they get a good result and failed if they get a
bad result. And the problem is it often takes years to find out what the results were. It's very
easy for people to persist with a failing project for a long time and convince themselves and everyone
else around them that they're on the right path. What a lot of the research on this suggests is
that we want to shift to process accountability, not just outcome about accountability. And ask
people to really think seriously about, okay, how would I know that this is a thorough and thoughtful
decision process as opposed to one that's driven, you know, purely by whim or intuition and
ends up being much more shallow. So I drew this little two by two that I've found helpful where
I cross the quality of the outcome with the quality of the process. And I think we need to
stop rewarding good outcomes with bad processes because that's just luck. That's kind of a boneheaded
decision that happened to turn out well. And we need to
start either celebrating or at least normalizing good processes with bad outcomes. Because if you have
a very thorough process, let's say, for example, you're going to launch a new product, or you're
going to even try to reinvent your culture a little bit, or you're trying to figure out, you know,
should we hire somebody or not? In all of those decisions, the common ingredient is you don't
know what the outcome is going to be a year, two years, five years down the road. What you do
know, though, is that there are more systematic, more rigorous ways of evaluating the decision now.
And so if you can score yourself on a set of benchmarks around, okay, was my process thorough,
then even if the outcome wasn't good, you could say, well, that was an experiment worth running
because that's part of how you become a learning organization.
And I'm constantly shocked by how few people actually think this way.
How do you judge a process?
Like, how do you walk through judging that and knowing that the outcome won't happen for years,
but also knowing that you need to update the process as you go along to get better and better
with incorporate new knowledge.
So let's take a specific kind of decision.
So let's do a hiring decision, since that's easy to work with.
So, Shane, if you and I are going to make a hiring decision together and we've got...
I'm delegating to you for sure.
No, you shouldn't.
What you want is for me to weigh in on how to design the process and you want to be the one
that implements it.
Okay.
What I would do is I would start by saying, okay, most basic mistake that people make in
these kinds of decisions is they don't consider criteria before looking at candidates.
Right. So they interview their three people and they start to compare them as opposed to saying, no, I should have an independent standard for the skills and values that I'm trying to select on. And let's identify those really clearly. Let's not just do those from my opinion. Let's try to build some wisdom from a crowd here. And of course, not all crowds are equally wise. So let's go to people who are knowledgeable about the key dimensions of our culture, the key challenges of the job. And then once we built out the criteria we're looking for, the next step is to
say, okay, how do we rigorously and comprehensively assess people standing on those criteria?
And, you know, in a lot of cases, there's one interviewer. We know that it's better to go up to
three or four empirically. In too many cases also, it's each interviewer's job to make an
overall assessment of the candidate, which makes it too easy to decide you like someone, and then
confirmation bias and desirability bias are basically driving the process. So what we do instead
is we break this down and we say, okay, Shane, I'm looking for somebody who's a
giver or not a taker. Your job when you meet this candidate is to solely assess them on that
dimension and come back with your behavioral data and whether they fall more on the selfish
or the generous end of that spectrum. And then we have someone else assessing their
intellectual humility and curiosity. We have somebody else who's, you know, maybe gauging
their levels of integrity. And so nobody has a conviction about whether the overall candidate
is good or not. They're building the pieces of the puzzle to say, okay, does this person
meet our criteria. And then after that's done, we would then come together and say, okay, now
let's make an overall judgment having pooled all of our knowledge, right? That's a thorough
process, and it's very different from how most organizations hire. Why is it so different from
what most people do, just because it's time-consuming and... I think it's less about the time
and more about the gospel of intuition. Too many hiring managers are afraid that if they
essentially delegate, I mean, this is a more algorithmic, it's a more algorithmic, it's a more
algorithmic approach to decision-making, right? If I, if I delegate my knowledge and my experience
to what feels more like a formula, then maybe I'm out of a job, and maybe also, you know, my
superior intuition, my gut feeling about a candidate is going to get ignored, and that's what I've,
you know, I've hung my hat on for a lot of my career. I also think it's boring, right? So
managers love having the freedom and flexibility to go wherever the interview takes them. And, you
You know, the idea of being much more structured in your interview process of saying,
okay, let's get a well-defined work sample.
Let's, you know, figure out if somebody says they're a good salesperson,
let's actually ask them to sell us something.
And let's compare all the candidates on the same selling task.
It kind of reduces the variety that I get in my job.
And I think that's, those are a couple of the reasons why it's uncommon.
But I think to your point, yeah, it's expensive, right?
So it ultimately will require a bigger investment of time.
It probably requires more people involved to.
That time has an opportunity cost.
And so maybe we feel like we're giving something up.
I don't know about you.
If I'm going to hire someone and commit to working with them,
I cannot invest enough time up front to decide that that's a good choice.
It's interesting to me to listen to you say that because what comes to mind is like
there are organizations that invest incredible amounts of energy, time, money into this.
There's sports organization, all sports, you know, before they draft somebody.
It's like, what is the person's character?
How well do they recognize the plays?
How well do they, is it intuition versus professionalism on their part, right?
And they have these ways of evaluating that special forces, the same thing.
They're investing a lot in sort of determining these recruits.
And why do you think they're so variable?
Like, it's hit or miss, right?
Like they haven't cracked that code, if you will.
I think it's hit or miss for a few reasons.
One is we don't have all the criteria that we need.
need. I've worked with a whole bunch of professional sports teams over the past few years on this
exact problem, and they're at best assessing on, you know, maybe seven or eight attributes
when there might be 200 that are going to drive people's future performance, right? So I think
that's the first problem. The second problem is the measures are extremely noisy. So it's one
thing to say, okay, you know, if I'm trying to hire a, if I want to draft somebody to play for
the Toronto Raptors, right? I can figure out how tall they are. I can figure out. I can figure out
how high they can jump. But when it comes to, you know, quickness in diving for a loose ball,
I can't measure that as precisely as I would like. And then I'm also trying to come up with a
score for grit and generosity and humility. Good luck with that, right. They're very, very
intangible factors to measure. And then how you would weigh them would vary based on the
individual candidate, too, I would imagine, and team character and a whole bunch of other
things. Exactly. The aggregation problem is huge. So I actually had this question,
posed by a sports team last year that was hiring a head coach. And they had used some of my
assessments. And the question was, okay, do we go, we have two finalists, do we go with the coach
who scored higher-in intelligence or the coach who was more of a giver? I don't know what the right
answer to that is. Was this in the NBA? I'm not at liberty to say. But the coach, they ended up
choosing the coach with the higher intelligence score and firing that coach at the, I think,
at the end of the season or shortly thereafter.
And I don't know what the right answer is there, right?
I think there's, you know, there's probably a threshold.
I would not want a coach who's extremely selfish.
I also wouldn't want a coach who's not, you know, reasonably intelligent.
But then when you get into the ranges of, well, anybody could, with these attributes,
could succeed, I don't know how to trade those off.
And then to your other point, well, how are they going to gel with the culture of the team
and with the players involved, right?
Those are all open questions.
And so this is a very messy problem.
Well, let's come out of this problem a little bit, and I'm going to hire you as an advisor.
And I want to, how do I encourage my organization, the culture within the organization,
that people are, they feel psychological safety, I guess.
That's the core requirement to rethink as an individual is like you feel it's not threatening
to your identity.
It's not going to have an impact on your job, your career, nobody's going to hold it over your
head.
How do we build psychological safety within an organization?
So Constantino Scudafaris and I just finished some studies on this exact topic.
And we started from the premise of saying, look, if you're a leader and you want to build psychological safety,
you want to give people the freedom to take risks and to know they won't be punished if they rethink something or they voice a problem that needs attention,
then what most leaders think they should do is ask for feedback because then the door is opened.
and we did find that CEOs who seek feedback more often
had higher psychological safety in their top management teams
but we found that when we went and encouraged managers to go and ask for feedback
it didn't have a lasting effect on psychological safety
and it seems like a couple things broke down in our follow-up analyses
the first one was sometimes leaders and managers would ask for feedback
and then they didn't like what they heard and they got defensive
which immediately says nope guess the door is closed
The second problem was even when they were open to ideas, sometimes the feedback was irrelevant
or it addressed areas that were outside their span of control.
And so they said, okay, this is not a priority for me or I can't do anything about it.
And that led them to stop asking and it led their employees also to stop giving and to stop speaking up
because it seemed like an exercise in futility.
So even if you took the fear away, that doesn't mean that I can have an impact if I raise an idea
or I challenge my leader to rethink something.
So we got curious about alternative approaches that might have a more lasting effect on psychological safety.
And the one we tried out that worked effectively was instead of just asking for feedback, we actually had leaders criticize themselves out loud.
In some cases, managers would bring in their performance review and they'd say to their team, hey, here's what my boss told me I need to work on.
And I would love your input on whether I'm making progress in these areas.
And not only did CEOs who did that naturally have higher psychological safety in their top management teams,
but when we randomly assign managers to kind of criticize themselves as opposed to asking for criticism,
just inviting them to do that once, increase psychological safety in their teams for at least a year,
which is a staggering effect.
And there are a couple of things that happen that are really different from what happens when you just seek feedback.
So one thing that happens when you criticize yourself is you show you can take it.
and it makes people immediately less fearful about challenging you.
The second thing that happened, which I think is in some ways even more interesting,
is it created mutuality.
There's now a dialogue that's going on where I've said,
you know what, Shane, here are all the places where I just, I stink.
And I really need your help in getting better.
And you now not only have the freedom to tell me how I can improve,
but you feel like you can be more vulnerable with me.
There's basically a normalization of vulnerability that happens,
where, you know, once I say, hey, I'm a work in progress,
everybody on my team is more comfortable acknowledging that too.
And it makes it easier, too, for the team to hold me accountable.
So let's say, for example, I have a tendency to talk too much in meetings.
And I come into my team and one day instead of saying, hey, could you give me some feedback on our meetings?
I say, you know, I've realized I have a tendency to not shut up when I should.
I would love you all to help me with this.
Then two meetings later, I'm rambling.
And that gives permission to my team to say,
hey, you know how you said you wanted to talk less?
It hasn't happened yet, right?
So then what we saw is a lot of the teams work together to create practices for keeping the door open.
They do a first five minutes of every meeting check-in.
Is there anything anyone can do to improve?
They'd hold a monthly vulnerability meeting in some cases where people would just talk about their development areas and their progress and where they were struggling.
And I think that we could all be more open in criticizing ourselves.
And I'm not saying, you know, that every leader should stand up in front of the thousands of people who work below them and talk about all the things they're bad at.
But I think with the core people that you work with, odds are they know what your weaknesses and shortcomings are anyway.
And if you can own up to them, people are much more likely to help you rethink your ways of fixing those flaws.
And I think, ironically, knowing that people might be a little bit on guard or a little bit suspicious actually led leaders to be more authentic.
And to come in and say, you know what?
the whole reason that I'm going to talk about areas for growth is I want to get better.
And so the more honest I can be about that, the more likely I am to grow.
And so I think it's the responsibility of people in power to open that door and keep it open.
And this step to prove, here is the stuff people told me I am terrible at.
Here's how I benefited from hearing those things in the past.
That's one of the best ways that you can show that you really mean it.
I think when I ask this question about psychological safety, I actually presume to know what some of those variables are, but maybe you can help.
What are the variables that go into feeling safe psychologically, either at home in a relationship with your spouse or at work?
Are they the same? Are they different?
So in the early research on psychological safety by Amy Edmondson and Bill Kahn and some of their colleagues, the two foundations were really trust and respect.
And I think a lot of people, as Amy has pointed out, get psychological safety wrong.
They think it's about being nice to everyone or being tolerant of everything or having no standards or not holding people accountable.
No, it's none of those things.
Psychological safety is knowing that other people are going to treat you with respect and trusting that if you go out on a limb and say something uncomfortable or challenge a deep-seated belief that you are not going to be punished for that.
you know, the basic ingredients, right? Trust and respect. Those, those matter in any relationship, right? Whether it's romantic or professional. I mean, one of the most basic mistakes people make on this is they forget that one of the most effective ways to earn trust is to show trust. And I think that's why I want leaders to begin with vulnerability. When you ask somebody for feedback, you are saying, hey, I respect you, right? Shane, I want your input. I want you to tell me what I should rethink in my book about rethinking.
But if I don't criticize myself out loud, then you can't really trust that I'm going to listen to you and that I'm not going to bite your head off or get offended in some way.
I think I'd get a much more honest answer from you if instead of saying, hey, tell me what I should rethink.
I came in and I said, you know what?
I think one of the biggest mistakes I made in writing this book was I really understated the importance of preaching and prosecuting relative to being in scientists.
mode, which maybe is something we'll talk about. Maybe we won't. But once I say that, like,
hey, you know what, I know this book is not perfect. I poured a lot of energy and time into it,
but I really want to find out how I can evolve my thinking, and you're a great person to help me do
that. Let's talk about the preachers, prosecutors and politicians and scientists. Yeah, so credit
to Phil Tetlock for bringing this framework onto my radar. Phil wrote this amazing paper almost 20 years
ago now, where he said, look, you know, a lot of research on decision making and judgment
assumes that people are thinking like hyper-rational economists or scientists. And we're not, actually.
We're much more social creatures than that. And as I read this paper, it suddenly dawned on me that
this is a perfect metaphor for me as an organizational psychologist, because we spend an
an inordinate amount of time thinking and talking like professions we have never held like occupations
we were never trained in so think about how much time you spend in your life preaching right you've
already found the truth and your job is to proselytize it prosecuting you find somebody who you think
is wrong and your job is to prove it and win your your case or come out ahead in the argument
and politicking, where you think, okay, I've got a base of people who I'm trying to curry favor with,
and so I've got a campaign for their approval and support.
What I started realizing is I was actually about halfway through writing Think Again,
I realized it needed an organizing framework, and so much of what I was trying to encourage people to do
was about getting out of the mode of preaching, prosecuting, and politicking,
and into the mode of thinking more like a scientist.
And part of the reason that I wanted to do that is,
I think that the danger of preaching and prosecuting
is that you don't change your mind.
You're right.
Everyone else is wrong.
And so you might be very motivated to get other people to rethink,
but your views are frozen.
They're set in stone.
And politicking is interesting because when we're being political,
we're actually more flexible, right?
We might even flip-flop,
but we're doing it at the wrong times
and we're doing it for the wrong reasons
because we're just doing it to appease our time.
tribe as opposed to doing it to find the truth. And so I think we could all get better at thinking
more like scientists to say, you know what, your views, they're actually just theories, right? You could
kind of make them into hypotheses. And then you could run little experiments in your life to figure
out whether they're true or false. And that should leave you not only more mentally flexible,
but also more likely to change your mind at the right times for the right reasons.
If it's not a law of nature, like effectively, it's just a theory. Exactly. Can you
say that again and repeat it to approximately 8 billion people? Yeah, I wish. In that case,
would you want your identity sort of tied up with your profession a little bit? Because
scientists are known to sort of think and change their mind and look for evidence.
Ooh, that is such an interesting reframing of my stance on identity. I think you might be right.
See myself as a social scientist, right? And thinking about myself, seeing myself as someone who likes to think
and talk scientifically and who was trained to do that, what that means to me is I value
truth.
I'm more interested in getting the answer right than I am in being right.
You know, that means lots of my opinions are still flexible, right?
I have a set of tools.
So I really like experiments.
I really like doing, you know, carefully constructed longitudinal studies.
And I think those tools have been rigorously tested over centuries, right?
as being the most valid and probably independently verifiable,
or at least most difficult to falsify,
techniques for reaching the truth or at least getting closer to it.
And I think as an identity, scientists is helpful
because it reminds me how much we don't know
and how hard it is to arrive at the truth
or an approximation of it.
I want to preface my next sort of question with,
I don't want to talk about politics.
I don't want to talk about sort of liberal Democrat,
Republican, conservative, anything to do with that.
What I want to hit at is it seems that most of our leaders,
we elect them for being strong-minded, clear-sighted.
You know, often charismatic is why are we drawn to these people
if we know that actually maybe the best elected official
would be the one that gets up and says,
I don't know how to fix this.
I would just hire the best people and listen to them,
but we would never elect that person.
And why do you think that is?
Why are we drawn to this?
I don't know.
I think we have elected that person.
I don't think it happens that often.
But I think that in the U.S., Franklin Delano Roosevelt, that was literally his campaign with a new deal.
It was a whole campaign to say, you know what, we're going to run a bunch of trial and error experiments and learn from what works.
I think it's hard for that person to get elected, though, because as we face crises and we grow,
grapple with uncertainty, we're drawn to people who we feel like are going to figure it out
and get to fix it. And so if somebody hedges too much, if somebody shows too much humility,
I think we mistake that as a sign of ignorance, right? And it's the basic trap that you've
railed against for years, Shane, which is we should stop confusing confidence for competence.
Just because somebody is sure of an opinion does not mean they actually know what they're talking
about. And in fact, anybody who is familiar with the Dunning Kruger effect will know that
the more sure people are of their opinions, the more hesitant we should be to listen to them,
right? But there's something very intoxicating about following someone who believes they've
already found the way. And I think it, you know, it gives us a sense of coherence. It can give
us a sense of purpose. It's easy to put our trust in people who, you know, who have a clear
vision. Of course, you know, in the long run, those are the people that I worry most about
because they're the ones who are most likely to get too attached to that vision and stick to it long past its time.
And I think it removes uncertainty too.
We would almost rather be wrong and certain than uncertain and land in the correct spot because it wrecks havoc on us.
At some level, you're right, it's a way of letting other people do your thinking for you.
And this is why political parties have always been such a mystery to me.
when, you know, when people ask me what my politics are, I think for myself, I try to form
independent opinions based on the information that I encounter. And the idea of identifying myself
as a Republican or a Democrat or a liberal or conservative, that's ridiculous to me because it
means I've outsourced my thinking to some group of people that I don't think are thinking
very scientifically. Let's talk about that without using politics, sort of, but like tribes.
like we fit into these groups. We want a sense of belonging. It's a very human thing to want to fit in with a group, be part of a group, have status within that group and a hierarchy. There's a biological sort of hierarchy need that we have. Even if we're lowest on the totem pole, we sort of like want to know where we stand in this pecking order. And then we assume group identities and group positions. And those are really hard. Talk to me about that. Well, I think the idea that comes to mind right away here is the these twin desires that human beings,
are constantly grappling with.
We want to fit in, we also want to stand out.
We want belonging.
We also want status.
There's a theory that I love, Marilyn Brewer, calls it optimal distinctiveness.
And she says, look, there's a way to fit in and stand out at the same time.
It's by joining unique groups.
Because then you are part of something, and you're not only part of something.
You're part of something that has a clear identity because there are very few other groups like it.
But you also stand out.
because of the very way that that group has differentiated itself from others.
And so if you can join a group that gives you that sense of optimal distinctiveness,
if you can join a kind of an unusual group or a group with clear, well-defined boundaries,
then you're able to satisfy those motivations simultaneously.
That explains the rise of a lot of movements and a lot of groups,
where people will say, okay, I want to belong somewhere where I also feel like,
you know, I'm not like everyone else.
else. You know, it goes back to that's how people gain a sense of predictability. It's how they
have a sense of control in their lives. It's how they avoid feeling excluded. And frankly,
the other function it serves is existential. One of the most robust findings in psychology over the
past three decades is that belonging to a unique group actually serves a terror management function
that it helps you avoid threats to your own mortality. Or at least it makes you worry less
about what might happen to you in the future and whether you have a legacy. It connects you to something
larger and more lasting than yourself. And so it's easy to see why people are drawn to these groups.
It's also a little bit scary. So we've talked about sort of the individual, but how do we change
the views of other people that people we work with? Our partners, this is probably the part of the
conversation everybody wants because we're right. How do I change somebody else's views? How do I convince
them that they're wrong and I'm right? Well, it helps to let go of that sense that you're right and say,
look, if this is a one-sided exchange and you're supposed to change your mind, but I get to
stand still, we're probably not going to make a whole lot of progress here.
Unless you are a very good prosecutor or the other person is very happy to be preached to.
One of the more interesting things that psychologists have done when looking at disagreement and
debate is, what if we thought about it as a dance?
There's something about dancing that says, look, we're both going to move.
I think a good argument or a good debate or a good disagreement,
is one where neither of us has really choreographed all the steps we're going to take.
And sometimes we step forward and sometimes we step back.
Other times we sidestep.
But ultimately, we're actually trying to get in rhythm or in sync,
which is a very different goal from trying to change somebody else's mind.
I think, yeah, that's a much better way to frame it.
So what are the steps that we take for this dance then,
not only for sharing our opinion, but also receiving their opinion?
So one of my favorite ways to think about that is a classic study by Neil Rackham of expert
negotiators, where he compared them both pre-negotiation and actually in the live negotiations
to average negotiators to look at what the two groups did differently. And the experts,
they spent far more time planning for and then talking about common ground was the first
takeaway. So a lot of people, when they go to have an argument or win a debate, they think
their job is basically to find the differences quickly so that then they can fix them, right?
That's not at all where I want to start. I want to start by saying, Shane, let's identify
areas that we agree on, which gets us in synchrony, right? And also says, hey, you know what,
this is somebody who shares some of my values or maybe some of my views. That makes the conversation
non-defensive and collaborative right from the start. Then a second difference that jumped out
pretty clearly is the experts asked a lot more questions than the average negotiators. And they
weren't leading questions. They were questions motivated by genuine curiosity. Let's talk for a
second about how versus why questions. So a lot of times when you discover that somebody has a
different view from you, you naturally will say, well, why do you think that? And the problem is
you're setting the other person up for confirmation bias. You're giving them an invitation to make
a list of compelling reasons that they themselves generated for why they're going to cling to
their preexisting convictions, right? So you're doing some of their work for them. What tend to be
more effective are how questions where instead of asking why do you believe this you ask how would
you implement this or how would that idea work if we were to come to some agreement on it and that
that cultivates intellectual humility right it's the the term for it in psychology is the illusion of
explanatory depth and the idea is that people think they understand things much more than they
actually do and if you ask them to explain how right one version of that is just how does this work
Another version of that question is, how would you explain that to an expert or, you know, how would you implement that in the real world?
They suddenly realized, gosh, I don't really know what I'm talking about.
And this has been, this has recently been demonstrated for policy questions where, you know, if you've got somebody who disagrees with you on, let's say, climate change or on tax laws, right?
Instead of asking them why they believe what they believe, if you were to just ask them, well, how would you implement that tax law and what are all the effects it would have?
or how would you, you know, how would you address this climate problem that I know there are a range of complicated solutions to?
As they try to answer that, they realize how the little they know, they become less polarized, they're much more open to hearing alternative views than the hope is that you are too.
One of your pet peeves is feigned knowledge. Talk to me about how you deal with this, because you must see it all the time, not only as a professor, but posturing in organizations and how do you deal with this?
It eats at the core of my soul when someone claims to know something that they don't.
And I have not always dealt with this well.
So the story that comes to mind, I was called by an investment bank some years ago.
And they asked me to figure out how to motivate and retain their junior analyst and associates.
So I did two months of research.
I had experiments.
I had longitudinal surveyed data, interviews, observations.
I had lots of outside research as well as internal data, and I came back with 26 evidence-based
recommendations. And I was presenting them. I think it was actually my first ever video conference
I'd done. This was at least seven, maybe eight years ago. And I'm presenting to the co-heads
of investment banking. They're on multiple continents. And I think I was on about recommendation
five or six. And one of the co-heads interrupts me and he says, well, why don't we just pay them more?
and if there is one recommendation that was not on my list of 26,
it was to solve the problem with money
because they had already thrown a lot of money at the problem.
These people were already well paid,
by which I mean overpaid.
And if money were going to attract them and retain them,
it already would have.
And I'm embarrassed to say that my response at the time was
I've never seen a group of smart people act so dumb.
That did not obviously accomplish a whole lot.
for me, right? Although they did tell me they got a kick out of it later because people
don't normally talk to them that way. But the mental modes other than scientists, the one that
I spend the most time in is prosecutor mode. And I've been accused of being a logic bully from
time to time. And I decided after that that I wanted to be in scientist mode if somebody
had feigned knowledge, right? If somebody claimed to know something that they didn't. And what
what a scientist do? A scientist would be just riveted. Like, who is this person? How could they
possibly believe these things? And what is it that would possibly change their mind? And so I wrote
a little script for how I want to respond whenever somebody challenges my evidence or claims to know
something that I think is false, which is just to get really curious and say, what evidence would
change your mind? And I don't always remember to ask the question. I don't always get curious enough
to really want to know, but in the situations where I've pulled it off, it has completely changed
the tone of the conversation. And often what will happen is the person will map out the kind
of study they would find convincing. And now we're on my turf, right, because I have a mental
library of studies that I can then cite, and they're helping me figure out which kinds of data
they would find compelling. It also, I find it really helpful because it refocuses the conversation
in the realm of evidence, as opposed to just opinion. Right. And so when I asked them to
tell me what a well-designed study is going to look like, we can then agree on what
valid methods are. And once I tell them what the findings are using their chosen methods,
it's a lot harder for them to just have a knee-jerk objection. So I decided after that
experience that you can lead a horse to water, but you can't make it think. The hope is I
learned something from that conversation. And now we're arguing to learn as opposed to arguing to
win. Talk to me about this logic bully thing. Well, I had a student some years ago now,
name is Jamie, and she was trying to make a big career decision about whether to do an MBA and
if so, what school she should go to. I think she had been accepted at two schools. I just said,
Jamie, look, I'm not saying you shouldn't do an MBA, but let me give you all the reasons why I think
it might be a waste of time and money. You already have an undergrad business degree. You don't need
an MBA for any job that you want, and okay, maybe there's some firm that will tell you you need it to get
promoted, in which case I would say, you probably shouldn't work at that firm because it's not
like having an MD or a JD, right? There's not a codified body of knowledge that you need to run a
business. And I just wanted her to think through carefully. Was this a good use of time and money?
And if you could take two years and a quarter million dollars, is there a better investment
of that, given your goals and your life circumstances? And so I didn't have a stake in the
outcome, but I argued the other side as I often do. I guess she activated.
in my prosecutor mode.
And she just, she came back
and she said, you're a logic bully.
I was like,
a what?
And she said a logic bully.
And she went on to tell me that I had overwhelmed
her with rational arguments.
And she didn't agree, but she couldn't fight back.
And my first reaction, Shane, was, yes.
Because I thought that was my job as a social scientist, right?
I want to come with airtight logic and rigorous data
and make the most compelling case I can
but what I was depriving her of
was the opportunity to own her own choice
and to reason through this for herself
and so I've tried to get out of logic bully mode
and that same conversation I've had
with a number of students now
and instead of listing all the reasons
I would say all right Jamie can you tell me
the pros and cons of doing an MBA versus not
why are you excited about it what are the risks
and then I go the extra step and say
and why are you coming to me
are you here because you want my advice
are you here because you're looking for my validation of a decision you've already made?
Or are you here because you want me to challenge your thought process?
And once I know what her goals are, it's a lot easier for me to invite her to rethink some of her assumptions
without insulting her or causing her decision process to go haywire,
or without making such strong arguments that she thinks I'm trying to give her an answer
when I'm really just trying to test her thought process.
I love that term logic bully.
And, you know, you think initially you're like, oh, this is so helpful. And then at the end, you're kind of like, oh, that wasn't what I thought it was going to be. I'm curious, how do you institute rethinking or how do you help kids, not only students, but your own kids rethink? Like, what are parents supposed to do at a young age for elementary school kids or high school kids and university students in terms of opening their mind to different possibilities and what sort of things can we do?
you. One of my favorite things that I learned while I was writing the book came from
Wisconsin's middle school teacher of the year, Aaron McCarthy. What Aaron does with her
students is she gives them a section of a history textbook, and she sends them out to rewrite it.
And what they do is they look at primary sources, they interview people, and they realize
how much information is missing from the way that we've narrated past events. And what that does
is it allows them to think a little bit more
when they encounter new information, like fact checkers.
Where instead of, you know, I read something in the news
or I heard it on TV, it must be true,
to say, well, what are the sources of that information?
And how do we really know?
And I think we should all be lucky to,
lucky enough to have a project where we get to rewrite
a section of a textbook.
I thought that was a brilliant assignment.
A variation on that that we do occasionally at dinner
is Allison and I with our kids will have a minute,
busting discussion.
And I think originally it came about because our kids had learned interesting things at school
that surprised us.
Like one day, one of our daughters came home and said, we were just doing an Egypt unit.
This was an elementary school.
And I found out that King Tut probably did not die in a chariot accident.
I was like, oh, that's so cool.
That's not what I learned.
What else are you learning in school that is different from what we thought was true at the time?
And so it became sort of an occasional tradition for us to say, okay, who's going to bring a myth
or a fun, surprising fact to the table? And what I want to do in these, whenever we have these
conversations is, I want our kids to experience the joy of being wrong, to say it is such a delight
to discover that something you thought was true was actually false, because now you know you've
learned something. I like that a lot. I'm still grappling with Pluto not being a planet,
so I'm a little sad about this. You and me both, Shane.
I think those kinds of moments, right, if you observe your own emotional reaction in them,
why was I so upset when I found out that Pluto was not a planet?
Why was declassifying the name of an object?
The object hasn't changed, right?
It's still floating out there.
It's still somewhere in the same position than we thought it was.
It doesn't affect my life in any material way.
Why does this bother me so much?
It's because I like to have a set of beliefs about the world that I can count on.
And, you know, when it comes to my understanding of the solar system, it's like, I have a tower made out of Jenga blocks, and somebody just pulled out the wrong block, and now all of a sudden the whole tower is coming crashing down.
And I want a much more solid and sturdy foundation for the things I believe.
What else do you do with your kids?
One of the other things that we realized a couple years ago we weren't doing enough of was when it comes to teaching values, talking about how we had failed to sometimes live our values.
So, you know, Allison would often roll her eyes when I would talk to our kids about being givers, not takers.
It's one thing to, you know, to make that case.
It's another thing to say, you know what, here's the time when I was not kind to somebody in my class who was being bullied.
And I regret that, right?
That was me failing to be a giver.
When I tell those stories, right, when I share mistakes I've made, when I talk about embarrassing decisions I've made,
what I'm trying to do is I'm trying to signal to our kids that it's okay to be wrong
and it's okay to rethink the choices we've made that's to me one of the functions of
regret right I think of most negative emotions as teachable moments and the whole point of
experiencing them is you're supposed to learn something from what you did wrong so you can
make it right in the future and so much of regret is saying okay I did something that
that led to an undesirable outcome or that violated one of my values.
And so how do I do this differently moving forward?
Instead of trying to deny these emotions,
let's actually listen to them and figure out what the lesson is in them.
You mentioned rewriting the history section.
It was one of your favorite things that you learned.
What's another favorite thing you learned when you were writing the book?
I mean, one of the things I just never thought about before was binary bias.
So I knew I wanted to write a chapter about having charged conversations.
and how people could talk about the most divisive issues
that many of us have just been shying away from
because it feels hopeless.
And I really came out rethinking my view
that what we need to do is better understand the other side,
which has been so much the political narrative.
I think actually that's part of the problem,
not the solution,
because there is no charged issue
that's ever simple enough to have only two sides.
And the research by Peter Coleman and his colleagues on this
really opened my eyes to the fact that
we want to complexify two categories,
into a spectrum. Anytime you run into a binary, like, you know, liberals and conservatives,
you should picture a whole spectrum of beliefs there and say, well, you know, they're relatively
conservative and relatively liberal members of each of those groups. And if you break down the
multiple issues, very few people agree with their party on all 16 or 17 of the major issues
equally. And so if I can see the nuance there and I can start to get people to think about,
huh, where in this very nuanced spectrum do I fall, then I see more shades of gray and less
black and white in my beliefs. And so what I've been doing a lot now is catching myself in
binary bias, right? I'm so glad in retrospect that give and take had three categories rather
than two, because if there were only givers and takers, I would have missed the matchers,
right, the people trading favors who actually are the most common at work. And I think that
there are so many, this is, I mean, this is relevant to any part of life, but
it's something that runs across almost every project I've done in psychology, right?
The best leaders are not the introverts or the extroverts on average.
They're the ambiverts who fall somewhere in the middle of that spectrum
and are comfortable flexing and talking and listening.
The most creative people are not the procrastinators like me who dive right in
or the procrastinators who wait till the last minute.
They're the people who are quick to start but slow to finish somewhere in the middle of that spectrum.
And I think any spectrum that you draw, you can almost always find.
find an advantage for being somewhere in the gray. And I think we should all do that more often.
That's flexibility and adaptability right there, right? Bingo.
One of the stories from the book that I liked was Daniel Kahneman teaching you about how to
respond to a surprise. Can you tell us that story? I went to give a speech at a conference,
and I didn't know a lot about who was going to be in the audience. And I get up on stage and
sitting in the audience is Danny Kahneman, Nobel Prize winning psychologist, one of the giants of
our field. Afterward, I ran into him. And he said something like, that was wonderful. I was
wrong. And those two things don't normally go hand in hand, right? Either a talk was wonderful because
you were right, or it was bad because you were wrong. My first reaction was to say,
holy cow, Danny Kahneman liked something I said. My second reaction was to say, this is so strange
that he thought it was wonderful to have been wrong.
What's behind this?
And so I ended up following up with him,
and I asked him why I could literally,
his face lit up when he talked about
how exhilarating it was to be wrong.
And he said, it's the only way I know I've learned something.
If I find out I was wrong,
it means I am now less wrong than I was before.
And he talked very eloquently about the value of detaching.
Like we talked about your opinions
your identities, your ideas from your identities, to say, look, every idea I have, it's just
a hypothesis. It might be true, might be false. And if I, if I have a vested interest in it
being true, then I'm not going to discover as much as if I really want to find out if it's true
or when it's true. I think Bezos had something along these lines too, right, where he's like,
people who don't change their mind are wrong. So if you want to be right, you have to be
somebody who changes your mind a lot. Yeah, I think that it's been one of Amazon's persistent competitive
advantages. Love them or hate them, right? They are extremely adaptable. And I've thought about this
in terms of another two-by-two. I met Jeff Bezos a few years ago, and I asked him how he goes about
making hard decisions. And he said, okay, basically two questions. One is, is this decision reversible?
and the other is how consequential is it? How high are the stakes? And he's willing to act very quickly
anytime a decision is reversible because he can change his mind tomorrow or low stakes because it doesn't
really matter. The decisions he puts off until the last possible minute, maybe even procrastinates
on, are the ones that are both irreversible and consequential because those are the ones that are not
just gambles. They are not just experiments. Those are real commitments. And I think this is as
relevant to personal decisions as it is to running a company, right, to say, okay, before I go
into a major decision, if it is both irreversible and highly consequential, then I want to spend a lot
of time rethinking my own views up front. Whereas if it's pretty easy to reverse and undo or
it doesn't really matter, I'm going to go forward and make the decision knowing I'll have time
later and the opportunity later to second guess it. But it's not enough to just sort of convince
yourself, you have the opportunity. You have to be of the mindset that that's sort of almost what
you're looking to do. Did you learn anything else from Jeff and or Amazon about how they make
decisions? Because I think they're one of the best large organizations we've ever seen
collectively making decisions. Yeah, I think one of the other smart things they do is they really
take this idea of process accountability seriously. So if you ever go to a senior leadership meeting
at Amazon, you'll see this sort of awkward experience of people sitting silently reading a memo
for 15, 20, 30 minutes.
And the reason they do that is they want everybody's thinking, careful, focused attention
around what are the alternatives around this big decision we have to make?
And is the problem framed the right way?
What's the long-term impact of this decision?
It is so rare for a world of people who are, like we all live in a world where we're too
busy and we have too many distractions coming our way constantly.
how unusual is it to sit down with a group of thoughtful people and read and reflect on a common
document and then say, okay, what assumptions should we be rethinking here? And I would be thrilled
to see more leaders adopt that practice. I think it's brilliant because it doesn't assume people
have read it before they come in. It gives people time to read it and discuss it. One of the things
I noticed when I was sort of running meetings like that is that people would come in and they
would have read sort of like the first couple paragraphs because they didn't have time,
really their fault in some ways. And then they would signal that they've read the document by
rephrasing some of those first. And everybody would just do the same thing. And I'm like,
you're all just saying the same thing. Like, we can assume this is knowledge, but then you can't
actually assume that people have read it. So I like the approach of sort of like distilling it down
to something digestible, giving people time to get on the same page, allowing the time in the
meeting. Most meetings don't need to be as long as they are anyway. And then everybody's
talking from a common base. Yeah, I think it's much more like.
if you adopt that practice that people are actually learning. It's also, I can't think of a better
time to do it than during a pandemic where we're all stretched for time. And now if I know the first
at least 10, 15 minutes of the meeting are going to be processing time, that's one thing I've
taken off my to do list, right? I can actually do it during the meeting as opposed to trying to
squeeze it in the night before. And I like how it's not PowerPoint and it's actually like thinking
instead of just point form. We could use more of that.
There is a part of me that wonders whether, you know, especially again for a high-stakes,
irreversible decision, do you want to send that out in advance?
Encourage people to read it at least a first or second time and then really digest it again
when you come together.
Because I'm not sure this goes back to that second thoughts post that you wrote, you wrote,
I'm not sure that our second thoughts are always our most insightful observations, right?
They're usually the easiest ones to think of.
And I worry a lot that we spend too much time listening to people.
who think fast and shallow, and not enough time hearing people who think slow and deep.
And I think we want to become the people who think slow and deep, because that's where most
of our good rethinking happens.
I want to end with sort of like this question that people say a lot in organizations, because
I think you'll have a good response to this, is people say, I'm entitled to my opinion.
And, you know, that's their way of ending the conversation and saying, you know, you can't
change my mind on something.
How do you respond to that?
Yeah, you're entitled to your own opinion.
if you keep your opinion to yourself
if you decide to say it out loud
then I think you have a responsibility
to be open to changing your mind
in the face of better logic or stronger data
and so I think if you're willing to voice an opinion
you should also be willing to change that opinion
I think that's of course easier said than done
but I think it goes right back to something we talked about earlier
which is to say okay
when would I change my mind
And if you can't answer that question, you are no longer thinking like a scientist, you've gone into preacher or prosecutor mode.
That's a perfect place to end this conversation. Thank you so much, Adam.
No, we can't end it. You haven't told me what I should rethink. I'm going to push for this.
Hey, one more thing before we say goodbye. The Knowledge Project is produced by the team at Furnum Street. I want to make this the best podcast you listen to, and I'd
love to get your feedback. If you have comments, ideas for future shows, or topics, or just
feedback in general, you can email me at shane at fs.blog or follow me on Twitter at Shane A. Parrish.
You can learn more about the show and find past episodes at fs.blog slash podcast.
If you want a transcript of this episode, go to fs.s.comlog slash tribe and join our learning
community. If you found this episode valuable, share it online with the hashtag the knowledge
project or leave a review.
Until the next episode.