Modern Wisdom - #980 - David Pinsof - This is Your Brain on Bullsh*t
Episode Date: August 14, 2025David Pinsof is a research scientist at UCLA, co-creator of Cards Against Humanity, and an author. Everything is bullshit. Your opinions, your arguments, even your thoughts. Most of it’s manufact...ured, borrowed, or absorbed without question. So if all that’s fake, what’s real? And if we can’t trust our own minds, or anyone else’s, what can we trust? Expect to learn how we can use incentives more efficiently and how to look at incentives more accurately, if other-thinking and worrying is complete bullshit, why we have opinions, and if our preferences are just even more bullshit, why arguing is bullshit, why most arguments are actually pseudo arguments, why so much advice mostly bullshit and why we take it and why we give it, and much more… Sponsors: See me on tour in America: https://chriswilliamson.live See discounts for all the products I use and recommend: https://chriswillx.com/deals Get the brand new Whoop 5.0 and your first month for free at https://join.whoop.com/modernwisdom Get the best bloodwork analysis in America at https://functionhealth.com/modernwisdom Sign up for a one-dollar-per-month trial period from Shopify at https://shopify.com/modernwisdom Timestamps: (0:00) Is Happiness Bulls**t? (7:48) Incentives are Key to Human Behaviour (12:33) Why Do We Have Opinions? (19:36) Exposing the Status Game (35:08) Are Opinions a Way to Test Loyalty? (40:50) How Does Arguing Relate to Opinions? (46:44) What’s the Difference Between an Argument and a Pseudo-Argument? (52:43) What is a Deepity? (01:01:14) The Differences Between Vague Bulls**t and Deep Bulls**t (01:08:18) Find Out More About David Extra Stuff: Get my free reading list of 100 books to read before you die: https://chriswillx.com/books Try my productivity energy drink Neutonic: https://neutonic.com/modernwisdom Episodes You Might Enjoy: #577 - David Goggins - This Is How To Master Your Life: https://tinyurl.com/43hv6y59 #712 - Dr Jordan Peterson - How To Destroy Your Negative Beliefs: https://tinyurl.com/2rtz7avf #700 - Dr Andrew Huberman - The Secret Tools To Hack Your Brain: https://tinyurl.com/3ccn5vkp - Get In Touch: Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/modernwisdompodcast Email: https://chriswillx.com/contact - Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
A desire for happiness is not what is driving our behavior.
It is a terrible way to predict our behavior.
It is a naive way of thinking about human psychology that will lead you into a morass of confusion, contradiction, and infinite regress.
Why?
Well, I wrote a very long post on this called Happiness is Bullshit.
And then I wrote a sequel to that post called Happiness Really Is Bullshit.
No, I'm just joking.
It was Happiness is bullshit revisited.
People really get hung up on this.
I think it's one of the biggest confusions we have about how the mind works, is that we have this really misguided idea that what we're pursuing in life is inside of our heads.
That is a really weird and implausible idea from an evolutionary perspective, that we would be animals that are driven to seek stuff inside of our heads makes no sense.
It makes way more sense that we would be driven to seek stuff out there in the world, you know, like food, sex, status, praise, inclusion,
in groups, all this stuff that would have correlated with biological fitness and ancestral environments,
these are the sorts of things that would make sense for a primate like us to want. It makes no sense
for us to want something inside of our heads. Now, the common response to this argument is that,
oh, well, happiness sort of motivates us to go out and get what we want. It's sort of like the
carrot that's dangling in front of us. And we need happiness to motivate us to get out, to go out and
get the stuff in the world, right? This view also makes no sense because as soon as you posit that
we need happiness to motivate us, well, there's a awkward follow-up question, which is, how does
evolution get us to want happiness? If you need happiness to get us to want stuff, then how
does evolution get us to want happiness? Does it have to give us happiness when we get happiness?
And then happiness when we get happiness when we get happiness? Like I said, we have entered an
infinite regress. This whole way of thinking that we need some internal goody to motivate us is
just wildly implausible. It contradicts a wealth of research and social and behavioral sciences
and neuroscience. We don't need happiness to be motivated. Our nervous system is directly wired up
to our hearts, to our physiology, to our lungs, to our muscles. It can just motivate us to go
out and get the stuff directly. We don't need happiness to motivate us any more than a thermostat
needs to feel happy when it gets your home at the right temperature. We have thermostats inside
our bodies that motivate us to shiver when we're cold and sweat when we're hot and seek
blankets when we're cold and to seek shade when we're hot. Those inner thermostats don't need
happiness just any more than the thermostats in our home need happiness. So there are many
reasons why viewing human behavior as a pursuit of happiness is misguided. I go through lots of
them in my post. But yeah, I think this is just a wildly confused way of thinking about human
psychology, about the mind. I think it is way more insightful to think about humanity as
striving for things for real things in the world that would have correlated with biological
fitness in ancestral environments. Do you completely discard with subjective internal
states of well-being and our relationship to them then?
No, I think those exist, but they're not what we think they are.
So what we think they are is just a carrot that's dangling in front of us that gets us to,
that motivates us.
What they actually are are mechanisms.
Happiness is a mechanism that evolved by natural selection to serve a very specific function.
And as we discussed, that function cannot be to motivate us because motivation doesn't need
happiness, right?
It serves a different function.
and what is that function? Well, it's to recalibrate our expectations and motivations when something
turns out to be better than we expected it to be. The sex is better than you thought it would be. The
ice cream is better than you thought it would be. You tried cooking a Spanish paella and you thought
it was going to suck and it ends up being amazing. When mistakes like that get made, your brain has to
do a lot of recalibrating. If I cooked this amazing paella, I need to update my expectations about how good
my cooking ability is, about my ability to cook Spanish cuisine, about the quality of paella,
and it needs to reorient my motivations so that I'm more motivated to cook Spanish cuisine in the
future. My brain has a lot of work to do when I'm wrong about reality in a positive way.
And all that work that my brain is doing, I think, is what we call happiness.
But if you understand happiness as a mechanism for recalibrating your brain in the wake of a
prediction error, well, then it makes no sense to say that we want happiness. In fact, it makes more
sense to say that we're chasing happiness away, because the more you get the thing you want,
the lower those prediction errors become, the more expected the good thing becomes, and the less
happy you feel when you get it. So many of us are familiar with this idea of habituation. You get a new
car, and it's awesome the first few times you drive it, but then eventually you get used to it and
no longer makes you happy. You get a new girlfriend or boy. You get a new girlfriend or
boyfriend, they're amazing at first, and then eventually, you know, things get a little more
expected, a little more boring. This happens pretty much across the board with regard to any good
things in our life. The more we get exposed to them, the lower those prediction errors become and the
less happy those things make us. But we often commit to girlfriends or boyfriends. We often get
married. We often drive a car for a very long time if we purchase a car. So just because the thing no longer
makes us as happy as it used to doesn't mean that we no longer want it.
We can still want things even if they don't make us happy.
And so I think we need to separate motivation from happiness and realize that we are often
motivated to get lots of things that don't really make us happy.
And in fact, the more we pursue those things, the less happy those things ultimately make us,
which suggests that we're not pursuing happiness itself.
We're pursuing those things out there in the world.
Does this suggest that intervening into our expectations is a more direct route to improving subjective well-being than trying to chase happiness?
Yes. If we actually did want happiness, then the way to get it would be to make our expectations about reality more wrong.
So one way to do that is by doing drugs.
sort of scramble your brain and make
your expectations about reality more wrong.
And in some cases, drugs can offer you a kind of euphoria
because everything is just so wildly surprising.
But when you think about what drug addiction is,
it becomes even more clear that it's not the high
that the drug addict is chasing,
but the drug itself.
Because what happens when you're addicted to something
is the more you take the drug,
the less happy the drug makes you.
The less intense you're high
becomes you become more tolerant to the drug. But an addict will often crave the drug really intensely
despite the fact that it no longer makes them feel good anymore. And that's what happens as an
addiction progresses. The drug makes you feel less and less good and you end up wanting the
drug more and more, which suggests that the addict wants the drug itself, not the high. So yes,
you're right. If we actually wanted happiness, then the best way to get it would be to make our
expectations wrong. But I don't think we actually do want happiness, and I don't think people are
going to be motivated to go out and screw up their expectations because that's ultimately not what
they want. It's also going to make you presumably less effective at getting, at being successful,
because part of being successful is being able to accurately predict what's going to happen,
which means that you need to have a good model of the world, which means that your expectations
aren't actually diverged from, diverged from all that frequently. So if, how much,
happiness isn't a particularly good way to work out what's driving our behavior. Are incentives a better North Star?
Yes, absolutely. I have a post called incentives are everything, where I argue that basically incentives are everything.
We usually think of incentives in terms of monetary incentives, like getting a salary increase or getting a fine.
We sometimes think about them in terms of legal penalties. Occasionally, people talk about social incentives like praise or esteem.
I like to think of incentives in the most broad way possible.
I think incentives are anything that we as human primates evolved to want and seek out in the world.
So incentives include status, belonging to a cohesive group, sex, food, you name it.
Comfort, homeostasis, anything we evolved to want is an incentive and can be used to incentivize us.
And what an incentive structure is, is where those incentives are situated across time and space.
And so I think a helpful way of thinking about human behavior is in terms of the incentive structures that we inhabit and what we do to get the various incentives in our environment.
I think that is a way, a more insightful way of looking at human behavior and human culture than this idea that we're pursuing happiness or inner states or inner well-being.
Hmm. Yeah, just follow the incentives. Where do incentives come from then? Are we a blank slate with regards to them? Have we got any or do we have any ability to interject into them?
So this is a great question. I think we need to make a distinction between what we want as an end and what we want as a means to an end. So for example, I don't think we want money as an end in itself. We want money because we can use money to buy us all sorts of things. We want.
want, like food, sex, comfort, housing, whatever. But if all currency collapsed tomorrow and nobody
was accepting money in exchange for goods and services, we would stop wanting money, right? So our desire
for money is conditional on its ability to get us what we want, which means that we want money
as a means to an end and not as an end in itself. So what we want as a means to an end can be
shaped by our environments, by our culture, by the opportunities available to us where we grow up.
But what we want as an end, I don't think can be changed. I think that comes from our evolutionary
history. That comes from biology. And I see very limited ability for us to change or intervene on
the things that we deeply want as animals. I don't think I'm ever going to get rid of my desire for
food or oxygen. This is just non-negotiable. So I think our deepest desires are ultimately come
from evolution and are non-negotiable. Would you be able to talk about proximate incentives and
ultimate incentives kind of in this way? You could think of it in those terms. So in evolutionary
psychology, people make a distinction between the proximate level of analysis and the ultimate
level of analysis. What the proximate level of analysis means is just you're thinking about
how the system works in nuts and bolts terms. So how does happiness work? What does it take as
input? What does it produce as output? How does it process information? These are all proximate
level questions. When you go to the ultimate level of analysis, you are thinking about the
evolutionary history of that mechanism and why it exists at all. What purpose is it serving? How did it
increase biological fitness and ancestral environments to the point where it became a reliably
developing part of the human phenotype. So ultimate questions are about function and
proximate questions are about structure. So your question is, can we think about incentives
in terms of proximate and ultimate terms? Yeah, I think we can. We can think about the function
of our motivational systems, why they evolve, why we want certain things in the world. And then we can
think about the proximate details of our motivations, you know, what our constraints are and
limitations are in terms of what we can perceive, what we can learn about, what we can remember,
what the available opportunities and constraints are in our environment. These are more
proximate level questions. Our opinions just broadcasts of our reasons for incentives then?
Is that us just getting a megaphone in front of our face and saying, these are my incentives
and this is why you should listen to them? Why do we have a
Hmm. Great question. So I think opinions are really puzzling. As far as I'm aware, nobody has provided a convincing definition of what an opinion is. And so I have an opposed I have a post called opinions or bullshit where I try to really drill down to what an opinion is. I mean, it's not as obvious as you think it is. I mean, people might think, oh, it's just just my preference. You know, I like, you know, spicy food or I like cilantro or I don't like cilantro. But that can't be an opinion.
because that's not how we talk about our opinions.
If I, if I like cilantro, I just can say I like cilantro.
I wouldn't say, like, it's my strongly held opinion that I like cilantro.
That sounds weird.
It's like one level removed from what it is.
Your liking or non-liking of cilantro is an axiomatic fact to you.
It's not an opinion that you hold about yourself in regards to cilantro.
Exactly.
We already know how to talk about our preferences.
We say we like this or that.
We don't care for this or that.
We never say it's my strongly held opinion that I like this. That just doesn't make sense, right? So opinions cannot just be preferences. They have to be something more than that. And then you might say, well, well, maybe it's your point of view, your perspective on something. Maybe, you know, some people see the glass half full. Some people see it half empty. Maybe it's my opinion that I see the glass half full and it's your opinion that you see it half empty. Well, that can't be right either because your perspective is just another kind of preference, right? I,
prefer to see the glass as half full. You prefer to see it half empty. I prefer to focus on the bigger
picture. You prefer to focus on the details. These are ultimately preferences. So that gets us back to
the question of how we differentiate opinions from preferences. So perspectives cannot be opinions.
So what the hell are these things? You might think, oh, maybe they're beliefs. They're just
what we think is true. Well, no, they can't be beliefs because if you believe something and you're
right. Well, then that's just a fact, right? I believe that Paris is the capital of France. That's true. That's a correct. That's a fact. But if I believe that, you know, Dallas is the capital of France, then that's a mistake. That's not an opinion. That's just, I'm just wrong about reality. So we have facts. We have mistakes. We have preferences. We have perspectives. None of these things can be opinions. So what the fuck are opinions?
please go on
I really hope you're not asking me
because all of my ideas have just been shot down
yeah what do you think what do you think Chris
I don't know I mean look before
I read your very castigating post
I would have said something like
if I'd known that I was speaking to you
I would have said something like
kind of a campaign
kind of a justification
a justification for a goal that we're trying to pursue.
Hmm. Okay. I like that. That's actually pretty close to my definition of an opinion.
So you're very insightful, Chris. You're very close to the way I'm thinking about this.
So I think of opinions as almost preferences. So they include preferences. What opinion is is a preference plus a set of judgments you make about the people who share your preferences and about the people who don't share your preferences.
So let's say I like McDonald's.
That's a preference.
But now let's say that I have all sorts of positive judgments about the people who like McDonald's.
Maybe I think they're authentic, real, sincere, gritty people who know what tastes good in their mouth and they don't care about virtue signaling and anti-capitalism.
They're just real, authentic, cool people who like McDonald's and they're honest, they're blunt.
And let's say I have a bunch of negative opinions about the people who shit on McDonald's.
Maybe I think they're just, you know, whiny virtue signalers.
They're just dishonest.
They know that it tastes amazing, but they're just pretending that it doesn't so that they can, you know, look good.
So if I have all of those judgments about the people who do and don't like McDonald's, well, then all of a sudden my preference gets transformed into an opinion, right?
So an opinion is a preference plus all of those social judgments I make about the people who have or don't have that preference.
So your angle is correct in that ultimately what I'm trying to do when I share my opinion is I'm trying to make the people who share my preferences look superior to the people who don't.
I'm trying to make the people who like McDonald's gain status over the people who don't.
And ultimately what I'm trying to do is I'm trying to change social norms such that people are praised or at least get seen in a positive light when they eat at McDonald's.
and people get condemned or dissed or seen in a negative light when they don't eat McDonald's.
And what that is is just a social norm.
That's another way of saying that there is a social norm to eat McDonald's.
So what I think we do, what I think we're doing when we share opinions is we are fighting over social norms.
They are battles over what social norms are going to prevail in our culture.
And of course, we all have a self-interested stake in trying to shape social norms in our favor in ways that benefit us,
in ways that inflate our status and lower the status of our rivals.
So I see the space of opinion sharing and the space of opinion criticism as a battleground in the fight over social norms.
Yeah, it's kind of the vanguard of social norms and you're trying to campaign for your particular side.
Presumably that means that most people's opinions are either obviously self-serving or,
second, third, fourth order effect self-serving in some way that comes back around to make
them look in, it just takes a little bit more time for people to realize. I have to assume that
there's an injection of like self-pedestalization and you are going to capture a lot of the
upside from this being held. Yes, exactly. I think opinions are ultimately self-interested
status-seeking tactics, but there's an interesting paradoxical element there and that we cannot
reveal that our opinions are self-interested status-seeking tactics, because revealing that
would lower our status and be against our interests, because there is a weird thing about
human psychology where being seen as a status seeker actually lowers your status. So in order
for us to seek status, we have to do it covertly. We have to cover it up. We have to make it
seem like we're not actually pursuing status. We're pursuing some other high-minded thing. Maybe
we're pursuing happiness, or maybe we're pursuing authenticity or self-actualization, truth,
or making the world a better place. These are the sacred values that we use to cover up
and explain our status seeking. This episode is brought to you by Whoop. Your body is constantly
sending you signals, but without real data, it's easy to overtrain, under recover, and miss
your best performance, which is where Woop's brand new 5.0 comes in. It is the newest version of
the wearable I've trusted for like 2,000 nights now, giving you everything, the
you need 24-7 tracking of your heart rate, your sleep, your recovery, your workouts, and more,
all translated into clear, personalized, simple data. And now it's 7% smaller. It's got 14 days
of battery life health span to track your pace of aging and hormonal insights for women who
want smarter support during their cycle and pregnancy and all of that stuff. Basically,
it is everything that was awesome about WOOP, plus tons of new tools to help you optimize your
health and performance. Right now, you can get the brand new WOOP 5.0 by going to the link in the
description below are heading to join dot whoop.com slash modern wisdom that's join dot whoop
dot com slash modern wisdom can you can you give a couple of examples of how opinions
masquerade as pursuit of something higher but are just thinly veiled status seeking
sure so here's an example um it is currently a social norm in many uh intellectually uh well educated
and literate cultures to praise Shakespeare.
So Shakespeare is seen as a genius, a brilliant and prophetic writer, prescient.
And if you're in a highly literate subculture, you might say, oh, yes, I love Shakespeare.
You might quote Hamlet or Macbeth.
You might see these as revealing deep and profound truths about the human condition that more recent works of fiction cannot reveal.
What's happening is that if I have read Shakespeare,
and can quote Shakespeare, I get status. And if I haven't read any Shakespeare and don't know shit about Shakespeare, I lose status in that subculture. What that means is that liking Shakespeare or pretending to like Shakespeare is a social norm. Now, who does that social norm benefit? It benefits the people who have read Shakespeare, who can read Shakespeare, who are smarter, who are more well educated, who have been taught Shakespeare, who have taken more time to learn
all the different, you know, words that, that have, have changed as English has evolved.
So it increases the status of a very select group of people while lowering the status of a very
select group of people. And so that is one way in which people will try to shape social norms
in their favor. If I haven't read any Shakespeare, maybe I'm not smart enough to understand it.
Maybe I'm too lazy to look up all the words. Maybe, you know, I didn't get a college degree.
Maybe I didn't go to grad school. Maybe I think, you know, fiction.
is, you know, a waste of time, whatever.
Well, then it's in my interest to try to shit on Shakespeare and say it's overrated.
Oh, why are people pretending that Shakespeare is so deep?
He's obviously, you know, he was fine, but there are plenty of more interesting and better writers.
I can learn these lessons by watching TV or by watching a movie.
What's the big deal?
That's me trying to play the opinion game and try to change social norms in my favor so that my status isn't lowered for not liking Shakespeare.
And so everyone who has any kind of preference, wherever that preference comes from, is a stakeholder in the opinion game.
They're each trying to shape the social norms so that people who have their preferences get status and people who don't have their preferences lose status.
But they have to cover up the fact that they're doing that because if it came out that they were doing that, they would fail to win the opinion game and their status would be lowered.
So we have to somehow fight over social norms while concealing the fact that we're fighting over social norms.
We have to pursue our own self-interest and social status while concealing the fact that we're pursuing our self-interest and social status.
And this, I think, is why we all sort of know deep down that opinions are bullshit when someone is sharing their opinion with us.
We can sort of tell deep down that they're trying to boost their status and look superior.
But we cannot call them out on that because if we did, we would look mean and we would look like we were trying to gain status over them.
And then that would lower our status.
It's this really complicated.
It's interesting that even a pushback against the status game or the opinion game
breaks the rule so much that even if somebody on the other side is playing it,
as long as they're playing it in a sufficiently culpably deniable way,
you don't get to break the...
Are you familiar with improv?
I'd do this last time.
There's an idea from improv called Don't Punk the Game.
It's the one thing that you're never allowed to do.
David, Chris, you're on a ship.
You're cleaning the deck.
And I'm there going, oh, God, it's so cold out here, isn't it?
And you go, I don't know what you're talking about.
It's like, ah, no, you fucking punked the game.
You could pretend that it was too hot.
You could have said, oh, yeah, it really is very cold.
I'm glad I've got my thick coat on, or whatever, right, in improv.
But you can't punk the game.
And punking the game is we're playing a game of tennis,
hitting the ball back and forth, and you hit the ball sideways.
Or you hit it straight up in the air.
No, no, no, no.
Like, we're playing this sort of relatively linear game.
I never thought of the fact that,
status game is this sort of concealed way or opinions are a way to sort of covertly
conceal the fact that we're playing for status putting it back and forth but even in the act
of criticizing unless the person opposite you unless you say dude this is like you're only
saying this because you believe in that unless that person has done something that makes it
obvious that they're doing it that the case for the prosecution on your side has a good
case to be able to sort of justify it. That sounds like you are disregarding the higher order
refined search of truth and beauty and art and fulfillment and whatever, whatever.
Oh no, this is, they're playing a game of social norms. You're playing a game of personal status.
Oh, you look like the one who is actually the very shallow vapid seeker of, of,
credibility.
Exactly. If pointing out someone's status motive lowers their status and increases your status,
well, then that itself can be seen as a status tactic, right? And it often is. And not only that,
it can be very threatening to the person that you're calling out because you're threatening
to lower their status. You're threatening to make their status game collapse. So what happens
when we all realize that a status game is a status game? Well, the player,
no longer gain status for playing it
because being seen as a status seeker
lowers your status. And those at the top
of the hierarchy, their status gets
lowered because they're seen as the iciest,
most vinglorious, most selfish status
seekers of all. And so the hierarchy
kind of gets inverted a little bit when
the status game is called out as a status game.
So everybody who is playing that game
has a vested interest in keeping it stable
and protecting it against attacks.
So if you try to attack a person's status
game and call out their
sacred values as hollow and
full of shit. Obviously they're going to get extremely threatened and angry and they're going to
try to silence you. Right? Yeah. Just the weird thing, remembering our last conversation about
why everything is bullshit, very matter. We don't know ourselves. If I ask you, why is this
thing, something that you hold, why do you behave in this sort of a way? We're not transparent to
ourselves. Also, other people, given that we're not transparent to ourselves, other people definitely
fucking certainly aren't transparent to us either. So we don't know us and we don't know others
and given that we don't know us, we can't even use our own theory of mind to infer somebody
else's theory of mind because the best way to convince other people that you genuinely believe
a thing and aren't doing itself deceptively is to be able to deceive yourself so that you believe
the lie, which is potentially self-serving. I'm still trying to find at what point in this sphere
there is a vector that someone can try and inject
I know what's going on into this
there's no point at all in this structure
that people can go, yeah, I actually understand
that being said, if you understand the meta game
which is most opinions are campaigns
for a movement in social norms in a direction
that in some way is probably going to benefit you
or your cohort or derogate
other person's other person or their cohort.
Yep.
That as just a general rule overall to pattern match what most people are doing seems to be
one particular way to do it.
But outside of that, I can't actually see a way to better diagnose what, why you think
the things you think or why other people think the things that they think.
Yeah.
No, it can be pretty depressing and disorienting for sure.
I think we should be troubled about, you know, most of our.
beliefs being bullshit. I think we should have more humility because most of what we think
probably is wrong. But if there is one way to have true beliefs in our head or to be
confident that we know what's really going on, it's by creating a set of social incentives
that promote truth. So the scientific method and the status games surrounding academia and
scientific research, you know, winning the Nobel Prize if you make an important discovery and
it's replicated, those are really powerful status incentives that drive scientists. And when
those status incentives are perverse, you can get things like the replication crisis, where you get
status for pumping out low-quality publications that don't replicate. Well, guess what happens
when those are the incentives? Well, you're going to get low-quality publications that don't
replicate. So the incentive structure of science and academia has to be just right for promoting truth.
I was going to say, do you not have now in what, when did replication crisis really, really ramp up?
Like 2016, 2018, something like that?
Yeah, around then, yeah.
Okay.
Now, the big meta is, well, you need to make sure that things can replicate.
So whereas previously it was pumping out lots of low quality, weirdly powered, P hacked, like bullshit studies.
Now where you get most of your.
status from is being the person who proves that this low-powered study from 30 years ago,
oh, Zambardo, actually it was this, that didn't quite work and it wasn't pre-registered
and all the rest of it. So we've gone from sort of the pioneer sphere to the critique sphere now.
So is that, do you just see this as the barstool got turned upside down and everything's
inverted and people are now playing a new game because that's kind of the new matter?
Absolutely. That is very well put. I think that's exactly what happens.
I think that's a good thing, though. It's good that the status game got turned upside down and now you get status for, you know, failing to replicate others work and balance the scales a little bit. Balance the scales. But what's interesting, and this is bringing back to your point about how calling out the game often can be disadvantageous, when people were originally starting to call bullshit on these bad studies and published failed replication attempts, they were actually heavily attacked and criticized their character was attacked. They were called replication bullies, the replication
police, methodological terrorists. I know this because I was in grad school.
Oh, methodological terrorists. I fucking love that. Let's go. It's beautiful. It's a perfect
example of this dynamic where people who have entrenched interests in stabilizing a status game
will attack those who are trying to make it collapse and invert. And this is exactly what
happened. The people who are lashing out against the people failing to replicate their work were
the people who were high status in the field, who had lots of publications, who had lots,
who resumes padded with shitty studies.
These were the people who were most vociferous
and most petulant against the people trying to improve science.
In other news, this episode is brought to you by Function.
Did you know that your annual physical only screens for around 20 biomarkers,
which leaves a ton of gaps when it comes to understanding your health,
which is why I partnered with Function.
They run lab tests twice a year that monitor over 100 biomarkers.
They even screen for 50 types of cancer at stage one.
and then they've got a team of expert physicians that take the data, put it into a simple dashboard,
and give you actionable recommendations to improve your health and lifespan.
They track everything from your heart health to your hormone levels and your thyroid function.
Getting your blood work drawn and analyzed like this would usually cost thousands, but with function,
it is only $499.
And for the first thousand modern wisdom listeners, you get $100 off, making it only $399.
So right now you can get the exact same blood panels that I get and save $100 by going to the link in the description below or heading to Function
Health.com slash modern wisdom. That's functionhealth.com slash modern wisdom. So is it a case,
that's a, this is probably a good example to bring home what we talked about so far. You have this
existing frontier. You can think of it like an army or battalion or whatever. And they're
real powerful and they're kind of capturing a lot of territory. And then there's this, you know,
incumbent force that comes from the other side. And they try and hold.
hold the line aggressively.
You can see this is two different social norms coming in.
One, psychology, science through psychology should be breaking new ground, finding new
things.
This is exciting.
The other side is we should move more slowly.
We should ensure that things actually are accurate before we inculcate them as part of the
sort of foundational understanding of how human behavior works.
And then there's this, there's a point in the middle, which is where there is a fight for
power. There is a struggle between the two and which one is going to become the dominant social
norm and then presumably if the incumbent side, the new one, backs down, then you get to continue
with this until the weight of evidence flips and everybody sort of moves over to the other side.
I mean, we saw this with COVID was natural origins, zoological or it was a lab leak and then
it's this fight, fight, fight, back and forth, who's going to win? Who's going to back and forth?
who's going to win and it's still up for debate now um so is is that kind of the way that it works
it's this sort of mimetic battle between two armies looking for social norms and then
eventually the scales tip and they go back one way this to me kind of explains why we see
these big swings these big sort of polarity swings um i mean let you go on that and i've got
a real spicy opinion how far anyway we're 30 minutes in it's only the cool people left so i can
start talking about this but is that is that an accurate
way to kind of diagnose what's going on?
Yes, absolutely.
You could think of it as a status game, which gets attacked and exposed as a status
game, which inverts it, turns it into an anti-status game, where you get status
for pretending that you don't care about status.
And so what happens to a status game when it's collapsed is that you often gain status
for doing the opposite of what was done previously.
So if the status game gets called out where I have to slick back my hair and wear
a nice, crisp black and white suit, if that gets called out, if I'm just being snooty and stuffy and, you know, pretentious and pompous, I'm working this, you know, an annoying corporate job, like all this status game gets called out for what it is, well, all of a sudden, I get status for doing the opposite. I get status for growing my hair out long and wild and wearing long, colorful, flowing outfits and being casual and avoiding the black and white suits and avoiding the corporate world. And so all of a sudden, you get an opposite status game rising from the ashes. And then eventually that,
status game will get called out and you'll get another inversion and a new status game that gets in. So you get this kind of cyclical dynamism where status games are constantly collapsing and reemerging in antithetical forms. And I think this is one of the major engines of cultural variation. So if you look across cultures, you see that what people get status for doing is wildly different. Status symbols very wildly across time and space. It used to be that we got status for wearing powdered wigs and for dueling.
Eventually, you know, that got called out for being stuffy and self-important.
Dueling was just a macho pissing contest.
The status game collapsed.
And now we get different things.
We get status for doing different things.
Right now we get status for, you know, let's say having educational credentials.
If you go to, you know, an Ivy League school, if you get an advanced degree, you get lots of status for that.
Maybe that'll collapse one day and you'll get status for shitting on the educational system.
It used to be that you got a lot of status for having quote unquote woke beliefs.
I think that status game is starting to collapse and invert.
So I think this is one of the main drivers of cultural variation in status symbols.
If status games are constantly collapsing and reemerging in antithetical forms,
then this explains why cultures are so different and why status games are so different,
because they're constantly evolving.
Or is it a way to look at opinions as a method to sort of test
allies, like a loyalty test or a fealty test in some secret war of social norms?
Yeah, I buy that.
So I think, you know, one function of sharing opinions might be to test people's loyalty,
to see if they share your preferences, if they have the same interests, if they're playing
the same status game.
If they're going to benefit from the social norm that you're advancing in addition to you,
well, then that makes them a good ally.
You have common interests.
And the same way that, you know, nations who are economically interdependent are more likely to team up with each other, people who are socially interdependent, who have common interests in particular status games, you know, those people make better allies and they're going to feel closer to each other if they share the same opinions. They echo each other's opinions. And oftentimes, because these games get so complicated, we pretend to hold opinions that we don't actually hold so as not to alienate potential social allies. You might express an opinion and I might pretend to agree with it so that you feel closer to me.
we might form an alliance that I might benefit from, even if, you know, we don't actually have common interest, but I'm trying to make you think that we do.
And so these social games can get very, very, very complicated, very, very fast.
And it makes you appreciate why the human brain is so fucking big.
So our brains are about three times the size of a chimpanzee brain.
Hominated brains expanded quite rapidly throughout human evolutionary history.
The common sense view is that we use our brains to make tools, to be smart and to outsmart flora and fauna.
I think that is wrong. I think the main reason why our brains are so big is to play these complicated social games. This is an increasingly popular perspective in evolutionary social science. It's called the social brain theory. And the idea is that the human brain evolved for politicking, rule following, covert rule breaking, hypocrisy, propaganda, social strategizing, status seeking, covert status seeking. These are very complicated strategies to pursue. You need a big brain to pursue them.
Computationally difficult.
Yes, very computationally difficult.
I think that social games were the major selection pressure leading to human brain expansion.
There's cool empirical evidence for this.
So the best predictor of primate brain size is not whether or not the primate uses tools, but how big the primates group is.
And so I think there's a lot of evidence for this.
We used to think that reasoning was sort of for solving problems and for thinking logically and making better decisions.
an increasingly prominent view pioneered by the psychologist Hugo Mersier is that reasoning is actually a social tool.
It's not for solitary rationality.
It's for winning debates, for persuading other people, for rationalizing what you did and for justifying what you did so that you can look good to other people.
And that explains all sorts of biases in our reasoning processes that would otherwise seem puzzling from the perspective of individual rationality,
but are actually quite strategic and functional from a social perspective.
like social consistency bias?
Sure. So parroting the preferences of other people to fit in with them, that might be one thing that this can explain. Another thing is confirmation bias or motivated reasoning.
You know, we are biased in favor of confirming what we already believe and we look for reasons that support what we believe and we ignore or dismiss any reasons or evidence that challenge our beliefs.
And so this makes very little sense if reasoning is about being more rational.
If you're truly trying to get to the truth and be a rational person, you should consider
the evidence both for and against a belief.
But if you're trying to convince someone and win a debate, you don't want to bring up any
evidence or reasons that could show that you're wrong.
That would be disastrous for your social goal.
So it actually makes a lot of sense if the goal of reasoning is persuasion and winning debates
and justifying and rationalizing.
It makes no sense if it's about individual truth-seeking.
This episode is brought to you by Shopify. Look, you're not going into business to learn how to code or build a website or do back-end inventory management. Shopify takes all of that off your hands and allows you to focus on the job that you came here to do, which is designing and selling an awesome product. Shopify powers 10% of all e-commerce companies in the US. They are the driving force behind Jim Shark and Skims and Allo and Eutonic. And that is why I've partnered with them because when it comes to converting browsers into buyers, they're best in class. They're
checkout is 36% better on average compared with other leading commerce platforms. And with
shop pay, you can boost your conversions by up to 50%. You can upgrade your business and get the same
checkout that we use at Newtonic with Shopify by going to the link in the description below and
signing up for a $1 per month trial period or by heading to Shopify.com slash modern wisdom
or lowercase. That's Shopify.com slash modern wisdom to upgrade your selling today.
What do you reckon is the likelihood that consciousness, i.e. a felt sense of
of us being an us, the phenomenological texture of being an eye with a sense of eye, how much of
that, do you reckon, is just a byproduct of me needing to be able to work out what David
thinks? And that means that in order to have theory of mind for him, I need to have theory of mind
for myself. How likely do you think that is? I think it's quite likely. I don't know that it
explains all of consciousness, but I think it explains a big part of it, and particularly the sense
of self and the sense of identity and self-consciousness. I think all of those,
fall out quite naturally from the social games we play. So we're having a conversation right now
and I see a picture of myself on my computer screen. Well, why do I need that? Well, I need to monitor
how I look to other people. That's what the camera is telling me. It's telling me how I'm coming
off to other people so that I can adjust myself. I can maybe, you know, fix my hair. If there's
something wrong with my hair, I can adjust my posture. If I'm making scowly faces, I can adjust
my facial expression to look more friendly. That is what, that is the function of my webcam.
spot on the computer screen right now. I think we have something very similar inside of our
brains. Just as I need to monitor how other people think of me and how I appear to other people,
I need to do that all the time, basically 24-7 in my everyday social interactions. I need to
have a model of how you see me so that I can adjust how you see me in a more positive way and
win you over. And I think that is largely what our sense of self is about. It's a kind of
selfie cam that is installed into our brains.
Yeah, that's very, very good.
So how does arguing relate to opinions?
Because it seems like social norms come into contact on some kind of a battleground,
presumably the battleground of that in the most direct form is an
argument. You have your opinion in position. I have my opinion, my position, and we'll
joust it out until one emerges remotely victorious. How does arguing relate to opinions?
I think it's related very thoroughly to opinions. One of the, one of our chief weapons in the
fight for social norms is to create good arguments. What's interesting, though, is that a good
argument is different from a socially effective argument.
So if I make you look uncool or awkward or stupid or low status, that is going to reduce the likelihood that people agree with you and share your opinions, even if your opinions are in fact correct.
And so a lot of our arguing is not actually designed to persuade anyone or to get at the truth.
What it's designed to do is make the other person look worse than you so that people are more likely to agree with you than the other person.
So if you look at presidential debates and the post-debate analysis, it is very clear that these debates are not about the contents of public policy.
When people analyze the debates are like, oh, he looked really great in this moment.
He was really presidential.
Or it's like, oh, yeah, he looked really confident.
This was a great one-liner.
They're not talking about whether their beliefs are correct or whether their policy preferences are actually going to promote the common good.
They're talking about which one looked better, right?
And so what I think, what presidential debates are are competitions to be quippier and more confident and more likable and more attractive to the American people, particularly your constituents, perhaps the people who are on the fence.
They're not really about policy or about getting to the truth of what policies are actually going to help the nation.
I think that's true of much of our debates as well beyond presidential debates.
There are competitions to be more likable than the other person a lot of the time.
And if I disagree with you, I am implicitly challenging your status because I am saying basically that you're wrong and I'm right.
And what could explain the fact that you're wrong and I'm right?
Well, you must be dumber than me.
You must be less knowledgeable than me.
You must not be privy to the same kinds of information that I'm privy to.
You must be less socially connected.
Whatever reason you come up with, it's going to make you look bad and it's going to make me look better than you.
So merely by disagreeing with someone, you are implicit implicitly threatening their status.
So I think a lot of what goes on in arguing is not persuasion and truth-seeking, but status
competition.
There are other dark functions of arguing, which I talk about in my post, arguing is bullshit.
I think a lot of the times when we argue with someone, we're not trying to persuade them,
but we're trying to intimidate and silence them.
We're trying to make them feel bad about expressing their opinions so that they're less likely
to express them in the future, and they're more afraid to express them in the future.
So if you think about the tendency of people to call each other Hitler or compare each other to Hitler during Internet debates, that is very hard to explain from the perspective of persuasion.
When was the last time you heard someone say, wow, you're right, I'm just like Hitler.
You've totally persuaded.
Yeah.
Yeah, that's right.
So if you're trying to persuade someone, calling them Hitler is a very bad idea.
But if you're trying to make them feel bad and silence them, then calling them Hitler is a great idea because inside the privacy of their own minds, they're going to be thinking, oh, shit, if I express my opinions, people are going to call me Hitler.
That's terrible.
I better shut the fuck up and not express my opinions.
And so what happens after someone gets called Hitler during a debate?
The person who is called Hitler is silenced and the competition for social norms is tilted in favor of the accuser if the accusation is blocked.
And so what we're trying to do in some cases when we argue is literally just intimidate silence the person we're talking to and make them feel bad so that our coalition, our tribe can gain power at their expense.
It's the same thing that happens, you know, in totalitarian regimes.
You know, for example, if we're in the Soviet Union and we all despise Stalin, it is very bad news for Stalin if we all gain common knowledge of the fact that we hate Stalin, right?
And so it is in Stalin's interests to make sure that none of us know how many anti-Stalinists there are in our midst, because if we knew that, and if we knew that the anti-Stalinists knew that we knew, then we could rise up to overthrow Stalin.
And so what keeps Stalin in power is constant uncertainty, not knowing if anyone else is actually an anti-Stalinist, not knowing that they know that I know, this kind of coordination is what helps people rise up to overthrow a regime.
and every totalitarian regime that I'm aware of in existence has tried to silence, intimidate, and break
down this kind of coordination so that nobody knows how many people are opposed to the regime,
how many people will come up to support them, and they're living in fear.
And it may be the case that, you know, 90% of the population hates the regime, but nobody
knows that 90% of the population hates the regime, and so they stay silent out of fear.
And so what a tribe does to gain power is to silence the opposition, prevent them from coordinating.
And I think that is one of the other functions of arguing beyond mere persuasion.
And that's why I think arguing can get so ugly and insulting at times.
Because you're optimizing for intimidation and a lot of the time that's quite militant.
It's very unforgiving.
It can be quite mean.
Yes.
Mm-hmm.
Yeah.
You win not only if the person changes their mind, but if they shut up.
And it's often easier to make a person shut up.
And so often we opt for for the latter over the former because it's easier.
What's the difference between an argument and a pseudo argument?
A pseudo argument is a is when a person who is doing something ugly, like say, competing for status, trying to silence you or intimidate you, covers up their ugly motives with the veil of persuasion.
So it might be that I'm just trying to dis you and silence you or look good or whatever, but I'm going to pretend that I'm actually trying to persuade you. I'm trying to, you know, change hearts and minds. And so I might, you know, make put on a performance of giving reasons and citing evidence and having a logical argument. But beneath the surface, I'm just trying to make you feel bad and put you down and raise my status at your expense, right? That is a pseudo argument. It is masquerading as a real argument, a real attempt to persuade. But secretly, it is something much uglier.
and darker. And I think much of, if not most of our arguments are actually pseudo-arguments.
What is some of the common signs that you're in a pseudo-argument? How can people diagnose whether it's
the matrix or the real world? Yeah. So I have a list of warning signs in my post arguing is
bullshit. I'll see how many I can remember off the top of my head. So one of them is whether the person is
actually listening to what you're saying and understands
what you're saying. If they do not understand what you're saying, they're not listening to you, and they
caricature your view and interpret what you're saying in the worst possible light, that is a very
good sign that you're in a pseudo argument. If a person is actually trying to collaborate with you to
get to the truth, they have an incentive to listen to you and get what you're saying right.
If they are not trying to get at the truth and they're just trying to make you look bad,
they have an incentive to interpret what you're saying in the worst possible light and not listen
to you. And they'd rather talk than listen a lot of the times.
Often in pseudo-arguments, the arguers don't know what they're arguing about because they don't bother to define their terms.
I see this a lot in debates over quote-unquote socialism.
One person might define socialism as Sweden.
The other person might define it as the Soviet Union.
And then they angrily talk past each other and fail to persuade each other of anything.
This makes little sense if the goal of arguing is persuasion.
It makes a lot of sense if the goal of arguing is intergroup competition.
So that's another sign.
Hmm, what else?
If the person interrupts you a lot, if the person, you experience that a lot in pseudo-arguments, let's see here, what else?
If they dodge your questions, so if they refuse to engage with what you're asking them, if they fail to point out anything that they agree with and what you're saying, that's a sign.
Usually when someone cares about getting at the truth, they might say, oh, that's a good.
good point. Oh, I agree with that, but I don't, but I might push back on this. If they're never
saying anything they agree with you on, that's a bad sign. What else? I think that's about
as much as I can come up with off the top of my head. I'll refer to the main, as you're talking
about the challenges of opinions arguing, the main question I've got in my mind is, is there any
hope of ever scaling genuine good faith debate? But does that, does that exist?
How hard is that for anybody to arrive at?
I think it does exist. It is rare. It is a rare precious flower. But it does exist. I see it a lot weirdly in the comment section of my blog. For whatever reason, I feel like it attracts, you know, kind of good faith discourse. I don't know what it is about my writing. Maybe I scare away bullshitters. I'm not sure. So that's one place to find good faith debate. What's interesting is that I think we're more than capable of good faith debate when it comes to more.
mundane practical matters. So when it comes to deciding, you know, which restaurant we should go for
dinner, you know, to accommodate everyone's preferences, you know, if someone's a vegetarian or whatever,
like, which restaurant's going to be closer, if they're going to have tables available, when we're
having debates about that, we're actually perfectly rational. We listen to other people's opinions.
We take into account other information. We're totally willing to change our mind.
To say, oh, that's a good point. Yeah, they don't have vegetarian options there. We should go to
this place instead. We're actually, we're paragons of rationality when it comes to these mundane
practical matters. If we're driving to the restaurant and someone says, oh, actually, the highway will be backed up. Why don't we take this other route? We'll say, oh, yeah, that's a good idea. I'll take this other route. We'll listen and we'll change our minds. And so there are many domains of human life where we're fully capable of having a good faith argument, exchanging reasons, exchanging information and updating our models of the world. But as soon as you bring in status and tribalism, all that goes out, goes out the window, and we turn into apparatchiks. So I think what a lot of the
lot of sort of autistic adjacent people do is they bring that mundane practical rationality of
deciding which restaurant to go to for dinner or what route to take to get there. They bring
that kind of rationality into politics where it doesn't belong. And then they get frustrated
that nobody is sharing their way of thinking about politics. Like, oh, no, this is the best
restaurant. This really is, this really is the best route to get there. Everyone's like, what are you
talking about? We don't care about that. Right. So what they're trying to do is they're trying to play
the good faith debate game, when everyone else is actually playing the intergroup dominance
game, which involves concealing the fact that it's an intergroup dominance game, obviously,
we've been through that. And people who are bad at socializing and bad at picking up on those
cues think that it's just the reasonable argument game. And then they get frustrated when no one
shares their focus about the evidence and the arguments and so forth. I think this is a big mistake
that I see a lot of people make. I myself have made it a lot of times. I think it's important to be
realistic about what politics brings out of us and be realistic about the dark things that politics
often brings out of us. What's deepity or what is it a deepity? This is a word that I didn't even
know existed before I read some of your stuff. It's a wonderful word. It's a great and helpful
concept. It's a great way to detect bullshit in the world and avoid falling prey to it. So the word
A deepity was originally coined by the philosopher Daniel Dennett, and his example of
adipity was love is just a word.
And what makes it a deepity is that it has two interpretations.
One interpretation is absolutely mind-blowing, earth-shattering, world-changing, it boggles
the mind.
It makes you, in fact, it's extremely implausible.
It's almost certainly wrong, right?
And that interpretation is the emotion of love, you know, our heartfelt
feelings of desire, lust, romantic commitment, all of our courtship rituals, all of our songs about
love, all that is just four letters. It's just a puff of air from our mouths. It's not actually
an emotion. It's just a word. That is almost certainly wrong. But boy, if it were true,
that would change everything, right? And then there's another way you can interpret love as just a
word, which is that the word love is just a word, which is obviously true. Obviously,
the word love is just a word. Every word is just a word, right? It's not conveying any information
And so what makes it a deepity is sort of toggling back and forth between these two interpretations of love is just a word.
So you go to the mind-boggling interpretation. That makes no sense at all. And then you go to the more plausible interpretation like, oh, that makes sense.
And when you go back and forth between those interpretations, you create the illusion of insight, the illusion of resolving confusion and getting to something true.
It's sort of like going in and out of a hot tub on a cold night, right?
that a hot tub is more pleasurable when you get out and experience a little bit of cold.
And then when you're in the hot tub, it feels pleasurable to get out of the hot tub a little bit, right?
So when you toggle between confusion and understanding by saying something vague with multiple interpretations,
you can create the kind of pleasure of getting in and out of a hot tub,
but you get the pleasure of being confused and then resolving the confusion, right?
You had an insight.
There was hot take, you only live once is bullshit.
It's usually used as an excuse for status seeking and self-graph.
certification, splurging on a vacation or pivoting to a more competitive career. There's no logical
connection between life is short and hedonism and risk-taking a good. Is that an example of
deepity? Yes. I think so. So there are multiple ways to interpret the sentence, you only live
once. One of it is just, one interpretation is obvious, right? Like, of course we only live once.
Of course, we're all going to die, right? And that it's obviously correct. Um,
And then another interpretation is, well, therefore, we should take all these risks and travel the world and splurge on a vacation to Italy so we can see Rome before we die and see the redwoods and, you know, have an affair or whatever.
And that interpretation makes no sense.
Just because we're going to die doesn't mean that all those things are good.
There is no logical connection between the former and the latter.
Maybe it's good to travel the world.
I don't know.
maybe it's good to, you know, have sex with someone. I don't know. But it has nothing to do with whether or not we're going to die, right? Obviously, we're going to die. And you need an independent argument for why this thing is good, right? It could be that maybe it's the opposite. Maybe because we're going to die, you shouldn't have an affair. Maybe because we're going to die, you know, there are only so many hours in the day and you shouldn't go to Rome. You should instead do your work or do something that's going to, you know, make a contribution to society, right? Like there's just no connection between what what the statement means and it's and the implications that people take it to me.
mean. So I think that's one example of adipede. Another example of adipity is everything happens for a reason, right? You hear this one a lot. And there are two ways to interpret it. One is that everything has a reason from the perspective of a conscious supernatural being, that something happened because some supernatural being or essence or force wanted it to happen or intended it to happen. That's the really implausible earth-shattering, mind-boggling interpretation. The other interpretation is just that things have causes.
everything does happen for a reason like that has cause things have causes this is just the basic scientific worldview uh and you can toggle back and forth between those two interpretations and delude yourself into thinking that you're experiencing some deep insight when you're not it's just a deepity um other examples of deepities what we think we become this is from the buddha you can interpret it as meaning that if i think i'm abraham lincoln then i will magically uh sprout a beard and sprout a top hat from my head and because
Abraham Lincoln. Obviously, that's not true. The more mundane interpretation is that just
thoughts have causes. When you think stuff, it's going to affect your behavior in some way. That's
why we have thoughts in the first place. That's why we have brains in the first place because they
affect our behavior, right? It's a pretty boring interpretation. Another one, the future
influences the present as much as the past. This is from Frederick Nietzsche. The mind-boggling
interpretation is things that happen in the future can retroactively change the past, right? The
boring interpretation is that we sometimes think about the future. And that can change the
present. What's, explain to me the emotional payoff when we flip-flop between something that's
bold and something that's banal. Like what, why? Yeah. Well, part of it is that, I mean,
we've been talking a lot about status games and status competition. One of the things that we get
status for is by presenting people with bold, earth-shattering ideas and changing the way they think. That
gets us a lot of status. We do that for a living and we get status for it. But it's actually a
really hard thing to do because real genuine novel insights and earth-shattering ideas are hard to
come by. And because the conventional wisdom is often correct, things that depart from the
conventional wisdom are often wrong and implausible. So there's this trade-off between finding
ideas that are new and special and interesting and and provocative and ideas that are
plausible and logical, inconsistent, and correct. The thing is, the plausible and logical
and correct ideas, people already mostly believe them or will be quickly convinced of them
and you won't gain much status for presenting them, right? And so what deepities do is they
kind of allow us to have our cake and eat it. We can present an idea that seems, you know,
provocative and earth-shattering. And then when people question and say, no, you're crazy, what are you
talking about, you can pivot to the other interpretation and say, no, actually, it makes perfect
sense.
It's kind of like an intellectual Motten Bailey.
Yes, absolutely.
So the Motten Bailey is a very similar idea.
I forget who came up with this term, but it's when you're arguing.
It's another bullshit arguing tactic, by the way.
It's when you put forward an extreme position of your view, like say, we should abolish the
police, right?
And then when people say, no, you're crazy, what are you talking about?
You say, oh, no, I just mean we need, you know, like more mental health services and, like, we just need to, like, change the way that police operate. And then they walk back to a much more moderate position. And then as soon as the person leaves, they go back to saying, let's abolish the police. Right. You see this a lot in political argumentation. And that is a kind of deepity. You can interpret abolish the police in one of two ways. You can interpret it the extreme way, which is we should have no police. Or you can interpret as we should, you know, reform the police. And that's more moderate. And people will often pivot back and forth strategically to sort of gain status in, in, in, in, in.
in political debates. And I see people doing that, that toggling, not just in politics, but across the board with spiritual insights, supernatural ideas. They will often put forward the more provocative and plausible supernatural interpretation of a statement and then pivot back to the more straightforward interpretation when they're challenged or when they're attacked. Everything happens for a reason. God wanted this to happen. What are you talking about? Oh, just things have causes. Obviously, you know, things happen because, you know, there was some cause, right? That's all I'm saying.
DPDs kind of feel like
they're essentially brain hacks
that manufacture an aha
without an insight
I wonder how much of the appeal
is sort of fueled by
status signaling of sounding profound
without also risking status
it's this very sort of safe way to do it
yes exactly
it's a low risk way
of getting status from having
a provocative counterintuitive idea
Well, what's the difference between vague bullshit and deep bullshit, then?
Hmm.
Very good question.
They're not the same thing.
I see one as the parent of the other.
So you can think of vague bullshit as the broader.
Necessary, but not sufficient.
Yes.
So I think vague bullshit is the umbrella term.
It's the broader category.
And vague bullshit is bullshit that just is hard to interpret.
It has multiple interpretations.
And some vague bullshit is deep bullshit, but not all vague bullshit.
So vague bullshit is something that has multiple interpretations.
Deep bullshit is when one interpretation is earth-shattering and the other interpretation is boring and mundane.
But the broader concept is just stuff that has multiple interpretations.
And there's lots of vague bullshit out there like in astrology, quantum healing, continental philosophy, postmodernism, psychoanalysis.
All this stuff is jargon-laden, verbose, impenetrable, hard to wrap your head around.
You have no idea what the fuck you're reading when you read it.
And I think that is all by design.
I think often the function of vague bullshit is to create uncertainty about what the speaker intends while covertly signaling group membership to a select few of insiders who understand what the bullshitter is getting at.
So an example of this is there is no limit to the fullness of emptiness.
Do you have any idea what the fuck that means, Chris?
There is a type of satisfaction that you can derive from deep peace
which a busy and chaotic life cannot afford you.
That is very close to the intended meeting,
but you are very familiar with this kind of stuff.
Probably most people when they hear that will be very confused.
I've had to do some real dissection there to try and get.
You had to do a lot of work.
you had to do a lot of work, right?
So that's a real quote.
It's from Osho or the Bhagwan,
from the Rajneesh cult in that Netflix docu-series,
Wild Wild Country. Have you seen that, Chris?
Sick. I should have been there, but no.
Yeah.
Fantastic. One of my favorite Netflix docus series.
So, yeah, so he's a cult leader.
He had this quote, there's no limit to the fullness of emptiness.
Basically, what he's saying, what he means by emptiness?
is a kind of mindfulness that you would achieve in meditation.
So you are emptying your mind of cravings and mental chatter,
and that kind of emptiness makes you feel great.
He's basically just saying mindfulness is good.
It's another kind of deepity.
You know, mindfulness is good is the mundane interpretation.
There's no limit to the fullness of emptiness.
How could fullness be emptiness?
That makes no sense, right?
So what he's doing is he's saying mindfulness is good
to the select few of people who know what the fuck he's talking about,
who know that emptiness means mindfulness,
and those people feel warm and included
because they understand what he's getting at,
and everyone else feels alienated and confused.
And that's a wonderful way of covertly probing group membership.
He is saying something that will attract people who are like-minded
and sycophantic toward him while alienating and excluding everyone else.
And so that creates a sense of loyalty and shared community
when everyone spouts vague bullshit and secretly gets what everyone means.
by the vague bullshit, while everyone else has no idea what they're getting at.
Does that mean that there is, there's kind of an accelerator pedal that you need to press carefully?
Because if it's too obvious, then it doesn't have the aha, boldness sort of insight.
But if it's too obfuscatory and too complex, then it's not sufficiently accessible to actually be a
attractive to people. So you have a kind of tension when it comes to designing your deepity. Is that a fair way to look at it? Yes, I do think that's a fair way to look at it. But I think there could be another function of vague bullshit where it's, it literally is meaningless. It literally is impenetrable. It has no meaningful meaning at all. But you might be trying to test your audience to see if they are loyal enough to you to hallucinate meaning and to and if they want to flatter.
you enough that they're willing to interpret what you're saying as deep and profound, even
though they have no idea what the hell you just said, right? And so even if it's totally
meaningless, you can still get some meaningful social information from it by gauging how people
react to it and how eager people are to defend you for it and defend your reputation.
No, no, he actually, he meant something that's really deep by that. Trust me. That reaction is
very valuable to a cult leader because they can assess who is going to be loyal to them and who
is not, right?
A car carrying member.
Yeah, exactly.
Yeah.
So it can be a kind of loyalty test as well.
I think there are lots of social functions of vague bullshit.
I write about several possibilities in the post.
But yeah, I think it also has to do with the fact that we are linguistic animals.
That's a pretty incredible trait of homo sapiens that we communicate.
We form shared models of reality.
And that means that if I'm good at language, if I'm good at transmitting ideas from my head into yours, that makes me a really good social partner.
That makes me a good ally, a good mate, a good collaborator.
You're going to have an easier time smoothly coordinating with me than with other people.
If we're vibing, that suggests that we're good at communicating with each other and we know what each other's expectations are and we know what each other means by what they're saying.
And so part of what it means to be good at language is good at interpreting it.
good at reading between the lines. If you say something unclear, or if you stutter, or if you get cut off, or if I don't quite hear everything you said, and I still manage to get at your intended meeting, then that's a really good sign from your perspective. That means that I'm either really good at language or I know you really well, both of which are good signs in terms of you liking me, right? So a lot of vagueness is just the joy of trying to get a, the joy of finding meaning and chaos, right?
I think the joy of finding meaning and chaos has to do with the joy of practicing our linguistic ability because language depends on extracting meaning from seeming chaos.
And so we like to practice that ability and show off our ability.
If I can extract some meaning from this really weird impenetrable text, boy, that makes me look good, right?
That makes me look like I'm really good at language.
I'm really clued in to the writer.
If they're a high status person, that helps me a lot, that I'm really in sync with them.
So I think vague bullshit really has all these intertwined social functions that bring people together, that help us gain status, that test loyalty, that allow us to show off and practice our hermeneutic or interpretive talents.
It's really a lot going on there.
What a wonderful way to make us all feel a little bit less virtuous than we should be.
David Pensoff, ladies and gentlemen, David, you're great.
I love your blog.
I love all of this stuff you do and join your podcast.
Tell people where they can go and check it all out.
Sure. So I recently started a podcast with Dave Petruski, is a professor at University of California, Santa Barbara, and it's all about evolutionary psychology. As far as I'm aware, we are the only explicitly evolutionary psychology focused podcast. That is just all about evolutionary psychology. We bring in evolutionary psychologists who are practicing, who are professors, who are doing research right now, and we interview them. We also talk with each other. It's more of a nerdy and academic vibe than my blog. But I
think it's doing an important service in getting good, solid, rigorous research out into the
public. So you can check that out. It's called Evolutionary Psychology, The Podcast. The podcast is in
parentheses. And you can also check out my blog. It's called Everything is Bullshit. You can find
it at Everything Is Bullshit.org. And you can find me on Twitter at David Pinsoff.
Feel free to reach out to me directly. I love talking to people. Yeah. David, you're great,
man. Until next time, good luck with the podcast. Lots of fun. Thanks so much, Chris.
If you're wanting to read more, you probably want some good books to read that are going to be easy and enjoyable and not bore you and make you feel despondent at the fact that you can only get through half a page without bowing out.
And that is why I made the Modern Wisdom Reading List, a list of 100 of the best books, the most interesting, impactful and entertaining that I've ever found.
Fiction and nonfiction, and there's real life stories, and there's a description about why I like it, and there's links to go and buy it.
And it's completely free.
You can get it right now by going to chriswillex.com slash books.
That's chriswillex.com slash books.