Modern Wisdom - #385 - Gurwinder Bhogal - 15 Mental Models To Understand Psychology
Episode Date: October 16, 2021Gurwinder Bhogal is a programmer and a writer. I got tagged in a monstrous thread of Gurwinder's on Twitter exploring human nature, cognitive biases, mental models, status games, crowd behaviour and s...ocial media. It's one of the best things I've read this year, so I just had to bring him on. Expect to learn how saying ridiculous things can be a test of loyalty, why people can be too stupid to know that they're stupid, why million-to-one odds happen 8 times a day in New York City, why The Bullshit Principle is actually a thing, why everyone is seeing racism everywhere and much more... Sponsors: Get 20% discount on the highest quality CBD Products from Pure Sport at https://puresportcbd.com/modernwisdom (use code: MW20) Get perfect teeth 70% cheaper than other invisible aligners from DW Aligners at http://dwaligners.co.uk/modernwisdom Extra Stuff: Follow Gurwinder on Twitter - https://twitter.com/G_S_Bhogal Gurwinder's MegaThread 1: https://twitter.com/G_S_Bhogal/status/1225561131122597896 Gurwinder's MegaThread 2: https://twitter.com/G_S_Bhogal/status/1438972527838117895 Get my free Reading List of 100 books to read before you die → https://chriswillx.com/books/ To support me on Patreon (thank you): https://www.patreon.com/modernwisdom - Get in touch. Join the discussion with me and other like minded listeners in the episode comments on the MW YouTube Channel or message me... Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/ModernWisdomPodcast Email: https://chriswillx.com/contact/ Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Hello everyone, welcome back to the show.
My guest today is Gwinderbogel.
He's a programmer and a writer.
I got tagged in a monstrous thread of Gwinder's
on Twitter exploring human nature and cognitive biases
and mental models and status games
and crowd behavior and stuff.
And it's one of the best things that I've read this year.
So I just had to bring him on to talk about it.
Today, expect to learn how saying ridiculous things
can be a test of loyalty,
why people can be too stupid to know that they're stupid,
why million to one odds happen eight times a day
in New York City, why the bullshit principle
is actually a thing, why everyone is seeing racism
everywhere and much more.
This is a thing about the internet, man,
like it's awful, apart from a few times when it's absolutely amazing, just finding this
guy out of nowhere, bringing him on the show and having this unbelievable hour and a half conversation,
talking about all of the stuff that I'm fascinated by. Thank you to whoever it was that tagged me on
Twitter. This leads me quite nicely into the modern wisdom
locals community launching this Monday,
the 18th of October.
It's precisely why I want to create this.
It's a place where everyone that listens to the show
can get together and talk about episodes,
submit ideas for guests,
things that they've seen, articles that they want me to comment on
or react to on YouTube or on the podcast,
suggestions for guests, I can do private live streams.
We can talk about all of the stuff that we can't on the normal internet.
I'm really, really excited.
It also means that you can support the show.
If you love it, you can actually help us grow by giving us your support,
but you can also access it for free.
This will be live on Monday.
Check out the Instagram, check out my Instagram.
I'll have the links on there.
Or if you're on the mailing list,
or if you listen to the intro to Monday's episode
with Mr. Daniel's Slaus.
But now, please give it up for Gwinder Bogle.
Tell me your background, how did you come to write long Twitter threads that hundreds of thousands of people see?
So my original background is in tech and I was working sort of on search algorithms and things like that and
Basically tasked with sort of ensuring that people get directed to the right information
But I sort of started losing interest in that when I realized that the main problems with
The internet were not actually caused by algorithms
They're actually caused by people because algorithms are basically just a reflection of
human behavior.
So once that sort of epiphany came to me, I decided that it would actually be more productive
for me to actually understand the core of the problems with the internet.
And when I say the problems with the internet, I mean things like misinformation and polarization and things like that. So I decided to sort
of move away from tech and sort of explore human psychology a little bit more. So I basically
started freelance writing and sort of, you know, understanding, sort of trying to understand
psychology and how that sort of integrates with the sort of digital age and how it's caused so many problems
and things like that.
So yeah, I've been gradually trying to build a
performance on Twitter and it's been working quite
well so far and then hopefully I've got some
enough people interested that I can actually start to
really explore this topic properly and as a full-time job,
it's my hope.
I think so, man, the couple of tweet threads that I've seen from you that I got linked by
some listeners are their monsters. Total, like 50,000 likes on a couple of them and 40
tweet threads along. So I got sent this by one of the people that listened to the show and
I just fell in love with it. So I want to go through, I'm going to harass you today
and ask for some insights into some of the concepts that you came up with
and we'll see how many we get through today.
So the first one, first tweet, the law of very large numbers.
Given a wide enough data set, any pattern can be observed.
A million to one odds happen eight times a day in New York City population of 8 million.
The world hasn't become crazier, we're just seeing more of everything. What's that mean?
So that's basically the story of Twitter, basically, that sort of explains all of Twitter.
So the whole thing about news is that news is only news if it's surprising, if it's interesting,
if it's not interesting, it's not news. So people only share things that are surprising and
not news. So people only share things that are surprising. And as a result of that, what happens is that if you've got a feed, a Twitter feed, and people are just sharing things that
they find unusual, it gives you a distorted perspective in the world because you're not
seeing reality, you're seeing the exception to reality, you're seeing what's surprising,
and the cumulative effect of this is that it can really
sort of send you bonkers, it can send you crazy,
because you just get a completely distorted view of reality.
And this is something that occurs
regardless of what your beliefs are.
It's a universal experience.
So if you're on the left, you're just gonna constantly see
things that would, you know, be sort of surprising
and sort of interesting and outrageous to the left.
So you'll see, you know, racism
and you'll see a lot of instances of corporate greed,
bigotry, you know, transphobia, all that kind of stuff.
So that will lead you to believe
that the world is more bigotid and more greedy and just more corrupt than it actually is,
because you're just seeing these sort of cherry-picked examples of the worst of humanity.
And the same goes with if you're on the right and you know, you're anti-woke and whatever,
if you're following sort of, you know, lips anti-work and whatever, if you're following, sort of, you know, lips of TikTok.
Great to recount.
Yeah, yeah.
It's a hilarious account, and it's very entertaining.
But if that's all you're following,
if it's just those kinds of anti-work accounts,
then you're gonna, basically, all you're gonna see
is just, you know, Hollywood celebrities acting crazy
and you're gonna see these academics, you know,
with this kind of critical race theory craziness.
You're just going to see that just over and over again.
And it's going to, you know, even if you're not conscious of it, it's going to basically
fool your brain into thinking that this is the norm, that, you know, this madness, this
basically extremism is going to, is the norm basically.
So what that does is that it basically creates this sense of threat and it makes you feel
that the whole world is hostile and this is what drives people further to the extremes
because they think, oh my god, they're basically a woke apocalypse or oh my god, racism is
everywhere.
So this basically causes people to double down on their beliefs and to say, oh my god,
you don't need to do something about this and it basically makes people to double down on their beliefs and to say, you know, oh my God, you don't need to do something about this.
And it basically makes people guilty.
Have you ever read 10 Reasons Not to Get Famous by Tim Ferris?
I haven't.
It's just an article online.
It's about half an hour long.
And fuck man, like it's so interesting.
And in it, one of the things that he highlights is that most people are shooting for fame.
They want to try and get as big of an audience as possible.
But what happens when you overshoot fame
and you get 150 million or a 300 million person audience?
And I think he said, 1% or around about 1%
of the population are psychopaths.
And he was like, okay, in an audience of 150 million people,
that's a lot of psychopaths.
So his point was that when you start to spread the net wide enough,
the exceptions start to be able to band together to the point where you have so many
outliers that they dominate your experience.
And I guess this is kind of what you're seeing.
You know, for all that, I find lips of TikTok kind of, it's funny and sort of ridiculous. It does irritate me in a way
because I know that it's a misrepresentation of what most well-meaning people on the left
must have. And it irritates me because I think like outraged at what these people are
actually saying. So yeah, I, um, it's a difficult one, man. And it's just that limbic hijack
race to the bottom of the brain stem that you immediately respond to.
Yeah, absolutely. And I mean, further to what you were saying about sort of, you know,
the psychopath thing, you know, it sort of increases the thing, the number of psychopaths.
Well, the problem is, is that there's something called negativity bias. I don't know if I've
included this in one of my megaphores. I don't think I have.
Bonus round. Yeah, bonus round. Yeah. So negativity bias is basically the human tendency to remember and to sort of give more
focus and attention to negative information rather than positive information.
And so if you've got an audience like Tim Ferris, for instance, I think he's got 1.5
million followers, probably more than that.
If you've got a huge
number of followers, you can sort of read through your notifications and you'll see a lot of
compliments after you tweet something, you know, and you see, oh, you know, this is nice, this is nice,
you know, this person loves what I'm writing, that's great. And then, suddenly, you'll get that one
negative comment and your mind will sort of focus on that negative comment. Even if you've got like,
you know, 99% positive comments, that 1% is going to remain in your mind more than the 99% of
comment because, you know, that negativity is something that you have to react to and this
goes back to our evolutionary history, you know, it obviously is more important for you to be
able to react to negative stimuli than to positive stimuli, because negative stimuli constitute an existential threat.
So you have to react to them a lot more quickly and a lot more harder than you would to
a positive stimulus.
And so as a result of that, people, even if there's only a small number of psychopaths,
even if there's a small number of rude people on Twitter, if you've got a massive following
and you see that, those small numbers, those are going to be inflated
and you're going to get this kind of distorted perspective
and it's going to bring you down and it's going to make you depressed.
And this is another problem with Twitter is that
it kind of, it takes a lot more positivity
than negativity to affect you. And because of this disparity, you're always going to feel depressed if you
spend too long on Twitter because you're just going to remember the negative
things. You're going to remember the negative news stories rather than the positive
ones. You're going to remember the negative comments that you get from people
rather than the positive ones. And again, the cumulative effect of this is that
it makes you feel depressed. And it just kind of brings you down and it just,
it's not
healthy for your brain because your brain's not designed for this kind of information.
And so it's basically, it just brings down, that's why probably one reason why we have
such a pressure epidemic, an anxiety epidemic as well, is just because of this sort of,
the importance of validation and how that can
be easily sort of, you know, just obliterated by just one negative comment.
Right, next one. The Peter principle, people in a hierarchy such as business or government
will be promoted until they suck at their jobs, at which point they will remain where they are,
as a result, the world is filled with people who suck at their jobs.
Right, yeah. Well, this one is kind of people who suck at their jobs. Right. Yeah.
Well, this one was kind of more of a humorous one.
I mean, I think it's probably true in a lot of instances.
It's not true in every instance, because some people might choose to remain where they
are.
They might love that job, and they might just want to do it, despite the fact that they
could do better jobs.
You know, I know a few people, personally personally who have actually declined promotions because they like
where they are currently and they want to do that rather than a much managerial role or whatever.
But I think it is true as a general rule, I think it's just this kind of game, there's this whole
hierarchical game where people are always trying to impress the people above them and then
as long as they can do that
They will be promoted and they will you know sort of
rise in the in the hierarchy, but
once they can't
The investment in that person is too big to let them go you can't just fire them because you train them
You've you've built up a relationship with them
So them it makes sense sort of to just keep them where they are
You know to prevent them from being
able to do any more damage by just containing what their responsibilities are essentially.
You see this in sales organizations, right? Someone comes in, they're a good salesperson,
they become a good head salesperson or senior salesperson, and then eventually because humans
in a meritocracy within an organization, they naturally want to progress and be promoted
because they have this desire for growth,
you end up eventually losing a good salesperson
and gaining a shitty manager
because this person was never built to be a manager.
They were a mint salesperson.
They don't want to be told that they can't be promoted
because they have this natural desire for growth.
You can't not promote them
because then they're going to leave and go somewhere else
and be a good salesperson until they maybe do get to have a shitty manager job position
within that. But by promoting them, you lose the person that was good at sales and you gain
the person that can't manage. Yeah, absolutely. I think there's this sort of faulty sort of
sense of success that people have, which is that if you're successful at one thing, you
can be successful at anything. And so somebody who's been successful,
let's say, who's basically done really well
in engineering or something.
Somebody would think, oh, this guy,
because he's so successful in engineering,
he must make great manager.
And so they'll promote them to a sort of managerial,
but the thing is, is that the person
who spent their entire life doing engineering,
and that's why they're so good at engineering.
So to try to just assume that just because they're successful at an engineer,
they're going to make a great manager. I think this is a bit of a, you know, sort of a fallacy.
That is quite surprisingly common. You know, when I, you know, places that I've worked, I've found
that sort of people would just be promoted just purely based on numbers, just sort of their statistics as it were,
rather than do they actually have the skills to succeed in a very different
environment to one that they've worked on.
And as a result of that, I think there are a lot of people who just aren't really qualified at what they do.
You see it all the time, you just see in competence all the time, you know,
and it's the rule rather than the exception.
And it's, I think that, yeah, the Peter Principle does have a large part to play in.
I don't think it's the only factor, but I think it's, I do think it's, it's certainly
one of the factors.
Yeah.
Next one.
The Golden Hammer.
When someone, usually an intellectual who has gained a cultish following for popularizing
a concept becomes so drunk with power, he thinks he can apply that concept to everything.
Every mention of this concept should be accompanied by a picture of Nassim Talib.
I think I was a bit mean-spirited when I wrote that.
If I was to write that mega thread again today, I probably wouldn't go so hard on Tele.
I was thinking of Tele when I wrote that because I basically had been following
him on Twitter and I just noticed that pretty much everything that came across him, every
sort of idea that he was talking about, he would link it back to one of his ideas which
he had written back before, so it would either be something about anti-fragility or the
lindy effect or black swans. you know, and yeah, these are quite sort of fundamental principles
and they do have quite fly reaching implications, but it was pretty obvious to me that what he was doing
was he was essentially
selling his book by just sort of
making his explanation as wide as possible. So, you know, he was basically saying that, you know, if you want to understand A, read my book, if you want to understand B, read my book, if you want to understand C,
read my book, he was basically linking everything to his book. And I think that this is something
that obviously is not just to let this deal to this. He was just the example that I thought
about the time. It's anybody, I've found this actually something that's quite common on Twitter.
When you see somebody who's just written a book and they've got a new concept, you know,
they will try to explain everything in terms of that concept.
And I think half of it is because they actually have spent
so long on this idea that it's kind of become an obsession.
And confirmation bias means that they're gonna just,
you know, they're gonna see any sort of way that they can make their explanation,
the explanation they'll do that. That's half of the explanation. I think the other half is that
it's probably a conscious decision to just sell that book. I think.
I think another bit of it's the in-group out-group dynamic as well, that by highlighting this is
the language of our people.
If you understand what I'm talking about, you're one of the initiates, if you don't understand
what I'm talking about, you're one of the heretics.
Absolutely, yeah.
This is something known as shibbolits.
And I think this is quite common on social media as well.
People form lingo, they form these concepts, these words, which signal to others of the same tribe that
they are a member of the Ingrud, they're called shibbolits.
And they are actually a very, very sort of central part of tribal life, to have this kind
of code.
You know, everybody's got it, you know, like people on the right, for instance, use the word
based a lot, you know, and stuff, you know, and then you've got people on the left who've
got their whole, you know, academic sort of stuff about, you know, and stuff, you know. And then you've got people on the left who've got their whole, you know, academic sort
of stuff about, you know, white fragility and all that kind of stuff.
So everybody's got their own sort of code sort of system, which they use to sort of reinforce
their in-group status and, you know, sort of, you know, reinforce these relationships they
have with other tribe members.
So yeah, I think that could be a part of it as well.
Right. Brandelini's law, aka the bullshit asymmetry principle, it takes a lot more energy to
refute bullshit than to produce it, hence the world is full of unrefuted bullshit.
Yeah. I think one of the reasons why social media is just so full of shit to put it bluntly, is because it doesn't
really take much time or effort to post something that's wrong. And if you think
about the kinds of people who don't think very much about what they post, they're
going to be able to post at a much faster rate than people who do think about what they post.
So because of that rule, most of Twitter,
mostly a Twitter timeline,
most of your Facebook sort of, you know,
timeline is going to be composed of people
who haven't thought through what they're actually saying
because obviously they can post at a much faster rate
and at a much greater rate than people
who think very carefully before they post. And as a result of that, it
creates the impression that people are actually stupider than they actually are. Like, I'm not
being funny, but when I first joined Twitter in 2014, one of the first thoughts that I
had was, my God, there's a lot of stupid people on this planet. You know, that was actually
one of my thoughts.
You know, I just couldn't believe it.
I'd never been exposed to so much stupidity
when I just looked at Twitter for the first time
and saw all of these idiotic comments.
I just couldn't believe it.
And part of that was because I'd actually be,
I'd fallen under the illusion of Brandylini's law.
I'd basically been given this false impression of people
by the overrepresentation of people
who don't think before they post.
There are a lot of very smart people out there, but they are very anxious and they tend to
really, really think very, very hard before they post.
And as a result, they don't post very often.
And so you don't see their tweets very often.
You see the stupid tweets a lot more often than you see the the intelligence tweets. Have you had a look at the statistics around the percentage of people
who contribute the highest volume of social media posts on Twitter? Have you seen this?
It's like the Pareta principle on steroids. It's like 2% of Twitter users account for 90%
of the content that gets posted.
That wouldn't surprise me.
I can believe that, I can believe that very easily.
When I first joined Twitter, I just followed anyone and everyone really.
There was just so many of the tweets, we were just like, oh, I just made a bacon sandwich,
it tasted very good.
It's lovely.
Just stuff like that.
There are a small number of lovely, just stuff like that.
There are a small number of people who post stuff like that all the time and they're responsible
for just filling up Twitter with junk.
That's what you have to be very discriminatory on Twitter, you have to be very careful about
who you follow and I think it's a good idea to block as well.
I used to be against blocking because I thought it was unfair of like unfair on people because you're kind of dismissing them out of hand, but I've come to the
conclusion that there's too many people in this honest earth for you to have any time for nonsense.
So I would advise people to just block. If somebody's post-installed, let's got very low information
density, I want to say information density, what mean is, in a single tweet, how much information are they actually
giving you?
If they're not giving you very much information,
or if it's irrelevant information,
it's a good idea just to block them or mute them.
Probably mute them if they haven't been rude to you.
I usually only block people who are rude to me,
but I mute people quite often because I feel that,
what they're doing is they're muscling out other people
on your timeline who have more thoughtful thoughts.
So you know it's like it's it's it's very important I think to do this because it's the difference between Twitter being a
Hellsite and Twitter being like a digital Disneyland basically. That's the difference you know if you just very
careful about how you curate your information, follow the right counts, block and mute a lot
deliberately, then you'll find that brand new in these laws is not so influential on your life
anymore because people will be thinking a lot more carefully before they post and that all
sort of offset the sort of effects of the law. The best thing I ever did was limit myself to
99 people that I follow, that's it. It's the best thing that I ever did was limit myself to 99 people that I follow.
That's it.
It's the best thing that I ever did and Twitter is such a beautiful place for me now.
I see articles I'm interested in.
I very rarely see retweets of stupid stuff.
And I've got in the back of my mind, I've got some sort of little limbic clicker of how
many strikes someone's on.
And then if it, if within the space of a month,
someone annoys me too many times,
it's like, all right, well, there we go.
There's a slot that's opened up who,
I don't know who I had to kick off in order to start following you,
but whoever it was unlucky.
Right next one, the Tokville paradox.
Yeah.
Is that if I pronounce that right?
Yeah, Tokville paradox.
Tokville paradox, as the living standards in a society rise,
the people's expectations of the society rise with it.
The rise in expectations eventually surpasses
the rise in the living standards,
inevitably resulting in disaffection
and sometimes in populist uprisings.
Yeah.
So this is a very important one, I think,
because this is a very powerful counter argument
Against the people who see racism everywhere who see, you know, the bigotry and all that kind of stuff
You know, there's this whole concept of how on the left
How sort of systemic racism has got worse and how you know
Basically, it's almost if you were were to believe what is sort of posted,
you know, in the New York Times and stuff like that,
it would seem like there's an epidemic of racism
and misogyny and transphobia and, you know,
all these other phobias and isms.
But what's actually happened is that our conception
of these things has actually widened.
So, yeah, that's another one, concept creep.
That was a miracle.
Yeah.
So, it links in with top-wheel paradox quite well,
quite easily, because they're both
referring to the same general principle,
but in slight ways, so I'll probably just do
both them together.
So, concept creep is basically when your definition
of a certain concept expands when that
thing becomes less common.
So for instance, misogyny is an interesting one.
So the original sort of concept of misogyny was that it was sort of like a violent hatred
of all women basically.
It was this belief that women were inferior
and that, you know, or it was something to do with sort of,
that they should give over their sexual reproductive rights
to the men of women.
Female racism almost.
Yeah, yeah, kind of, yeah.
It was big a tree directed at females.
So that was the original sort of concept of misogyny,
but as the world became more enlightened
and sort of, you know and women's rights were put
on parity with men's and there was more equality, that old sense of the term no longer had
much relevance in the West.
I mean, it still has a lot of relevance in the East because there's still a lot of that
kind of misogyny in the East, but in the West there's not so much of it anymore.
So the Oxford dictionary actually changed the definition
of misogyny and it became it went from sort of hatred of women to it could be dislike of individual
women or it could be dislike of particular groups of women. It didn't have to be like you know just
a sort of a blinding hatred of all women and gradually gradually that word has been sort of appropriate. Now
it's been sort of reimagined again by the one feminist movement. To now it can also basically
mean it could be hatred of a particular woman. You see this very often, you know, somebody,
I mean Hillary Clinton, weaponized this new definition of misogyny very, very cleverly
I mean, Hillary Clinton weaponized this new definition of misogyny very, very cleverly in the sort of election.
When she was criticized by people, she would assume,
she would basically say that it was because she was woman.
She was seeking to be the first female president
and they didn't want the first female president
because they were misogynists and this kind of thing.
So it was politicized and it was basically,
because of that, how this links in with the top real paradox is that
when you expand the definition of a term that thing becomes more common again. So if misogyny in
the traditional sense is decreasing and then you increase the scope of the word misogyny,
suddenly the instances of misogyny are going to increase again. So it's going to seem like misogyny is getting worse.
When, in fact, it's not actually getting worse,
the definition of it is expanding.
And this applies not just to misogyny or to words like racism.
I mean, racism is another one.
Racism originally was discriminating against people
on the basis of the color of their skin or hating people
because of the color of their skin.
Now there's all these different types of racism. Now, we've got systemic racism,
institutional, all that kind of stuff. They've expanded the definition of racism to include
things like microaggressions and all this sort of stuff. The numbers of absolute instances
of racism have actually decreased, but because racism now means so much more than it used to,
you can easily find more examples of racism now
than you would have been in the past.
Just because of, and this also links in with the law
of very large numbers, you know, that if you've got a white
and a fake set, you can find any number of instances
of racism, you can cherry pick them and make it look like it's worse.
So this is a paradox because as the world gets better,
because of the change in these ideas, it can make it look like the world is actually getting worse.
And a lot of people do believe the world is getting worse.
And this is not just something that's confined to the left.
You've got the neo-reactionaries who are convinced that the world is heading to hell.
And they believe that the world is actually going to just fall apart.
So this is quite a common belief, this kind of a pop-elected sort of belief system is something that's very common, and I think it is largely a result of the topical paradox, because the definitions
of problems are widening. Poverty is another one. In absolute terms, poverty has been dwindling. It's been being reduced
to mean a person living today, the poorest person in the West today is better off than most
nobles who are living in the sort of medieval times because they've got access to iPhones,
even if they're poor, pretty much everybody's got a phone or a laptop and got access to
the internet. So in absolute terms, people are richer than
they were in those days. People have access to a wider range of foods now than they used
to have. So in absolute terms, we've gotten much, much better. But there's this thing called
relative poverty, which is how the UN will measure, that's how it measures poverty.
It doesn't measure it in absolute terms.
It measures it in relative terms. So relative terms is basically how much poorer is the poorest
person compared to the richest person? So yes, if you measure things like that, if you measure
poverty by a relative poverty, then... That's well-thin. I'm not convinced that poverty fits that definition, but that's concept
creep again, right? Yeah, absolutely, exactly. So that's essentially what this sort of relative
poverty is. A relative poverty is wealth inequality. It's basically the gap between the richest and
the poorest. And it's not absolute poverty because they used to be absolute poverty, which was,
you know, if you're below a certain income per year,
regardless of how rich the richest person is, then you are an absolute opti. But now it's done by
different sort of criteria, because the world has just got so much better that you can't really
measure things in absolute opti in a bit. So now we're thinking more about wealth inequality,
and so the top world paradox, basically, to sum it all up, it makes the world
look like it's getting worse when the fact is actually getting better. And that's a big
cause of problems. It's an interesting one to think about that. When rightly as society
develops, kind of steel man the side of the top-villains, why you would say that that would
be a good idea is we now have more access to technology,
we have more wealth, we have more ability to actually deploy things to improve well-being
and happiness and fulfillment and flourishing and blah blah blah.
Therefore, the level of dexterity and resolution with which we should be looking for problems
has to become increasingly fine as well. It's no point saying, oh, well, we've got rid of dysentery
and fucking managed to get the M out of the measles,
mumps, and rebella vaccine.
Therefore, we fixed all the problems.
He said, no, no, there are more problems to continue going.
But you're right, it's not necessarily
about this linear progression.
It's about the way that the rules of the game
continue to be played, that redefine it.
It's impossible to compare progressive today with problems of yesterday, when the rules of the game continue to be played, that redefine it. It's impossible to compare progress of today
with problems of yesterday
when the rules set that was used yesterday
and today are now different.
Yeah, absolutely.
I completely agree.
I think, yeah, absolutely.
And I believe that wealth inequality
is generally a bad thing.
And I do want to see the people who are lifted up.
Absolutely.
And I don't think that we should be content with where we are.
I think we should always be looking to improve the situation for people, no matter how well
we have improved it in the past.
But what the top bill paradox, the true problem with top bill paradox, is not that it stops
us trying to make the world a better place.
The problem is that it creates
this sense of pessimism which can manifest in dangerous ideologies. So, you know, you
can have this sort of left, leftist sort of view of, for instance, you know, the Black
Lives Matter riots last year, they cause a lot of damage. And, you know, I can understand
why people are angry. I mean, George Floyd should not be killed the way he,
he shouldn't have been killed at all,
but it was particularly egregious the way he died.
And, you know, so I can understand
what that it made people angry,
but if people had actually looked at the facts,
they would see that the violence against,
by the cops against members of the black community
has actually decreased massively and
Coleman Hughes is a great writer on race and stuff. He wrote of this thing called the racism treadmill
which is that no matter how well race relations become, there will always be people who will say that it's not enough and that there needs to be more need to be done.
And that it's basically, it's not like that they're saying that they just need to improve,
because that would be fine. Of course, race, you need to improve. What they're saying is that race
relations haven't moved at all, basically. They will always say that race relations are still as bad
as they were in the 1950s or in the 1900s, when it's self-evidently not true. There's been a huge
huge civil rights movements, which have given black people the rights to go
to school with white people. All these things have happened in the past 100 years. It's
self-evidently false. Yet these people still believe that race relations have not moved relations are either haven't moved at all since those times or that they've actually got worse.
And this is, you know, it's just, it's, it is largely due to the top real paradox. People just
can't see that these, they can't see the sort of the advances because they're sort of,
they've been forward by language and they've been forward by ideas. And they've been fooled by language and they've been fooled by ideas and they've been fooled by sort of a lot of
Idealogging by the sort of New York Times and you know these kinds of
Publications which are always pushing this sort of narrative that
The world is getting more racist
The racism treadmill. I like I haven't spent much time watching Coleman Hughes's stuff I've got John McWerter on in a couple of weeks and he's he's got a new book called woke racism out which I haven't spent much time watching Colman Hughes' stuff. I've got John McWerter on in a couple of weeks,
and he's got a new book called,
Work Racism Out, which I haven't read yet,
but he'll be interesting to speak to.
Right, next one.
This is one of my favorites,
because I adore the blog post that this came from,
the Toxoplasma of Rage,
the ideas that spread most,
and not those everyone agrees with,
but those that divide people most.
Because people see them as causes to attack or defend
in order to signal their commitment to a tribe,
and this is Scott Alexander
from what used to be Slate Star Codex
and is now Astral Codex 10.
Yep, absolutely.
Yeah, he's a great writer.
So yeah, this is a very important one or so.
Because there's this sort of naive view
that some people have, which is that people
just want to know what's true.
And if you just only give people the truth,
then everybody will be enlightened.
And the world will be sort of rosy
and everything will be happily ever after kind of thing.
But that's not how the human brain works.
The human brain is not actually that interested in true. What the human brain, I mean, the human brain is interested in creating a sort
of image of reality that is in line with evolution, basically, and part of that evolution, because
most of our lives, if you look at human, humans have been around for about 200,000 years and through approximately 180,000 of those 200,000 years. So, approximately 90% of human evolution
has been spent in hunter-gatherer lifestyles in tribal societies. And as a result of that,
we have tribal mentalities. So, we tend to play status games within tribes and we tend to have these kind of
internacy struggles and we also sort of are very hostile to people who are not of our tribe.
And when it comes to information, we often try to use that information in ways that will
benefit our tribe, which will benefit our status within that tribe.
And so this is one of the core things behind epistemology.
If you've got a set of information and you give that information to people, they're not going to process it as in,
is this true or is this false? They're going to process it largely as as does this hurt or does it help my tribe?
And this really explains a lot of the sort of polarisation that we're seeing
on the internet and social media specifically because
essentially what the culture war is, the culture war is a sort of a relic, it's a vestige of
is a sort of a relic, it's a vestige of our tribal struggles. We have essentially reprimativized technology to sort of, you know, because we've basically
we've reverted back to this kind of caveman sort of tribal ideology because of the way
that Twitter is sort of and Facebook and stuff are basically sort of arranged.
So we form these communities online,
which are essentially tribes.
And we do not look at things as true or false.
We look at them as how is this gonna benefit my tribe?
How is it not gonna benefit my tribe?
And from that, from that simple fact,
you have polarization, you have misinformation, you have all the big
problems that we are facing today as a result of this one thing. People will tend to share
information. I've seen this happen so often that it's just, it doesn't even register anymore,
where people will know something is not completely accurate, but they'll post it anyway because
it either demeans the enemy tribe or it
makes their own tribe look good. And I mean, there's plenty of words for it. There's one thing called
nut picking. Nut picking is when you take the most extreme examples of the opposing tribe,
and then you use that to demonize the entire enemy tribe. So if you're on the left and you see, you want to sort
of demonize the right, what you would do is you would go to someone like Stefan Mollonu
or Richard Spencer sort of character, someone who's unpopular with most people, someone
who's very, very fondly. And so, you know, then they would say, oh, look, this person
is, this person's a racist and he's a member
of this tribe, so they're all racists, you know. And likewise, if you're on the right,
you'll get the most woke person that you can find, you know, somebody who basically thinks
that math is racist or something like that. And then you'd say, oh, look, look at this
person, you know, they're so woke, is what the won't believe, they believe that math is racist.
And so, you know, it's, the people who post these things, they know that they're just,
they know what they're doing, they know that this is not a representative example, you know,
it's just, they're taking an extreme and they're using that to sort of demonize the entire
trial.
And straw mining is basically the same sort of concept, you know, where you get, you take
what somebody said
and then you just interpret it in the worst possible way.
And people often do this dishonestly.
They don't do it, sort of unwittingly.
A lot of people do this dishonestly.
So, you know, somebody says, you know,
oh, if they have concerns about immigration for instance,
you know, then you say, oh, so what you're saying
is that you hate all immigrants, you know, you say say oh, so what you're saying is that you hate all immigrants
You know you so you're a racist you don't like brown people basically
So they'll take them the worst interpretation of what somebody has said and then they'll basically you know
They'll use that to demonize a tribe. So this is this basically sums up pretty much
99% of the cultural basically. There's an article by Eric Torrenberg that I read a couple of weeks ago, and he talks about status games
and some of the idea pathogens
that we're sort of seeing at the moment.
And one of the things that he highlighted
was that some of the crazy ideas
that people share from their own tribe,
the ones that they know to be false,
they're almost commitment devices.
So they're seen as commitment devices
by their own side.
Look, we all kind of know that this thing is a bit mental, but you need to
posit your position alongside us as a show of faith. And if you don't, it's a
canary in the coal mine that you might not be someone that is aligned with our
interests. And that's really, really fucking interesting, that this is some sort of like hazing, initiation,
type, fealty ceremony to check whether or not everyone's on side.
An absurd ideological belief is actually a form of tribal signaling.
It signifies that one's ideology is more important to them than reason itself than true sanity
reason.
To one's allies, this is an oath of sort of unwavering loyalty.
To one's enemies, it is a threat display basically.
So it's not always about what's true.
It's often about how does this make me look to my tribal sort of compatriots
and to my enemies. And I think a lot of that is actually, it really does explain a lot
of the culture. People are not saying what they think is true, they're saying what is
going to sort of favor them to their tribe.
They're saying what they think is effective. Yeah. Yeah. All right. Next one, a bull verism, instead of assessing what a
debate opponent has said on its own merits, we assume they're wrong and then try to retroactively
justify our assumption, usually by appealing to the person's character or motives, explains 99%
of Twitter debates. Yeah. And this also links in with what we were just talking about. I mean,
Yeah, and this also links in with what we were just talking about. People are not configured for true and they're configured for these tribal games.
So if you're debating somebody who's of a different tribe, it makes sense to just assume they're wrong.
Rather than have to do the hard work of actually analyzing, are they really, you know, telling the truth? It's just much easier and much better, just to assume that they're wrong and then to work
your way back with through that.
And I mean, I think everybody's guilty of this.
I'm guilty of it myself, you know, sometimes if I'm arguing with somebody that I just know
that I'm going to disagree with, I will just, I won't really pay too much attention to
what they're saying.
I'll just look at the keywords that they're using and then just be like, okay, why don't you
do it? pay too much attention to what they're saying. I'll just look at the keywords that they're using and they're just, you're like, okay, why? Make a value judgment from there. Tell you what's the perfect example of this that you see from
the right. As soon as someone criticizes them, they'll go onto their profile and if they've
got their pronouns in their bio, they'll just reply with pronouns in bio case closed. You know,
well, that is precisely dealing with the person, not the argument. Yeah, and I think a lot of it has to do with the format of Twitter as well.
I think you can't really have a decent debate on Twitter.
It's just not possible.
I've made this comparison where I've said that trying to have a debate on Twitter
is like trying to have a sword fight in a phone booth.
It's just not possible because you don't have the space.
Unwieldy.
Yeah, it's unwieldy. You don't have space to really
explore ideas. And so people will tend to just take shortcuts because they don't want to argue.
I mean, it's a laborious thing to actually get into a Twitter argument with somebody. I don't do it.
Because it's just a waste of time, most of the time, because
you're not going to change their beliefs, they're not going to probably change yours.
Yeah. And it's just going to change their beliefs, they're not going to probably change yours. Yeah.
And it's just going to resolve in insults and then just eat rather than light.
And you know, what's the point in it when you could be spending that time doing something
more productive?
One of my favorite Twitter followers, Adam tweeted this earlier on, Twitter is too short
for specifics.
Generalizations have to suffice.
Midwits can't abide generalizations.
They'll point out every exception and demand
you to eat a full thread. Then they refuse to read the threads. Don't tweet for midwits.
They are not your audience. Yeah, it's true. I mean, I have to deal with this all the
time. Every time I tweet, I have to sort of make a compromise between accuracy and pithiness.
And the thing is, is that I like
consideration. I like to just, I like to say
witty one line is on Twitter.
But the thing is, is that if you say
witty one line is you have to omit
a lot of context.
And the context, the lack of context is where
the people are going to be coming up in your mentions.
And they're going to be accusing you of,
you know, lacking context of
general.
The best reply to one of those pithy little statements is, well, like not everyone.
And you're like, well, yeah, obviously not everyone.
I wasn't trying to be like exhaustive with this little aphorism that I've come up with
that rhymes, so I think it's cute.
Like just leave me, allow it.
Yeah, exactly.
Yeah. Yeah, I mean, Twitter is really just, it's for very, very brief general maxims about the
world.
It's not really for anything about it.
So, afterism, circle, jerk, isn't it?
All right, next one.
This is one of my favorites.
I really want to hear your thoughts on this.
Good heart's law.
When a measure becomes a goal, it ceases to become a measure.
For example, British colonists tried to control snakes in India,
they measured progress by the number of snakes that were killed,
offering money for snake corpses,
people responded by breeding snakes and killing them.
Yeah. Yeah, this is a very interesting one.
So when you have a system,
and then you try to sort of optimize that system
by reference to a single metric.
What normally happens is that that metric will be game.
History has shown that human beings will always game a system if they can do so.
It's sort of like part of the human make up to try to do that.
We always try to find loopholes. We always try to, you know,
try to get sort of underneath the sort of defense and try and get to the other side. We try to,
we try to always find our way around things. And basically, this is why it's a bad idea to try to
use metrics for anything, because they will always be games, you can always find a way to manipulate
metric with anything, you know, like for instance, I mean this again, it goes back to what we were
talking about with poverty and relative poverty. If you measure things in terms of poverty and absolute
poverty, people will change the sort of definition of it or they will manipulate it in such a way as to make it look like something
it's not. So for instance, if you are measuring absolute poverty and you are measuring it via salary
or by a yearly income, which is how it was measured, what people will do is they'll misreport
their salaries because they want their area to be,
they want themselves to seem like they're in great poverty
than they actually are in order to get more help from people.
This is a bit of a vague example.
This is one that you can find in any...
Dude, an example that I really love using is email capture.
So let's say that you're a content creator
that wants to start building up your email list
and your goal, the outcome that you're looking for is emails. How many emails can I capture?
So what you say is this ebook contains a world-winning lottery combination formula that will guarantee to make you
£1 million blah blah blah and you post it everywhere and you get tons and tons of emails but when people open the PDF it's just blank, there's nothing in it. It's like,
okay, so you've gained the outcome, the outcome was get emails but really what was the
distillation of the outcome, the outcome was get access to people who genuinely want
to hear what I have to say in a good faith way that makes them continue to want to hear
what I have to say. So by optimizing
for that particular outcome, you've actually missed off the thing that you were there to
get.
Yeah, absolutely. And I mean, it works in so many different ways. I mean, again, you see
this on Twitter. You know, if you try to measure a person's credibility via the number of
followers that they've got, for instance, you know, what happens to you.
Kanye West is crushed it, yeah, exactly.
Yeah, people game the system. I mean, what will happen is that people will use underhand
tactics to get followers. So what you actually get is you get the most untrustworthy people
having the most followers. So it's actually, it's an inversion of the original sort of
concept, you know, this happens, I mean, with everything, you know, with everything. It's basically the rule rather than the exception.
People will always try to game a system and laws are created in such a way as to try to
minimise this. That's why you have all these extra additions added to laws. Whenever a loophole
is found, a new law will be created to basically close that loophole.
And it's a constant game between people trying to create the rules for a system
and people trying to gain the system. That is the essence of civilized society.
The Messiah Effect, most people don't believe in ideals,
but people who believe in ideals, most people don't believe in ideals
but in people who believe in ideals, most people don't believe in ideals, but in people who believe in ideals, hence why successful religions tend to have human profits or maceias,
and why when a demagogue changes his beliefs, the beliefs of his followers often change accordingly.
Yeah, so this one, I wasn't really sure about including this one because this is completely my
own invention. That's what basically separates this one from
the others, but because I included one of my own from the first thread, I thought,
okay, I'll include one of my own into this thread, it's still a nascent one, which
is why the Messiah, the Messiah, in fact, is not a great name. I just sort of
spurred the known moment name, but it's something that I do feel is generally
true. I don't think it's true in every instance again, but it's that I feel is a general truth and this is from my observations of people and I think
generally if you look at people
when they
during election time, what they will normally do is that they will normally express
sort of their sentiments towards a certain person, a certain politician
rather than towards an ideology. If you ask people, for instance, if you look at Donald Trump,
for instance, if you ask a Trump supporter what they like about Trump's policies, most
of them wouldn't really be able to give you a very good overview of his policies. They
would generally just say that I like him because I think he tells it like it is,
he doesn't care about what the establishment thinks.
He just basically does what he wants to do,
he's independent and all this kind of stuff.
So they would generally fire off the qualities
of the person or the policies that he espouses.
And I think that one of the reasons for this
is that policies are quite hard to understand.
And they take a lot of sort of time to really get to grips with. You have, in order to understand
Trump's policies, you need to understand how economics works, you need to understand how the
political system works, you need to understand how business and corporations and things are upwork.
So there's a lot of, there's a lot of things you have to understand
in order to just understand from the policies.
Whereas if you look at Trump himself,
he's quite easy to read.
You're just all you've got to do is just watch him on TV.
And you can see aspects of his personality
very, very quickly.
You don't need to have any other knowledge.
So it's a shortcut, basically.
It's a shortcut.
You trust people rather than ideas. And it makes it much easier on our brain because
then we can just, we can just delegate all the responsibilities to that person rather
than have to think for ourselves.
It's like there is distillation of what they represent.
Yeah, absolutely.
Yeah, yeah, yeah.
We tend to think of human beings in terms of archetypes, I think, and when we see somebody
we think of what they represent as a symbol.
We see people as symbols, in a sense.
And for a lot of people, Trump symbolized this kind of anti-establishment
sort of independence, basically.
He was essentially like a wrecking ball that is going to sort of, you know,
just tear apart the sort of polite society sort of, you know,
that govern before him.
And then obviously to people who like, who don't like Trump, he's just this, you know, crude sort of
buffoon who is just an idiot, basically, doesn't really know anything, and he's pathological
liar. And that was basically, that was for them, that was sufficient for him.
Do you know why another reason I think this might be the case? And perhaps it's a weakness or a vulnerability of the 21st century.
I don't know many people that genuinely love a thing.
I know people that love people, but I don't know many people that are actually really passionate.
So if you ask someone, dude, what do you really care about?
What do you really passionate about? What do you love in life?
And...
So you should say that a family wouldn't make... Correct. Yeah, they don't tend to say a thing.
So when you see someone who is outright caring, this is from Elliott,
say, Yucatsky, actually, he highlighted that most people take the piss out of the
rationalist movement, not because they're taking the piss out of the
rationalist movement, but because not many people love anything as much as
rationalists love the rationalist movement, but because not many people love anything as much as rationalists love the rationalist movement. He says, it's just an outlier effect to see people that
care a lot about a thing. And when you finally do, I think the presumption from midwits is
they know something. No one would be this bought into any idea if there wasn't tons and tons and tons of
virtue behind it, because I'm not, I think I'm smart and I'm not brought into anything
as much as this guy.
So I'm a, I'm a put my colors onto this person's flagpole.
I'm a hold on to those take cult tales because that's the person that's going to carry
us forward.
Yeah, absolutely.
Yeah, we generally do have more of an affinity to people than to ideas
or to things. And I think part of it is also the same reason why we like watching Hollywood
actors play parts. And stuff is that we get the archetypes from these people. You know,
we basically we see the way that a certain person behaves. And if we like the way that it behaves, we try to emulate them.
We see them as a model for our own behavior.
And through that, you know, we sort of, it's like a way that we've kind of tried
to improve ourselves by emulating other people.
And if we see somebody that we like, think, oh, I've been more like this person,
you know, you can't really emulate an idea.
You can't emulate a thing, but you can emulate a person. And because we are sort of mimetic human beings, you know, we
return and learn by copying others, it's only natural that we're going to sort of migrate
and sort of, you know, navigate towards people rather than towards ideas.
React and theory, when someone is restricted from expressing a point of view or pressured to adopt a different point of view they usually react by believing their original point of view even more and then you did not
call about why the cure the best cure for fake news is more fake news as well that which is related to this I think yeah reactants theory is one of the key arguments in my view against censorship because what history has shown and by history,
I mean just the past couple of years, what that has shown is that if you try to stop people from
believing something, they're going to dig their heels in harder because Dostoyevsky actually made
this point a long, long ago where he basically said that what people want is not truth. What
people want is they want to exercise their free will. They want to, they want to basically
exercise their free will. And if that means going against reason, they will choose freedom
over reason. So, and he said this in a book called Notes from the Underground. Pretty good book.
And basically, this is the idea behind Reactance Theory,
which is that if you tell people they can't do something,
they're gonna feel like they're under threat.
They're gonna feel that their freedom is being
sort of rained in and they want to exercise that freedom
in order to sort of feel that they're not,
in order to sort of assuage this feeling of claustrophobia.
And the way that they do that is by believing what they believe are even harder
and reacting even more against, you know, what is trying to sort of stifle them.
It's almost like if you look at the sort of the story of the Garden of Eden,
when, you know, God tells Adam and Eve that they can't
eat from the apple.
And that makes them even more curious about the apple.
And they're like, okay, he told us not to eat the apple.
Why didn't he do it?
Now, I really want to eat the apple.
So they become even more sort of adamant to eat the apple.
So if this is a very fundamental idea, and it's one that you see all the time,
if you look at the way that censorship works now,
I mean, most of the censorship that occurs
is usually targeting people who are pushing
what are regarded as conspiracy theories,
things like Q and on the anti-vaxxers
and those kinds of people.
And what these people generally do when they see their posts
being deleted from Twitter is they're not gonna say,
oh, okay, they deleted it.
So that means it's wrong.
So I'm not gonna believe that anymore.
They're gonna do the exact opposite.
They're gonna say, okay, these guys don't want truth
to come out.
So they're basically, they're censoring us because they are afraid of what we have to say.
So that means that we're right. That's basically that validates what we're trying to say.
Because if what we were saying wasn't dangerous, if it wasn't true, they wouldn't bother,
they would just point out why we were wrong. They wouldn't bother trying to censor. So
it basically, it's counterproductive. And you see it with what happened with Parley.
You know, Parley, the social media app, I think it's pronounced parla. Yeah, it's spelled
parla, but I think it's pronounced parla. At least that's what I heard earlier. Yeah,
I mean, I only knew this, I only found this out yesterday, in fact. It's called fucking nothing now,
because God knows where it is.
Well, it's back online, it's actually back online again now.
Yeah, it's on a different server.
But what happened, so I'll just sort of,
quickly, briefly sort of delineate what happened.
So after the January 6th Capitol Raya,
there was a lot of pressure on the tech giants by the
Biden administration to essentially rein in these kinds of Q&N-on conspiracy theories.
And basically, what Amazon did is Amazon figured that one of the websites using its platform, its web server, this social
media site called Parle, was basically a key to organizing the riots.
So they essentially deleted the website from their platform.
And in their view, this was a good thing because this would ensure that people wouldn't be
able to coordinate any further sort of riots. But they actually had no effect because everybody just went to alternate
platforms, platforms such as GAB, and they basically just did what they were doing before. But now
they were extra angry because now they thought this is actually a confirmation of
Everything that we have been saying that they have actually you know, they've tried to silence it's they've tried to silence it's on mass
And so this is clear that there's some kind of conspiracy and this is the thing that conspiracy theories is that if you try
To conspire against the conspiracy theory it becomes more evidence conspiracy
to conspire against the conspiracy theory, it becomes more evidence conspiracy.
And that's what essentially is happening
with these kinds of, with censorship.
It appears to the conspiracy theorists
that censorship is a conspiracy,
that it's part of the conspiracy essentially.
In that sense, conspiracy theories are antifragile.
You cannot defeat a conspiracy theory by conspiracy,
because that just becomes part of the conspiracy of that they believe in.
So it sounds very obvious,
but this is something that a lot of people who work for these fat checking organizations
and these Silicon Valley sort of sensors,
a lot of these people don't really realize this.
And this is actually one of the problems
that I see with this partially or a counter point to it,
is that taking out individual speakers
that hold keystone positions can be effective.
So I know that Alex Jones has a particularly large audience,
but his audience isn't going to grow.
Like the only way the info wars.com now increases in size
is by word of mouth.
He can't advertise, he's not got access on social media.
You know, Donald Trump now when you see him on the internet,
you're like, oh, look, it's Donald Trump.
As opposed to, oh, look, it's Donald Trump.
Like, it's a surprise rather than something that's obvious.
And the example that I use for this is Miley inopolis.
Like, Miley got totally unperson from the internet.
And where the fuck has he got?
Now, he may be an easy example because it's just gone mental.
But in some situations, it might work.
Yeah, I think when Miley was a very peculiar case,
because he didn't just get cancelled by the establishment. He actually got cancelled by his own peers. He got cancelled
by the right. Because obviously, you know, he did the unthinkable, you know, he basically
tried to create an excuse for the billiard. And so obviously, I don't think anybody could
recover from that, no matter how far to the right you are, no matter how pro-free speech
you are, you know, you're not going to get a return.
Richard Spence is not coming to right you are, no matter how pro-free speech you are, you're not going to get a rich in spending.
That's not coming to save you there, no.
Luckily, exactly.
So I think that was a slightly different.
But I understand what you're saying.
I do think that it can work sometimes.
And I do, I think you're an example of Alex Jones
as a particularly important one.
Because I mean, I've got a lot of criticism
for writing a Quillet article in which,
although I reiterated that I'm basically pro-free speech, I said
that it was actually, that Alex Jones was a threat to free speech, because, and the reason
for that, is because what he was doing is he was essentially, he was accusing people of
being child molesters with no evidence, And this guy has got millions of followers,
and a lot of those followers have got guns, and they don't take kindly to hearing about pedophiles
in their vicinity. And this obviously manifested with the planet ping pong being shot up, you know,
luckily nobody was killed. But, you know, when you've got a guy who's created, just basically
creating these sort of conspiracy theories and accusing random innocent people of being pedophile,
that's actually pretty bad
because that creates violence in the real world.
And that's not good because that intimidates people.
It prevents people from wanting to speak out
against Alex Jones because if they say something
against Alex Jones, they might be sort of, you know,
assumed to be pedophiles too.
And then people come up to them. So I felt that Alex Jones was actually creating, although
he always talks about freedom of speech and all this stuff, I felt that his actions
in that respect were actually detrimental to freedom of speech. But now the thing is,
yes, Alex Jones was, he was taken off Twitter and it had a very small effect. Not a huge one because he's still going viral.
He went viral very recently on YouTube, despite being banned from YouTube.
He went viral on Twitter, despite being banned from Twitter.
So it doesn't actually have a massive effect.
It does have a small effect.
But the thing is, is that this is all going to come to an end very soon.
It's no longer going to be possible people to be cancelled by the establishment
within the next couple of years.
And the reason?
And the reason?
Oh, decentralization.
Bingo, yeah, you got it.
Yeah, that's exactly what's happening.
So we've got Web 3 coming along.
And because that works on the blockchain,
primarily on Ethereum, there's no regulatory body, there's no sort of centralized node in that network.
So there's no middleman basically, nobody has to get their information going through a router in order to get it to somebody else.
It goes directly from one peer to another. And because of that, the old systems are not gonna work
very well at, that's sort of regulating
what people can and can't say.
You know, you've got, for instance,
in the web two system, you've got things like Patreon.
Patreon can take your livelihood away from you
if it thinks that you've said something that it doesn't like
because in order to get payments,
you have to go through Patreon. You can't just get payments directly from your uses,
at least you couldn't until Web 3.
So that was one way that they would leverage their power against people in order to
style for speed.
Another way was like Facebook would be, they could just completely delete your account
because your entire online personality
is dependent on Facebook's platform.
So you can't really,
if you want to continue to have business ties,
if you want to continue to have relationships online,
you have to do what Facebook tells you to.
But this is all going to come to an end.
For most people, not for everybody, but it's going to come
to an end for a large proportion of people with Web 3. Because with Web 3, you're going
to have all of this payments processing, all this kind of stuff, it's all going to be
done on the blockchain. And so because it's trustless, because it's permissionless, there's no single person
or no single entity or no single organization can have the leverage to stop you from saying
what you want. And that's a great thing. But the thing is, it's not everybody is going
to be able to use Web3. Because it's going to require, at least for the foreseeable future,
it's going to require a little bit of knowledge of blockchains and things like that. So I'm a bit worried actually with
the way that things are currently going with misinformation and censorship because there's
actually, there's going to be the older generations like the boomers and stuff who are going
to continue to use Web 2 and then there's going to be the younger, more tech-savvy people who are going to use Web3. And so the people who are using Web2, they're going to still be
restrained by these systems. And as a result of that, there's going to be a kind of disparity
between these two classes. I mean, one of the reasons, you know, there are plenty of ethical arguments
against censorship, and I'm sure your view is a public hit them already, so I won't go into them.
But there are also functional arguments against censorship.
And there are three, there are in fact three functional arguments against censorship.
The first one is that censorship doesn't really work because fact checkers are not very good at their jobs. Pretty
much all the fact checkers that work in Silicon Valley, they have to do something, they have
to basically, they have to be accredited by something called the International Fact Checking
Network, which runs out of the point as Institute in Florida. And this is a very liberal
organization like most of its kind, you know, most organizations it's kind. It's quite, it's very heavily for the Liberals in fact, it works very
closely with the Southern Pobty Law Center which you might know has a tendency to just call
everybody foreign like, I mean it was sued by Magi Nalaz for calling him foreign like and
it was forced to pay him $3.5 million and apologise publicly for that.
So, I mean, these guys, these guys, I just have a tendency to just dismiss anything that doesn't agree with them as far right.
And these are the people who compiled a database for the International Fact Checking Network of Fake News websites,
for the fact that checkers to use.
This database had to be taken down after it was revealed that it consisted pretty much
exclusively of conservative news websites and it was written by a podcast producer for the
Southern Pobty Law Centre. So it's pretty messed up system. All of these people,
basically, these people are the guys who work at Silicon Valley. Silicon Valley's fat checkers are overwhelmingly ultra liberal, they're very liberal,
they are, they regard anything to the right as fake news, they don't fat
check the left very often. So it's, it's got a very strong ideological slant
and because of that, you get that you see them getting things wrong like
for instance, lably hypothesis. They completely got that wrong.
They thought it was a conspiracy theory.
It's very clear that there's more evidence now
for a lably hypothesis than there is
for a natural origins hypothesis.
And they got this wrong because they have this liberal mindset.
They assume that Donald Trump was racist
and anybody who believed in lably hypothesis was racist,
which makes no sense because the alternative explanation,
which is that it came from Chinese dietary habits, is more racist. That's a lot more racist
than believing it came from a lab. And yet they somehow managed to, because Trump said it,
it was by virtue of that it was racist. So that's the first problem for the censorship, is that you
can't actually determine what's fake news or not because it's very, very heavily ideological.
You can't actually determine what's fake news or not because it's very, very heavily ideological.
The second problem with it is that it interferes with our natural adaptive processes.
So you can't get rid of all the misinformation in the world.
You can only get rid of a small proportion of it and some of the time.
But what you're doing is you're making people rely on you to tell you what is true and what is false.
When you choose what people can and can't see, and when you put these kind of nutritional labels on posts to say,
oh, this has been fact check and proven to be false, what you're doing is you're not allowing people to,
if you're not exposing people to lies, then you're not giving them the experience that they need in order to work out what's
true and what's false for themselves. You're basically spoon feeding them what's true
and what's false. And so you're making people reliant on your system to tell you to tell
them what is true and what is false. And that is a bad thing because if you do that over
a moment of period of time, eventually people are going to grow dependent on that. They're
going to grow dependent on organizations like Facebook to tell them what is true and what is false and that's very, very dangerous. People should
always try to work out for themselves what is true and what is false. So that's the second
problem with censorship. And then obviously the third problem is that Web 3 is going to
create this division. It's going to split the Web in two basically, where you're going
to have the old Web 2 users who are being spoonfared
uh, they're in a they're going to be completely reliant on these centralized structures to tell them what's true
and then you're going to have the web three people who are going to be learning for themselves what's true and what's false
and this is going to i mean this might be a bit of a you know crazy thing to say but
you might know about the story um the time machine by HG Wells,
and you've got the Moorlocks and the E-Loy. So basically, in this story, this guy goes into the far
future, and in the far future, the Moorlocks, basically human beings divide into two different
subspecies. You've got the E-loy who basically live lives of luxury
and they don't have to worry about anything.
They basically have pampered and they have everything done for them.
As a result of that, they grow very, very stupid.
They never have to use their brains, so their brains atrophy.
They never have to use their bodies, so their bodies atrophy.
So they become very weak and very stupid and very naive and sort of, you know, they
believe the best in everything.
And then underground, you have the mollocks who do all the work, they do all the toil,
all the industrial toil, they live lives of hardship.
And as a result of that, they become very sort of their brains overdeveloped and their
muscles overdeveloped and their muscles overdeveloped.
They basically become the opposite of the evil.
And so, you know, in the end, I mean, you know, it's sort of implied that the evil oil will be destroyed by the moorlocks.
I don't think that that would go quite so far, but essentially, what would happen if we retain this kind of central centralized structure on top of the decentralized structure?
What's going to happen is people will be divided into two and you'll have these kind of people who are using web 2 who will be kind of
E-Loy and then you have people who are using web 3 who become kind of more like so the people who are using web 2 will be
Spoon fed by these
Organizations, you know, they will essentially brainwash, tell them what's true and what's false.
And then you'll have people who are sort of using stuff themselves, you know,
finding out things for themselves, and they will have an advantage over the people who are using
the web too. So it's a bit of a dangerous situation when you try to regulate what people can't
of can't say, because in the long term, it's going to create disparities. You're going
to create people who are completely reliant on others to tell them what's true and what's false.
You're creating an entire class of people who are essentially cheap.
You get a Matthew principle with this as well, right? You're going to get to the people who
have the ability to understand technology and utilize the way that Web 3 works and can fact check
more effectively themselves, they're going to, you are going to end up with a
bifurcated culture. If Web 3 takes off in the way that some people think it might
do, you are going to end up with one group of people speaking one type of
language, having a group of commonly held cultural assumptions, and then
another group of people who have the old ones, the ones that have just been left behind.
Yeah, absolutely. I mean, and you know, it sounds kind of far fetched, but the thing is,
it's already happening, you know, it's already happening, it's not something that we have to wait
and see. If you look at, for instance, the way that sort of schism has occurred between liberals and conservatives, the liberals are not e-loys, you know, they're
not e-loys and the conservatives are not more locks. But the thing is, is that the liberals tend to
get their information from the establishment because the establishment is liberal.
And then the conservatives are obviously getting their
information from alternative media because the alternative media is now mainly conservative
or at least, you know, right leaning might be libertarian.
So there's this kind of, there's two separate sources
of information now in the world.
And these two groups are gradually diverging
as they sort of become trapped in their own echo chambers.
And we saw it again going back to the Labelique Hypothesis.
You had the sort of the liberal establishment, which
was firmly adamant that the Labelique Hypothesis
was a racist conspiracy theory.
And then you had the sort of right wing news media
and the right wing sort of information environment,
which believed that it was not just wing sort of, you know, information environment, which believed that, you know,
it was not just a sort of leap from a lab, many of them believed it was a bio weapon,
and you know, there was all these other crazy theories coming out of that situation. So,
it's already beginning, and I fear that the sort of, if Web 3 is not sort of homogenous, if it doesn't sort of draw everyone
and become truly sort of egalitarian in its functionality,
then it's gonna sort of exacerbate these divisions
because you're gonna have these big monolithic structures
in Silicon Valley regulating Web 2.
Obviously they will, they will integrate Web 3 functionality,
but the very nature of Web 3
means that Facebook is going to have very limited power in a fully Web 3 environment. So they're going
to try to retain their sort of power by limiting functionality. And people who are accustomed to
using these big monolithic structures won't want to leave them, so they'll retain, you know,
they'll retain, they'll retain a large audience,
even though there will be some people who will migrate
to mesh networks and the like.
But yeah, it's a very precarious situation.
And censorship is not the way to sort of try
to get people to believe truth because in the end,
all it's gonna to do is just
going to make people rely on others to tell them what's true, which in the long term is going to
make them completely, you know, delusional. Speaking of that one, Hitchens razor, what can be asserted
without evidence can be dismissed without evidence. If you make a claim, it's up to you to prove it, not me to disprove it.
Yeah. So, this actually has a precursor.
It was, I mean, it's famously attributed to Hitchens,
Christopher Hitchens. A lot of people love Christopher Hitchens because he's a great speaker and he's done a lot of great YouTube videos and that's how most people get to know him.
But this was actually originally thought of by a guy called Bertrand Russell, who's a
great philosopher who was around sort of the first half of the 20th century.
And he had a concept called Russell's teapot, which was basically that imagine there's a
teapot which is sort of orbiting the moon right now. You can't prove that there is no teapot which is sort of orbiting the moon right now.
You can't prove that there is no teapot there.
And so because you can't prove that there's no teapot there, if the onus is on you to
disprove it, then you have to believe in pretty much everything.
You have to believe in Santa Claus because you can't disprove Santa Claus.
You have to believe in Zeus, you have to believe in Poseidon, you have to believe in Shiva, you know, you have to believe in all these imaginary beings
because you can't disprove their existence. So the only way that you can actually get through life
is by assuming that what has been assertive of that evidence can be dismissed with that evidence
because if you don't do that then you have have to believe everything, essentially. So it's basically a sort of, it's a system
which ensures that your mind is occupied by the minimum amount of bullshit. Basically,
it's a bullshit filtering, aristic. That's the best way to think of it.
So few people use this, though, because a lot of the time we have these assertions that
occur online and people posit them as if you should accept the fact that they're there.
And you're like, hang on a sec.
He's a good example.
I had a conversation with a buddy who did philosophy at Oxford Uni and we were talking about ethics.
And I'd never looked into ethics, not proper philosophy,
uni type of ethics.
And he said, you can have a conversation
with somebody debating about ethics.
If you both agree on the meta, ethical framework
that underpins it, but if you do not agree on the meta ethics,
then the conversation about ethics just falls apart
because it's not grounded in anything
that you can both agree on.
It's like, look, are we playing rugby or are we playing football? Because we can debate about,
or we can play the game of rugby in football, but if I kick it and you pick it up,
this is no longer the same game that we're playing. And yeah, this is the same with the Hitchens
Razor thing here. People post stuff online, presuming that you're supposed to, and then you say,
well, hang on a second, like, that's not right. And you go, everyone believes that this is right.
What do you mean?
What do you mean?
Yeah, absolutely.
Yeah, I think this is actually the essence
of disagreement now on Twitter really,
is that people don't actually disagree on facts
or anything like that.
They're actually disagreeing on frameworks
by which they come to true.
And so, yeah, this is why Hitchens raised
with so important because it's a short
pithy thing that everybody should learn because, you know, all it's doing, if you are expected
to disprove what somebody else is saying, all this is doing is wasting people's time because
it means that I could come out with literally any claim, you know, there's no filter, there's
no discrimination. It's just like, I could just come out with literally any claim. There's no filter, there's no discrimination.
It's just like, I could just come out with anything.
And then it's just going to waste people's time
when they have to just prove it.
So it's a time saving device.
And it's worth, one, just for me, because I just,
you know, I think of the strength of the argument.
If it's strong enough, I will engage with it.
If it's not, it doesn't actually have the strong evidence
behind it.
It's not worth my time.
Do the thing it does is it limits our own hubris
about believing our own bullshit,
right?
You say, look, hang on a second.
If I want to put something forward, I actually need to be able to back this up.
And if I can't, then maybe I need to shut my pie hole.
At right, final couple, a couple of my, sort of the ones that really intrigued me, focusing
illusion, nothing is ever as important as what you're thinking about while you're thinking about it.
For example, worrying about a lot like the eyes.
So the eyes can only focus on one thing at a certain time.
When you look at something, when you look at anything, you kind of have to focus on something
or you have to just focus on nothing at all.
You can't focus on everything at the same time.
And the mind is exactly the same.
If you think about something, you have to think about, you have to focus on that thing.
You can't, when you imagine a system,
you have to focus on a certain aspect of that system.
You can't imagine the entire system at once,
because your mind just doesn't work like that.
It works by picking out details
and focusing on those details.
And the problem is, is that when you focus
on certain aspects of a system or a structure,
your mind inflates the importance of that.
And so it basically can, you know, if you spend too long thinking about something,
then that thing, because it grows in stature, and it becomes something that eventually can become an obsession
because you think it's so important.
And I think this kind of can be seen in a way on Twitter.
If you look at certain people who spend their lives
with a single cause, one cause candidates
of political parties or even just ordinary people
who just have this one issue that they really, really, really had them and about.
What happens is that the more that they focus on it, the more important it gets and so they spend
even more time on it and so it becomes gradually just snowballs into this kind of obsession until
they just become more and more extreme and more extreme. And you can see it with someone like Tommy Robinson, for instance. You know, he started off as a relatively well-rounded fellow,
you know. And then gradually, you know, he had some valid concerns about Islamism in
Bari Park. And he decided to sort of devote his life to his life, to fighting his Islamic extremism.
But then what happened was that because he was spending so long thinking about it, it
began to seem like the entire world to him.
And so it sort of set off this threat response in his brain where he began to sort of feel
this kind of feeling that it was encompassing him.
It was basically something that he had to defeat. And this, this conviction of his got
stronger and stronger the more he thought about this problem. And so eventually,
you know, when he was on Twitter, you know, he was just posting never-ending,
just constantly sparing out all this anti-Islamic stuff, you know, and it just
gradually, what it did is it just kind of made him, it was like a kind of
self-reinforcing
became his own caricature, right?
Yeah, that's exactly what happened
He became a character of himself and the reason that that happened was essentially because of the focusing illusion because he was focusing on
Islam at the expense of all the surrounding context, you know anything else
It was just literally just Islam, Islam, Islam, Islam,
and I thought that he became, he just became obsessed and it happens with every, it's not just, you
know, with people like Tommy Robbins. This is something that happens with pretty much everybody.
I mean, it's happened to me, you know, sometimes if I thought about a certain thing, it kind of
links in with the golden hammer in a sense because if somebody's written a book about a certain
concept,
they're going to be thinking about that concept all the time and that concept is going to see
more important to them than it actually is. And so you're going to get people like Taleb
who are going to see things in the world and they're going to think, oh, okay, this is this is
explained by anti-fragility or this is explained by the Lindy effect or Black swans or now all these
other concepts that he has. it's because the mind is incapable
of seeing a system as a whole. It tends to just see the constituent part and it tends
to focus on those particular parts. And as a result of that, you can become blinded
essentially by what you focus on.
Right, last one, Dunning Kruger effect. Awareness of the limitations of cognition, thinking requires proficiency
in medicognition, thinking about thinking.
In other words, being stupid makes you too stupid to realize how stupid you are.
Yeah.
Well, I mean, this one's quite self-explanatory, but I think I'll try to expound upon it.
So I think really this is kind of the main obstacle to people educating themselves
and becoming more intelligent because you can only remedy something if you see the problem,
if you actually know what the problem is, but the problem with human minds is that they're incapable
of really comprehending themselves. Nobody really understands their own limitations.
You can't, because you can't step outside yourself.
We are sort of locked inside our schools.
And that's what, we have a very limited perspective.
We can see the outside world relatively well,
but we can't see how we are ourselves.
We can't see our own flaws very well.
We can't see the limitations of our thinking very well.
And as a result of that, we often, there's a sort of correlation between how
unintelligent you are and how ignorant of your own unintelligence you are. So,
you know, people sort of who are very ignorant are obviously going to be ignorant of themselves,
and so they're going to be ignorant of their own stupidity and their own ignorance.
And so it's very hard to remedy that, you know, because what exactly are you supposed
to do?
If somebody can't recognize their own ignorance, then you can't really remedy it.
It's a bit of a sort of sticky situation that people are sort of, and
so I mean, one of the ways that you can try to rectify it is by educating people about
the Dunning-Cruge Effect, by actually telling people, look, everybody is ignorant to a certain
extent, and the degree to which you're ignorant limits how aware of your ignorance you are. How do you think the Dunning-Kruger effect relates to the midwit meme?
I think it probably does explain it to a certain extent. I think when people are kind of,
you know, when they are not really aware of their own flaws, they will just tend to just
not bother trying to seek anything out of their own comfort zone they will just tend to just not bother trying to
seek anything out of their own comfort zone. They won't really, they won't try to find,
if you think that what you believe, if you think that what you believe is right, then
you're not going to try to correct it. You get what I'm saying. So these people are just
going to carry on believing what they already believe, they're just going to just sort of,
you know, they're not going to bother trying to remedy because they're not even aware that there's even a problem.
So this I think is the crucial sort of element of midwits and not just a midwits but of everybody
really. I think everybody has this problem. This is not, you know, this is not something
that anybody is immune to and we're all, we're all guilty of it. And, you know, I myself am ignorant
about my own ignorance. There's a lot of things that I don't know about myself that I'm
ignorant of. So it's something that we all have to contend with, you know, this is,
and this is the problem, but I think, I think that this is something that we can only really
remedy by making people aware of the Dunning Cruder Effect itself and also by making people
aware of cognitive bias because when you, when you're aware of the Dunning Cruder effect itself and also by making people aware of cognitive bias. Because when you're aware of the way that your brain is working, then
you can sort of, you can take steps to remedy it. And I'll give you an example of how you
do this. Let's say that you are an employer and you are going to hire one of two people, person A and person B. And let's
say person A and person B have got the exact same qualifications and they have pretty much
they're pretty much exactly the same in terms of what they work history is and all that
kind of stuff. One of them is physically attractive and one of them is not. So who would
you hire in that situation? A lot of people will probably say,
oh, I would hire the physically attractive person, but that's actually wrong. You should actually
hire the physically unattractive person because the physically unattractive person has managed to get
where they have gotten despite having the ugliness, the unattractiveness. So you're basically second-guessing your own bias. You know,
you're basically saying, okay, so there is actually a bias at work here, and that person
has overcome that bias, so they must be more competent than the attractive person. So you
can do it through these little sort of full experiments like that. You know, you can try
to teach people to be aware of their own biases, which can go some way to remedy the
Dunning Group Refect, but you'll never be able to completely be aware of every problem that you have in your thinking,
because it just doesn't work that way.
Matt, we'll just hold our hands up and allow ourselves to be carried away by a tidal wave of nihilism.
Look, dude, we made it.
Thank you so much for coming on today.
What should people do if they enjoyed this and want to check out more of your stuff?
I should they go? Yeah, so the best way to keep in touch with me is on Twitter. So you can follow me at
G underscore s underscore
Google that's pho g a l
And I've also got a sub stack on the horizon which I'm going to be open soon. I'm going to be exploring a lot of these concepts in more detail in SA form, which is, you know, it's only a couple of weeks
away now, so that's something that I'll be doing soon as well. And, yeah, hopefully,
I'll go to increase the understanding of these concepts.
Sick man, we'll just write some more threads and then we can do some more episodes. That's
what I want.
Yeah, more threads will be coming as well. Yeah, this is something I'm going to start doing,
but semi-regulately. So, yeah, I think I'll probably try and get into the thread out for the end of the year
and then we'll see what goes from there.
Good man, catch you later, Undead.
Cheers, thanks Chris.
you