Decoding the Gurus - Special Episode: Interview with Thi Nguyen, a Gurometer's Guru
Episode Date: January 29, 2021Today we talk to C. Thi Nguyen, a philosophy professor at University of Utah. He has some excellent insights into the kind of discourse that *feels* like it gives us insight, that wonderful 'aha' mom...ent. Basically, what happens when unscrupulous actors aim to optimise that feeling - putting aside concerns as to whether or not it's the real thing.Thi has previously studied 'moral outrage porn', which is a bit like food porn, but for your emotions. You might say "X-porn" is any material that gives us the fascimile of the thing, without having to put in the hard yards of actually doing the thing.You might also say there's such a thing as 'insight porn' and maybe that's what Gurus deliver! Matt and Chris feel like they got some real insights (touch wood) from their chat with Thi and they hope you do too!--------------------------------More from Thi Nguyen:Thi writes about many things, including echo chambers, epistemic bubbles and the seductive feeling of clarity. A free preview of the first chapter of Thi's Games bookMoral outrage pornThe op-ed version of moral outrage pornThe echo chambers paperHow Twitter Gamifies CommunicationRecommendation from Thi: The best book on Echo ChambersAnother recommendation: A great recent book arguing for the large-scale misinformation thesis: and it’s open access.Thi's appearance on ETV pod discussing Cheap Talk.
Transcript
Discussion (0)
Hello and welcome to Decoding the Gurus, the podcast where two academics listen to content
from gurus across the online world and we try to understand what they're talking about.
across the online world and we try to understand what they're talking about.
And in this special interview slash chatette episode, we have someone here to help us understand some of the reasons why gurus are so appealing.
So welcome, T. Nguyen.
Hello.
Okay, so T. is a professor in philosophy at the University of Utah, and he's done some
very interesting work on how the online infosphere
affects people's thinking, including the phenomenon of outrage porn, which he can tell us about,
as well as epistemic bubbles and echo chambers and other interesting things. So it's great to
have you on, T. Thank you very much. So to get us rolling, we might start off with briefly
with your stuff on Outrage Porn because on DTG
where we're really interested in essentially fake things
masquerading as the real thing, right?
Well stated, Matt.
Very well explained.
I bet.
So, yeah, so Outrage outrage point is kind of a similar thing
so maybe tell us a quick bit about that okay so this paper this paper and i will be honest the
paper started as a drunk facebook conversation on someone else's thread between me and becca
williams who turned into my co-author for this.
And I was just like two in the morning.
I was like, you know what?
No one's given a good definition of food porn because we talk about it all the time.
And Floss was like, define stuff.
Of course, there's like this huge amount of work
on like sexual pornography,
but there's this new use, right?
And I think we all know it.
Food porn, poverty porn, ruin porn, like closet porn.
My wife says to calm herself down she goes to look at this site called things organized neatly which is just obviously
organization porn i'm adding it to my bookmarks it's you it's it is it is it is strangely sexual
and one of the things you can see is when you look at these porn sites like food porn and organization porn, it's obviously porn-like. So we were trying to figure out what it was, and Becca had this incredible suggestion. There's this old paper from Michael Ray where he says what sexual pornography is, is you exchange sexual images outside of the context of a relationship for not,
not for furthering a relationship.
He was really interested in the fact that, you know,
people could exchange naked and erotic pictures as part of a healthy
relationship, but porn was something else.
So he thought it was like this,
this weird thing that existed outside of the normal goals of like intimacy and
connection and a romantic relationship. We were like, hell yeah, we can generalize that definition.
So here's our definition of porn.
Ex-porn for any ex.
Sorry, I'm a philosopher.
I have to use variables.
Ex-porn.
Ex-porn is representations of ex used for immediate gratification
while avoiding the costs and consequences of entanglements with the
real thing. So food porn, pictures of food make you feel all hungry and good and whatever,
salivating, but without having to buy food or make it or deal with the nutritional consequences or go
out. Like real estate porn, cool pictures of real estate without having to buy for it or care for it.
And one of the suggestions we made was,
okay, here's a new kind you can identify,
moral outrage porn, right?
Moral outrage porn is representations
of morally outrageous situations engaged in instant gratification
for the pleasure of moral outrage
rather than for
actual moral action. And I want to be super clear here. A lot of people read the stuff of ours and
they immediately try to adapt it to this like crappy end. I don't believe in it at all. The
crappy end is, oh, this means moral outrage is bad. Let's all be civil and nice to each other.
moral outrage is bad. Let's all be civil and nice to each other. Fuck that. That's not what we meant.
What we meant was, when you say that, if you think that sexual pornography is bad,
you don't think sex is bad. We don't think that moral outrage is bad. Like moral outrage is incredibly important. It's motivating. Aimed at the real thing, aimed at actual morally
problematic situations, it's one of the most crucial emotions
we have it's because real outrage moral outrage is so important that the pornified version
which simplifies moral outrage for the sake of pleasure is so devastating and one of the
worries is as with all other kinds of porn, shorn from the responsibilities of doing it in a nuanced and careful way,
when you're just like optimizing it for pleasure. So, I mean, if you want to be really moral,
you have to pay attention to nuance, you have to pay attention to people's feelings. But if you're
just in it for the pleasure of outrage, then you want to do something else. You want to tune it for
max pleasure. And tuning it for max pleasure involves, like, making it simple,
making it easy to access, making it un-nuanced,
making it uncomplicated, right, making it into, like, moral candy.
Yeah.
Yeah.
Yeah, that seems like very much an idea for the modern age
where so much of people's lives is conducted online
and in this virtual,
not unreal kind of sense and is often performative to some degree.
So when I heard about this idea, I immediately thought of so many instances of, let's see, moral grandstanding,
which is public shaming and that kind of online activism
which becomes a kind of slacktivism.
So I'm extending from your idea here.
I know it's not exactly what you were talking about, but it feels like those are also things
that can sometimes be a facsimile of the real thing, which is quite time consuming and difficult
and frustrating, but done just really for the pleasure of it.
I mean, you might think there's a slight difference.
So moral grandstanding is like using expressions of morality for status,
and moral porn is using expressions of morality for pleasure.
So they have slightly different purposes,
but they share a similar structure,
which is you're not supposed to use morality for pleasure or status,
right? You're supposed to use it to be good. This is what I think is happening with many
expressions of morality. And I wish I could go back in time and change one thing. A lot of people
read this stuff and they immediately are like, oh, this only applies to expressions of outrage i have
expressions of civility and calm and connection that shit is just as pornifiable right if you
express centrist expressions of like let's all get together let's be civil and kind to each other
enough with this moral outrage and you do it to feel smug towards say the radical left
it's just as pornified i know this that whenever this concept came up and people were debating it
online that they they kept conflating when they hear you say that they think you're saying
civility doesn't matter civility is important like we don't need to be nice to people and they avoid
the fact that the qualifier porn is there right that's the whole point that you're not denigrating civility
and being respectful to people you're denigrating the indulgence of it outside of its purpose well
let's say the when it's done in a performative way chris you know we both know of instances of
of how it's done in a very elaborate way where it's a way of
showing, oh, look how much credit I'm giving. Look how open-minded I'm being.
Yeah. I think you want to distinguish between the performative and the hedonistic, right? And I
think they may go together. So I think the moral grandstanding stuff may be more performative.
You're doing it more to look like you're moral than to
actually be moral. And the hedonistic is you're engaging it for your own pleasure. And I think
often my suspicion is a lot of the outputters of moral outrage porn would be described as
performative. And a lot of the receivers of the audience and the audience are engaged in it
for hedonistic reasons and these
may be parasitic with i mean maybe symbiotic with each other yeah that makes a lot of sense to me i
think um yeah fascinating so one thing that came up when you were describing that is the concept of a
super stimulus that in the modern environment we have things that activate our moral senses or desires,
but they're, they're kind of super attractors for it. And you know, there's lots of examples
from evolutionary biology that's applying to like other animals. And so I was wondering,
is that inherent to it that for something to be porn, that there's a super stimulus aspect of it,
or is that not necessarily does that not need to be stimulus aspect of it? Or is that not necessarily,
does that not need to be there for the concept?
So I don't think it's necessary,
but it's a common result
of a certain functional relationship.
So you can use anything as porn, right?
Like this is, so Michael Ray made this point. Like people can
exchange intimate sexual pictures for as part of the relationship and then someone else grabs it
and uses it as porn. Right. And I think the same thing is true of, um, of moral outrage porn,
which is, I mean, I could act it's's, the situation is really complicated. Like you might think that
someone could sincerely tweet a genuine expression of moral outrage from a really
morally difficult situation and other people could use or retweet it for porn. Right. And so,
so, but the thing that I'm really worried about it is, so when you use something for porn, you're trying to use it for pleasure.
So if people start consciously producing it, then they'll want to optimize the pleasurable aspect.
So I keep, in everything I've been talking about, I keep running this background analogy with like the industrial production of sugary and salty snacks,
right? Like, so we're evolved to want sugar and salt for fine reason, like in the environment of
evolutionary adaptedness, they were moderately correlated with nutrition. And so we get pleasure
from those things. And the moment someone can profit off of giving us pleasure, then they're not going to target
the original function, right?
They're not going to target nutrition.
They're going to target the thing that gives us pleasure.
So if there's any wiggle room, right?
I mean, you should expect companies that make money off of selling food to exaggerate the super stimulus part that gives us pleasure,
right? Because I suspect the same thing, right? If you're peddling moral outrage porn,
and especially if that gives you power, then you're going to, and you have audience people
who have started to get used to and want and crave pleasurable moral stimulation,
then you'll have reason to exaggerate whatever parts of it are pleasurable.
Like whatever will give people the sensations of confidence or smugness or any of that.
Now, the point where your work really became super interesting and relevant for me was when you moved into looking at this,
the sort of stuff you cover in your manuscript, the seductions of clarity. Yeah, which I think
is a, correct me if I'm wrong, but it seems like a cognitive parallel to the outrage porn in that
it's, well, I'll let you describe it, but in my fuzzy understanding, it seems to be that you're talking about how actors can focus on giving the feeling and impression of insight and that aha kind of feeling, but actually is really can be a substitute for the real thing.
Exactly right.
So, I mean, you all should help me because I shoveled a bunch of papers at you and you're at it. For people in the audience, these two did this enormously like insane thing of reading this whole pile of my papers for no reason.
And they're all related to each other in ways that are really hard for me to say. And I'm actually currently trying to write a book for not just academics, but for everyone about it.
And I'm having a little bit of trouble saying what that center is.
I can say it in technical philosophical language.
I'll do that later.
It's gross.
But no, this is exactly right.
So there's a separation between the signal and the actual content.
So here's the idea of the sections of clarity.
So I mean something specific by clarity.
I mean the feeling we get,
right? So Alison Gopnik, a psychologist who studies this stuff, has this paper called Understanding as Orgasm. And it's this moment of like, she's trying to talk about that cognitive
moment of like epiphany, like, aha, like I get it. And everything falls into place and it feels
good, right? So what I was thinking, what I'm thinking is, look, we're limited beings. We can't do
everything. We need to know what to pay attention to and what not to pay attention to. And so
my claim, which I have some integration with the psychological, you can tell me if the integration
was good, but I think I have some backing
from the cognitive sciences and psychology.
My claim is that we use the feeling of clarity,
the sensation of understanding
as a guide to when to stop thinking.
When you get that aha moment, right?
That feels right.
Then you're like, oh, I get it.
So you stop thinking about it
and you start thinking about something else, right? So we use the feeling of understanding as a heuristic for terminating
inquiries. So if that's true, then it would be really valuable for anybody who wanted to
manipulate our beliefs to game that feeling. So if you could fake the feeling of understanding,
then you could get control of people's attention, what they paid attention to, right? You can,
I think in the paper, I have this analogy of like, look, so stage magicians, actually what
they train in is to make the hand that's actually doing the work look boring and the hand that's
not doing the work really interesting to send people's attention away. Like it's a signal of boringness is an invisibility cloak uh and so if you can
manipulate people's feeling that they understand then you can cloak things behind a similar like
cognitive invisibility cloak so so there's actually a really useful uh description of
works drawn from the philosophy of science they they ask like, so what is it to
understand something? Like to really understand, to like the real thing, not the feeling. And they
say, look, understanding is not just knowing separate facts. It's having all the facts
cohere together in a usable way. When you understand something, you have a model that
connects things and makes them coherent. That model is usable to generate new explanations
and actions, and it's easily communicable. So in this paper, what I was saying was, look,
so if you want to game this, if you want to fake this, then you want to fake the feeling
of coherence. You want to fake the feeling of usability, and you want to fake the feeling
of being able to communicate things easily how do
you do that conspiracy theories is one and then actually in the paper i think by bureaucracies
is another i think you're you two are far more interested in the conspiracy theories but i think
it's just as applicable to bureaucracy bureaucracies are there in those intricate webs they weave in
everyone's lives as well but the the thing that those points make me think about is,
you know, a lot of the people that we look at, the gurus,
they actually do create these extremely elaborate,
interlocked series of narratives and theoretical frameworks
about how the world works, the reasons that they are not,
or they are disparaged by people,
and also broader,
often civilization sweeping narratives about how the issue of trans bathroom access will
relate to the downfall of Western civilization.
Right.
And what it strikes me as, you know, when, for example, Jordan Peterson's devoted fans
are saying to people when
they criticize him, that you're not understanding him in context, and you haven't looked at the
lectures where he connects these ideas and gives a more nuanced understanding. You know,
it's sometimes presented as that's them being disingenuous, right? They'll never be satisfied,
which may be the case. But I think part of it is more related to the point that you're making.
case but i think part of it is more related to the point that you're making they they are women in these dense networks of symbols and connections and narratives and when people come in and point
holes in it it doesn't really work because they have a whole elaborate network and taking out
one part of it it just feels like feels like that's barely making a dent.
Well, I mean, the thing that it reminds me of is when you go into the conspiratorial communities,
you so often hear them say, you have to go do the research.
If you don't get this, then go do the research.
That's like a slogan.
And I think you're right, Chris, they really do mean that
in the sense that this thing might seem silly on the surface,
but when you've done all of the reading that we have
and you've accumulated this vast, complex infrastructure,
then it makes sense.
So I have a question, though, that relates to that,
and then I'll shut up.
So I have a question though that relates to that and then I'll shut up. So there is an aspect of that where that's a reasonable thing to do.
You know, when you have genuine expertise in a topic and somebody comes and says, well, I think this, you know, I have this opinion on immigration.
But have you read anything about immigration or, you know, the policies, the statistics?
And genuine people say, you know, you need the policies the statistics and and genuine people say you know you need to do
the research so a question is how to distinguish those requests yeah i mean the a lot of the work
i'm doing is fighting this view that say people in echo chambers people in the alt-right are like
unthinking or intellectually
lazy. It's like, it seems to me like the opposite. They're like hyper-intellectual.
In fact, sometimes it's almost about being like too attracted to the pleasures of intellectual
power. And what I mean is something like, so when I was an undergraduate, I had an English
professor, Richard Marius, and
he said something that I've always remembered. And he said, we were re-reading Thomas Pynchon,
which is all about real epiphanies and fake epiphanies and crying about 49, something you
definitely read as an undergraduate. And he said, well, he had this lovely Southern accent. He said, well, I've always thought that the pleasure of mystery novels was like the pleasure of religion. Everything that seems disconnected stands revealed as having some kind of perfect order.
the pleasures of these networks you're talking about remind me of the kind of like some of the vast fantasy novels i read where like i'm reading brandon sanderson right now and the thing about
brandon sanderson is there's all these like cool hints and things and in the end like there is an
order like everything makes sense and that's so pleasurable right so one thing that i think
is going on is if you compare what it's like to be a real scientist by the way um i just read this
marvelous book on the popular book on the philosophy of science from michael strevens
called the knowledge machine and one of these things it talks about is look you're aiming what
you hope for is total coherence, right?
That's the long-term goal.
But as long as you're getting hit with all this other evidence that doesn't fit, you have to, like, be in the uncomfortable position of saying, like, we don't know yet.
No theory we have works perfectly.
It would be nicer if we had a theory that fit perfectly.
But, right, We're still waiting. I think that takes a certain,
I don't know, something where, where with this stuff, you're like, no, no, we've got it.
And the, one of the things I was trying to talk about in this paper is that
it seems to me that a lot of these theories are made to be easily applicable. So they constantly
give you the sensation of intellectual power and understanding because if you can generate explanations for anything easily right then you feel like it's not that you're unintellectual
you're getting confirmation of your own intellectual powers yes yeah no i think that's
that's true and it gels with what i know about the literature on this so i'm just one random
example is that it's actually people who are more open to experience,
more intellectually curious, who tend to succumb to conspiracy theories or various other belief
systems. And so the fact that they're so elaborate and Byzantine in their complexity is part of the
appeal because it is like saccharine. It's intellectually pleasurable in that sense.
But, you know, I've never really got my head around this apparent contradiction,
which seems to be that on one hand, there's one simple explanation for everything.
As you said, this huge amount of explanatory power.
Who did it?
Well, it's the New World Order and the Illuminati or whatever, or the Jews or God,
you know, depending on what your theory is.
All of them.
Or all of them, if you're Alex Jones.
Yeah.
So from one point of view, it's extremely simple because everything comes back to the
one thing.
But on the other hand, it's also extremely complicated and elaborate.
But I think the key thing that you said is that it has huge explanatory power.
Like any new thing that comes along
can be explained quite easily.
Yeah, that's a really interesting observation.
I mean, I haven't spent as much time as you two
looking at the particulars of the current gurus,
but my guess would be something like
the relationship between a single core idea
with a really complex application
is something that would both
give you pleasure because you would tie it back to the single central idea but also the sensation
of power like as long as the idea is complicated yeah but within your grasp then you get to have
the feeling of intellectual power yeah yeah i think it's interesting to compare it to the kind of knowledge that is real but unsatisfying,
which is the kind of stuff that I feel like I have, right,
in psychology, right, which is it's a mess.
You know, there's a bunch of different explanations
and theories for various things.
None of it fits together very well.
So it's a very unsatisfying state of affairs,
and I think that's often the case for real knowledge. it fits together very well. So it's a very unsatisfying state of affairs.
And I think that's often the case for real knowledge.
The ratio of hard work you have to do to satisfaction that you get having it is.
Yeah.
The thing that makes me think about is that, you know, the kind of guru
people that we look at, I think they do exactly what you're talking about,
where they, they expand their explanatory ideas
across all these fields that they're not experts in to show case their ability,
how insightful their worldview is.
But the opposite of that is that as academics become more specialized and
more proficient in a particular area, they tend to become less willing to venture
grand opinions about fields that they don't know about.
So it's a kind of inverse relationship.
And of course, there's plenty of mainstream academics who do venture grand opinions.
But I think in general, there's that knowledge amongst academics that becoming
highly proficient in a certain field makes you like this uber nerd about
a topic that you recognize is extremely niche and that seems the opposite dynamic i think you're
right there's the the sort of graduate student in a in a field is generally far more certain of
themselves than a professor so i i have i think this is super interesting and this is this is
weird because this connects to another part of my research that seems totally
unconnected, but I think is weirdly connected.
So if you look at the history of philosophy, so the history of philosophy in the modern
era has this fetish for intellectual autonomy, right?
For like, think it through yourself.
You can think independently
and you can understand everything. One of my own life-changing experiences was reading this book
from a philosopher, Elijah Milgram, called The Great Endarkenment. And he has this suggestion
that the great enlightenment undermined itself. It said, think for yourself. And that created
the sciences. And the sciences were so vast and so enormous that it's now possible for us to think for ourselves, right?
That we have to trust experts and we have to trust experts. So, I mean, expertise is our host. So I
was asking my wife about this. She's a chemist and I'm like, so how far away in chemistry do you have
to go before things are basically incomprehensible to you? And she was like, I'm an organic chemistry.
It's not chemist. It's not just inorganic chemist. I can't understand.
It's like any sub sub specialty right next door.
I have no fucking clue what's going on.
So this is really painful. Yeah.
And I think one of the things that happens with a lot of the figures you're
talking about is actually offer this fantasy
of being able to be back in the time when you could understand it all yes and weirdly that's
built into like intellectual like you if you listen to most philosophers scientists it's
sometimes they're like it's so important to think for yourself. And then we're sitting here being like, but hold on a second. I don't know the math behind climate change science.
Yeah. Oh, look, I, that is such a, I love that observation. I think that is fantastic. I mean,
like it's so true about the absolute need for just out of specialization these days. Like I've,
I've published in the journal vaccine,, Human Immunotherapeutics
and various public health and epidemiological journals, right?
I don't have any hot takes on COVID or how the vaccine,
various things going, right?
Because it's that the expertise is so narrow these days
because just because we're covering such a broad range
of technologies and forms of knowledge and what these people, what these gurus offer,
that's a horribly unsatisfying state of affairs, as you said.
And the gurus offer this polymath ability to draw it all together,
which is so satisfying.
I'm published in a philosophy journal.
That's clearly an error on someone's part but so one one more thing i want to say i think at the end of the the seductions of clarity paper i'm trying to figure out what we can do
about this and the thing that i end up saying is something like at some point i think a lot of us
have this i mean go back to the food analogy.
I spent a lot of time just like stuffing my face with like the crappiest chips.
And at some point you're like, okay, this stuff is not good.
Someone has engineered the stuff to be addictive and you get it, you get it.
You get, you evolve this sense of like, no,
that's a little too fucking yummy.
That's a little too salty and savory and fried and maybe indulge once in a while but i need to be suspicious because that shit was engineered
and i think there's something similar where i think actual intellectual life when you're exposed
to the complexity and difficulty is painful and humiliating it It's fucking awful. And you almost have to, it's like,
it's like learning a taste for kale. You have to, you have to like cultivate in yourself.
That makes you morally better.
No, it means you're not fucked.
Yeah. Yeah. I think that's excellent. I've actually said something similar, which is that,
you know, you have to become a little bit suspicious when when it feels too appealing when when it sounds oh that sounds right that's
got to be true and you know we all have that feeling a lot of the time in this conversation
right now it's a lot of the time some with someone to say something oh yeah that sounds
completely right you we should always be suspicious and that feeling. You know, we see a lot of the gurus, when we listen to their content,
and we don't break it down, we just let it wash over you, we often say,
that it feels really satisfying in the moment.
And you can kind of, you know, follow along the connections where they're going.
Jordan Peterson is great at it, giving these extended metaphors and analogies
and layer them on top
and connecting them to grand narratives. And it feels satisfying in the moment. But then when you
take time, like what we do in the podcast, and you stop and say, okay, so what was the argument
made here? And what's the evidence for what they're claiming? And it very quickly becomes
apparent that it's a lot of emptiness. You know, you can spend 15 minutes
describing this idea, which actually would only take two sentences to explain. So there's a
there's a genuine, I think your analogy to junk food that it's super satisfying, and we enjoy it
in the moment. But afterwards, we look at what we've done and I feel kind of ashamed of ourselves really applies.
Yeah.
Yeah.
My take on that, and I sign off on that totally,
but another thing I've thought about that phenomenon, Chris,
about how that feeling of it washing over you, and it does,
it kind of feels right, you know, like the analogies are evocative,
the connections kind of seem all right, but then when you stop and think about it
and what we're doing chris is we analyze it like we're adopting an analytic frame of mind where we
actually go okay hang on does that follow from that does that actually make or is that a
contradiction with this other thing and so on but i think that gurus largely rely on intuitive information processing.
So it's that, as long as you, you know,
sit back and just let it wash over you,
that's kind of that intuitive feeling of it feeling right.
And I think that, so those are ideas
from cognitive psychology, which I think are helpful here.
That's not the way they editorialize it though.
Especially the people we look at tend to invoke,
you know, that they are practicing real science and doing it in an analytical, scientific, rationalist way. But I think we all know that people who claim the mantle of rationalism and science and not to be tribalistic tend to often fall into all of the traps that they claim to avoid. So, yeah. Let me give you something.
I want to ask you about something
because this is something I've been puzzling over.
So when I wrote this thing about echo chambers
and how they manipulate trust,
I had this basic image where echo chambers are structures
where you're told to distrust everyone on the outside.
And I kind of said, and you kind of trust everyone on the inside. And I was thinking about like spending a lot of time on the online
echo chambers that I've looked at. By the way, to anyone listening, I just have to say, I think it's
really important to distinguish between filter bubbles and echo chambers. So people have been
confusing these two notions. A filter bubble
is when you don't hear the other side, and an echo chamber is when you don't trust the other side.
The original research into echo chambers was about trust structures, and lately people have
bundled them all together. And there are all these disproves, like, oh, extruders don't exist.
But they're all talking about filter bubbles. They're all talking about how you hear people
on the other side. Climate change deniers know all the other arguments on the other side.
They just trust them.
Anyway.
You've made that point in a bunch of talks and it's a great point
because I think the notion
that people aren't hearing the other side is just wrong.
It's often clearly wrong
because they spend all day often talking
about the other side.
Right.
Obsessively discrediting the other side.
Yeah.
Well, I want to pick you
up on that echo chambers stuff because it really closely ties to one of the key features of our
gurus. We actually described it as anti-establishment, but it really is. But we have another feature
where we talk about their cultish in-group, out-group dynamics. But if you actually sort of
put those two things together, right, that for a lot of gurus, the out-group dynamics. But if you actually sort of put those two things together, right,
for a lot of gurus, the out-group is everyone else.
It can be like another political side, but a lot of the people
we follow are not really, they're not, political stuff
is not their main game.
Their out-group is the institutions, the experts,
the establishment.
You know, academics like us are definitely in the out group.
And so the establishment is presented by the gurus
as being hopelessly corrupted by incentives, by groupthink,
by ideologies, et cetera.
And therefore you really can't trust them.
And they spend an awful lot of their time undermining uh all uh
you know all other sources of knowledge while building themselves and sometimes their friends
up so that really strikes me as a very now i'll probably get the two terms mixed up echo chambery
thing to do did i get it wrong no you got it right. So I think I, this image of echo chambers as ones where
everyone on the inside trusted each other and distrusted everyone on the outside world.
And then Joshua DePaulo, another philosopher, a friend of mine wrote a critical paper about it,
where he points out, look, there's another option that also counts as an echo chamber, which is when the leaders
make the people inside the echo chamber distrust themselves too, not just the outside world,
but also distrust themselves and create this total vacuum where the only person you trust is the leader. So now it seems to me like there are two
echo chamber structures you could have. One is everyone in the echo chamber trusts everyone
else in the echo chamber and each other, but especially the leaders. The other one is everyone
in the echo chamber distrusts everyone on the outside and themselves. They've been taught to
think that they themselves are dumb.
They only trust the leader. So Josh pointed out that that's the second structure is actually characteristic of a lot of older religious cults. And I haven't seen that
structure a lot in the new online world. I tend to see this like leaders pumping up the confidence
of the followers.
And I'm not sure, and I wanted to ask you
because you two follow this stuff better than me.
My sense is that it has something to do
with the structure of online recruiting,
that giving people a sense of pleasure and confidence
is a better way to snag people online.
And it's harder to, the self-hate,
I think like the self-hate methodology was the the methodology that cults use when they could like isolate people and take
them out to a compound right but now you need something else you need like a sugary bait to
put on the internet so my sense is that the method all the the methodology of cult building has
slightly changed um to this more hedon satisfying, pump up the ego of your audience thing on the online world.
Yeah, I think you're largely right about that, that it is more carrot than stick.
There's certainly an awful lot of flattery being done of followers by gurus. On the other hand, you do see some tricks,
the thing that we called the emperor's new clothes maneuver, where they will say things like,
now, look, I think this is going to be too complicated for a lot of you people to understand,
but I respect your intelligence. And I think some of you might be able to get it. And then
they proceed to say something quite absurd and outrageous so nobody wants to be the person
that doesn't get it right so so so there's and there is um you know we do see in our we
interviewed someone who had spent a fair bit about some of that follower management
that was done.
There were certainly in-groups and out-groups within the Discord
where the people who were more loyal, more on board with the message,
were definitely treated more favourably than the people
who asked irritating questions and challenged things
and thought for themselves too much. So that's my take. treated more favorably than the people who asked irritating questions and challenged things and
thought for themselves too much so that's my take um how about you chris uh yeah so i'm echoing
probably some of the things that you're saying but basically i'd agree d that there's a lot of
flattery about you're the guys that can look at these topics with nuance and complexity and like you're you're interested in scientific
approaches but there's also this element where it depends on the guru but some of them really
go along the lines of indulging in almost like star trek style techno babble about like specific
mathematical or scientific topics where the the way that they are illustrating the concepts
feels like it's to illustrate their own intelligence yes but the other thing that
it's to do is to highlight to the followers that they're invited into the club to be a part and to
watch and maybe they'll get to that level but there's such a vast difference between where you are and where the guru is that if you are going to chastise them for not knowing things you really
better do your homework and in their internal communities like the discords and the patreon
groups and these kind of things i think there is a greater chance for the old style dynamics kind of the things matt is talking about
with community management threatening to withdraw access or or negging people right so i think both
dynamics are probably at play but it just depends what part of the network you're looking at or how
deep you're into the uh the gurus right this makes me think something. So there's this paper draft. I haven't even sent it to you, but it's about different kinds of epistemic traps. And I start, one of them I want to call like a deference trap. you do a lot of intellectual inquiry, but like
subtly sends it down the wrong channels and like redirects your trust settings in various ways.
And the inquiry trap works because it gives people the sensation of being intellectually
autonomous and gives people like the feeling of power. But in order for it to be a trap,
it has to end up in the same place.
So if you're building something like this,
you need to like do, you can't,
it's like you want to give people the feeling
of intellectual autonomy and power,
but you don't want to actually give them
real intellectual autonomy
or they'll leave and not follow you.
So you have to build this weird,
I mean, this feels again like a magician's sleight of hand.
And listening to the two of you talk about this this push and pull i mean i'm really i need to
listen to this stuff because i'm i'm really interested in seeing hearing the experiences
of people in the current kind of online thing uh the the more cultish thing.
And it does seem like it sounds like the thing you're describing sounds like the technique you'd need to fake giving people intellectual autonomy and then subtly not give it to them.
Yeah, Matt and me have often commented, like I've described it as people act as if words
are magic.
I've described it as people act as if words are magic.
And what I mean by that is when somebody editorializes that I'm not advocating a conspiracy theory or I'm not going to just pat myself on the back.
And then they do that.
They proceed for 30 or 40 minutes to outline a conspiracy theory that's sweeping and makes
all these large disparaging
claims about entire fields. The fact that they added in the just the disclaimers works. It really
works as people will, if you point out that, well, that guy was advancing conspiracy theories,
people come back and say, no, did you not hear? They said that they are not doing that. And they
added disclaimers at the end
saying they're not entirely sure maybe they got some things wrong but it feels to me that that's
not epistemic humility it's not genuine it's it's a kind of covering your ass tactic you know if you
had real epistemic humble yeah humble that's what I'm looking for. You wouldn't then spend the hour and only the two minutes on disclaimers.
It would be the inverse structure.
So, yeah.
My take on what you said, T, was that perhaps that problem of managing where the independent inquiry goes to the end point is not such a big problem in
for the gurus or just in for these communities a lot of the time so if you take something like
um the covid conspiracies around you know everyone wants to blame china right so so every they they
will they will be naturally drawn there like Or around climate change, they naturally just want to deny
that it's happening, right?
So I feel like the kind of gurus who exploit these things,
these issues, are pandering to a pre-existing prejudice
that is widely held and, guess helping helping them along with the very complex
intellectual rationalization for their pre for what they wanted to believe in the first place
does that make sense yeah i mean i mean one thing that unites a lot of the a lot of this stuff is
I mean, one thing that unites a lot of this stuff is all of these tricks and tactics, like the moral outrage porn stuff, the fake clarity stuff, it's all like dirty tactics you would use if you don't actually care about the truth.
So it's all like playing up the symptoms of the truth and then making maneuvers that don't actually require loving the truth or giving a crap about the truth and then like uh only fronting as much
as it's useful yeah well i mean people have talked about um trump and for in exactly the same
way for having a complete disregard so this is what they call bullshitting right it'sitting, right? It's not caring about the truth and deliberately wanting
to deceive people.
It's just completely having no regard for it whatsoever.
So people have described that of Trump, have described it
as a superpower because it gives you suddenly so many more degrees
of freedom with which to optimize your persuasive tactics
um i think i think is that a fair summary of what of what you were saying or or not a summary that
yeah i mean the the analogy that comes to mind is again it's uh easy to make things delicious if you
don't care if they're nutritious that's totally totally easy. Like it's really, really hard.
The balance of nutrition and like goodness is tough
and requires other sacrifices.
Let me actually float a weird theory
that just came to mind about this.
Like there's really interesting balance
between trying to give your followers
the feeling of freedom and not.
From the other side of my brain. So I don't think we've talked about this yet, but the other half of my philosophy of life is about understanding games and the philosophy of games. And I find
something really like they've started to collide. And I think there's like, there's a really
interesting similarity. So for me, one of the things that makes games incredibly pleasurable is they offer a completely clear sense of value and purpose.
Like normal life is like full of these incredibly conflicting plural values.
And they're hard to apply.
And then a game, you know exactly what you're doing.
You know exactly where you stand.
All the values like fit into one economic point system.
And things are clear. it's like this relief
in particular i think one of and this relates to some of the conspiracy theory stuff
i think in our actual lives trying to get things done is very rarely pleasurable because problems
are either so vast and overwhelming that like our abilities don't fit or like so boring that we they're like
these easy things we have to do over and over again and we want to shoot ourselves but games
have been like are like engineered environments to make the process of thinking or doing whatever
just fit like your ability just fit it just it just, it's a world of practical struggle where the struggles are engineered to feel good.
I've seen, there's this article that everyone's been sending me about how game designers,
it says like QAnon is like a game.
And this seems like exactly right.
Like, it seems like what you're doing is creating this game-like puzzle experience of,
I mean, the thing about games is,
unlike, say, science,
the puzzles are hard,
but they're built for people to solve.
And you can do that
because you have a lot of free play in the game
to redesign the environment and the abilities.
And I kind of think that a lot of,
if you're out there to build
pleasurable candy intellectual belief systems,
you're going to make them hard,
but within human capacity.
So the weird connection was something like, and you know what else game designers are really good at they're really
good at giving you the feeling of freedom and yet steering your action down a pre-channelized path
right like game designers masters of oh you feel free but you're going to end up at this next cut
scene anyway and i i just wonder like maybe the analogy goes to that next level too
like being able to create a choice environment i mean this is like nudges stuff so when i started
researching this stuff everyone's talking about how games are good because they're like fiction
or they tell stories or like movies and i'm like no they're more like cities or governments they
like are these choice spaces full of nudges to get people to go in certain ways? And I feel like intellectually, a lot of these conspiracy-ish theories have the same, like, you're free, but hey, somehow we've constructed it so you end up in this place.
Yeah.
Well, look, I think one angle of it too, though, is just the richness and complexity of the space of the game or the conspiracy theory
so there's lots and lots of space for people to do their own research and to and to come up with
with their own little insights and and elaborations and make their own connections and it's it's it's
challenging but not too challenging in the same way that you're talking about with games um and
when you were talking i was reminded of the work I've done on complementary
and alternative medicines. So this field of alternative health treatments ranging from
homeopathy and energy therapies and kinesiology, there is just so many. And it has some similar things to what you described.
Like it's an extraordinarily rich and interesting landscape for an interested person to explore.
I think like a conspiracy theory, it taps to some fundamental anxieties
and stuff that people have, perhaps even existential ones,
about health in the case of alternative medicines.
And conspiracy theories often there's
an underlying kind of thing that they tap that's that's quite different so so you have scientific
medicine which is boring and difficult and technical it doesn't have any of these satisfying
properties and then you have this sort of alternate version which which any interested person can quite
easily feel like they're making a lot of progress in mastering and understanding.
And yeah, anyway, I just felt like it was an interesting parallel.
I'm sure you're used to this, T, but because I have some history and interest in games,
I really liked your discussions and your work on gamification.
I really liked your discussions and your work on gamification.
But I think like many people who have played games,
I'm also inevitably thinking,
well, what about that counter example that doesn't really exactly fit?
So in a contrarian way, I was thinking about Minecraft,
where part of the appeal is that the goals are,
although there is a game there, a survival game,
the reality is that most people play it in an open-ended way um so i'd be interested to hear your thought about that as a nitpicky thing but the other thing is that thinking about that uh when you pointed out that you're allowed to play
but there are actual hidden constraints and it seems like you have endless opportunities but really you don't and that also
fits the minecraft analogy where you you can do incredible things you can rebuild you know the
star trek enterprise if you want uh for and and go around all the nacelles but but you're still
ultimately trapped in voxels and uh yeah so it just i think the metaphor sits really nicely but
i'd be interested that there are games that are popular
which seem to have rather open-ended reward systems now.
I mean, so one thing I can say is
the thing that I'm analyzing is constructed systems
in which you have a clear goal and clear rules
that constrain how you get to that goal.
And not all things that are called games
in our natural language are like that.
So I think chess is like that, right?
Another thing to think is you have to be really careful.
Now I'm just putting on my philosophy of games hat.
You have to be really careful to distinguish
the software environment from the game.
And you can play different games
with the same software environment.
So you can play Super Mario Brothers or you can speed run Super Mario Brothers,
which is a different game with a different goal played on the same software.
You can play World of Warcraft for experience points and gold,
or you can go to socialize in the environment.
Then I think you're doing something slightly different.
So one thing I think is there are some things that people call video games,
but really they're more like toys.
They're more open-ended.
They're like structures for play.
And I think a lot of the times a lot of modern games are made,
a lot of modern video games are made so you can engage within it,
engage with it with this clear goal,
or you could just play around with the environment as a toy.
And that's like different activities supported by the same thing.
So I think like you have to be careful there.
It's almost as if you fought about this topic.
But there,
there is a point related to that,
that you've talked about the gamification of Twitter
or other social networks, right?
Harvesting likes and retweets.
And I think nobody is immune
from the reward dynamics that are in play there.
But one thing that struck me about that
when I was listening to you talk about that
is that when I look at the case of James Lindsay, who's a super stimulus in the guru sphere at the minute, because he's burning brightly, right?
As somebody who, whatever you thought of him, he once seemed to be on the kind of legitimate side of things to some extent, right?
That he might be obnoxious and whatnot, but he, he has some legitimate
arguments that he makes and he's taking things seriously and in recent months,
famously since Trump retweeted him.
Much more leaned into complete right wing partisanship, retweeting people
from info wars, endorsing voter ballot conspiracies and, and doing things
that, you know, if you are a secular,
rational atheist concerned about science, you don't promote coronavirus conspiracies, which he
does. So when I've looked at that, one of the things that keeps coming up when people discussing
that is, you know, to what extent does he believe the things he's doing? And to what extent is he
just engaging in harvesting followers or you know
playing to a certain audience and i'd be really interested to hear your view on that from my
perspective it looks like it's it's a little bit of you know column a and column b the the academics
or like eternal answer that he is intentionally doing things to garner controversy and that pander to an audience but
at the same time he seems genuinely to have bought into a whole ecosystem of ideas that are not his
own that are about you know the great replacement or the great reset and about uh sauros's influence
that that pre predate him and and kind of co-opt his agenda to some extent that's that's a lot of
things but i'd be really interested to hear your opinion on uh any of that um i can give you some
ideas then you can apply them to lindsey because i honestly can't stomach following like you you
have more so i i have no so i have no actual evidence about him himself
more than like flybys on twitter that kind of look at so so so so t's already proved himself
far more emotionally aware and stable and healthy than the two of us chris well just i will add
because the guru account only follows the people that we cover in the show.
So it's only got like nine or 10 people.
And basically, the reason I see his tweets, like he's blocked me long ago, is because
our Decoding the Guru's account is basically his Twitter feed because he tweets so prolifically.
But after your appearance, you and ContraPoints will diluteute the stream so that'll be nice at least
but sorry to interrupt you there go go ahead with yeah sorry so um okay so let me step back for a
bit and vomit some stuff on gamification and then we can try to think about how it connects to the
situation so there's a standard thought out there that something like games are good,
so gamification is good.
This is like James McGonigal, like gamification booster says this.
I think actually if you understand why games work,
you'll understand why gamification is terrible.
And the reason is that games offer you this wonderful value clarity
of a simple artificially clear goal,
but they do so in a secluded environment where you pursue it away from the rest of the world offer you this wonderful value clarity of a simple, artificially clear goal,
but they do so in a secluded environment where you pursue it away from the rest of the world and where the goal isn't connected to the rest of the world. When you gamify ordinary activity,
to get that pleasure, you have to simplify the goals in real life, right? Like in some sense,
it doesn't matter who I kill in like Dota or whatever, right? But what I say on Twitter matters.
So the worry is that when you gamify an activity,
so I worry about like Fitbit and Twitter.
When you gamify Twitter, to make that exciting,
you have to change what you care about
from like whatever rich and natural goals you have
to like just whatever the points measure, right?
It's only thrilling to watch the points come up
if you care about them.
So this is part of this larger phenomenon.
What I'm actually trying to write about right now
is this larger phenomenon I'm calling value capture,
which is when you have rich and subtle values,
you get put in an environment with really simple,
often quantified versions of them,
and then they drift into you and they start to take over.
And I mean things like, for academics, like citation rates, right?
Or like the status.
There's a really interesting kind of like what the U.S. News & World Report law school rankings do.
Like it seems like everyone in that system gets value captured
and they just start caring about moving up this clear ranking. And there's this weird sense in which it seems like,
I haven't quite figured out how it works exactly, but it does look like you have this promise of
pleasure. If you align your values with whatever this thing, whatever this point system is pounding
out, then suddenly you get these huge bursts of pleasure.
And so, this seems reminiscent in a way of what you're talking about, about Lindsay. But again,
I don't know. What I would imagine is that suddenly you get this huge burst of points
for doing a certain kind of action, right? And then if you continue that actually you get more points and i i mean
i'm not a psychologist i don't know how incentives change belief systems but i definitely like i've
been on twitter for a while and i can feel i can feel the pull and it's to me it feels like
sometimes like i'm on twitter and then some tweets do really well like i try some tweets that are
really about what i like like what i care about and they're like you know whatever and you say some like zippy peppy thing and then it explodes and then you can
start feeling your brain reorient around saying things like that and i don't even know i've i mean
i try to pull back like i can recognize it because like background one of the reasons I write about games and game addiction is I've lost years of my life
to certain games like civilization two three and four which I'm never allowed to touch again good
games I know they're good I can't touch but like I can feel that and like the way the way games can take
over your brain is like you just start looking I'm a climber and sometimes you climb well and
you're in climbing mind you just look around and the world is suddenly just like how would I climb
that how would I go up that and I feel like if when the twitter thing gets it hooks in you I
walk around the world and I'm like would that be a fun tweet would that be a good tweet and it's
almost like the thing that I do that's called believing things as they're true is a little bit disengaged
and the filter i'm looking around the world is not is that true but yeah yeah i'll make a good
tweet yeah and that creeps me out whenever that happens i like make myself delete twitter for a
while because like i can feel it and it fucks me up yeah look i i'll jump in jump in now. First of all, about Twitter, I mean, one thing I noticed
is that it's always the worst tweets,
the tweets that I'm actually a little bit ashamed of
because they're cheap, that always do the best.
And when I noticed that, it was a good reminder
that never to, you know, never to pay attention
to that particular scoreboard.
But, I mean, I think, you know, if I understand your point correctly,
you're basically saying that the rewiring is going on in one's brain
and, you know, value system such that there isn't really a dichotomy
between, oh, are you bona fide about this stuff
or are you just chasing the thing?
The brain's rewired so that's kind of,
those two things have become conflated.
And when you mention the incentivisation of academics
towards citations and the various metrics they apply to us,
I mean, you guys probably know the same characters that I did
as a couple of famous figures who ended up publishing,
and I know one of them personally, publishing like more than a hundred papers a year. And,
you know, just these crazy citation metrics, most of it is self-plagiarized and just regurgitating
the same thing. And I can tell you that in his mind, he has definitely conflated his original goal of being, you know, scientifically influential and, you know, in a genuine way with those metrics.
I mean, this is so, this is, look, I will do more autobiography here than you probably expected.
But like, at some point I was like, I was super depressed in philosophy.
And the reason was I realized, I mean, it's the exact same thing.
I was just looking at ideas and being like, well, that's, is that published?
Will that get in a good journal?
Right.
Is that the kind of thing you get published in this fancy journal?
And again, like thinking about me then, it's not like I was saying things I thought were
false in order to advance professionally.
It's like something had slipped in my brain and I was just looking at ideas and the criteria in my head for good ideas was the kind of things that would get public.
And I was like really depressed for a long time because I was writing things I thought were boring.
I actually almost quit the profession and then had this moment where I was like,
I can't fucking do this anymore. And one of the interesting things is I ended up writing a blog
post on this internal philosophy thing about how I tried to throw all this stuff out of my head
and write about things I cared about. And i got this flood of emails from people like all
private i won't mention the names they're all like oh my god yeah i've forgotten why i got into
philosophy like why am i writing about this boring stuff and i'm like these are philosophers right if
anyone's supposed to be fucking resistant to this shit it's the lover like why the fuck are you in
philosophy if you don't care about ideas? But somehow, like, even
I mean, to be me, even, this will sound weird,
like, even the
philosopher. Even the heady world
of mind.
Even philosopher.
Right.
Who are supposed to be the best,
whatever, are completely vulnerable
to having their belief
criteria shift from
institution like institutional metrics and measures so i mean i think of myself as fairly
intellectually rigorous and careful and this shit will subvert me in like a second yeah
right i'm like i i feel like i have to be constantly vigilant. And I don't know, like, I feel like this, like,
and lately I've been thinking, I've been trying to figure out, like,
I don't know how much we should assign responsibility to people for it.
Like, sometimes I think it's the, like,
when the entire system pervasively hits you with these points.
Yeah.
Right.
Like, it's so hard not to remind yourself.
My other field is addiction. And so I'm definitely on board with the idea that you cannot necessarily blame the individual's vulnerability for that kind of dopamine reward delivery.
But Chris, sorry. delivery um but chris sorry no that i was just going to say that you know the points that you
are making the and uh i think we've discussed also matt on the podcast and offline uh relate
to the fact that there is there's plenty of genuine criticisms to be made about institutions
and academia and and incentive structures which are there's validity to them.
And that's part of the reason I find it so annoying when what we called anti-establishmentarianism
is like, you know, a kind of hollow version of that where they don't they don't actually
address things like the the citation metrics.
That's not a big focus.
It's mentioned, but just in passing.
And they act as if that your critics,
if you do not agree with them in their critique of, you know,
the establishment, that you are a defender of the status quo
and the mainstream establishment.
And that might be valid on some occasions,
but it's chucked around so often and in
my case it usually doesn't bother me that much because when I see people presenting me as
something like one of the things I get presented as is an advocate for wokeism and it's so far from
an actual accurate hit on me that it doesn't bother me because it just feels like they're attacking
an image that doesn't exist. But the other point that you made about pursuing what you're
interested in and that often is at odds or seems at odds, at least intrinsically, from
institutional metrics and the things that social metrics like Twitter.
Yeah. Oh, yes. Yes.
Both. But the I you know, when I see your what saw your talks,
the ones that partly made us interested, the interview you it was clear you had
a passion for the topic and we're talking about it in an academic way.
But for an interest purpose and that's
to me the most gripping things and it's not really related to the guru point it's more that just to
echo your personal story that pursuing the things that are interesting and which you feel you know
are important or have insight i think that's really important it's not it's not related
to conspiracies or anything but uh yeah just saying follow follow your dreams kids follow
your dreams follow your dreams you too can have a podcast so um okay so this this this
led to a question i wanted to ask um which is um so, you know, we talked about that, those social media gamified incentives.
And so it sort of raises the question of to what degree our gurus
are actually, you know, you think of a guru as sort
of leading the flock, but to what degree are they driven
by their audiences, you know, that desire to build an audience and keep an audience?
Are they victims?
Yeah, that's really interesting.
So I realize I haven't thought about the things from that angle at all.
Mostly I've been thinking, I've been trying to adopt this,
like, in some sense, super simplistic model just to help me think.
And that super simplistic model is like, imagine you were out to manipulate people and get them to believe this system you wanted.
How would you build the system?
And from that angle, I was thinking about gamification as a useful tool for a manipulator.
Because if the manipulator is trying to bait people into joining the system for pleasure, gamified systems, the gamification of Twitter really, well, basically it gives, it offers a lot of pleasure for being in a part of a large unanimous group.
And so it's a good reinforcement mechanism for getting people to be in a group because, you know, you get a ton of likes if you say something that people in the group agree with.
be in a group because you know you get a ton of likes if you say something that people in the group agree with but now you have me thinking this other thing where it's like where there's
another possibility where right like the leaders and the followers evolve together and the leader
like you can imagine them both chasing pleasure and the pleasure comes from either for the
followers having a pleasurable system and the leaders like having people uptake and so you might think that like you could kind of
wander into one of the guru positions not from being this kind of like i mean i think like steve
bannon for example is just like a purely conscious manipulator like he's making systems infected like
that's but like you can imagine other systems where someone
just like starts saying things and people start responding and they get stimulus serotonin hacked
into like saying more things and right and so they co-evolve and that yeah that seems like i mean does
that seem right of some of the gurus you're looking at that yeah i mean that's my opinion i i i would
definitely describe it as sort of a co-evolving thing for most of them i think there's there's
certainly a few of them that they have some strong ideological prior thing that they're looking to
convince people of um and there's there's some who are politically political partisans like that's
how they started like scott adams for instance. So they already have like a neat audience to sort of talk to.
But I think a lot of them are much more flexible
and that they're kind of bullshitters.
It's a bit like Trump's policies, you know.
You know, it's really they, I don't feel like they come
to the table with a strong desire to convince people
of something, but rather they interact with their and co-evolve with their followers.
Yes.
Okay.
Analogy.
Let's go back to the junk food analogy.
Frito-Lay company doesn't have to be out to make you unhealthy or control you.
They just have to respond to profits.
So they don't have to be aggressively trying to game the gap between nutrition and pleasure all they know is if they make do this thing then they get more money and
so like functionally that creates a gaming of the gap between nutrition and pleasure but they can
just be responding to incentives so you might think that someone is just like oh my god twitter allows incredibly fast meme evolution they just like say shit and
then some of it takes and then because it gives people pleasure and then they get seamless response
they're like i should say shit more like that so without aggressively gaming the system they're like
pushed to become the kind of people that emit pleasurable, catchy, sticky ideas.
Yeah.
I totally agree with that framing.
And one of the reasons I agree with it so strongly is because
one of these features we've identified again and again is narcissism.
So we're all subject to the pull of attention and praise, right? But narcissists are really subject to it. Like, they're almost victims to it. And, you know, it's, I don't think it's a coincidence that the large majority of the people that we cover, you know, the narcissism is so strong.
The self-aggrandisement is so strong.
And I feel like that makes them particularly vulnerable or particularly incentivised, just like the company you described,
because they're just like they're just 100% focused
on maximising profits.
These guys are 100% focused on maximising attention.
So someone like yourselves might look at, oh, that's a viral tweet,
but I'm not very proud of it and put that aside.
But a narcissist wouldn't be able to do that.
That is so interesting.
I mean, this is why talking across fields, like, I don't know.
We should pat ourselves on the back for, you know,
being willing to engage with difficult ideas.
Yeah. It feels so good. So, okay.
So there's a standard view.
I don't know if I believe it that like companies are psychopaths that just aim
at profit. Cause that's the only thing they respond to.
So you might think that, I mean, exactly what you said,
like the more you only respond to praise, the more you will spend all your energies optimizing your thought patterns to get praise.
And the structure of Twitter makes it really good for
harvesting praise.
And obviously not just Twitter,
but YouTube and stuff like that.
All of the modern...
All the modern, like, you have a channel
and likes go up.
Most of the time, people, your followers
like you, so
it's almost like
if you could create an environment
to optimally yes evolve the ultimate like emitter of viral ideas if you wanted to if you wanted to
if you wanted to select for narcissism yeah filter out all the people who are narcissists
and then build the narcissism amongst those people then we have that now it's awful with i i think i have uh
a related question t and i want to get there before i know your time will be running out
so these ideas like your talks i i really like them obviously matt really likes them and i know
that a lot of people that have been exposed to them, especially on the left of center or the far left of center, you know, the left wing general tend to find them appealing.
Right. Because it it hits a lot of buttons for one.
It's criticizing social networks for the incentives that they're damaging society and our brains.
that they're damaging society and our brains.
And two, that it points out a lot of the features within right-wing conspiracy communities
or right-wing gurus or even the IDW so-called centrists,
the dynamics that are at play there.
But one thing I wanted to put to you
and get your input on is, you know, me and Matt are just
as guilty of this, that most of our examples are taken from the right side of the spectrum.
And I don't think there's an equivalence here.
I think there's an issue that you're sampling from a biased pool because there's a bigger
amount of it on the right.
But I want to ask, how do you recommend or do you have any suggestions about how people avoid essentially taking the points that you're making and viewing them as these are things which my opponents and the right wing do, but which us on the left are, you know, are generally immune to or less prone to acting on.
And do you think that's true?
Is there an imbalance or is it just our self-serving biases in play?
Right.
So obviously I'm fairly left and I think there are echo chambers and moral outrage porn and seductive clarity on both sides
i think it's quite asymmetric but of course the other side will say oh no you're all whatever
um but i don't think it's totally asymmetric and i think there are a lot of super i mean when i
write this stuff i always am hoping to write it so that the experience of
someone reading it is like, I see it on the other side, but wait, what about me? And I always try
to catch, put little like hooks in the end. Cause I think it's easy to get someone to upload it and
then turn. But I think like the thing that I'm talking about is I have become really cautious
of versions that look like this on the left and i think i see
a decent number and again it has that feel here's a really nice theory that explains everything
and it makes the other side totally evil and that yeah um but you also have to be like i mean
one of the dangers here is i mean some of the stuff i think starts to look like versions of a conspiracy theory.
And one thing that you always have to remember, here's a way in which I'm kind of opposed to a lot of other people that think about conspiracy theories in the academy.
A lot of people want to say that all conspiracy theories are bad. What I want to say is, no, no, there are conspiracy theories are sometimes good
and they are good when there's actually a real conspiracy. You know, here's a situation in which
you should believe in a conspiracy and believe that the mainstream media is corrupted.
If you're in Nazi Germany and you're looking around, you're being like, there's a conspiracy that's sweeping the world and it's corrupting me.
You're actually right.
So one of the things that we have to be careful about is it's, you can't,
you can't just say, look, no conspiracy theory.
So the thing is like a lot of the people on the, on the left,
their beliefs about the functioning of capitalism look a lot like a conspiracy
theory. And now we have to do the hard work of not saying like, well, on the left their beliefs about the functioning of capitalism look a lot like a conspiracy theory
and now we have to do the hard work of not saying like well dismiss all conspiracy theories we
actually have to figure out yeah which ones which ones are legitimate which are not okay uh so one
thing that struck me when you were talking about that the and it's an issue that we come up with is that some of the
gurus we look at like the Weinstein's are always front and center of my mind
because they're they're kind of excellent at this but like Eric Weinstein
talks about responsible conspiracy fear rising and Brett Weinstein talks about
conspiracy hypothesizing not theorizing It's just a hypothesis.
And both of them make the point that you just did,
where they indicate that there was Watergate,
there are the dirty tricks of the CIA and attempted blackmails,
and there are conspiracies in the real world.
So one thing I'm curious to get your feedback or opinion on
is how do we avoid that we basically say,
you know, on the right, they have conspiracies
about the postmodern neo-Marxists overtaking academia.
And that's obvious nonsense.
But the left has things that look similar
about capitalism or institutionalized racism could be presented that way as well.
And how do we avoid it just being that we say, well, the conspiracy theories on the right are obviously crazy, but the ones on the left, well, they're in the category of, you know of reasonable ones. Right. I mean, I'll do you one better. So let me give you something that I think looks,
has the shape of a conspiracy theory
that I probably believe,
that I think it's pretty good evidence for.
So if you read the book, Dark Money,
this is a journalistic investigation
of the Koch brothers
and how they've been spending money
to influence politics for the last 20 years. And it looks like they've been funding money to influence politics for the last, you know, like 20 years.
And it looks like, so they've been funding various libertarian think tanks.
They're funding scientific ventures that support the progress of big oil, stuff like that.
And it's a story about a long-term informational manipulation for a purpose by a particular elite cabal, this time the Koch brothers.
So I've read this thing.
I've checked it up.
Seems reasonable to me.
I have high credence in it.
And it's very explanatory of a lot of weird features that you see.
So, I mean, here's the difficult line to walk.
I mean, when I talk about this, like, clarity is seductive thing, people always say, and I mean, I think it's a good line to walk i mean when i talk about this like clarity is seductive thing
people always say and i mean i think it's a good thing to say oh well that was very clear like that
made every sense of everything so should i be suspicious of it and maybe the right answer is
yes be suspicious but it would be too easy if all conspiracy theories were false those stupid people
they believe in conspiracy theories that's's too easy, right? That
is exactly the earmarks of the thing we're worried about because we know for a fact that some
conspiracy theories are true. So now we get into a much more complicated space. First of all, we
know that some conspiracy theories are true. Second of all, remember, I have this worry about
clarity being seductive by imitating the joys of understanding. But the other thing is,
real understanding also makes things coherent and is joyful. I mean, you're a scientist.
When you see a theory, it's not like we should be suspicious of unifying coherent
experiences that feel pleasurable. It's that there's a cartoon manipulative version that's imitating closely a real thing, which is,
oh my God, some theories do make sense of the world, and that feels good.
And so now we're in this incredibly difficult space where we have to carefully separate
real conspiracy theories from fake ones and genuinely pleasurable unifying experiences
with cartoon ones with the pleasure slightly amped up.
And that is incredibly hard.
Sometimes I worry that for many of us, we don't quite have enough information to do it.
That's like my paranoid worry that sometimes I worry it's just a matter of luck about which institutions you ended up connected with.
But I mean, it's super hard.
And I'm not,
again, I'm not quite sure how to do it. One of the things we talked about before is there is the signal that certain things are just a little too easy, that they've just been made for pleasure.
But I'm almost worried that a sufficiently clever group could fake that by making it
fairly difficult, but not too difficult.
Like, again, you talk about the labyrinthiness of QAnon, right?
So, like, this is hard.
Yeah.
Look, I'm going to be a bit of a philosopher here and define terms a little bit because in psychology,
we like to focus on conspiratorial ideation
rather than conspiracy theories yeah so
if conspiracy theories are the content then the ideation is the is the is the sort of um the mental
processes so the problem with conspiracy theories as we've talked about is that there's it's there
there's heaps of them around and they're completely true because the the way that they're defined is
any any group of of powerful actors uh of powerful actors acting secretly in their own interests
and maybe not in everyone else's best interest, right?
So that is mundane.
Yeah.
That happens all the time everywhere.
So it's far too broad a thing.
So really when we talk about conspiracy theories,
we're really using a bit of lazy language here.
Really what we're talking about is conspiratorial ideation, which we define specifically to be
having unwarranted, paranoid, and overly elaborate models of this. So I think that's just a helpful
way. Because it's almost like the opposite of good science, isn't it? Like conspiratorial
ideation. If you think all the things like Ockham's brand
and working from an evidence base and so on, the conspiratorial ideation is almost doing
the opposite, having a large, intricate kind of theory with lots of tenuous connections,
maybe some internal contradictions, which a theory shouldn't have, and being basically
motivated by these prejudices or biases.
Anyway.
Yeah, it sounds like both of you are hitting the point,
you know, it's not a conspiracy when they're out to get you.
But I think the point you made, T, about, you know, that there's versions of conspiracies which are accurate.
And like, for example, the flip side for the CAC funding
is people focusing on George Soros, right?
And now, again, I'm not saying there's an exact equivalence
to draw here, but there's a version of it where, yes,
there are funders who support specific kind of causes
and some can be more nefarious than others,
but they are funding things
often they're doing it fairly openly so the question is when it's hidden and through shell
companies and all that kind of thing but your point that it's hard to thread the needle is
really important because like take for example the current issue about the origins of the coronavirus
now this is a topic that's super popular amongst the gurus we look at
to highlight their heterodox thing
that they're willing to consider the possibility
that it's a lab like,
but they don't just consider it a possibility.
Some of them set the possibility at over 90%,
but what I find is they frame it as if
nobody's allowed to talk about that hypothesis
and it's not even
on the cards it's it's verboten but when you actually listen to experts discuss it they do
leave space for that possibility but what they do is that they put it in with the probabilities and
they say we can't rule this out completely but it's very unlikely from the current evidence and to give the reasons and
go for it. But getting to that nuanced place where you're saying what the other person is doing is
conspiracy theorizing, even though there is still a possibility that it's true. It's the reasoning
approach, which is kind of going wrong, which is echoing what Matt said. Rather than the outcome,
which is echoing what Matt said, rather than the outcome, say there was some massive Chinese government cover-up and all of the virologist community had not anticipated the level of
duplicity that was involved. It wouldn't mean that the reasoning was wrong. It would just be
that there was a grand conspiracy, which was extremely unlikely. And yeah, it seems
difficult. So thinking about this and thinking about the ideation thing, I'm a little worried
that the psychological approach that Matt is talking about makes things a little too easy
and helps itself to a certain thing, which is, I mean, so I mean, think about something like
paranoid. Could I use this in self-reflection? Could I be like, look, am I doing the real thing or am I involved in ideation? Well, it depends if I'm paranoid. But again, the problem is a belief in a conspiracy theory is paranoid if it's false. But if you believe it, you believe it's true, right?
you did the reasoning, then you're not going to think that you're unwarranted. So, in the background, I think there's a slightly different picture about what's happening with conspiracy
theories. So, my worry is that some of the psychological conspiratorial ideation may be
right of some people, but I'm worried that that's sometimes too individualistic an account that is mostly focused on trying to find a reasoning error in a person.
And my worry is, again, if someone sets the evidence in the right way, I mean, remember,
the big background picture here is that we learned who to trust from other people, right?
So, if you start and everyone has to trust their parents and teachers about parts of the world. So if your parents and teachers tell you most of these things are false, it's only Fox News or whatever that tells of the times, my worry is that it's not an ideation problem. It's like a large scale sustaining misinformation
problem. And if you make it pleasurable, it's simultaneously a little extra sticky, right?
So my worry is that what we should be looking for is reasonable procedures in tainted informational environments,
which is a different story from a purely psychological process.
And I think that story might be true of some people.
Yeah, look, look, yeah, no, I completely agree with that.
And in that sort of that psychological frame, I gave it, I don't want to overemphasize,
you know, the biases and fallacies or the emotional motivations of the people.
Those are useful explanatory factors, but I definitely agree with you that they're not
a necessary component at all. But I probably would stick to my guns slightly in describing
the sort of the process of the way in which they're evaluated as enacting bad like like a bad scientific investigation principles i suppose um however
this connects back to what you mentioned earlier on which again i um i wanted to follow up on
because i really think it's important how important having a good trust network is and
that we all necessarily rely on authoritative sources of information.
Like my opinions about climate change do not derive
from a close inspection of the raw data.
There is far, far too much of it.
We've talked about specialisation and so on.
So I'm just simply agreeing with you that in practical terms,
in terms of how us or just people as just consumers of information
and have a havers of opinions, probably the most important thing is figuring out the correct trust
allocations to have. And a lot of the people who are victims of conspiracy theories or adhere to
them are not, there's nothing wrong with them psychologically. Just as you said, they've simply, often through
no fault of their own, allocated the trust to the wrong sources. I mean, I'm really, sometimes I
just really like, sometimes I can tell a story that says like, no, it's that procedure is totally
reasonable. Another time, other times I think like, what it often seems like is once you accept
the belief system, it's self-sustaining in it.
But what about the moment of acceptance?
And sometimes I think like what you might find is not like wild irrationality, but like a moment of weakness where you're like, oh, yeah, that's nice to believe that person.
And then once you start doing that, rational procedures are self-sustaining to continue that belief.
So it's just like a little slip.
And I don't even know how to assign responsibilities.
And I mean, there's also all this other stuff where I worry that like,
if you have these situations where you reason a little bit more loosely,
just a little bit, and you get enormous amounts of emotional comfort,
like that's really hard to resist.
Yeah, no, my gut feeling on this is, this is just pure opinion now, not, not being all
professorial and stuff. It's just that I, my gut feeling is that the two, two things that help with
that is, is being, is trying to be dispassionate, like just, just, just cultivating that a little
bit. And, and that helps I think. And being willing to revisit one's assumptions.
Those two little good habits can maybe help us pull back from the brink after we've sort of
accepted that first premise and then have started going down a rabbit hole. And I think we all have.
I think everyone has gone down some little rabbit hole at some point in their lives.
And you need to be able to walk it back and i feel
like those two things can help here's a worry i'm i'm going to continue to play the pessimist about
rationality here i'm not totally sure about this but here's my worry a lot of the times
the systems that are so sticky and catchy get some of their catchiness by simulating the a particular
experience of rethinking your assumption and that looks like oh what your assumption was
was cnn is trustworthy rethink that and that's why i mean right so that's why in some ways like
and i think the party line in a lot of these worlds is like, oh, you sheep, you haven't even, you just trust CNN.
We've thought about it.
We've stepped back.
We've worried.
And I mean, so I don't think you say like,
oh, you're never rethinking your assumptions, right?
And that's the pleasure.
No, no, I agree.
There is no magic bullet that solves that problem.
It's very hard to reason your way out of a place of
delusion, I think. But that point about having skepticism and cynicism, like and it being
rewarding. I was strongly interested in Buddhism when I was a teenager in the slightly exotic,
oh, it's a philosophy, not a religion kind of way. And then I went to university and started studying actual Buddhist history and
cultures and find out, oh, dear, my illusion was shattered. And that was unpleasant, right? But
then there's a pleasure that comes from it where you're like, oh, actually, now I get to find out
the reality and it's complex and it's messy. And the history is actually interesting. But there's
a pleasure in that. Oh oh I saw through the facade and
when I see people talking exactly like you said to you about CNN or the WHO or
institutions there's there's the same feeling that they've seen through things
that others haven't and it's really hard to explain that well I get the pleasure
and I also get that you're right.
You're right to be cynical, but you've took it too far.
And that feels like it's a position that it's hard to communicate in a way that doesn't end up sounding like special pleading.
But it's probably exactly what you said at the start of this conversation about, you know, the reality is complex, unsatisfying, sometimes
a bit contradictory. And, and that's, but that's what you have to deal with if you want to grapple
with reality. I was actually thinking about this on a walk earlier today, and I, this went in a
weird place. So we'll, let me see what you think about this. So sometimes I think like, okay, is
there, is there an internal hint that I'm caught in something like this right and one thought you
might have is i said before like one of the pleasures of a game is that it's made to fit
your mind and it feels like capable to do the stuff in the game. And the worry I had was like, it's almost like, look,
if you think you have a total picture of the world,
how could you think that the world is something that would just fit easily in
your mind?
So maybe a little sign is that it's too easy.
You think you have a final answer, but then again,
once again, the worry is what's it feel like to be Darwin? Oh my god!
I
understand so much now.
Yeah, there was a real Galileo,
even though there's a lot of people
thinking they are Galileo.
Well, yeah, I still think it's
a good rule of thumb. I quite like that.
When it seems to fit
like a hand in a glove, and
the mist seems to fall from your eyes
and everything now makes sense,
that should make you very, very suspicious and cautious.
Yeah.
Yeah.
Cool.
So is there anything else we wanted to,
points we needed to cover before we wrap this up?
Anything you feel we've badly misrepresented?
Yeah.
This is awesome.
This has been an incredible time.
If we spend some time and you see more phenomenon,
I've been thinking about this more.
I'd love to talk again and figure out more stuff.
I do think that the philosopher's way of thinking
and the psychologist's way of thinking
and the anthropologist's way of thinking
are usefully different intersecting things yeah yeah totally totally agree this is there's been really
you know i i feel after we just did the douglas murray episode where douglas murray and eric
weinstein slapped themselves on the back for four hours i'm in danger of falling into that area. But I will say for me, just for me,
this has been extremely enjoyable conversation.
And T, I really genuinely love the work you're doing.
So, yeah, keep it up.
And I'm sure our paths will cross again before too long.
Absolutely. I think there's
so many intersections between the stuff we're doing with the gurus and the stuff you're investigating in your academic work. And as we said offline, we are hoping, planning to eventually write something academic-y on this ourselves. And yeah, it'll be great to be working in the same field. So just want to say thanks very much for coming on.
We will post links
to those excellent lectures
you've given
that are available on YouTube
and also a link to your interview
on Embrace the Void.
And so some good stuff there.
And if there's any other cool things
you want to share with our audience,
we can probably find a space for it.
Thanks so much.
This is awesome.
Yeah, thanks so much. We is awesome. Yeah. Thanks so much.
I could continue on endlessly.
So sorry.
Maybe we should sometime.
That would be awesome.
That would be great.
Excellent.
Bye.
Thanks, mate.
See you.
Bye-bye. Thank you.