Cognitive Dissonance - Episode 646: Discussing Jonathan Haidt with Aaron Rabinowitz
Episode Date: September 12, 2022Thanks to Aaron Rabinowitz for joining us - check him out at...
Transcript
Discussion (0)
See yourself buying a home one day? Do future you a favor. Open a Questrade first home savings
account and help that future come faster. The FHSA is a tax-free account where all your
investment gains are yours to keep and put towards your first home. With Questrade, you
can open an FHSA online. No bank appointment needed. It's easy and only takes a few minutes.
The sooner you get started, the more time your down payment has to grow. Open an account today at questrade.com. Today's show is brought to you by adamandeve.com.
Go to adamandeve.com right now and you'll get 50% off just about any item.
All you have to do is enter the code word GLORY, G-L-O-R-Y, at checkout.
Be advised that this show is not for children, the faint of heart, or the easily offended.
The explicit tag is there for a reason. recording live from glory hole studios in chicago and beyond this is cognitive dissonance
every episode we blast anyone who gets in our way. We bring critical thinking, skepticism, and irreverence to any topic that makes the news, makes it big, or makes us mad.
It's skeptical, it's political, and there is no welcome mat. six of cognitive dissonance. And we are joined this episode once again, by longtime favorite
Aaron Robbie Rabinowitz. I believe he prefers Robbie. That's like, I think that's what he
told me in the beginning. He is the co-host of, or host, I don't know. He's on Embrace
the Void, Philosophers in Space. He's a monthly columnist for the UK Skeptic.
Not to be confused with the US version of Skeptic,
which is gross and nobody should read that.
I don't think anyone does, though.
No, I don't think so.
I don't think so.
That's pretty much nothing. Well, Jonathan hate me.
Jonathan hate me.
He may.
We'll find out.
Probably.
Probably does.
I'm sure he considers it a respectable source of information.
Welcome again to the show, Aaron.
Thanks for coming on.
Hey, excited to diss some more cogs.
All right, so we wanted to have you on
because Tom and I had read this article called
After Babble, Why the Last 10 Years Have Been Uniquely Stupid.
It was written by Jonathan Haidt,
who we had no idea
really who he was. So we had read, Tom and I had both read a book. I read it first, Righteous Mind.
And then I said, Tom, you should read this book. It's an interesting book. Interesting way to think
about how we decide what's moral and what's not moral. It's an interesting bit of moral psychology.
May or may not be true, but it sometimes very much feels like it's true.
But I'm not smart enough to know whether it's true or not.
And then, so we listened to it together.
We talked about it.
We both seemed to like it quite a bit.
Then I came across this article in the Atlantic,
sent it to Tom.
Tom liked it quite a bit as well.
Loved it.
Because it really does resonate with both of us
with how social media has really face-fucked the entire earth. And so where we're at right now is, you know, Tom and I are both, you know, we're both
sort of on the same train. I think Tom's a little farther down the track than I am, but not much
farther. And we thought, you know, what's a good counterpoint? Who could we bring in here? Who's
maybe more high on the internet, more high on social media, somebody who does have critical
thoughts about it, but doesn't really feel as, as doom and gloom as Tom. And so we thought, why don't we bring Aaron in and see
what he thinks. And as soon as we brought up, you're like, Oh, that fucking guy. So we had
these like extended back and forth on like, we're trying, we're chatting on messenger guys. And
we're like, and we're having these like back and forth where we find out like this author who we
do nothing about.
We didn't know anything about him.
And that's a topic that actually
we will discuss as part of this.
And I guess he's a complete shit.
Yeah.
And so I know that we want to talk about that.
Oh, he's a partial shit.
He's a partial shit.
He may be shit adjacent.
Yeah.
And I know that that's going to be an interesting...
He has been shit even used in my opinion.
Yeah.
So we want to start out today by talking a little bit about the book, The Righteous Mind.
Now, if anybody who's listening hasn't listened or read The Righteous Mind, the concept is pretty simple.
The idea is that we very much pay attention to our social circles to sort of figure out where our morality is and in his
his idea is really most of our morality is virtue signaling we're doing this to for the rest of the
group to sort of say we are moral and you know he's he tom and i actually we were talking about
this earlier we we seriously bonded and, and really met
over reading philosophy together later when we were both in college. And we read funny enough,
moral philosophy, the grounding for the metaphysics of morals. And that is a book too,
that really talks about this as a way to say the only real true moral action you can do is if it's
not virtue signaling, if you're not, if you're doing something that's completely outside of your,
of what is,
uh,
what is advantageous for you,
right?
Then that is the only way to do a true moral thing.
So it's funny.
We think it goes all the way back.
Go ahead.
No,
if you got a critique of that,
go ahead.
Well,
to be fair,
there are Conti and emotivists.
There are Conti and virtue signalers who think that actually ethics is both what Kant says and also the virtue signaling.
But we should be careful about using terms like virtue signaling because it can mean a pejorative thing or a neutral thing, right?
Like there's a pro-social behavior that everybody likes, which is signaling your virtue by doing good things, right? And then
there's this like potentially bad thing where people like over-signal their virtues in some
ways or signal virtues they don't actually have or something like that, pay lip service in this
kind of way, right? And we want to be careful because I think people slide back and forth between those two concepts too easily.
And it gives the impression that humans are worse ethical reasoners than I think we actually are. I guess, thesis is that in his thesis, a lot of moral behavior revolves around your position
within a certain in-group and that it validates your position as part of an in-group,
often politically and frequently socially. Yes. And let me front load the concern to help you explain why I'm putting a pin in this is because there's a large debate, very broad debate across multiple spheres about whether human beings are fundamentally, either individually or as groups, reasonable or rational entities.
Okay. Right. So there's a bunch of different theories in one camp that are going to say that human beings either individually or as groups are irrational or not reasonable. They're not acting via the kind of reasoning that they like to think they're acting via. Or there's going to be the other camp we're going to say, actually, they are reasonable. Contra. I think conventional wisdom these days would say that we're not reasonable creatures. But I think there is good...
There was essentially like a movement sort of critiquing the idea that humans were rational.
And then there's been sort of a shift back, I think, towards...
Well, wait a minute.
I think we overcorrected too much towards the idea that like we're nothing but just cognitive biases all the way down kind of positions. And I think Haidt's view, his moral intuitionism
slides a little bit too much
towards sort of seeing us
as bad moral reasoners.
I'm also a moral intuitionist,
but I think we tend to be more reliable
and partly reliable
because we are social reasoners.
So, right, being a social reasoner
doesn't necessarily make you
an unreliable reasoner. That's an important note, right? Like, we all interact with our social
circles when we try to understand ethics, but that makes us better at understanding ethics,
not worse, I think. Can you explain to me what you mean by social reasoners?
Yeah. So, the idea would be you have these two kind of models of how human beings reason. You
have the individual reasoner, like the Cartesian, go sit in your office and sit in a big chair next to a fire and cogitate on what is like the true nature of the universe by yourself.
And then there's the like go out in the world, interact with other people, have arguments with them, have conversations with them, be pressured by their social cues, et cetera.
And that is the better environment
in which to gain real understanding.
So social epistemologists are ones who will argue
that that is either our natural way
or our evolved way or the best way.
There are lots of different ways you could frame this,
but something along those lines.
Essentially, yes, and in Haidt's view of social moral epistemology, but saying, and that's fine. It's
a good thing. It actually has driven us towards better moral views or something like that.
So one other thing that I want to just, at the outset, one other question I wanted to ask you
specifically kind of to help me understand or frame this conversation we're going to have is, I don't know that I fully understand, although I have, I guess, an intuitive grasp.
But what do you think the difference is between moral psychology, which is, I think, the directionality that hate purports to come from, and moral philosophy, which is what we touched on briefly with respect
to Kant. Really, really great question. I think it's a really important question. Thank you. Thank
you very much. I thought so too. Yeah. Great question. I don't ask any great questions.
He's never said that to me, Tom. No, it's fine. Go ahead. No, it's fine. Go. No, it's cool. I know,
I know. I know it's, I know it's cliche to respond. I won't, I won't say it that way ever
again. I literally was like thinking,
you know,
Oh,
we should,
we should really make this distinction.
And then you ask that question.
So like,
I have to give you credit.
Absolutely.
You do have to give me credit.
If you didn't,
I was going to actually tell myself,
I was going to be like,
you know,
and I want to point out that that was a great question.
I just want everyone to know.
All right.
All right.
All right.
Yes.
You've made fun
of me we love you buddy you're great and i love you uh um it's a really important question because
i think it's important to understand that height is a moral psychologist and when he does moral
psychology he's pretty decent and when he does moral philosophy he he's pretty decent. And when he does moral philosophy, he's bad, in my opinion, on multiple levels.
So one way to understand this distinction would be the descriptive versus prescriptive
distinction, right?
Moral psychology is just trying to describe how humans think about morality.
It's trying to look at us and say, what are we really doing when we're doing moral reasoning?
Are we just
rationalizing after the fact? Are we actually motivating our behavior by figuring out what we
think is the right thing? Those sorts of questions, right? So his moral foundations theory that we're
going to talk about is moral psychology in the sense that it is just describing all of the
different ways that human beings think about moral foundations.
Moral philosophy is in theory, well, so I'm saying all this and then I'm going to say, but, right, obviously, because I'm a philosopher, so I have to contradict everything I say, right?
Moral philosophy is in theory more prescriptive, right? It is telling you what you should believe about things, right? What you
ought to do. And then, of course, the but is there are people in both camps who will say,
actually, we should be doing the other thing. And the reality is it's a giant, messy clusterfuck
because what we ought to do is impacted by our moral psychology, and our moral psychology can be
shaped by what we ought to do. So you have a hideous feedback loop between all of this shit.
So it makes it a giant mess. So there's no stay-in-your-lane solution to any of this.
There's no, like, I'm just going to do psychology and you can just do philosophy. We have to do them
together. I just think that in particular, for various reasons, hype is not especially good at
making the jump from descriptive analysis to prescriptions about what we ought to do or what
actually is good or even how moral philosophy works. I think his account of pluralism is wrong.
Like it's just a bad account of pluralism. So yeah. I want to get back on track just a second because there's another big part of the book that's really important.
And that part is really sort of, it's leaning toward what you sort of alluded to earlier,
is that moral intuition is what leads us and our rational mind makes up excuses for why we believe the things we do after the fact. He uses a metaphor
of an elephant deciding where it wants to go. That elephant is the moral intuition and the
after the fact post hoc reasoning is our, are the writer, what he calls the writer.
That's our rational mind trying to decide why we think the way we do. Why are we revulsed by this particular
example? Why do we think it's wrong? Why do we think it's right? And, and his, the whole,
the whole argument of the whole book is that everyone is being drug around by this intuition,
their whole life, this moral intuition. And then we spend our time sort of as our own
politicians, as our own press secretaries, trying to tell everyone why we think and why we believe
these are the right and important things to either be revulsed by or to think are moral,
et cetera, et cetera, et cetera. I would add to that the book spends a lot of time on a concept
that I spent a lot of time thinking about afterwards which is that disgust itself is a primary driver for understanding our moral
response to this good point and that's that's he spends a lot of time on it there's a lot he
includes a lot of research on that which as a non-research guy, I found initially very compelling and very interesting.
And also, and I would throw this in the garbage can,
but say it out loud, it felt intuitively accurate.
Yeah, okay.
So lots of things there, right?
First, we keep using the word intuition
and I've discovered that some folks don't even,
that is not a common term for a lot of people. So I think we should back up and say like, intuition is this kind of
pre-reflective thinking that individuals do. And so like at every level of this, there can be
degrees of kind of pessimism about our ability to be good moral reasoners, right? Like, overall, he's making
an argument that I think is leaning very pessimistically in the sense that, you know,
we're the tiny little rider, and our intuitions are this giant elephant. And his view of intuitions
is fairly pessimistic in the sense that he thinks this is emotional, social-driven thinking,
and that makes it unreliable.
But that's a questionable theory.
I think you could argue that intuitions can be reliable,
that moral intuitions are a useful source of understanding,
and that there's a weaker version of this
that just is more about persuasion,
which is to say you have to placate
the elephant before you can talk to the rider.
So this is
a question of how much is the rider just
post hoc rationalization?
Or is the rider actually
making the calls, but only
when the elephant is happy?
So that's one way to read this. And I think
that's a plausible reading, that
you're a better moral reasoner when you're not, I think, being sort of short-circuited in certain ways. But it's also true that sometimes your fast moral reasoning can be better than your slow moral reasoning.
So I think that there's a lot of actual complexities to how we should approach this intuitionism, even if it is the
case that in general, we are these kinds of social reasoners. One thing too, that I think is
interesting is that it doesn't account for you changing your mind about moral reasoning. It
seems to think like, like you're driven by this intuition and that intuition feels sort of, it's,
it feels very stable. It doesn't feel like a thing that
can move or can change very much. And when he's talking about it, it feels like you just,
you make these decisions. But I used to, I used to think I used to be pro-life. Like I used to be a
conservative. Like these are things that I very much believed in that I don't believe in anymore
because I thought about them. And I think he, he doesn't give enough credit to those things. Cause I think those are moral positions,
like, you know, like, and so I think he doesn't give enough credit to that.
I would add to that, that one of the things that I did think about when I was thinking about the,
the revulsion, the disgust concept, it did ring intuitively true, but it rings intuitively true
to me on, on that sort of initial gut level. And I thought like,
all right, well, like the heuristics that we develop, we develop, I think from a variety
of different like input methods, some of which are thoughtful, some of which are disgust oriented,
some of which are like maybe innately sort of things that we are tuned or
attuned to. But like those like heuristics, I think are, are more broadly complicated than I
felt like was, was credited in the book. And so to your point, Aaron, about like,
are those gut intuitive, intuitive heuristics, are they accurate or inaccurate in terms of like leading
us toward like morally like right answers? I think like that, that answer is also complicated by like,
what are the inputs that create those heuristics in the first place? And how do you, how do you
like pull that shit apart? I mean, like, I don't know from, from your comment before, like, I don't know that there's like a reasonable distinction in terms of practicality between
moral psychology and moral philosophy, because I don't, I cannot possibly imagine probably because
I'm not smart enough, how you could separate them in a disciplinary way that still had rigor.
I think what you want to do is separate,
making sure it's clear when you're saying,
here's how I think people reason,
and here's how I think they should reason, right?
You have to study both in conjunction,
but like in your work,
you want to make sure that it's clear
when you're doing one versus the other.
And I think this disgust thing
is really important to focus on
as like a way to critique.
When I said that Haidt, I think, is doing what he wants to do poorly, I think that he doesn't take pluralism seriously enough and his moral foundations theory kind of suffers for it.
So should we maybe unpack, I guess, those things for a second?
Yeah, sure.
Go ahead.
Explain the disgust thing? Yeah, so, go ahead. Explain the discuss thing?
Yeah, so moral foundations theory,
which is the stuff that I teach when I teach...
So I teach intro to ethics, I teach philosophy,
but I teach this in the class because as you say,
it's unavoidable.
You can't have it as part of your system.
If for no other reason than you need to know
how to persuade other people,
if you're going to try to make moral progress in the world.
And I do think he's on to part of a persuasion point,
if not a further moral point.
So moral foundations theory is basically the idea
that human moral reasoning involves individuals
combining several moral foundations,
which are themselves intention,
and the different ways that they balance those
tensions the different ways they combine them produces the different kinds of societies and
behaviors and personalities that we see in the world so the examples that are most commonly used
are things like care versus harm and they're put as fairness but like the basic idea is you know
you you value caring you value fairness, authority, sanctity, or purity, and liberty.
Those are the ones that often get used.
And this gets applied to politics by Haidt, where he argues that the difference between liberals and conservatives...
And he gets a little essentialist about it, going back to your point about how do you change minds.
He kind of suggests at some points
that you're just born a liberal
and you're likely to just end up a liberal
in that kind of way.
Or you're born a conservative. You're born with different
amounts of fear reactions and those
drive what
moral foundations you prioritize.
Yeah, he does say that.
And there is a little bit to that.
I don't think he's totally wrong.
I don't think it's quite as strong
as he might think it is.
Sure.
But I do think there is some decent evidence
that there is a consistent difference
between liberals and conservatives
in terms of which of the foundations
they prioritize specifically.
Yeah, the Republicans,
like conservatives prioritize all of them equally
supposedly and liberals or leftists prioritize care and fairness and deprioritize loyalty
authority and sanctity um and quasi and quasi liberty like liberty is the third for for for
liberals he says it's like it's like down their will. He says they'll let it go
if they think that care is better.
So there's very often they'll let it go.
They're very, like, his view is
they're very hierarchically placed
in the liberal mindset. But in the conservative
mindset, he's just like, no, they're all the same.
They're all the same, yeah. No, that's why
I'm not crazy about the last part
of this book.
Yeah, and I don't think you should be because I think it's wrong on several,
several fronts.
So like,
here's a question for you going back to your disgust thing.
Now that we've,
we've laid that out.
I think it's a really good question to ask is purity or disgust actually an
independent moral foundation in some way that is not sufficiently covered by
the other moral foundations?
Or is it at best sort of an evolutionary inroad for us to gain some insights about what is good
for us to do or not good for us to do, but actually should be discarded in favor of care
or fairness, right? What do y'all think? I don't know because like i think i'm struggling right now
with the framing of that question i guess because oh yeah let me let me frame it as a hypothetical
right so this comes up for example in the issue of gay rights or gay marriage or something or
gays just the gays okay like a lot of a lot of yeah it's gays excuse me a lot of the conservative reactions
to homosexuality are disgust driven like we all know this yeah yeah remotely a question right
right um if you're given a moral choice where you have to decide between and this this often
is what happens is people have to decide between foundations right all of these foundations might
have some value maybe i don't think they all do.
But I'm a moral foundations theorist in the sense that I do think there are a bunch of irreducible,
competing moral things that we have to try to balance, like freedom versus safety,
all those kinds of basic trade-offs, right? But I don't think there's anything actually of value in the idea of purity or disgust such that if we have a situation where some folks are like, hey, I'd like equal rights, and other folks are like, yeah, but you're really icky.
I think we should tell the icky people they're fucking wrong.
Like, they're just morally objectively wrong.
And I think Haidt's bad at telling people they're morally objectively wrong. And I think height's bad at telling people they're morally objectively wrong. Yeah. And this to me, the reason I struggled with the question, the first way it was phrased and
struggle with it now is that this does sort of, to me, it like the question that arises from this is
how often easily or practically does moral philosophy, Trump moral psychology. So if,
if the gut reaction, and we know it's true, if the gut reaction for
some people is to be repulsed by something, and we know that that creates for some people a
post hoc moral justification for that disgust, I think it's easy to say, yes, and that is wrong.
It's easy to say, yes, and that is wrong. But then that also, to me, is structurally valueless because it doesn't produce a result where people move from one position to another unless you can overcome the rep the philosophy is the easier question. We know the psychology, we know the philosophy, but then squaring that circle toward action is where I get hung up.
I also want to say too, what I would do is, and what I think about is you use these foundational
principles, these pillars to try to move the elephant.
So you say, yeah, you're disgusted by that, but you have to consider
someone's Liberty. When you think about this, they are, there's two consensual adults. So you have to
now lean on this other pillar to take away your, I don't even know what that would be a sanctity.
I guess. I don't know what you would put that under. I feel awful just talking about it, but
you know what I mean? Like, yeah, like whatever these, these awful people think,
you know, whatever pillar they're leaning on, you have to try to entice them to another pillar.
You have to say, okay, that's fine, but you need to come over here to Liberty because Liberty
matters for everyone. Right. Right. Yeah. Yeah. And that's, but I was seized on something you
said, like what these awful people think. And I think hate, and to some degree,
I'm hard pressed to disagree.
Like hate's whole thing is like, this is not thoughtful.
And so when a lot of,
when some or a significant amount of our
like moral positioning comes from a place
that is pre-thoughtful,
how strong is that positioning,
especially for people who are not like us,
like sitting and listening. Yeah, I know, we're going to try to have a conversation.
Right, right. Like sitting in like, for guys like the three of us where it's like, yeah,
you know what, Cecil and I are actually going to become friends over reading like,
you know, Kant moral philosophy. Like that's not most people. Like that's not,
most people are not spending their time deeply considering and reconsidering their moral position.
They're living a moral positioning based on a set of intuitions that are driven in part by things like disgust.
And that to me, like it's wrong, but I'm like,
I remember like in college, one of my favorite stories,
like in college, I wrote this long paper about this book and I'm like, guys, and you could read this as
like this instead of this like feminist way, you could read it, you could read this in this
capitalistic way. And I wrote this long paper. And at the end of it, my professor just wrote,
so what? Because I forgot to make a fucking point about it. I just said it could be done.
And I think about that all the time with respect to this like moral philosophy
is like, so what?
If we don't do anything,
all we're doing is pointing at it
and saying that's wrong and we know it,
but like, how do we move the rider
or how do we move the elephant, you know?
Yeah, and Aaron, you can use that so what thing
on your papers too from now on.
It was the most devastating criticism.
It's devastating and it'll hurt people's feelings.
I'll never forget it.
It'll hurt people, so that's's vicious. It'll hurt people.
So that's good.
I'll never forget that.
That is the only comment.
The only comment on this long thing.
I'm like, oh, you can read this fucking book as like a fucking Marxist.
And then she just wrote, so what?
And that was the whole thing.
And I was like, fuck.
Because she was right.
There's a story I think about Anscombe would like,
she would draw a line in the paper
where she stopped reading like that was her note that's a fucking hammer drop
hold on hold on i had a professor i had a critical thinking professor the meanest
motherfucker in the world he was super fucking mean He would take excerpts from everyone's papers and then he
would critique them in one big paper and he would mercilessly critique the excerpt. And then he
would hand that out to everybody. So he would publicly shame you in his class by you. Like
one time I misspelled principle, this is in the days before spellcheck guys. So I misspelled
principle in the sense that I used principle with an E
instead of with an A
or whatever I messed up.
And he,
he went through
and like was like
telling me how it,
you need to understand
the principle of the principle
of the principle.
He's trying to be all super clever
and he makes you feel like shit
in front of everybody.
Anyway,
so I just had to say that.
Fuck that guy.
That's so brutal.
Fuck you Ken.
Eat a dick.
All right,
go ahead.
It's important that you needed to get that out.
Thanks for letting me do it.
I appreciate it.
Okay, so this is where I really do think we have to reiterate,
you have to keep different conversations somewhat separate,
even though they influence each other.
So there's a conversation about what is the moral truth?
There's a conversation about how do we persuade people, right? And they both matter. So you're asking like, why does the, what is the,
what does the truth matter? Right? Well, I think it matters because we want to do the right thing,
first of all, like we actually do care about the moral truth and it matters for persuasion because
I think the right way to understand the elephant metaphor is you have to placate the emotions,
but you also have to make an argument to the writer
once the emotions are placated, right?
And so I think you still want to have a good moral argument
for why we should actually side with pro-gay people
instead of the anti-gay people.
Because I think otherwise, in this worldview,
and this is where I think
Haidt goes wrong meta-ethically, he claims to be a pluralist by which he means,
I think by which he means cultural relative actually. I think he thinks that to some degree,
almost all of the cultures have to be acceptable in his view, even though he sort of says that there might be ones that are left out.
If you read the book, his conversion experience is going to a patriarchal culture and being like,
well, everyone seems really happy in this incredibly patriarchal culture. Maybe there's
something to patriarchal culture. And that's how he applies this moral foundations theory. He sort of goes in the direction of all of the foundations have core things that we need to, and here are examples of conservatives and libertarians. And that's why the part three matters is that that's where he lays out what he thinks are the examples of where conservative and libertarian moral palates are better than lefty're more broad they're more broad well no he says better
actually he says and not just better for persuasion i think he really thinks that there is a
shortcoming to the liberal palette but it doesn't take seriously enough those other foundations and
i think joshua green in in morales has the better of him on this,
where he basically says, actually, I think the left has a more refined moral palette,
which is to say, leftists correctly recognize that in a conflict between the moral foundations,
care and fairness should win out over authority and sanctity and these other things, right?
Liberty, it's more debatable, right?
But the ones that are the straight-up conservative values,
I think they don't actually have value in themselves.
They only at best have instrumental value towards the purposes of care and fairness.
And from a persuasion perspective,
it's probably not the case that you're going to do better
as a lefty trying to persuade a conservative by trying to appeal to purity or something right
you all appealed to liberty in your example because liberty is a shared value you know
there's i think good evidence that like the you still benefit the most from sincerely appealing
to values that everyone does share that are like
care or fairness or something like you can understand where a person is coming from with
the other moral foundations but you don't have to like try to like rewrite your argument to solely
appeal to those foundations um so yeah i i think he's not a good pluralist because a good pluralist has to be able to say
one side is more right than the other
about certain things.
If I could just throw out a few examples
from that part three,
just like he claims effectively
that conservatives understand the concept
of moral capital,
which a better,
especially he cites Sowell,
who I think is actually not a good,
reliable source of information. But like, this is an example where I think he,
you know, is overly credulous towards conservative and libertarian ideas.
I think, you know, I would put Haidt closer to a libertarian sometimes.
Yeah, sure. Absolutely. I would too.
A hundred percent.
Right. And I think he's ridiculous in claiming that libertarian is not closer to conservative.
Oh, that's absurd.
Especially in America.
That's absurd too.
That's absurd.
So he says things like he critiques people who claim that like markets aren't good, essentially.
He basically says that working markets are the best way to bring health care at the lowest price.
He basically says that working markets are the best way to bring healthcare at the lowest price.
He says that we should be effectively like social Darwins about free markets, which is going to be contradictory to what we're going to then talk about his conclusions in the article he wrote.
Absolutely. Absolutely.
In this article.
100%, right?
He thinks conservatives are right to retain nationalism and religion as a form of social cohesion, as far as I can tell. And he effectively sort of soft pedals the idea of ethno-nationalism because too much
diversity erodes solidarity.
Like he seems sympathetic to all of those positions.
And I think it's because he's not good at actually balancing the moral foundations on
the philosophical side.
Yeah, I actually feel like that is the most helpful critique of how I felt.
Because I was absolutely like I'm reading the first two thirds of this book and the first two thirds are primarily moral psychology.
And I was I was down. I was like, OK, this makes a lot of sense.
I you know, there's a lot of evidence cited in here. There's a lot of studies cited in here.
There's a lot of studies cited in here.
And then I think he does lose himself completely when he then says, all right, what do we do with all this?
Which is the movement then, as I understand it,
from moral psychology to moral philosophy.
He's good at the descriptive, terrible at the prescriptive.
The worst part about the whole book, in my opinion,
and then we're going to shift off the book
because we got to move on to this other piece.
But the worst part about the book
is that he kind of leads you on throughout the whole thing, being like, this is how you can talk to the other piece. But the worst part about the book is that he kind of leads you on
throughout the whole thing being like,
this is how you can talk to the other side
and there's none of that.
There's literally none of that in the book.
There is not a moment that I could glean from that to say,
here's how you change someone's mind.
It never happens.
He kind of feels like he alludes to it throughout,
but he never actually gets there.
That's why I only like, like Tom,
I only like that first couple parts of like partially the book, because I think it introduced me because I had never heard these.
I, you know, I have a degree in philosophy as well, and I did study some ethics, but the,
I went to a school that was a lot more classical philosophy. It was a continental philosophy
school. And so we studied like ancient works and medieval works. And so we weren't ever studying modern ethics.
And so I never really caught any of this stuff.
So this was really interesting to me,
an interesting way to look at it.
And so it helped me understand
this moral foundations philosophy,
which I had never really heard of.
So I'm glad you're spending some time explaining this to us,
but we wanna talk about Jonathan Haidt as a person. You had
talked about him being part of this
intellectual dark web.
I had never heard of it. Tom had never heard
of it. In fact, we actually
reached out to this guy
and his PR people. I'm repeatedly
jealous of y'all's minds. I want to live
in a world where you've never heard of it.
So jelly.
Yes, go on.
But we reached out to his publicist to see if he could come on the show.
And he gave us a after my next book sort of thing, maybe I can come on.
It was a like, you know, yeah, whatever kid kind of thing.
But we had reached out to him multiple times to see if he could come on to talk about the Righteous Mind book.
And we never really got anywhere with it.
And then when we brought this up to you, you're like, oh no, he's part of this fucking crazy
intellectual dark web. And so I
want to hear about the intellectual dark web,
because he's probably never ever coming on
our show. And so we're really
interested to hear about this. Tell us what that
is for other people that might have
the same virginal minds as
Cecil and I.
Yeah, and for the extended
version of this, I did a really good episode with Chris Cavanaugh
on Embrace the Void.
He's the guy from Decoding the Gurus.
He does a lot of like...
We're now described as the critical sphere,
I guess is the way they're calling us.
The people who are critical of the IDW
and the sense makers and all of these things.
So IDW stands for Intellectual Dark Web,
which is a silly name that was probably made up by Eric Weinstein. Who's one of the people in this group.
I think was,
I think he was the one who coined it and it was popularized by a New York
times piece by Barry Weiss because you know,
all the people are so canceled that they can't get any,
any,
you know,
any publicity except for,
you know,
biggest things possible.
I hate that the only one that will ever publish
my essays is the goddamn New York Times.
It's so fucking frustrating. I can't get anything
written. Like Breitbart's like,
no. Let me get something in the Post once.
No, ass. Jesus.
Times, times, times.
Doesn't really fit our New York Times.
Yeah, no.
So the IDW
is basically the anti-woke sphere, I think is the way to think about this. It's a collection of individuals, gurus, one might say, who are joined in their dislike of liberal leftism gone too far, social justice gone too far, and a sort of common belief that mainstream media and society is all captured by wokeness,
and that's bad for various reasons.
This was popularized by Barry Weiss.
In the original article, it included Sam Harris.
I think Jordan Peterson was in there, Joe Rogan.
I think Haidt is in there, the Weinsteins.
God, I'm yawning so much.
The names, I'm just like, oh my God, I'm yawning so much. Do you,
the P the names.
I'm just like,
Oh my God,
I'm going to yawn every time you say one of these names.
So boring.
Oh God.
It's a collection of fucking dipshits.
Right.
Here's the thing.
So like all of them,
like have this anti-woke bias.
Many of them have various forms of narcissistic personality disorder.
I think,
you know, this, Many of them have various forms of narcissistic personality disorder, I think.
This sphere, I would also expand to include folks like James Lindsay before he completely flamed out.
Anyone who's in that heterodox is another term for this in reference to Jonathan Haidt's Heterodox Academy.
What about that guy who wrote that atheist book, converting people to atheist books?
Oh, Boghossian.
Boghossian? Oh, Bogosian. Bogosian?
Oh, Bogosian.
He's part of that too?
Yep.
Is James Lindsay the guy that does the sword katas?
James Lindsay does axe and sword katas on Twitter.
He used to.
He's canceled now.
And promotes white Christian nationalism.
He got his Twitter canceled.
So he's not on Twitter anymore.
Thankfully.
Aaron, I know that he's a white Christian nationalist,
and I know that that's actually the most important
thing about him, but the most
important thing about him for me
are his cringy axe katas.
It's so bad.
If I was in a tomahawk fight with him,
I would be worried for sure.
I can't tell you how many times I've gotten
in an axe fight and I've been like,
where's James Lindsay right now?
He's just leaning up against his car, smoking a cigarette. There's a lady getting in an axe fight and I've been like, where's James Lindsay right now? Right.
He's just leaning up against his car,
smoking a cigarette. There's a lady getting
mugs like, you don't want to do that, punk.
Takes a cigarette and
pulls his axes out. As someone with
cringy martial arts
videos on the internet myself, I
wouldn't necessarily throw stones at that house.
I 100% am a
cringy martial artist myself,
but I don't post them
as a celebration of 100,000
subscribers. Anyway, continue on with your
conversation. No, no. You're
totally fine. So yeah, this
loose collection, who all
as a result of being anti-woke, I would
argue, even though many of them see themselves as
liberal or classical liberal or
the left, left behind, blah, blah, blah they're all yeah dave rubin is part of this group
too the fuck is post tribal what the like you're just i love you aaron but you're saying a fucking
shit ton of like like what i what i'm garbage words these are garbage thank you these are a
bunch of silo descriptors that's what they are's like, there's these like weird fucking silo descriptors and it is for sure.
Insane at some point.
So think of the IDW as the silo of people who believe falsely that they are
not in a silo.
So it's wool.
It's just,
it's okay.
Okay.
So, so we, we had no's just, it's okay. Okay. So,
so we,
we had no idea who this was.
Yeah.
And we recognize that,
you know,
it's problematic,
this idea.
And,
and actually to be honest,
even in this,
even in this,
even in this article that we're going to be talking about this after Babel
article,
he does in a lot of pejorative ways,
talk about wokeness.
Right.
So he's not,
he doesn't,
he doesn't feel like that's a thing,
that's a boon or a benefit to anyone.
Way more than he talks about black maggotness
or conservative anything.
I feel like this article,
I want to lay the ground,
sort of set the ground rules for this article.
Before I do, is there anything you want to add
about the intellectual dark web before I move on?
I don't want to take away from you if you had a, is there anything you want to add about the intellectual dark web before I move on? I don't want to take away from you
if you had a special point or anything
you want to... Well, I think what's valuable
to know about the intellectual dark web is that
the vast majority of them have
either laundered or spiraled directly
into conspiracism of various sorts
because
one of the things that held them together was this
rejection of orthodoxy
or something, but they were like,
they go way too far in the other direction and like continue like yes.
And everything.
Um,
and height has a bit of that to,
to his credit,
I will say him and like Steven Pinker have slid the least far in that
direction.
I do think that like height has lots of terrible views,
like the ones I just talked about,
but he hasn't promoted or laundered like
white replacement conspiracism the way that like he's actually michael schirmer or the people have
but he's actually in this article like he he literally is saying that's it that's a huge
negative those are those people are pretty terrible people like i feel like yeah i want to
give credit like i want to give like credit where it's due is that he is not and we'll read that
like we'll talk about the after babel article is that he is not, he is deeply critical of the right as, as well as the
left. And he does go through pains. He spends, it's interesting. Cause I agree. He spends more
time, like in terms of column space, talking about some of the darts that the left shoots
into their own head. But he also says none of that is as bad as the darts that the right is
shooting.
So there is more column space dedicated to the left,
but there is an explicit statement that it is worse on the right than on the
left.
Let me,
let me lay the ground.
I just want to put that out.
Let me lay the ground rules.
Let me,
let me talk about the article itself and then we'll go into the article and
talk about it.
I'm just going to lay out exactly what he talks about in the article.
So he says he uses Babel as a metaphor for red and blue America unable to communicate.
He basically blames most of this miscommunication on social media, on the emergence of social media.
He points to a time, the inflection point he thinks is sometimes around the 2010,
2011 era when there was the Arab Spring and there was also the global Occupy movement.
Those things were the start of this happening. He says there's two forces that bind successful
democracies, social capital and strong institutions. Oh, pardon me, I guess three
shared stories as well. And social media, he claims, has weakened all three, social capital and strong institutions. Oh, pardon me. I guess three shared stories as well. And social media, he claims has weakened all three social capital, strong institutions
and shared stories. He thinks that share sharing and like buttons and virality and public shaming
are the main problems with social media. And that those are the things that are causing the most
discord in those spheres. He also says that confirmation bias is something that we, that tends to get promoted way more
on the internet.
And it is something that people will fall into way more commonly.
He says that the, that social media essentially gave a dart gun to millions of people.
And those darts do damage to, uh, the intellectuals on their side most of the time.
And it's the extremes that are shooting the darts to sort of pull these intellectuals and the people
who have voices to those extremes. And then he says that we're in for something even worse as
we move forward because AI and the flooding of the information
ecosystem, it can be much easier to do in the future. And then he finally, at the end of the
article, has some steps in the right direction. He says, harden democratic institutions, reform
social media, and prepare the next generation by letting them go outside and play. Now, I want to
say this article is 45 to 50 minutes long
if you want to listen to it on Autumn.
I'm going to link it in the description in the show notes.
It's an Atlantic article.
It'll take you less than an hour to read.
Tom and I, I've read this article three times.
Same, yeah.
And I think that there's a lot of good stuff in here,
especially if you are a little bit leery of social media.
And I think there is some good things in here
that he points to.
But we wanted to talk to you about it.
Yeah, you're here for a reason right now.
Aaron, because both Tom and I,
we were very much into this article,
thought it was very good.
I do agree that the both sides-ism is a problem
because there's no way on one hand to say
one side sometimes does some
questionable things at universities by disinviting professors and another side tried to siege the
capital you know what i mean so i feel like there's definitely a huge discrepancy between
those two things and he seems to be bringing them up in opposition to each other. And one side always seems worse.
Like one side is vaccine misinformation.
The other one's like, be a little cautious around COVID.
And I'm like, okay, but I get it.
But at the same time, man, these are not so, but Tom is right.
He explicitly very often in the article says the right is worse at this and the right is doing more damage. So, yeah. Here's what I'll say
about the,
like, I guess
my experience of it isn't exactly just
a bullsiderism problem.
I think it's more
his narrative starts too
late for me, for starters.
I think he locates the problem in
2010 and makes it largely
about media, or largely about the internet
but like if his if the metaphor is supposed to be about the balkanizing of our understanding of
reality right the like splitting up of we're in two realities two societies or whatever now or
something like that or a million different societies whatever it goes back way farther
and i think it's important that we talk about that it goes back to, you know, the social justice, the 60s, like
civil rights era, where you start to have a pulling apart over, you know, that or like
anti-scientism and Christian conservatism leading to a pulling apart of basic facts about reality.
And then you have like the 80s, you have the 90s,
you have, you know,
like specifically a move
in the conservative movement
to like not give a shit
about facts essentially, right?
Fox News was doing this
long before the internet
had gotten into anyone's minds
in any significant degree.
So what I would argue is
it's true that social media is an accelerant, right?
It takes what is there and makes it happen faster, right?
The virality stuff makes it faster,
makes it easier to find the people who agree with you
and find community and get reinforced,
all those sorts of things.
But the balkanizing of reality was there to begin with.
It wasn't caused by the internet.
And I think there's a good piece of evidence for this,
which is the internet hasn't face-fucked
the entire world equally, right?
Like some cultures are doing a little better than others
on the like misinformation side of things.
Not like perfect, everyone's struggling,
but like America has a unique problem of Balkanization
because we are a deeply morally and like socially fractured
society that hasn't dealt with our history well. And like all of that was being actively weaponized
by the Republican party for decades before the internet and the internet just like threw a bunch
of gasoline on that fire. So I don't think he includes not enough of any of that in the article personally. So, all right.
Do you then, so the title of the article is, what was the title of it?
After Babel, Why the Last 10 Years Have Been Uniquely Stupid.
I'm close if I'm not perfect.
Do you take issue then with the premise that the last 10 years have not been uniquely stupid?
Is that, so you're, to me, to me, as I just read your, your part, yeah,
so. I think we would say, I mean, like, uniquely, do we mean, like, categorically or degrees,
right? Like, it's been extremely stupid in the past few years. I think his argument is degrees.
I think, to be fair to his argument, his argument, because he, he cites all the way back to New
Gingrich, which, to your point, isn't far enough, but he doesn't actually, I think his point is a
matter of degrees. Like, like the accelerant, I think part of his point is the accelerant matters.
Like if my house is on fire, but I can confine it to one room, that's bad.
And there's a fire and someone started it and it could get out of control at any minute.
But like if somebody is spritzing the air with gasoline, that is a uniquely problematic behavior,
which I think is what he's pointing at.
So I think it is a matter of degrees,
which is where his argument really starts at.
Not a categorical distinction.
I'm fine to...
Do you not read it that way organically, though?
No, I think the word uniquely
conveys something different.
And I think the fact that he says that it started 10 years ago and that I don't think he talks enough about the history of post-truth or of these deep moral and epistemic disagreements.
So that was just like, yeah, I don't think that it reads that way to me. I'm willing to entertain that argument, though, because I do
think that is closer to something plausible, at least, that you do have, just like television,
likely accelerated things from the previous model of muckraking. The modern age of screwing people
over is much faster. It's much more effective. Cancel culture is... The culture part is much faster, right? It's much more effective. You know, cancel culture is like,
like the culture part is debatable,
but the cancel part is quite real.
Like you can do huge damage to a person online
very quickly, very easily over almost nothing
a lot of the time.
Like that isn't always the case for what happens,
but it does absolutely happen.
And that like also, you know,
you have all of the filter bubble problems that
are true about social media.
You have
all of the algorithm
problems that accelerate
certain kinds of content because they
get more clicks and they're more popular.
I think we talked about this stuff
in our previous chats. I'm
sympathetic to those
parts of the critique, I think,
but then it becomes like his solutions are like a little bit wacky, you know, like they get very
silly. And it's like, I think it's pretty low hanging fruit at this point to say that social
media is an accelerant. And if that's all the article is bringing to the table, then like,
it doesn't seem particularly insightful, because that's something the article is bringing to the table then like it doesn't seem particularly
insightful because that's something that we've we've already talked about previously
i do think it's worth talking about where he goes with that theory because like i think it again
reinforces that like he's not a very consistent internal moral reasoner to some extent um yeah so
what do y'all think before I, I think that's
interesting perspective. What I think is, you know, where he goes with it is what, what, what
drew me to the article because I, I recognize, I think a lot of people recognize, I don't think
it's a controversial statement to say that social media fucks a lot of things up. I don't think
that's a, that's a controversial statement to be like, you know, social media has done some really bad things.
And it has made us certainly, it certainly feels like it's made us more polarized.
It's made us where there's very little chance to meet in the middle often.
And one of the things that he talks about and he spends a lot of time on is talking about how small the far right and left groups are,
yet they seem to have the loudest messages when it comes to online spaces. And they seem to be
the ones that are pulling people in directions when a lot of people might not have as radical
ideas. But the social media sort of shames them into thinking they need to have as radical ideas. They just, but the social media
sort of shames them into thinking
they need to have these radical ideas.
And I feel like I'm a little sympathetic to that idea.
I think that there are,
that I get called a centrist all the time on this show.
People will send messages to me
and be like, you're a centrist.
I'm like, I don't feel like I'm a centrist.
I don't feel like I,
all of my voting in the past. I get called a fascist en go so you know like that's the thing is like there is a loud group of people on the internet who will go out of their
way to tell you things that that that are farther away from where i am. But to them, to them, anybody to the right of them is,
is a far right fascist.
You know what I mean?
And so like,
I feel like there's something to that.
Yeah.
I actually,
I think I wrote about this recently in that,
in the piece I did about Shermer and the mainstreaming of
conspiracism by folks in the IDW kind of sphere,
which is that I think it's really particularly bad because it
causes what I think is the following series of events for a lot of people. Some not radical
person sees what they think is a not radical person making what they think is a not radical
argument. They then think that argument is plausible. They then go to other people and
they say, what do you think of this argument? And they get a really powerful response, right? Like a really overwhelming response. And instead of that,
like actually, you know, helping them understand why the argument wasn't particularly good or why
it's actually like anti-Semitic conspiracism with like a thin veneer of normalcy on top of it,
right? Like they think, oh, this actually is forbidden truth. People actually are resisting this.
Oh, wow.
And they get pulled in that direction, right?
So it does create the beginning of what I think becomes a feedback loop of combative behavior leading towards increased entrenchment.
If your thesis is wokeness is extreme, and then you go to talk to some people and they act really strongly, then you confirmed in your thesis especially if you don't know that like and like the thing is their strong
reaction might also be justified let's use our moral foundations here right like if you're a
person like me who spends a lot of time on the internet like dealing with this stuff right it
can get sort of frustrating or tiring to have another person come along and be like but why
why isn't it true that we're just slowly trying to replace all the white people like what
are you the fact of that a little bit you know like you know i i'm i'm drawn back to to something
we talked about briefly just a few moments ago because i think it's it it just made me stop and
think is you know i i sort of like made a joke, like these are all silo descriptors.
And I was just thinking as we were talking here that,
you know, one of the things that is happening,
I think is that we are describing,
we are constantly like creating this kind of like
semantic cartography of the information ecosystem.
And that labeling that is taking place, like encourages bad faith
arguments and encourages us to make more connections and hear each other a lot less.
And I think that that is something which is like, to your point, like, let's say I were to go online
and I hear an argument and it is a bad faith argument, but I don't know that because
I'm just some guy first hearing about Jonathan Haidt. And I don't have any idea who he is.
And you know what? Tom is actually bad at ever looking at seeing who wrote something.
I just read stuff and I never look at who wrote it. I literally never do that ever. And actually,
it's a rule that I follow myself. So like in the IDW, all that high decoupling, you're a,
you're a brilliant high decoupler because you don't attach the ideas to the people.
I, I, and I, it's a rule that I've followed now for 25, 22 years. I won't re I won't look at who
wrote something. So, but that also means that I could very easily, if I was online, I could very
easily say, Hey, I just read this and it was really interesting and then i could become labeled all right yeah oh is that bad i'm right so okay it's about a struggle
it's my struggle has anyone heard of this guy it's crazy yeah but i i think that there's
something i do feel like i'm struggling frequently yeah but i i do. I feel like there's something to that.
I feel like that labeling,
that reactionary labeling,
that bad faith siloing,
that it's a problem.
It is not a good.
It is not making us smarter.
And I really reacted strongly
to Haidt's article in that respect
because that is making us worse thinkers.
I just can't see that it doesn't.
I also, I think it might be the case in a more sort of subtle kind of way, merely being in a situation where we all feel more compelled to justify our beliefs to other random people online.
Like, think about it before this,
you know,
before the internet,
right?
You didn't have to spend much of any of your time justifying your beliefs to
like random strangers or things like that.
But now that you do,
everybody kind of feels like they have to become amateur philosophers or
experts or whatever in like the things they believe and they have to defend
them.
And like,
they don't make like just bad faith arguments.
They just make bad arguments. Like arguments. They just make bad arguments.
Like,
like we all make bad arguments.
And if you haven't done a lot of it,
like you go out there and you make some bad arguments and some people freak
out at you and you freak out back at them.
And like,
if you're famous,
if you're Jordan Peterson,
you make those bad arguments and then 10 million people agree with them for
some reason.
Or if you're just a young person.
Like there,
like,
like one thing that i was very sympathetic to
in this article is like i think about my intellectual like kind of growing up and
cecil and i did a lot of it together and we would get together we'd drive somewhere we did this in
the cars a lot we did this like a party like we'd get together and we'd have these like
what we felt like were really deep and meaningful conversations, but acknowledging that we were bad at having those conversations at first.
And that you only get better by having them over and over again with a variety of different people and different inputs and then learning how to have conversations that have rigor and not just curiosity.
and not just curiosity. But I do feel like one of the things that Haight points out in this article is that our inability to hear across these sort of ideological lines, our inability to literally,
the Babel metaphor, literally understand the language that is used across these ideological
lines makes it less possible for us to grow in the ways that I remember having the opportunity to grow intellectually.
And I was so sympathetic to that argument because the space that we use to have those conversations,
when Cecil and I had them, we had the luxury of having them in person. And so I could fuck up and
say something terrible and not know it was terrible. And the only one who heard it was Cecil.
But now everything is public. It's not in perpetuity too, right? You can go back and
look at my bad arguments on a message board from years ago, you know? So I think there's something
there. There's also a flip side to this though, right? If we are social moral epistemologists,
right? We need to be talking to people in order to get better at ethics.
I actually think some of that needs to be
within a community that has positive,
healthy morals already.
And the internet provides that
for a lot of people who are trapped in places
that don't have that.
That's a great point.
Yeah.
You know, so like there are a lot of people
being pulled out of bad epistemic environments
by the internet as well as people
being pulled into them, you know, in these various, it's like, it's a lot of epistemic environments by the internet as well as people being pulled into them,
you know, in these various... It's a lot of epistemic luck about where you end up
on the internet in that kind of way. I do think that there are structural features of the internet,
some of which he highlights, you know, the virality stuff, the like button, the public shaming
that are bad. But this gets us to our like, you know,
his conclusions about what to do about it
don't really make a lot of sense.
Really?
I only get more actionable.
So I do like his conclusion on how to change politics,
which is to turn it into rank choice voting.
And we have seen that pop up
in a couple of places in the United States.
And I think it's a worthwhile endeavor. I mean, Sarah Palin just lost because of rank choice voting. And we have seen that pop up in a couple of places in the United States. And I think it's a worthwhile endeavor. I mean, Sarah Palin just lost because of rank choice
voting. And there was more votes for the Republican initially. There was more structural votes for the
Republican initially. I mean, this is the thing. It's sort of like there's a safe version, a mild
version, a weak version of his conclusions that's like a weak version of his argument where, yeah, the internet has some problems, right?
And, like, yes, it would be better to have better voting,
but as a solution to the babble problem,
it doesn't seem like it even touches it.
Like, it staves off some of the symptoms a little bit,
but it doesn't, like, it doesn't address the issue
of the fact that one of the political parties
no longer, like, believes in, believes in fair elections where they lose.
You're right.
It won't change that,
but it will take their voice away.
So if you start with a ranked choice system
and you start slowly pulling out
that the rug underneath the Matt Gaetz
is in the Marjorie Taylor Greene's
and they can't get traction where they're at
and they can't win where they're at,
then suddenly their voice is gone
or at least a large portion of their voice
and the authority of the US government stamp that they wear. they're at, then suddenly their voice is gone, or at least a large portion of their voice and
the authority of the U.S. government stamp that they wear. So those things, when you start pulling
those things away, that suddenly takes credibility away from their voice. And I think that that's a
positive thing. I don't think it fixes the Babel problem, though, because those are just the most
comically extreme version of the problem. And maybe they get pulled out, right?
But there is no one left in the Republican Party
who doesn't believe in the big lie.
They kicked out Liz Cheney.
They kicked her out specifically
for not believing in a conspiracy theory.
So it's not like Gates is being replaced
by a reasonable individual
who suddenly is going to come to the table.
They're just replaced by the Ron DeSantis' of the world,
the slightly better Taylor Greene,
the one who can say the Jew laser part a little quieter.
I'm not convinced.
What I'm saying is I don't think he's acknowledging
the problem is as big as he actually
has correctly identified it as being.
When he gets to the policy stuff, it's like, what you really need is you need to get rid
of the filibuster.
You need to overhaul the Senate.
You need to pack the Supreme Court.
Those are the things that would actually harden our democratic institutions, because I don't
think they're getting hardened at all by...
And also, how are you getting this magic redistricting in nonpartisan ways when our system is already controlled by the minority white Christian nationalist party? Similar with the social media reforms, right? Making it harder to go viral. Great in theory. What does that look like actually in practice?
And how does a libertarian who thinks that the free market, as we said earlier, is the way to get the good things in the world?
You're going to have to be paternalistic on this one.
You're going to have to shut down some choice.
Yeah, but I think he very explicitly does lay out some measures that in my mind make a lot of sense as far as revising social media. So to reduce virality, he says anything that you try to share that's been shared more than a handful of times, you can't just click the share button.
You have to actually stop, copy, paste, perform a slightly more cumbersome action.
Isn't that just internet safetyism? This is the other problem is that one of his solutions is
anti-safetyism towards children but like what you're describing
is like an annoying neoliberal solution where it's like i have to click two buttons to tell
this person to go fuck themselves yeah but it's means reduction internet yeah but yeah but that's
important right because like we've seen in so many other cases where means reduction has a very
demonstrable and immediate impact on reducing unwanted behaviors because fundamentally
people don't they're lazy are fucking lazy and lazy and they like they'll if something is easy
they do it more often if something is a little more difficult we do it less like most people
are sitting on the shitter hitting share if i have to sit on the shitter and i can't just
comfortably hit share you know it's i'm not going to spend the 25 seconds to do it. I think it, I think in my mind...
I'm not saying it can't have any effect.
Yeah.
I'm not saying it can have any benefit.
Yeah.
So you combine that with his,
everybody who gets online,
like has to go through a mechanism
to make sure you're not a bot.
I think that that would certainly...
Yeah.
Which like what magical, you know,
like technology is caught?
Is that doable?
I don't think that's remotely even like anything anyone can...
You don't think that's technologically feasible to reduce?
My understanding is that social media...
Not eliminate, but reduce.
My understanding is that social media orgs already
do not want a bunch of bots on their sites
because it's not good for ad revenue, for example, or whatever.
There's various reasons why
I do think that
Twitter is already trying to, to some extent,
combat the kind of bot
problem. I'm for
combating it more, but I don't think
it actually fixes
the problem because you can just have
farmer accounts
where they have people who actually are real human beings who sign up for real accounts and then spread
misinformation.
Yeah.
So like...
I guess to me, like, I'll stop you there because like, I do, I hear what you're saying, but
I think that that runs into the all or nothing issue.
I think that the solution set for most problems is not this 100% fixes it or it's not
worth it. It's a matter of narrowing the funnel, knowing that some amount of shit's going to get
through. I agree with that fully. But if you can cut it by 60%, you've cut the problem 60%.
That's a big deal. Look at the 12 people online that were responsible for something like 75 or 80% of the vaccine misinformation.
You don't have to solve a problem that's this big.
You can solve a problem.
Will it fix the whole thing?
No.
But when Trump got kicked off Twitter, the amount of disinformation.
One guy got kicked off Twitter.
The amount of disinformation.
So the funnel is important.
For sure, it is important.
And I would be in favor of
most of the things that I think he
argues for. Like, get rid of the
like button if you want to be, you know, like,
do these sorts of things, for sure.
I just also think that
A, there's a reason that
those things are being resisted, and a large part of it
is free market capitalism. So, like,
his beloved free market is the reason he's not getting what he wants and b i just i want a little bit more
honesty from people like height who have made a career of you know accusing people of coddling
other people's minds to acknowledge that what we are now saying is you know we need to coddle
people's minds a little bit yeah we're going to do it really passively.
We're not going to actively
suppress any particular information.
But y'all are just a little bit too fucking
dumb to be getting this much misinformation
in your diets.
That's pretty
paternalistic, pretty safetyist
in my opinion.
And is that bad?
Well, no, it's not bad.
It's just frustrating, it's just,
it's just frustrating.
It's a hypocritical stance,
but it's a hypocritical stance for him though.
Right.
Because he's sort of come out against this for a lot.
I want to ask both of you what you think of the final internet piece.
And that's tied to saying who you are and tied to being who you are.
Do you think that cuts down on
rape threats, death threats? If even if it's just a third party, they don't,
nobody knows who you are technically, but your name is tied to an account. Do either of you
think that would slow down the massive vitriol, death threats, horrible things that people do?
Yes. At a cost is my answer what does that mean like
what i mean is absolutely right so if you look at the way that anti antifa for example will
dox nazis and get them fired and shit like there is evidence that it works that like
outing people gets you know for their terrible views will impact how they how the degree to which
they want to promote those terrible views the cost of making it so that no one can be anonymous
online i didn't really say that though i i think he says that you are tied to an uh an arbiter
somebody out there that knows for sure you're a real person and your name is tied to it but that
doesn't necessarily mean that it's
that, that, that information would be public, that that information would be available. But, but I
know that it's me and I know I'm typing these things out and someone out there in the world,
I know that someone out there in the world knows it's me, even if not the public knows.
So, so my concern there would be that keeping that information would be difficult
and that like,
there are a lot of good reasons for people to want to be anonymous online for,
you know,
safety reasons,
not getting out into their families,
all those kinds of reasons.
So like,
I think it would probably be difficult.
You certainly need a lot more bureaucracy to adjudicate legitimate and illegitimate claims of need for anonymity, for example, online,
if you were going to try to implement a requirement like that. I do think if we take
all of these things and make them like, here's proactive things that social media orgs could do if they wanted.
Some of these things are helpful.
I do think changing the way
likes and retweets work could be helpful.
But I
do get worried
about the requirement of
not anonymity. It would absolutely
clean up a lot of trollish behavior.
It would also put a lot
of people back in closets.
That's a good point.
Tom, do you have anything else?
Yeah, I guess I see both.
And I don't want to do the both sides of this piece
because I do have a view on this.
But 100%, we lose no matter what.
I think that that's the truth.
We are losing now.
We have a problem now where there is a very imperfect anonymity, right?
Otherwise, doxing wouldn't be a thing.
We already have imperfect anonymity online.
I think increasing the imperfection of online anonymity might, because again, it's not an all or nothing.
It's not perfect anonymity or no anonymity might, because again, it's not an all or nothing. It's not perfect anonymity or no
anonymity. I think that perfect anonymity is a fucking cancer. I think that is a generally
cancerous thing. And I mean that word intentionally. I think it behaves like a
cancer in our system. It erodes and diseases us socially. So I have a strong view on that.
I see the need for some amount of anonymity, absolutely, for people that are in abusive
relationships, et cetera, right? They need to have some ability to get online, but I am okay with
a less perfect and lesser amount of online anonymity in order to sort of reduce the spread of the cancer.
To rein it in a little bit.
Yeah, it's too much.
It's too strong.
So the other thing he talks about on online is not allowing companies, this is interesting
because it goes against his libertarian, very libertarian stance, not allowing companies
to cash in on kids.
So allowing kids, and this is also leading towards the next generation. I know that
you have some serious problems, Aaron, about the next generation talk that he gives. He thinks the
kids need to get out and play. That's one of his big things. I think he's trying to sell a book
when he does that, to be perfectly frank. But he's saying kids need to get out and play and
they need to have unstructured play that allows them to go out and interact with each other
in ways that is unsupervised by adults because that teaches kids how to be kids and how to be
part of society. But then he also says that these social media companies shouldn't be making money
off 13 year olds. We should at least wait till 16. I personally think maybe 18, whatever,
but they're selling data on little kids at this point. And that's a way to slow that down because he's saying that kids these days, he's saying the kids
these days are actually becoming more and more depressed. He didn't cite any real data on this,
but he did say that it seems like the data is leaning towards children being more depressed because of social
media. There is some data, at least I think I read that Facebook has its own data that shows that,
that, uh, that, that Instagram is causing some body issues with, with young people. So there is,
there is data out there. There is data out there, but I, I, that's a lot. That's a big question,
but I wanted to throw it out to you because i know you have problems with this section yeah i think that it's sort of it is this weird
mix of his like old timey get off my lawn kind of stuff but also this like very paternalistic
like we have to be extremely protective of children in the extremely harmful digital
information world and much less protective
of them in the, I would still argue, fairly harmful real world. I'm for more free range
a little bit. But I also think that part of what we call safetyism or helicopter parenting,
I don't think he really does a good job analyzing the causes. He tends to associate it, I think, with too much with wokeness and not enough with,
A, just the increased understanding of human developmental psychology from the past several decades,
where we've learned that they're not tiny adults, and they do need structured development in various kinds of ways,
or they can be horribly damaged by various things.
unstructured development in various kinds of ways, or they can be horribly damaged by various things.
But also, when it comes to unstructured play, I think the root cause is toxic meritocracy. I think that the reason children don't get to go outside and play unstructured enough is because they're
too busy going from soccer practice to martial arts to band, because they have to do all of
this stuff in order to go to college, to do the next thing, to do the next thing.
I mean, there's only one, as far as I know.
There is a proper datologist in the room here who can say maybe a little bit more about that from direct experience.
But from talking to my students, they all feel like they constantly have to portray overworked, meritocratic burnout, essentially.
And they learn that from a young age.
And I think that's why the unstructured play has gone away.
But I don't think Haidt is willing to be critical about meritocracy
because I think he views it as a liberal value.
And I don't think he sees how much of the current situation
is being driven by that meritocracy trap stuff.
I think also the safetyism stuff, like if you're having fewer children
and it's vastly more expensive to raise a child,
and there are all of these challenges that they're going to face
just to get a decent job, even with an undergraduate education,
of course parents are being extremely focused on raising them and trying to give them every opportunity and stuff like that.
So yeah, I think it would be helpful here if we could be a little bit more honest about what psychological factors are driving parents and address those problems rather than saying it's just the woke.
rather than saying it's just the woke.
He does this weird thing where he jumps from the woke trying to protect you from ideas
to the woke literally trying to protect children
from going outside and running around with each other.
I don't know any woke people who give a shit about unstructured play.
And I think that's a false jump.
I just think that the woke are more concerned
than he used to be about being harmed by bad ideas.
Though now that he's arguing in favor of making social media worse,
or less profitable, let's argue,
but less effective at what it was designed to do,
I think it's because he has to acknowledge that the woke were right,
that ideas can oftentimes be more harmful than physical things.
Yeah, I got a different feeling from that end part.
I didn't feel like he was laying that at the feet of the woke.
I really did not get that feeling from the article that he – and I also feel like –
Maybe I'm bringing the larger atheism stuff that he's done.
And I may not be familiar with it.
Because I didn't get that feeling.
And I did get the sense that – and I agree with you that – and I got a sense actually, and maybe I could be mistaken, that Haidt might agree with you that the specific and purposeful differentiation between soccer practice and unstructured free play is specifically speaking to there's too much of this soccer practice meritocracy shit. And I think that's why he specifically is saying, let's pull kids
out of the, because he's not recommending more soccer practice, right? Even though that soccer
practice might accomplish many of the goals that he lays out otherwise in the article and in the
righteous mind. He's specifically calling out unstructured free play as a benefit so i i don't know that i necessarily read his
comments in the same way in the article as being um i i read them as being consistent with a critique
on that sort of um tyranny of meritocracy that i agree with you about in general like as like
being socially problematic i would love to hear if he was. Yeah. Yeah. I mean, my, my, my sense is just broadly because the IDW and like to go back to our,
our larger sort of context thing,
right?
Like he's coming from a group of people who have been incredibly defensive of
meritocracy because it's been critiqued by the social justice folks for being
not real.
Yeah.
And I didn't have that context.
Yeah.
Yeah.
Yeah.
So that's,
that's what,
that's part of where I was coming from on those sorts of things. And I just, um, Yeah, and I didn't have that contact. some. It just, you know, it would be nice to have an acknowledgement of like, I beat a lot of people over the head
for a long time for coddling people's minds.
And now I do wish that we would coddle
people's minds just a little bit more. Just a little bit
of mind coddling online, I think, would be a good thing.
What do you think?
Let's finish this out here real quick.
What do you think about
us coming into this blind,
us not knowing him?
Do you think that's,
what do you think about the person versus their ideas?
I know this is a deep, deep question
and I don't want to like make you answer it quickly,
but what do you think about
not having the context of the person
when you read something like this?
Do you think that we approach this in a way
that might be in some ways
a little lazy and a little lackluster?
No, I wouldn't use those terms.
I think there's a genuine debate to be had
about what's the right way to approach information.
You know, context versus not
context, right? I think...
And the problem is you can't
do both, right? Ideally, what
I would love for you to do is read it
totally contextless and then get a do is, right, read it totally contextless
and then get a bunch of context and then read it again, but in be no way influenced by your
first read through, which is like psychologically impossible. The jury will disregard what they've
just heard. Exactly. Right. Use a transporter, copy yourself. Do a-blind. No.
It is a fair debate, right?
Because there can be ways in which knowing that information can color your reading, and that can be bad.
And there can be ways in which not knowing that information can color your reading and be bad.
I don't have a good answer to, like, what is... There's no one right way, I think, right?
I think you do the best you can to do it both ways, essentially, right?
You want to try to read the arguments in isolation,
but you also want to understand, you know,
how a particular individual's,
this one argument is part of their larger project
about white genocide or something, you know,
like that is in full context to hell.
You want to know.
Like you shouldn't go blind into reading,
you know, Douglas Murray's, you know,
Death of the West or something like that.
I don't think that's a great
approach to what are polemics.
If you're reading an actual polemic,
you should know what the ideology
that it's coming from is.
Obviously, Haidt's going to claim that
Righteous Mind is not a polemic.
I think it's fine to just read Righteous Mind
as is, though I think you should listen to this
episode first to understand where he goes
astray in the,
you know,
in the,
um,
uh,
the prescriptive side of things.
Yeah.
Yeah.
Right.
And the conclusions and things,
um,
I had a,
you know,
what I always think about when I,
when I,
when I,
when we talk about this issue,
when I was studying theater,
cause I'm a theater major as well as a philosophy major.
Wow.
I had a teacher.
You're doubly unemployable.
Deeply, deeply unemployable.
On the menu, I will have everything that does not equal job after it.
What do you have?
English lit.
I think you get all of your...
The top of my employment list is cult,
and I have at least achieved on that front.
Amazing.
Credit where credit's due.
I did get the skills I needed.
Yeah.
So I had this teacher who,
whenever we would read plays,
he was like,
you need to know the name,
like who wrote it,
when they were born,
what the culture was like that they were in,
what were they writing in response to?
Because like,
you don't understand the entire piece of work until you have all of that
context.
So at some point you desperately need that context, whether you should have it on the first reading or not, that's an interesting question.
But you absolutely need to get that context. And you need to be as open-minded as you can
to reassessing your view of something once you have that context. Because lots of things can...
This is the problem of laundering conspiracism and laundered ideas.
you know, this is the problem of laundering conspiracism and laundered ideas.
The laundered version of it can seem
really plausible and, like,
fine, but, like, you have to understand
that it's coming from this other place
in many cases.
Yeah, that's my take on that.
It's funny because I think there's a sort
of well-understood
or well-established
agreement that when you read
something that is older,
that that historical context is necessary.
And I don't know that we all necessarily agree
that when you read modern work,
that a modern context may also be,
but even as I say that, I think to myself,
I don't know that I understand
what a modern contextualization really means the same way that I understand a historical contextualization because you don't have distance.
There's really no way to have any kind of intellectual or historical distance from the modern in the same way that we pretend we have, at least for the historical.
I think there's a lot of pretending going on with the historical. Well, yeah.
So I think the textbook example that I
would point to, you know, people to understand
why context matters, is
the episode with
Ezra Klein and Sam Harris
talking about Charles Murray.
So, for folks who are not familiar,
Ezra Klein, a progressive
podcaster, Sam Harris, a
left... Intellectual dark web guy, yeah.
Intellectual dark webby, but like,
again, one of the ones who hasn't spiraled
as far as Lindsay and stuff like that.
And they were arguing about an episode
that Harris had done with Charles Murray
where he had him on to talk about
The Bell Curve, which is his book
where he argues that like,
differences in outcomes are partially
to some extent the result of
IQ differences between
races, essentially. That's one of the
arguments that became very controversial from that book.
Subtitle, yikes!
Right.
And Harris had him on
and called the episode Forbidden Knowledge
and like prefaced it as like
this is uncontroversial truth,
but it's treated as controversial by the woke.
And then basically like softballed Charles Murray
through an interview about this stuff.
And Ezra Klein criticized him.
And so Harris went on Ezra's show,
I think is where it ended up.
And basically Klein was like,
Charles Murray has a lifetime project
of libertarian deconstruction of the
social safety net via the
argument that racial IQ
differences are the large reason
for differences
in outcomes, and therefore it's not
worth it to try to close
the racial gaps, essentially.
Okay? And Harris
was like, I don't think any of that matters.
Basically.
You know?
That's a lot.
It's a lot to take in right there.
Right.
It's a big difference between,
it's also a big difference in question
between you as an individual in private,
right?
Reading Charles Murray or Douglas Murray
or any of the racist Murrays.
Any of the Murrays.
There's two racist Murrays and you have to keep them separate.
One of them's old and racist and one of them's young and more British and racist.
It's one thing to read that stuff on your own.
But if it's like you and me and Sam Harris, where we have some amount of audience, more so than others,
that you have a deep moral obligation, I think,
to not uncritically platform that shit,
to not just have that on your show.
And not just uncritically,
but treat it as sacred, conventional,
hidden wisdom or taboo wisdom or something like that.
It's extremely inappropriate behavior.
So yeah, I think if you are people like us,
you absolutely need to get that content
before you are willing to have somebody on
so that you can provide substantial pushback.
And you have an obligation
to provide substantial pushback.
Aaron, we're coming to the end here.
We just spent a long time
talking about some really deep topics.
I want to end today by asking you a few questions
and I want you to tell me
if they're real or not.
Okay?
So I want you to want to do this.
You do this on your show
where you ask people questions
and decide whether or not they're real.
So I'm going to do this to you here real quick.
So now you have to answer
whether this is real or not real.
Now, what's this segment called on your show?
It's called the Enlightening Round.
Enlightening Round.
So we're stealing this directly
from Embrace the Void.
So here we go.
We have actually for the moment retired the real or not real.
We've moved on to the trolley problem version of the enlightening round.
So you are free.
I'm free to steal this.
Perfect.
Oh,
that's great.
We needed a break from the stick for a little bit.
Let's continue here then.
This is perfect.
Yeah,
this is exactly what,
if you want a joke,
that's going to die,
it's going to come on this show.
So here we go. All right. We will beat it. Yeah, we is exactly what, if you want a joke that's going to die, it's going to come on this show. So here we go.
All right.
We will beat a dead horse.
We will fucking murder a dead horse.
All right.
So, so starting out,
real or not real,
Jonathan Haidt.
Not real.
Michael Shermer.
Not real.
Ben Shapiro's wife's dry vagina.
Real.
The intellectual dark web.
Yeah, I'll go with real.
White genocide.
Not real.
Fucking shit.
Science fiction. Hard not real on that one.
Science fiction.
Yeah. Real. Let's go with real on that. That last one blew my brain out. science fiction science fiction um yeah real
let's look at that
that last one
blew my brain
last one
unstructured play
not real
how do you feel
how does it feel
does it feel good
how does it feel
I feel great
good
how could you have
genuinely unstructured play
have you ever watched
because they'll immediately
invent a game
or you do this and I do that.
You run over there and then I'll chase you.
It's very true.
Very true.
Aaron, I know this was a long conversation,
longer than we had planned.
Thank you so much for joining us today.
It was a lot of fun.
This is what we call half of a philosopher.
This is like half a philosopher.
This has been great.
It's like a warmup.
This was fun.
This was fun.
Thank you. We appreciate you coming on. philosopher you know you all this has been great it's like a warm-up this is fun this is fun thank
you we appreciate you coming on
not too long ago there was a great earthquake in the C.D. Prefecture.
Several towns and villages were devastated. Following the earthquake, a large fault was
discovered near the epicenter on the north slope of Omnigary Mountain. The fault was deep
and several kilometers long. This was the sign of the beginning of
this strange incident.
When I saw it, I knew I had to go.
It was the same with me. It's a real mystery, isn't it? It's a wonder of nature. It's got the entire country. No, the whole world transfixed.
Huh?
Oh, wait.
It must be the others who've come here.
What?
What are you doing?
You look like you're looking for something.
I know I saw it somewhere.
There are so many holes.
So, what's so special
about this hole?
It was in my shape.
It was identical.
It was based on me.
What are you saying?
This hole is yours?
I'm not joking.
Oh, so you too.
I came here to look for my hole.
It's unbelievable.
Most of us came here for the exact same reason.
We're all looking for our holes.
And I finally found mine.
This is it.
You look like you don't believe me.
I'll prove it's my hole.
Hey, hey, come back!
Oh my god.
He went in the hole.
He's gone.
What are we gonna do?
Hey, come with me.
What is it?
I found my hole.
What about that guy?
Just come with me.
Look. Look over there. guy? Just come with me. Look.
Look over there.
He's just gaping at me.
It's my hole.
I mean, it kind of looks like your hole.
This is my hole.
It was made for me to enter.
It's been waiting for me to enter all that time.
It's saying, come into me.
Okay, okay. If you're scared, just watch me. I'll fill this hole up.
There, see?
Hey, what are you doing?
I'm going to have to enter that hole.
This is my hole.
It was made for me.
What are you doing?
No, come back. Come back.
Come back!
Damn.
Why did you do it?
Why?
When you could have just gone to adamandeve.com and used code GLORY,
you would have gotten 50% off almost any one item plus three free gifts and six
free spicy movies and free shipping all when you use code glory why why why
huh
what's that?
What is this?
This is my hole.
My hole. so i want to thank aaron rabinowitz for joining us today we're recording a little early this week uh and we're not we didn't live stream this last thursday and uh and we're
not recording on thursday like you normally would and so you're not hearing a lot of uh stories that
happen during the week uh because we we actually are recording pretty much the day the other show
released but we want to thank aaron rabinowitz for joining us great guy uh really great guy and
a really smart guy we're glad he could take the time to talk about Jonathan hate and his article
and book,
uh,
and talk about,
uh,
all the implications of his activity with the intellectual dark web,
which we didn't know about.
We had no idea.
Happy that he had an opportunity to explain some of that stuff to us.
Um,
and,
uh,
and if you want to check out his stuff,
you can check out philosophers in space,
or you could check out embrace the void.
Those are his two podcasts.
We'll have links in the show notes.
We'll also link to him on the Skeptic UK online magazine
and he is a columnist for them as well.
Very smart guy.
He's gonna be a QED this year.
You can catch him at QED.
He's gonna be giving a conference,
not a keynote, pardon me.
He's going to be giving a panel discussion
on conspiracy theories.
And so it's gonna be really great.
Marsh and he and two other people
who I don't remember exactly who's gonna be on there,
but I know for sure Marsh and he are on the same panel
and it should be really great.
Pro-conspiracy, weirdly enough.
Yeah, I know.
I think him and Marsh are actually gonna box
after it's over.
So, but check it out and check out his podcast.
He's a really smart guy and he's
really fun to talk to that's gonna wrap it up for this week we're gonna leave you like we always do
with the skeptics creed credulity is not a virtue it's fortune cookie cutter mommy issue hypno
babylon bullshit couched in scientician double bubble toil and trouble, pseudo-quasi-alternative, acupunctuating, pressurized,
stereogram, pyramidal, free energy, healing, water, downward spiral, brain dead, pan, sales pitch, late night info-docutainment.
Leo Pisces, cancer cures, detox, reflex, foot massage, death in towers, tarot cards, psychic crystal balls Bigfoot Yeti aliens churches mosques
and synagogues temples dragons giant worms Atlantis dolphins truthers
birthers witches wizards vaccine nuts shaman healers evangelists conspiracy
double-speak stigmata nonsense expose your sides. Thrust your hands. Bloody. Evidential. Conclusive. Doubt even this.
The opinions and information provided on this podcast are intended for entertainment purposes only.
All opinions are solely that of Glory Hole Studios, LLC.
Cognitive dissonance makes no representations as to accuracy, completeness, currentness, suitability, or validity of any information
and will not be liable for any errors, damages, or butthurt arising from consumption.
All information is provided on an as-is basis.
No refunds.
Produced in association with the local dairy council and viewers like you.