Making Sense with Sam Harris - #368 — Freedom & Censorship
Episode Date: May 21, 2024Sam Harris speaks with Greg Lukianoff about free speech and cancel culture. They discuss the origins of political correctness, free speech and its boundaries, the bedrock principle of the First Amendm...ent, technology and the marketplace of ideas, epistemic anarchy, social media and cancellation, comparisons to McCarthyism, self-censorship by professors, cancellation from the Left and Right, justified cancellations, the Hunter Biden laptop story, how to deal with Trump in the media, the state of higher education in America, and other topics. If the Making Sense podcast logo in your player is BLACK, you can SUBSCRIBE to gain access to all full-length episodes at samharris.org/subscribe. Learning how to train your mind is the single greatest investment you can make in life. That’s why Sam Harris created the Waking Up app. From rational mindfulness practice to lessons on some of life’s most important topics, join Sam as he demystifies the practice of meditation and explores the theory behind it.
Transcript
Discussion (0)
Welcome to the Making Sense Podcast.
This is Sam Harris.
Just a note to say that if you're hearing this, you're not currently on our subscriber feed,
and will only be hearing the first part of this conversation.
In order to access full episodes of the Making Sense Podcast,
you'll need to subscribe at samharris.org.
There you'll also find our scholarship
program, where we offer free accounts to anyone who can't afford one. We don't run ads on the
podcast, and therefore it's made possible entirely through the support of our subscribers.
So if you enjoy what we're doing here, please consider becoming one.
Today I'm speaking with Greg Lukianoff.
Greg is the president of the Foundation for Individual Rights and Expression.
The acronym is FIRE.
He earned his undergraduate degree from American University and his law degree from Stanford.
And he worked for the ACLU of Northern California
and other organizations before joining
FIRE in 2001. And he's one of America's most passionate defenders of free speech. He has
written about the issue in the New York Times, the Wall Street Journal, the Washington Post.
He has produced documentaries on the subject. He also wrote, along with Jonathan Haidt,
The Coddling of the American Mind. And most recently, he wrote, along with Jonathan Haidt, The Coddling of the American Mind.
And most recently, he wrote, along with Ricky Schlott, The Canceling of the American Mind.
Cancel culture undermines trust, destroys institutions, and threatens us all. But there
is a solution. And that is the topic of today's conversation. Greg and I discuss the origins of
political correctness,
free speech and its boundaries, the bedrock principle of the First Amendment,
technology and the marketplace of ideas, epistemic anarchy, social media and cancel culture,
comparisons to McCarthyism, self-censorship by professors, cancellations from the left and the right and how they differ,
justified cancellations, the Hunter Biden laptop story, how to deal with Trump in the media,
the deplorable state of higher education in America, and other topics.
And now I bring you Greg Lukianoff.
Greg Lukianoff. I am here with Greg Lukianoff. Greg, thanks for joining me.
Thanks for having me.
So we've got a lot to talk about. The world is on fire and you happen to run an organization called FIRE, which addresses some of these problems. I will have introduced you properly
in my housekeeping, but what is FIRE and how do you come to be running it?
FIRE is the Foundation for Individual Rights and Expression, and we're actually celebrating
our 25th anniversary this year. We were founded back in 1999 by Harvey Silverglate, who is a
liberal-leaning libertarian, and Alan Charles Kors, who's a more conservative-leaning
libertarian, who is one of the world's foremost experts on the Enlightenment, especially Voltaire.
And they founded it because even back then, they realized that, and Alan was a professor at Penn,
that students were increasingly getting in trouble for what they said, not just for what they did. And they wrote a book called The Shadow University,
trying to blow the lid on what was going on, that came out in 1998. And Harvey always likes to say
that he thought that would solve the problem. But instead, they got emails and letters from people
all over the country asking, particularly professors, asking for help in their free speech and academic freedom cases. So in order to deal with the
onslaught of interest, they founded FIRE in 1999. And I joined as the first legal director in 2001.
And I'm the weird law student who actually went to law school specifically to do First Amendment law.
I took every class that Stanford offered on First Amendment. And then when I ran out,
I did six credits on censorship during the Tudor dynasty. And I externed, a weird word for internship, at the ACLU of Northern California. So when Harvey went to Kathleen Sullivan, who was
then the dean of the
law school, to ask for who they would recommend to be the first legal director of FIRE, it remains
the greatest compliment I ever received is she recommended me by name.
But what years were you at Stanford?
1997 to 2000.
Right. Actually, we overlapped a little bit there, but I was going back to finish my undergraduate.
But I remember the first time I was there, there were some intimations of the coming censorship apocalypse.
I remember that the great books approach to, I forget what it was called in the freshman years,
the freshman literature requirement was a great books seminar. And, you know, this was much
maligned as just teaching the products of dead white men. And there had been a march, I think it
was, might have been a year before I got there as a freshman, but, you know, Jesse Jackson had led a
500-person march on the Stanford campus that got a lot of coverage. And there was certainly a movement afoot on that
particular campus at that point to reset everything with respect to how we talk about ideas. I don't
quite remember what was happening in 1999, but I can only imagine the pot just continued to boil
from there. Yeah, it was interesting that, you know, I started there in 97, and nobody really
talked about the fact that just two years earlier, Stanford Law had lost a court case defending a
speech code that it had come up with with Professor Tom Gray, who was a professor who was there when I
was there, to limit nominally racist and sexist speech, but really going after offensive speech to a vast
degree. And even though it was a private school, the legislature passed a special law, it's now
called the Leonard Law, that actually made sure that nonsectarian schools in the state of California
actually had to abide by First Amendment standards. So they actually lost in court in 95.
And this was like a dirty little secret by 97 97 because it was actually kind of a sweet spot to go to school because what I missed was what I call in my latest book, the first great age of political correctness, which is essentially 85 to 95, which is when speech codes came into vogue. There was, I remember actually the chant was famous by Jesse Jackson.
It was, hey, hey, ho, ho, Western Civ has got to go.
Right, that was it, yeah, yeah.
And this was a lot of what Harvey and Allen were writing about in the shadow university.
But by, say, 95, 96, political correctness had become even a joke on the left.
So there was a sense that the speech codes were defeated.
The professorate kind of fell out
of love with enlightened censorship. The students, you know, were more the kids of the boomers who
were actually quite good on free speech. And so there was a sense that this all kind of died off.
What I learned in working with Jonathan Haidt when I was doing the research for coddling the
American mind was that, interestingly, even though this kind of fell off of the public
radar, because there was a sense of political correctness was like this fever that had broken
sometime in the mid nineties, but that the viewpoint diversity of, of the professor of
professor hirings plummeted in the late nineties. It got much, much worse, even as people had taken
their eye off the ball. And when I started at FIRE in 2001,
I was pretty shocked at how easy it was already to get in trouble for what you said on campus. And I also say, when I got to Stanford, it was a little bit of a culture shock to meet so many
wealthy kids who were pretty Victorian, I would almost describe it, in their ideas of what
acceptable speech constituted. It seemed like it
was very easy already to get in trouble for saying something in an unapproved way, but it was a
cultural thing. It wasn't that people were actually literally getting punished for it. It was more
that it just seemed kind of, forgive the expression, uptight.
Well, I should say much of what we're going to talk about you cover in great depth in your most recent book, The Canceling of the American Mind, which you wrote with Ricky Schlott. And there's just a ton of material in there and many case studies of which we are not going to get to cover, which we may cover some.
Let's cover all of them. Yeah, feel free to introduce any that you want, but there's just so much detail there
where it's just amazing.
It's certainly amazing for anyone who thought that cancel culture wasn't a thing.
I mean, anyone who can be found having said that in the last 10 years really needs to
read your book.
Thank you.
But let's start with the concept of free speech and its boundaries.
How do you think about free speech and what precisely do we want to defend here?
I have a pretty expansive view of what the term free speech means.
And I think of it as the big Boolean sphere that includes the First Amendment, but is much, much larger than that.
And I definitely, you know, I think of it as
not just being about the limits of the law of what you're allowed to say, but also the cultural
tolerance for the acceptance of people with any kind of opinion. I sometimes explain there's no
such thing as a free speech absolutist, or at least I haven't met a serious person who is,
because we all believe that some speech is and
should be unprotected. But I am an opinion absolutist. And what I mean by that is that
if you're merely expressing your opinion, even if it's repugnant, that is protected. It has to be
something more than the mere expression of opinion. It has to be something like discriminatory
harassment, which is a pattern of severe, persistent, and pervasive behavior
that targets someone on the basis of a protected category that a reasonable person would also find
offensive. And by the way, we actually have seen genuine discriminatory harassment on campus since
October 7th in a number of cases. So it's not an unmeetable standard. Intimidation, also known as
true threats, which is when you make someone,
we'd make a reasonable person fear that they're in danger of bodily harm or death.
That is not protected nor should it be. Defamation is not protected, which is basically
making a factual allegation that someone, for example, is guilty of an odious crime.
I always gave the example when I was at the student newspaper of, you know, groundlessly asserting that you know that someone is a pedophile, for example, is a classic example,
because I could ruin someone's life if they believed you. So nobody really believes that
there's no way to use words in a way that isn't protected. I mean, there's also, of course,
you use words to commit extortion, you use words that are incidental parts of existing crimes, you use words to commit conspiracies, for example.
And under that, the law, I think, is sensibly set up in the United States to have what we call the bedrock principle, which comes from a case called Texas v of the entire history of the First Amendment and of free speech decisions simply because it's offensive. And I always make
the point that this is a great rule for a genuinely multicultural society because what people find
offensive, and my dad's Russian, my mother's British. I grew up in a neighborhood with a lot
of kids from Vietnam and Puerto Rico and Korea and from Peru, all people who had very different ideas of like what offensive speech
looked like. And it was such a perfect example for why free speech had to be the rule because
we couldn't, you know, my British mom thought all sorts of things were offensive. My Russian
father thought practically nothing was, but Danny Nguyen's parents had different ideas of what was
offensive and same thing with Nelson Bolido's. And so I think that the bedrock principle is a very sensible rule for a free,
multicultural, diverse society, but it is one of the major things that makes America so very
different from Europe, for example. But let's linger on that point because I'm often guilty of just reflexively seeing everything through an American lens. And then I'm reminded that even our closest cousins over in the UK or elsewhere in the English speaking world don't enjoy the same kinds of protections that we do.
kinds of protections that we do. How is it different in Europe and Canada and Australia and New Zealand and elsewhere? And I think you and I are both going to agree that we have it
closer to right in America than exists anywhere else, at least that I'm aware of.
What do we have that other nations don't around free speech?
I was sort of laughing right there because I violate a major convention
among constitutional lawyers when I travel to Europe. It's generally considered, and it's
partially because a lot of other constitutional lawyers are highly sympathetic to greater
limitations to free speech. So it's partially also just agreement. But even when they're not,
there's a tendency to go over there and be very deprecating of our own system and very complimentary towards the way things work in Europe.
And I am not that lawyer.
I go over and I'm rude about it.
I'm sorry.
I actually think I'm quite polite about it, but I violate the rule of saying you guys have it all right and we have it all wrong because I'm like, I don't think we do.
have it all right and we have it all wrong because I'm like, I don't think we do. Particularly since I attend a lot of these hate speech conferences and watching someone with a German accent explain
why this cartoon landed someone in jail, but this cartoon didn't. And you're like, this doesn't make
any sense. These are both arguably equally offensive and you're just being arbitrary about
it. So you're not really persuading me that this is the way to go. And that's partially because they don't have a bedrock principle that you
can't ban speech simply because it's offensive. But I think that comes from an interesting place.
And I think it's essentially, oh God, I had to find myself thinking about the Treaty of Westphalia
in 1648, the nation state model. Essentially, I think that in My Mother's Britain, there is an idea that there's sort of a modal Brit and that nation can be kind of almost analogized to a single person who has, just like any person, a list of things that are offensive and a list of things that aren't. A John Bole kind of character.
things that aren't, a John Bull kind of character. And I think that that's something that you'll see in sort of like the national character of Germany and France, and that there is an idea that there
is like a modal version of that person. And if you think of your country that way, it can make
sense that you can call balls and strikes in terms of what's offensive and what's not, because you
think that there's some way that most people in your country would. I think this has gotten much harder to argue as all of these countries have become much more
diverse, much more multicultural. And I think it was also a little bit asinine to argue in the
first place, particularly in a lot of these places, because of profound class differences
within these same cultures. And so I think that that's one of the reasons why they can be more
comfortable thinking that not having a bedrock principle makes sense. But I can see that the strain on the system is
really happening in places like Scotland and Ireland and Germany, for that matter,
in terms of how do you enforce this fairly? And the answer is, it's incredibly difficult.
Meanwhile, in the United States, we've always had an understanding that Boston is not like Richmond, is not like Georgia,
is not like, and then later California, not like Texas. So we've never really thought of ourselves,
despite the image of Uncle Sam, we've always been a country that we know full well that different
parts of us are simply not going to agree. And I think, frankly, we've been at the genuinely multicultural,
generally multi-ethnic society thing longer than a lot of the countries in Europe. And so I think
we actually did get this one right. And I think that what you see going on currently in Europe,
particularly around issues like immigration, I mean, I think about the polling that you presented on Bill Maher's show about some conservative Muslim attitudes about a variety of issues.
That struck me as something that absolutely protected the United States, but I could imagine that even some factual assertions could be treated in some countries in Europe as if they're actual offenses now.
actual offenses now. Yeah, yeah. Well, the modal Brit or the modal German or the modal Frenchman is increasingly going to be guilty of Islamophobia in that context. So let's return to the American
view of things. I think we both agree that as well-intended as they might be,
prescriptions against offensive speech in other countries like the UK or Germany,
Holocaust denial laws, for instance, in places like Germany, and I think Austria has it,
I'm not sure where else. They can literally land you in prison for denying the Holocaust.
As much as I am a student of the Holocaust and am alert to the, even to the forward-looking prospect of future Holocaust, I think laws of that sort are just wrong-headed. I mean, they don't accomplish what you would hope they would accomplish, which is to get everyone to understand that, you know, in this case, the Holocaust really did happen and it was horrific.
And it was horrific. And it's just the wrong algorithm to be running socially so as to purify people's thinking and to get people to converge on a fact- it being legal to more or less talk about anything, leaving aside those cases where speech isn't quite just speech, where it is, I think,
as you said, a provocation to imminent lawless action or obvious harassment, where it's more,
it's a kind of behavior. you could be defrauding someone with
speech, et cetera. But if you're just expressing opinions, and even extraordinarily odious ones,
and even in principle, dangerous ones, dangerous if acted upon, I think we want the, generally
speaking, in the public sphere, which is to say, you know, the sphere that the
government coercion casts a shadow over, I think we want just the best idea to win in collision
with all other ideas. Is that how you, do you agree with what I've said on that?
Well, the marketplace of ideas, theory of freedom of speech, I think is, I think it's a great idea.
And I think it's particularly relevant in higher education where it is supposed to be
a clashing of ideas and the use of disconfirmation to chip ever closer to the truth by chipping
away at falsity.
But I have a little bit of a more idiosyncratic theory on freedom of speech that unsurprisingly
is fairly expansive. And I
just call it simply the pure informational theory of free speech, which is that there is, if the
goal of human knowledge is to know the world as it is, then you can't really know the world as it is
without knowing what people actually think. And if you create a situation in which people aren't
too terrified to say what they actually think for fear of punishment, you are depriving yourself of really important knowledge
about the world. And I extend this also to conspiracy theories. So like, let's take a real
one. The Protocols of the Elders of Zion. If the idea that that wasn't a relevant and important
thing to know about its existence in history is just delusional. I think it's very
important that people understand that there was this massive conspiracy theory in the early part
of the 20th century that was based on very shoddy writing that basically was saying that Jews
controlled the world. Is it true? No. Is it worth knowing? Of course it is. And so when it comes to
conspiracy theories, when I'm being a little more lighthearted about it, the way I explain it is, listen, lizard people who live under the
Denver airport do not control the world. But knowing that your girlfriend or uncle believe
that lizard people living under the Denver airport control the world is very important
information to have. And so knowing what people really think and why is a key part to understanding
the world. And when it comes to conspiracy theories, I mean, here's one thing that I just
have to say over and over again. If you're battling someone who believes that there's a conspiracy to
shut them up, do absolutely nothing that looks like a conspiracy to shut them up because they're
just going to take that as like,
well, I must be onto something. And then the last part of like why you have to let these bad ideas
out is something that I call the, I call Mills Trident from the, from On Liberty, which is that
in a truth seeking discussion, which is actually not most discussions, but in a discussion to try
to get towards truth, there's only three possibilities. You're either completely wrong, you're partially wrong or
partially right, or you're completely right. And now that middle category is the one that's going
to be the most common.
Yes, well populated.
Yeah, exactly. But in the case that you're completely wrong, of course, free speech is
useful because people can point out how
and why you're wrong. In the middle one, there's really no other way to get at what's the wrong
parts of your opinion without usually a somewhat long process of debate and discussion. But even
under that final one, if you're completely right, that obviously the Holocaust really happened,
but you never actually get tested on that. People tend to hold their beliefs like they hold prejudices. They know that something's true or they believe that something't happen, but you're never actually challenged on
whether or not it happened, you may not even end up being aware of how overwhelming the evidence
that it absolutely completely happened is. And so I think that a lot of these anti-conspiracy laws
backfire in multiple ways. And one thing that is, I think, interesting is when you look at
the prevalence of Holocaust denial, but
also of things like antisemitism in Europe versus the United States.
Europe passed a lot of laws that, particularly places like France, laws that were designed
to go after antisemitic speech.
There was also a push, of course, on college campuses to do the same thing, both in the
90s.
But they actually got passed in France, and they only got passed on campuses in the United States. And for the most part, they were defeated in court.
But when you look at sort of like the rates of anti-Semitism in Europe, they're much,
much higher than they are in the United States, although unfortunately, things have been getting
worse in the past couple of years. And I think that's at least in part underestimating what can happen
if you don't actually create a situation in which people are actually finding out what people really
think and why. I think we should introduce a few more distinctions here because I think as much as
we seem to agree, we may disagree on a few points and it'll be interesting to have you defrag my hard drive if it needs it.
So there's obviously a distinction between laws and norms, and there's a distinction between
public and private spaces or platforms. And a failure to notice these distinctions,
in my view, does confound a lot of the discussion around censorship and misinformation
and big tech and what should have happened on Twitter during COVID, et cetera, and some of
which you cover in your book. So as I just said, I fully agree with you that nobody is a free speech
absolutist. That's just a cartoon of a cartoon. And especially when you move into private space and private platforms and
businesses that have brands to be maintained, right? Your meta or Instagram or your Twitter,
now X, right? You're a business that has to make money at some point, presumably.
And then the question is, what sort of information do you want on your platform? You're in a position to make decisions, curatorial, moderating decisions.
You have to make those decisions unless you're going to become the next 4chan or 8chan or some other digital sewer that nobody or a few of us want to spend much time in, and certainly no one wants to advertise in, right?
And yet any effort to clean up the sewage is routinely described, mostly right of center now,
as censorship, right? And this has always just struck me as pure confusion, and especially in a political context where you're seeing the
algorithmic amplification of obvious misinformation and disinformation, some of which obviously is
put there maliciously by Russian troll farms, right? Although that's not the bulk of it,
that certainly happens. But much of it is just generated by maniacs, whether it's Alex Jones or someone
you've never heard of. And the way it's different than ordinary speech is, once again, it is
algorithmically amplified based on the dynamics of these hallucination machines that we have built.
And it's just a much more complicated picture. Again, there are distinctions between laws and
norms, public and private,
normal speech versus algorithmically boosted speech, speech whose terrible effects you can see play out hour by hour. And in certain cases, many of us worry that the misinformation component
of this, if left unchecked, could render us just ungovernable politically, right? I mean, people are so
confused. When you look at what's happening on college campuses now, I look at it as just a
manufactured hysteria based on almost pure misinformation. There's a lot of stupidity,
you know, greasing the wheels there and moral blindness, but there's a tremendous amount of misinformation. And so the happy talk
I uttered a few minutes ago about how the best idea should win and collision with all other ideas,
we have created private and public and semi-private, semi-public contexts, whether you
think of each of these things as publishers or platforms, depending on how you squint your eyes, you can make a case for either.
We've created a situation where we appear to be driving ourselves fairly crazy, and
it's worth worrying about this.
And so I'm just wondering, I mean, take any side of that grotesque object I just put in
view that you want, but how does your commitment to free speech in a very,
you know, deep way help you navigate the mess I just made?
Yeah, no, this is, and I wish I could, I could have a short answer to this, but I don't think
I really can, because I think that we have to take a big step back to where we are in 2024.
And I talk about this a lot within my organization.
Probably people are sick and tired of hearing the printing press.
But I mentioned that I did censorship during the Tudor dynasty is that was all directed at the printing press as
being this infernal device that spread misinformation that led to all of these social
ills. And I always want to be clear here. It did. It really did in a lot of ways because,
and this is a point that I sometimes, you know, take a somewhat different approach than my
much beloved friend, John Haidt, is that I'm, you know,
I'm more of a civil libertarian, but also I come from, you know, a little bit of a different sort
of academic background. And I definitely, you know, tend to see solutions as more bottom up
than top down, because I look at what happened in England around the printing press, starting with
Henry, and then also going all the way through Elizabeth,
actually all the way through the Glorious Revolution and a little bit beyond, about the printing press. And essentially what happened there was you suddenly had millions of additional
people in a faster-paced global discussion about what is true and what is not. And in the short run, it was devastating. It meant
an increase in the witch trials. It meant religious unrest. It meant political unrest.
It meant genuine bloodshed. And in some ways, this disorder led to the biggest war in European
history before World War I, the Thirty Years' War, which was just an
absolute calamity. So Henry's and Elizabeth's and Mary and Edward's, you know, concern about
the printing press, you know, honestly should not be so easily dismissed because it really was
a device that led to all this destruction. But that's going to happen anytime that you have this
many additional people
brought into a conversation. Over the long term, it actually turns out that this many additional
eyeballs and minds on problems was a tremendous boon for the production of thought and for human
progress because it allowed for things like the scientific revolution, it allowed for
decentralized disconfirmation of information, which is one of the great
that we're still benefiting from to this day.
Social media has added billions of people to an instantaneous and fast-paced global
conversation.
So there's literally no way to avoid that being a anarchical, epistemically anarchical period. So I feel
like we're unavoidably in a crazy period right now. And I get sometimes more concerned about
ways to fix the problem if they're heavy-handed, top-down problems. But another thing that happened
when you have this many new minds and voices into a conversation, is you realize that the old authorities probably
didn't necessarily deserve to be authorities in many ways.
Because with this many minds and eyes on a problem, you realize the thinness of what
some of the ancient voices were actually telling us, particularly about things like science.
And so the old authority gets destroyed, and there's always a period before new authority is able to establish itself and it can be as being, I will actually, you know, I can,
I will be straight. I will actually be someone who will be a reliable source of expertise and
information. But the thing that, one of the reasons why trust is in the title,
canceling of the American mind, is because I think cancel culture, I think what the calamities we're seeing on campus has been
a well-deserved disaster for the credibility of the academic class and for the expert class
overall. Because nobody really, after a professor loses their job, for example, for saying biological
sex is real, which really happened, nobody's going to listen to you again when you try to
claim actually it exists on a spectrum because they realize you can't be objective in an environment where your
job would depend on giving the quote unquote right answer on that. So I think that right now we're
seeing a much bigger disruption because nobody knows who to trust. And a lot of the voices that
we were supposed to be able to trust, whether it's mainstream media, whether it's academia, whether it's the expert class, have shown themselves to not be that
trustworthy. And unfortunately, I actually agree with that to a large degree. I actually think that
the expert class has really shot itself in the foot in a lot of ways. But we haven't yet reached
the stage where we know which experts are actually reliable. And unfortunately, a lot of those same
experts that we would hope would be are the same people who created sort of the misinformation on
campus. So there's a big problem there. And I've been thinking a lot about how to create social
media specifically for truth identification. And I think it's possible. I talked about on
Lex Friedman's podcast about the idea of being able to even create a truth-seeking stream with
an ex. Now, the rules have to be different in order to create that. But if you do it top-down,
people aren't going to trust it, but it'll also largely be based on the expertise of existing
authority who've proven themselves not that trustworthy. So it has to be a process. It has to be relatively
transparent, but we're not quite doing it yet. I think that some of the authorities that you should
have some faith in have raised their hands, for example, on Substack. I think the New York Times
is trying to improve its trustworthiness to the larger public, at least to some degree.
But I think we're unavoidably in a messy period where we haven't figured out a way to make
our systems for chipping away at falsity, for establishing what is true by getting rid
of what isn't.
I think we need to figure out a way to make that whole knowledge system more reliable.
Yeah, I take your point about the printing press, and I have a printed copy of the Malleus Maleficarum somewhere in the house.
Oh, wow. So not an old one, I mean a new one. But it's just, I understand the chaos wrought of
suddenly being able to manufacture crazy documents. Has it helped you find some witches?
Crazy documents.
Has it helped you find some witches?
Yes.
I'm still looking.
But it's a good read.
But I worry about drawing a false analogy here or being misled by an analogy that doesn't quite capture the dynamics of what we're suffering here. Because it's not just a matter of more people talking, right?
Because it's not just a matter of more people talking, right? Because it's, you know, if you're going to talk about just literacy and its consequences, well, we almost got maxed out before the internet, right? I mean, to each other about everything under the sun. And what's happened now is we're talking much faster and the
usual gatekeepers no longer really can man the gates because we go around them and above them and below them on social media.
And it's introduced some fairly perverse incentives that have undermined the business
model of journalism in particular. It's made academia brittle in the face of public opinion
online, perhaps quite unnecessarily, but nonetheless, they respond to the various
mobbings we have seen on Twitter in ways that have proven totally dysfunctional.
Firing a professor for claiming that there are only two biological sexes is the perfect
example.
We've created a machinery that is not just more voices, it is a, again, it's algorithmic
amplification that preferentially spreads misinformation over the truth. It biases us
toward outrage. I mean, all of these effects have been, you know, at this point, endlessly described
by me and others. And I'm not, there are no benefits to, you know,
every piece of this machine. I mean, it's not that social media has been bad for us across the board,
but it does have this effect of, I mean, now we have, you know, people in Congress who are
there, it seems, just to talk to their social media followers, right? I mean,
they're not there to govern, you know, somebody like Marjorie Taylor Greene, I mean, it's just, she's building a lunatic brand on social
media from Congress. That seems to be the project. And I worry that it's a different situation.
And like, you know, for instance, like one thing you said before we hit this topic, which you seem to take as a fairly ironclad heuristic, which is in the face of a conspiracy theorist who claims that he's being deprived of speech or a community that claims as much, by no means do anything to deprive such people of speech because then you're just, you know, then you have, you know, you've,
it's like the, I guess the David Koresh principle, you know, that he pulled himself up in a bunker
claiming the government was going to come and take his guns and the government came to try to take
his guns and all hell broke loose. But given the dynamics of the situation, I think a very strong
case could be made for canceling certain people, that is, depriving them of speech on these private platforms.
I think the defenestration of Alex Jones has always made a lot of sense, given what he was doing on Twitter.
Yeah, and I was actually kind of shocked that he was invited back to Twitter because, well, I mean, it's also kind of personal.
I grew up in Danbury, Connecticut,
and Newtown is kind of a suburb of Danbury.
And one of the guys who used to come to parties at my house
lost a little girl at Sandy Hook.
And so I knew from friends and family, you know,
that some of the Alex Jones insanity
was leading to, you know, people showing up and targeting parents who lost their kids. But here's the thing. That was defamation. their kids, you know, and leading to a situation
in which they were actually being targeted. That's something that I actually thought that him being
devastated in court was completely appropriate. And I think once you have a finding like that
against you, it would have been perfectly justifiable because that's unprotected speech,
that he not be invited back to X. But Greg, I'm sure there's a version of it where, let's say, he hadn't named the parents
specifically, but he's still talking about it in such a way on, again, this media platform that
facilitates this kind of mob-like convergence on untruth. He's propagating his lies and his
quote, opinions in a way that doesn't
meet the standard of defamation.
No one could sue him over, no one would have the standing to sue him over it.
It's a little bit like, you know, mini Holocaust denial, which you and I think should be legal.
And I think, to be clear, I think everything that I'm aware of, defamation aside, most
of what Alex Jones has said, I think he should be free, legally free to say.
But I think any platform should be free to say, we want nothing to do with this.
And we certainly don't want to help this maniac ruin the lives of people who have already suffered the worst thing on earth.
So, yeah, he's gone, right?
We've effectively buried him in a digital
hole out in the desert. I think that's totally fair game. And if I was running one of those
platforms, I would have kicked him off in the first five minutes of my tenure there. And many people
beyond him who are participating in the same kind of destruction of our information wilderness, in many cases
consciously and maliciously, but in many other cases, they're just insane, right? So the way I
tend to think about this is if you shrink the scale of it, if you just imagine, if I own a
restaurant and I find out that the bartender is talking to customers about how the Holocaust
never happened or the Jews know, the Jews deserved
it on October 7th, right? Well, you know, I'm totally within my rights to fire that bartender,
right? It's just, he's destroying the brand of the restaurant, right? He doesn't get to do that
as my employee. And so it would be with a patron to the restaurant. If somebody is actually just
making it an unpleasant place to be, I should be able to kick them out, right? So how does that change when it scales
to the size of a, quote, platform like X or meta?
Or does it change in your view?
Yeah, I mean, the primary concern
with misinformation and disinformation
as a rationale for kicking people,
well, certainly, like, you know,
as a rationale for punishing people in a legal sense. The podcast is available to everyone through our scholarship program. So if you can't afford a subscription, please request a free account on the website.
The Making Sense podcast is ad-free and relies entirely on listener support.
And you can subscribe now at SamHarris.org.