Unlocking Us with Brené Brown - Brené with Ben Wizner on Free Speech, Misinformation, and the Case for Nuance
Episode Date: February 9, 2022I was looking for some certainty around the tough issues of censorship and misinformation — legal definitions, rules, and clear lines — so I called Ben Wizner, a lawyer with the ACLU and the direc...tor of its Speech, Privacy, and Technology Project. I’d hoped we’d have a Free Speech 101–type conversation, with tidy resolutions and a clear path forward. But what I suspected, and Ben confirmed, is that the law gives us very few answers to the hardest questions that we have. So in this episode, the two of us grapple with issues of balance and boundaries, unpacking the harms that speech can cause and the harms that censorship can cause. I’m glad that we had over an hour to talk, because as tempting as it is to approach issues like this with firm certainty or with 140 characters, it’s much more important to unpack the nuances and unlock the opportunity for growth and learning. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Hi everyone, I'm Brene Brown and this is Unlocking Us.
I'm glad to be back.
It has been, as many of you know, a very, very tough couple of weeks.
What has it been, two weeks or three weeks, Barrett?
Two weeks.
It's been long and it's been hard and the world is not built.
I'm learning for pausing and learning more and researching. I've learned that.
What else have I learned? I've learned that bots that have cartoon avatars that are eagles are
really mean AF. I can just tell y'all that right now. I've learned that this community is incredible.
And even when we don't agree, which there are a ton of you that didn't agree
with me taking a pause as I dug into
what was happening at Spotify to better understand,
you asked me tough questions, held me accountable,
but did so with respect and curiosity.
And I really appreciate that.
I'm tired.
I'm, I don't know, getting back up on my feet a little bit. You know, the worst thing
about this whole shit show is I would do it again the exact same way. Like I didn't know enough
to tell you even why I was pausing because I didn't know what I was going to have to go look at. And I wasn't sure what I was even allowed to say, to be honest with you. And it's been hard,
but I'm glad to be back. I'm coming back, interestingly, with Ben Wisner. He has been
an attorney at the ACLU since 2001. And since 2012, he has been the director of its Speech, Privacy, and Technology Project.
In two decades at the ACLU, he has litigated numerous cases involving freedom of expression, surveillance practices, government watch lists, targeted killing, and torture.
Since 2013, he has been a lead attorney for NSA whistleblower, Edward Snowden. He went to
Harvard and then he went to NYU for law school. I have to say that I have a very strange relationship
with the ACLU. I've supported them financially before. I recently did a pro bono leadership
event for them and no one pisses me
off like them. Actually, this is like the craziest thing to say. The analogy is with church, which I
know that seems weird to be using the analogy of church with the ACLU. But I go to church
for the reminder of who I want to be. I want to find the face of God in everyone that I look at. I want to love people.
I don't want to move through the world full of lovelessness and hate and rage. And so I go to
church sometimes just to, you know, pass the peace and sing and break bread with people.
I really can't stand. And the ACLU is the same way. I support them because a lot of times I
agree with what they're doing. And sometimes, you know, who they defend, it just makes me crazy.
But then I think about the importance of free speech, and I'm about as anti-censorship as
you can get. So I thought this would be a good conversation. I will tell you that I was looking for a lot of certainty in this conversation. I was hoping for some rules and some
clarity and legal definitions. And that went south about what? 30 seconds into it. I'm like,
okay, let's do free speech 101. He's like, no, we're not going to do that. We're going to grapple
with really hard things that the law can't answer. I'm like, shit. So this has been Wisner on free speech, misinformation,
and the case for nuance. I with new finds from Macy's. From October 9th to October 16th, get amazing deals on shoes and boots
on sale at 30 to 40% off. And you can shop new styles during the Macy's Fab Fall Sale
from October 9th to October 14th. Shop oversized knits, warm jackets, and trendy charm necklaces
and get 25 to 60% off on top brands when you do.
Plus, get great deals on cozy home accessories
from October 18th to October 27th.
Shop in-store or online at macys.com.
Hello, I'm Esther Perel,
psychotherapist and host of the podcast
Where Should We Begin,
which delves into the multiple layers of relationships,
mostly romantic. But in this special series, I focus on our relationships with our colleagues,
business partners, and managers. Listen in as I talk to co-workers facing their own challenges
with one another and get the real work done. Tune into How's Work, a special series from
Where Should We Begin, sponsored by Klaviyo.
So let me start by saying welcome to Unlocking Us to Ben Wisner. Hi.
Hi, what a pleasure.
Thank you for being here.
I'm looking forward to it, I hope.
You know what? You know what I think this is going to be? I think this is going to be probably for you like you were teaching Free Speech 101.
I actually think it won't be.
Oh, you don't?
Because most of the things that we're going to be talking about today are not about the law. And the law is going to give us very few answers to the hardest questions that we have. So this is
going to be the two of us just grappling with issues that we've both thought about with no
right or wrong answers, probably. Oh, holy shit. I was really looking for some certainty, but okay.
So before we get started, I have to tell everybody that you are with the ACLU. And I have supported the ACLU over the past,
financially, and then most recently,
y'all invited me to do a fireside leadership talk.
And I did that pro bono.
So I want to be really upfront about that.
I also want to tell you that I appreciate the work you do.
And God, I hate y'all so much that I could just spit sometimes.
You know, they say about the ACLU that if you agree with us 80% of the time, you should be a member.
And if you agree with us 60% of the time, you should be a board member.
So there's no one in the organization who agrees with everything that we do.
It's an ongoing argument, and it has been for 100 years.
So tell us a little bit about the ACLU and tell us about what you do there.
Well, the ACLU has been around a lot longer than I have.
Although, when I was writing a quick bio for your show and saw that I've been there now for 20 years, not at all what I expected when I showed up there in August of 2001. That is
now 20% of the history of this organization. But the ACLU has been the country's constitutional
and civil rights watchdog now for a century. We have offices in every single state, and we work
on a broad range of civil rights, civil liberties, and human rights issues from free speech and freedom to protest to the rights of immigrants to women's rights.
That project was started by the late Justice Ruth Bader Ginsburg to racial justice to criminal law reform.
You can imagine with all of those issues under a single umbrella, there are sometimes going to be pressures and tensions that
arise even within the ACLU. So I showed up here as a baby lawyer 20 years ago to be just a
generalist, to kind of work on everything. And then just a few weeks later were the 9-11 attacks.
And so the first half of my career was really shaped by the country's response to 9-11 and what we believe to be rights-abusing responses to that, including secret detention and torture and excess surveillance, targeted killing.
And then after doing that work primarily for a little over a decade, I took over the leadership of our free speech and privacy team at the ACLU.
So sort of working on both some of the oldest issues that the ACLU works on and some of the newest.
Because as you know, and as we'll talk about today, technology has had a huge effect on how we think about rights and liberties and speech and privacy.
So that's what I've been up to.
You also, are you the chief lawyer for Snowden?
I have been since the summer of 2013,
been his main lawyer and the one who coordinates
the other lawyers in different countries on his team.
That's been a sort of wild and unexpected life
and career experience.
And, you know, apropos of our conversation today,
I have to admit that the only time in my life
I've ever heard a Joe Rogan podcast was when Edward Snowden went on to hawk his memoir.
So I may have some expertise on the issues, but I don't have specific expertise on the man.
Okay, so let me jump in.
Now I know what you do.
Let me ask you something.
We have a lot of college students that listen. and said, listen, was there just a time in your young kind of baby lawyer life where you said,
I should really stir things up a bit and be a lawyer for the ACLU? Like, how did that happen,
actually? You know, I think I was always the kind of kid who people would say after a loud
objection or an argument, why don't you channel that into something like being a lawyer? I kind of had two directions and there was a push-pull.
I'm very passionate about reading and literature and there was a side of me that thought maybe I should be a humanities professor.
And then at the same time, there's the very strong activist side of me that wants to be out there stirring shit up, as you put it.
And college was the place that clarified it.
The process for me
of writing a senior thesis showed me that I would be a real fraud as an academic, that I would cut
every corner I possibly could just to get out of it, because I did not want to spend my days and
nights in a cubicle surrounded by books. I want to spend my evenings on a couch with a book,
but my day I want to spend surrounded by people. Just, I work better in a social
environment. And so I decided to make activism my life and, you know, literature my hobby rather
than the other way around. Yeah. Vocation. And yeah, I get that. All right. Let's just start.
Can we just start where I am? I'm really
struggling. I am. I'm just confession. I'm, I was really looking for some certainty here.
Really, if y'all can't see Ben right now, but he's laughing at me. Where is the legal line?
Where does the First Amendment protection end? Well, so no rights are absolute. So let's start
with that proposition. The right to speak is not absolute. The right to own a gun is not absolute. So let's start with that proposition. The right to speak is not absolute.
The right to own a gun is not absolute. Even if the Constitution really does protect an individual
right to bear arms, which I don't believe it does, it doesn't protect your right to have a
surface-to-air heat-seeking missile that can take down a commercial airline. The right to free
speech doesn't mean that nothing that you can say can be prescribed or punished, right? So for example, we're not allowed to engage in true threats,
the kind of speech that would put someone else
in real apprehension of physical danger.
We're not allowed to incite violence.
We can't stand in front of a crowd of rowdy people
in a leadership position
and urge them on to imminent violence.
We're not allowed to harass.
If I called you up and said, let's go out for coffee, that would be free speech. But if I did
it 50 times in the next hour after you had told me politely, no, that could be harassment. So there
are lines that we are not permitted to cross, even though there is free speech. Now, on the topic
that you want to talk about,
misinformation, disinformation, it's actually a lot more complicated. You know, the Supreme
Court has held that, for the most part, the First Amendment protects the right not only to
say things that are false, but to do so deliberately. It actually protects the right
to lie. Congress passed a law called the Stolen Valor Act,
which was aimed at trying to punish these people
who pretended that they were military veterans and used that.
The Supreme Court struck that law down.
People were saying that they were war heroes.
Yeah, that's right.
Who didn't actually serve.
Or they weren't war heroes, either way.
Exactly right.
And it seemed like about a decade ago,
we were hearing these stories all the time
of people who either in politics or in business
had invented identities as war heroes or veterans
that were false.
And you can imagine how aggrieved and offended
actual veterans were that people would do this.
And Congress passed a law aimed at criminalizing this conduct. And the Supreme Court said, no,
the Supreme Court struck that law down and said, we're not going to have the government as an
arbiter of truth or falsity. And that, that doesn't mean that every lie is protected. Fraud,
for example, can be criminalized if I'm lying to
you in order to get you to purchase a defective product from me. But there has to be that kind
of concrete harm attached to the lie before it stops being constitutionally protected.
And that's why I said to you in the beginning of this conversation, law is not going to help us
very much here. Now, that would be the case even
if we were talking about government intervention, and we're not, right? We're talking about a private
company and one of their business partners. But even if we were talking about government
regulation, the Supreme Court has been really protective of even false speech in most circumstances because, remember,
every time you have a rule saying, on this side, it's okay, and on that side, it's not okay,
someone has to be the decider. And this was really brought home to me in the early months
of the Trump administration. Because if you remember, at that time,
we weren't using the term misinformation.
It was fake news.
That was what was on everybody's lips.
What are we going to do about fake news?
What are we going to do about fake news?
And I think what they meant is,
what can the government do about fake news?
How can we stop this fake news?
And then before you could even blink,
the person who had co-opted the term
fake news was Trump. And Trump was using the term fake news to describe any accurate news story that
cast any aspersions on him or his administration. And that should have been a reminder to everybody
that supposing Congress had passed a fake news law in 2017, who would have been enforcing that law? The Trump administration
would have been enforcing that law. Someone has to actually decide what fits into that category
and what doesn't fit into that category. And that's a really huge power to give to governments.
We can't expect governments to apply it neutrally, fairly, scientifically. We know that they're going
to apply it politically, parochially for their own benefit.
And that really is why the Supreme Court has been very, very, very, very cautious about letting Congress put limits on any kind of speech, even false speech.
Okay, this is going to be a big question for me because who's the arbiter of truth?
Like who gets to decide?
That's the scary thing for me. So can I play something back to make sure I'm saying
what you said correctly or what I'm thinking about it correctly?
Sure.
That the government, the Supreme Court has been very cautious in allowing the government to be
the arbiter of what's truthful and what's not. Is that a fair statement?
That's a true statement.
Okay. So let me just give you an example. Inciting violence, you said that that was one of the
exceptions? Is that the right word maybe? Or considerations?
I think you can say oral limitation on the right to free speech, right? So in almost every circumstance, we want to hold people responsible for their own conduct, not other people's conduct.
But in this narrow category of incitement, when my words directly lead to someone else's violence, then you might be able to hold both of us liable.
And here you might think about the Hutu radio broadcasters in Rwanda who really were using the airwaves to incite a genocide.
Or closer to home, and these were much closer calls, but when Trump was at his rallies telling his very riled up supporters to kick out protesters,
that is almost the paradigmatic example of a situation where you might hold the speaker
liable. Now, in most of those instances, he was careful to say, don't use any violence,
whether that was with a wink or a nod, who's to say. But those are really narrow situations because
it has to be the intent of the speaker that the words lead to violence and the violence needs to
be imminent. We're not going to say that,
you know, something that you put on Twitter today that arguably has an effect, you know,
days or weeks later fits into this very, very narrow exception. It does not. It shouldn't because of the imminence requirement, right? Again, the principle here is that people should
be responsible for their own conduct, not other people's conduct. The incitement doctrine is an exception to that principle,
where you can hold a speaker responsible for other people's conduct,
but we need that to be a very, very narrow exception.
Otherwise, all kinds of speech can be forbidden.
So here's what I'm struggling with.
I agree with everything you just said.
It pisses me off, but I agree with it. I agree that
that limitation should be very narrow and very imminent. But I personally struggle with not just
the belief, but I think we have historical data that language is a precursor to dehumanization
and dehumanization leads to violence. Like, are you saying I have to go too far? Like there's
not enough direct connection? Legally, yes, but I don't think that's an adequate answer to your
question because obviously words matter. Obviously words have power. Obviously they have
meaning in the world. They have impact in the world. Otherwise we wouldn't be so concerned
about this. We wouldn't need so much protection for free speech. But we have so many examples
through history of ideas, arguments, concepts that were prohibited one day and then became accepted consensus almost mainstream in a later
era. So we only really need free speech protection for ideas that are deeply unpopular to someone,
if not most people. Other words and ideas don't need protection at all. If you're saying things
that most people agree with, you don't need a constitutional right. We need a constitutional
right in the Bill of Rights for ideas, concepts, things that you would
say that majorities of people might be tempted to prohibit or disapprove of. So these are the
tensions, right? We have to acknowledge on the one hand that words, propaganda, disparagement,
discrimination, these things are real, that the words can cause harm in the world.
And then we have to be humble about our ability as a society to be able to make these distinctions correct at any time because majorities have been so wrong for so long about these things.
And we have to kind of give a wide berth for disagreement and even
offense when we're talking about legal interventions. Okay. I'm learning so much,
man. I'm so grateful for this conversation. I'm frustrated by it, but I'm really grateful for it.
Okay. So let's talk about, this really conf me, and I don't know if this is, maybe you'll have the answers and maybe you don't.
Let's talk about this argument by technology companies trying to tease out whether their platforms are publishers,
and then based on that answer, what their
responsibilities are to the public. Can you help me understand that?
I can, yeah. So let's start with something that's a little bit more straightforward.
Let's start with a traditional news publisher like the New York Times.
You may or may not know that the New York Times is in court this week. They've been sued by Sarah Palin for
defamation, and that case has gone to trial because the Times wrote an op-ed, an editorial piece,
that seemed to suggest that Sarah Palin's political speech had led, incited effectively,
a shooting of a politician. They corrected it the next day. There were some, at least,
opportunities for misunderstanding of that, but she has sued them for defamation.
The New York Times is responsible for every word it publishes. The New York Times exercises full
editorial control over everything that shows up in the newspaper. Contrast that with something like Facebook. Facebook has two and a half billion users who are
able to upload their words, ideas, thoughts, pictures instantly without any kind of review
in between. If I write a letter to the New York Times, someone looks at it, maybe asks me to edit
it. Ultimately, they publish it. But when they publish it, they've published it after an exercise of their own editorial control.
Facebook is exactly the opposite. You know, it may have rules that they want me to follow, but they have been architected, as has most of the, you know, modern for-profit internet, to allow us to speak first and then have the rules be applied later, if at all. This was by design. Congress,
in a 1990s law called the Communications Decency Act, which it has amended more than once,
essentially understood that there was no way to treat Facebook or, say, Yelp or any kind of
enthusiast site the same as a publisher in terms of liability,
if we wanted to allow people to be able to have
those kinds of interactions online.
Imagine if Yelp and its reviews were subject to defamation law
for everything that any of the millions of reviewers wrote there.
You could have a system that had that set of rules,
but you wouldn't have Yelp.
You can have one or the other. Either we're, but you wouldn't have Yelp. You can have
one or the other. Either we're going to have these kinds of social networks like Twitter,
where our tweets go up without being reviewed by a legal team at Twitter, or you won't have
social media at all. And I'm sure some people would be happy with that as an outcome. But if
you want to have Twitter, Facebook, Instagram operate roughly the way they operate right now, you cannot make the tech companies liable in the same way that we can make the New
York Times liable for its editorial decisions. That doesn't mean that there isn't gray area in
between. There aren't things that you can do, you know, for example, and Europe is experimenting
with this as well, right? If once tech companies are aware that comments
have been posted or material has been posted that violates their terms of service or law,
they could be required to take it down. And this is how our copyright regime works. It's why you
see on YouTube, if you look, lots of things that used to be there aren't there anymore.
Someone posted it, but then YouTube found out that it was illegal copyright content and they
took it down. So again, there's some in between here, but the YouTube found out that it was illegal copyright content and they took it down.
So again, there's some in between here, but the idea that we should just apply the same rules
that we've applied to publishers, to all of these social network tech platforms, doesn't make any
sense if you believe that we should be able to have social media the way that we have right now,
which is largely without curation or
editing, or we'll get into the loaded word censorship from a major corporation. And then,
of course, there are platforms that are somewhere in between. We'll get to Spotify, I'm sure,
because we're on that platform and that's in the news lately. But that's the basic distinction that
law recognizes.
So can I ask you some detailed questions?
I'm going to have to really close my eyes as I think about this because it's so complicated.
So Sarah Palin, the thing that you talked about in court this week with the New York Times, that's falling into one of those very narrow limitations, right?
Correct, which is for libel or defamation.
If I say something about you,
a factual statement, knowing it to be false,
in some instances, you can sue me for having harmed your reputation.
That is an exception to free speech.
It's much harder for a public figure
like Sarah Palin to do that. She has to show
that the New York Times acted with what's called actual malice. And actual malice means they either
knew or had reckless disregard for the truth of whether what they were saying was true or false.
So that's a hard thing to prove at trial. She's going to lose. It's important to make predictions
that can be verified or not. I'm putting myself out. I'm being brave, to prove at trial. She's going to lose. It's important to make predictions that can be verified or not.
I'm putting myself out.
I'm being brave, as you would say.
She's going to lose her trial.
But yes, this is one of those narrow exceptions for defamation.
But imagine if you applied the defamation regime,
as I said a minute ago, to Twitter
for all the defamation that takes place on that platform
on an hourly basis, right?
You couldn't have Twitter.
Okay, so do we use the terms defamation that takes place on that platform on an hourly basis, right? You couldn't have Twitter.
Okay.
So do we use the terms editorial control and curation synonymously?
Are they just supportive of each other? You know, like censorship, those are not terms of art.
Those terms don't have legal meaning.
Got it.
So you can use them however you like,
as long as your listeners more or less understand you.
Okay, so if you're going to use terms like curation
or editorial control,
you should define those terms when you're talking about them
just to be in a decent argument or discourse.
Yeah.
Okay, so here's one question. While I was watching the fistfight in my Twitter feed,
getting canceled, and then, you know, I'm canceling you because you're canceling someone
who canceled someone who's canceled, you know, like all that bullshit. I saw a lot of people
saying, I don't know why you're accusing her of censorship. Only the government can censor people.
Is that true?
Again, people can use that term how they choose. It might be clarifying to use the term private censorship, but even censorship when the government does it is not a legal term of art.
Censorship is not a legal word. So I understand what they're saying. You know, what they're saying is the
government can be legally constrained from censorship activities, whereas a private actor
like Spotify has its own First Amendment right to decide who it wants to associate with. That's true.
But if Spotify said, for example, you know what, this whole debate about critical race theory is just too
divisive and it's too controversial. And we've decided we don't want to have any content about
that on our platform, whether pro or con, it's tearing this country apart. I would regard that
as censorship. If they did the same thing for other controversial issues, that doesn't mean
that they've done anything illegal. They've done something legal. In the same way that if a private school decides, you know what,
we don't want to have any books in our library that have any sexual content. So we're getting
rid of The Bluest Eye by Toni Morrison. We're taking it off our shelves. I would regard that
as an act of censorship, even though legally as a private school, they're not constrained by any law. They can decide what
books are on their shelf without a court having any say in the matter. So once again, censorship
is a word that you ought to define if you're using it. I think when people say private actors can't
censor, what they're really saying is the law protects private actors in the way that the law
would not protect a government action. And that's absolutely true.
Can you say that again? Can you say that again? That seems really important.
Right. So let's put it this way. If Spotify says, we have decided to define misinformation
in this way and to ban it from our platform.
The law protects Spotify's decision to do that.
Whereas if Congress passed a law saying,
we are defining misinformation in this way
and Spotify can't have it on its platform,
that would be a First Amendment violation.
Got it.
Right?
So that I think is what, you know,
being charitable people are saying when they say it can't be censorship if it's a private entity.
Again, I regard, you know, the decision by Amazon or Walmart not to sell a certain book, right?
That's not illegal, but I would regard it as censorship if it was driven by ideology. And maybe the easiest way to clarify this is just to distinguish between government censorship and private censorship.
Okay, that's so helpful for me.
I have this question, and it's going to be in the vein of the dehumanizing language thing.
You said fraud is a limitation on free speech if you're using your speech to defraud?
I would put it another way, which is...
Okay, how would you put it? I would say
so governments, state governments,
local governments, the federal government
can pass criminal laws
prohibiting fraud
and have those survive
First Amendment challenge because
they tie the false speech to economic harm. So I'm defrauding
you out of your money. If it's just aimed at confusing you or giving you false information,
that's going to be considered protected speech. But if it is about actually giving you false information
so that you'll give me your money,
then that can be regulated
without running afoul of the First Amendment.
So what if I say things that are untrue
in order to gain a specific audience
which raises how much money I make?
Do you understand what I'm saying?
I do, but I don't think that you can
cut it that cleanly. I think that would be protected speech. We all speak to gain audience.
Gaining audience often brings material benefits to us. And so that wouldn't fall into the
fraud exception to free speech in the same way that if you were selling a COVID pill
that you knew was actually just a placebo,
but you put a label on it and you sell it as an elixir
without any government approval and people give their money,
that can be regulated in a way that you're just,
shall we say, lying for aggrandizement and to grow your audience, that would be protected speech.
I mean, that sounds a lot like politics to me.
Yeah, yeah, yeah. That's why I'm like testing the limits here. It's really interesting because all
these kind of limitations or exceptions are so tight. This is making sense to me. And I actually agree with it. It scares me because
I don't know who's going to be the arbiter. So it scares me to have, do you know what I mean?
I do. But I think that we want to leave as much room as possible for political speech.
Political speech can be really offensive.
It can be really demeaning.
It can be very, very challenging.
But we will always come up against this problem
of, as you just said, who is the arbiter?
And when you think about, you know,
hateful speech is another category
where other countries have come up
with different systems that we have.
We don't have a concept called hate speech in American law.
Hate speech is, hateful speech is protected under American law.
And when you think about who these arbiters are going to be,
in almost every case, it's not going to be the Attorney General of the United States.
It's going to be the principal of a school in a rural district.
It's going to be a sheriff somewhere.
It's going to be a university president under pressure. One person's hate group will be KKK.
Another's is going to be BLM, and another's is going to be BDS, depending on what political
system you are. BDS, of course, is the Boycott, Divest, and Sanction movement aimed at boycotting
Israel, which 25 states have passed laws aimed at restricting those protests, Black Lives Matter and the Ku Klux Klan, right? There is no consensus
among the political leadership of our country about what is hateful. Many politicians have
labeled the ACLU a hate group and have said it's one of the most dangerous organizations in the United States.
And that's why, you know, we want the law to have a very, very soft touch in these areas and really
kind of police things at certain extremes. But we don't want to have regular intrusions
because history shows that these powers will not be used in the ways that you might expect.
Yeah.
So what are your thoughts about Spotify pulling down all of Joe Rogan's episodes where he uses the N-word?
Look, I think these are really hard calls.
First of all, I should say, this is not a legal question.
I've said this before.
I just want to repeat it.
I want to repeat it. Spotify has its own First Amendment right, constitutionally protected right, to decide with whom it wants to associate, to decide what content it wants to publish. When we're talking about who the speaker is here in terms of regulation, Spotify is actually the speaker.
No, you lost me.
In terms of government regulation,
if the government tried to tell Spotify to take these down,
in that case, Spotify is the speaker.
And Spotify says, you can't actually interfere with our free speech.
We've decided we want to publish this.
Using the N-word is constitutionally protected.
So there's no illegality there.
This is a question of Spotify's values
and what Spotify wants to be associated with. In this instance, it seems like Rogan has apologized.
I don't know whether he objected. He may even have supported having some of those old episodes
taken down. So that one might not be as difficult as some of the other harder questions like,
you know, should he still have a platform?
Should they still be hosting him? But I will also say that one of the reasons why we need to be
humble and cautious here, well, a few reasons. First of all, stepping way back, my life until
I went to college is unrecorded. And I thank my lucky stars for that all the time. My parents
didn't even own a camera, right? There's school photos of me once a year. And none of the stupidest
shit that I said when I was 15 years old is preserved anywhere, or 16 or 17, or even 21.
And it's only remembered, if at all, by the people who were immediately around me. And we're now living in a technologically merciless world where everything is forever.
That's why Snowden's memoir is called Permanent Record.
That, you know, we are now surrounded by ourselves,
and it makes it so much harder to try on personas, to try out ideas,
to just experiment with different versions of ourselves without having consequences. So I always get a little nervous when people start digging up something
from seven years ago, nine years ago, 11 years ago, and using it to form a conclusion, not about
what someone said, but about who they are, as if that's a static thing. And you see it even
happening to 15 and 16 year olds who are having their college acceptances revoked
on the argument that we don't want a racist here.
Not someone who said something racist at 15, a racist.
And in public interest law, we like to say
people are a lot more than the worst thing
they've ever done or said.
It's why we're against capital punishment.
It's why we want mercy for people who are in prison, even who have committed serious
crimes.
And we don't extend that mercy to people for speech crimes against the current norms.
And so I think we need to be really cautious, given how those norms change, about taking
today's set of understandings, even if it's a consensus we agree with, and applying it uncritically to people's past statements and using that to form conclusions
about their identities. So that, I mean, and I say all this with a lot of caution,
it seems like this is a week where every day we're going to find out something different that
Joe Rogan said that is going to be appalling and offensive. And the point of my speech just now was
not to defend any of that and not to really
say anything about him. I don't know him or what he stands for or whether he's changed. I try to
take apologies at face value if they're sincere. But to go back to your question, taking down these
old episodes, leaving him with his current platform to show whether his remorse is sincere,
doesn't strike me as an unprincipled stance for Spotify to be taking. It strikes me as one of many possible principled stances they
could take. Let me ask you this question. What is your perspective? This is such a hard topic for me
because just to be honest with you, because I'm married to a physician whose life seems threatened by it. But what is your perspective on COVID misinformation?
It's a big and hard topic. I'll start with me and then go out to the idea. I've had COVID twice.
I've been very lucky. I had the alpha version in March of 2020 when the ambulances were going by
all day and night and there were no tests available.
I had the Omicron version just last month. Both of them were mild enough. So I feel fortunate in
that regard. I'm glad. Look, I am dismayed that our vaccination rates are not high enough to allow us to return to a much more normal life.
And the reason why we can't return to a much more normal life is if our hospital rooms are so full
of COVID patients that they can't treat other patients, then that's just not a way for a society
to be able to function. And so I wish that the vaccination rates were considerably higher, given what we now know to be their really shockingly effective rates at keeping people from serious illness and death.
I completely understand, even if I don't relate to, the specific trust of institutions that's driving some of the hesitation here.
Our mantra, trust the science, trust the experts,
and all of this is a lot less persuasive
when you look at the last 20 years of American life.
God dang, that's true.
When you look at the experts who brought us the war in Iraq and torture prisons, who brought us the
housing bubble and financial crisis and deregulation, who have brought us some of the
worst income inequality that you see anywhere in the world. And then even during COVID, of course,
the medical consensus has shifted. And I think one of the problems was that we sort of set up
science on one side
as being the thing that's right and then everything else. Whereas, as you know, because you're married
to a physician, science is a series of questions and hypotheses that have to be tested. And so if
you say, you know, this is the truth. The truth is you don't need a mask unless you are working
in an emergency room. Now, at the time, the reason for that message was
we are worried that we don't have enough
of this PPE equipment for people on the front lines,
and we're lying for the public's own good.
Well, that always backfires
when you try that kind of messaging,
and it really undermines.
And so when the voice of authority
turns out to have been incorrect,
it doesn't just change our view about that particular question, but it undermines our
authority in the institution. We have now seen this as an institution that is willing to be
misleading for our best interests as they understand our best interests. And that's
really problematic. Look, there was another pretty controversial public health misstep when the same public health experts who had been saying, don't go outside at all, when the George Floyd protests started, said, you know what?
Go out there and protest because racism is also a public health problem.
Now, I was out there protesting, so this wasn't a problem for me.
But you can see how you erode your authority when you say how you should behave depends on why
you're doing it. And then I would also say people have good reason to be suspicious of big pharma.
Big pharma has brought us the opioid crisis. And so, you know, the idea that we're all just going
to trust this voice of reason, it's unrealistic. It ignores reality. And what we really need to be thinking
about is not how we can bully people with reason, but how we can persuade. How do you reestablish,
or how do you establish trust and authority in these kinds of situations? Look, I hope that
there are not millions of people listening to the Joe Rogan podcast, hearing who
he has go on there, and taking everything that they hear at face value. Surely some are. The
numbers are too big. If he has 11 million listeners, some of them are probably listening to that.
I hope people realize that he is, you know, the host of the fear factor. He's not someone who
brings any expertise to this.
All he's doing is asking questions.
And I think that the response to this,
the response to this mini scandal,
has been fairly positive in the sense that
Spotify is going to be clearer about what their rules are.
They are going to attach better information
to worse information.
But I have to say, and this is why I wish that we had
someone in this conversation who was really an expert on misinformation and public messaging.
How do you regain the trust of large numbers of people who have lost faith in the wise authority
of these institutions? I don't think it's by kicking people off platforms. I don't
think that's actually going to make the situation better. I think Joe Rogan, if he leaves Spotify,
will bring most of his audience with him wherever he goes.
Oh, for sure. 100%.
And I don't think that the message that they will take from this is we should trust him less,
or we should trust his guests less, or we should trust science more.
And that's why I want us to be focused on how you can ameliorate the misinformation,
not how we can punish the speaker or reveal our own purity.
How do we actually understand why someone like this draws an audience like that?
And how do we find a way to communicate with that audience in a way that is not just finger-wagging and scolding?
It's true because there's some data that points to the fact that silencing debates about vaccines actually increases vaccine hesitancy
and that rigorous debate increases vaccine compliance.
So for me, as someone who believes
in the efficacy and safety of vaccines,
I would not want to see the debate go away.
I think that's right.
And I think tone matters a lot too.
You know, I remember when I was a young lawyer at the ACLU
and I had been quoted in the newspaper about something.
I don't remember what. And someone from our communications department came into my office and she said,
nice quote in the newspaper, but you started your sentence with the word obviously.
And when an ACLU lawyer starts a sentence with the word obviously,
30% of the people tune out
because they're used to being lectured by someone like you,
a snooty, elitist ACLU lawyer.
So don't say the exact same thing.
Take out obviously.
And it was really one of these light bulb moments
where I was like, wow, I'm really condescending,
and that's going to affect my ability to communicate here, even though I think I'm just as right.
Before how I say that and how I approach and how I understand how I appear, how I show up in a quote in the newspaper as an ACLU lawyer is really important to getting the message across in a way that works. About a year ago, two twin brothers in Wisconsin discovered, kind of by accident,
that mini golf might be the perfect spectator sport for the TikTok era.
Meanwhile, a YouTuber in Brooklyn found himself less interested in tech YouTube
and more interested in making coffee.
This month on The Verge Cast, we're telling stories about these people
who tried to find new ways to make content, new ways to build businesses around that content, and new ways to make content about those businesses.
Our series is called How to Make It in the Future, and it's all this month on The Verge Cast, wherever you get podcasts.
What software do you use at work?
The answer to that question is probably more complicated than you want it to be. The average U.S. company deploys more than 100 apps,
and ideas about the work we do can be radically changed by the tools we use to do it.
So what is enterprise software anyway?
What is productivity software?
How will AI affect both?
And how are these tools changing the way we use our computers to make stuff,
communicate, and plan for the future?
In this three-part special series, I want to read something.
Before I read this, I want to ask you about this.
So I was reading in Lawfare.
I don't know what that is, but it's L-A-W-F-A-R-E.
Do you know what that is, but it's L-A-W-F-A-R-E. Do you know what that is?
Yeah, it's like a blog for people in the broad world of national security law.
Okay. So it's interesting. It's talking about warnings that work and warnings that don't.
We're pulling away from a little bit of your, let's dig into the deeper question,
because I do want to talk about this a little bit.
So the title of the article, again, from Lawfare, Warnings at Work, Combating Misinformation Without Deplatforming.
I'm actually, to be honest with you, not a fan of deplatforming.
That feels dangerous to me and a slippery slope into something that there are a fuckload of people that would love to see me deplatformed right now.
So I'm not sure that I'm a fan of it just because, you know.
But what was interesting about this article is they draw a comparison to early internet days when there were security warnings about, you know, you had web browser warnings and, you know, malware and that none of it really worked,
but they're calling these contextual warnings,
which is like what we see now on,
I think, Facebook.
We see that on Twitter,
now on Spotify, contextual warnings.
But they did a study
where interstitial warnings,
do you know what these are?
I don't think so.
They come up and you have to click that you understand.
I see.
Like we're using cookies.
Yeah.
And so while the contextual, I'm reading this verbatim
because I don't want to screw it up for obvious reasons,
really didn't slow anyone down or make anyone think twice.
These interstitial warnings, this is the quote,
dramatically changed what users chose to read. interstitial warnings, this is the quote, dramatically changed what users
chose to read. Users overwhelmingly noticed the warnings, considered the warnings, and then either
declined to read the flagged content or sought out alternative information to verify it. That's
interesting to me. Do you think that's interesting or no? Very much so. And I think we need a lot
more research of this kind, of exactly this kind.
Me too. What works? I think there is a lot of that going on right now, just because, you know, as we said before, starting after Brexit and the Trump election in 2016, and there's been a lot more claims made about the power of fake news or misinformation or disinformation, but there hadn't been enough concrete research about both what effect it has and what the best ways are to ameliorate it. How
do you get around this? And it's a really hard problem because, you know, we talk about this in
a lot of contexts, but when opinions ossify into identities, then it's not just a question of what
you believe, but it's a question of who you are. Your identity.
Yeah, I think what's been so surprising for me is that this has showed up in the context of the pandemic,
that people now have identities on different sides of the pandemic response.
I mean, masking is obviously a good example.
And we saw it in some of our bluer communities, people wearing masks outdoors,
even after the CDC said, well, maybe that's not so necessary because it had been really a way of showing I care about others and I'm not Trump.
And, you know, much more.
It's the new NPR bag.
Right.
Much more so, of course, on the other side where people are now using vaccines for identity purpose.
And you have, you know, Fox News hosts who refuse to answer the question, have you been vaccinated? Because they
don't want to admit that they have, because it's the wrong answer in the identity that they're
trying to forge here. So that has been somewhat surprising. And the question is, do these kinds
of interventions that you're describing have a way of getting around that kind of identity defense.
And on that, I'm really not an expert at all.
Yeah, I just think it's interesting.
I think one of the hard problems that they talk about here,
I've read three or four peer-reviewed articles.
This is not a peer-reviewed article.
It's in the process of being submitted as one.
But the three or four that I've read
around what kind of interventions work around misinformation
all said that part of the problem is the very small amount of data public platforms are willing to release.
So let's talk about something that we had a bigger problem.
Well, I mean, we don't have a bigger problem, but we have a root problem around the power these platforms have.
Like, what would you say about that?
Like, is this, are these antitrust issues?
Are these, I mean.
I think so.
I think, yes, I endorse that comment 100%.
What we've seen is that the most important places for free speech in our society are now platforms that are privately owned by Silicon Valley oligarchs.
And you'd be making a huge mistake if you thought that those oligarchs share your political views and values. They do when it works for them,
and that they won't turn on a dime if the political environment requires them to do that.
So these are institutions that are not answerable to the public. They're answerable to their
shareholders and bottom line. They are corporations, and they are the ones now that are making the godlike
decisions about who gets to speak and how. Now, as you say, that wouldn't be as big of a problem
if there were a much more diverse ecosystem of those kinds of platforms. But when you have
really dominant ones, when you have billions of people around the world on the same
platform, when you have huge percentages of the public getting their news or information from
a handful of essentially advertising companies, that is a major threat to free expression that
our constitutional law has nothing to say about because these are private entities, right?
Facebook gets to decide if they
want to have nudity on their site. The government doesn't get to object to Facebook's policy on
that. Twitter has made a different decision. And it may be that antitrust is the only lever
that can be applied effectively. We have not had in the last two generations, much of a tradition of robust antitrust enforcement, and we may not even have the right legal framework for it. The same thing is happening in Europe, by the way. They call it competition law, where the Europeans are even more aggrieved by this because they're American companies that are dominating this global infrastructure. So I do think that the government could do a lot more to try to
improve the playing field, the market by which these kinds of companies are allowed to become
the behemoths that they are right now. It's very hard when I hear you saying this, Ben, because when I I read how these founders and CEOs talk about their mission and their vision.
They don't talk about necessarily market cap or they don't talk about those issues.
They talk about owning audio globally.
Yeah.
I thought you were going to say something different different which is that they talk about not being
evil or they talk about the wonderful bed i don't say i i don't think they say that as much as they
say i mean just to be i mean just what i've heard i what i hear when they're talking at least
to wall street or when they're talking to investors, what they're talking about is global
domination as the business goal. And what I hear you saying, and again, I'll play this back and
you can correct me because I've been off a couple of times, at least during this conversation,
is that, I mean, on a scale from one to 10, 10 being really confident and one being this shit will never happen. How confident are you
that we could put together, that we will put together a framework for antitrust for competition
in the next 10 years that will loosen up this clot? Five. I do think that.
I love it. What if I said five was like forced to say,
like I'm a researcher.
You don't get a five.
You go one through four, six through 10.
Okay.
Look, I think that there is a global recognition
that these corporations have gotten too powerful.
I think part of the same conversation, though,
needs to be not just their dominance of our communications platforms,
but how these massive corporations have contributed to income inequality.
Relative to the size of these corporations,
they're not employing a lot of people.
And if you look at something like
Amazon, they've wiped out main streets across the country. And the law is at a hard time dealing
with that because for you as an individual, Amazon is great. You click a button and they
deliver it to your door for less than you would have paid at the old store in the downtown. For us as a society,
it can be very damaging because all of a sudden we now have way fewer middle-class families who
have professions across that supply chain that is now owned by one company. So these are really,
really hard issues as efficiency. And as we move into more automation with AI, these things are
going to accelerate, has made more
and more middle-class jobs obsolete. And now this is where it's really going to go dark
in this conversation. But we don't have really too many persuasive examples of having democracy
without a middle class, without a strong middle class. And when you have some people who
are very rich, and most people who really aren't, and are losing in the system and feel themselves
losing in the system, that's when people become much more susceptible to populism, demagoguery,
and particularly the right-wing versions of that. And so then you start seeing Trump's election in 2016
not as just the product of some meddling Russians and Facebook,
but as part of a trend that we're seeing across Western societies
of right-wing populists coming to power
and the old traditional more centrist parties shrinking.
So we're going to have to get
a handle on that problem. How are we going to share the benefits of all of the great efficiency
that technology is bringing us and not just have a smaller and smaller number of winners
and everybody else being on the outside? So I think that went quite a bit beyond where
we meant to go here, but I think that went quite a bit beyond where we meant to go here.
But I think that is the bigger challenge.
No, it's exactly where I wanted to go.
I believe there's a profound danger in disconnecting these things.
Or tapping out when it gets too dark or too complex or too related in hard ways.
I just think that's, I think you can't separate what's happening, I don't think. I just don't think you
can. And I'm, you know, as a social worker, class, working conditions, as a former union steward,
you know, these are things that unless like put in front of us over and over can so slip away in every conversation, but
I'm so glad you brought it up. It's really important. And I think where I got chills
was if you look back at history, which I love, when you said this, I was like, is that true?
Stay focused on what he's saying, but is that true? Very few examples of strong democracies that are not held together really by a strong middle class, even a strong working class. I mean, yeah, I think this is really important conversation. I'm so grateful you connected that for us. Yeah, and it's why when people talk about Facebook
as a threat to democracy,
the last thing we should be worried about
is their content moderation policies.
The first thing we should be worried about
is how these companies are contributing
to the hoarding of resources of a few
and the hollowing out of the middle of the country
in a way that's going to have
really long-term corrosive consequences,
unless we get a handle on it somehow.
And maybe the tax code is the way that you get a handle on it.
Maybe antitrust is part of the problem.
But as I said before, automation, AI,
is only going to accelerate these kinds of trends.
And to me, even though it's not what I work on, you know, economic justice, that's the
urgent problem that's at the center of all of these issues.
All right.
I'm going to close by asking you a question about you.
And I've changed him up a little bit, but I'm going to go to our rapid fire questions.
I'm kind of scared of one of them because I know you'll just say what it is.
But let's start.
Fill in the blank for me.
Vulnerability is?
Difficult.
Do you want to say more?
Yeah.
I mean, the great Buddhist line, there's no self to defend, is pretty hard to live up to. And it's particularly
hard for me. So I have a lot of defenses. Yeah. Me too. Okay. Number two, you're called to be
very brave, but your fear is real and you and hold it. Yeah. Okay. Something that people
often get wrong about you? They always think that I'm kidding and can't tell when I'm actually being
earnest. Really? Yeah. That hasn't come through as much in this conversation,
but I would say I sit on that razor's edge
between irony and earnestness a lot.
And so when I'm not wisecracking
and when I'm trying to say something sincere,
people think that I'm putting them on.
Do you have any strategies for when you have to defend
an individual or a group whose behavior or speech you find just offensive or even devastating?
It's very rare, I want to say that.
People's image of us as doing this all the time, defending the people who want to burn down our office, office. These are very, very rare instances, but I think my strategy is more intellectual
than emotional in those situations. And it is just to imagine a world where the shoe is on
the other foot, the wrong people have the power, and I'm the one being censored.
Powerful. Okay, we're taking a hard turn. Are you ready?
I am. Last TV show that you binged and loved.
This is going to be a little, it's going to make me sound more elitist than I am because I
try to read books and watch movies rather than TV shows. But let's just say I rewatched
Deadwood, the HBO Western. Oh Western. And I wish there were more shows
like that one. Just brilliant. Favorite movie of all time. You know that you have to have like
five or 10 favorite movies of all time. You can have two, Max. I can have two. Well,
let me do it this way. I'm going to do it this way. The best lawyer movie of all time, and it's not even close, is My Cousin Vinny.
Everything else is distant.
The movie that I've seen the most times is The Big Lebowski, probably followed by Dazed and Confused.
But I'm still a sucker for Godfather 1 and Godfather 2.
I just learned so much about you in like 60 seconds.
Good questions.
Like so much.
Yeah.
A concert that you'll never forget.
There's so many.
All right.
I'll, you know, I will be brave and admit it.
13 years old, Billy Joel at Madison Square Garden.
Oh, come on. Yeah. that's a classic. I don't even
think you're the first person that said that actually on our podcast. Favorite meal of all
time? Sally's Pizza in New Haven, Connecticut. That would be my last meal. If they ever tried
to execute me, that'll be my last meal. What do you get on your pizza? Oh, I mean,
the main rule is not too many things.
You know, maximum of three things,
two would be even better.
But you know, you have to have four or five pizzas on the table, so.
Got it.
So you're sampling it around.
Yeah.
Okay, tell us what's on your nightstand.
I'm so curious about this one.
Right now, well, so I read very, very,
very little nonfiction
because I feel like my work and life is nonfiction.
And so I'm reading right now a novel by Heinrich Boll, who was a post-war German
writer, most famous for a novel called The Clown. This one is called The Safety Net,
and it's really fascinating. It's pretty much about the way in which the security that surrounds this important person to protect
his life, in fact, completely constrains his freedom and that of his family and corrodes
his relationships and all of that. And it's a pretty great literary representation of being
trapped in your shelter. Oh, yeah. You put this thing up to protect yourself, but actually you're
the one who's caught inside.
You're the prisoner.
Right.
I mean, which is what I think that we do to our society sometimes in the face of certain kinds of threats.
It's certainly how I would describe our reaction to 9-11 was that we have to take all of these
harsh actions against a real but pretty distant threat.
And, you know, do we really need to be scaring people on TV every night with color-coded terror alerts? Is that actually useful? Who is it helping? So this is a really
terrific novel. Wow. Okay. Give me a snapshot of an ordinary moment in your life that really
brings you joy. I love it when I can give my senior rescue dog a really good bone from a restaurant that I shouldn't give her because it's a cooked bone from something that I ate.
But, you know, she was a street dog for seven years, so she can handle it.
And it was just that moment of, you know, just pure gratitude and love as she goes to chew on a lamb shank.
Okay. I had to add this one.
What keeps you up at night? I mean, I'm going to talk about one night in particular.
Okay. Interesting.
I had a very fortunate pandemic. I became a first-time parent at the ripe young age of 49 on Halloween in 2020.
Congratulations!
She was actually due on Election Day, and the day that this baby girl came home from the hospital was Election Day.
And we went to sleep that night not knowing what the outcome of that election was.
And this goes beyond who the president was going to be.
But I just had this crystallized thought.
What is this world going to be in 20 years, 40 years, 60 years, 80 years? And how can you launch a new life into this right now when it seems at this moment, at this dark moment
in my night, that it may be that we're coming to the end of this peaceful, safe, secure, free
world that I've lived, you know, 50 years in. And we're about to enter a world of no democracy,
climate catastrophe, violence. And let's just say I felt much better
in the morning, but there was just this, these things coming together at the same time, this,
you know, are we about to turn this dark corner and here's this new life that, that made that a
really, really long night for me. That is, yeah, that is a, you were right in the middle of life right there man
like yeah yeah oh my god i love that congratulations on your daughter thank you
i wish i could see his face he's got a big smile
all right we asked you for five songs for your mini mixtape let me tell you what you picked
freedom blues by little richard talking loud and saying nothing by
james brown perfect day by lou reed as by stevie wonder and happiness is a warm gun by the beatles
in one sentence what does this mini mixtape say about you ben weisner
um it says sometimes i'm too clever for my own good.
I was trying to choose things that would be both sincerely related to my taste,
but also topical for our conversation.
So that's how this came into being.
Thank you so much for your time. You were it. So that's how this is, yeah. Yeah, you nailed it. Came into being, yeah. Thank you so much for your time.
And I really, you were right.
There was a lot more grappling than there was certainty.
I think that's right.
And I just want to say, I want to we need to find a way to balance the harms
that speech can cause and the harms that censorship can cause, that actually is precisely what we're
trying to grapple with here. And that anybody who approaches an issue like this with firm
certainty is probably going to be angry at both of us for the conversation that we just had,
and we're going to have to live with that.
Yeah, I think misinformation, threat to democracy,
censorship, threat to democracy,
an unwillingness to grapple with nuanced, difficult things,
an equal threat to democracy.
I mean, it's just, I want answers because I love certainty,
but I don't have them.
And I think we just, as a culture, are not willing to pause and ask questions before we launch.
So much pain, you know, and I could have probably done a lot of things better, but I'm really trying to understand.
And that is not easy.
No, especially when we're operating in a dopamine casino like Twitter that really runs on outrage.
So you need to be able to step back from that. I'm glad that we had over an hour to talk about
this and I didn't have to fuse it into some evil corporate ontology of 140 characters, right?
No. Yeah, it just doesn't work.
It doesn't work.
Thank you so much, Ben.
Thank you so much, Ben. Thank you so much.
All right, y'all.
That conversation kind of, I don't know, it surprised me, I think, is the word that I'm looking for.
Like, I was surprised how complex these issues are. But you know what really surprised me the most is that
the center will hold if we don't allow ourselves to have these knee-jerk reactions,
race to ideology, you know, scream and yell at people when we don't understand. Like,
I think the whole system works if we're willing to have nuanced, complex conversations and engage
in like real critical thinking, but we're just not willing to do that. It's so interesting because So I paused my podcast and then I went up a couple of days later, like on February 2nd, I think,
and said, here's why. And the most vitriol I've received for sure is off Twitter.
But what was interesting is Twitter had almost like less than 10% of the people who read my position that I wrote on the website
came from Twitter. So people are responding without even reading what I'm thinking or,
you know, they're just, it's just, again, a lot of it is bots. But I think the system is so beautifully designed, the architecture of it, but it requires education
and critical thought and thinking and debate in a way that we just don't do anymore.
Myself included sometimes.
I mean, look, if you don't all, like, while I stand for free speech all the time and have
a long history of doing it, we all have little inner sensors within us that are like, hey,
you shut up. You know, you on the other hand, within us that are like, hey, you shut up.
You know, you on the other hand, you talk, but you over there, you shut up.
That's maybe the biggest threat to the system in addition to
censorship and misinformation is just the unwillingness to engage thoughtfully.
We'll be back next week with Unlocking Us and Dare to Lead. And guess who's in the studio? Hi, Laura.
Hi. Hi. Is next week Valentine's Day? Next week is Valentine's Day. Yeah. So we have
Dare to Lead will come out on Monday and we have a little special Valentine's gift, an extra
episode of Unlocking Us for you as a Valentine's day present from us.
And it's an excerpt from the new audio book of Atlas of the Heart. I just got it recorded
and it was so fun because they let me ad lib. So I read the book, but then they let me like
describe things and what I've learned since the book came out. And so it's kind of fun.
And I'm excited about it. So, and then Wednesday, we're coming back with a pod. So
Monday is Dan Pink. Dare to lead Dan Pink. We're talking about regret. Je ne regrette rien.
Except for the shit show last week, but I don't regret it. It just sucked. And then Wednesday is,
oh, that Monday, our Valentine's day present, and then bonus Unlocking Us. And then on Wednesday,
it is, oh my God, Jason, is it Jason? I was like, is it Jason? Perfect for Valentine's Week. Jason
Reynolds. Oh my God, this interview is so fun. Every episode of Unlocking Us and Dare to Lead
have episode pages on brennibrown.com. You can visit those episode pages for resources, downloads, and transcripts.
Takes us about five days to get the transcripts up.
I'm just happy to be back.
I know it's been really hard.
I know some of you are still grappling
with a lot of issues.
Hashtag, I am too.
But I'm going to make the best podcast I can
because I believe in the
conversations and I believe in this community. So thank you and stay awkward, brave, and kind.
I'll see you next week.
Unlocking Us is produced by Brene Brown Education and Research Group.
The music is by Keri Rodriguez and Gina Chavez. Get new episodes as soon as
they're published by following Unlocking Us on your favorite podcast app. We are part of the
Vox Media Podcast Network. Discover more award-winning shows at podcast.voxmedia.com.
Do you feel like your leads never lead anywhere and you're making content that no one sees and it takes forever to build a campaign?
Well, that's why we built HubSpot.
It's an AI powered customer platform that builds campaigns for you, tells you which leads are worth knowing and makes writing blogs, creating videos and posting on social a breeze.
So now it's easier than ever to be a marketer.
Get started at HubSpot.com slash marketers.
Support for this podcast comes from Klaviyo.
You know that feeling when your favorite brand really gets you.
Deliver that feeling to your customers every time.
Klaviyo turns your customer data into real-time connections
across AI-powered email, SMS, and more, making every moment count.
Over 100,000 brands trust Klaviyo's unified data and marketing platform to build smarter digital relationships with their customers during Black Friday, Cyber Monday, and beyond.
Make every moment count with Klaviyo.
Learn more at klaviyo.com.