Today, Explained - Never tweet
Episode Date: August 17, 2018Twitter temporarily banned Alex Jones and Infowars this week. Why'd it take so long? Learn more about your ad choices. Visit podcastchoices.com/adchoices...
Transcript
Discussion (0)
Johnny Harris recently completed your second season of the Borders videos here at Vox.
You can find them at youtube.com slash Vox.
But also maybe, you know, the greatest spokesperson there ever was.
Forgetquip.com slash explainedquip electric toothbrushes.
We talked about your wife, Izzy.
We talked about your youngest, Oliver.
I think today we're on oldest son.
Henry, a five-year-old.
A five-year-old.
Yes.
Henry's always been more amiable to teeth brushing,
and he's just not as contrarian as Oliver.
Okay.
Where's the tension?
We need tension in each one of these vignettes.
Hey, y'all.
I'm Sam Sanders, not Sean Ramos from I Know.
Usually I host my own show. It's called It's Been a Minute.
It's from NPR. Check it out.
But today I am hosting Sean's show while he is out on a little vacay.
All right, let's begin.
Today we're going to talk Twitter, specifically about the CEO of Twitter, Jack Dorsey.
For the longest time, Jack Dorsey has been someone who is seen, but not heard.
He wasn't the kind of CEO who was all over TV or in big magazine profiles.
To be honest, even though I would always see his tweets, and he tweets a fair amount, I
really had no idea what his voice sounds like.
Jack Dorsey was kind of the Tom from MySpace of this current tech generation.
Always there, but quiet.
A little over a week ago, that changed.
All of a sudden, Jack Dorsey spoke.
Thank you, Sean, first for the opportunity to talk with your listeners
and also painting a picture of the complexities that we're facing.
And then he spoke some more.
We're looking at our current policies and evolving our policies under the changing circumstances that we see around us.
He's still talking.
Jack Dorsey going on somewhat of an apology tour, if you will.
Because he has to.
On the right and the left right now, it seems everyone is saying Twitter has to clean up its act.
When it comes to speech on the platform, when it comes to who gets to be on the platform, when it comes to who's even real on the platform.
Yeah, it sort of feels like a very critical turning point for the whole of the tech industry.
Tony Rahm covers tech policy for The Washington Post.
Now, we've known for a long time that social media platforms have a lot of good on them, people who are connecting over legitimate things like politics, and they have a lot of bad,
like people who post racist content and people who spread disinformation, including what Russia
did during the 2016 presidential election. But InfoWars has been a longtime challenge for a lot
of these tech companies, including Facebook and Google and Twitter. Alex Jones on InfoWars has
done everything from attack the victims of the Sandy Hook shooting to defame Muslims and women
and other minority groups. And so these companies have been under pressure for a long time to do
something about it. At times, Facebook and others have said that Jones hadn't violated their
policies, but that pressure ultimately led those companies to change course just about a week ago.
We saw Spotify and Apple and Facebook start to take steps that led to Jones being kicked
off of each of those platforms.
Twitter was the last to do so.
But as of Wednesday, Twitter essentially had put InfoWars in timeout, which means that
Alex Jones and the InfoWars accounts aren't able to tweet for the next seven days.
Timeout sounds like a slap on the wrist and not a real long-term fix.
Am I wrong in thinking that?
That's what a lot of people say.
They would like to see Twitter kick Alex Jones and InfoWars off the platform entirely.
But this really speaks to the heart of the problem here, which is that Twitter and some
of these other companies don't want to be the arbiters of speech.
They don't want to be the ones deciding what's right and what's wrong. And they truly believe that better content,
things that aren't as inflammatory, things that aren't as offensive, will ultimately prevail in
the marketplace of ideas. So with Twitter, ultimately, the company found that Alex Jones
and Infowars generally might be posting repugnant things, but it wasn't their place to take it down
unless they did specific things, which was,
you know, for instance, threatening violence. And that's exactly what happened here. The reason why Twitter put Alex Jones and Infowars in timeout was because of a video that they recorded on Periscope.
We're under attack and you know that you pointed out mainstream media is the enemy,
but now it's time to act on the enemy.
Where essentially Alex Jones was threatening violence against the mainstream media and the
left. People need to have their battle rifles and everything ready at their bedsides.
And you got to be ready.
That was what Twitter said was too far.
And that's why these folks are in timeout.
So, I mean, to hear you say this, one, Alex Jones gets a timeout for seeming to incite violence against the mainstream media.
But our commander in chief, Donald Trump, has done so before too.
He's not timeouted.
Yeah, the president is not timeouted.
And this is actually something that we raised
when we sat down with Twitter CEO, Jack Dorsey,
just earlier Wednesday.
One of the big issues that Twitter has faced
are the fact that the president has attacked his opponents.
President Trump tweeting out this video over the weekend,
showing him in an old WWE clip.
But in this version, you can see he is body slamming CNN.
He, just a few days ago, called Omarosa, who recently penned a tell-all book from her time at the White House.
He called her a dog.
But when you ask Dorsey about it, what he says is that there's essentially an exception for public figures.
You don't make a distinction, however, you know, over policy, talking about taxes or tariffs versus calling a person a dog.
We make an understanding of who is actually saying that and whether they are a global leader, whether they are a public figure, and whether this is something that should be reported on and should be talked about. That the benefits of knowing how our leaders think,
what they think about issues and what they think about the people around them,
outweigh the need to police exactly what he's saying.
So you talked to Dorsey this week. He seems to have been making the rounds. He was even on Sean Hannity's show recently. I really appreciate you coming on because I'm sure this is probably
the last thing you want to do is come on my radio program. So to see him talking to the press more this week and last week,
and to see him putting someone like Alex Jones in timeout,
is there a sea change at Twitter?
Yeah, there's definitely a sea change.
People don't get why Twitter does what it does.
Like most human beings on the street don't know why
one tweet is allowed on Twitter and another is not.
So when you add to that the fact that there's regulatory pressure as well, it's put Dorsey
in this place where he has to talk to people.
He's got to be more clear about how Twitter makes the decisions that it makes, about why
it allows some content and why it bans others.
And he has to do it in a way that reassures folks and keeps them using the site.
Because after all, if people stop using Twitter, there's no product left.
There's no company left. And that's also partly why Dorsey has spent some time
courting conservatives in particular. It's sort of become this like cause du jour on the right
to claim that Twitter is biased against conservatives and conservative leaning news
and views. There isn't a whole lot of data about that, but at least anecdotally, they say that
they've been targeted. Someone like Dorsey
has to be out there on the road talking to these folks, doing these interviews, reassuring the
listeners of Hannity that no, in fact, Twitter isn't trying to stamp out conservative voices.
We do not shadow ban according to political ideology or viewpoint or content, period.
That's why you're seeing him doing more of the stuff that he wasn't doing in the past. Yeah. So you had a face to face with him. I've read, you know, stories out there that
seem to indicate that the folks at Twitter don't really have a handle on what their strategy should
be or what their policy should be. From your conversation with Jack Dorsey, does he know
actually what the policy and strategy going forward should be?
Well, on one hand,
Twitter knows what the problem is.
They know that there's a lot of content
on the platform
that probably doesn't belong.
They know they have a problem
with harassment.
They know they have a problem
with incendiary content.
And they saw during the 2016 election
what happens when misinformation spreads
and creates all kinds of political
and social unrest.
They're still kind of grappling with the best solutions there that somehow threads this
narrow needle where they combat the things they don't like, but at the same time doesn't
put Twitter in this position where it's the arbiter of what's true and what's false and
is the arbiter of what's news and what's so, you know, quote-unquote fake news.
They don't want to be a traditional media company.
So they're really just throwing spaghetti
against the wall at this point,
thinking about what kinds of decisions
they have to make on the back end
with the way the technology works
and the way you see the tweets that you do
that might ensure that the conversation
is a little bit healthier.
And so one of the things that's come up
are these accounts that, you know,
maybe they're parody accounts,
maybe they're just people
who are kind of goofing off,
who will tweet something
that's just patently false,
but will widely reverberate.
We saw this this week
with a fake account for Peter Strzok,
who was the fired FBI agent
who had said some things about,
you know, in opposition to President Trump
and the FBI fired him.
There was a fake Twitter account for Strzok
that had a tweet calling the president a madman that had more than 56,000 retweets. This is an
account that isn't the person that it says it is. It might be labeled a parody. People didn't seem
to care. It reverberated extremely widely on the platform, and Twitter does not have a solution for
it. And so the questions that they're asking now are, okay, what can we do to give people the context clues
they need to make better decisions
about what they read and what they share?
Senator Mark Warner, Democrat of Virginia,
floated a memo, a brief earlier this month
that seemed to indicate that he would be for declaring companies like
Facebook and Twitter utilities, which would allow Congress to regulate them more.
Could that ever happen?
I don't really know what that means, though.
Like, you can regulate a company without calling it a utility, to be clear.
I think the question generally is, will Congress regulate tech at all in any capacity, utility or otherwise?
And even on the bare-bones stuff, like things you would think is super low-hanging fruit, Democrats and Republicans can't really get any momentum.
Consider political ads.
There's nobody on Capitol Hill who believes that there isn't a bit of a problem when it comes to foreign agents trying to purchase online advertisements, which is illegal.
Foreign nationals cannot purchase political ads that way.
But that's precisely what happened during the 2016 election.
We saw Russian agents purchase ads on Facebook and Twitter, which reached millions of users online,
and in doing so, spread their messages in an attempt to cause social and political unrest.
So there's a proposal, it's a bipartisan proposal that would just require more disclosure of those ads. And even on that narrow question, Congress hasn't
even held a hearing on a bill that could be eventually brought to the floor and become law.
Like we're like multiple steps away from anything ever hitting a federal rule book on something
that everybody seems to agree is a serious problem. So when you think even further ahead to
things like regulating tech companies as utilities
or holding them accountable for the content that appears on their platforms, it just seems
like an impossible sell.
It seems like lawmakers just aren't as plugged into this as perhaps they should be.
After the break, if we were going to regulate social media, how would we even do it?
And could it ever make all of us happy?
I'm Sam Sanders. This is Today Explained. So, Johnny, at the top of the show, you mentioned that we're going to talk about Henry's toothbrushing experience with the Quip electric toothbrush. diagnosed with autism and actually has sensory deprivation, meaning some kids with autism have
sensory overload, meaning bright lights or sounds are too much. For him, it's the opposite. He needs
deep tissue massage. Anyway, all that is to say a vibrating toothbrush. First time he's ever had
that. He loves it. It's actually like sensory like therapy for him. Wow. Yeah. He like uses it and we
like see him like brushing his teeth, but he's not really brushing his teeth.
He's kind of just like letting it sit on his gums and like feel it.
And it's actually like really regulating for him.
Huh.
Yeah.
Amazing.
Yeah.
So that's it.
It's a whole different experience.
It's a whole different experience because we've never had electric toothbrushes in our house.
Yeah.
And so for Henry, the tension is not really tension.
It's actually a really positive thing for him.
Should Twitter police tweets on its platform?
That's an open question.
Now, whether they have to legally, that is not an open question.
They do not.
All because of one law, a law that forms the bedrock of the modern Internet.
It's Section 230 of the Communications Decency Act.
Section 230 says that Internet media companies can't be held responsible for what their users say on their platforms.
That means that pretty much no matter what disgusting things or illegal things or hateful things people post,
Twitter and Facebook and the rest of the gang,
they are totally off the hook. How could that possibly be a good thing?
The argument is we wouldn't have the internet that we have today without Section 230.
Danielle Citron is a professor at the University of Maryland School of Law.
She teaches and writes about information privacy, free expression,
and civil rights. And she says Section 230 is actually crucial to everything we know and love
online. Because if there's a risk that what users say and do is on the platform's potential
liability, then it's going to allow people to complain. And then in turn, sites will worry
about publisher and distributor liability and take it
down. They won't even look to the merits. They will just to protect themselves. They will just
overly take down valuable content, newsworthy content. And sites that don't have the resources
to police content their users are posting, they might just shut down entirely because they're
afraid to run afoul of the law. So it's true that Section 230 has been incredibly important for free expression,
but there's also costs of free speech.
It's true that, you know, in the face of threats and defamation and privacy invasions, people go offline.
It's not that Section 230 isn't a great thing, but because it's been so broadly interpreted, it means that even revenge
pornographers who encourage abuse, even sites like thedirty.com that say, hey, post all the
smut and gossip and nude photos you want, they're not even just not filtering, they're encouraging
it, and they get the immunity as well. So we're in a moment now where the status quo, the Section 230 status quo
is not working for liberals and conservatives. In that kind of environment, is there any kind of
new formula, new type of regulation that would work and make both sides happy.
Okay, so I have, I think, a third way. We preserve the immunity, but ensure that sites engage in
reasonable practices in the face of known and clear illegality. It's not going to satisfy
liberals entirely because hate speech is completely
protected, but it might satisfy those people who are worried about problems like cyber-stalking
and revenge porn because in the face of known illegality and you don't engage in reasonable
practices, well, then you've given up your immunity, right? The other thing that I'm
thinking about is should we treat content platforms, so like the Twitters, the Facebooks,
the YouTubes, you know, at the top content layer of the internet, like public utilities?
What if we said content service providers? You can't discriminate people not only against them
because of who they are, you know, they're a member of a protected group, but also because
of their political views. And political views that we
would make clear is not, we're not talking about true threats. We're not talking about cyber
stalking. We're talking about truly ideological perspectives. So it's something at least some of
us are writing and thinking about. There is a division in thought about whether these companies are private companies or the new de facto public
square. And it seems if that is the underlying tension in how we think about what these companies
should do and how they should treat content, are these companies really totally private and have
the right to whatever they want? Or have they become such utilities that they have a higher obligation and they are de facto public.
Right.
That binary is exactly the public conversation.
They're either totally private.
They're not First Amendment actors.
They can do whatever they want.
Or it's the virtual town square.
And it's actually really neither.
Right.
It's something in the middle.
Because, you know, unlike I walk into a diner and the diner and I start saying things that
are outrageous and the diner says, listen, Danielle, you have to leave because we don't
like offensive speech.
You know, Twitter and Facebook and YouTube, they're not just any old diner, right?
They're not just any old private business.
Could we regulate them to recognize how important they are to speech
and expression? We could, right? Much in the way that we treat public utilities. Like at some point,
the telephone, which was AT&T, totally private company, it became really clear they were really
important to how we were communicating in the 20th century. And lawmakers said, it's so important,
we need to set limits and requirements of non-discrimination.
So when I hear you tell me that companies like Twitter aren't quite only private and yell First Amendment, First Amendment, hate speech, hate speech, hate speech.
It seems as if our laws and our constitution and our conversation about these companies hasn't kept up with the weird space these companies live in right now.
Has the law kept up with where a place like Twitter is right now?
It's struggling. Law is
a blunt instrument and it's slow. It's like the tortoise. And it's true that even the Supreme
Court in a case from last term, Packingham versus North Carolina, had this like rosy view of, and
I'm going to say the internet, but think of me saying it with air quotes, the internet. You know,
the court said, it's the virtual town
square. It's where we can all go be a town crier. And then it says, this is the Supreme Court saying
this in 2017, someday there may be mischief and illegal activity on the internet. And when that
happens, we'll revisit this. I swear to gosh, I thought to myself, and I'm reading these sentences, oh my golly, like, this is not 1996.
So we've laid out all of the problems and the challenges in fixing this situation.
We need to reconceptualize how we think of companies like Twitter.
We need the law to catch up with a company like Twitter.
And we need to find some kind of solution that works for both the left and right, knowing that all of those problems exist, if you were a betting woman
in two years, do you think we have this figured out? That's a lot of pressure.
I would say I am cautiously optimistic that we will get some part of this figured out. Because if you had told
me 10 years ago that we would ever even have a conversation about Section 230 and that lawmakers
would be open to revising it, I would have told you you're out of your mind, Sam. I'm a little
Pollyannish, I know. But having seen the change in the last two years, I'm shocked, like blow me over, that even Mark Zuckerberg in his testimony said, time is now, we're responsible, something should be done.
So let's do it.
That was Professor Danielle Citron of the Maryland School of Law.
If you want to hear more about regulating speech on the Internet,
I want to refer you to an episode of my show from June 5th.
In that episode of It's Been a Minute, I chat with Nadine Strawson.
She's the former president of the ACLU,
and she tells me why she thinks the only way to beat hate speech is through even more free speech.
I'm Sam Sanders, happily filling in for my friend Sean Rama's firm while he is on vacation.
This is Today Explained.
Hey, guys, hold on. It's Sean from Vacation. I just found out that it's Sam's birthday
and he offered to guest host the show
even though it was his birthday
and then didn't tell any of us that it was his birthday.
So don't forget to do something.
One, two, three.
Happy birthday to you.
Happy birthday to you.
Happy birthday to you. Happy birthday, dear Sam.
Happy birthday, Sanders.
Happy birthday to you.
And many more.
Yeah.
There are all sorts of bonus surprises that come in that package.
The package that you get at getquip.com slash explain.
Yes.
And in some of those pouches come little boxes that have a giant thing of toothpaste.
And then it comes with these little travel toothpaste tubes, which I just love.
It even says on it like one week, one week of toothbrushing.
Oh, really?
It tells you like how many days?
Assuming twice a day brushing, probably.
So it probably lasts a little longer for me.
No comment.
No comment.