Front Burner - Trump gets deplatformed
Episode Date: January 11, 2021U.S. President Donald Trump was permanently banned from Twitter after the platform cited “the risk of further incitement of violence” following the insurrection at the U.S. Capitol last week. Face...book previously banned him for the remainder of his time in office, and many other tech companies have followed suit. Today on Front Burner, Julia Angwin joins host Jayme Poisson for a conversation about Trump’s ban from multiple social media platforms and what consequences that might have. Angwin is editor-in-chief of The Markup, an American non-profit that takes on data-driven investigations about the ethics and impact of technology.
Transcript
Discussion (0)
In the Dragon's Den, a simple pitch can lead to a life-changing connection.
Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in part by National Angel
Capital Organization, empowering Canada's entrepreneurs through angel investment and
industry connections. This is a CBC Podcast.
So on Friday night, Twitter played whack-a-mole with Donald Trump.
After previously slapping a 12-hour pause on his account,
Trump popped his head up, released a video message, and tweeted this. The 75 million great American patriots who voted for me,
America first and make America great again,
will have a giant voice long into
the future. They will not be disrespected or treated unfairly in any way, shape or form.
Lots of exclamation marks there. Trump also tweeted that he would skip Joe Biden's inauguration.
Twitter brought down the hammer, a permanent ban because of the quote,
risk of further incitement of violence following the riot on Capitol Hill.
Then out of another hole, he popped up again, this time on his campaign's Twitter account.
We will not be silenced, he wrote.
Twitter banned that account too.
And Trump, he had one last burrow to emerge from.
He commandeered a staffer's account.
The profile picture and display name changed to Donald J. Trump,
and Twitter wielded the mallet one last time.
That account got suspended too.
Over the weekend, a bunch of other tech companies followed suit.
Today, Julia Angwin is with me.
She's the editor-in-chief of The Markup, a nonprofit that takes on data-driven investigations
about the ethics and impact of technology.
We'll talk about Trump's ban on social media and what consequences that might have.
I'm Jamie Poisson, and this is FrontBurner.
Hi, Julia. Thank you so much for making the time to speak with me today.
Hi, it's great to be here.
President Trump has now been banned from so many social media platforms, from Twitter.
88 million people following nothing.
Everything he's ever tweeted or retweeted, removed.
Facebook for at least the next two weeks.
His Twitch channel, Snapchat.
YouTube has removed some of his videos.
And also, like now you can't buy Donald Trump merchandise using Shopify,
which is a Canadian online retail company. So if Trump were to try to use the internet today, what could he do?
Yeah, that's a good question. Because every moment I hear of another platform banning him. And so
I feel like it's one of the it's like a video game where you keep trying to find the door
out, and you can't find it.
He's actually sort of what they call deplatformed.
Like almost every major service has kicked him off.
Right, right.
I want to start by asking you about Facebook and Twitter's decision to do this because they seem so significant to me.
For years, both Facebook and Twitter have defended having Trump on their
platforms. And let's talk about Twitter first. Obviously, it was the insurrection in Washington,
but what other factors do you think contributed to that decision?
Well, I don't think you can underestimate the fact that the Twitter employees basically all
banded together, not all of them, but I think several hundred wrote a letter to management saying we need to remove Trump.
One telling NBC News, a lot of us are so happy and so proud to work for a company that did the right thing.
You know, one of the things about these big tech companies is that they aren't really scared of regulation.
They operate around the world and a lot of governments have tried to
crack down on them. And oftentimes they just thumb their nose at governments. But the employees have
a lot of power because these engineers are really expensive. They can work anywhere. And so we've
seen this with Google when they banned, their employees got together and said, you know, we
don't want to work for the Defense Department.
This after a letter was signed by over 3,000 Google workers to the CEO, Sundar Pichai, saying in part, quote, we believe Google should not be in the business of war.
And it feels to me like Twitter was maybe pushed over the edge to make this decision by its own employees.
The other thing I was wondering about, and I'd love to get your thoughts, is the timing.
So Trump has two weeks left. And I also can't help thinking about the fact that the Democrats
now control both the House and the Senate, thanks to their wins in Georgia. And do you think that
this is playing a role in these decisions as well? I mean, of course, you can't really prove that,
but it does seem really convenient, right? Like, all of a sudden, there's a new administration in town. And so, you know, you could imagine they don't want to be antagonistic to that new administration. I mean, traditionally, when you look at the behavior of these platforms around the world, they do tend to be deferential to the government in power.
tend to be deferential to the government in power. And so we see decisions all the time like that. And I think it's like if you think about Thailand, for instance, you know, they
I think Google is very deferential to the monarchy there, which doesn't allow sort of
pictures of the king that look bad. Google has agreed to help remove content insulting
Thailand's monarchy. Thailand's deputy prime minister said more than 100 items
insulting the monarchy have been found on the U.S. internet firm's services.
And you know, things like that would never allow here in the U.S. And so there is a tradition with
these companies of being deferential to power. And so although I have no evidence that this is
related to this, it doesn't seem surprising. I want to talk about Facebook now to a little bit of a
different conversation there. Definitely a different company. They've decided to ban him
for two weeks now up until he's essentially out of office. According to Mark Zuckerberg,
in a statement, he said the shocking events of the last 24 hours clearly demonstrate that
President Donald Trump intends to use his remaining time in office to undermine the peaceful and lawful transition of power to his elected successor, Joe Biden.
But it wasn't that long ago, just back in May, that Facebook declined to remove Trump's post suggesting anti-black racism protesters in Minneapolis could be shot, which I'm sure you could argue
is an incitement of violence.
Zuckerberg defending his decision not to flag this or pull it down, saying, quote,
I disagree strongly with how the president spoke about this, but I believe people should
be able to see this for themselves because ultimately accountability for those in positions
of power can only happen when their speech is scrutinized.
And Twitter went the full way here.
And why don't you think Facebook has gone as far as Twitter?
I feel like with Twitter, they are actually pretty much a, they're kind of a live and
let live situation.
They don't have a huge moderation.
They don't have anywhere near the huge moderation presence that Facebook does, you know, with
thousands, tens of thousands of people around the world looking at content. But the thing about Facebook is they tend to be really concerned about
universality. Like they don't want to set a precedent for the future. And so they often
leave themselves like an optionality. And so in this case, I feel like it's like they wanted to
just sort of make the decision but not make it completely. And that's a position that I think
some people think is really unrealistic, but they've held on to it. And the thing that's
weird about that position is that it doesn't really make any sense to have a global standard
for like, what is hate speech, for instance, because hate speech is really context specific.
And so there's a lot of people who have said like this, that hanging on to a global standard
is actually the problem here. Hmm. And so they would a lot of people who have said like this, that hanging on to a global standard is actually the problem here.
And so they would like more regional approaches?
Yeah. So like, you know, Europe has forced Facebook into a regional approach because they have rules banning, for instance, Nazi content.
So after years and years of fighting, they finally actually came up with a European hate speech
standard that they hold Facebook to. And Facebook is actually accountable, the other platforms as
well, to remove things that violate the European laws, but don't violate here, right? So like,
for instance, Holocaust denial, which, you know, until about three weeks ago, Facebook was fine with. But that's illegal in Europe. So they actually
had agreed under huge pressure to do that just regionally there, but they're very upset about it.
And they didn't want to have to do that everywhere. Because imagine you have to put a whole team
that really understands that specific area and what those issues are. And that's money,
every employee that they have to add to look at content is hurting their profit margin.
This argument that Facebook wants across the board's standards,
this is the same argument that Russian dissident and opposition politician
Alexei Navalny made over the weekend. He argued that emotions and personal political preferences
were at play here as other accounts belonging to other controversial leaders like Russian
President Vladimir Putin have not been banned. And I imagine that some people will argue that
this argument is essentially whataboutism, that the violence at the Capitol building has such frightening implications for American democracy, that this should have been a no brainer for Zuckerberg, that he should have even gone further.
I mean, it's not necessarily a no brainer, because there's a pretty legitimate argument that the public needs to know what the president is thinking. And so there are a
lot of smart people who have said, you know, it was right to give him a platform until he got to
the point of inciting violence. Now, when was that point? Like you said, there have been many times
that he's called for violence. And so whether the fact that this actually resulted in violence or maybe more violence than the previous ones, was this the right threshold?
But it is difficult because there's a way in which these platforms serve as our public square.
And that's the problem.
The problem is we have our public square policed by private platforms and they can make decisions wherever they want.
Right. And I want to get to that with you in a few minutes as well, just the incredible amount of power these companies have.
But first, can we talk about where some of these people will go now? So Twitter is also banning
people this week who have promoted QAnon conspiracy theories. They've also banned some high profile
supporters of the president, like former National Security Advisor Michael Flynn, one of Trump's lawyers, Sidney
Powell. And lots of people are angry about the platforms banning Trump. And they say that they're
leaving on their own volition. A lot of people are saying that they're going to places like Parler.
And for those who might not have heard of Parler, it advertises as a free speech social network.
And can you talk to me a little bit about who uses it and how?
So Parler was set up basically because even before this,
a lot of people felt like their voices were being stifled on other social media.
And so they described themselves as sort of radical free speech.
Start a conversation.
Share your opinions.
Free speech never felt so good. Check out your feed. Follow your friends. And vote. Voting is the equivalent of liking. Start a conversation. Share your opinions. Free speech never felt so good. Check
out your feed. Follow your friends and vote. Voting is the equivalent of liking. Agreeing.
Standing with. Which maybe is true, but it's also true that it was a place where they could express
views that were really, you know, not considered okay, right? Calling for violence, organizing to
kill Nancy Pelosi, talking about white supremacy,
talking about Nazi being Nazis. You know, these are things that the other platforms didn't want
happening on their platform. So Parler sprung up to meet that need. And so that's essentially,
I mean, there's other things that happened on Parler, but that's sort of the reason it came
into being. The social network gained attention this week for posts that planned and incited violence against U.S. lawmakers contributing to the riots on Capitol Hill. Apple
says those clashes were coordinated on Parler's site and in a statement on Saturday announced
the app's suspension from its store. You know and I was listening to an interview with the CEO of
Parler and he certainly suggested that he wouldn't deplatform Trump, right? He doesn't think the decision to deplatform conspiracy theorist Alex Jones was the right one.
It's Parler, it's Twitter, it's Facebook, it's Google, it's Telegram, WhatsApp, whatever it
might be. You can't stop people and change their opinions by force, by censoring them. They'll just
go somewhere else and do it. So as long as it is legal, it's allowed. So I suppose my question
is, is that is there a scenario in which the alternative is worse here that people will
gravitate to these sites? And there's little to no moderation? Yeah, it might be worse. Because
it's a kind of sort of a cesspool, right of conspiracy theories without any sort of
moderation. But it's not also clear that the moderation
really worked, right? So even when Facebook was moderating some of these groups that were spreading
lies about the election, it was clear from the comments on the pages that even when Facebook
put a little flag saying like, this isn't actually true, that it didn't make any difference to the
people chatting, they were still believing it. There's evidence that, you know, putting these little flags doesn't
make a huge difference to people. But what about the fact that Parler is just happy, or right now,
certainly a CEO is happy to let a lot more people stay on Parler? So if people are organizing
something, that's more of a problem of people are upset.
They feel disenfranchised. They need their leaders to stop provoking this partisan hate.
They need to come together and have a discussion on a place like Parler.
Yeah. I mean, he is definitely has a totally different approach. Right. And that is something
that he's entitled to have. Right. That's it's a private entity and they can have whatever kind of chat they want.
Now, they have to probably assume that, like,
for things like violence,
the FBI is infiltrating those chats and monitoring them.
But, yeah, they can make that decision.
But they're also subject to the fact that, like, you know,
Amazon is about to kick them off.
In a post on Parler, CEO John Matzz said Amazon will be shutting off its servers effective midnight.
The CEO says that could take a few days to find another host.
Which is another private company that just doesn't want to deal with them.
Right. And do you think that this will make a difference there?
Because, you know, I know I believe like another one of these sites, Gab, was banned by Amazon's Amazon Web Services, AWS, in the past.
And like these companies just find some server to host them in like some other country or something.
They do and they don't. Right.
If you look at the history with Daily Stormer, the Nazi website, after they were banned by all the big server platforms, and then they went overseas,
and then they kind of basically don't exist anymore, right?
So it works in the short term, right?
Whether that works in the long term,
because maybe it builds this feeling of censorship
and that creates an even more potent, you know,
form of this type of thing,
I think we don't actually know yet.
Right, right, the unforeseen consequences. Changing Connection. Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in
part by National Angel Capital Organization, empowering Canada's entrepreneurs through
angel investment and industry connections. Hi, it's Ramit Sethi here. You may have seen
my money show on Netflix. I've been talking about money for 20 years. I've talked to millions of
people and I have some startling numbers to share with you. Did you know that of the people I speak to,
50% of them do not know their own household income?
That's not a typo, 50%.
That's because money is confusing.
In my new book and podcast, Money for Couples,
I help you and your partner create a financial vision together.
To listen to this podcast, just search for Money for Cups.
Watching this sort of whack-a-mole play out on Friday on Twitter,
where like Trump was popping up on multiple different feeds and Twitter just silencing each feed,
I really couldn't help think to myself like how indicative this was of just the incredible amount of power these companies hold, the power that just a few billionaires have over public discourse.
And now that we do have this example that the president has essentially been kneecapped or basically all he can do is like online shopping and sending text messages.
What questions do you think people should be asking themselves about the power of these companies today? Yeah, I think this is a really good
illustration of something that I have been studying as a reporter for years, which is the
power of these big platform companies. They really police political speech and speech generally
worldwide because we do so much of our communications using them. And I think it's a really important question to ask whether we,
what are the conditions under which we're willing to let this happen? Because back when we had
television as our main media outlet, it was regulated, and they had sort of obligations to
show how much political advertising was happening.
And they actually had to disclose who bought it and not charge egregious rates.
And there's all sorts of things that these tech platforms are not held to.
And so I think it's just worth thinking about how in the past, when we've seen powerful groups like this, at least in the US, we have really given this one
industry a free pass where they really are not regulated. And it's a question that I think
is growing, already been growing in Congress about what to do about that. And I don't know
that I know what the answer is, but it does seem like there has to be some obligations that they
have to the public, other than just the
obligations they have to their shareholders, because right now, they're really only operating
under the obligations to the shareholders. Right. And sort of speaking of this unchecked
power, you know, I know that these companies have tried to take some steps on their own.
They've tried to make some promises. And, you know, I imagine it's really easy to measure
and see Trump being deplatformed. But they've also made all these promises around more insidious
things that contribute to like a lot of misinformation floating around. And I know
your team has been working on a really interesting project around this. And I wonder if you can tell
me a little bit about what you found and how, you know, your findings there fit into the conversation that you and I are having right now.
Yeah. Yeah. So we at The Markup have been thinking about this exact question that we're discussing,
which is the sort of unaccountable power that the tech platforms have to regulate speech.
And thinking, how could we as journalists be an independent watchdog on that power? And we came to the
conclusion that we would build something. So we ended up doing a project called Citizen Browser,
where we spent the past year building an app. And then we paid more than 1000 people across the US
to install that app on their computers. And it allows us to see what is being promoted to them
in their Facebook feeds.
And we've just launched this project. We had our first story the day of the Georgia elections.
And we looked at what was happening up until the Georgia runoffs and saw that in the weeks leading up to it, Facebook had changed the dial and allowed political advertising.
had changed the dial and allowed political advertising.
And what that did was it really meant that all the political news in the panelists' feeds went from mainstream media
to about a third partisan content.
And so you could really see the impact of them flipping the dial.
And so I think this project is really interesting
because I think it's the first time we have a real window
to see what happens when they move the levers behind the scenes.
Right.
And when we're talking about regulation here, like so many of the conversations around regulation center around exactly what you're talking about here.
This opaqueness of how these algorithms actually work and the effect that it ultimately has on democracy.
Right.
Exactly.
democracy. Right, exactly. And the thing that's weird about the tech industry is like, that we had to go to such extreme lengths to try to break through that black box, right? And we barely have,
right? We have 1000 panelists, they have a 2.7 billion users. So the even the sample that we
have is tiny and doesn't really, it's only a very small peak behind the curtain. But if it was like, if we were covering aviation,
right, there would be, we would go to the Federal Aviation Authority for records about like safety
and inspections. And so we wouldn't be having to do maybe this kind of reporting that's so intense,
but there is no FAA for the tech industry. There's no entity out there collecting independent data about what they're doing.
And so we had to build our own tool at great cost and huge expense in order to find out what they're doing.
Before we go today, you know, this has been this unprecedented week in the news,
and it felt like these four years of Trump's term really led up to what we saw unfold on Capitol Hill on Wednesday.
And with hindsight, what do you think the history books will end up saying about the way that social media giants handled that?
I think what we've learned here is that it's very much arbitrary. It's very much at the tech
company's discretion. And so we see these sort of wild swings, right, where I think it was not that
long ago that Mark Zuckerberg gave a huge speech in Washington, D.C. about free speech over everything.
And it was really directed at this question about Trump and his supporters spreading misinformation on their platform.
Some people believe that giving more people a voice is driving division rather than bringing people together.
driving division rather than bringing people together. More people across the spectrum believe that achieving the political outcomes that they think matter is more important than
every person having a voice and being heard. And I think that that's dangerous.
And then, you know, one riot later, they changed their mind. And so I feel like the
lesson here is that they have a lot of power,
they use it at their discretion, and there are no checks and balances on it. And I think what
we'll look back and wonder at is how we let that happen for so long without any sort of
real oversight and accountability. All right, Julia, thank you so much for this conversation.
Really appreciate it. Thank you.
Okay, so before we let you go today, some news developments from Washington.
Late Sunday, Democratic House Speaker Nancy Pelosi told lawmakers that they will vote this week on a resolution calling on Vice President Mike Pence and the cabinet to invoke the 25th Amendment and remove President Donald Trump from office.
Pelosi added that after that, they, quote, will proceed with bringing impeachment legislation to the floor.
We'll be following all of this really closely, but that is all for today. Thanks so much for listening to FrontBurner, and we'll talk to you tomorrow.