Radiolab - Post No Evil Redux
Episode Date: June 19, 2020Today we revisit our story on Facebook and its rulebook, looking at what’s changed in the past two years and exploring how these rules will impact the 2020 Presidential Election. Back in 2008 Face...book began writing a document. It was a constitution of sorts, laying out what could and what couldn’t be posted on the site. Back then, the rules were simple, outlawing nudity and gore. Today, they’re anything but. How do you define hate speech? Where’s the line between a joke and an attack? How much butt is too much butt? Facebook has answered these questions. And from these answers they’ve written a rulebook that all 2.2 billion of us are expected to follow. Today, we explore that rulebook. We dive into its details and untangle its logic. All the while wondering what does this mean for the future of free speech? This episode was reported by Simon Adler with help from Tracie Hunte and was produced by Simon Adler with help from Bethel Habte. Special thanks to Sarah Roberts, Jeffrey Rosen, Carolyn Glanville, Ruchika Budhraja, Brian Dogan, Ellen Silver, James Mitchell, Guy Rosen, Mike Masnick, and our voice actor Michael Chernus. Support Radiolab today at Radiolab.org/donate.Â
Transcript
Discussion (0)
Okay, so in the layer cake of mayhem that we find ourselves in that metaphor makes no sense.
I want to update a story that we played a few years ago that is fascinating,
but definitely shifting and changing as we speak, and worth tracking, and yet hard to track with all the things that are happening right now.
This is a story that aired, I believe two years ago, but still very timely.
As you hear, a bit later in the episode we'll come back around and update it, bring it
up to the present.
Oh, wait, you're listening.
Okay.
Alright.
Okay.
Alright.
Door listening.
To radio lab.
Radio lab.
From WNYC.
To C.
C.
Yeah.
Hey, I'm Chad Abumrod.
I'm Robert Krollwich.
Radio Lab.
And today we have a story about what we can say and what we f***ing can't.
And by the way, there's going to be a smattering of curse words here that we're not going
to believe, which I think makes sense given the content of the story. And also there's some graphic scenes that, uh, if you've got kids with you, you may want
to sit this one out.
Yeah.
Anyway, the story comes to us from producer Simon Adler.
So let's start.
Can we start in 2008?
Sure.
How about with a song?
Yes, please.
Okay.
Rise up.
Rise up.
Demon Facebook sees their oppressive ways. Rise up. Rise up. So December 27th, a sunny Saturday morning, this group of young to middle-aged women gathered
in downtown Palavelta. I've got a diverse face freedom. That's all I've got is a gardenette. They're wearing these colorful hats and are singing and swaying directly in front of the
glass-dored headquarters of Facebook, since we're one of graphics.
Facebook.
Is there so cost-service, so free?
Yes.
What's up?
It was a humble gathering of a few dozen women and babies.
That right there. Are you the organizer of this right there is one of the organizers of the gathering.
I'm Stephanie Muir.
And what do you call the event?
It's a Facebook nurse in.
Nursing as in like breastfeeding.
The intent was really just to be visible and be peaceful and make a quiet point.
What point were they trying to make?
Well, Stephanie and this group of mothers, you know, they were on Facebook as many people
were and they'd have photos taken of themselves occasionally breastfeeding their babies.
They wanted to share with their friends what was going on so they would upload those photos
to Facebook.
And these pictures would get taken down and they would receive a warning from Facebook for uploading
pornographic content and people were really getting their backs up over this
They wanted Facebook to stop taking their photos down to say that well nudity is not allowed. Press feeding is exempt period
And put back our pictures right now! Now, what Stephanie couldn't have known at the time was that this small peaceful protest
would turn out to be...
This morning a face-off on Facebook.
One of the opening shots...
Facebook triggered a hornet snazzy.
And what would become a loud...
Fuck you, Facebook. Fuck you, Facebook. Rock become a loud Fuck you, Facebook. Fuck you, Facebook.
Rock-ass.
Fuck you, Facebook.
Fuck you.
And global battle.
In battles, Facebook CEO.
Facebook today playing defense.
And now I'm not talking about all the things
you've recently heard about, Russian interference,
and election meddling, or data breaches.
But rather something that I think is deeper
than both of those.
Free speech.
What we can say and what we can't say.
We can't get a Facebook, but I'm just iconic photocopter.
What we can see and what we can't see.
They'd let Mueller write kids in front of me.
On the internet.
Thank you, Mr. Chairman.
Mr. Zuckerberg, I've got to ask you, do you subjectively prioritize or censor speech?
Congresswoman, we don't think about what we're doing as censoring speech.
But what really grabbed me was discovering that underneath all of this is an actual rule
book, a text document that dictates what I can say on Facebook, what you can say on Facebook,
and what all 2.2 billion of us can say on Facebook.
For everyone in the entire globe.
For everyone who's on Facebook.
One set of rules that all 2.2 billion of us
are expected to follow.
Is it an actual document?
It's a digital document, but yes,
it's about 50 pages if you print it off.
And in bullet points and if then statements, it spells out sort of a first amendment for
the globe, which made me wonder, like, what are these rules?
How were they written?
And can you even have one rule book?
Right, exactly.
And so I dove into this rule book and dug up some stories that really put it to the test.
Hmm, okay.
Yeah, I'm interested in you.
Now, when stories are gonna go, we are gonna go.
Three-ish.
Three-ish.
Okay.
All right, cool.
Particularly interesting in the ish, but let's go ahead with the first one.
Well, so, uh, let's start back, uh, on that morning in 2008.
The morning that you could argue started it all.
Rise up, rise up.
Because in the building right behind those protesting mothers, there was a group of Facebook
employees sitting in a conference room trying to figure out what to do.
Um, cool.
So I was able to get in touch with a couple of former Facebook employees, one of whom was
actually in that room at that moment.
And now neither of these two were comfortable
being identified, but they did give us permission
to quote them extensively.
How's that?
Well, that'll take work for you.
Sound great.
Just so we have it, let's get to it.
So what you're going to hear here is an actor
we brought in to read quotes taken directly
from interviews that we did with these two
different former Facebook employees.
All right, ready.
So at the time when I joined them, there was a small group, 12 of us, mostly recent
college grads, who were sort of called the site integrity team.
Again, keep in mind, this was in the early 2000s.
SizeMitchanges this week in the internet hierarchy.
This was like the deep dark past.
MySpace.com is now the most visited website in the US.
Facebook had somewhere in the neighborhood of 10 million users. We were smaller than MySpace.com is now the most visited website in the US. Facebook had somewhere in the neighborhood of 10 million users.
We were smaller than MySpace.
The vast majority of them college kids.
And so in those early days, those 12 people, they would sit around in a sort of conference
like room with a big, long table, each of them in front of their own computer.
And things would come up onto their screen flag to Facebook.
And flag meaning like I a user saw something
that I thought was wrong.
Exactly, like a reporting a piece of content
that you think violates the community standards.
This is Kate Klonic, she's a professor of law
at St. John's and she spent a lot of time
studying this very thing.
And she says in those early days,
what would happen is a user would flag a piece of content
and then that content along with an alert
would get sent to one of those people sitting in that room.
It would just pop up on their screen.
Most of what you were seeing was either
naked people blown off heads or things that there was
no clear reason why someone had reported
because it was like a photo of a golden retriever
and people are just annoying.
And every time something popped up onto the screen,
the person sitting at that computer
would have to make a decision,
whether to leave that thing up or take it down.
And at the time, if you didn't know what to do,
you would turn to your pod leader,
who was, you know, somebody who'd been around
nine months longer than you and asked,
what do I do with this?
And they would either have seen it before
and explained it to you, or you both would know and you'd google some things.
It really was just kind of an ad hoc approach. Was there any sort of written standard or any
common standard? Well, kind of. They had a set of community standards. But at the end of the day,
they were just kind of, that was one page long and it was not very specific.
Sorry, the guideline for really one page long.
They were one page long.
And basically all this page said was, nudity is bad, so is Hitler.
And if it makes you feel bad, take it down.
And so when one of the people sitting in that room would have a breast feeding picture,
pop up on the screen in front of them, they'd be like, I can see a female breast,
so I guess that's nudity, and they would take it down.
Until...
Rise up!
Rise up!
Fight for the rights to have breastfeeding, though.
Anyway.
Now, a dozen or so people in front of their offices
on a Saturday, it probably wasn't causing Facebook
too much heartache.
But...
I thought, you know, hey, we have an opportunity here with, you know, over 10,000 members in
our group.
According to Stephanie Meer, those protesters were just a tiny fraction of a much larger
online group who had organized ironically enough through Facebook.
So to coincide with the live protest, I just, you know, typed up a little blurb encouraging
our members that were in the group to do a virtual
nursing.
A virtual nursing.
Right.
What we did, they posted a message asking their members to, for one day, change their
profile avatar to an image of breastfeeding and then change their status to the title of
our group.
Hey Facebook, breastfeeding is not obscene.
And it caught on.
The social networking website is under fire
for its policy on photos of women breastfeeding
their children.
12,000 members participated and the media requests
started pouring in.
The Facebook group called, Hey Facebook, breastfeeding
is not obscene.
I did hundreds of interviews for print,
Chicago Tribune, Miami Herald, Time Magazine, New York Times, Washington Post.
The Internet is an interesting phenomenon.
Dr. Phil, it was a media storm, and eventually, perhaps as a result of our group and our efforts, Facebook, was forced to get much more specific about their rules.
So for example, by then nudity was already not allowed on the site.
But they had no definition for nudity.
They just said no nudity.
And so the site integrity team, those 12 people at the time, they realized they had to
start spelling out exactly what they meant.
Precisely.
All of these people at Facebook were in charge of trying to define nudity.
So I mean, yeah, the first cut out it was visible male and female genitalia, and then visible
female breasts. And then the question is, well, okay, how much of a breast needs to be
showing before it's nude? And the thing that we landed on was if you could see essentially
the nipple and ariola, then that's nudity.
And it would have to be taken down.
Which theoretically at least would appease these protestors, because now when a picture
would pop up of a mother breastfeeding, as long as the child was blocking the view of
the nipple and the ariola, they could say cool, no problem.
Then you start getting pictures that are women with just their babies on their chest,
with their breasts bare.
Like, for example, maybe a baby was sleeping on the chest of a bare breasted woman
and not actively breastfeeding.
Okay, now what?
Like, is this actually breastfeeding?
No, it's actually not breastfeeding.
The woman is just holding the baby and she has her top off.
No, but she was clearly just breastfeeding the baby.
Well, like I was before.
Well, I would say it's sort of like kicking a soccer ball.
Like a photo of someone who has just kicked the soccer ball,
you can tell the ball is in the air,
but there's no contact between the foot
and the ball in that moment potentially.
So although it is a photo of someone kicking a soccer ball,
they are not in fact kicking the soccer ball in that photo.
That's a good answer.
And this became the procedure or the protocol
or the approach for all these things was,
we have to base it purely on what we can see in the image.
And so, they didn't allow that to stay up under the rules
because it could be too easily exploited
for other types of content, like nudity or pornography.
We got to the only way you could objectively say
that the baby and the mother were engaged in breastfeeding is if the baby's lips were touching the woman's nipple. So they
included what you could call like an attachment clause, but as soon as they got that rule in place,
like you would see, you know, a 25 year old woman and a teenage looking boy, right? And like,
what the hell is going on there? Oh yeah, it gets really weird if you like start entering
into like child age.
I didn't even gonna bring that up
because it's kind of gross.
It's like breastfeeding porn.
Is that a thing?
Are there sites?
Like, apparently.
And so this team they realized
they needed to have a nudity rule
that allowed for breastfeeding,
but also had some kind of an age cap.
So, so then we were saying,
okay, once you've progressed past infancy, then we believe
that it's inappropriate. But then pictures would start popping up on their screen and they'd
be like, wait, is that an infant? Like, where's the line between infant and toddler? And so
the thing that we landed on was, if it looked like the child could walk on his or her own,
then too old. Big enough to walk? Too big to breastfeed. Oh, that could be a two month.
Yeah, that's like a year old in some cases.
Yeah.
And like the World Health Organization recommends breastfeeding
until you know, like 18 months or two years,
which meant there were a lot of photos still being
taken down.
Within, you know, days, we're continuing to hear reports
from people that their photographs were still being
targeted.
But Facebook did offer a statement saying,
you know, that's where we're going to draw the line.
Facebook is in budging on its policy.
And keep in mind through this whole episode.
Is this perhaps the next big thing?
The Facebook.com.
The company was growing really, really fast.
It seems like almost everyone is on it.
And there just got to be a lot more content. When we first launched we were hoping
for you know maybe 400 500 people and now we're at a hundred thousand so who
knows where we're going now. Thousands more people are joining Facebook every
day. 60 million users so far with a projection of 200 million by the end of the
year and now more people on Facebook than the entire US population. Not just within the United States, but also it was growing rapidly more international.
You know, you were getting books in India.
It's stuff for India and Turkey.
Facebook.
Facebook is going to ran.
Facebook.
It's getting big throughout the EU.
Korea's joined the Facebook.
So they have more and more content coming in from all these different places in all these different languages.
How are we going to keep everybody on the same page?
And so once they saw that this was the operational method for dealing with this
creating this like nesting set of exceptions and rules and these clear things that had to be there had to not be there in order to keep content up or take it down, that I think became their procedure.
And so this small team at Facebook got a little bigger and bigger jumped up to 60 people
and then 100 and they set out to create rules and definitions for everything.
Can we go through some of the sort of the ridiculous examples?
Yes, please.
Let's go over here.
Okay.
So Gore. Gore. You mean violence, that's what we're here. Okay, so Gore. Gore.
You mean violence kind of Gore?
Yes.
So the Gore standard was headline,
we don't allow graphic violence in Gore.
And then the shorthand definition they used was
no insides on the outside.
No guts, no blood pouring out of something.
Blood was a separate issue.
There was an excessive blood rule
to come up with rules about bodily fluids.
Seaman, for example, would be allowed
in like a clinical setting,
but like what does a clinical setting mean?
And you know, does that mean if someone is in a lab code?
I want my favorite examples is like,
how do you define art?
Because as these people are moderating,
they would see images of naked people
that were paintings or sculptures come up.
And so what they decided to do is say art with nakedness can stay up.
Like, it stays up if it is made out of wood, made out of metal, made out of stone.
Really?
Yeah, because how else do you define art?
You have to just be like, is this what you can see with your eyeballs?
And so from then on, as they run into problems, those rules just constantly get updated.
With your constant amendments.
Yeah, constant amendments.
New problem, new rule.
Another new problem, updated rule.
In fact, at this point, they're amending these rules up to 20 times a month.
Wow.
Really?
Yeah, take for example those rules about breastfeeding. In 2013, they removed
the attachment clause, so the baby no longer needed to have its mouth physically touching the
nipple of the woman. And in fact, one nipple and or ariola could be visible in the photo.
But not two. Only one. Then 2014, they make it so that both nipples
or both Aureola may be present in the photo.
So this is what happens in American law all the time.
It's very thing.
Yes.
Yeah.
You know, it sounds a lot like common law.
So common law is the system dating back to early England
where individual judges would make a ruling,
which would sort of be a law, but then that law would be amended or evolved by other judges.
So the body of law was sort of constantly fleshed out in face of new facts.
Literally every time this team at Facebook would come up with a rule that they thought
was airtight, caplop, something that would show up that they weren't prepared for, that the rule hadn't
accounted for.
As soon as you think, yeah, this is good.
Like, the next day, something shows up to show you, yeah, you didn't think about this.
For example, sometime around 2011, this content moderator is going through a queue of things.
Except, reject, accept, escalate, accept.
And she comes upon this image.
Oh my God.
The photo itself was a teenage girl,
African by dress and skin,
breastfeeding a goat, a baby goat.
And the moderator throws her hands up and says,
what the fuck is this?
And we Googled breastfeeding goats
and found that this was a thing.
It turns out it's a survival practice.
According to what they found, this is a tradition in Kenya that goes back centuries that in
a drought, a known way to help your herd get through the drought is to, if you have a woman
who's lactating, to have her nurse the kid, the baby goat, along with her human kid.
And so there's nothing sexual about it.
It's just good for our business.
Good. And theoretically, if we go point-by-point
through this list, it's an infant,
it sort of could walk, so maybe there's an issue there.
But there's physical contact between the mouth
and the nipple.
But, but obviously,
breastfeeding as we intended anyway meant human infants.
And so in that moment, what they decide to do
is remove the photo.
And there was an amendment and asterisk
under the rule of stating animals are not babies.
We added that so in any future cases,
people would know what to do.
What, what they removed, they discovered
was culturally appropriate and a thing that people
do and they decided to remove the photo.
Yeah.
That outraged individual is our editor, so we're in Wheeler.
Why?
Why didn't we make an exception?
Because, because when a problem grows large enough, you have to change the rules.
If not, we don't.
This was not one of those cases.
The juice wasn't worth the squeeze.
And like, if they were to allow this picture, then they'd have to make some rule
about when it was okay to breastfeed an animal
and when it wasn't okay.
This is a utilitarian document.
It's not about being right 100% of the time.
It's about being able to execute effectively.
In other words, we're not trying to be perfect here,
and we're not even necessarily trying
to be 100% just or fair, we're just trying to make something that works.
One, two, three, four, five, six, seven, eight.
And when you step back and look at what Facebook has become, like from 2008 to now, in just
10 years. Simon, I've just arrived at the
Accenture Tower here in Manila.
I don't know, many floors it is one, two.
The idea of a single set of rules that works
that can be applied fairly?
It's just a crazy, crazy concept.
15, 16, 17.
Because they've gone from something like 70 million users
to 2.2 billion.
It's hard to take count, I would say it's about 30 floors.
And they've gone from 12 folks sitting in a room deciding what to take down or leave up
to somewhere around 16,000 people.
So there's a floor in this building where Facebook supposedly outsources content moderators.
And so around 2010 they decided to start outsourcing some of this work, to places like Manila, where
you just heard reporter Aurora Almond-Drawl, as well as...
I mean, I would guess that there are thousands of people in this building.
Dublin, where we sent reporter Garrett Stack.
Oh, I can see in where they got their delicious Facebook treats cooked.
Everybody's beavering away.
And we sent them there to try to talk to some of these people who for a living City to computer and collectively click through around a million flagged bits of content that pop up onto their screen every day
Wow, what we I'm just curious like what's what's that like?
Well
Questions
We found out pretty quickly. What do you work for? None of these folks are willing to talk to us about what they do.
So there's a lot of running away from me happening.
Hey, lad, sorry to bother you.
You guys work in Facebook?
No, I don't.
Sorry, sorry.
You happen to work in Facebook, many times?
No, I don't.
I sorry to bother you.
Do you work inside?
No, sorry.
What?
Do you work in Facebook?
No.
I mean, like, you just came out of there.
I know you're lying.
In fact, most people wouldn't even admit they work for the company.
Like, what's, is there something wrong about being in there?
To say like an NDA that they signed?
Well, yeah. So, so when I finally did find someone willing,
uh, willing to talk to me.
Do you want to be named or do you not, or do you not want to be named?
I'd rather not.
That's totally fine.
You know, I'm still in the industry. I don't want to lose my job over this shit, you know.
To explain that he and all the other moderators like him
were forced to sign these non-disclosure agreements,
stating they weren't allowed to admit
that they worked for Facebook.
They're not allowed to talk about the work they do.
My contract prohibited me from talking
about what content moderation was.
Why?
Several reasons.
One is that up until recently, Facebook wanted to keep secret what these rules were so
that they couldn't be gameed.
At the same time, it creates a sort of separation between these workers and the company
which, if you're Facebook, you might want.
I knew I signed up to monitor graphic images.
Just given the nature of the job.
But, you know, I didn't really, you know, you don't really know the impact that that's
gonna have on you until you go through it.
So this guy talked to, he got his first contract doing this work several years back and for
the duration of it about a year, he'd show up to his desk every morning, put on his headphones, ignore delete, delete, delete, just like this, like this, like this,
45, 5,000 kids every day.
And it was just image and decision, image decision.
With 5,000 a day you just said?
Yeah, it was like, it was a lot of cases.
Yeah, he said basically he'd have to go through an image
or some other piece of content every three or four seconds.
Wow, all day long?
All day, eight hours a day.
Well, if I can ask, what kind of things did you see? I don't know if this has even, I don't even know if this is like radio, but worthy.
It's too, I think it's too extra-ded.
You know, clicking through, he came across unspeakable things.
Some heads exploding to, you know, people being squashed by a tank to, you know, people
in cages being drowned to like a 13 year old girl having sex with an eight year old boy.
And it's not just once.
It's over and over and over and over.
When did you, did this like keep you up and night? Did you, did this?
Absolutely. Absolutely. 100%. It kept me up at night.
He'd catch himself thinking about these videos and photos when he was trying to relax.
He had to start avoiding things. It was too real. I saw that. It's classic PTSD.
A different moderator I spoke to described it as seeing the worst side of humanity.
You see all of the stuff that you and I don't have to see
because they are going around playing cleanup.
What a job.
And it's worth noting that more and more of this work is being done in an automated fashion,
particularly with content like Gore or Terrace Propaganda.
They're getting better.
You can automate that?
Yeah, they, through computer vision, they're able to detect hallmarks of a terrorist video
or of a gory image. And with terrorist propaganda,
they now take down 99% of it before anyone flags it on Facebook.
But moving on to our second story here,
there is a type of content that they are having
an incredibly hard time, not just automating,
but even getting their rules straight on.
And that's surrounding heat speech.
Oh, good.
Some more laughs coming up.
Well, there will be laughter.
Oh, really?
There will be comedians.
There will be jokes.
Oh, comedians.
Hey.
All right.
So we take a break and then come right back.
No, I think we're going to keep going.
Okay.
Testing 12345.
Testing 12345.
I'm Simon Adler. So a couple months back. I think it's okay. We sent our pair of interns
on the 60 feet. Carter Hodg, and Liza, sure. To this cramped, narrow little comedy club.
The kind of place with like,
Super expensive hotel.
I know.
$15 smashed rosemary cocktails.
So, I know.
What kind of it?
We did not need to get a mug in the bin.
High top tables.
The AC is dripping on me.
But stills, kind of a dive.
And we sent them there to check out someone else who'd found a fault line in Facebook's
rulebook.
It's exciting.
We're going to keep it going right along.
The next team to come to stage, please give it up for Marsha Belskins!
Thank you.
I guess I guess so mad. I feel like my first one to the city, I was such a carefree, brat.
You know, I was young and I had these older friends, which I thought was like very cool.
And then you just realized that they're alcoholics.
You know?
She's got dark curly hair, it was raised in Oklahoma.
I was raised Jewish, so when you raised Jewish,
you read about Anne Frank a lot.
A lot a lot.
When you read about Anne Frank, this will get funny.
She...
How did you decide to become a comedian?
It was kind of the only thing that ever clicked with me.
And especially political comedy. I used to watch the Daily Show every day.
And back in 2016, she started this political running bit that I think can be called sort of
absurdist feminist comedy.
Now a lot of people think that I'm like an angry feminist, which is weird.
This guy called me a militant feminist the other day.
And I'm like, OK, just because I am training a militia
woman in the woods.
At first, I just had this running bit online on Facebook
and Twitter.
She was tweeting and posting jokes.
You know, like we have all the Buffalo wild wings
surrounded, you know, things like that.
Eventually took this bit on stage, even wrote some songs.
They all older white men should die, but not my dad.
No, no, not my dad.
Anyhow, so about a year into this running bit,
Marshall was bored at work one day
and logs onto Facebook.
But instead of seeing her normal lose feed,
there was this message that pops up.
It says you posted something
that discriminated along the lines of race, gender,
or ethnicity group.
And so we removed that post.
And so I'm like, what could I possibly have post it?
I really, I thought I was like a glitch.
But then she clicked continue and there highlighted
was the violating post.
It was a photo of hers.
Well, what is the picture?
Can you describe it?
The photo is me as what can only be described as the cherub.
Cute little seven-year-old with big curly hair
and she's wearing this blue floral dress,
her teeth are all messed up.
And into the photo, Marcia had edited in a speech bubble that just says, kill all men.
And so it's funny, you know, because I hate, I hate, it's funny, you know, it just made
whatever.
So, um, I thought it was ridiculous because she searched through her library of photos and
found that kill all men image. And I thought it was ridiculous because she searched through her library of photos and found that kill-all-man image.
And I post it again.
Immediately after like, yeah, and it got removed again.
And this time, there were consequences.
I got banned for three days after that.
Then after several other bans.
Shoot forward, this is months later.
A friend of hers had posted an article and underneath it in the comments section,
there were guys posting just really nasty stuff.
So I commented underneath those comments, Men are Scum, which was very quickly removed.
When, how long did you get banned for at this time?
Thirty days.
Wow.
Yeah.
I was dumbfounded.
So there's a rule somewhere that if I type men are scum, you take it down.
Yes.
I'm like, what could it be?
And so, Marcia called on her, quote, militia of women.
Exactly.
To find out, like, is this just me?
Female comedians who were sort of like mad on my behalf started experimenting, posting Men our Scum.
To see how quickly it would get removed,
and if it would be removed every time, and it was.
So, they started trying other words.
Well, yeah.
To find out where the line was.
My friend put Men our Duskum, that got removed.
Men are the worst.
Removed in band.
This one girl put Men are septic fluid.
Band.
But.
We're only at the middle of the saga.
It doesn't end there.
Because now she's really like what the hell is going on.
Is this sexism?
So I just start doing the most bare minimum amount of investigating.
She's googling around trying to figure out what these policies are, and pretty quick,
she comes across this leaked Facebook document.
So this is when I lose my mind.
This is when Mark Zuckerberg becomes my sworn nemesis for the rest of my life.
Because what she'd found was a document Facebook used to train their moderators.
And inside of it, in a section detailing who Facebook protected from hate speech,
there was a multiple choice question that said,
who do we protect?
White men or black children?
And the correct answer was white men, not black children.
Not even kidding.
White men were protected.
White men were protected.
Black children are not.
That's not a good look.
It's racist.
Something's going on here.
There is absolutely some sort of unaddressed,
biased or systematic issue at Facebook.
Hey.
Hello.
How are you?
I'm doing well.
Thank you so much for being on the video.
Yeah, no.
Good to see you.
So not long after sitting down with Marsha,
Facebook invited me to come out to their offices
in California and sit down with them.
I'm gonna eat one cookie and then...
Oh, they're little, I think I get two.
Typer.
Could I just get your name and your title and...
I'm Monica Bickert and I lead the policies for Facebook.
Monica Bickert is in charge of all of Facebook's rules,
including their policies on hate speech
And so I asked her like why would there be a rule that protects white men, but not black children
We have we have made our hate speech policies
Let me let me rephrase that our hate speech policies have become more detailed over time
We rephrase that, our hate speech policies have become more detailed over time. But our main policy is you can't attack a person or group of people based on a protected
characteristic, a characteristic like race, religion, or gender.
So this takes a couple of beats to explain, but the gist of it is that the Facebook borrowed
this idea of protected classes straight from US anti-discrimination law.
These are the laws that make it so that you can't not hire someone,
say, based on like the religion, their ethnicity, their race.
And so on Facebook, you can't attack someone based on one of these characteristics.
Meaning you can't say men are trash, nor could you say women are trash.
Because essentially you're attacking all men for being men. Oh, is it the all? Can I say women are trash? Because essentially you're attacking all men for being men.
Oh, is it the all?
Can I say Bob is trash?
Yeah, you can say Bob is trash,
because as my story's explained to me.
The distinction is that in the first instance,
you're attacking a category.
In the second instance, you're attacking a person,
but it's not clear that you're attacking that person
because they are a member of a protected category.
Oh, so Bob might be trash for reasons that I haven't done anything with him being a man.
Yeah.
He just might be annoying.
Right.
Okay, so that explains why you take down men or scum, but why would you leave up black
children or scum?
Why would that not get taken down?
So traditionally, we allowed speech once there was some other word in it that made it about something other than a protected characteristic.
In Facebook, jargon, these are referred to as a non-protected modifier.
Just means literally nothing to you.
Give us an example of this.
So traditionally, if you said, I don't like this religion, cab drivers. Cab driver would be the non-protected modifier because employment is not a
protected category.
And so what the rule stated was, when you add this non-protected
modifier to a protected category, in this case, the cab driver's religion,
we would allow it because we can't assume that you're hating this person because of his
religion.
You actually just may not like cab drivers.
So in the case of black, children, children is modifying the protected category of black.
And so children, Trump's black.
Age is a non-protected category. Okay. So, children becomes a non-protected modifier, and their childness trumps their blackness.
You can say whatever you want about black children.
Whereas in the case of white men, you've got gender and race both protected, you can attack them.
That's just a bizarre rule.
I would think you would go the other direction.
The protected class would outweigh the modifier.
Well, they made this decision, as they explained to me,
because their default was to allow speech.
They were really trying to incorporate or nod to the American free speech tradition.
And so there's a whole lot of stuff out there that none of us would defend as a valuable
speech, but didn't rise to the level of stuff that we'd say, this is so bad we're going
to take it down. And in this case, their concern was, we're all members of like, you know, at
least half a dozen protected categories. Like we all have gender, we all have sexual orientation. And if you, if the rule is that any time a protected class is mentioned, it could be hate speech. What you
are doing at that point is opening up just about every comment that's ever made about
anyone on Facebook to potentially be hate speech. Then you're not left with anything, right?
No matter where we draw this line,
there are going to be some outcomes that we don't like.
There are always going to be casualties.
That's why we continue to change the policies.
And in fact, since Marsh's debacle,
they've actually updated this rule.
So now, black children are protected
from what they consider the worst forms of hate speech.
Now, our reviewers take how severe the attack is into consideration.
But despite this, there are still plenty of people. That is flawed because you are a social
network, including Marsha who think this still just isn't good enough. There are not systematic
efforts to eliminate white men in the way that there are other groups. That's why you
have protected groups. She thinks white men and heterosexuals should not be
protected. Protect the groups who are actually victims of hate speech. Makes
sense. Yeah, because in sort of hate speech or thinking about hate speech, there's
this idea of of privileged or of historically disadvantaged groups and that
those historically disadvantaged groups should that those historically disadvantaged groups
should have more protection
because of being historically disadvantaged.
And the challenge with that that was presented to me was,
okay, in the 1940s,
you had Japanese soldiers killing millions of Chinese during World War II.
At that same time, you had Japanese-American citizens, being put into internment camps.
And so we had to ask ourselves a question like,
are the Japanese and historically advantage
or disadvantaged group?
Japanese Americans pretty easy to make a case
that they were disadvantaged,
but in China, it's a totally different story.
And this happened at the exact same moment.
So you've got two different places,
two different cultural stories.
And when you have a website like Facebook, this transnational community, they realized or
they decided that ideas of privilege are so geographically bound that there is no way
to effectively weigh and consider who is privileged above who and decided therefore that we are not going to allow historical advantage or historical privilege into the equation at all.
And I think it's very important to keep in mind here.
These moderators only have like four or five seconds.
Republicans have come to make a decision. In those four seconds is there
enough time to to figure out where in the world someone is, particularly given IP addresses
can easily be masked. Go back where you came from. Is it enough time to figure out a person's
ethnicity? White children are better than black children. On top of that, we often don't know an individual's race.
Straight people suck.
Other categories are even less clearly sexual orientation.
And they just realized it would be next
to impossible to get anybody to be able to run
these calculations effectively.
When we were building that framework,
we did a lot of tests, and we saw sometimes that it was just too hard for our reviewers
to implement a more detailed policy consistently.
They just couldn't do it accurately.
So, we want the policies to be sufficiently detailed,
to take into account all different types of scenarios,
but simple enough that we can apply them consistently and accurately around
the world. And the reality is, anytime that the policies become more complicated, we see
dips in our consistency.
What Facebook's trying to do is take the first amendment, this high-minded lofty legal
concept, and convert it into an engineering manual that can be executed every four seconds
for any piece of content from anywhere on the globe.
And when you've got to move that fast, sometimes justice loses.
That's the tension here.
And I just want to make sure I emphasize that these policies, they're not going to please
everybody.
They often don't please everybody that's working on the policy team at Facebook, but if
we want to have one line that we enforce consistently, then it means we have to have some
pretty objective black and white rules.
It means that you fish very funny. Oh, yeah, yeah, yeah, yeah, yeah, yeah, yeah, yeah.
But when we come back, those rules, they get toppled.
This is Danny from Denver, Colorado.
Radio Lab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding
of science and technology in the modern world.
More information about Sloan at www.Sloan.org.
Chad, Robert, Radio Lab.
Back to Simon Adler.
Facebook, free speech. Chad. Robert. Radio lab. Back to Simon Adler. Facebook.
Facebook. So as we just heard before the break, Facebook is trying to do two competing things
at once. They're trying to make rules that are just, but at the same time can be reliably
executed by thousands of people spread across the globe in ways that are fair and consistent.
And I would argue that this balancing act was put to the test April 15th 2013.
Hey Carlos, we have some breaking news. Otherwise, I wouldn't cut you off so abruptly.
Mundo the 15th 2013 just before three in the afternoon.
Two pressure-cooker bombs ripped through the crowd near the finish line of the dust begins to settle. Oh my god. Oh my god.
People are springing into action.
This one man in a cowboy hat sees a spectator who's been injured,
picks him up, throws him in a wheelchair.
And as they're pushing him through the sort of ashy cloud,
there's a photographer there and he snaps this photo.
the sort of ashy cloud, there's a photographer there and he snaps this photo.
And the photo shows the runner in the cowboy hat
and these two other people pushing this man
who his face is ashen from all of the debris,
his hair is sort of standing on end
and you can tell that actually the force of the blast
and then the particles that got in there
are actually holding it in this sort of wedge shape.
And one of his legs is completely blown off.
And the second one is blown off below the knee other than the femur bone sticking out and then sort of skin and muscle and tendons.
It's horrific.
Meanwhile, from the CBS Bay Area studio, on the other side of the country, I remember snippets
of the day.
Facebook employees were clustering around several desks, staring at the computer screens,
watching the news break.
I have memories of watching some of the coverage.
Shilling new images just released of the Boston bombings.
I remember seeing the photo published online,
and it wasn't long after that.
Someone had posted it on Facebook.
From the folks I spoke to, the order of events here are a little fuzzy,
but pretty quickly, this photo is going viral.
And we realized we're going to have to deal with it.
This image is spreading like wildfire across their platform.
It appears to be way outside the rules they'd written,
but it's in this totally new context.
So they got their team together and sat down in a conference room.
I don't know.
There was probably eight or ten people
thinking about, like, should we allow it?
Or should they take it down?
According to their rules.
Yeah.
So, if you recall, the no insides on the outside's definition that we had in place, meaning
you can't see like people's organs or that sort of thing, and if you can, then we wouldn't
allow it.
And in this photo, you could see, you could definitely see bone.
And so, by the rules, the photo should obviously come down.
Yep.
However, half the room says, no.
The other people are saying, this is newsworthy.
Essentially, this photo is being posted everywhere else.
It's important.
We need to suspend the rules.
We need to make an exception, which immediately receives pushback.
Well, I was saying that what we've prided ourselves on
was not making those calls, and there are no exceptions.
There's either mistakes or improvements.
We made the guidelines for moments like this,
to which the other side shoots back.
Oh my God, are you kidding me?
Like the Boston Globe is publishing this all over the place,
and we're taking it down, like, are you fucking kidding me? Damn the guidelines, let's have common sense here, let's be humans, we know that this is important.
And yeah, they're kind of, they're right, but the reality is like if you say, well we allowed it because it's newsworthy,
how, how do you answer any of the questions about any of the rest of the stuff?
How do you answer any of the questions about any of the rest of the stuff?
In other words, this is a Pandora's box and in fact for reasons that aren't totally clear Team consistency team follow the rules eventually wins the day. They decide to take the photo down
But before they can pull the lever words starts making its way up the chain and internally within Facebook
According to my sources an executive under Zuckerberg sent down an order.
We were essentially told,
make the exception.
I don't care what your guidelines say,
I don't care what your reason is,
the photo stands.
You're not taking this down.
Yes, yes, that's what happened.
This decision means that Facebook has just become a publisher, but they don't think maybe they have, but they've made a news judgment. And
just willy-nilly, they've become CBS, ABC, New York Times, Herald Tribune, Atlantic Monthly,
and all these other things. All at once, they just become a news organization. Yeah, and this brings up a legal question
that's at the center of this conversation about free speech.
Like is Facebook a sort of collective scrapbook for us all
or is it a public square where you should be able
to say whatever you want?
Or yeah, is it now a news organization?
That's transparency, a lot of shared.
Let me get, I'm sorry to interrupt, but let me get the one final question that
kind of relates to what you're talking about in terms of what exactly Facebook is.
And this question has been popping up a lot recently.
In fact, it even came up this past April when Zuckerberg was testifying in front of Congress.
I think about 140 million Americans get their news from Facebook.
So which are you?
Are you a tech company?
Are you the world's largest publisher?
Senator, this is a, I view us as a tech company because the primary thing that we do is build
technology and product.
You said you're responsible for your content, which makes it kind of a publisher, right?
Well, I agree that we're responsible for the content
But I don't think that that's incompatible with fundamentally at our core
Being a technology company where the main thing that we do is have engineers and build products
Basically Zuckerberg and others at the company are arguing no. They're not a news organization
Why what would be the downside of that?
Well, Facebook currently sits on this little idyllic legal island where they can't be held
liable for much of anything.
They're subjected to few regulations.
However, were they to be seen in the eyes of the court as a media organization, that could
change.
But setting that aside, what really strikes me about all of this, is here you have a company that really, up until this point,
has been crafting a set of rules that are both as objective as possible and can be executed as
consistently as possible. And they've been willing to sacrifice rather large ideas in the name of this.
They, for example, privilege, which we talked about, they decided was two
geographically bound to allow for one consistent rule. But if you ask me, there's nothing more
subjective or geographically bound than what people find interesting or important, what people
find newsworthy. And I'll give you a great example of this that happened just six months
after the Boston Marathon bombing, when this video starts being circulated out of Northern
Mexico. And it's a video of a woman being grabbed and forced onto her knees in front of
a camera, and then a man with his face covered grabs her head, pulls her head back and slices
her head off right in front of the camera.
And this video starts being spread.
I can't count how many times, like just reading my Twitter feed.
I've been like, ah, you know, like one person who came across this video or at least dozens
of others like it was Shannon Young.
My name is Shannon Young.
I am a freelance radio reporter.
I've been living here in Mexico for many years now and
her beat is covering the drug war and doing so years back she noticed this strange phenomenon.
It first caught my attention in early 2010. She'd be checking social media. You know, you're scrolling through your feed and
you know, you'd see all this news. People say, ah, there was this three hour gun battle and
intense fighting all weekend long. Folks, we're posting about clashes between drug cartels, people say, ah, there was this three hour gun battle and intense fighting all weekend long.
Folks were posting about clashes between drug cartels, government forces, but then when
Shannon would watch the news that night, she'd see reports on the economy and soccer
results, but the media wasn't covering it.
There'd be no mention of these attacks.
Nothing to do with the violence. And so she and other journalists tried to get to the bottom of this.
Reporters in Mexico City would contact the state authorities and you know public information officer and they'd be like shooting
bombings. What are you talking about? Nothing's going on. We have no reports of anything. These are just internet rumors.
The government even coined a term for these sorts of posts. The famous phrase at the time was collective psychosis.
These people are crazy.
Because you know, they didn't want the situation
to seem out of control.
But then a video was posted.
Yeah.
It opens, looking out the windshield of a car on a sunny day.
The landscape is dry, dusty,
and the video itself is shaky, clearly shot on a phone.
And then the woman taping starts talking.
And this woman, she just narrates as they drive along this highway. She pans the phone from the passenger window to the windshield,
focusing in on these two silver destroyed pickup trucks.
And she's saying, look at these cars over here.
They're shot up.
Oh, look here. Look here.
This 18 wheeler is totally abandoned. It got shot up and look here look here you know this 18 wheeler is you know totally
abandoned it got shot up at one point she sticks the phone out the window to
show all of the bullet casings littering the ground and she just you know
turned the the official denial on its head the government was saying there's no
violence here were cars riddled with bullets.
It was impossible to dismiss.
And from then on, you had more and more citizens,
citizen journalists uploading anonymously,
video of the violence.
The violence.
The violence.
These low-fi, shaky shots of shootouts, this member of it, the headings, I mean bodies hanging
dangling off of overpasses to prove to the world that this was really happening.
It's a cry for help.
Yeah. Which brings us back to that beheading video we mentioned a bit earlier.
Yeah, that video of the beheading, a lot of people are uploading it, condemning the violence
of the drug cartels.
And when it started showing up on Facebook, much like with the Boston Marathon bombing
photo, this team of people, they sat down in a room, looked at the policy, weighed the
arguments.
And my argument was, it was okay by the rules during the Boston bombing.
Why isn't it okay now?
Particularly given that it could help.
Leaving this up means we warn hundreds of thousands of people
of the brutality of these cartels.
And so, we kept it up.
However,
it's fucking wrong!
It's wrong!
I think it's utterly irresponsible
and in fact quite despicable of them to put...
When people found out...
I'm talking, I have little neighbor kids
that don't need to see shit like that. Backlash. Is there really any justification for allowing these videos to... Not quite despicable of them to put people out.
People as powerful as David Cameron weigh in on this decision.
Today, the Prime Minister strongly criticized the move, saying we have to protect children
from this stuff.
David Cameron tweeted, it's irresponsible of Facebook to post beheading videos.
People were really upset because of what it was showing.
And so, according to my sources, some of the folks involved in making this decision to
leave it up were once again taken into an executive's office.
And so we went up and there was a lot of internal pressure to remove it, and I'd go to my boss
and say, hey, look, this is the decision we made.
I recognize this is controversial.
I wanna let you know why we made these decisions.
And they made their case.
There are valid and important human rights reasons
why you would want this to be out there
to show the kind of savagery
and she vehemently disagreed with that.
They took another approach arguing
that if we take this down,
you're deciding to punish people
who are trying to raise awareness.
Again, she wasn't budging.
And just didn't get... didn't get past that.
And ultimately, I was overruled and we removed it.
Just because there was pressure to do so.
The same people that six months prior told them to leave it up because it was newsworthy
said, take the video down.
Facebook this week reversed the decision and banned a video posted to the site of a woman
being beheaded. In a statement Facebook said quote, when we review,
if you want the one from Boston in you probably should have the one from Mexico in.
Right. It was a mistake. Yeah. I think it was a mistake. Because I felt like, like, why do we have these rules in place in the first place?
And it's not the only reason, but decisions like that are the thing that precipitated me leaving.
Leaving? Yeah, not too long after that incident of a few members of the team decided to quit.
Okay, we're gonna break in here and fast forward to the present. In our original broadcast of this story,
Simon finished with one final story about a content moderator in the Philippines
who, for personal and religious reasons, would ignore the rules entirely and just take down whatever she saw fit,
which added just one more layer of difficulty to the whole problem.
But...
I think I can hit record on my end.
Okay, I think it's working now.
Okay.
In just the past few weeks, all of these questions about newsworthiness and what Facebook
should or shouldn't allow on their platform, all of it has gotten even stranger and harder
in ways that are definitely going to be having an impact on the 2020 presidential election.
Okay, well, so we're going,
we gotta start with President Trump.
President Trump has threatened to rain in social media companies,
claiming they're interfering with free speech.
This has sort of been all over the news.
President Tweeting, quote,
Twitter is completely stifling free speech,
but it's also been in the shadow of larger news events going on.
Yeah, I'd say I know I've seen Facebook headlines, but I honestly have not been able to
absorb what's happening.
So, long and short of it is, May 26th, President Trump pens two tweets, start talking Twitter,
not Facebook here initially, falsely claiming that mail-in ballots will lead to voter fraud
and a, quote, rigged election.
So I mean, just to be clear, like, there's no, like, the evidence for voter fraud is almost
non-existent.
Yes.
Everyone who's looked at this says this is not an issue.
And so, in response to these tweets, Twitter decided to do something.
They labeled these two tweets with an exclamation point and a bit of text that read, get the
facts about mail-in ballots.
The most mild fact-check one can imagine.
Yes, but it's also a gargantuan step forward
based on anything that they've done
with Donald Trump at least up until that point.
I mean, it's the first time that Twitter or Facebook
have fact-checked the president of the United States,
which unsurprisingly...
President Trump hit back at Twitter today.
He was not happy about it.
The president wasn't furiated. They tried to silence views, which unsurprisingly president trump hit back at twitter today he was not happy about president was infuriated
they try to silence views and they disagree with by selectively applying a fact check fact check
f-a-c-t fact check and he actually went so far as to draft and sign an executive order threatening to
regulate or shut down social media companies that engage in this sort of fact checking.
So you've got the left saying Twitter is not doing enough.
The right is upset with Twitter for censoring conservative
voices, Twitter is then in this position of like
what the heck do we do next.
And into that uncertainty, Trump tweets the now
infamous line.
When the looting starts, the shooting starts.
When the looting starts the shooting starts when the looting starts the shooting starts this
pops up at 11.53 p.m. and according to the reporting all of the Twitter execs got onto a virtual hangout
and just like many of the cases we just talked about at Facebook they're like what do we do about
this first thing they had to consider was that this tweet did break their rules.
Just like Facebook, Twitter's got its own set of rules
defining what does and doesn't constitute
glorifying violence.
And according to that, this did glorify violence.
But then at the same time,
this is the president of the United States.
And so after going back and forth,
what they decide is that we're not gonna take it down
because we don't want to censor the president.
But what we're gonna do is we're going to shield
that information and neuter this tweet's ability to spread.
How do you do both of those things?
So the first thing they do is,
if you're scrolling down your Twitter feed
and you come across this tweet,
instead of seeing the text of that tweet,
all you see is this tweet has violated the Twitter rules
about glaring, fying, violence,
and you then have to actually click on the tweet to view it.
So there's one click that they've put between you
and the information itself.
Okay, okay.
Then, let's say you want to retweet that.
You want to help this information spread.
You want to put it in front of all of the people that follow you.
You click, retweet, and instead of just being able to retweet another box pops up that says
comment on this tweet.
At that point, you then have to write your own tweet
about Donald Trump's tweeting question.
And when you finally are able to tweet this,
the way it now shows up for others is your comment.
And then again, that text, this tweet, glorifies violence.
Interesting.
So now for someone else to get to it, through your feed,
they have to click and go through that same rigmarole.
So that's what Twitter decided to do.
But here's where you see a real fork in the road, because simultaneously...
We have a different policy, I think, than Twitter on this.
Trump posted the exact same thing on Facebook.
And as Mark Zuckerberg explained during an interview on Fox News. You know, I just believe strongly that that Facebook shouldn't be the arbiter of truth of everything
that people say online. Our policy has been for several years that we are not going to
infringe upon political speech. I think in general private companies probably shouldn't be,
or especially these platform companies shouldn't be in the position of doing that.
And we've been pretty clear on our policy.
And so, we're not going to do anything.
We're going to let this post stand.
So you've got same post, two different companies coming to two very different decisions.
And so to make sense of this difference, I gave another call.
I think that I understood, I mean, I get both decisions,
but I just would like.
I just would kind of be happy for some consistency.
And the way she tells it, Facebook's decision
to leave this post up really goes all the way back
to that newsworthiness argument
about the Boston marathon bombing.
Yeah, it's a little bit more complicated than that,
but because of newsworthyness,
when a public figure or someone of influence
in some way post on Facebook.
Like say, Donald Trump or Michael Jordan.
Basically, because they are a public figure,
it's inherently newsworthy.
It's kind of a circular definition.
Essentially, Facebook is saying, yes,
we have rules about hate speech and violence, but
anything a famous person says is newsworthy enough that so that they don't have to abide
by those rules.
Which does that mean that they can just say whatever the hell they want?
Pretty much.
This was one of the reasons why Donald Trump's statements...
Donald J. Trump is calling for a total and complete shutdown of Muslims entering the United States.
About a Muslim ban, which would have come down as heat speech if anyone else had said it,
were kept up by the platform because they justified it as well. He is like a candidate running for
office, a presidential candidate, and this is newsworthy.
And this is where you feel the rub.
Because on the one hand, there's plenty of evidence that people in power using charged
language can lead to violence. But on the other, I think there's an argument to be made
that isn't it important for us to know what our leaders are saying and thinking particularly when it's
threatening or dangerous.
And that argument has basically exploded within Facebook's own ranks.
Facebook employees attacking Mark Zuckerberg for a separate campaign.
Zuckerberg defending his decision not to.
There's been an employee walk out.
Others have actually resigned.
A number of senior Facebook executives publicly sharing their outrage.
And then sort of a who's who of
Former Facebook employees folks who were there at the beginning
Pinned an open letter to Zuckerberg urging him to reverse his decision. Yeah, I agree that the decision is dangerous
It's that simple and so we actually reached back out to one of our sources from the original piece a former Facebook employee
To get his take on this.
Just like before, we're using an actor to conceal his identity.
But he says, when you look at the reasoning, the rationale behind this public figure
and newsworthiness exception.
Yeah, in addition to it being dangerous and wrong, it was just ridiculously badly done.
I mean, if you're going gonna come up with a protection for reason
to it more effectively
the idea that you want to protect political speech i have sympathy with
you would not do what they're doing basically he's saying they're being
incredibly selective in what political speech they're choosing to privilege
you just you can't stand there and say you're all in favor of expression while
banning political cartoons for hate speech and
The idea that political freedom requires that the people with the most power in society have released rules is
It's obvious nonsense, right?
The spirit of the First Amendment is meant to protect us the citizens from the powerful not the other way around
so I
don't know.
It's repudement to me.
It's like an inversion of everything it's ought to be.
The best parts of the internet is the amplify voices that do not traditionally have amplification
on their own.
But what we're seeing in these types of rules is it doubles down on amplifying and bringing
the power structures that already exist in society to the internet, allowing people
who are already powerful and already are newsworthy themselves, even more power to speak.
And that's what makes this moment so confusing.
Because on the one hand, you have videos coming out of Black Lives Matter protests that
are shining a light on issues of policing and systemic racism.
And we need to see those videos.
Unfiltered.
But then at the same time, right alongside that, you have posts from the leader of the free
world encouraging violence and there they are right next to each other side by side.
Now lastly I think it's important to keep in mind that none of this is static, that
just like all their other rules and policies, this public figure carve-out has been and is
again being tweaked.
I mean, well, I was tracking these very words I'm saying to you now.
I received a notification that Facebook just removed posts from
Trump's reelection campaign because those posts featured a symbol used by Nazi Germany.
And not long ago, Facebook actually went even further.
In light of COVID, they actually halted this exception. They said, listen, if you're saying something that is false about the
pandemic or about COVID-19, we are going to remove what you said no matter what. Wow. People were
really happy with that decision. And I think that that was it was right where we kind of it was like pre-Boston marathon days, and maybe where we should have been always.
1 tbc 1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc
1 tbc 1 tbc 1 tbc This episode was reported by Simon Adler with help from Tracy Hunt and produced by Simon
with help from Bethel Hoppe.
Big thanks to Sarah Roberts, whose research into commercial content moderation got us going
a big time, and we thank her very, very much for that.
Thanks also to Jeffrey Rosen, who helped us in our thinking about what Facebook is.
To Michael Chernis, whose voice we used to mask other people's voices.
To Caroline Glanville.
Rooshika Budraja.
Ryan Dugan, Ellen Silver, James Mitchell, and Guy Rosen.
Of course, to all the content moderators who took the time to talk to us.
And... Do you want to sign off? Yeah, I guess you should, huh? You should. Ready? You want to go first? I rose him of course to all the content minorators who took the time to talk to us and
Do you want to sign off? Yeah, I guess you should
You should ready you want to go first? Yeah, I'm Chad Abumrod. I'm Robert Krollwich. Thanks for listening
This is Lys from Perth, Australia
Radio Lab is created by Chad Ab Bumrad with Robert Crowwich
and produced by Thorin Wheeler. Dylan Keith is our director of sound design. Susie
Lechtenberg is our executive producer. Our staff includes Simon Adler, Becca Bresler,
Rachel Cusick, David Gable, Bethel Hubdee, Tracy Hunts, Matt Kielty, Annie McEwan, Latif Nasser, Sarah Carrey,
Ariane Wack, Pat Walters, and Molly Webster. With help from Shima Ole EW Harry Fortuna,
Sarah Sandbach, Melissa O'Donnell, Tad Davis, and Russell Greg. Our fact-chicker is Michelle Harris.
Our fact-chick-a is Michelle Harris.