Radiolab - Post No Evil
Episode Date: August 17, 2018Back in 2008 Facebook began writing a document. It was a constitution of sorts, laying out what could and what couldn’t be posted on the site. Back then, the rules were simple, outlawing nudity and ...gore. Today, they’re anything but. How do you define hate speech? Where’s the line between a joke and an attack? How much butt is too much butt? Facebook has answered these questions. And from these answers they’ve written a rulebook that all 2.2 billion of us are expected to follow. Today, we explore that rulebook. We dive into its details and untangle its logic. All the while wondering what does this mean for the future of free speech? This episode was reported by Simon Adler with help from Tracie Hunte and was produced by Simon Adler with help from Bethel Habte. Special thanks to Sarah Roberts, Jeffrey Rosen, Carolyn Glanville, Ruchika Budhraja, Brian Dogan, Ellen Silver, James Mitchell, Guy Rosen, and our voice actor Michael Chernus. Support Radiolab today at Radiolab.org/donate.
Transcript
Discussion (0)
Okay, so should I just improvise something?
Sure.
Hey, this is Chad.
Before we launch into this week's podcast, I want to make you aware of, well, our friends down the hall, Brooke, my friend, our friends on the media.
Brooke Laststone is here with me.
Have a new show coming out that I think I might be in?
I'm excited to be in, but I'm certainly excited about.
So what is it?
You are going to be in it.
It's an episode, although you don't want to say show because on Radio Lab, that generally
means you're launching a whole new enterprise.
Yeah, this is an episode on Twitch,
which some people know as if it were part of their family.
And other people are like, what?
Yeah, is that something you need medication for?
So it depends where you're situated.
But what we are going to do is examine how it came to be
and how it points to the future of where our culture,
is going. And for the people who don't know, what is Twitch?
Most people, if they've ever heard of it, they know it's about watching and commenting
in real time on people playing video games.
No. Trap? Oh my God, he put a trap there? And you guys, from what I hear, profile,
one of these Twitch superstars. The main character in that story is Ninja who makes something
like half a million dollars a month. What? But... On Twitch?
Yes.
Through contributors who say, say this out loud, or point to me, mention my name.
He is online. Let's do, you know, let's do no stuff.
And at the same time, he's playing the game.
He's even giving advice to young boys with girlfriend problems.
Don't act like, you know, don't make her seem as like her wall.
You just need to be like, babe, like, we need to have to talk.
And be like, don't worry, everything's fine.
You know, re-sure that.
But then you just need to be like, I really feel like I'm trying, like,
putting in more effort in this relationship than like, then I feel like you are.
And I want to, and then just be like, I just want to make sure that you're just as invested
in this as I am.
I think it's something new.
Oh, I want to hear that.
Well, thank you.
Okay.
Thank you for letting me in the top of your show.
Absolutely.
So Twitch on the media, on the media.org.
Check it out.
I'm in there.
It's going to be amazing.
But not just because of me.
Because of you, Brooke.
Thanks for dropping in.
And Bob.
And Bob.
Bye.
Bye.
Wait, you're listening.
Okay.
All right.
Okay.
All right.
You're listening to Radio Lab.
Radio Lab.
From W. N. Y.
See?
Yeah.
Hey, I'm Chad I boomrod.
I'm Robert Krollwitch.
It's Radio Lab.
And today we have a story about what we can say.
And what we fucking can't.
And by the way,
There's going to be a smattering of curse words here that we're not going to bleep, which will I think make sense, given the content of the story.
And also, there's some graphic scenes that if you've got kids with it, you may want to sit this one out.
Anyway, the story comes to us from producer Simon Adler.
So let's start. Can we start in 2008?
Sure.
How about with a song?
Yes, please.
So December 27th, a sunny Saturday morning, this group of young to middle-aged women gathered in downtown Palo Alto.
They're wearing these colorful hats and are singing and swaying directly in front of the glass door at headquarters of Facebook.
Yes.
It was a humble gathering of a few dozen women and babies.
That right there.
Are you the organizer?
Is one of the organizers of the gathering.
I'm Stephanie Muir.
And what are you calling the event?
It's the Facebook nursing.
Nursing, as in like, breastfeeding.
The intent was really just a,
to be visible and be peaceful and make a quiet point.
What point were they trying to make?
Well, so Stephanie and this group of mothers,
you know, they were on Facebook, as many people were,
and they'd have photos taken of themselves,
occasionally breastfeeding their babies.
They wanted to share with their friends what was going on,
so they would upload those photos to Facebook.
And these pictures would get taken down,
and they would receive a warning from Facebook for...
uploading pornographic content.
And people were really getting their backs up over this.
They wanted Facebook to stop taking their photos down,
to say that, well, nudity is not allowed.
Breastfeeding is exempt, period.
Now, what Stephanie couldn't have known at the time
was that this small, peaceful protest
would turn out to be...
This morning, a face-off on Facebook.
One of the opening shots...
Facebook triggered a hornet snest.
in what would become a loud...
Fuck you Facebook.
Fuck you Facebook.
Rockus.
Fuck you, Facebook.
Fuck you.
And global battle.
In battles, Facebook CEO.
Facebook today playing defense.
And now I'm not talking about all the things you've recently heard about,
Russian interference and election meddling or data breaches.
But rather something that I think is deeper than both of those.
Free speech.
Facebook has been accused of facilitating violence against Rohingya mussel.
we can say and what we can't say.
We're gonna get it.
Facebook banned this iconic photograph.
What we can see and what we can't see.
They'd let Mueller rape kids in front of people.
On the internet.
My account is people.
And fuck you.
You're a fucking piece of shit.
Thank you, Mr. Chairman.
Mr. Zuckerberg, I gotta ask you,
do you subjectively prioritize or censor speech?
Congresswoman, we don't think
about what we're doing is censoring speech.
But there are types of...
What really grabbed me was
discovering that
underneath all of this
is an actual rule book,
a text document
that dictates what I can say on Facebook,
what you can say on Facebook,
and what all 2.2 billion
of us can say on Facebook.
For everyone in the entire globe.
For everyone.
There's one set of rules
that all 2.2 billion
of us are expected to follow.
Is it a little?
Is this an actual document?
It's a digital document, but yes, it's about 50 pages if you print it off.
And in bullet points and if-then statements, it spells out sort of a First Amendment for the
globe, which made me wonder, like, what are these rules?
How were they written?
And can you even have one rulebook?
Right, exactly.
And so I dove into this rulebook and dug up some stories that really put it to the
test.
Hmm. Okay.
How many stories we're going to hear?
We are going to hear. Three-ish?
Three-ish.
Okay.
Okay. All right, cool.
I'm particularly interested in the ish, but let's go ahead with the first one.
Well, so let's start back on that morning in 2008.
The morning that you could argue started at all.
Rive up.
Because in the building right behind those protesting mothers, there was a group of Facebook
employees sitting in a conference room trying to figure out what to do.
Cool.
So if I, so I'm just going to, so I should just just.
So I was able to get in touch with a couple of former Facebook employees, one of whom was actually in that room at that moment.
And now neither of these two were comfortable being identified, but they did give us permission to quote them extensively.
How's that? Will that take work for you?
Sounded great.
Cool.
Just so we have it.
So what you're going to hear here is an actor we brought in to read quotes taken directly from interviews that we did with these two different former Facebook employees.
All right, ready.
So, at the time when I joined them, there was a small group, 12 of us, mostly recent college
grads, who were sort of called the site integrity team.
Again, keep in mind, this was in the early 2000s.
Seismic changes this week in the internet hierarchy.
This was like the deep, dark past.
MySpace.com is now the most visited website in the U.S.
Facebook had somewhere in the neighborhood of 10 million users.
We were smaller than MySpace.
The vast majority of them college kids.
And so in those early days, those 12 people, they would sit around in a sort of conference-like room with a big, long table, each of them in front of their own computer.
And things would come up onto their screen, flagged to Facebook.
Flagg meaning like I a user saw something that I thought was wrong.
Exactly. Like a reporting piece of content that you think violates the community standards.
This is Kate Clonic. She's a professor of law at St. John's, and she spent a lot of time studying this very thing.
And she says in those early days what would happen is a user would flag a piece of content
and then that content along with an alert would get sent to one of those people sitting in that room.
It would just pop up on their screen.
Most of what you were seeing was either naked people, blown off heads, or things that there was no clear reason why someone had reported
because it was like a photo of a golden retriever and people are just annoying.
And every time something popped up onto the screen, the person sitting at that computer,
would have to make a decision, whether to leave that thing up or take it down.
And at the time, if you didn't know what to do...
You would turn to your pod leader, who was, you know, somebody who had been around nine
months longer than you and asked, what do I do with this?
And they would either have seen it before and explain it to you, or you both wouldn't know
and you'd Google some things.
It really was just kind of an ad hoc approach.
Was there any sort of written standard or any common standard?
Well, kind of?
They had a set of community standards that they had a set of community standards that they
At the end of the day, they were just kind of, that was one page long, and it was not very specific.
Sorry, the guidelines were really one page long?
They were one page long.
And basically all this page said was, nudity is bad, so is Hitler.
And if it makes you feel bad, take it down.
And so when one of the people sitting in that room would have a breastfeeding picture pop up on the screen in front of them, they'd be like, I can see a female breast, so I guess that's nudity, and they would take it down.
Until...
Rise!
Rise up.
Rise up.
Fight for the rights to have breastfeeding.
Anyway.
Now, a dozen or so people in front of their offices on a Saturday.
It probably wasn't causing Facebook too much heartache.
But I thought, you know, hey, we have an opportunity here with, you know, over 10,000 members in our group.
According to Stephanie Meera, those protesters were just a tiny fraction of a much larger online group who had organized, ironically enough, through Facebook.
So to coincide with the live protest, I just.
just typed up a little blurb, encouraging our members that were in the group to do a virtual
nurse in. A virtual nursing. Right. What we did. They posted a message asking their members
to for one day change their profile avatar to an image of breastfeeding and then change their
status to the title of our group. Hey Facebook, breastfeeding is not obscene. And it caught on.
The social networking website is under fire for its policy on photos of women breastfeeding their children.
Big time.
12,000 members participated, and the media requests started pouring in.
The Facebook group called, Hey, Facebook Breastfeeding is Not a Zeme.
I did hundreds of interviews for print.
Chicago Tribune, Miami Herald, Time Magazine, New York Times, Washington Post.
You know, the internet is an interesting phenomenon.
Dr. Phil, it was a media storm.
and eventually perhaps as a result of our group and our efforts, Facebook,
was forced to get much more specific about their rules.
So, for example, by then, nudity was already not allowed on the site.
But they had no definition for nudity.
They just said no nudity.
And so the site integrity team, those 12 people at the time,
they realized they had to start spelling out exactly what they meant.
Precisely.
All of these people at Facebook were in charge of trying to defundate.
So I mean, yeah, the first cut out it was visible male and female genitalia, and then visible
female breasts.
And then the question is, well, okay, how much of a breast needs to be showing before it's nude?
And the thing that we landed on was if you could see essentially the nipple and ariola,
then that's nudity.
And it would have to be taken down, which theoretically at least would appease these
protesters because, you know, now when a picture would pop up of a mother breastfeeding, as long as
The child was blocking the view of the nipple in the ariola.
They could say, cool, no problem.
Then you start getting pictures that are women with just their babies on their chest,
with their breasts bare, like, for example, maybe a baby was sleeping on the chest of a bare-breasted woman and not actively breastfeeding.
Okay, now what?
Like, is this actually breastfeeding?
No, it's actually not breastfeeding.
The woman is just holding the baby and she has her top off.
No, but she was clearly just breastfeeding the baby.
Well, like, I would say...
Well, I would say...
Well, I would say it's sort of like kicking a soccer ball.
Like a photo of someone who has just kicked the soccer ball.
You can tell the ball is in the air, but there is no contact between the foot and the ball in that moment, potentially.
So although it is a photo of someone kicking a soccer ball, they are not, in fact, kicking the soccer ball in that photo.
That's a good example.
And this became the procedure or the protocol or the approach for all these things was we have to base it purely on what we can see in the image.
And so I didn't allow that to say.
up under the rules because it could be too easily exploited for other types of content,
like nudity or pornography.
We got to the only way you could objectively say that the baby and the mother were engaged
in breastfeeding is if the baby's lips were touching the woman's nipple.
So they included what you could call like an attachment clause.
But as soon as they got that rule in place, like you would see, you know, a 25-year-old woman
and a teenage-looking boy, right?
And like, what the hell is going on there?
Oh, yeah.
It gets really weird if you start entering into, like, child age.
I wasn't even going to bring that up because it's kind of gross.
It's like breastfeeding porn.
Is that a thing?
Are there sites?
Like, do they have that?
Apparently.
And so this team, they realized they needed to have a nudity rule that allowed for breastfeeding,
but also had some kind of an age cap.
So, so, so then we were saying, okay, once you've progressed past infancy, then we believe
that it's inappropriate.
But then pictures would start popping up on their screen and they'd be like, wait, is that an infant?
Like, where's the line between infant and toddler?
And so the thing that we landed on was,
if it looked like the child could walk on his or her own, then too old.
Big enough to walk?
Too big to breastfeed.
Oh, that could be 18 months.
Yeah, that's like a year old in some cases.
Yeah, and like the World Health Organization recommends breastfeeding
until, you know, like 18 months or two years,
which meant there were a lot of photos still being taken down.
Within, you know, days, we're continuing to hear reports from people that their
photographs were still being targeted.
But...
Facebook did offer a statement saying...
You know, that's where we're going to draw the line.
Facebook isn't budging on its policy.
And keep in mind, through this whole episode...
Is this perhaps the next big thing?
The facebook.com...
The company was growing really, really fast.
It seems like almost everyone is on it.
And there just got to be a lot more content.
When we first launched, we were hoping for, you know, maybe 400, 500 people.
And now we're at 100,000.
So who knows where we're going now.
Thousands more people are joining Facebook every day.
60 million users so far with a projection of 200 million by the end of the year.
There are now more people on Facebook than the entire U.S. population.
Not just within the United States, but also it was growing rapidly more international.
You know, you were getting...
In India is going to...
Stuff from India and Turkey.
Facebook.
Facebook is getting big throughout the EU.
Korea's joined the Facebook.
So they have...
more and more content coming in from all these different places in all these different languages.
How are we going to keep everybody on the same page?
And so once they saw that this was the operational method for dealing with this,
creating this like nesting set of exceptions and rules and these clear things that had to be there
or had to not be there in order to keep content up or take it down, that I think became their procedure.
And so this small team at Facebook got a little bigger and bigger.
jumped up to 60 people and then 100,
and they set out to create rules and definitions for everything.
Can we go through some of sort of the ridiculous examples?
Yes, sweet.
That's what we're here.
Okay, so gore, gore.
You mean violence kind of gore?
Yes.
So the gore standard was headline,
we don't allow graphic violence and gore.
And then the shorthand definition they used was no insides on the outside.
No guts, no blood pouring out of something.
Blood was a separate issue.
There was an excessive blood.
rule. They had to come up with rules about bodily fluid.
Semen, for example, would be allowed in, like, a clinical setting, but, like, what does a
clinical setting mean? And, you know, does that mean if someone is in a lab code?
Hmm.
One of my favorite examples is, like, how do you define art?
Because as these people are moderating, they would see images of naked people that were
paintings or sculptures come up. And so what they decided to do is say, art with nakedness
can stay up.
Like, it stays up if it is made out of wood, made out of metal, made out of stone.
Really?
Yeah.
Because how else do you define art?
You have to just be like, is this what you can see with your eyeballs?
And so from then on, as they run into problems.
Those rules just constantly get updated.
Constant amendments.
Yeah, constant amendments.
New problem, new rule.
Another new problem, updated rule.
In fact, at this point, they're amending these rules up to,
20 times a month.
Wow.
Really?
Yeah.
Take, for example, those rules about breastfeeding.
In 2013, they removed the attachment clause.
So the baby no longer needed to have its mouth physically touching the nipple of the woman.
Oh.
And, in fact, one nipple and or ariola could be visible in the photo.
But not two.
Only one.
Then, 2014, they make it so that both nipples or both areola.
may be present in the photo.
So this is what happens in American law all the time, this very thing.
Yes.
Yeah.
You know, it sounds a lot like common law.
So common law is this system dating back to early England,
where individual judges would make a ruling,
which would sort of be a law,
but then that law would be amended or evolved by other judges.
So the body of law was sort of constantly...
Fleshed out in face of new facts.
Literally every time this team at Facebook
would come up with a rule that they thought was
airtight,
coplop,
something would show up
that they weren't prepared for,
that the rule hadn't accounted for.
As soon as you think,
yeah, this is good,
like the next day something shows up to show you,
yeah, you didn't think about this.
For example, sometime around 2011,
this content moderator is going through a queue of things.
Except, reject,
accept, escalate,
accept.
And she comes upon this image.
Oh, my God.
What?
The photo itself was a teenage girl, African by dress and skin, breastfeeding a goat, a baby goat.
And the moderator throws her hands up and says,
What the fuck is this?
And we googled breastfeeding goats and found that this was a thing.
It turns out it's a survival practice.
According to what they found, this is a tradition in Kenya that goes back centuries that in a drought,
a known way to help your herd get through the drought is to, uh,
if you have a woman who's lactating,
to have her nurse the kid, the baby goat,
along with her human kid.
And so there's nothing sexual about it.
It's just good farming.
Good business, couldn't you?
And theoretically, if we go point by point through this list,
it's an infant.
It sort of could walk, so maybe there's an issue there.
But there's physical contact between the mouth and the nipple.
But.
But.
Obviously.
Breastfeeding, as we intended anyway, meant human.
infants. And so in that moment, what they decide to do is remove the photo. And there was an
amendment, an asterisk, under the rules stating, animals are not babies. We added that so in any
future cases, people would know what to do. But they removed? They discover it was culturally
appropriate and a thing that people do when they decide to remove the photo. Yeah. That outraged
individual is our editor, Sorin Wheeler. Why? Why didn't we make an exception? Because... Because
when a problem grows large enough, you have to change the rules. If not, we don't. This was not. This was
not one of those cases. The juice wasn't worth the squeeze.
And like if they were to allow this picture, then they'd have to make some rule about when
it was okay to breastfeed an animal and when it wasn't okay.
This is a utilitarian document. It's not about being right 100% of the time. It's about
being able to execute effectively.
In other words, we're not trying to be perfect here, and we're not even necessarily trying
to be 100% just or fair. We're just trying to make something that works.
1, 2, 3, 4, 5, 6, 7, 8.
And when you step back and look at what Facebook has become,
like from 2008 to now, in just 10 years.
Simon, I've just arrived at the Accenture Tower here in Manila.
I don't know how many floors it is.
One, two.
The idea of a single set of rules that works that can be applied fairly.
It's just a crazy, crazy concept.
Because they've gone from something like,
70 million users to 2.2 billion.
It's hard to count, but I would say it's about 30 floors.
And they've gone from 12 folks sitting in a room deciding what to take down or leave up to somewhere around 16,000 people.
So there's a floor in this building where Facebook supposedly outsources content moderators.
And so around 2010, they decided to start outsourcing some of this work to places like Manila,
where you just heard reporter Aurora Almondral as well as...
I mean, I would guess that there are thousands of people in this building.
Dublin, where we sent reporter Gareth Stack.
Oh, I can see in where they get their delicious Facebook treats cooked.
Everybody's beavering away.
And we sent them there to try to talk to some of these people,
who, for a living, sit at a computer and collectively click through around a million flagged bits of content
that pop up onto their screen every day.
Wow.
I'm just curious, like, what's that like?
Well...
Hello, Paul.
Um, can I ask you some questions?
Sorry, yeah.
I'm sorry. I just know what I'm gonna be blank on.
We found out pretty quickly.
Who do you work for?
None of these folks are willing to talk to us about what they do.
So there's a lot of running away from me happening.
Hey, lots, sorry to bother you.
Do you guys work in Facebook?
I know, sorry.
You happen to work in Facebook for any chance?
No, I don't.
Sorry to bother you.
Do you work inside?
No, sorry.
Do you work in Facebook?
No.
I mean, like, you just came out of there.
I know you're lying.
In fact, most people wouldn't even admit they work for the company.
Like, what's, is there something wrong about being in the...
they have like an NDA that they signed?
Well, yeah.
So when I finally did find someone willing to talk to me,
do you want to be named or do you not want to be named?
I'd rather not.
That's totally fine.
You know, I'm still in the industry.
I don't want to lose my job over this shit, you know?
He explained that he and all the other moderators like him
were forced to sign these non-disclosure agreements,
stating they weren't allowed to admit that they work for Facebook.
They're not allowed to talk about the work they do.
My contract prohibited me from talking about what content moderation was.
Why?
Several reasons.
One is that up until recently, Facebook wanted to keep secret what these rules were so that they couldn't be gameed.
At the same time, it creates a sort of separation between these workers and the company, which, if you're Facebook, you might want.
You know, I knew I signed up to monitor graphic images.
Just given the nature of the job.
But, you know, I didn't really, you know, you don't really know the impact that.
that's going to have on you until you go through it.
So this guy I talked to, he got his first contract doing this work several years back,
and for the duration of it, about a year, he'd show up to his desk every morning,
put on his headphones.
Click, click, click, click, click, click, click.
Ignore.
Next, next, next, delete.
Case by case by case by case.
45,000 kids every day.
It was just image and decision, image decision.
Wait, $5,000 a day you just said?
Yeah, it was like, it was a lot of cases.
Yeah, he said basically he'd have to go through an image or some other
piece of content every three or four seconds. Wow, all day long? All day, eight hours a day.
Well, if I can ask, what kind of things did you see?
I don't know if this is even, I don't even know if this is like radio but worthy. It's too,
I think this is too exaggerated. Clicking through, he came across unspeakable things.
Some heads exploding to, you know, people being squashed by a tank to, you know, people in cages being
around to like a 13-year-old girl having sex with an 8-year-old boy.
And it's not just once.
It's over and over and over and over.
Well, and did you, did this, like, keep you up in night?
Did you do this?
Absolutely.
Absolutely.
100%.
It kept me up at night.
He'd catch himself thinking about these videos and photos when he was trying to relax.
He had to start avoiding things.
There were specific, like, movies that I couldn't watch.
It was one, I think, is the Quentin Torn 10, you know, one, the wife wanted to kind of see it.
I was like, okay.
I turned it on.
It was like, heads were exploding.
I was like, nope, nope, I have to walk away.
And I just, I had to, I, it was too real.
I saw that.
It's classic PTSD.
A different moderator I spoke to described it as seeing the worst side of humanity.
You see all of the stuff that you,
you and I don't have to see because they're going around playing cleanup.
Yeah.
What a job.
Wow.
Yeah.
And it's worth noting that more and more this work is being done in an automated fashion,
particularly with content like gore or terrorist propaganda.
They're getting better.
You can automate that?
Yeah.
Through computer vision, they're able to detect hallmarks of a terrorist video or of a gory image.
And with terrorist propaganda,
they now take down 99% of it before anyone flags it on Facebook.
Wow.
But moving on to our second story here,
there is a type of content that they are having an incredibly hard time,
not just automating, but even getting their rules straight on.
And that's surrounding hate speech.
Oh, good. Some more laughs coming up.
Well, there will be laughter.
Oh, really?
There will be comedians. There will be jokes.
Hey.
All right.
Okay.
So we take a break and then come right back?
No, I think we're going to keep going.
Okay.
Testing one, two, three, four, five. Testing one, two, three, four five. I'm Simon Adler.
So a couple months back.
I think it's working.
We sent our pair of interns.
On the left, 60 feet.
Carter Hodge.
Here we go.
At the standing room.
And Liza Yeager.
Do you have that tickets for tonight?
I think we're on the guest list.
Okay.
To this cramped, narrow little comedy club.
The kind of place with like...
It's all super expensive.
I know.
$15 smashed rosemary cocktails.
What's so on?
None of it.
We do not need to get a drink.
It's fine.
High top tables.
The AC is dripping on me.
But still's kind of a dive.
It feels good, yeah.
And we sent them there
to check out someone else
who'd found a fault line.
in Facebook's rulebook.
We're going to keep it moving
right along.
The next comedian come to a stage.
Please give it up for Marsha Belski.
I get so mad.
I feel like my first home to the city
I was such a carefree brat.
You know, I was young,
and I had these older friends,
which I thought was very cool.
And then you just realize that they're alcoholics.
She's got dark curly hair
was raised in Oklahoma.
And I think I was raised Jewish.
So when you're raised Jewish, you read about Anne Frank a lot, you know, a lot, a lot.
And when you read about Anne Frank, like, this will get funny.
How did you decide to become a comedian?
You know, it was kind of the only thing that ever clicked with me.
And especially political comedy.
You know, I used to watch the daily show every day.
And back in 2016, she started this political running bit that I think can be called sort of,
absurdist feminist comedy.
Now a lot of people think that I'm like an angry feminist,
which is weird.
This guy called me a militant feminist the other day,
and I'm like, okay, just because I am training a militia of women
in the woods.
At first, I just had this running bit online on Facebook and Twitter.
She was tweeting and posting jokes.
You know, like we have all the Buffalo Wild Wings surrounded,
you know, things like that.
Eventually took this bit on stage, even wrote some songs.
Anyhow, so about a year into this running bit,
Marcia was bored at work one day and logs onto Facebook.
But instead of seeing her normal news feed,
there was this message that pops up.
It says you posted something that discriminated along the lines of race, gender, or ethnicity group.
And so we,
removed that post. And so I'm like, what could I possibly have posted? I really, I thought it was like a
glitch. But then she clicked continue and there highlighted was the violating post. It was a photo of hers.
What is the picture? Can you describe it? The photo is me as what can only be described as a cherub.
Cute little seven-year-old with big curly hair and she's wearing this blue floral dress. Her teeth are
all messed up. And into the photo, Marcia had edited in a speech bubble that just
says kill all men. And so it's funny, you know, because I hate, I hate, it's funny, you know,
trust me, whatever. So, um, I thought it was ridiculous. So she searched through her library of
photos and found that kill all men image. And I post it again. Immediately after, like, yeah,
and it got removed again. And this time, there were consequences. I got banned for three days after
that. Then after several other bans, shoot forward, this is months later. A friend of hers had posted
an article and underneath it in the comments section there were guys posting just really nasty stuff.
So I commented underneath those comments, men are scum, which was very quickly removed.
How long did you get banned for this time?
30 days.
Wow!
Yeah, I was dumbfounded.
So there's a rule somewhere that if I type men are scum, you take it down.
Yes.
I'm like, what could it be?
And so Marcia called on her, quote, militia of women.
Exactly.
To find out, like, is this just me?
Female comedians who were sort of, like, mad on my behalf, started experimenting, posting men are scum.
To see how quickly it would get removed and if it would be removed every time.
And it was.
So they started trying other words.
Bluff, yeah.
To find out where the line was.
My friend put Menar Duskum.
That got removed.
Men are the worst.
Removed in banned.
This one girl put men are septic fluid.
Band.
But...
We're only at the middle of the saga.
It doesn't end there.
Now she's really like what the hell is going on.
Is this sexism?
So I just start doing the most bare minimum amount of investigating.
She's Googling around, trying to figure out what these policies are.
And pretty quick, she comes across this leaked Facebook document.
So this is when I lose my mind.
This is when Mark Zuckerberg becomes my sworn nemesis for the rest of my life.
Because what she'd found was a document Facebook used to train their moderators.
And inside of it in a section detailing who Facebook protected from hate speech,
there was a multiple choice question that said,
who do we protect?
White men or black children?
And the correct answer was white men, not black children.
Not even kidding.
White men were protected.
Black children are not.
That's not a good look.
It's racist.
Something's going on here.
There is absolutely some sort of unaddressed bias or systematic issue at Facebook.
Hi.
Hello.
How are you?
I'm doing well.
Thank you so much for being willing to do this.
So not long after sitting down with Marcia,
Facebook invited me to come out to their offices in California and sit down with them.
I'm going to eat one cookie.
Oh, they're little.
I think I get two.
Typing.
Could I just get your name and your title?
I'm Monica Bickard, and I lead the policies for Facebook.
Monica Bickert is in charge of all of Facebook's rules,
including their policies on hate speech.
And so I asked her, like,
why would there be a rule that protects white men,
but not black children?
We have made our hate speech policies.
Let me rephrase.
that our hate speech policies have become more detailed over time.
But our main policy is you can't attack a person or group of people based on a protected characteristic,
a characteristic like race, religion, or gender.
So this takes a couple beats to explain.
But the gist of it is that Facebook borrowed this idea of protected classes straight from U.S. anti-discrimination law.
These are the laws that make it so that you can't not hire someone, say, based on, like, their religion, their ethnicity, their...
race. And so on
Facebook, you can't attack
someone based on one of these characteristics.
Meaning, you can't say
men are trash, nor could you say
women are trash, because
essentially you're attacking all men
for being men.
Oh, is it the all? Can I say Bob
is trash? Yeah, you can say Bob
is trash, because as my sources
explained to me, the distinction is that
in the first instance, you're attacking
a category. In the second
instance, you're attacking a person, but it's not
clear that you're attacking that person because they are a member of a protected category.
Oh, so Bob might be trash for reasons that have nothing to do with him being a man.
Yeah.
He just might be annoying.
Right.
Okay, so that explains why you take down men or scum, but why would you leave up black children are scum?
Why would that not get taken down?
So traditionally, we allowed speech once there was some other word in it that made it about
something other than a protected characteristic.
In Facebook jargon, these are referred to as a non-protected modifier.
Just means literally nothing to make.
Give us an example of this.
So traditionally, if you said, I don't like this religion cab drivers.
Cab driver would be the non-protected modifier because employment is not a protected category.
And so what the rule stated was when you add this non-protected modifier,
modifier to a protected category.
In this case, the cab driver's religion.
We would allow it because we can't assume that you're hating this person because of his religion.
You actually just may not like cab drivers.
So in the case of black children, children is modifying the protected category of black.
And so children trumps black?
Age is a non-protected category.
Okay.
And so children becomes a non-protected modifier,
and their childness trumps their blackness.
You can say whatever you want about black children.
Whereas in the case of white men,
you've got gender and race, both protected.
You can't attack them.
That's just a bizarre rule.
I would think you would go the other direction,
that the protected class would outweigh the modifier.
Well, they made this decision, as they explained to me, because their default was to allow speech.
They were really trying to incorporate or nod to the American free speech tradition.
And so there's a whole lot of stuff out there that none of us would defend as a valuable speech,
but didn't rise to the level of stuff that we'd say, this is so bad we're going to take it down.
And in this case, their concern was we're all members of like, you know, at least half a dozen protected categories.
Like, we all have gender.
We all have sexual orientation.
If the rule is that any time a protected class is mentioned, it could be hate speech.
What you are doing at that point is opening up just about every comment that's ever made about anyone on Facebook to potentially be hate speech.
Then you're not left with anything, right?
No matter where we draw this line, there are going to be some outcomes that we don't like.
There are always going to be casualties.
That's why we continue to change the policies.
And in fact, since Marsha's debacle, they've actually updated this rule.
So now black children are protected from what they consider the worst forms of hate speech.
Now our reviewers take how severe the attack is into consideration.
But despite this, there are still plenty of people.
That is flawed because you are a social network.
Including Marcia, who think this still just isn't good enough.
There are not systematic efforts to eliminate white men.
men in the way that there are other groups.
That's why you have protected groups.
She thinks white men and heterosexual should not be protected.
Protect the groups who are actually victims of hate speech.
Makes sense?
Well, yeah, because in sort of hate speech or thinking about hate speech,
there's this idea of privileged or of historically disadvantaged groups
and that those historically disadvantaged groups should have more protection because of being
historically disadvantaged.
And the challenge with that
that was presented to me was, okay,
in the thousands of new Japanese reinforcements
in the 1940s
you had
Japanese soldiers
killing millions of thousands of Chinese civilians
killing millions of Chinese during World War II.
At that same time, you had
Japanese American citizens.
More than 100,000 persons of Japanese ancestry.
All of them would have
being put into internment camps.
And so we had to ask ourselves a question like,
are the Japanese and historically advantaged or disadvantaged group?
Huh.
Japanese Americans, pretty easy to make a case that they were disadvantaged,
but in China, it's a totally different story.
And this happened at the exact same moment.
So you've got two different places, two different cultural stories.
And when you have a website like Facebook,
this transnational community,
they realized or they decided that ideas of,
privilege are so geographically bound that there is no way to effectively weigh and consider
who is privileged above who and decided, therefore, that we are not going to allow historical
advantage or historical privilege into the equation at all.
And I think it's very important to keep in mind here.
I hate Americans.
These moderators only have like four or five seconds.
Republicans have come.
to make a decision.
I'm an Indian and I hate this.
In those four seconds, is there enough time to figure out where in the world someone is?
Particularly given IP addresses can easily be masked.
Go back where you came from.
Is it enough time to figure out a person's ethnicity?
White children are better than black children.
On top of that, we often don't know an individual's race.
Straight people suck.
Other categories are even less clear, like sexual orientation.
And they just realized it would be next to impossible to get anybody to be able to run these calculations effectively.
When we were building that framework, we did a lot of tests, and we saw sometimes that it was just too hard for our reviewers to implement a more detailed policy consistently.
They just couldn't do it accurately.
So we want the policies to be sufficiently detailed to take into account all different types of scenarios.
but simple enough that we can apply them consistently and accurately around the world.
And the reality is any time that the policies become more complicated, we see dips in our consistency.
What Facebook's trying to do is take the First Amendment, this high-minded, lofty legal concept,
and convert it into an engineering manual that can be executed every four seconds for any piece of content from any,
anywhere on the globe.
And when you've got to move that fast, sometimes justice loses.
That's the tension here.
And I just want to make sure I emphasize that these policies,
they're not going to please everybody.
They often don't please everybody that's working on the policy team at Facebook.
But if we want to have one line that we enforce consistently,
then it means we have to have some pretty objective black and white.
Rules.
Very funny funny.
Very funny.
Very funny.
But when we come back, those rules?
They get toppled.
This is Danny from Denver, Colorado.
Radio Lab is supported in part by the Alfred P.
Sloan Foundation, enhancing public understanding of science and technology in the modern world.
More information about Sloan at www.sloan.org.
Chad, Robert.
Radio Lab.
Back to Simon Adler.
Facebook.
Free speech.
So, as we just heard before the break, Facebook is trying to do two competing things at
once.
They're trying to make rules that are just, but at the same time can be reliably executed by
thousands of people spread across the globe in ways that are fair and consistent.
And I would argue that this balancing act was put to the test April 15, 2013.
Hey, Carlos, I'm so sorry.
We have some breaking news.
Otherwise, I wouldn't cut you off so abruptly.
Monday the 15th, 2013, just before three in the afternoon.
Two pressure cooker bombs rip through the crowd near the finish line of the Boston Marathon.
and as sort of the dust begins to settle,
oh my God, oh my God.
People are springing into action.
This one man in a cowboy hat
sees this spectator who's been injured,
picks him up, throws him in a wheelchair,
and as they're pushing him through this sort of ashy cloud,
there's a photographer there, and he snaps this photo.
And the photo shows the runner in the cowboy hat
and these two other people pushing this man
who his face is ashen from all of the debris.
His hair is sort of standing on end
and you can tell that actually the force of the blast
and then the particles that got in there
are actually holding it in this sort of wedge shape.
And one of his legs is completely blown off.
And the second one is blown off below the knee
other than the femur bone sticking out
and then sort of skin and muscle and tendons.
It's horrific.
Meanwhile,
from the CBS Bay Area Studio.
On the other side of the country,
APIX-5 News.
I remember snippets of the day.
Oh, my.
Facebook employees were clustering around several desks,
staring at the computer screens,
watching the news break.
And this has occurred just in the last half hour or so.
I have memories of watching some of the coverage.
Chilling new images just released of the Boston bombing.
I remember seeing the photo published online,
and it wasn't long after that.
Someone had posted it on Facebook.
From the folks I spoke to, the order of events here are a little fuzzy, but pretty quickly, this photo is going viral.
And we realized we're going to have to deal with it.
This image is spreading like wildfire across their platform.
It appears to be way outside the rules they'd written, but it's in this totally new context.
So they got their team together, sat down in a conference room.
I don't know. There was probably eight or ten people thinking about, like,
Should we allow it?
Or should they take it down?
According to their rules.
Yeah.
So if you recall the no insides on the outsides definition that we had in place, meaning you can't see like people's organs or that sort of thing.
And if you can, then we wouldn't allow it.
And in this, in this photo you could see, you could definitely see bone.
And so by the rules, the photo should obviously come down.
Yep.
However, half the room says no.
The other people are saying this is news words.
This is newsworthy.
Essentially, this photo is being posted everywhere else.
It's important.
We need to suspend the rules.
We need to make an exception, which immediately receives pushback.
Well, I was saying that what we've prided ourselves on was not making those calls.
And there are no exceptions.
There's either mistakes or improvements.
We made the guidelines for moments like this.
To which the other side shoots back.
Oh, my God, are you kidding me?
Like, the Boston Globe is publishing this all.
over the place and we're taking it down. Like, are you fucking kidding me?
Damn the guidelines. Let's have common sense here. Let's be humans. We know that this is important.
And yeah, they're kind of, they're right. But the reality is like if you say, well, we allowed it because
it's newsworthy, how do you answer any of the questions about any of the rest of the stuff?
In other words, this is a Pandora's box. And in fact, for reasons that aren't totally clear,
team consistency, team follow the rules, eventually wins the day.
They decide to take the photo down.
But before they can pull the lever, word starts making its way up the chain.
And internally within Facebook...
According to my sources, an executive, Andre Zuckerberg, sent down an order.
We were essentially told, make the exception.
Huh.
I don't care what your guidelines say.
I don't care what your reason is.
The photo stands.
You're not taking this down.
Yes. Yes, that's what happened.
This decision means that Facebook has just become a publisher,
but they don't think maybe they have, but they've made a news judgment.
And just willy-nilly they've become CBS, ABC, New York Times,
Herald Tribune, Atlantic Monthly, and all these other things all at once,
they just become a news organization.
Yeah, and this brings up a legal question that's at the center of this conversation
about free speech.
Like, is Facebook a sort of collective scrapbook for us all?
Or is it a public square where you should be able to say whatever you want?
Or, yeah, is it now a news organization?
Let me get, I'm sorry to interrupt, but let me get to one final question that kind of relates
to what you're talking about in terms of what exactly Facebook is.
And this question has been popping up a lot recently.
In fact, it even came up this past April when Zuckerberg was testifying in front of Congress.
I think about 140 million Americans get their news from Facebook.
So which are you?
Are you a tech company?
Are you the world's largest publisher?
Senator, this is a – I view us as a tech company because the primary thing that we do is build technology and products.
But you said you're responsible for your content, which makes that kind of a publisher, right?
Well, I agree that we're responsible for the content, but I don't think that that's incompatible with fundamentally at our core.
being a technology company, where the main thing that we do is have engineers and build products.
Basically, Zuckerberg and others at the company are arguing, no, they're not a news organization.
Why? What would be the downside of that?
Well, Facebook currently sits on this little idyllic legal island where they can't be held liable for much of anything.
They're subjected to few regulations. However, were they to be seen in the eyes of the court as a media organization, that could change?
But setting that aside, what really strikes me about all of this is here you have a company that really, up until this point, has been crafting a set of rules that are both as objective as possible and can be executed as consistently as possible.
And they've been willing to sacrifice rather large ideas in the name of this.
For example, privilege, which we talked about, they decided was too geographically bound to allow for
one consistent rule. But if you ask me, there's nothing more subjective or geographically bound
than what people find interesting or important, what people find newsworthy. And I'll give you,
and I'll give you a great example of this that happened just six months after the Boston Marathon
bombing when this video starts being circulated out of northern Mexico. And it's a video of a woman
being grabbed and forced onto her knees in front of a camera,
and then a man with his face covered, grabs her head,
pulls her head back, and slices her head off right in front of the camera.
And this video starts being spread.
I can't count how many times, like, just reading my Twitter feed, I've been like, ah!
You know, like...
One person who came across this video, or at least dozens of others like it, was Shannon Young.
My name is Shannon Young.
I am a freelance radio reporter.
I've been living here in Mexico for many years.
Her beat is covering the drug war, and doing so years back, she noticed this strange phenomenon.
It first caught my attention in early 2010.
She'd be checking social media.
You know, you're scrolling through your feed, and, you know, you'd see all this news.
People saying, ah, there was this three-hour gun battle and intense fighting all weekend long.
Folks were posting about clashes between drug cartels and government forces.
But then when Shannon would watch the news that night,
She'd see reports on the economy and soccer results, but...
The media wasn't covering it.
There'd be no mention of these attacks.
Nothing to do with the violence.
And so she and other journalists tried to get to the bottom of this.
Reporters in Mexico City would contact the state authorities and, you know, public information officer, and they'd be like...
shootings, bombings? What are you talking about?
Nothing's going on.
We have no reports of anything.
These are just internet rumors.
The government even coined a term for these sorts of posts.
The famous phrase at the time was,
subjective psychosis. These people are crazy.
Because, you know, they didn't want the situation to seem out of control.
But then, a video is posted.
It opens, looking out the windshield of a car on a sunny day, the landscape is dry, dusty,
and the video itself is shaky, clearly shot on a phone.
And then the woman taping starts talking.
And this woman, she just narrates as they drive along this story.
highway. She pans the phone from the passenger window to the windshield,
focusing in on these two silver destroyed pickup trucks. And she's saying, look at these cars over here,
they're, you know, shot up. Oh, ooh, ooh, look here, you know, this 18-wheeler is, you know,
totally abandoned. It got shot up. At one point, she sticks the phone out the window to
show all of the bullet casings, littering the ground.
And she just, you know, turned the official denial on its head.
The government was saying there's no violence.
Here were cars riddled with bullets.
It was impossible to dismiss.
And from then on, you had more and more citizens.
Citizen journalists uploading anonymously video of the violence.
These low-fi, shaky shots of,
Shootouts, dismemberments, beheadings, I mean, bodies hanging, dangling off of overpasses,
to prove to the world that this was really happening to say, we're not crazy.
It's a cry for help.
Yeah.
Which brings us back to that beheading video we mentioned a bit earlier.
Yeah, that video of the beheading, a lot of people are uploading it condemning the violence of the drug cartels.
And when it started showing up on Facebook, much like with the Boston Marathon bombing photo,
this team of people, they sat down in a room, looked at the policy, weighed the arguments.
And my argument was, it was okay by the rules during the Boston bombing?
Why isn't it okay now?
Particularly given that it could help.
Leaving this up means we warn hundreds of thousands of people of the brutality of these cartels.
And so, we kept it up.
However...
It's fucking wrong. It's wrong.
I think it's utterly irresponsible.
and in fact quite despicable of them to put...
When people found out...
I have little neighbor kids that don't need to see shit like that.
Backlash.
Is there really any justification for allowing these videos to be...
People as powerful as David Cameron weigh in on this decision.
Today, the Prime Minister strongly criticized the move.
Saying we have to protect children from this stuff.
David Cameron tweeted,
it's irresponsible of Facebook to post beheading videos.
Yeah.
People were really upset because of what it was showing.
And so, according to my sources,
some of the folks involved in making this decision to leave it up
were once again taken into an executive's office.
And so we went up and there was a lot of internal pressure to remove it.
And I'd go to my boss and say, hey, look, this is the decision we made.
I recognize this is controversial.
I want to let you know why we made these decisions.
And they made their case.
There are valid and important human rights reasons
why you would want this to be out there to show the kind of savagery.
And she vehemently disagreed with that.
They took another approach, arguing that if we take this down...
You're deciding to punish people who were trying to raise awareness.
Again, she wasn't budging.
And just didn't get... didn't get past that.
And ultimately, I was overruled, and we removed it.
Just because there was pressure to do so.
The same people that six months prior told them to leave it up because it was newsworthy, said,
Take the video down.
Facebook this week reversed a decision and banned a video.
posted to the site of a woman being beheaded.
In a statement, Facebook said, quote, when we review...
If you want the one from Boston in, you probably should have the one from Mexico in.
Right.
It was a mistake.
Yeah.
I think it was a mistake.
Because I felt like, like, why do we have these rules in place in the first place?
And it's not the only reason, but decisions like that are the thing that precipitated me leaving.
leaving?
Yeah, not too long after that incident,
a few members of the team decided to quit.
And what I think this story shows
is that Facebook has become
too many different things at the same time.
So Facebook is now sort of a playground.
It's also an R-rated movie theater.
And now it's the front page of a newspaper.
That's all those things at the same time.
It's all those things at the same time.
And what we, the users, are demanding of them
is that they create a set of policies that are just.
And the reality is justice means a very different thing
in each one of these settings.
Justice would mean that the person in Mexico
gets told the truth in Mexico by Facebook
and the little boy in England
doesn't have to look at something gory and horrible in England.
But you can't put them together because they clash.
Exactly.
So how do you solve that?
I don't know.
I think it's important to keep in mind that even if you have the perfect set of policies,
that somehow managed to be just in different settings and that can be consistently enforced,
the people at the end of the day making these decisions, they're still people.
They're still human beings.
Is this working or no?
They can hear you.
Yeah.
Great.
Okay, okay. At long last, we've figured it out, huh?
Yeah, clearly.
I spoke to one woman who did this work for Facebook.
I want to be anonymous.
I don't want them to even know that I'm doing it
because they might file charges against me.
We'll call her Marie.
She's from the Philippines, where she grew up on a coffee farm.
Yeah, that's my father's crop.
And I didn't know that coffee was only for adults.
She said many afternoons, well,
she was growing up, she and her mother would sit together, like outside, sipping their coffee,
and tuning into their shortwave radio.
This is the Voice of America, Washington, D.C.
And they'd sit there.
Listening to the Voice of America.
Silent.
I'm going to ask that we all bow our heads in prayer.
She said one of her favorite things to catch on Voice of America were Billy Graham's servants.
Billy Graham, one of the great evangelists.
Our father, we thank thee for this love of God.
that reaches around the world and engulfs all of mankind.
But then,
Fast forward 50 years to 2010,
and Marie is consuming a very different sort of American media.
The videos were the ones that affected me.
There were times when I felt really bad that I am a Christian
and then I look into these things.
She became a content moderator back in 2010.
and was actually one of the first people in the Philippines doing this work.
I usually had the night shift in the early morning or at dawn from 2 a.m. to 4 a.m.
She worked from home, and despite it being dark out, she'd put blankets up over the windows,
so no one could see in at what she was looking at.
She'd lock the door to keep her kids out.
I have to drive them away.
Or I would tell them that it's an adult thing. They cannot watch.
And she and the other moderators on her team who lived throughout the Philippines,
they were trained on the guidelines on this rulebook.
There were policies that we have to adhere to.
But some of us were just clicking pass, pass, pass, even if it's not really past, just to finish.
Just to get through the content fast enough.
And in some cases, she thinks...
A number of the moderators are doing it as a form of retaliation for the low rate.
People were pissed at the low pay.
If I can ask, how much were you making an hour doing this?
As far as I remember, we were paid like $2.50 per hour.
Marie wouldn't say whether or not this low wage led her to just let things through.
But she did say...
Based on my conservative background, there are things that I cannot look objectively at.
So I reject many of the things that I think are not acceptable.
Really?
Of course.
She said whether something was outside the rules or not,
if her gut told her to, she just took it down.
Whenever it affects me a lot, I would click the button of, like, it's a violation.
Because if it's going to disturb the young audience, then it should not be there.
So, like, if there's a nude person.
Whether it was a breastfeeding photo or an anatomy video or a piece of art.
I would consider it as pornography.
And then click right away.
It's a violation.
You took the law into your own hand.
You went vigilante.
Yeah.
Or something.
So, yeah, I have to protect kids.
from those evil side of humankind.
Where does that leave your feeling?
Does it leave you feeling that this is just, at the end,
this is just undoable?
I think they will inevitably fail,
but they have to try.
And I think we should all be rooting for them.
This episode was reported by Simon Adler,
with help from Tracy Hunt,
and produced by Simon with help from Bethel Hoppe.
Big thanks to Sarah Roberts,
whose research into commercial content moderation got us going a big time,
and we thank her very, very much for that.
Thanks also to Jeffrey Rosen,
who helped us in our thinking about what Facebook is.
To Michael Chernis, whose voice we used to mask other people's voices.
To Caroline Glanville, Rushika Budraja.
Brian Dugan, Ellen Silver, James Mitchell, and Guy Rosen.
Of course, to all the content moderators
who took the time to talk to us.
and...
Do you want to sign off?
Yeah, I guess we should, huh?
Ready? You want to go first?
Yeah, I'm Chad I boomerad.
I'm Robert Kulwich.
Thanks for listening.
The message press two.
Message one.
Kate Cronick from Brooklyn, New York.
Radio Lab was created by Dad Abramod and is produced by Thorne Wheeler.
Dylan Keefe is our director's sound design.
Maria Madazar Padilla is our managing director.
Our staff includes Simon Adler, Maggie Bartholomeo,
Becca Bressler, Rachel Cusick, David Gobble, Bethel Hapty,
Tracy Hunt, Matt Kielte, Robert Krollwich, Annie McEwan,
the Chief Masser, Melissa O'Donnell,
Arian Wack, Pat Walters, and Molly Webster.
With help from Shima Oliéi, Carter Hodge, and Lee Ziegaer.
Our fact checker is Michelle Harris.
End of message.
