Today, Explained - Friends without benefits
Episode Date: March 1, 2019Facebook moderators watch suicides, decapitations, and drone attacks so you don’t have to—and they’re paid just a fraction of what the average Facebook employee makes. The Verge’s Casey Newton... got a rare glimpse inside “the Trauma Floor.” Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
LinkedIn. Do you have one? A lot of people do. And when you use LinkedIn Jobs to hire someone, your matches aren't just based on a resume.
They're based on skills and background and interests and activities and passions.
Post a job at LinkedIn.com slash explain and you'll get $50 off your first job post. That is LinkedIn.com slash explained. There are four shifts at the Cognizant facility in Phoenix.
The early shift starts at 7 a.m.
Miguel might get in at 7 a.m.
And after he badges through security,
he needs to put all of his personal items in a locker.
You're not allowed to bring a phone or a pencil
onto what Cognizant calls the
production floor because they don't want you to be tempted to write down the personal information
of any of the people's content who you might be reviewing. So he gets to his desk, which is not
permanently assigned to him. There are about 300 people who are going to work here today.
And because of high turnover and other reasons, it's easier to just make people
kind of find a new desk when they get in. And when he's ready to start looking at content,
he clicks a button labeled resume reviewing inside a custom piece of software called the
single review tool. And from there, almost anything could happen. He could look at something
totally benign. He could look at a decapign. He could look at a decapitation.
He could look at child exploitation.
He could look at any number of really, really disturbing things.
And he has no advance warning of what he's going to see
before it pops up on his screen.
Once he sees it, though, his job is to decide as quickly as he can
whether it violates any of Facebook's policies,
which are voluminous and ever-changing. For this task, Miguel will be paid $15 an hour,
which comes out to an annual salary of $28,800 a year. And that compares to median compensation
at Facebook, which is $240,000 a year if you include salary, stock, and bonuses.
Casey Newton reports on Facebook for The Verge, and for the past few months, he's been talking to Facebook moderators.
These are the people who screen the social network for offensive content
from offices around the world.
Miguel's story you just heard is true, but Miguel isn't his real name.
Casey spoke with several people on the condition of anonymity,
and they all described what it's several people on the condition of anonymity. And they all described
what it's like working on the front line, protecting the planet from the most horrifying
content you can imagine. So at Cognizant, your time is managed down to the second. When Miguel
wants to go to the bathroom, he needs to click a Chrome extension. He gets two 15-minute breaks a day.
He gets one 30-minute lunch.
And then he gets nine minutes a day of something called wellness time.
And what can he do on that wellness time?
The idea behind wellness time was if you saw a really disturbing video
and you felt like you needed to stand up from your desk,
take a walk around the block, maybe go see a counselor,
that is the time that you can do that.
They've also tried to add activities for employees over time.
So there's like yoga, there's some meditation.
But one of the things I found was that the sites are so crammed and the bathrooms are
so few that often there would be long lines for the bathrooms on employees' breaks.
So, you know, after they went to the bathroom, they might only have a couple of minutes to
go to their locker, check their phone before they went back to work. So, some people actually
started using their wellness time to go to the bathroom. And then when Facebook found out about
that, they sent down a directive that said, you are not allowed to use your wellness time to go
to the bathroom. I talked to some other employees who had observed
that Muslim employees at the site were praying during their wellness time, and they were ordered
to stop praying during their wellness time because for some reason that was never adequately
explained to me, that was not considered a valid use of their wellness time.
So it just sounds like it's a really high-pressure environment. Yeah, because in addition to the micromanaging of your time and the really grim stuff that comes through your queue, you're held to this very high standard of accuracy.
So in my piece, I compare it to a high-stakes video game where you start with a perfect score of 100, and then every time you make a mistake,
your score goes down by maybe a point.
And if you fall much below 95,
you're going to start getting warnings from your boss.
And if you fall much below that,
you're probably going to be fired eventually.
Like Facebook takes accuracy really seriously,
but it does have a lot of really negative effects
on these folks in this very high-pressure environment.
What are the accuracy standards? What are the rules to this whole thing?
So as one of the employees said to me, accuracy is judged by agreement. So if I'm the moderator and I see something and I say, yeah, that can stay, that's going to be added to this giant
bucket of decisions that I made during the week. And then a person above me who is a quality assurance worker, also known as a QA,
they're going to review a subset of my decisions.
So they might review 50 or 60 of my decisions that I make during the week.
And from those, Facebook is going to audit a subset of the QA's decisions.
And so from that series of decisions, an accuracy score is going
to be generated. But what that means in practice, of course, is that the majority of decisions are
never reviewed by Facebook. They're only reviewed by the contractor. And we're just going to sort
of take the contractor's word that, well, if the reviewer and the QA agree, then it was accurate,
sort of no matter what actually happened.
So I guess you've given us a pretty good sense of what life is like for someone like Miguel.
What's life like for a QA?
So one of the really bonkers things about this story to me is that QAs work on the same sites as the content reviewers, right? Because if you're a content reviewer and the QA judges you for being
wrong, all of a sudden your job is at risk. And if that person works at your office and you see
them in the hallways, you're probably going to want to go up and say something to them about it.
Now, officially that's prohibited, but according to the moderators I spoke with,
it was a regular occurrence. And it wasn't always in a friendly like, hey, man, can you do me a solid kind of thing.
So I talked to one guy named Randy.
He is a QA.
He would go to his car at the end of the night and there would be content reviewers waiting for him.
They would threaten to beat him up. And it freaked him out. And so he actually started
bringing a gun to work to protect himself, both from his fellow employees and for any former
employees who may have gotten fired and returned to the workplace, which is something that actually
happened while he was working there. Wow. Casey, I mean, I think for a lot of people,
this line of work is sort of unfathomable. While you were reporting the story, what was the most sort of surprising or unexpected
upshot that came from people doing this kind of stuff all day?
The thing I was not expecting, although it makes perfect sense if you think about it
for a second, is that the moderators come to believe the conspiracy content that they're
reviewing.
One person I talked to told me that he found that he all of a sudden had a lot of questions
about the Holocaust.
Another person I spoke with said that he no longer believed that 9-11 was a terrorist
attack.
The same person told me that he's now convinced that multiple shooters were responsible for
the Las Vegas massacre, which, you know, the FBI has said that a lone gunman was responsible for that.
And one of the people I spoke with was so frustrated and would say, guys, like, we're here to be getting this stuff off of Facebook.
And yet here they are and they're all being persuaded by and radicalized by it.
What kind of toll does looking at all of this stuff like beheadings and drone strikes take on people mentally?
So it can leave you with either post-traumatic stress disorder symptoms.
Some of the folks I spoke with said they had been diagnosed with PTSD.
It might be more common for them to be diagnosed with something called secondary traumatic stress disorder, which is a very similar set of symptoms
that people get when they witness other people's trauma.
So it's not uncommon, for example,
for psychologists or social workers
to experience these kinds of effects, right?
If you're talking to people all day long
who have suffered from traumas,
you yourself, as an empathetic being,
are going to start to experience similar things.
One of the people I spoke with was a girl who I call Chloe in the piece.
And she had applied for this job right out of school.
She didn't have any other immediate prospects.
And because the job
paid $15 an hour, that's $4 more than Arizona minimum wage, it seemed like maybe the best that
she could do. So she took the job and was about three and a half weeks into training when she had
to do this particularly grim exercise. And the way the exercise works is you're in a room with all of your fellow trainees.
There's a video screen on the wall. You walk up to a laptop and you hit play on some piece of
content. And then the screen is going to show you something. And then it's your job to explain to
everyone around you if it violates the community standards. And if so, why, right? Kind of like a pop quiz type of feel.
And so Chloe walks up and she hits play,
and she immediately starts to see a man who is being stabbed dozens of times.
He's screaming. He's begging for his life.
And effectively, he's being murdered, right?
Chloe knows that it violates the rules to show someone
being murdered on Facebook, so she explains that. But as she does, she hears her voice shaking.
She goes back to her seat. She finds that she can't concentrate. She has an overwhelming urge
to sob. She leaves the room. She starts crying so hard that she has trouble breathing. And keep in mind that there are supervisors around her.
No one has come out of the room to check on her.
No one is trying to comfort her.
No one is trying to get her resources.
And Chloe actually leaves it to herself to go back into the room
when she feels like she's put herself together.
The first thing she sees when she goes back into the room
is that a drone is shooting people from the air.
And so she watches these bodies go limp she goes back into the room is that a drone is shooting people from the air.
And so she watches these bodies go limp while she's in the room.
And at that point, she needs to leave.
So she goes into the break room.
She's sobbing.
She goes into the bathroom.
She's sobbing.
Finally, a supervisor finds her, kind of gives her this weak hug,
and says, yeah, you know, it's tough out there.
Why don't you go see the counselor?
There are counselors on these sites.
They're not there all the time.
They are there during large stretches of the day.
And you can go see one if you're having a problem.
So Chloe goes to see the counselor.
And her takeaway from that meeting,
after he kind of gives her some, you know,
suggestions for how to cope with what is the first panic attack
that she's ever had in her life, her takeaway is what this man is really trying to do
is to get me back on that production floor. Like his job is to get me back in operational shape
so that I can go and review content, right? Like that is his primary concern. Now, what's
interesting about that is many people actually are fired during training or leave shortly thereafter.
And, you know, Chloe is someone who is still struggling with anxiety and panic attacks
many months after she's left Cognizant.
If she had left during training, she would have had this sort of long-term adverse mental health problem
and absolutely no resources from either Cognizant or Facebook for, you know,
this thing that never would have happened to her if she hadn't started the job.
You mentioned that Chloe left. What made her leave?
It was a handful of related things that led her to leave.
She was very concerned about the threat that she might come to believe conspiracy theories.
And she was very concerned about her deteriorating mental health. So she still struggles with anxiety. She still struggles with panic attacks.
One of the things that Chloe told me was that, you know, in the aftermath of leaving the job,
she had gone to watch the movie Mother, which includes a violent stabbing spree.
As she watched it, she started to think about that first video
that she had seen in training,
and she felt another panic attack coming on,
and so she had to stand up and leave the theater.
She's still kind of struggling to get on her feet.
I think she's doing okay,
but I think she would probably tell you
that she would have been better off had she never taken this job.
After the break, Casey gets a surprise invitation from Facebook.
I'm Sean Ramos for him. This is Today Explained. Thank you. make the most sense for your role. LinkedIn Jobs uses knowledge of hard skills,
but also soft skills to match you with the people who fit your role best.
I'm more of a soft skills kind of guy myself.
People come to LinkedIn every day to learn and to advance their careers. So LinkedIn understands what they're interested in and what they're looking for.
Matching on LinkedIn Jobs lets you quickly get a group
of the most relevant, qualified candidates for your role.
And that way, you, future employer,
can focus on the candidates you want to spend time talking to
and make a quality hire you're excited about.
Customers rate LinkedIn Jobs number one in delivering quality hires.
You can post a job today at linkedin.com slash explained and get $50 off your first job post.
That is linkedin.com slash explained.
Casey, you published this story earlier this week.
What's been the response from Facebook?
So a few days before my story published, I reached out to Facebook and asked if we could talk.
And they invited me to the facility itself so I could see it with my own eyes, which I have to say was a surprise.
I was sort of expecting a canned statement.
And instead they said, meet us in Phoenix and we will introduce you to the folks who work here.
What was it like? Tell me about your trip.
Well, awkward, I guess, would be one major descriptor I would put on it.
They knew everything that I was about to report, you know, because I have a no surprises policy.
I like companies to know, you know, what I'm about to say to them so they can address anything with me up front.
And so I was very interested to see what it was going to be like.
You know, by that point, I talked to so many moderators that I felt like I could have drawn a sketch of the facility myself.
But, you know, still there were some surprises.
One of the things I say in the piece is that it was way more colorful than I expected it to be.
Like literally colorful.
Yes.
Like I didn't think it was going to be dark and dingy, but I also didn't expect there to be neon wall charts that said, hey, Wednesday's crazy hair day.
And on Wednesdays we wear pink. It was kind of a mix between like a summer camp and a senior center
is how I would describe the activity list.
Did any of the crazy hair day and neon colors convince you that like
maybe this job wasn't as bad as you thought or you had written about?
Look, I think people have a range of different experiences at this job.
One of the things that I did while I was there was meet with five people who volunteered to speak with me, or at least I was told that they volunteered.
And they sat in a room with me and their manager.
No one from Facebook was present.
And I asked them a bunch of questions about the job.
And some of them did say, we're very proud of the work that we do, which was something I heard from basically everyone.
And we feel like we're making a valuable contribution to the world and we're happy to be
here. And, you know, the gross stuff that we see, it doesn't affect me as much. So, you know, I don't
want to give the impression that no one can do this job. But, you know, I just keep thinking of
what Randy told me, which was, he said, I don't think it's possible to do this job
and not come away with some sort of lasting
mental health effects.
And on that point, like,
that seems more true than not to me.
You heard from people like Chloe about, like, PTSD.
Did you ask them about that when you visited?
Yeah, I did.
And one of the counselors, who I call Logan in the piece, kind of redirected the question and said,
well, actually, let me tell you about a phenomenon called post-traumatic growth.
And he described post-traumatic growth as a phenomenon whereby after something bad happens to you,
you bounce back stronger than you were before. And the example
that he gave me was that of Malala, the teenage women's education activist who got shot in the
head by the Taliban. And he said, you know, that was clearly a very traumatic experience for her,
but look at her. She has a Nobel Peace Prize. The thing that really struck me reading your piece is like,
these people are protecting like all of us.
They're like the guardians of our Facebook galaxy.
And yet they're paid not very well.
And they're given these short breaks and their time is so structured.
It's like,
I mean, Facebook is one of like the richest companies in the world. Couldn't they just at
least pay these contractors more? Literally, yes. Sean, literally, yes. They made $6 billion in
profits last quarter. That is money they're not spending on anything else. So they could take
that money and they could pay these people more. You know, the more that I reported on this story, the more this work came to feel to me like other work that first responders do, police officers, firefighters, social workers.
And something that those jobs tend to have in common is that we recognize them as a society-level task that is so important that we all pay for them collectively with our taxes,
right? Like that is how important we see those jobs. And once your platform has scaled up to
more than 2 billion people, which is how big Facebook and Instagram and the various other
properties that Facebook owns are together, you have created a society. And yet, who is policing
it? It is not, you know, all of us chipping in to make sure that
these people are safe and supported and, you know, have pensions. It's people who are being paid $15
an hour. This isn't just Facebook that operates that way, right? No. And it's really important
to say Facebook did not invent this model of content moderation, even though it was new to me. This is the same model
that is used by Twitter. It's used by Google, which also means YouTube. And it's been used by
other tech firms in the past. So it's a model that has a long history. I would say, though,
that in the past, these platforms were smaller, in some cases, radically smaller.
And while a small size doesn't solve all of these problems,
I do think it does make them somewhat more manageable.
So one of the things that I'm arguing
is that this call center model
of policing the biggest platforms in the world
is really straining under the weight
and has arguably broken in a lot of places. What do you think these companies can do to actually make this a better system, to make
this function better? I think there's two things that a lot of the moderators I spoke with would
like to see. One is they deserve more pay, and so I hope that they get it. Two is they can just take away some of the crazy
restrictions on people's personal freedom. I don't think that they should have to log in a
time tracker every time they go to the bathroom. I think they should be able to go to the bathroom
for long periods of time and tell no one about it the same way that most of us who work in offices
do, right? I don't think anybody should be telling them how to manage this extra wellness time that they're getting. You know, I learned just today that at at least some
sites, the workers are judged partially on what is dystopianly called an occupancy score, which
is the amount of time during the day that their mouse is moving, right? So 5% of the score of one of the people I spoke with
is how much his mouse was moving during the day.
These firms could stop doing that overnight
and it would have no effect on fricking anything
except that these people wouldn't feel like
they worked in this hellish panopticon.
It is amazing to think of that, right?
Because like the rest of Facebook's employees
work in this like Frank Gehry designed Mecca
in Silicon Valley and they have like in-house chefs and all of these amazing benefits and
just free shit as far as the eye can see.
You know, doctors on site, free dry cleaning.
You know, there's this kind of fake main street on Facebook's campus where all of the restaurants
are free and there's
a pop-up shop that changes seasonally. So, oh, hey, it's Valentine's Day and you forgot to buy
something for your sweetheart. Good news. Here's a dozen roses. Just waltz off the site with them.
These are real things. And that's what it's like to be a Facebook employee. And that's in addition
to the $240,000 a year that more than 50% of you are taking home. So the money can pay for those
things. Let's talk about what this really is, Sean. This is about how we value people, right?
One of the reasons these people are compensated in the way that they are is because Mark Zuckerberg
believes that this work will one day be done by machines, right? They are building machine
learning systems that can identify this stuff before it ever gets uploaded so that someday Facebook won't have to rely on a moderator to get there.
And arguably, this is a good thing, right?
Like there's a lot of this work that we probably do want to be automated so that we can spare
human beings from having to see it.
But once you've decided that people are doing work that will one day be done by machines,
you are going to radically devalue
that work, right? Because you were saying that all you're doing now, this is just temporary.
And so, yes, you can have some crumbs along the way, but like this job is not a lifetime job.
It's not a career. And so, you know, do it for as long as you want, but don't get too comfortable. I think that there is another world in which
tech companies said, look, we hope that one day AI can do some portion of this work. But,
you know, because of the empathy we feel as human beings
and because, frankly, it's good for our brands,
we're going to try to take extraordinary care
of the people who are doing the jobs
that we ourselves would never want to do, right?
Like, that is a choice they could make.
And so I hope that that's some of the discussion
that we have in the months ahead.
Casey Newton reports on Facebook and Democracy for the Verge.
You can find the trauma floor, his piece on Facebook moderators, at theverge.com. Thank you. interested in and looking for. Post a job at linkedin.com slash explained and get $50 off your first job post. That is linkedin.com slash explained.