Today, Explained - Facebook’s PTSD payout

Episode Date: May 14, 2020

Facebook has agreed to pay a $52 million settlement to its content moderators. Reporting by The Verge’s Casey Newton was a game changer. Transcript at vox.com/todayexplained. Learn more about your a...d choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 The all-new FanDuel Sportsbook and Casino is bringing you more action than ever. Want more ways to follow your faves? Check out our new player prop tracking with real-time notifications. Or how about more ways to customize your casino page with our new favorite and recently played games tabs. And to top it all off, quick and secure withdrawals. Get more everything with FanDuel Sportsbook and Casino. Gambling problem? Call 1-866-531-2600.
Starting point is 00:00:23 Visit connectsontario.ca. There are four shifts at the Cognizant facility in Phoenix. The early shift starts at 7 a.m. Miguel might get in at 7 a.m. And after he badges through security, he needs to put all of his personal items in a locker. You're not allowed to bring a phone or a pencil onto what Cognizant calls the production floor because they don't want you to be tempted to write down the personal information of any of the people's content who you might be reviewing. So he gets to his desk, which is not permanently assigned to him. There are about 300 people who are going to work here today.
Starting point is 00:01:06 And because of high turnover and other reasons, it's easier to just make people kind of find a new desk when they get in. And when he's ready to start looking at content, he clicks a button labeled Resume Reviewing inside a custom piece of software called the Single Review Tool. And from there, almost anything could happen. He could look at something totally benign. He could look at a decapitation.
Starting point is 00:01:32 He could look at child exploitation. He could look at any number of really, really disturbing things. And he has no advance warning of what he's going to see before it pops up on his screen. Once he sees it, up on his screen. Once he sees it, though, his job is to decide as quickly as he can whether it violates any of Facebook's policies, which are voluminous and ever-changing. For this task, Miguel will be paid $15 an hour, which comes out to an annual salary of $28,800 a year. And that compares to median compensation at Facebook, which is $240,000 a year if you include salary, stock, and bonuses. You might be experiencing deja vu right now, but it's not your imagination. Casey Newton from The Verge told us about Miguel and Facebook content moderators at the Cognizant facility in Phoenix in March of last year.
Starting point is 00:02:41 His reporting on the trauma content moderators endure was a game changer for the tech industry. It was picked up and shared all around the world. It led to this company Cognizant leaving the content moderation business altogether last year. And this week, we got the biggest news yet. Facebook decided to settle a class action lawsuit filed by current and former content moderators. Facebook is paying up. We called up Casey to explain how big a deal this all is. But before we get to that, we wanted to remind you exactly what this content moderation business is like. So here's the reminder from our episode last year. It was called Friends Without Benefits. Casey started off with life at Cognizant. So at Cognizant, your time is managed down to the second. When Miguel wants to go to the bathroom,
Starting point is 00:03:37 he needs to click a Chrome extension. He gets two 15-minute breaks a day. He gets one 30-minute lunch, and then he gets nine minutes a day of something called wellness time. And what can you do on that wellness time? The idea behind wellness time was if you saw a really disturbing video and you felt like you needed to stand up from your desk, take a walk around the block, maybe go see a counselor, that is the time that you can do that. They have also tried to add activities for employees over time. So there's like yoga, there's meditation. But one of the things I found was that the sites are so crammed and the bathrooms are so few that often there would be long lines for the bathrooms
Starting point is 00:04:19 on employees' breaks. So after they went to the bathroom, they might only have a couple of minutes to go to their locker, check their phone before they went back to work. So some people actually started using their wellness time to go to the bathroom. And then when Facebook found out about that, they sent down a directive that said you are not allowed to use your wellness time to go to the bathroom. What kind of toll does looking at all of this stuff like beheadings and drone strikes take on people mentally? So it can leave you with either post-traumatic stress disorder symptoms. Some of the folks I spoke with said they had been diagnosed with PTSD. It might be more common for them to be diagnosed with something called secondary traumatic stress disorder, which is a very similar set of symptoms that people get when they witness other
Starting point is 00:05:07 people's trauma. So it's not uncommon, for example, for psychologists or social workers to experience these kinds of effects, right? If you're talking to people all day long who have suffered from traumas, you yourself, as an empathetic being, are going to start to experience similar things. One of the people I spoke with was a girl who I call Chloe in the piece, and she had applied for this job right out of school. She didn't have any other immediate prospects, and because the job paid $15 an hour, that's $4 more than Arizona minimum wage. It seemed like maybe the best that she could do. So she took the job and was about three and a half weeks into training when she had to do this particularly grim exercise. And the way the exercise works is you're in a room with all of your fellow trainees. There's a video screen on
Starting point is 00:06:05 the wall. You walk up to a laptop and you hit play on some piece of content. And then the screen is going to show you something. And then it's your job to explain to everyone around you if it violates the community standards. And if so, why? Right. Kind of like a pop quiz type of feel. And so Chloe walks up and she hits play and she immediately starts to see a man who is being stabbed dozens of times. He's screaming, he's begging for his life. And effectively he's being murdered, right? Chloe knows that it violates the rules to show someone being murdered on Facebook. So she explains that. But as she does, she hears her voice shaking. She goes back to her seat. She finds that she can't concentrate. She has an overwhelming urge to sob. She leaves the room.
Starting point is 00:06:57 She starts crying so hard that she has trouble breathing. And keep in mind that, you know, there are supervisors around her. No one has come out of the room to check on her. No one is trying to comfort her. No one is trying to get her resources. And Chloe actually leaves it to herself to go back into the room when she feels like she's put herself together. The first thing she sees when she goes back into the room is that a drone is shooting people from the air. And so she watches these bodies go limp while she's in the room. And at that point, she needs to leave. So she goes into the break room.
Starting point is 00:07:29 She's sobbing. She goes into the bathroom. She's sobbing. Finally, a supervisor finds her, kind of gives her this weak hug, and says, yeah, you know, it's tough out there. Why don't you go see the counselor? There are counselors on these sites. They're not there all the time.
Starting point is 00:07:47 They are there during large stretches of the day. And you can go see one if you're having a problem. So Chloe goes to see the counselor. And her takeaway from that meeting, after he kind of gives her some suggestions for how to cope with what is the first panic attack that she's ever had in her life, her takeaway is,
Starting point is 00:08:04 what this man is really trying to do is to get me back on that production floor. to cope with what is the first panic attack that she's ever had in her life, her takeaway is what this man is really trying to do is to get me back on that production floor. Like his job is to get me back in operational shape so that I can go and review content, right? Like that is his primary concern. Now, what's interesting about that is many people actually are fired during training or leave shortly thereafter. And, you know, Chloe is someone who is still struggling with anxiety and panic attacks many months after she's left Cognizant. If she had left during training, she would have had this sort of long-term adverse mental health problem and absolutely no resources from either Cognizant or Facebook for, you know, this thing that never
Starting point is 00:08:42 would have happened to her if she hadn't started the job. The thing that really struck me reading your piece is like, these people are protecting like all of us. They're like the guardians of our Facebook galaxy. And yet they're paid not very well. And they're given these short breaks and their time is so structured. It's like, I mean, Facebook is one of like the richest companies in the world. Couldn't they just at least pay these contractors more? Literally, yes. Sean, literally, yes. They made $6 billion in profits last quarter. That is money they're not spending on anything else. So they could take that money and they could pay these people more. You know, the more that I reported on this story, the more this work came to feel to me like other work that first responders do, police officers, firefighters, social workers.
Starting point is 00:09:33 And something that those jobs tend to have in common is that we recognize them as a society level task that is so important that we all pay for them collectively with our taxes, right? Like that is how important we see those jobs. And once your platform has scaled up to more than 2 billion people, which is how big Facebook and Instagram and the various other properties that Facebook owns are together, you have created a society. And yet, who is policing it?
Starting point is 00:10:02 It is not, you know, all of us chipping in to make sure that these people are safe and supported and, you know, have pensions. It's people who are being paid $15 an hour. You mentioned that Chloe left. What made her leave? She was very concerned about her deteriorating mental health. So she still struggles with anxiety. She still struggles with panic attacks. One of the things that Chloe told me was that, you know,
Starting point is 00:10:31 in the aftermath of leaving the job, she had gone to watch the movie Mother, which includes a violent stabbing spree. As she watched it, she started to think about that first video that she had seen in training, and she felt another panic attack coming on. And so she had to stand up and leave the theater. She's still kind of struggling to get on her feet.
Starting point is 00:10:58 I think she's doing okay, but I think she would probably tell you that she would have been better off had she never taken this job. Casey and the news of Facebook's settlement after the break. Thank you. and spend management software designed to help you save time and put money back in your pocket. Ramp says they give finance teams unprecedented control and insight into company spend. With Ramp, you're able to issue cards to every employee with limits and restrictions and automate expense reporting so you can stop wasting time at the end of every month. And now you can get $250 when you join Ramp. You can go to ramp.com slash explained, ramp.com slash explained, R-A-M-P dot com slash explained. Cards issued by Sutton Bank. Member FDIC. Terms and conditions apply.
Starting point is 00:12:36 Bet MGM, authorized gaming partner of the NBA, has your back all season long. From tip-off to the final buzzer, you're always taken care of with a sportsbook born in Vegas. That's a feeling you can only get with BetMGM. And no matter your team, your favorite player, or your style, there's something every NBA fan will love about BetMGM. Download the app today and discover why BetMGM is your basketball home for the season. Raise your game to the next level this year with BetMGM is your basketball home for the season. Raise your game to the next level this year with BetMGM, a sportsbook worth a slam dunk,
Starting point is 00:13:08 and authorized gaming partner of the NBA. BetMGM.com for terms and conditions. Must be 19 years of age or older to wager. Ontario only. Please play responsibly. If you have any questions or concerns about your gambling or someone close to you, please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
Starting point is 00:13:30 BetMGM operates pursuant to an operating agreement with iGaming Ontario. Casey Newton, we spoke to you around this time last year about your reporting for The Verge on Facebook content moderators. What we didn't talk about really was that there was this lawsuit pending against Facebook about this very issue. Can you tell us a little bit about the lawsuit? Yeah. So in the fall of 2018, just before I started my reporting on this issue, a moderator named Selena Scola, who had worked through a third-party vendor for Facebook in California, filed a lawsuit against the company in San Mateo Superior Court, alleging that Facebook had created an unsafe workspace for her. And over the next year and a half, she was joined by a handful of other moderators in Arizona, Texas, and Florida,
Starting point is 00:14:26 alleging that they had had similar experiences. So is this a so-called class action lawsuit? That's right. And the idea is that their claims would eventually enable all 11,250 or so people who had done this work in North America for Facebook through a variety of third party companies to join the case and get some sort of relief. So how does Facebook initially respond to this class action lawsuit? They denied the claims and said they were all baseless and pointed to the things that they do to try to take care of the moderators while they're on the job.
Starting point is 00:15:04 Facebook executives have maintained that the working conditions described by dozens of contractors do not accurately reflect the daily lives of the majority of its workers. And when does that storyline shift? It sort of shifted this week when Facebook and the plaintiffs reached a preliminary settlement. And while Facebook continues to deny the allegations, they have agreed to pay out $52 million as a part of a settlement with this class of former workers. So do the math for us. How much money can former and current Facebook moderators hope to see? Everyone will get at least $1,000. The idea is that people will take that money and go get a mental health checkup.
Starting point is 00:15:47 And for those people who either are newly diagnosed with a mental health condition or already have a mental health condition and can get a doctor's note, they'll be able to get two additional kinds of relief. One is they'll be eligible for between $1,500 and $6,000 of money to support their ongoing mental health care. And then two is they'll be eligible for up to $50,000 in damages, depending on the severity
Starting point is 00:16:15 of the injuries they suffered. The big asterisk on all of that is that people will be eligible for less money the more people apply. So once the money's gone, it's gone. And how many people actually get just depends on how many people ask for it. So this isn't meant to be a life-changing amount of money. Like you experience this life trauma, here's a bunch of money to take care of you. It's meant to be, here's enough money to go see a doctor. Yeah, but it's also symbolic, right? Because up until now, Facebook has denied that people are working in an unsafe environment. Some of the reports, I think, are a little overdramatic from digging into them and understanding what's going on. It's
Starting point is 00:17:01 not that most people are just kind of looking at just terrible things all day long. And now the company has been forced to acknowledge that in fact, a significant percentage of workers may actually be experiencing mental health issues as a direct result of doing the job. And now they're going to be paid out for it. And one of the things that that does is it sends a big message to anyone else who might do this job that, hey, this is a risky line of work, which I think is something that workers have a right to know. And it also puts pressure on all the other companies who are employing moderators to try to improve those working conditions as best they can, because otherwise they're going
Starting point is 00:17:41 to be paying out these kinds of settlements too. And by other companies, you're talking about YouTube or its parent Google or Twitter, all these other platforms where we have content moderation going on? That's right. Content moderation is a feature of pretty much any website that allows anyone to upload images or text. And so while Facebook might employ more moderators than anyone else, there are a number of companies who are employing thousands of folks to do this work around the world. And so this really does feel like a landmark settlement that is going to have a lot of implications for this industry going forward. Does that make Facebook something of like a trailblazer here to be the first to pay out and say, we realize that this sucks? Yeah, sort of. But they were also the first to be sued, right? So they sort of were dragged into this kicking and screaming.
Starting point is 00:18:34 Yeah. Does this open up Facebook and Google or Twitter or anyone else to further lawsuits like this, further class action lawsuits that maybe could even have greater payouts for moderators? Yes. So there is a very similar lawsuit that is unfolding in Europe right now against Facebook. It was filed in Dublin by a former moderator there named Chris Gray. I'm just really angry. I mean, it's a very demanding job. And we always felt like we didn't get any respect, that we weren't important. And now there's more and more people complaining that they've been hurt, they've been damaged by the work, including myself. And the only way to get this solved is to force Facebook to accept responsibility. I've already heard from those plaintiffs today, and they tell me that they are going to be looking for a much larger number. But you can certainly imagine that Facebook is
Starting point is 00:19:30 going to make a similar deal with those moderators. And then I think the big question is, what happens with Google and YouTube? What happens with Twitter? And what happens with some of these platforms that maybe we're not thinking about yet, like a Pinterest or a smaller social network? Again, I just think that we're going to about yet, like a Pinterest or a smaller social network. Again, I just think that we're going to start seeing a lot more of this. Has anything actually changed for the people who work in these positions? Yeah. Since I started my reporting on this, Facebook has rolled out a number of changes to the job itself. A lot of them have to do with what they call the tooling, which is basically
Starting point is 00:20:04 the software that moderators use to find new posts and determine whether they should stay up or come down. white by default, or mute the audio on videos by default, or blur faces in graphic imagery, all of which have been shown to reduce the psychic impact of doing this work over a long period of time. And then the other set of changes that Facebook agreed to is to provide extra counseling, particularly for people who are seeing graphic content on a daily basis. They'll be able to go see a counselor if they're having an emergency within one business day. They'll also get access to weekly one-on-one coaching with a counselor. So they're hoping that they can manage this with a combination of better software and more psychologists. I think the thing that struck so many of us who read your reporting on this in the first place was that you personalized this work because we just look at our machines and see
Starting point is 00:21:14 the things we want to see and go about our days, but there are people behind what we're seeing who are keeping us safe. I wonder, have you spoken to those people that you sort of shed a light on in your original reporting? People like Miguel, people like Chloe, since your original reporting, or even since this settlement? Yeah, I heard from both of them yesterday. Both of them were really excited about what had happened. And I also called up a guy named Sean Spiegel, who I interviewed for my story about the moderators in Tampa last June. Sean is just one of the moderators whose story has never really left my mind. He was an animal lover who had to watch constant videos of animal abuse and wound up getting diagnosed with PTSD. He's a school teacher now. He's doing better. He's received treatment from licensed mental
Starting point is 00:22:11 health care professionals who have worked with him. And he had no involvement in the lawsuit whatsoever, but I called him up to tell him what was in the settlement. And he just stopped me and he said, hey, I just want to say, I don't care if I get any money out of this. When I went to work at Facebook, they told me I was going to be making the world a better place when I was doing this work, and I felt like I wasn't, and that's what I wanted to tell you. Why was it that Sean didn't feel like his work was making the world a better place? I mean, even with the horrific nature of this line of work, did he not feel like his content moderation was helping matters at all? Is that what it was?
Starting point is 00:22:51 What Sean really struggled with was that he would see something terrible and he would remove it from Facebook and then the next day he would see it again and the next day he would see it again. And so it was this Sisyphean task where no matter how many times he removed a terrible thing from the internet, it came back. And he felt like Facebook wasn't doing enough to prevent those terrible images from being re-uploaded and re-shared. And so over time, the entire enterprise just came to feel futile to him. I just never cease to be amazed by the decency and the civic-mindedness of the folks who do this work. So many of them really do want to make a better internet for you and me. They want to make the world a better place. And all they were asking for was working conditions
Starting point is 00:23:43 that felt safe and took care of their mental health. And so we're still a long way away from that. But the hope is that this settlement is going to be a meaningful step forward. Casey, thank you. Thank you guys. Casey Newton reports on Facebook and Democracy for the Verge. You can find all of his reporting on content moderators, including the original piece we featured, The Trauma Floor, at TheVerge.com. I'm Sean Ramos-Furham. This is Today Explained.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.