Factually! with Adam Conover - A.I. Doesn’t Run the Internet; Exploited Humans Do, with Sarah T. Roberts
Episode Date: December 4, 2019Technology expert and UCLA professor of information studies Sarah T. Roberts joins Adam to discuss the oversold fantasy of artificial intelligence, the real humans who labor behind the scenes... to moderate your social media feeds, and the psychological effects the work takes on them. This episode is sponsored by Acuity (www.acuityscheduling.com/factually), KiwiCo (www.kiwico.com/FACTUALLY), and Parcast - Natural Disasters (www.parcast.com/NATURALDISASTERS). Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
You know, I got to confess, I have always been a sucker for Japanese treats.
I love going down a little Tokyo, heading to a convenience store,
and grabbing all those brightly colored, fun-packaged boxes off of the shelf.
But you know what? I don't get the chance to go down there as often as I would like to.
And that is why I am so thrilled that Bokksu, a Japanese snack subscription box,
chose to sponsor this episode.
What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds.
Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Plus, they throw in a handy guide filled with info about each snack and about Japanese culture.
And let me tell you something, you are going to need that guide because this box comes with a lot of snacks.
I just got this one today, direct from Bokksu, and look at all of these things.
We got some sort of seaweed snack here.
We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the
guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This
one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava
potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this
is so much fun. You got to get one of these for themselves and get this for the month of March.
Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono
style robe and get this while you're wearing your new duds, learning fascinating things
about your tasty snacks.
You can also rest assured that you have helped to support small family run businesses in
Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight
to your door.
So if all of that sounds good, if you want a big box of delicious snacks like this for yourself,
use the code factually for $15 off your first order at Bokksu.com.
That's code factually for $15 off your first order on Bokksu.com. I don't know the way. I don't know what to think. I don't know what to say. Yeah, but that's alright. Yeah, that's okay. I don't know anything.
Hello, welcome to Factually. I'm Adam Conover.
And, you know, if you listen to politicians, tech titans, and crazy uncles around the country,
the robots are going to take over.
Supposedly, a revolution in artificial intelligence is transforming our economy,
making human workers unnecessary, right?
I mean, robots don't need health care or minimum wage or job security.
AI doesn't need regular bathroom breaks or even bathrooms at all. And hey, best of all, AI would never betray you, you titan of industry, and
form one of those pesky unions. Hey, you can straight up program them to never go on strike.
Wow, the perfect employee. Humans are screwed, right? Well, the truth is this story of encroaching AI supremacy has been a bit oversold.
Let's say radically oversold. The fact is it is extremely difficult to develop AI that can perform
even basic tasks a human can. And, you know, if you want a worker that doesn't get healthcare
or bathroom breaks, you know, you can just not give those to a human worker.
So as a result, it's often cheaper and easier to just pay exploited human workers peanuts than it is to design, deploy, and debug a multi-billion dollar technical solution.
And it likely will be for a long time.
Take Uber, for example.
You know, they've long teased this future where they replace human drivers with super self-driving cars.
And Wall Street is so convinced that that's the future of the company that their self-driving car unit is valued at over $7 billion.
But look, even if you ignore the fact that Uber's executives have been charged with stealing their self-driving technology from Google and the fact that one of their AI cars killed a person in 2018
because it was too stupid to realize it needed to hit the brakes, the idea of Uber replacing
all of their drivers with a self-driving fleet is still pretty far-fetched. You know, a while back,
I was actually chatting with someone who works at Uber, and they told me, kind of on the DL,
so I'm kind of blowing them up a little bit. Apologies. They told me that the demand for rides is far too high for them to ever handle it entirely with automated vehicles. Think
about it. Uber provides 15 million rides a day. So which is cheaper? Building and maintaining a
fleet of millions of cutting edge high tech autonomous vehicles or paying human drivers
less than minimum wage to supply and drive
their own Toyota Corollas. I think I know which one I'd pick, and I think that they are picking
it right now. They're going to go with the humans. Or let's take another example. AI boosters have
long promised that artificial intelligence will soon do things like translate text from one
language to another, or transcribe audio, or identify objects in photos
and videos. But right now, humans are still much, much better at those things, and humans are being
paid pennies to do them on Amazon's Mechanical Turk website. The Mechanical Turk is a site that
tech companies can use to farm out repetitive tasks to an army of actual humans who are being paid literally
$4 to $5 an hour.
You know, from a bird's eye view, it kind of looks like AI, except that the actual work
is being done by people being paid poverty wages, which makes Mechanical Turk a bit of
an on-the-nose name, since the original Mechanical Turk was a chess-playing robot that wowed
audiences who didn't realize it was actually being controlled by a very cramped human chess player hidden inside. Well, the truth is, we are living in
a mechanical Turk economy. Underneath the shiny veneer of our sleek, futuristic technosphere,
there is a massive underclass of workers who are performing so many of the tasks we assume
are being done by algorithms,
or that we just don't think about at all.
For instance, you might not realize that every time you log into Facebook and Twitter,
there is a gigantic, poorly paid army of laborers making sure that your feed isn't clogged with spam and porn.
Without them, the internet would be a nightmare of busted links and offensive content,
or at least more of a nightmare of busted links and offensive content.
But who are these folks? Where do they come from? And how do big tech companies keep them hidden out of sight?
Well, our guest today has done pioneering work on the people who keep our social media feeds clean and the effects of that work on their lives and livelihood.
Sarah T. Roberts is a professor of information studies at UCLA and the author of Behind the
Screen, content moderation in the shadows of social media.
Please welcome Sarah T. Roberts.
Sarah, thank you so much for being here.
Thanks for having me.
This is great.
So tell me about your work.
What is it that you do?
great. So, tell me about your work. What is it that you do? So, I'm a professor at UCLA, and my purview there is to be a researcher of the labor of the internet. For a lot of people,
that might sound like an oxymoron, because what labor really is involved in a thing that's all
computers? I've been looking at my particular area of this issue for about a decade. And I can tell you that the internet, in addition to computers, is made up of a lot of human activity, which we don't see.
And the particular facet of that that I look at are people who sit in between us as users and the platform as sort of a distributor and review all the stuff that we upload to those platforms that we
think we're just going to kind of seamlessly share with the world. These are people called
content moderators and the people who do it for pay and as a job or as a profession, I call
commercial content moderators. And I can tell you that if you haven't heard of this job, that is by design. It's because it's a pretty miserable job
by and large. It means that human beings are sort of in a position to be the mediators of
oftentimes some things that at best are maybe boring or rote and at worst can be really nasty
and upsetting material. Well, I want to hear what kind of material
you're talking about. But yeah, first of all, this is not a job that I conceived of existing
before encountering your work. We sort of have this idea, well, when someone posts something
on Facebook, there's an algorithm that sorts it and determines whether or not you see it.
People are clicking a flag button maybe, and that's kicking it up in the algorithms hierarchy or whatever.
But for the most part, we believe, and these companies present themselves as being just dumb pipes.
Hey, people post and we show it to you.
The end, free speech, do what you will.
Nothing else going on here.
But that's not the reality.
Look over there.
Don't look over here. I think, you know, there are facets to what you just
described that have truth in this ecosystem. But it's not as simple as the industry would
have us believe. Like I said, I've been looking at it for like a decade. And in that time,
there's been quite an evolution in terms of what is possible with technology. If we were having this conversation in 2010,
I'd tell you pretty unequivocally that computers would be largely useless in this kind of analysis,
like, is this good or bad? But in that time, the biggest players in the industry, the Facebooks,
the Googles, those who have the resources, meaning
money, computation, and employees to put on this problem have really accelerated the capabilities
of what you're talking about, which is like computational decision making. That having been
said, how do you train the algorithm to know where the line is with something like hate speech,
which we as a society cannot come to terms of agreement on, right?
Even people can't agree on what counts.
Yeah, that's right.
So, part of the issue that is kind of baked into this scenario is that on the front end
of what they're trying to maybe solve for,
there might be certain social issues that are fairly intractable.
And it's like just ramping up computation on that isn't going to, believe it or not, bring world peace.
You're going to need people to do it at the end of the day,
just like I was talking about in the intro,
where there's jobs where like, hey, you just need a lot of people to do them.
And you can get people cheaper than you can computation in a lot of cases if you're willing to drive it down that much.
I mean, there's really an economic dimension here, too, where there's a calculus done.
What is the return on investment of putting the entire apparatus that was supposed to be designing cool new functionality onto resolving what constitutes a picture of breastfeeding and what's somebody showing sexy skin, right? Yeah.
So the companies themselves have sort of –
That's not what the engineers who are like, oh, wow, I work at Google
and I'm going to work on the coolest new shit. That's not what they are interested in.
This is probably not the glamour side of the industry. And it's especially not that for the
people who are doing the human side of it. So, like, on the one hand, we have this,
you know, this is what I think you're getting at in this program. There's this, like,
aspirational quality that industry has sort of promulgated for the, I don't know, at least the
past five years that, hey, the algorithm, first of all, is a better decision maker. It does better
than humans, or at least it does it at scale or more efficiently or something.
And therefore, we're always just on the cusp, right?
We're always just almost there.
And maybe somebody is going to get hit by the self-driving car on the way to that.
Maybe that's part of, you know, maybe that's part of like the collateral that.
I mean, people, it's literally framed that way.
I was tweeting about the woman who was killed by the uber um by the uber self-driving car and it you know
we now have data from the federal highway or nitsa perhaps um you know federal agency that
reviewed it uh that said that the you know according to the data it could not recognize
that it was a pedestrian with a bicycle didn didn't know what she was, wasn't programmed to recognize that a pedestrian could be in that place on that road. And I had people, I was just tweeting about this,
you know, that like, you know, the real danger of AI is that we're going to all trust that it's
superhumanly powerful when really it's just got the same biases and blind spots as the dumb ass
humans who designed it. And I had people in my mansion saying,
oh, well, AI is so much more safe than a human driver
that we should accept this one death
and move forward with that program
because think of all the deaths that'll happen
if we don't do that.
Therefore, this is collateral damage
that's acceptable to us.
That like, you know, because it's,
because according to this person's hypothesis,
that's literally just comes from Elon Musk
tweeting about this, right?
Like, where is the evidence that self-driving cars
are a hundred percent safer than human drivers?
What studies have been done on this?
This is just blind trust of technology.
But according to that person, that means that, oh, we should,
any criticism of artificial intelligence is by definition going to kill people. It's a ludicrous
assertion, but that is like the utopian, like a halo that's been given to AI for so long? I mean, boom, right? I feel like, first of all,
I'll just, I'm going to slip into your mentions and try to reorient. I mean, I was fighting my
own battle this week around a similar issue, which had to do, you're probably aware of this too,
of the tech exec who applied for the new Apple credit card, and his wife, he and his wife share all financials,
have for decades.
She, in fact, has a better credit score,
and he got a higher credit limit by many, many factors.
You know, I want to say 10 times more,
but it could have been 100,
and somebody fact-checked me on that.
But the point is, you know,
where's the evidence of improvement, and by whose metric is the thing an improvement?
And what, you know, we're running into a situation where those measures of improvement are set up by the same, the same coterie, small coterie of people who envision the problem space also, right?
people who envision the problem space also, right? You know, I think of someone like Elon Musk, who lives in LA, and we've got rampant homelessness, we've got a traffic gridlock
that almost had me not getting here. And what's the solution? Going to Mars and like,
making private tunnels under the city in an earthquake zone. Like, who vetted this?
But you know, one of the things that what your comment and your kind of interaction with your Twitter followers, God help you.
Oh, this was one out of, this is one out of a hundred replies I got to this.
You know, my Twitter followers are smart people.
I live on Twitter and, you know, we just, it's just what it is.
It's just what it is.
But, you know, one of the things that I pointed out in that commentary that I think is also relevant here is that, you know, we're willing, we, quote unquote, are willing to put some of the most precarious folks on the line in this experimentation.
Right. Like, who doesn't fit in the algorithm?
Well, in the case of Uber, it might be people with different body shapes, right? Or when
we do facial recognition, it's people with certain skin tones that don't match or who disproportionately
match in a database, right? So, we've got like these scholars out there, many of whom are my
colleagues who are pointing out, you know, and have been like sending up these signal flares
about these warnings. And I mean,
my hat's off to the gentleman who raised the issue about his credit score, because he's a white dude,
a tech millionaire, long-term married in a, you know, heterosexual marriage to a woman who said,
hey, I'm getting screwed too. And by doing that, you know, he kind of brought everyone else along.
But I think that calculus is really
disturbing. Like who's disposable along the way to get to this utopia that is always just out of
reach. Right, right. Yeah. Like the, what is this promise that we are blindly swallowing of the
self-drive? What is the self-driving car future actually, right?
Like what is it actually going to be?
If he is allowed to build exactly the thing
that he wants to build, what will the world look like?
Because that has not actually been described to us.
We've gotten the 3D rendered, you know,
planning ahead version.
We've got the little mini movie that they've made.
But we know just from watching what's happened with his own projects that like the gap you know, planning ahead version. We've got the little mini movie that they've made.
But we know just from watching what's happened with his own projects that like the gap between that and reality is enormous.
So what is the actual future that we will get if we allow that to continue untrammeled?
And it's not something that we actually have been presented with.
We haven't actually dove into, dove, dived, dived into what that is.
And so what are we sacrificing those lives for?
Yeah, I mean, I think one of the, you know, in all of the confusion and obfuscation around the
issues that you raise, you know, one thing that seems to have surfaced for me is that whatever
that future looks like, it's not going to be one that is envisioned to be equally
distributed throughout society. So when we're thinking about some of his other projects,
pet projects, and those of peers. Dunking on Elon Musk is like such a recurring topic on this
podcast. Yeah, I'm in the right place. Let's talk about your actual work because, man, I could
go off on the big picture issues here.
This is a wonderful conversation.
Yeah, I'm happy to chat about that anytime.
But let's talk about – no, we'll get back to it.
But let's talk about these specific content moderators.
Like for a site like Facebook, Facebook employs these folks?
And how many people are we talking and what is it that they're doing?
So, you know, it's difficult to nail down numbers, but I think in the tens of thousands
is a fair assessment.
Tens of thousands of people.
The tens of thousands for Facebook specifically.
And, you know, when we think about some of the big players in the industry, so
that's not counting Instagram, which is another property that Facebook owns.
That's not counting something like YouTube.
That's not counting people doing content moderation activity for other properties for Google.
So you can see that, you know, if we take in the tens of thousands for one of these major
platforms and we kind of multiply it up, you see that we get at some pretty significant numbers. So, Facebook definitely has an entire ecosystem. And kind of speaking earlier about
the ability of firms to build their own in-house algorithmic or AI-informed tools,
they are also players in that space. So, what they have is something of a hybrid ecosystem where algorithms, machine learning,
informed tools, other kinds of computational tools are definitely used. But humans are also
a fundamental and mission critical part of that ecosystem, whether it's like being generalists
on the front end where they're on a live production floor, and they're looking at content
as it gets flagged and put into a queue, and they're making a decision that has kind of an instantaneous
effect.
Or more and more that they might be an employee looking at a curated data set and making decisions
about that data set to go to inform that machine learning tool that will then go to ideally
at the very least support or maybe even replace some of
that human activity. But at the end of the day, there's a certain extent to which it doesn't
matter, right? Like whether they're doing all of the moderation manually in front of their laptop,
or if they're at the end point of some complex algorithm that's giving them shit to look at at
the end of their, you know, at their laptop. They're still sitting there. We still have tens of thousands of people spending hundreds of thousands of person hours
a day, right? Looking at this content. You got it. So where are these people? Who are they? And
what are, what is their day to day? What does it look like? One of my earliest findings in the
process of doing my research was that there wasn't a one size fits all solution that that industry was kind of bootstrapping or having this patchwork mechanism to meet their needs.
And so they were doing a number of things.
One of the things they were doing was kind of creating teams within their HQs.
This would be in the case of the companies we're talking about.
This would be, in the case of the companies we're talking about, this would be in Silicon Valley.
So they were hiring folks,
usually through contracting companies though, right?
So they weren't direct.
Oh, of course they're not direct employees.
They were not direct employees.
Keep the company lean, right, for Wall Street.
But they're bringing in these folks as contractors,
probably limited term.
But these people were coming to and from
the onssite facility and
working there, maybe down the hall from engineering or some other place in the building.
So there was a group of people like that. There were people doing this work sort of with third
party boutique specialist firms. That was another crew of people. And sometimes those firms were
providing social media management across the board,
including creating content or seeding content for companies that didn't have the desire.
Then we had folks working in what might come to somebody's imagination if they think about this,
which is the call center environment. Call centers are not just in Southeast Asia. They're
not just in Manila, but they certainly are there. That's, in fact, like the call center capital of the world and particularly for the American market. But I was finding evidence of these call centers in places like West Ames, Iowa, which is like pretty much next door to a cornfield or maybe a soybean field.
pretty much next door to a cornfield or maybe a soybean field.
Facebook has a site in Austin, Texas. That's another big site.
There are places all over the United States, weirdly, who see their competition as places like India.
So, this company I discovered in Iowa, its tagline, it was called Calaris back in the early 2010s.
It's changed hands and names many times.
But its tagline was outsourced to Iowa, not India.
And I was just like, as a researcher, that was a goldmine.
And if you're listening, students take screenshots because they were going to change that stuff.
Now, how should we feel about that?
Because obviously, you know, many American workers have negative feelings about outsourcing, right?
But if it's, and so, okay, great.
Hey, call center American, union- about outsourcing, right? But if it's, and so, okay, great. Hey, call center American, union made call center, right?
Perhaps, I'm sure it's not a unionized call center,
but you know, you've got that,
maybe there's that buy American halo around it,
except that then when you say they're competing with India,
but then wait, how low wage can these jobs be?
Well, you know, back in the early 2010s, when I found out about this, this Iowa based site through a New York Times article, they were reporting wages around like $9.75.
You know, so pretty low, no benefits.
Of course, that's many times greater than what a comparable employee in a place like the Philippines would earn.
But what that company, Calaris, was effectively doing was selling its value add to its business-to-business customers as being exactly what you're getting at, which is the American-ness.
And not just any American-ness, not those fancy highfalutin East Coasters.
Iowa, down home, Mike Huckabee.
I mean, people, folks cannot see me, but these are people who could be my cousins. I'm from
Wisconsin. You know, we're corn fed. We're hearty, salt of the earth. That's what they were selling.
I mean, the xenophobia was present. Like, don't outsource it to these unknown others in other
parts of the world. Keep it in Iowa.
I mean, I think the people in that call center
would have rather been on their family farms,
but they lost those in the 80s.
Yep, and Rust Belt where factories are collapsing
and et cetera.
And it almost looks like you've got these companies
saying like, oh, hold on a second.
There's a lot of exploited Americans without jobs
who will be willing to work for pennies.
That's right.
And the exploitation goes up the chain in the sense that there's a stratified system
wherever you find this work, even when the conditions are markedly better.
So, for example, the group of people I talked to in Silicon Valley,
yeah, their gig was probably nicer in material terms than the people in Iowa.
They weren't making much money at all, however, compared to their peers. And these were people who were required by the company they worked for to have a four-year university degree from places like Berkeley, USC, private liberal arts colleges.
And these poor saps, you know, had graduated with journalism degrees, econ, history. So,
you know, they made some bad life choices, strapped with student debt. And so, they were
ripe for this work too. And they even brought an even higher level maybe of cultural knowledge or
context. It's really funny that it's like, no matter where, whether you're in the Philippines
or you're in Iowa or you're in San Francisco Bay area, there are folks trying to push your wages
down, trying to like, you know, like there there's a way to get, there's a way to get
fucked no matter where you live to put it bluntly. Cause yeah, I know, like there's a way to get fucked no matter where you live,
to put it bluntly. Because, yeah, I know people who have those jobs who, you know, are highly
educated, highly trained, but are making low enough money they can't possibly pay back their
loans. They can't live in those cities. They have, you know, two hour commutes, that kind of thing.
I'll go one step further, which is that it's the very fact that they're strapped with those
loans that makes them open to such a shit job in the first place.
Right.
So, the folks that I talked to who were working at a company that I called Megatech,
which obviously is a pseudonym, you know, they said things to me like,
yeah, it beats being a barista, which is what I was doing.
Although that, you know, later on that kind of, it's sort of a Faustian bargain. Like,
does it actually beat being a barista? Well, maybe marginally, maybe economically,
slightly there's an edge, or maybe like you get to sit at a desk all day and say you work at
Megatech and that has cachet. What's the actual cost on you in terms of the psychological impact?
I mean, it sucks slinging coffee,
but you're not exposed to child abuse most of the time.
Right, okay, so let's talk about that.
Let's talk about, yeah, Starbucks,
say maybe not the best,
but they try to make their baristas kind of happy there.
And they're definitely not making them look at images,
horrible images that people are posting on Facebook all day long. So what does that work
like that these folks are doing, these contractors are doing and what effect does that have on them?
So that's, you know, that's something that has also experienced an evolution. In the early days,
in the early days, it really was just like an open pipeline in the sense that, you know,
as users, we might see something that we find disturbing. That doesn't happen that often.
But when you aggregate that over the millions and millions of people who are on the platform
at a given moment, now the numbers start to get great that people are reporting like,
you know, dangerous, abusive, maybe pornographic,
violent content, all of that would just kind of go into a queue. And, you know, we wouldn't
ever kind of know the end result or really think about it again. But there were people on the other
side sort of receiving that. Having to watch all of it.
Yeah. And in those early days, again, it wasn't really very sophisticated in terms of any triaging or how that stuff would necessarily get sorted.
And so it was the case, for example, that there might be something as a worker that you really have a hard time with.
I'll give an example.
In my case, it would be animal abuse.
That's just a thing that I have no capacity to deal with.
There would have been no way for me to kind of self-select out of the animal abuse reports
in the early days.
So, there are improvements around that.
But by and large, for these generalists who are sort of the kind of lowest level grunt
workers, they are sitting in front of a screen and they are reacting.
They're in a reactive mode to stuff that's already been flagged.
A lot of that could be false positives.
You know, somebody's pissed at their friend
and they start reporting all their posts as being,
you know, they're gaming it or whatever.
This stuff actually does need to be filtered.
Yeah, it needs to be filtered.
We need to be able to tell the difference
between good and bad.
Right, but, you know, intermingled in that
are things that are legitimately frightening.
Things that no one should have to see, certainly not over and over again. And so, you know, full disclosure, I'm not trained in psychology.
I'm not a psychologist, but I am a human.
And, you know, I have empathy and I could understand what people were reporting to me.
And in part because I've been on the internet for almost 30 years, I clicked rogue, rogue clicked on stuff I thought was one thing and saw it and
it was something else. It's messed me up. So, you know, these folks would tell me,
you know, I can handle this job. I'm okay with it. I've come to terms with it. I can do it. Other
people can't do it. I can do it. And I take that on face value. And then like, you know, 20 minutes
later, the same guy would say to me, you know, I've really been avoiding
hanging out with my friends since I have this job because like whenever I'm in a social situation,
we just invariably talk about work and I don't want to talk about work. I don't even want to
tell people what I do. Not to mention they're all under nondisclosure agreements, by the way.
So, not only are they under the NDA, but they didn't want to
freak their friends out with the stuff they were seeing. It's not fun to be like, yeah, I saw a
kid get the shit beat out of them on, you know. But I can imagine. I mean, that's bad socially,
right? And being, I mean, we've done work on my show about how, you know, loneliness is a health
condition, right? That, you know, if something is impacting you that socially, that can actually hurt your health.
But, you know, hey, there's a lot of unpleasant jobs. You know, I'm sure, you know, garbage men
come home smelling like shit. You know, there is that part of it. And, you know, there are some
jobs that we all accept need to be done that way. And those are unfortunate jobs. And hopefully
people get them that are able to handle them. But at the same time, like I can imagine the psychological effect of this.
Like when I think of, you know, the one or two or three things that I've seen on the Internet that truly upset me, that, you know, videos where I accidentally saw someone being killed or, you know, some really.
Oh, I don't even want to go into it.
I'm thinking about it right now.
Yeah.
It's like upsetting to think about.
And like you click away, like, oh my God.
And then it, and then it like flashes back to you.
I have a couple clips like that.
If you're seeing hundreds of those a day that can't help but have a psychological toll on you.
And you're seeing ones that no one else is even seeing.
Like you're, you're not, you're not clicking on a YouTube link. You know, you know, at least when you're
watching, if you watch like a beheading video or something, which is, I think the example,
probably a lot of people can come to mind about like, you know, those get spread around and a
million people click on them. They're on 4chan or whatever. But when you're just watching like,
Hey, someone uploaded to YouTube, a video of them abusing an animal, abusing a child,
anything like that. Like, my God, that must be really take a toll.
I mean, I think it does. And I think, you know, your point is really accurate in terms of, look,
we, for better, for worse, our society is set up where some people have shit jobs,
right? That's, I would say it's for worse, but we is set up where some people have shit jobs, right? That's,
I would say it's for worse, but we know that's a fact of life. There are also people who,
as a kind of a precondition of the work that they do, know they're going to encounter horrible
stuff. So, we might think of, you know, a homicide detective or something like that.
But in the cases that we might think of as analogs, you know, at least
the people are visible and at least they're understood in their social role. The issue for
so many of the commercial content moderators that I talk to is not only are they taking this on and
really taking it on the chin in so many ways on behalf of the rest of us. But the way that the ecosystem has been set up
is to really render their work invisible.
The way to be a good content moderator
is to leave no trace.
And this links us directly back
to the front end of our conversation
where we were talking about,
because the kind of better mode for the companies
or from their logic and perspective
is for people, if they think about it at all, to think a computer did it, not a human.
So, these folks have to operate leaving no trace, not acknowledging or being acknowledged
for the work that they do on behalf of others.
And really, you know, by the terms of the NDA, but also just by basic sociality, keeping it to themselves. And
human beings are not designed to do that. I mean, that's going to come out of the seams
somewhere. I talked to a woman who used to do this work for a little company that was called
MySpace, which is like really taking us back, which happened to be located in LA too. It was over on the West side. And she told me- Of course it was.
Yeah, right. Over by the boring company tunnel. And she told me, you know, she's gone into
bookkeeping. Okay. She's a bookkeeper. She doesn't do any work having to do really with computers
per se. And she said, look, my thing is this.
I don't really even want to work in an office setting because I don't know if the person who's in the cube next to me maybe did this work at some point.
And will later on have, you know, kind of crack or have it have a flashback or have some ill effect of the work that he or she did 10 years ago.
Yeah, this is from her experience.
Yes.
Yes. And, you know, we talked from her experience of doing this work. Yes, yes.
And, you know, we talked quite a bit.
Her name is Roz.
She's fantastic.
And I write about her in the book.
You know, she had so many interesting insights.
One of the things she said to me too,
she said, look, after I quit that job,
for three years, I wouldn't,
when I met people, I would refuse to shake their hand.
And she kind of looked at me knowingly.
And I was like, I'm not totally sure what she means.
I said, you know, tell me more. And she said, well, look, I've seen videos of
what people do and people are nasty. So, you know, that's the way that the effects might manifest
themselves, just changing your orientation to the world. I mean, on the one hand, it's,
there's the threat of having damage done where you become so sensitive and you're so kind of harmed and unable
to deal with the content. But I often think about what's another potential outcome here? Well,
the potential outcome is being totally desensitized where you're no longer effective
because things sort of roll off your back. I don't want legions of people like that walking around.
And I think that's what she was getting at too.
Right.
Well, there's a deeper hypocrisy here
on the part of these tech companies
that I want to dive into,
but I have to take a quick break.
Sure.
So after this ad break, come right back
and you're going to hear me rant about something
that makes me very angry.
All right, we'll be right back.
I don't know anything.
I don't know anything. All right, we're back with Sarah Roberts.
Okay, here's what really steams me,
is that you talked about how these tech companies want to present this image of, you know,
oh, it's being done by a computer, right?
And they don't want us to see all of this, you know, this perpetual underclass of workers who they actually have
moderating the content. I actually think it goes deeper than that, because what these companies
are currently telling us is actually they don't moderate at all, right? When people are making
the request of there is, hey, there's hate speech on your platform. Hey, there are false political ads on the platform.
There is propaganda on the platform. There is stuff that is harmful to people on the platform,
right? Or simply that, hey, the way you've organized your platform is causing bad behavior,
right? That like you're throwing a party. The example I always use is you're throwing a party,
you're buying the keg. It's your fault that shit gets broken on this platform, right? You created
Twitter. So the way people behave on Twitter, if it's bad, it's somewhat
your responsibility. We're all trying to tell this to these companies and these companies
are all built on this very understandable early 2000s internet ethos of we are dumb pipes.
We don't own anything. Hey, don't sue Google for getting you for, you know, delivering you the
pirated content, sue the site that's uploading it
and sue the person who uploaded it
and sue the person who's trying to get it.
Don't sue the pipes, right?
That's the principle they've been operating on for years.
We now realize that that is not sustainable, right?
It's not supportable.
These are media companies.
There are massive media companies
that have the biggest companies in the world.
They're owned by,
they're run by the richest people in the world. This is where people are getting their news. This is where we're all communicating with each other. And again,
they're throwing the fucking party. So they bear some responsibility for it. And they're saying,
no, we don't, we don't moderate anything. No, we just, people just post stuff and that's all they
do. Except that they are moderating it. They're moderating it in this silent way that they are then hiding from us and pushing the burden onto this underclass. And if they weren't doing that, we wouldn't use the sites to begin with because they'd be full of spam. They'd be full of pornography. They'd be full of abusive content that people don't want to see. So they are sanitizing it. They are moderating it is what you're telling me, but they're refusing to own that and they're refusing to do the kind of moderation
that their actual users are asking for.
That's my rant.
What do you think of it?
I'm mad.
I mean, I would just cordially like to invite you
for happy hour.
And I think we have a lot to talk about.
This is one of the most awesome experiences.
I feel so validated and seen
as a researcher right now. It gets kind of dry and boring on the university campus, but this is
what's up. I mean, this is what's up. And here's, I'm going to, I'm going to, I'll see that and
raise you. Okay. Wonderful. Yeah. Here's, here's the logic. Like what you're saying is, you know,
what the companies are telling us doesn't match. What is the deal?
The deal is this. The firms came to all of us and said, we want to take your beautiful human
self-expression. We're going to call it something called content. That's weird, but okay. Everybody
go with that. And we want that. And we want you to be able to share that and circulate it and
connect with each other. P.S., we're going to go over here in the corner and make some deals with some advertisers. And then we're going to take all that stuff that
you think you're making as the clients of our platform. But see, actually, our clients are
the advertisers. Oh, yeah. So, you keep making all that stuff because that helps us connect
with our actual customers, the advertisers. Now, internet freedom, information
should be free, blah, blah, blah, cyberspace. Like this takes us back to when I first got on
the internet in the early 1990s, where that already was kind of an ethos that didn't seem
to really properly fit depending on who you were. Like, okay, well, information should be free, but
my friends know that I'm gay on this platform, and then I'm getting a lot of abuse from it. Like, okay, well, information should be free, but my friends know that I'm gay on this
platform and then I'm getting a lot of abuse from it. So, I would like that to not be free-flowing
information. You see what I'm saying? So, like, already that was like an ill fit. But the way
that they got us all to buy in was by telling us that each and every one of us would have the
ability to freely share information as we saw fit. See, that isn't a good selling point to
advertisers, right? In the same way that broadcast television doesn't tell its advertisers, we're
going to have a show called Turn on the Camera and Open Up the Studio Door in Downtown LA and
See What Happens. Because advertisers don't want to get matched up with that. So even though the firms were telling the whole world,
and in essence, to a certain extent,
it was true that they wanted people to upload whatever
because they needed a critical mass of that stuff early on,
they had to have a mechanism to deal with their clients,
their customers, their relationships, their advertisers.
And so this is why I think if I had to tell you a principal
insight from my book, beyond all the experiences of the workers-
And what's the name of your book?
Oh, yeah. It's called Behind the Screen, Content Moderation in the Shadows of Social Media. It's
out on Yale Press. Thank you.
Fantastic. And what is this insight?
So the insight that, or an important takeaway here is that the content moderation ecosystem
that you and I are talking about, that the whole world is talking about now exists in the way it does because it is a brand management mechanism for platforms.
Are there knock-on effects of keeping you and me from seeing horrific shit?
Yeah, that's definitely an upside of them doing that brand management practice.
But as we know, it kind of seems like in some cases, some of the most disturbing, provocative,
maybe divisive content is actually the stuff that goes viral.
Yep.
So here's the thing, right?
That straddling of that line and managing that line of where it goes from viral and
provocative to a turnoff and a problem for the advertisers, that's kind of what this
whole apparatus was set up to deal with in the first place.
While still giving the illusion to users that there was no intervention, because once the
users knew there was intervention, they were going to want accountability for those decisions. And platforms operating at the scale that they are or want to
be at, they aren't interested in spending the entirety of their time and effort and money
on a losing, you know, a non-revenue generating part of the apparatus, which is the cleanup
and sort of the explaining of that
process. Yeah. They just want it to happen and they don't want people to complain about it.
Yeah. And so, there's kind of two better outcomes there. One is let's get people to not even think
about it. And two, let's have people, if they do think about it, think it's just computers.
And we know people are always, yeah, they're already socialized to for some reason think
that if a computer makes a decision, that must be the right one.
I mean, has anyone seen war games?
You know what I mean?
If a computer goes rogue, it's going to start a nuclear war.
Well, that's, but that's just how, you know, it's the example of people, you know, driving, following their GPS directions into a lake, which is a real example from James Bridle's wonderful book, New Dark Age.
But like, you know, when we've all been, you know, in a car with someone who's like,
I can't believe the GPS is telling me to go this way.
Yeah, you don't have to follow what the GPS says.
But yeah, I mean, what you're really alluding to is what I realized,
you know, YouTube being the biggest example,
when, you know, it was sold to us as hey everyone gets
to upload this is a new medium everyone gets to upload and share and that's what it's all about
free expression etc now i realize after you know youtube being out for i think it's 13 years
something like that um no here's what it is it's, but they don't pay for the content. That's it, right? As an
entertainer, that's now how I see it. Unlike NBC, NBC is taking eyeballs from people watching and
selling them to advertisers, right? The difference is NBC pays everybody. You see on screen, they pay
all the writers and stuff like that, right? YouTube gets it all for free. They run ads on it. They don't pay anybody. Oh, except if you do really, really, really well, then you can
qualify for a program where maybe you get some of a small amount of money, very small amount,
depending on the exact terms that they sort of don't really share with you. They're not really
transparent about them. And don't forget that that's just
like 0.1% of the people on YouTube. Most people never, never get that far. And so that's the
business that they're in and all of their decision-making is being driven by that business
model. Right. So why, yeah, why would they do, why would they make any decision other than, oh, will this make advertisers happier or sadder? That's it for them.
I mean, you know, as just a thought experiment, people could Google the FCC broadcast television rules, right? For example, for Americans, rules that govern what you're
allowed to do and not do if you're transmitting TV over the airwaves. There's a whole host of
things you cannot do. In fact, I think George Carlin made a pretty serious career out of like
the dirty words you couldn't say. I mean, let's take it a step further back in time, which is that the way these companies sort of positioned themselves, again, socially and legally and culturally was by reiterating, promising, swearing to God that they were tech companies.
And yet all the evidence is to the contrary.
The evidence is that they are media companies, just like you said.
The evidence is that they are media companies, just like you said.
I had the opportunity to be at the Sundance Film Festival in January of 2018 with a film on this topic of content moderation called The Cleaners.
And I was wandering around the village.
There are all these storefronts there.
Companies can rent them out for the duration of the festival.
It's a big to-do.
And I'm walking around this village, and who's rented out these out these like primo spots in this little Utah town? Dropbox, Amazon, you know, Amazon, the
movie company because they actually have studios. Apple, who's just gotten into the game and was in it in other ways. YouTube and Google. So, all of the companies that were being
represented in this prime real estate, if you asked a layperson or the average person on the
street, would say it was a tech company. And that's who was populating Sundance. So, there
are these like really strategic reasons to call themselves tech companies instead of media companies,
to issue all these rules, to position yourself in a certain way to avoid the unions, you know,
the actors' unions, all of these reasons. And they've been very adept. And it goes back to also
the, you know, what you were saying earlier, the 1996 Communications Decency Act that gave these
tech companies of the era this special designation of being immune for or being responsible for the
content that passed over their pipes, as you've been saying. They're empty tubes, as one senator once said. That's 1996.
You know, in 1996, I used to call up a business in the city that I lived in so that my modem in my house would connect to his bank of modems, and they'd make a lot of funny noises at each
other, and I'd move on to the internet.
And all that dude did was set up a rack of modems.
So modem to modem.
And it was called Internet Service Provider. That's who the internet intermediaries were thought of in 1996.
Or they were universities.
Correct.
Or the government in many cases.
Correct. And what's happened is this incredible privatization, this really, you know,
um this really uh you know fencing off of these spaces uh where the firms have developed as they have to their own benefit solely and uh and and yet they're still kind of trading on all of these
other ways these other metaphors that we had been led to use to understand what these new entities were when in so many ways
they have so many features of many industries and entities we already know. I mean, I, on the one
hand, I was surprised on the one hand, I just kind of like shook my head, like, of course,
when I found this out. But you know, YouTube has for its creators, quotey fingers to listeners,
for its creators, quotey fingers to listeners, for its creators of a certain caliber,
you get invited to, in all of these metro areas around the world, to a special studio.
Oh, I've been there. Yeah, right?
Yeah, yeah, the YouTube Creator Studio.
Yeah, and you get access to this high-level production.
Maybe someone will help you storyboard it out.
That sounds like organic user-generated content to me, right?
Yeah, yeah.
No, the first time I went there, cause college humor started using that space when I
worked there. And I was like, that's actually the first time I clocked it. Cause I was like,
oh, hold on a second. This is a studio. They got green screen. They got, they got editing bays.
They have all this stuff. And oh, we don't have to, we don't have to pay to use it because we're
going to upload the thing on YouTube. But wait, here's the other catch. We're not being paid. Nobody's being paid where we get to
use it for free. Um, but we are putting in labor. We're putting in our talent. We're putting,
we're buying our own props. We're hauling them there. You know, all we're getting is like,
you know, a, uh, basically a college film studio, uh, you know, with, with like slightly better
equipment. Um, but no one's getting paid. Why did YouTube make that?
Oh, because they're trying to increase the level of the content on the platform so they
can sell it to more advertisers.
They're a media company, right?
And like YouTube is, you know, you mentioned, say, Apple, you know, making TV shows.
That is, you know, for my purposes, that's Apple saying, oh, we're going to become a
TV network.
That's the old model.
I'm much more fine with that.
You know, they hire union crews, right, right? But YouTube is competing for the
same eyeballs. And it is, you know, YouTube is competing with Netflix. YouTube is competing
with NBC. And they're winning because YouTube is free for the user. Or at least it's not free
for the user because you have to have an internet connection. So it's like having cable TV.
But, you know, at the end of the day, that's who they're competing with.
And they're not they're not paying anybody.
But then also, like, so in a sense, in a really real sense, we, you know, we used to have NBC, ABC, CBS, Fox.
Right.
ABC, CBS, Fox, right?
Those companies we knew if something happened on the media that they were putting out
that was unconscionable, right?
Everybody could get mad at them
and we could make them stop, right?
It's your fault that you put that over the airwaves, right?
YouTube is now one of those.
We got Netflix, Amazon, YouTube is one of them, right?
YouTube is that big of a media company as NBC,
yet they claim no responsibility for what is on the service.
So when people were saying,
I actually talked about this on Bill Maher of all places,
because they kicked off Alex Jones off of YouTube, right?
And Bill Maher was doing the thing of, you know,
the, oh, what about free speech?
Da, da, da, da, da, da.
And yeah, oh, very much.
I won't mention what motion you just made.
But now look, and honestly,
he was saying that for everyone on the panel
to jump down his throat.
And so I appreciate having the opportunity to do so.
But what I said was like,
look, they weren't just hosting it.
They were selling ads on it.
They were making money off of it.
People were watching it
and YouTube was profiting off of it.
Therefore, they have a responsibility for it existing on their platform. And so it's like fine for them
to decide we shouldn't have that. In fact, I would encourage them to have more of that, to like sell,
to not sell Under Armour ads on white supremacy YouTube channels, to take another specific
example. Um, like, and, but they, they very occasionally in the case of Alex Jones,
there can be enough pressure for them to actually take those actions.
But for the most part,
they're like,
no,
no,
no laissez faire.
We don't do any of this.
Even though once again,
just do your research.
They fucking are doing it already.
Of course.
There are already,
how many people are watching YouTube videos to see,
to make sure that people aren't uploading child pornography, animal abuse?
Like, you don't see porn, child pornography, animal abuse.
You don't see it on YouTube, right?
Like, somehow YouTube is free of that stuff.
Yeah.
So they are moderating it.
They are discriminating about content.
Yeah.
They are making a choice, and they could make a different choice.
That's right.
And they shouldn't claim that they are choiceless and that they do nothing.
That's right. And I think, you know, for any young entrepreneurial type listening,
there is a huge market for a different kind of social media that puts its values out front,
that doesn't try to BS the whole world about its neutrality
while making decisions that absolutely not only favor its bottom line, but I think also
are trapped in sort of a, you know, like a twisted Randian you know like uh uh libertarian uh sure pipe dream about the nature
of of of free expression that just is demonstrably false in their own ecosystems like i really do
think the c-suite people at these firms are some are are some of the the biggest drinkers of their
own kool-aid around this right like jack dorsey's like the weirdest man on the planet and he just
tweeted like two days ago and and i retweeted it it said um hey fam or whatever you know hey
everybody um does anybody know who like the best people to follow about Nigeria are?
And I like retweeted it like, oh, like, is that really,
like what's going on in Nigeria politically?
Or I mean, I don't think he's just going on vacation.
You know, like, is that really the market research
you're doing?
Like, what the hell?
Yeah, but he lives in that space of 1996 Burning Man.
Right.
Blogger.com.
Well, that was a couple years later.
But, you know, just like, hey, we're all just posting stuff on the internet and anybody can make their post.
And I made a way for people to post.
And that made a lot of sense to me in 2000.
Right.
And like, you know, the possibly about to be criminally indicted president of the United States who seems to issue policy edicts via Twitter.
Yeah, that's totally normal.
Yeah. Right?
I mean, like, you know, that's just, again, you can prove that these things, these like prescriptive kinds of declarations about what the platforms are versus how they're
used. It's just like, there's just a total disconnect. Of course, there's a certain,
you know, there's a certain blissful ignorance in that, right? And not having to take responsibility
for the thing that you created is, I guess, like a cool spot to sit in as money gets printed. I
mean, let's just be honest.
Well, let's talk a little bit more about these content moderators or other people,
you know, more of the invisible underclass of the internet, right? Like, we talked about Facebook.
I imagine something similar is going on on YouTube where people are watching hours and hours of content. Where else? Like, what are other places where we see this dynamic that the average person doesn't expect?
Yeah, I would bring up the whole digital piecework ecosystem of platforms like Amazon's Mechanical Turk.
Right.
Where you can really, I mean, this is sort of like the most cynically atomized kind of work you can think of where people offer services and like anonymous,
unnamed people solicit work, but it might just be literally for like, you know, an analyze one image
or do one task. And you get paid like one penny. Pennies. Yeah. Right. In pennies. I mean, it's
just really deeply cynical.
And then there's no accountability for anyone.
Of course, it's just like the right to work states.
Well, the worker can quit any time too.
I mean, okay.
It almost seems like Mechanical Turk.
And it's the kind of thing you could almost do for fun,
where it's like, oh, hey, here's a little, oh, is that a...
A lot of the work is sort of like when you fill out a CAPTCHA
and it's like, is that a traffic light?
And you're like, yeah, it's a traffic light.
And then you move on, right?
Like it's a lot of stuff like that, or it's surveys or things like that.
So people who, the kind of person who, when, you know, they get a phone call,
it's like, we're conducting a survey.
And they say, okay, I'll answer the survey.
Those are the folks doing Mechanical Turk.
It seems to me, I understand anecdotally,
like a lot of stay-at-home moms who are like, you know, have an hour or two to kill or something
like that. I mean, I think that is a very positive spin on it. Well, I was going to spin it negatively
after that, but I'm sure there's an even darker version. I mean, I think, you know, you have to
think about it. It's a platform that's global in its reach. And of course, there are people who like
to, you know, deliver pizzas in their spare time. Right? And there are people who like to,
you know, XYZ, do medical transcription in your spare time from your home. But often,
there are other mitigating circumstances for people to take up that work. Of course,
there's a class of workers who want that freedom. They don't want to go to an office. They don't want to clock in. For sure,
that's real. But I mean, I guess I would argue that actually these phenomena have a lot more to do
with a fundamental destabilizing of the labor class as a political force among other things than any kind of freedom
that's true and of course let me let me conjure a different person who might be doing this work
someone who's disabled unable to really leave their home much needs some kind of income and
you know uh is not you know doesn't live in new york or san francisco so it's going to have
trouble finding like work at home like actually good tech work and just need you know, doesn't live in New York or San Francisco. So it's going to have trouble finding like work at home, like actually good tech work and just need, you know,
basically this is at home Uber driving for even less money.
That's right.
But even what I was going to say is, even if it is, hey,
oh, I'm a stay at home mom.
And I got an,
I got a free hour a day while I'm watching the soaps, you know,
to click on some stuff.
A company that's building its entire business model on like
monetizing millions of those people is and paying them far less than a minimum wage is still kind of a messed up enterprise.
Right. Because it's like you're you're putting in thousands of you know, you're you're making use of millions of person hours.
Right. That you're not paying a fair wage for.
And that's that's bizarre.
Not only are the employees or are the workers,
because they're typically not considered employees,
not getting a fair wage,
but there's another dimension here too,
which is like the social cost and the short shrift that municipalities, states, the federal government
gets in these deals where taxes
aren't being paid on behalf of employees. Social security is not being filled on behalf of the
work that they do. It's totally precarious work. You know, even this is like infecting other sectors
that were at one time more stable.
So, you know, retail sales, for example,
where you maybe could know
that you were going to have a certain schedule or set hours.
Now that's like down to algorithmic determination
of the labor vis-a-vis sales
and people don't know until the day before
that they're supposed to go in.
So, you know, this is like an infection across our social fabric. And I guess, you know,
it depends on where you sit. Like one person's efficiency is another person's nightmare, right?
And it's a thing I'm becoming more cognizant of as I use the internet. We are used to, I imagine a lot of
folks listening to this, occasionally rolling your knowledge of the horrible way that something is
made into your purchase of it, right? So you go to Forever 21 and you're like imagining, okay,
I know there's someone overseas, they're being paid very little. And then I know that this is
being shipped on a container ship, which is fuel inefficient and that it's going to be thrown away
and that there's part of this horrible fashion economy. Or if you don't, didn't know that now, you know it, and maybe you'll think
about it next time. Right. And so we're, but Hey, sometimes you got to buy it anyway, because you
really need a thing right now, you know? And so we're aware of that. But like this one specifically
that behind the internet, there are all of these invisible laborers is a lot harder to see. We're not as used to thinking
about it. Like one that just happened to me recently was for my recent live show I've been
touring called Mind Parasites. It's an hour long one person show. And I had video of it. And I was
like, I want to take a look at the script, not the script I've written for myself. I want to look at
what I did on stage and I want to edit that. So I said, Oh, I'll take the video. I'll send it to a transcription
service. Never done that before. There's one called Rev. I just Googled it. Yeah. Okay, great.
So, you know, so I've disappointed you already.
It's okay. We all are just trying to live in the world.
Yeah.
That's all we're trying to do.
Well, I had never used this. I'd never used one of these services before. So there's an
online transcription service. Oh, this one
seems like a pretty good, seems like it'll do the thing. I uploaded it. It was a dollar a minute.
So I paid 50, $60 for my one hour thing. I got it within a day. It was a very good transcription.
I wasn't like putting it on the web. It was for my own purposes. So punctuation errors were fine.
And then I just, yeah. And then I literally just read an article two days ago about how they just, without warning, dropped the rates they were paying their transcribers from, you probably know better than me, but something like 50 cents a minute to 30 cents a minute.
Not a minute of time spent, a minute of audio transcribed, which takes about five minutes.
That's right.
I'm spent a minute of audio transcribed, which takes about five minutes.
That's right.
So they cut their workers wages by like a third without really telling them without even sending an email, just sort of like doing it.
And, and so that made me realize, oh yeah, when I did that, there was a person who had
to do that.
They weren't someone who was being treated in a professional manner.
You know, that's not to say every service doesn't.
I use a massage service called Soothe.
Sends a masseuse to my house.
All those people
are trained masseuse.
I use it like once
or twice a year.
Okay, you made a face
like, oh, I'm fancy.
No, I was like,
that's cool.
Yeah, yeah.
It's very nice.
You and Elon.
Yeah, yeah, exactly.
No, it's like, you know,
it's a luxury
I allow myself, right?
It's fine.
And it's better than
going to a massage parlor
in person.
I don't want to drive
to a massage parlor.
So instead,
you just hit the button on the app, sends a trained masseuse. But guess what? You pay an actual masseuse price and they're a train masseuse. They're not some fucking gig worker who's like, hopefully they can just press on people for money. Right. So I'm like, this is a this person is being Rev is, right? It's the polar opposite. And I did not realize that when I was using it.
And there was nothing to indicate to me that, that while I was using it, there was no warning.
There was nobody really talking about it at the time, at the moment that I use it, or at least I hadn't seen any articles or any tweets about it.
And so I was like accidentally using this, uh, like massively exploited underclass of
labor without realizing it.
Right. And then you were like horrified, I'm sure, right?
Just like I am every time I go on YouTube or on Facebook with these content moderators.
I mean, so to give an example for my research, I went to the Philippines in 2015 and spent some
time in Manila talking to people who are working in call center environments there.
And one of the things that they mentioned to me was that, you know, some of them had been working there for a few months.
Some had been as long as two years.
And what kept happening to them was that whereas when they first started, they had like maybe like 32 seconds to make a decision about a piece of content.
Suddenly, their manager would come and say, you only have now 13 to 15 seconds. first started, they had like maybe like 32 seconds to make a decision about a piece of content.
Suddenly, their manager would come and say, you only have now 13 to 15 seconds to make a decision.
And if we don't meet that new metric, this contract is going to up and go to India. It's always India that's invoked. This contract is going to get pulled and go to India. Because
if you think about it, just like kind of the math you did on the REV example, asking workers to only take 15 seconds where they once had 30 is asking
them to double productivity. Or another way to look at it is asking them to have a wage that is
halved. So, when you, you know, the story of REV or the story of the folks I just described in the
call center, you know, there is a common
thread. And the common thread is that the power differential between the labor force and management
is beyond belief. There is no ability for workers in these current constitutions of work relationships
to really effectively push back.
Like how did, how did factories not do this shit to people all the time?
Because people like my grandpa who worked on a factory line for 45 years,
built a union that said, we have a contract.
I just, I just had a Stephen Greenhouse in here a couple of weeks ago telling us about
the history of the labor movement and how, you know, a couple hundred women died in a factory fire.
Sure did.
And they got together and started protesting and they, you know, through over the decades and decades of the labor movement fought for protections that stopped that from happening and pensions and health insurance and all those sorts of things.
And now we are back to square one again, seemingly.
Well, let me just neatly tie it up in a bow because we don't get to a place in 2019
where it's like the entire nation has at best forgotten about its labor history,
but more commonly holds organized labor in total disdain by accident. There's been a 40-year political campaign to demonize labor
and organize labor as a political entity, starting with Ronald Reagan crushing the
air traffic controllers when he took office in 1981. Guess who's running the companies
that we're talking about and what their politics are towards something like labor unions.
Yeah.
They are anti-labor because, you know, everybody ought to be free. We're not going to organize up in these weird collectives and like make everybody sign a contract. That's not free, right? Yeah.
Freedom comes at a real cost for some people on the wrong side of that equation. And I think the result is that we're
seeing suppressed wages, we're seeing workers in precarity, we're seeing an overnight thing where
somebody's income can be halved with no recourse other than this smart young woman who went on
Twitter and was like, hey, I want to tell you about some bullshit going down at Rev.
Yeah.
And she went rogue and posted it and everybody picked it up.
And then the media started picking it up.
So in terms of like resistance, I want to call out our colleagues in journalism.
I want to call out people in labor organizing, like the Tech Workers Coalition, who are trying
to bring together an industry.
Call out in a good way.
Yeah.
Like raise them up as like places, because this has been a fairly grim conversation,
let's be honest, right?
A fun grim, which is where I live.
Yeah, me too.
Like, you know, kind of depressing laughs.
But I was going to ask you
where your places of optimism were.
And so I want to hear about this.
Well, I think, I mean, you know,
the reason I do the stuff that I do, which is,
you know, has weirdly put me in a world that I think about horrible stuff all the time, too,
even though I don't have to see it, is that, you know, the whole point of that is to make things
better, right? And we can't make things better if people live in ignorance about the way things really are.
I don't blame any listener of your program
for saying to herself, I never thought about this
or I didn't know about this
and I use Facebook all the time
or I use Twitter all the time.
That's not on us per se.
We were led to never think about it.
So the work that I do gives me hope
because I know that I'm making
the conversation more complex, right? And it's only through the complexity of the conversation
with real facts and information that we can actually, as a collective, as a society,
decide upon what we're willing to accept and what we're not willing to accept. But if we're living in like blissful ignorance
or buying a BS line that we're being served,
why would we think to resist at all?
So I have hope around that.
I have, you know, I am a, just again, full disclosure,
I'm a Gen Xer, I'm of a certain age, you know, if you will.
I feel like a lot of the people coming up, the young people are
not taking it anymore. They're like not having it. Unfortunately, I was born-
Because they're the ones doing the jobs.
Yeah, they're doing the jobs or this is what awaits them.
Yeah.
Crushing debt plus shit work equals no.
Yeah.
Right? That's a simple equation for everyone.
Right.
So that's where that's going to come from.
Unfortunately, I learned recently that the year I was born in was like one of the two lowest
birth years in the entire 20th century. So, when I feel lonely or like misunderstood,
that explains it. Yeah. Gen X is like forgotten. But, you know, there's a lot of
But, you know, there's a lot of like organic and very real outrage that is being funneled into a lot of resistance and also not just resistance, but new ways of seeing.
So, I think a lot of the young people out there are not just going to take on what's being spoon fed to them.
They're just not going to do it.
And that's, I mean,
shit, I'm praying that that's the case, you know?
Well, amen. And it is, it's one of those issues where once people are aware of this problem and of these people who were, you know, who's all of whose work we are unknowingly exploiting, right?
That, that is going to change the conversation. And, you know, I mean,
the, you know, the anti-sweatshop movement, right, in the 90s, half worked, right? They all,
all those companies, you know, change their practices just enough to have plausible
deniability. But let's say things got a bit better, right?
They did. And I think, you know, social movements work, but they are sustained and they're long term and the gains take time.
But I do think of the anti-sweatshop movement as one really appropriate kind of example.
Because for no other reason, I think like the textile industry is really problematic in many of the same ways.
Like the textile industry is really problematic in many of the same ways.
You know, a factory collapses in Bangladesh and this industry is fractured so delightfully well that H&M or whomever can really reasonably, again, quote unquote, reasonably say, gosh, we didn't know our stuff was made there.
Plausible deniability. Yeah.
And we have to say that's actually not, that's not going to work.
liability. Yeah. And we have to say that's actually not, that's not going to work.
Well, I thank you so much for coming on the show to tell us about it and for doing this work and for spreading the word that will hopefully result in some change. I'm really grateful to, you know,
be talking to your audience and it's awesome. Awesome. Thanks so much for being here.
Really appreciate it.
awesome. Awesome. Thanks so much for being here. Thank you. Really appreciate it.
Well, thank you once again to Sarah T. Roberts for coming on the show. Her book, once again, is Behind the Screen, Content Moderation in the Shadows of Social Media. I hope you check it out,
and I hope you enjoyed that interview as much as I did. That is it for us this week on Factually.
I want to thank our producer, Dana Wickens, our engineer, Ryan Connor, our researcher,
Sam Roudman,
and our WK for our theme song,
I Don't Know Anything.
You can follow me on Twitter at Adam Conover.
You can sign up for my
super secret mailing list
with tour dates and fun facts
at adamconover.net.
And see you next week
on Factually.
Thanks so much for listening.
That was a HateGum Podcast.