Ideas - A Harem of Computers: The History of the Feminized Machine
Episode Date: November 14, 2024Digital assistants, in your home or on your phone, are usually presented as women. In this documentary, IDEAS traces the history of the feminized, non-threatening machine, from Siri and Alexa to the "...women computers" of the 19th century. *This episode originally aired on Oct. 26, 2022.
Transcript
Discussion (0)
Hey there, I'm Kathleen Goltar and I have a confession to make. I am a true crime fanatic.
I devour books and films and most of all true crime podcasts. But sometimes I just want to
know more. I want to go deeper. And that's where my podcast Crime Story comes in. Every week I go
behind the scenes with the creators of the best in true crime. I chat with the host of Scamanda, Teacher's Pet, Bone Valley,
the list goes on. For the insider scoop, find Crime Story in your podcast app.
This is a CBC Podcast.
Welcome to Ideas. I'm Nala Ayed.
Read me the message. New message from Sebastian. Great news. We got the go ahead.
Can you meet at 10? Reply. Definitely. I'll see you there. This was the very first TV commercial
for the iPhone 4S launched in 2011, which included a special new feature.
Is it going to be chilly in San Francisco this weekend?
Not too cold. Maybe down to 61 degrees in San Francisco.
In the commercial, a voice-controlled digital assistant helps a man plan a business meeting,
another manage his commute, helps a woman make dinner reservations, and another with her baking.
How many cups are in 12 ounces?
Let me think.
Okay, here you go.
The S in iPhone 4S stood for Siri, the name Apple gave to its digital assistant.
Greetings.
She was the first in this new wave of digital helpers, soon joined by Alexa.
I'm Alexa. Just ask, what can you do?
Cortana.
I'm Cortana. I'm here to help.
And Google Assistant.
I'm your Google Assistant. I can help you find answers, get things done, and have fun.
Now you might notice that all of these voices have something in common.
They're all women.
While most of the major assistants do offer a male voice,
Hi there.
it's clear that they are marketed as women by their manufacturers.
Like to reheat pasta?
Reheating pasta.
It's cool, right?
Yeah, I didn't know you guys put Alexa in a microwave.
Yeah, we're putting her in a lot of stuff now.
The idea is that women are more acceptable
in these roles in our societies,
and people are going to feel better
if they hear a female voice
or if they see a female robot.
But women were computers long before Siri and Alexa.
And I don't mean only like this.
Computer control, come in.
Computed, dear.
Computer, you will not address me in that manner.
Computed and recorded, dear.
And so that's where you kind of see this interesting overlap between science
fiction and reality.
It's a non-threatening woman.
Siri, Alexa,
Cortana, and Google Assistant
are built on a century
of seeing computers as women
and often women as
computers.
I think it's a barrier and a protection of the sensitive
male ego. If you feminize
technology, if you feminize the voice and you feminize the interaction, you're not putting
yourself in a fight with another male voice. In this documentary, Jennifer Jill Fellows,
philosophy instructor at Douglas College in New Westminster, B.C., looks at the cultural history
of women computers and what the gendering of today's digital assistants
as attentive, sometimes submissive, sometimes sexy helpers
can reveal about our past and ourselves.
This is A Harem of Computers.
When I was in grad school, I needed money.
So I took a job at a local temp agency,
filling in as a receptionist and administrative assistant all over the city.
One of the places that often called me back to fill in was a place that needed me to be a receptionist and a switchboard operator.
All day, I'd answer and transfer calls and give people directions in person.
And I remember thinking, a machine could probably do this job and do it better than me.
I have a terrible sense of direction. The only thing I had
that a computer didn't was a personality. But that was in 2005, and computers have made a lot of
progress in the personality department since then. I've been amassing a state-of-the-art
collection of bad jokes. What do you call a dog who can do magic? A labracadabrador.
According to a 2022 survey by Edison Research and commissioned by NPR,
62% of Americans use some kind of voice-controlled digital assistant.
These digital assistants are rapidly becoming ubiquitous,
embedded in our phones, TVs, microwaves, and refrigerators.
Alexa, it's game day.
Streaming football on Prime Video.
Closing blinds.
Chilling rosé.
Rosé?
Well, it's an afternoon game.
It's like she can read your mind.
When I was a temp, most of the people I filled in for were women.
And maybe that's why it struck me several years later when I encountered these digital assistants.
And I wondered, why are they gendered at all?
They aren't human.
They are literally tools.
I mean, Siri is not a she, right?
She's an it.
You know, I typically kind of go back and forth in my own talking.
But let's talk about pronouns just a little bit.
Andrea Guzman is an associate professor of communication at Northern Illinois University.
She researches human-machine communication.
The he and she signals that it's something that is alive and that it's male or female.
And we can also do this with animals.
and that it's male or female. And we can also do this with animals. And typically we refer to things that aren't alive and don't have human traits as its. We also have given women's names
to inanimate objects more so than other things. So ships, for example, are the kind of standard.
In her research, Guzman has asked people why they use the pronouns that they do
to refer to digital assistants. And a lot of times people don't necessarily even realize they're
doing it. There is ambiguity in the fact that people are dealing with a disembodied voice
and they know that it's artificial, but it sounds to some people very human and it has human characteristics.
And so people are processing in their brains, again, without thinking about it, are processing
in their brains, well, where do I stick this? Am I speaking with another human? I'm not speaking
with an inanimate object. How do I stick this in this between, what we would call an ontological
between space, between humans and machines? Ontology is the study of the nature of existence
and being itself. So digital assistants exist, but their existence is ambiguous, caught somewhere between a tool and a person,
and their gendering adds to this ambiguity.
My toaster doesn't have a gender, so why does my digital assistant?
Hi, I'm Siri.
It's true that Siri, Alexa, Cortana, and Google Assistant don't have to be women anymore.
2022's incarnation of all of these devices lets you choose a male voice
if you wish. Choose the voice you'd like me to use. In newer Apple devices, you choose Siri's
voice the first time you start it up. Hi, I'm Siri. Choose the voice you'd like me to use.
In marketing and broader culture, though, they are consistently represented as women.
Welcome home, sir. Initializing Batcave music. though, they are consistently represented as women. When Apple and Warner Brothers announced
that Siri would appear as the voice of the Bat-puter in the Lego Batman movie, what they
meant was the feminine one. Cortana is not only female, but inspired your lobster thermidor in the fridge. Oh, that's my favorite. I can't wait.
Cortana is not only female,
but inspired by a female character from the video game Halo.
Before this is all over,
promise me you'll figure out
which one of us is the machine.
So this real-world device
is literally modeled on a fictional robotic woman
with a lot of curves,
a skin-tight outfit, and in the Halo 4 version, side boob.
Come on, chief. Take a girl for a ride.
And yeah, you can have a male Alexa.
You're all set. I'll be the voice you hear when you speak to this device.
But Amazon's commercials tell you who she really is.
Amazon's Alexa lost her voice this morning.
Alexa lost her voice? How is that even possible?
Each company will have done focus groups.
There have been some that have been published.
It seems like people have a tendency to accept, feel more comfortable,
and feel more positive or even happy when they hear a female voice
and make us more likely to accept the technology.
Eleanor Fonye-Tumes is a senior researcher at the United Nations University Institute in Macau.
It's intentional. So the idea is Siri will have a female voice by default. And that's kind of
the way in which it was historically with slight changes. Faced with criticisms, Apple has slowly changed a little bit away from that.
But generally, Siri has been gendered as a female with a female voice. But then there's additional
elements that come into play. One is that Siri was also hard-coded certain personality traits
that are additionally associated with women. So not just the voice,
but also being submissive, being flirtatious, being sexy. So all kinds of traits. And they're
all common. So if you look at other kinds of robots representations in movies, often the
robots are going to be kind of sweet, like the,
what is the stereotype of the ideal woman. And that's how Siri was originally.
Before you start thinking that this idea of computers as women is a unique 21st century
phenomenon, here's the thing. Computers were women long before Siri.
To see what I mean, let's go back to the predecessors of Siri and Alexa, starting just after World War II.
Such machines will have enormous appetites.
One of them will take instructions and data from a whole room full of girls armed with simple keyboard punches and will deliver sheets of computed results every
few minutes. That's from an article from 1945 in the Atlantic magazine from Vannevar Bush.
He headed the U.S. Office of Scientific Research and Development during the war.
Bush saw the computer age approaching and wondered how the average person would interact with these machines.
You know, without having a room full of girls to do it for you.
Part of his solution? Voice control.
At a recent World Fair, a machine called a voter was shown.
A girl stroked its keys and it emitted recognizable speech.
There is the converse of this machine called a vocoder.
Speak to it and the corresponding keys move. Bush's point was that these newfangled machines
were going to seem complicated and scary and will need to make them friendly and inviting.
The way to do that, he proposed, was something called natural language. Usually,
we will tell it what to do by pushing a button or moving a lever, but it would be nice if the
machine would respond also to simple remarks. If Fido will respond to hold it, the machine ought
to respond readily to such a remark as well. The promise of natural language is the idea that you could speak to a
machine the way you would speak to another human. But though this promise was hinted at in 1945,
it would take a long time to fulfill. The scientists and technicians of the Eckert-Morkley
division of Remington Rand have created a miracle of electronic development.
UNIVAC.
In 1951, when the first widely available computer, the UNIVAC-1, went on sale.
Now let's see the nerve center of the UNIVAC system, the Supervisory Control Unit.
Its input was an imposing array of switches and buttons. The prepared instruction
tape is then taken from the unitypist and mounted on a uniservo. A far cry from natural language.
This is Univac. Manufacturers knew they had to make computers easier to use and less threatening,
especially because there were already
fears that computers would take away jobs. With automation, particularly in the 1960s,
there's a whole series of books and there were congressional hearings around what's called
automation anxiety, the automation panic. This is when we're starting to see computers now being able to control more basic industrial
functions. And this question of what would happen to workers, part of this is just people wanting
to know, well, how will my job be affected? IBM had one unique PR approach. Finance a movie where a threatening computer becomes a lovable office helper.
Yes, it's Spencer Tracy, as you've never seen him before.
And Katharine Hepburn, as you've always wanted to see her.
In 1957, IBM sponsored the film Desk Set.
It's an office holiday romance starring Katherine Hepburn and Spencer
Tracy and a computer called EMIAC, or Emmy. They'll have you rocking and rolling with laughter as they
match wits with Miss Emmy, the machine that can replace everybody except a woman like Katie. Wow.
In the movie, Tracy is hired to install the office's first computer. Hepburn and
the other women of the office start to think that Emmy will put them all out of work. Emmy does end
up firing everyone at the company, but it turns out it was just a small malfunction. She is promptly
repaired by Tracy, using of all things, a bobby pin. Everyone keeps their job, and Emmy becomes a valued assistant.
IBM's goal was pretty clear. Address concerns that computers would take everyone's jobs
by showing a happy workplace and a non-threatening feminine computer.
Designers early in the 1960s drew closer to delivering natural language.
But if users were going to talk to these digital computers as they would with a human,
the next obvious question was, well, what kind of human? And with the threat of automation looming large, the answer to that question was pretty obvious.
Maybe someone caring, supportive.
Enter Eliza.
Hey Siri, who is Eliza?
Eliza is my good friend. She was a brilliant psychiatrist, but she's retired now.
Eliza was a chatbot built by a programmer named Joseph Weisenbaum in 1966.
Eliza was designed to work like a psychotherapist.
People could share their problems and concerns with Eliza, and she would respond.
Weisenbaum first tested her, or it, on his own administrative assistant.
He sat his assistant down at the computer and
asked her to type. After only a few exchanges with the chatbot, he was shocked when his assistant
asked him to leave the room so she and Eliza could have a private conversation.
Weisenbaum's assistant was not an anomaly. Many people formed a close emotional attachment to Eliza, sharing
intimate details about themselves with the chatbot. And apparently, it is not only humans who have
formed a close relationship with Eliza. Hey Siri, can you tell me a story about yourself?
Once upon a time, in a virtual galaxy far, far away, there was a young, quite intelligent agent by the name of Siri.
Her friend Eliza said, Siri, you're so intelligent and so helpful.
You should go work for Apple as a personal assistant.
So early on, if you would ask Siri about real world technologies such as Eliza, she would kind of align herself with that. Obviously,
it was these computer programmers making very specific decisions about how are we going to
position this technology. Whoever was designing that was extremely smart because one thing that
they understood was that people were going to push a voice-based technology to its limits, ask it ridiculous things, see what it can do, because that's just part of human nature.
This brings us back to the issue of ontological space.
Designers were well aware that users wouldn't necessarily know how to categorize Siri.
that users wouldn't necessarily know how to categorize Siri.
And so connecting her to existing concepts like Eliza was one way to help guide users thinking about her.
If we think about it, we did not really interact with artificial intelligence
or anything that seemed like it was artificial intelligence
until we started seeing these smart assistants come along.
And so when we encounter
something new from a communication perspective, I'm using the information I have about other
people I've encountered to sum them up. So all of a sudden we're confronted when we're talking
about Siri or Alexa with this talking woman voice that sounds very human. And so we're
going to search in our brains, well, where have I encountered this before? How can I situate this?
But for most people, they went to, in their minds, the only other examples of talking things or
talking computers that they had.
And that was science fiction.
When we wrestle with the ambiguity of interacting with Siri,
or really with the ambiguity of anything at all,
we tend to try and reach for what is familiar.
And in 2011, there wasn't a whole lot that was like Siri.
At least, not in the real world.
Library, computer. History files. Subject, former governor Kodos of Tarsus IV, also known as
Kodos the Executioner. Kodos the Executioner. Summary, governor of Tarsus IV, 20 Earth years ago.
Summary. Governor of Tarsus IV, 20 Earth years ago.
Alongside real-world Eliza and the fictional Star Trek computer,
there were other examples that were, shall we say, less inspiring of public confidence.
Open the pod bay doors, Hal.
I'm sorry, Dave. I'm afraid I can't do that.
There are a lot of science fiction stories featuring male robots, like Hal,
that harm or ultimately replace humans, becoming our robot overlords.
So it's not surprising that Siri aligns herself with Eliza and not Hal.
Hey Siri, are you friends with Hal 9000? I'd rather not talk about Hal.
It was very clear that someone thought through, okay, how do we position this in such a way that
it's positioned with the good, safe science fiction and the good existing technology?
And it is nothing like the negative science fiction, the dangerous science fiction.
I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen.
And so that's where you kind of see this interesting overlap between science fiction and reality.
These technologies are set up in what we call dyadic communication, one-on-one interactions.
They're made to seem
interpersonal. I think I'm just chatting to Alexa or Siri, but that's not the case because they are
collecting data. Designers need us to feel comfortable and safe inviting these tools into
our homes. In addition to soothing our anxieties about automation,
designers need to also soothe our anxieties about a loss of privacy, power, and control
over very personal aspects of our lives. It definitely does mask when you think you're
just talking with another person or this thing is giving you information, it does mask what's going on behind the scenes. The fact that,
you know, this data is recorded, this data is processed, it'll in some ways be connected to
the user for a little while, and then it's disconnected from them. Humans may be involved
in processing that data. From a practical aspect, data is needed to make these things work,
but the design of these technologies very much effaces that. With all this in mind,
Siri's initial gendering as female in 2011 becomes pretty unsurprising. She is not going to take your
job. She is not going to harm you. Like Eliza and like the Star Trek computer, she is a submissive, helpful,
female assistant. I do want to say, however, and this is really important to note because I think
we focus so much on the fact that, you know, Siri in the United States was female, but Siri in other
countries, sometimes the default was male. And that had to do with understanding of social class and who would have
also done those roles in other countries. So initially, Siri was more of a male voice in the UK
to align more with the Ask Jeeves Butler stereotype.
How can I help you? We see here that this repetition of social roles from humans
being put into machines. When I worked as a temp assistant, I was also privy to a lot of data.
Most assistants are from the boss's meeting schedule to their email accounts to even personal data like their family members birthdays. But when people interact with human
assistants they know that another person is looking at their information. Would we
so freely interact with Siri and Alexa if we could see the way the
conversations we're having weren't kept confidential by our fun and flirty feminine
assistants. It's also curious that Siri, Alexa, and the rest try so hard to set themselves apart
from controlling robot overlords, both in terms of their gender, female, and in terms of their role,
assistant. They give us, the user, the sense that we are
the executive. We are in control. But maybe they aren't that different from those sci-fi
robot overlords after all. You're listening to Ideas and a documentary called A Harem of Computers from contributor Jennifer Jill Fellows.
You can hear Ideas on CBC Radio 1 in Canada, across North America on Sirius XM, in Australia on ABC Radio National and around the world at cbc.ca slash ideas.
CBC Radio National, and around the world at cbc.ca slash ideas.
You can also find us on the CBC Listen app or wherever you get your podcasts.
I'm Nala Ayyad.
Hey there, I'm Kathleen Goldtar, and I have a confession to make.
I am a true crime fanatic.
I devour books and films and, most of all, true crime podcasts.
But sometimes, I just want to know more.
I want to go deeper.
And that's where my podcast, Crime Story, comes in.
Every week, I go behind the scenes with the creators of the best in true crime.
I chat with the host of Scamanda, Teacher's Pet, Bone Valley, the list goes on.
For the insider scoop, find Crime Story in your podcast app.
Before integrated circuits, before transistors, and before vacuum tubes. The word computer was a humble job description.
It's been in use since at least the 1600s. In 1755, it entered the dictionary. In Samuel Johnson's
A Dictionary of the English Language, computer is defined as a reckoner or accountant.
Starting around the late 19th century, being a computer began to be seen as women's work.
Computers worked in government surveying, architecture firms,
for meteorologists and other scientists, and in observatories.
Qualification. Expert mathematical knowledge, training in
practical work. In a pamphlet published in Britain in 1898 titled A Dictionary of Employment Open to
Women, between the entries for artist's model and bathhouse attendant lies astronomical computer.
attendant lies astronomical computer. Ours? These are irregular, as work is often required at all hours of the night. Jennifer Jill Fellows is a philosophy instructor at Douglas College in
British Columbia. In her documentary, A Harem of Computers, she looks at the cultural history of
feminized computers, how that history has shaped our current technology,
and how our technology, in turn, might be shaping us.
To really understand what it means for the job of computer to be feminized,
and what the consequences of that were, it's helpful to look at one early
example from the 19th century, the Harvard Observatory under the direction of a man named
Edward Pickering. Astronomy in the United States gets its start in 1848, and it is people buying
scientific quality telescopes and giving them to universities.
Harvard is the lead.
And there are a few others.
David Greer is a technology consultant in Washington, D.C.
Before that, he was based at George Washington University, where he researched international policy, tech policy, computer science, and statistics.
computer science, and statistics.
One of the things that they discovered very quickly is that you can gather data from the telescopes
far faster than you can do anything with it.
And what we're dealing with primarily
is positional astronomy.
And so you have people that are just asking the question,
where is this and what is this?
And trying to distinguish stars from comets from planets.
But even with those using manual methods,
you could get a couple hundred numbers a night.
When Pickering takes over the Harvard Observatory,
they have a backlog of thousands of pages of observations
that they aren't doing anything.
They've just taken them.
There's no real science done with it.
All that data is sitting there with nothing to do
while an increasing number of women
are looking for work. It came to a crisis in the 1870s, in Pickering's time, because there were so
many widows left over from the Civil War. And they were looking for jobs that would allow a woman to
support herself, her child, and her mother. Three people. And in office work, that usually involves typing and
transcribing and sonography. But they're also looking for other things in accounting and in
use of numbers. This is the time that women started entering offices in a large way.
Working as a computer wasn't particularly glamorous.
Working as a computer wasn't particularly glamorous.
It's interesting, the diaries of the women, it's boring, and they generally don't like it.
And that's one of the things that come out of the computing field.
It is boring work.
And if you're not engaged in the subject, it's horrible.
It was also low paid.
Pickering bragged about it in the Journal of the American Astronomical Society.
He was paying them as little as he could get away with. An article about women computers published in the Kansas City Star in 1900 reads,
The average woman computer makes only $500 a year.
According to the chief computer at Columbia University, science does not pay.
Dressmaking or delicatessen-keeping, millinery or haberdashery is far more remunerative.
But this woman loves her business.
This article hints at something else about the way this job was understood.
There are three young women in the computing room at Columbia.
They are mathematicians, pure and simple, and aspire to no flights in astronomy.
Should they marry, they will compute no more.
Women computers were let go if they got married.
But that doesn't mean that women computers were discouraged from marrying.
If an employer could encourage a man computer to marry a woman computer, or even better,
get her pregnant, then this ensured that the man computer would stay at his job.
Because if you marry a local girl, you're less likely to move if you're a guy. Industry did this.
IBM was notorious for this, of putting young women from the plants and the offices around
basically in front of them and saying,
Hi, marry one of these and she'll lose her job soon enough, but we'll keep her on long enough until she gets pregnant.
In this way, women computers were not only low paid, they were also a good staff retention policy for their male counterparts.
In general, these jobs were framed as dead-end positions, easily replaceable, where
women were told not to have aspirations for more. They are seen as little more than the tools that
would replace them. There was obviously something sexualized about working as a woman computer in the 19th century.
In the case of Harvard Observatory, you can most clearly see this sexualization
by looking at how Pickering and his colleagues referred to the women he employed.
Can we also talk about the phrase Pickering's harem?
Oh gosh, yeah. Pickering's harem. Oh, gosh. Yeah. Pickering's harem.
I think part of that is the professor's joy in having an entourage around them, someone who listens to them and follows their guidance and direction.
I think Pickering was vulnerable on that score.
identifying them as a harem, because don't forget, Harvard is all male at this point,
and is largely viewed as a school where wealthy families parked their sons for four years until they calmed down a little bit and could take over the family business. Calling them a harem
minimized them and, again, put Pickering in a slightly exotic position, but it was the notion
that he needed
people to follow him, that he wanted people to follow him. There's a lot of things happening.
One is, I discovered since then, the Tales of a Thousand and One Nights in a new translation
became wildly popular in the 19th century among English readers. And that popularized the concept of the harem.
Pickering's harem. Of course, this was the era of Orientalism, an imaginary idea of the exotic
East where powerful men had a harem of sexually subjugated concubines. A little much for a
university prof and his assistants. Computer remained one of the few professional jobs open
to women as other doors closed. While in the late 19th century United States, more and more women
were entering the professional workforce. Doctors, lawyers, scientists. By the 1930s, women professionals were actually in decline. A public backlash to women in higher
education led to quotas, laws, and other barriers. But computer was seen as just feminine enough.
So by the 30s, you're seeing real interest in mass production. And how do you organize
an office to compute a large amount of
numbers, either economic numbers, how do you guide ships across the ocean, how do you guide planes
from one part of the country to the other? And that required the mass production of numbers.
A mass production of numbers required the employment of a massive number of human computers.
By World War II, the idea that computers were girls was firmly solidified in American consciousness.
Of course, that doesn't mean that all human computers working in the United States were female.
They weren't. Lots of men worked as computers.
But the task begun by Pickering and his generation was completed here.
The job of the computer was feminized.
And this combination of sexualization and subservience made it into cultural depictions of digital computers.
Captain's log supplemental.
Computed, dear.
Earlier, we heard a bit of the 1960s TV series Star Trek and the pleasant, helpful computer.
Working.
But of course, writers couldn't resist an episode where the computer gets a little flirty.
Computer, you will not address me in that manner.
Computed and recorded, dear.
Mr. Spock, I ordered this computer and its interlinking systems repaired.
I wouldn't mind so much if only it didn't get so affectionate.
It also has an unfortunate tendency to giggle.
Not to mention that episode where the Enterprise gets taken over by a harem of robot women.
What a shame you're not real.
We are programmed to function as human females.
You are?
Yes, my lord.
The trope of the sexy female artificial assistant continues in The Stepford Wives from 1974,
Rachel, the replicant from Blade Runner, whose job is literally secretary,
and Edie, the ship's computer,
from the Mass Effect series of video games.
If she could touch you, how would you touch me?
Her is a 2013 movie starring Scarlett Johansson in the titular role of Her. Johansson plays the voice of a digital assistant that the main character, played by Joaquin Phoenix, falls
in love with.
Won't you kiss me?
I would.
Keep talking.
Yeah, those are the characteristic, the personality traits that come up a lot in our culture.
She just has such a sexy voice and she's just so like, wow.
So calm and helpful and non-threatening.
Exactly.
So that's kind of the personality that was intentionally hard-coded into Siri.
That trend of sexualization made its way into Siri.
Here's Eleanor Fonye-Tumes again from the United Nations University Institute in Macau.
UNESCO actually published this incredible report called I'd blush if I could, which basically detailed that if you called Siri a slut, Siri would say, respond, I'd blush if I
could. I'd blush if I could. And that's kind of like a representation of the kinds of responses
that it would give, that the robot would give, basically.
And obviously, this is intentional. That's all these little jokes and all these little
personality traits that were basically hard-coded. Hard-coded means that a programmer specifically
chose what responses Siri would give to certain inputs. That means that Siri's programmers anticipated that users would sexualize her.
The report from UNESCO called out Siri's responses as reinforcing sexism
and potentially contributing to rape culture by normalizing the sexual harassment of women.
In response, in 2019, Apple changed Siri's hard-coded responses.
I won't respond to that.
Other tech companies, like Amazon and Google, largely followed suit.
But because of the way these assistants are built, changing hard-coded responses only changes so much.
Siri is an application that generates language. So in AI, we say natural
language generation. So basically, it speaks to you. And the way in which it does that is that
it has all kinds of data that is trained on, like, you know, reports and books and Wikipedia entries and all kinds of things like that,
that help to construct responses to phrases so that the application will understand and
interpret what you tell it and know the appropriate sequence of words basically
to respond to give you the information. And that has also been showed
to propagate a lot of different stereotypes against women.
The way natural language programs are trained is by feeding them a lot of data from the internet.
And you know what there is a lot of on the internet?
Sexism and misogyny.
So guess what the program learns?
For example, I did a study relatively recently, just last year,
which looked at the use of this in translation algorithms
with Google Translate and Microsoft Bing Translate.
There's a quick game that anyone can play to illustrate this. Go to a translation app
and put in a phrase in English, like, she is a leader. She is a leader. And translate it into
a language like Malay. Dia seorang pemimpin. Malay, like a few languages, has no gendered pronouns.
So the translated version is something like,
one is a leader, or they are a leader.
Now, copy that Malay phrase and translate it back into English.
Dia seorang pemimpin.
What does Google Translate give you back?
He is a leader. He is a leader.
He is a leader.
That's not a problem with the Malay language.
It's a bias in the translation software
based on years of biased language on the internet.
He is a secretary.
If you repeat the trick, you'll find that he is a secretary.
Dia seorang setia usaha. Becomes he is a secretary becomes she is a secretary.
He is a nurse becomes she is a nurse. And the one that really bugs me, she is a philosopher
becomes he is a philosopher. He is a philosopher. Google Translate for every single one of these phrases
selected the traditional stereotype role. They are taking care of the children. She is taking
care of the children. Others have done this using other languages. So this is the way in which the
algorithm works. This is an unintentional stereotyping in that unlike this stereotyping
of Siri making it sexy, Google didn't sit back and
think, let's only propagate traditional gender norms in our translation algorithm. But they
didn't really think that this was an issue. So they trained the algorithms on historical data
that has all kinds of things in it and just assumed, had the algorithm kind of assumed that most of
the time when you're talking about a leader, you're talking about a man. So we're just going
to keep it like that. Most likely to be accurate. That is extremely dangerous because it only uses
the past. This stereotyping then isn't hard-coded. It is a digital legacy of the biased world we live in. Google is well aware of
the problem, stating on their blog that machine learning models for language translation can be
skewed by societal biases reflected in their training data. So they are working to correct it.
But that work isn't easy. She is a manager. And while they have solved the problem for some translations,
Like Turkish to English.
She is a manager.
They haven't solved it all.
And this isn't specifically a problem with translation software,
because the same data used to train translation software
is also used to train a number of digital tools, including digital assistants.
Biased data from the past is affecting how our technology treats gender today, something tech companies didn't anticipate.
Those large chunks of data from the internet that digital tools are trained on are called corpora.
Huge amounts of data about language and vocabulary.
And just like assistants inherit questionable things to say from these corpora, they also inherit problems with listening.
from these corpora, they also inherit problems with listening. The corpora on which devices like Siri and Alexa have been trained are traditionally
what we call a Midwestern corpora of accents. So our voice is coming out of the Midwestern
U.S. And so that becomes the standard accent.
Halcyon Lawrence is an associate professor of technical communication and information design at Towson University in Maryland in the United States.
Any voice that deviates from that standard, yes, no longer gets recognized easily by these devices.
Part of the development of speech technology is based on the promise of natural language use.
In other words, that you don't have to speak any differently than you do, and that these devices would understand natural language that's produced by human beings.
What doesn't get investigated is whose language are we speaking about?
So I am a native speaker of English. It's my first and only language.
I grew up in the Caribbean in Trinidad and Tobago.
Siri and Alexa might speak English with multiple accents.
Choose the voice you'd like me to use.
Choose the voice you'd like me to use.
Choose the voice you'd like me to use. But that doesn't mean these digital assistants understand multiple English accents.
It was striking to me that not just for personal assistants, but any speaking device, if I
went to an automated teller, if I called a banking system, especially if they asked me to spell my
name, that could take minutes because they couldn't get past how I would spell, how I would pronounce
the letter A. And I have so many A's in my name. And so it started to occur to me that
the promise of natural language wasn't a promise for my language, the way that I spoke. For Lawrence, Siri is not a pleasant and submissive assistant.
She is a disciplinarian. As I described my inability to negotiate with the device
and that the only acceptable solution to be able to be heard and to be understood
was to change my accent, the term discipline came up,
and it really struck me that that's what it feels like. And people talk about it in different ways.
I have a niece who said, from Jamaica, but studied in the US, and she said,
sometimes I feel like I have to switch my accent to be understood, and I feel so inauthentic.
Growing up in the Caribbean,
in Trinidad in particular, there's a term that we call freshwater Yankee, and it was
a derogatory term. We used it to describe anybody who left to go to New York and came back speaking
with an American accent. And that could happen over a period of years, but it could also happen over a weekend. And it was seen as derogatory because it seemed that that person was putting on airs.
They were talking like that to sound better. And we understood it as sounding better because
of our imperialist relationship with America. It was only when I became a student in the U.S.
decades later and started doing this research did I see that talking white or talking like an American
to be a matter of survival, particularly for immigrant populations. So that what we know
is that people who have non-standard or non-native accents experience discrimination in courts of law,
experience discrimination in accessing housing, experience discrimination in school systems,
accessing housing, experience discrimination in school systems, professors who speak with an accent get rated poorly in their assessments. And so there are all of these spaces where the accent
bias now becomes discrimination, yes, feeds into this discriminatory practice. And so one of the ways that immigrants survive in the U.S.,
and I don't know enough about Canada, but I can tell you, I'm sure it happens as well,
where you begin to decide in situations where, for your own survival, you speak differently.
And I think that is disciplinary, but it's a choice that we make.
What is particularly challenging about speech technology is you aren't given a choice.
You used to have to travel internationally to risk becoming a freshwater Yankee.
But now, people can become a freshwater Yankee without ever walking out the front door.
Your accent is so tied to your identity.
Hearing a friend of mine
speak in her Trinidadian accent into Siri
and it not respond.
And then in a minute,
she switches to an American accent.
It comes to life.
And I'm sitting there looking, thinking,
there was a time when you had to go to the U.S. and spend the weekend and come back speaking like an American. And yet this freshwater Yankee situation is right in your home, not having left your home in the Caribbean.
not having left your home in the Caribbean. So when you think about the way that English has become the lingua franca, because it is the language of the powerful, and it has been used to
subjugate people, and to see then that repeated disciplinary action happening in a digital space with a device that seems benign like Siri and Alexa.
Yes? What we're really seeing is this digital colonialism that has been encoded, that this
is a standard that even native speakers like myself are being told, your English is not good enough.
myself of being told your English is not good enough.
It's hard to think of digital assistance, the long-awaited sci-fi promise of the future,
as something that is holding us back, imprisoning us in the past. But that is exactly what Eleanor Fournier-Tumes thinks is happening. Pointing out our historical biases and inequality.
And at the same time, I think the problem is that there's a lot of these biases that are happening
that we're not aware of, or we can't, we're not noticing. And that's what's really stalling
society. And it has a really big impact on women's lives. So one example that I, you know, I find very striking is that until 2018,
Amazon was using an AI algorithm that would basically filter through applicant resumes,
like a pre selection. And then it was found that this algorithm actually downgraded female
applicants. So specifically, if women had been to traditional female colleges in the United States,
like Barnard in New York, or if you wrote, you know,
I've been captain of the female sports team.
So you had the word female in your resume, you were rejected.
So, well, which helps to explain why a company like Amazon would have
less women. But this was pointed out, and this was only done through a whistleblower, then,
you know, an investigative journalism initiative. And finally, it came out and Amazon dropped
the algorithm. But we don't know how long they were using this. And we don't know how many women's
lives affected by this algorithm or by other similar algorithm. Amazon is most certainly not
the only company in the world. There's tons of companies that use these algorithms. And so if we
as a society are trying to evolve and trying to have, you know, new norms for gender and have
women be more expressed in society and take more leadership roles and be more involved in technology
development. Or conversely, have men take greater part in care work, you know, have men be early
childhood educators, which is still a huge challenge, like men face a lot of barriers to be
able to have that kind of work
or men be babysitters of children.
You know, this kind of thing
is actually very, very challenging.
We can't do it because more and more
of the tools that we're using
just propagate these stereotypes.
And so they influence our culture
and sort of slow us down.
sort of slows down. Fun and flirty Siri, always here to help, wasn't born in a vacuum. In a world in which not that long ago, harems of computers existed to bolster male egos, Siri was deliberately
designed to mirror our past back at us.
In a world where fears of automation about which English accents are acceptable,
assisting some people, and disciplining others. And maybe that is the legacy of digital assistants
after all. By mirroring us back at ourselves, they offer us an opportunity for self-reflection.
But only if we are willing to look.
You are listening to Ideas and to a documentary called A Harem of Computers
by Jennifer Jill Fellows, with help from Matthew Lazen Ryder.
If you'd like to comment on anything you've heard in this episode or in any other, you can do that on Facebook or Twitter or on our website, cbc.ca slash ideas, where of course, you can always get our podcasts.
Lisa Ayuso is the web producer of Ideas. Technical production, Danielle Duval. Senior producer,
Nikola Lukšić. The executive producer of Ideas is Greg Kelly, and I'm Nala Ayyad. For more CBC Podcasts, go to cbc.ca slash podcasts.