Tech Won't Save Us - How Britain Killed its Computing Industry w/ Mar Hicks
Episode Date: March 18, 2021Paris Marx is joined by Mar Hicks to discuss why we need to know the history of tech and how the British history of sexism and colonialism in computing has lessons for the present-day US tech industry....Mar Hicks is the co-editor of “Your Computer Is on Fire,” along with Thomas S. Mullaney, Benjamin Peters, and Kavita Philip. They are also the author of “Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing” and an Associate Professor of the History of Technology at Illinois Tech. Follow Mar on Twitter as @histoftech.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.Find out more about Harbinger Media Network at harbingermedianetwork.com.Also mentioned in this episode:Mar wrote about the story of COBOL computer systems in the early months of the pandemic and how Britain killed its tech industry.Google fired top AI ethicists Timnit Gebru and Margaret Mitchell after their research was critical of the company’s practices. Diversity recruiter April Christina Curley was also fired in September 2020.Support the show
Transcript
Discussion (0)
I always balk at the idea of terming something a revolution when it's not about power changing hands.
Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks, and this week my guest is Mar Hicks.
Mar is the co-editor of Your Computer is on Fire, a new collection of essays that digs into the roots of the problems that wequality, How Britain Discarded Women Technologists and Lost
Its Edge in Computing, and an associate professor of the history of technology at Illinois Tech.
This is a really fantastic episode where I talk to Mar about why understanding the history of
technology is so important when we think about the problems that we're facing today and how many of
these issues have occurred in the past
and have been dealt with in the past in good ways and bad and we can learn from those examples.
In particular we look at how in the UK when it was originally building its computing industry
it not only pushed women out of the industry but tried to use the technologies it was developing
to maintain its colonial hold on the world as it was losing its grasp on its empire.
Those examples hold lessons for what we see happening in the United States today,
where the tech industry is dealing with issues of sexism, racism that have been built into it over the course of many decades,
and where the United States has used its technology to increase its
influence in the world by having countries and peoples everywhere adopt it and use it.
But now it's facing a challenge from China and the fact that its technology can be seen
as a form of imperialism is being shown to everybody. I think you're really going to
like this conversation. Tech Won't Save Us is part of the Harbinger Media Network,
a group of left-wing podcasts that are made in Canada. And if you want to find out more about the other shows in the
network, you can go to harbingermedianetwork.com. If you like the show, please leave a five-star
review on Apple Podcasts and make sure to share it on social media or with any friends or colleagues
you think would find it interesting. Every episode of Tech Won't Save Us is provided free to everyone
because of supporters like you. So if you want to support the work that I put into making this show every week,
you can join people like Neil King and Sophie Uch by going to patreon.com slash tech won't
save us and becoming a supporter. Thanks so much and enjoy the conversation.
Mar, welcome to Tech Won't Save Us.
Thanks so much for having me.
You are the co-editor of this new collection that has so many great critical perspectives
on technology, the kind of perspectives that we really need right now.
So I'm excited to kind of dig into some aspects of it with you.
You know, obviously, it's a really comprehensive collection, so we won't get to all of it.
And I'll recommend that the listeners pick up the book.
You know, when the idea came to you and your co-editors about this collection, what was
the incentive behind putting it together? What did you want to accomplish by bringing these voices
together and publishing them in one volume? One of the main things that we wanted to accomplish
was first to have a conversation with each other, because we were all people who worked in history
of computing, history of technology, media studies, information studies, African American studies, women's studies, all of these
various fields who had similar interests. But in a lot of ways, we found that we weren't necessarily
the people who had the most kind of intellectual friends, I guess, in the rooms that we were in,
we sort of had to build our own room out of people who had
similar interests who were interested in talking about these issues. This started happening back
in around 2015. So back then, this was not the gigantic, you know, very public tech lash that's
going on right now. It was a little bit more like you had to find people who were interested in
talking about these things.
And it grew from there.
By the time we got to the point of writing the volume, what many of us were interested
in, and you can see this very clearly in the introduction I wrote and in the conclusion
and introduction written by my three other co-editors, we wanted to pitch the book towards
our undergrad students, many of whom are going
into STEM fields and are thinking, how do I navigate this ethical morass, you know,
the set of really, really complex questions and really unnerving ethical questions that
I'm going to need to navigate as an engineer, as a computer scientist out in the working
world, and also have it be
accessible to people who maybe are already out there. Maybe they didn't go to college for STEM,
but they're interested in it. People who might just want to pick up this book or a book and
have an entry point in that isn't too abstruse, but also doesn't just paint everything with such a broad brush that, you know,
it's like the view from 50,000 feet. We're trying to give detailed case studies that really show,
okay, here's how we got here, here's different facets of the problem,
and here's maybe how we get out. It's a very historical volume.
Yeah, and I think it's really successful in doing that. Like, you know, I think one of the common criticisms of, you know, a lot of the kind of university or college programs that a lot of people who end up working in tech go through is that it doesn't have enough of this kind of critical perspective on technology and its impact on the world. And even the history that we should understand to really know how we got here today, right? So many of these things are excluded.
And I do feel like the book does a really good job of giving us those insights that
we might not otherwise see unless we're really kind of looking for them, right?
As you say, there's a bit more of this kind of tech lash, this critical thinking about
tech going on today, but it still is, I think, a minority view in most of the kind of
reporting and writing about technology. Yeah, I think it's definitely gaining a critical
mass now. Maybe that's not the best term to use, but it's certainly the case, I agree with you,
that we have a ways to go before we get to a point where everybody's on the same page about not just harms are occurring
and accepting that, but accepting that it's something that we can't trust the tech industry
to just fix or undo. It's not just a matter of fixing bugs or self-policing. And so that's
one of the things that I'm really gratified to see happening now. I think we're getting more and more to that point. And just a few years ago, just last year, the idea of regulating the tech industry was still
far off. It was still kind of a thing that maybe some people would really like to do,
and a lot of others were really opposed to. And now that's getting closer and closer to being a
reality. Yeah, and hopefully one that is achieved. Obviously, as you said, you are a
historian of technology, among other things. And I feel like history is so important when we consider
these questions of technology, because it's a topic that a lot of the modern tech industry likes
to ignore or, you know, not talk about, because they like to make up their own narratives, even if the historical
record disputes them, right? But you write in one of your contributions that each time a corporation
gets too big, it seems like a new problem, even though these trends have been repeated
time and time again, if we actually look at the history of what has gone on here.
We have these kinds of realizations, these issues come to the fore, and they seem new
and novel. But what we often find is that there are people in these institutions who have been
trying to get us to pay attention to them for a long time, but have been ignored, right? Because
they do not have the power in these organizations. So what do you see as being the importance of
learning about the history of these industries? And what does learning about the history of computing tell us about what kind of effects computers have actually had on society and the people kind of within these industries? into different parts of it. As you point out, history is so important because one of the things
it does is it removes plausible deniability. If you know history, then no, you don't get to say
we didn't foresee these effects that this technology is going to have. It's impossible
to say that. And it's also impossible to simply claim that, oh, these are bugs that we're just
going to fix because that has has, at certain points,
been used as a very disingenuous excuse for letting technologies out into the world that are,
you know, really just in the beta test phase and kind of unleashing them and just allowing them to
get out there, get market share. And whatever happens, happens. The venture capitalists don't
care. In fact, they're fully behind that model of getting market share because it means they're going to get more return on their investment, even if it does harm. beyond not just the present problem, but the way that sometimes when we talk about the problems
we're having with high tech and computing in particular right now, this connects up with
all sorts of different industries and histories of other industries where corporations have gotten
too big and they've become really, really damaging to our economy, to our environment, to our democracy even. And then they've reached a
stage of power where they seem pretty much unkillable, but then they get regulated and
they don't have that power anymore. And it oftentimes happens very quickly. So we see these
unkillable giants of Silicon Valley now, and it seems almost like we can't get ourselves out of
this problem or this set of problems that we're in, but that is not the case. And it's in many
ways, I think, very reassuring to look at the history of what happened with the automobile
industry, the pesticides industry, the oil industry, even what happened with AT&T and IBM
and Microsoft in more recent and more connected to computing history, that
really shows us not only what is possible, but the playbook that we have to follow as
workers, as citizens, as people organizing our workplaces or trying to pressure our elected
officials to regulate things so that they won't hurt consumers or they won't hurt our
democratic process. So I
think that's my pitch for history as something that is not just useful, but also can make us
feel a little bit better about not being so hopeless in moments where things seem quite
messed up, seem like everything's falling apart. Absolutely. You know, I think you make so many
good points there. And I would just add,
I'm in the middle of writing this book about transportation and technology and kind of the
proposals that are being put forward for the future of transportation. And it's been fascinating to me
to go back and read about the history of automobility and even the history of tech and
computing industries, how that's kind of arisen over the course of the 20th century. And just to see how so many of the issues that we treat as kind of novel to today have such
a long history. And you can see going back a century, right? So many of these problems are
not new at all. Yeah, exactly. So exactly. So in those lessons from, you know, for instance, the field of automobiles and the way
that automobile infrastructure was built and in many ways was actively used to get rid of other
forms of ways of moving about cities, that is in some ways, I think, a great comparator because
so much of what we're facing up to now has to do with recognizing that computing
and high tech, it's infrastructure and we have to look at it as infrastructure. It's not simply
consumer products. It's not for convenience. It is fundamental infrastructure and we have to start
regulating it, treating it as such and not allowing, you know, one or a handful of companies
to say, well, this is not about your
rights. It's about our profits. Yeah, I couldn't agree more. You know, there's another piece when
you were discussing the history in, you know, the piece that you wrote that really stood out to me
as showing a contrast with the narrative that is usually brought up about technology from Silicon
Valley. And you quoted Joseph Weizenbaum,
who was an early AI researcher, who said, and I quote, the computer has from the beginning been
a fundamentally conservative force. Superficially, it looks as if things have been revolutionized by
the computer, but only superficially. You know, that goes against this idea that Silicon Valley
puts out that it's this kind of progressive thing that is changing the world in this really positive way. I was hoping that you could talk a little bit more
about that and why Weizenbaum is positioning the computer as this conservative force.
Yeah, thanks for bringing that up. I really like that quote and Weizenbaum's work as he's working
through AI research. He's also working through the ethics surrounding what he's
doing. So Weizenbaum was the guy who invented the ELISA chatbot, you know, the chatbot that's sort
of Rogerian style, psychotherapist or something like that. And he did it as almost a programming
exercise. But when he started to see people interacting with it in a way that was
legitimately therapeutic for them, and the people treating the chatbot almost as if it were a person
they were talking to, he started to get squeamish about the fact that he had created this thing that
people were interacting with as though it were not a machine, as though it were a person. And
it led him to reflect on a lot of the other things
that he had done in his career, many of which are pretty dry and, you know, administrative or
bureaucratic, like he helped computerize banking systems. And later, he said, you know, when you're
attacking a really difficult technical problem, you just do it because it's fun to have this
challenge and to solve this challenge. But then later on, he started to realize, well, this actually allowed institutions that maybe ought to
have been changed to continue to trundle along and get even more entrenched, trundle along in sort of
the same broken ways they had been doing, and then become more powerful and to scale and to
centralize in ways that wouldn't have been possible without
electronic computers. And so that's what he meant by saying computers were a fundamentally
conservative force because they were conserving older ways of doing things instead of forcing
a new way of doing things. Not to say that forcing a new way of doing things is necessarily good.
We've often seen how that can be really disruptive in a negative sense. But he completely disagreed with this idea that there was a computer revolution
because it wasn't revolutionizing, it wasn't transforming or changing most of the industries
into which it was going. And I always balk at the idea of terming something a revolution when it's
not about power changing hands.
Revolution is a political term. And if there were a computing revolution, then what that really
should mean is that the computerization of the United States or other nations changed who was
in control. And it really didn't. If anything, it just amped up the billionaire class. It just
consolidated power in the hands of the people who already have
it. And that's one of the reasons that computerization has been so successful. Honestly,
it's because it has been an enormously powerful force for consolidating power in the grasp of the
people who already hold it. You also said something in your question about the neutrality of technology versus
technology being seen as an inherently good and progressive force.
I always like to point out, and I'm sure this is something I know, this is something you've
thought about a lot from listening to your other shows, that this is this really kind
of bizarre paradox that too often Silicon Valley corporations or startups get to just throw out
there, they get to say, you know, we're making the world a better place, then the minute something
bad happens, they say, that's not us, our technology is neutral. Well, which is it? Does
it tend to do something that is good or negative? Or is it neutral? Because it can't be all three
of those things at once. There is a person,
an ancestor in my field, Mel Kranzberg, who had a great line where he said,
technology is neither good nor bad, nor is it neutral. So that's a more complicated theoretical
way we can start getting into it. But I would just say on a very basic level, you can't say
this is inherently progressive and then also say it's inherently neutral. Those two things conflict. And once that's pointed out to people, I think, you know, in my classes, at least, a lot of my students, once they really grasp or grapple with this fiction of technological neutrality, then they start getting a much more clear view of the sorts of things technology is actually
tending to do. Not that it's always going to do those things, but there are affordances and there
are certain things it's going to tend to do depending on how that particular system was
designed. Absolutely. You know, everything that you are talking about there makes me think about
kind of the ideology of Silicon Valley and how that
evolved over the course of many decades, like in particular, the Californian ideology and the
bringing together of these kind of free market ideas with this kind of technological determinism
and the countercultural ideas so that politics is kind of reduced to this force that's not as
important because now we have the market and technology, right?
And what that ultimately serves to do is to benefit these often white affluent men who are already in control of this industry at this point, and we'll get to how it gets that way,
but they are able to benefit. And we see even today, it's still like that, where they are the
ones who are really controlling this industry for the most part. Yeah, that's right. And it tends to be the case that whoever is in control,
they build the world that they want, they mirror themselves in the technology. So it not only is
controlled by a certain group, but it mirrors their interests. And it means that other people's
interests and even needs, you know, needs that might be fundamental to somebody's life, liberty, and pursuit of happiness get written out of the equation, get wholesale ignored.
Or in some cases, and this is one of the things that I really hate to see, they get told, oh, that's something we'll fix later.
That's a side issue.
What's happening to you is just a bug in the system.
It will get worked out.
And of course, it never does. And even if it did,
that's a really, really threadbare excuse to say, we have designed this system that is not for certain people. And so you're just going to have to suffer for a bit.
I want us to go a bit further back in history now to talk about your main contribution to this
collection, which is obviously based on your earlier work that you published in your first book, Programmed Inequality. And that's looking at kind of the history of the British
computing industry and the role of women and gender in its kind of evolution. So you talked
about how in the 1940s, the UK was a leader in computing, but by the 1970s, it had lost that
kind of edge, that kind of leadership role. And how a key part of that was how it dismissed and denigrated the skills and the roles of women who were key to that earlier success.
So can you talk a bit about what was going on in the seen as masculine and that's male dominated tied in to the gendered labor change.
And that story had something to do with the way that labor can really affect the fortunes of an entire industry. reason this gendered labor change occurred, this gendered labor flip from feminized to more male
identified work, was that computing work didn't change or become harder or more technically
difficult somehow. It stayed pretty much the same, similar throughout this period, maybe got a little
easier as more high-level programming languages and other tools to make coding easier came into the frame.
But what did change was the perception of the work. And as the work gets perceived as more
important and becomes professionalized, all of a sudden it's no longer seen as suitable for
low-level workers like these technical women who aren't really supposed to be in charge of
anything. They're just kind of drones in the system doing the coding, doing the technical work. level workers like these technical women who aren't really supposed to be in charge of anything,
they're just kind of drones in the system doing the coding, doing the technical work.
As it becomes seen as more important, and a person who has technical expertise has to be somebody
who, for instance, manages people as well as machines, then there's this huge push to get
women out of the field and to get men who might not necessarily have the technical skills.
They're oftentimes trained by the very women they're replacing.
But what they do have is they have the assumption that they will be good managers, that they will be people who know how to make decisions that will be good for the country or an industry or a corporation.
And so they are more trustworthy and they're more
aligned with management. So that is why this gendered labor flip occurs. And to me, that's
very interesting because of two things. The first is that it doesn't happen because of women somehow
not being good enough to do the work. That's not at all what's happening. In fact, like I said,
many of these women are training their male replacements. And the other thing that was
interesting is that as I started to see what was happening with this gendered labor change,
I saw how it exacerbated a programmer labor shortage. And this programmer labor shortage caused the UK government in particular to make
some very strange decisions, or maybe not strange, maybe they seemed sort of like logical decisions,
not the best way to solve the problem, but decisions about trying to consolidate their
computer hardware industry. And this came back to bite them really badly. It essentially caused the implosion of a once successful computing industry in the UK. And it was all a butterfly effect from the labor crisis that the government and industry had engineered themselves by trying to get women out and to get a different caliber of management aspirant young men into computing jobs. And one example that I
really like to give, and I talk about this in the chapter in Your Computer's on Fire, is the example
of Steve Shirley, Stephanie Shirley, her nickname was Steve. And she ends up hitting the glass
ceiling and not being able to advance in government or industry as a technical worker. So she does what a lot of women at that time did,
left when she got married, had a child. But instead of just leaving the workforce,
she creates her own software startup. And her idea is to try to give her employees a family
friendly, flexible working environment, because what she's going to try to do is hire
all the other women like her who are getting pushed out of their programming jobs, and she's
going to give them a working-from-home, flexible work situation in a very, you know, she said she
had a feminist business model as she was doing this, and then they're going to try to get contracts
to do programming, so work from the outside in.
And she ends up doing a lot of really important projects for UK government and industry. Her team programs the Black Box for the Concord. And this is all as a result of the fact that they
had gotten fired from their straight jobs, and now there's nobody to do the work. So,
you know, companies and the government have to go and hire this ragtag group of programmers
working out of their homes while they're minding their children and so forth.
And it just shows the absurdity of this kind of attempt to force talented, qualified people
out of their jobs.
But it happens nonetheless, because it's not about meritocracy.
It's about consolidating power.
Like so much in computing and many other industries, it's not about meritocracy. It's about consolidating power, like so much in computing
and many other industries. It's about keeping certain power relationships in place and
strengthening them. Yeah, I thought that was such a fascinating story, like piece of history that
you described there, and especially the story of Stephanie Shirley. And obviously, I think that
what she accomplished by setting this up was brilliant. But it also made me think about, like, because of the discrimination, the sexism that was
in the British public service, you lost these skills that could have been kind of in-house,
like within the government. And instead, they have to seek out to kind of outsource this work,
because they're not able to have it in-house
because they don't recognize the skills of these women that could be doing that work, right?
And so today, I think that we recognize that the degree of outsourcing that we have is a problem.
And we see the tech industry as something that the public sector can't even really do,
that it has to be something that has to do with the private sector.
And so I don't know how this relates. So I wanted to ask you, do you think that there's anything in there in how the people who had the skills in the early days of this industry and for a number
of decades, how they were forced out and then these governments kind of had to outsource these
contracts to get that work? Do you think that contributed at all to these ideas that tech is something that has to be in the private sector and not the public sector? Or
do you think that's completely separate? I think there is a relationship there. I think that in a
lot of ways, this idea that tech is better left to the private sector is the result of, what's the
best way to put this? I don't know, a level of laziness or disinvestment in technology
on the part of many governments. Obviously, when it comes to military technology or needing
technology for the military, there's a huge amount of investment. But when it comes to
simple technologies of administration, bureaucracy, technologies that are, for instance,
used to pay out unemployment claims.
There was just a story in this past year how the unemployment systems in several states were
breaking down under the load because initially those states said it was because they were using
an old programming language called COBOL. And it was just that, oh, we need to upgrade the systems.
But then it turned out that wasn't the case. What had happened was the COBOL part was working, but the Java part wasn't working. And they didn't have people on staff around to fix
it because of all of these austerity measures that both the federal and the state governments
have been putting into place over the course of the past decade, at least, where the services that
all of us pay for as citizens out of our taxes,
we are given short shrift. There is this austerity logic that is inflicted on not just citizens,
but on all of the services that they might need from the government. And so there aren't,
for instance, programmers around to fix a system when it goes down because they have been fired,
and they're just going to bring
outsourced people in to fix it when there is an emergency. And that really isn't as efficient,
but it is a way of cutting budgets. And so I think there's definitely some connection there.
The other thing that I'd just point out is that when we're talking about that example with Steve
Shirley, you know, she was able to hire maybe a few hundred women who were
getting forced out of their technical jobs. But there were thousands and thousands of women getting
forced out of these jobs. And for the most part, that talent was simply lost. It was flushed away.
Now, these women may have gone on to do other very important things like raise families or work in
other fields, but their technical
expertise was lost. And even more importantly, their technical leadership was lost as well.
So it's one thing to give a person or a labor force, a group of people an opportunity to use
their skills, but it's another thing to give them an opportunity to rise above or, you know, have a career progression that puts them into a position of power eventually, into a position of calling shots about maybe how systems should work or how things should, but to the way that, you know, an entire industry maybe sets priorities,
the way that a nation state, a government sets priorities about whose needs are going to be met.
And so that's the thing that I think I really want to just highlight here, that it's not just about
letting people do the work if they have the skills, it's about not shutting them out of
positions of power where they actually get a say and
they get to make the world a better place through the work that they do.
I've been thinking a lot about this sort of waste of talent situation that went on in
the past in the UK, because right now, you know, we're seeing similar stuff happening.
I'm sure you've been looking at what's happening with Google and its AI ethicists, for instance. Absolutely. And, you know, I think that is an essential point. And
it just provides even more examples of how this idea of meritocracy is completely a myth that
is just made up to kind of justify the structure as it exists, right? One thing I found particularly
interesting about the example that you gave of what happened in the UK is that it seems like it's this example of an industry that was kind of de-unionized or, you know,
they're not considered as important as a result of the institution of various technologies.
Can you talk about that aspect of it, that change that happens to make it something that's
acceptable for men to go into because they don't want to be associated with it when, you know,
it's just this kind of computing job where you're considered a computer, as many women were at the time.
You're 100% right.
Usually the introduction of further automation means that a field feminizes instead of becomes more male identified.
And usually it means other stuff.
Like you pointed out, it means that unions oftentimes get broken or lose power when automation comes into the picture.
And computing is an interesting case because it goes the opposite way in certain respects.
So it doesn't result in more effeminization, but it does certainly break the power of unions because as it professionalizes as it becomes more and more associated with a certain class and kind of
caliber of mostly white man in this situation in the UK, it becomes an area where it's very hard
to advocate for the need for unions because the idea is that, oh, no, these are professional
workers whose interests are more aligned with management rather than with the working class or
with people who would need a union. And the thing about, I think, what's going on there and with
this progress of, you know, okay, how does a feminized field become male identified? It isn't
easy and it doesn't happen gradually. It requires a hard shift. So when the UK government, for
instance, tries to change the makeup of the people in these jobs, they have to explicitly say,
we are not going to recruit the people who are already in these machine grade classes for this
higher computing work. Even if they're already doing computing work, we're not even going to
let them apply for these jobs because they're trying to force a wholesale shift. And the men,
the young men who they're trying to recruit, oftentimes have very little interest in this
work because they think, oh, it's kind of mechanical. That gives it a little bit of
a working class flavor. They don't really like that. And it's feminized and it hasn't been
unfeminized yet. So they think they're just going to get stuck in a pretty low grade type of
position. And it isn't an easy switch. That's one of the things that causes this programmer labor
crisis because there aren't young men flooding in to take these jobs early on. And a lot of the
young men who get trained on the job to do
computing work, stay in it for three, six, nine months, and then they just they bop out to do
something else because they can get another good job that puts them on the management track where
they don't have to mess around with computers. And to a lot of folks, that's, you know, that's
preferable. That's such a fascinating piece of history. I had one more question about that before we move on to something else. You know, obviously,
you're describing how in this period from about, you know, the 1940s to the 1970s, the UK goes
through this period of, you know, professionalizing the computing industry, as a result, kind of
partially privatizing it and pushing women out of the industry in favor of men. I was wondering if
you could describe what is going on in the United States at the same time. Because, you know,
obviously, you're talking about how the UK was a leader and loses that leadership role. And we know
that the United States obviously kind of takes up that role, as I understand it, at least. So can
you describe what's happening in the United States and why the United States is able to
kind of persist, even though it has, as I understand it, similar sexism in those computing
industries as well? I'm so glad you asked that, because that's a question that I get all the time.
And it's a really important question. People say, well, okay, fair enough. Maybe this is what
happened in the UK. But if this was really caused by sexism and sexist labor practices,
then the US had all the same sexist and racist labor practices. Why don't we see the same
problems occurring in the US in this period with sexist labor practices hurting the progress of
computing? And there's a twofold answer to that. And the first part of that answer is,
if you think about the relative size of these
nations at this point in time, and you think about how strong economically the US is coming out of
World War II and how relatively weak the UK is, and you look at how many people are in the field
of computing early on, it's a very small number, relatively speaking. So a small number of people
can make a big difference. If you have a labor
force and a population and a tax base that's just vastly bigger, like in the United States,
you have so much more slop in the system to protect you from failing in the same way. So
that's part of what happens. But if you have, for instance, read Margot Lee Shetterly's Hidden Figures they were somehow, you know, just wonderfully equitable.
It was because they needed this labor to try to win this technological proxy war so we wouldn't get nuked from space.
That was what they were afraid of at
this point in time. And so you do see examples of the ways in which labor discrimination hurt
high-tech projects in the United States. And if you spin that out a little bit further,
you take it out of the immediate post-war period where the U.S. is so relatively rich and powerful,
and you look at what's happening today and how we're competing
against other nations that have much bigger labor forces and are investing in technology in different
ways, you see the ways in which we are losing because, for one thing, we are not adequately
leveraging our workforce. And I mentioned what Google's doing with its AI ethicists now. I'll
just return to that for a quick second because a lot of times when I talk about my work, people say, oh, isn't that wild?
How did the British make such a set of stupid errors and shoot themselves in the foot this way?
Well, look at what's going on now. Not one, but two, possibly more than that at this point,
brilliant AI ethicists and brilliant Black women, one of whom
was a recruiter for Google, one of whom, Timnit Gebru, was a lead AI ethicist, they're getting
fired for doing their jobs essentially too well. And this goes back, I think, to what you were
saying about Weizenbaum thinking about computing as a conservative force. It's not about
changing things. It's about keeping things the same. And if certain people are, you know, in the
game to change things, even if that's to improve things, they're not going to be shown the red
carpet or welcomed in. They might, in fact, be just barely tolerated and then potentially pushed
out or fired. And that's something that I think we're
going to see hurting our technological companies, our technological endeavors in general. I think
we've already seen it hurt them, and yet it continues to happen. So it's not like the
British were so stupid or doing things that we don't see happening in our own backyard right now.
Yeah, I think that's an essential connection to make, you know, between what's happening today,
what's happened then, and what Weizenbaum was saying. So I'm really happy that you brought
that all together. Now, obviously, there is another piece of that article that I think is
also really relevant to today. And you describe how the British in developing these computing
technologies, it wasn't just to be used within
the United Kingdom to kind of rationalize and manage production and the various aspects of
the state, but was also pushed out to the colonies as Britain was kind of losing its grasp on its
empire. And so you point out that the British state saw computing as a means to maintain that
colonial power that was slipping at the time. So can you a means to maintain that colonial power that was
slipping at the time. So can you tell us a little bit more about that period of history and what
was happening there and the goals that it saw in using these computing technologies in that way?
That's one of the really interesting and just almost heartbreaking parts about the way that
the UK shoots themselves in the foot. I shouldn't say
heartbreaking, it's just ironic, because they view computing technology at this point in time
as their last best way of maintaining any kind of global political relevance. Now, part of this is
quite sinister. Part of it is that they think computing will be a way of getting a backdoor into the governments and the banking systems of other nations. That's one of the reasons that the UK government is absolutely opposed to allowing, for instance, IBM computers or American products to run their government or to run their banking systems because they see it as an issue of national security. But they flip it around and they try to use that same sort of technological influence
or that sort of technological backdoor into nations that have newly gained their independence.
So in India, in Pakistan, they try to get British computers into as many places as possible
in government and industry, especially the banking industry, to try to get a sort of neo-colonial foothold using this new technology. So they can't dominate
by guns and boats anymore. Well, maybe they can dominate economically by computing systems.
And so it's not just about gaining market share. There is a neo-imperial project behind what's going on with ensuring that British
computers specifically run not only Britain, but run other places in the world that Britain wants
to maintain some level of influence and control over. And the other thing that's going on here,
too, with the sort of neo-colonial aspect of computing technology is that when we talk about the UK doing this,
and we see the UK empire slipping away in a material sense, and the British kind of very
forthrightly saying this is going to be our only way to maintain power on the world stage,
I think we need to make the comparison with what's going on in the United States. The United States isn't losing
a material empire in the same way, but it's doing so many of the same things in terms of trying to
use technology as a colonizing force in a lot of ways. And so that hasn't stopped, you know,
that's still going on. The book that, you know, we're talking about today at the beginning of the
show, Your Computer is on Fire, it has this wonderful chapter by Dr. Halcyon Lawrence of Towson University, and it's called Siri Disciplines.
And it's all about accent bias and the way that even though, you know, half or more of the people
who speak English in the world don't speak it with an American, a British or an Australian accent,
but those are the accents that speech recognition technologies are built to work with and recognize. And if you don't speak English with one of those accents, then you have to code switch to, for instance, maybe get banking account information or whatever This is part and parcel of technological colonization.
And maybe it doesn't seem so serious at first, but look at all of the ways in which this is
connected back up with structures of economic power and dominance, with structures of political
power and dominance. And then also just with, as she puts it, the violence of being forced to not
be able to speak in your own voice, being forced to speak in another
person's voice simply to be heard at this point by machines. Yeah. And, you know, I think we see
in so many instances how there are so many of these biases built into the technologies because
of the people who are developing them and, you know, the parts of the world where they are
developed, but also envisioned to be used, right? As you're saying, UK, Australia, these parts of the world that are the Western
developed nations, and they are the ones that these technologies are being developed for.
But obviously, as you were describing what was happening in the UK and what it was trying to do
with its technology as a colonizing force, it's impossible not to think of how the United States has seen its
technology as well, and especially how it sees it right now as China is kind of rising to rival it,
not only as an economic force, but also as a technological force in the world.
And I think it's really startling then to think about how back in, I guess it would have been around the 60s when the UK is doing these sorts of things and is saying, okay, we can't have US technologies here.
And we also want to spread British technologies because that will give us certain forms of power.
I think it's kind of startling how over the past couple of decades, I think there's just kind of been this narrative that we shouldn't question the kind of
global domination of American technologies. And that is just something that is normal,
and that we should accept. And like, why would we question this? And I feel like now we're
entering this period where that is becoming an open question. And I find that really interesting.
I agree entirely. It's very frightening in a lot of ways how there are legitimate conversations
going on, not just in Silicon Valley, but in policy circles about how it is somehow a moral
imperative to have American technology not just running the US, but running other nations or
sort of dominating the internet, the sort of shared technological space that we have online,
and that if that doesn't happen, somehow the world is going to be in peril, not just that it's going
to be against US interests, but that oftentimes people say, oh, it's going to mean that, you know,
democracy ends, or it's going to mean that things get worse on a global scale, which I think in some
ways is disingenuous PR. But I also think that it's
something that has been a very, very effective way of getting funding for a lot of technologies,
either from the government or from private investors. And that as we see the self-styled
AI wars ramp up between, for instance, the United States and China, that we're going to see more and more of this rhetoric. We're going to see more and more of the idea that somehow these technologies are,
again, like in the Cold War with space technology, a technological proxy war,
and that losing that technological proxy war doesn't just mean losing market share or getting
a black eye somehow technologically, it may mean something
like getting into a hot war. It may mean something much more materially injurious or disastrous for
the nation that's on the losing side of that. And I think that all of those things, as you point out,
they have to be part of this conversation about technological, quote unquote, progress, because
there's no such thing as simple technological
progress where the rising tide lifts all boats. There are always winners and losers. The future
is always distributed unevenly. We can see it right now with our diverse population in the
United States, how technologies that are great for certain populations are actively harming
other populations right in the same
communities a lot of times. And when we expand that and we look at things globally and we look
at the maybe same or similar technologies being developed simultaneously in other nations,
it just becomes a more and more difficult question where there aren't any easy answers.
And it really does involve, you know, a lot of careful historical thinking about,
okay, how did we deal with a lot of these issues in the past, we don't have to reinvent the wheel
every time there are some answers that we can draw on past experiences for it's not like that's going
to make it that much easier, but at least we won't have to, you know, flounder around for a decade or
more before we even figure out what path to try
to take. Yeah. And that completely relates to what you were saying at the beginning, right?
About how knowing this history is important and that so many of these problems are not new. And
if we look back, we can see that similar things have happened before and that might contain lessons
for how we deal with things in the present. So there were two quotes that really stood out to me that I think suggest some way for how we
might think about technologies as we move forward. And the first of those is you wrote,
for decades, the corporations that built the digital economy have tried to make us think
that we could consume without end, speak without consequence, and outsource our responsibilities
for ethical systems to someone at
a higher pay grade. We now see how wrong that was. And I think we've been talking a lot about that
in this interview to kind of illustrate these points, right? And then in the second quote,
you write, supporting older, more stable technologies that enhance our society,
like the postal system, traditional news media, and citizen-funded public health,
is as important
as rejecting newer technologies that threaten to disrupt and divide. And, you know, I think that
also stands out with what you were talking about, about the COBOL systems that we have been ignoring
for so long. So I was wondering how we should be thinking about technology into the future,
kind of reflecting on those two quotes and what they kind of tell us about technology?
Yeah, that's the $64,000 question, right?
Well, there aren't any easy answers for all situations, but I know that everybody wants
some amount of, you know, actionable, specific answer to this question.
So I will try to say a couple of things that I think are important for all of us to keep in mind as we approach, you know, exactly what you're talking about, which
is a situation where we're having to confront the fact that we are responsible for fixing these
problems. We didn't start these fires, we didn't create these problems, but we are going to have
to fix them and our tax dollars are going to have to fix them. Think about it, you know, the way that there are super fund sites.
Well, you know, corporations polluted sites in the United States and now they're toxic
and now those corporations are gone and we have to clean up the mess with our tax dollars.
A very similar thing is happening with high tech.
We have to clean up these messes.
And it's not necessarily going to be the sort of thing that we want to do or should do, but it's not going to be something we can trust Silicon Valley to do to self-correct. The free market isn't going to self-correct in this way because it's too profitable to leave these systems broken. It's just too profitable to have systems that are broken in certain ways. So I've been really gratified to see how much labor organization has been taking hold in high tech and how more and more workers who are white collar, more middle class, the sort of people who might go into middle management themselves in a few years, they're realizing that they really would benefit from
being organized and getting together with their fellow workers to try to, for instance, put
pressure on the companies they work for to say, no, you can't do this. This isn't ethical. Or you
can't fire that person for being a whistleblower or saying that, you know, your AI system should
be more ethical. So workers are building power. They're organizing to try to
combat what's going on and not be divided and conquered because as an individual, you have very
little power, right? That's why a corporation doesn't want a union. It's because you don't
have any power if you are not also a group going up against this structure of corporate power. And the other thing is that on a political level,
consumers, workers, citizens are starting to call for regulation. They're starting to call
for an end to the kind of free pass that so many high-tech startups that then in many cases became
very, very powerful titans of industry that they have had for a long time and that they continue
to have to the point where their power is such that they've almost become pseudo governmental.
You know, they have such enormous power that they make decisions about, for instance,
whether political figures should be speaking in public or not. And the danger of making
those decisions as a corporation having its specific interests,
that's a really scary place to be for a democracy that wants to continue as a democracy.
And so I think that those are the two things that I'm really glad to see starting to happen.
And I think there's a lot of resources for folks out there who might want to get involved in either policy
initiatives, policy change initiatives, or in labor organizing. You know, you can kind of go
through the mode of grassroots labor organizing, grassroots political organizing. You can try to
affect change in both spheres at once. And it's going to be the coming together of these two
trajectories that is going to get us
there. Because in general, you know, if you look at the ways that positive change gets made in
these situations in the past, where corporations and industries sort of become too powerful for
their own good, this is the way that the system gets corrected. And it's not easy, but there is a path there to follow. So in terms of concrete
answers about what folks can do, that's one thing that I would say. It gives me some hope or
means to be cautiously optimistic, seeing what's happening in those spheres.
Absolutely. Those things give me hope as well. And especially seeing how more and more people are paying attention to the power of tech and how it can utilize its power and why that power needs to be challenged. Mar, this conversation has been super enlightening, just in the same way that the book is. And obviously, again, I'll highly recommend that everyone go pick up the book and read the great insights that are in it. And I thank you so much for taking the time to chat and sharing your insights with us.
Thanks so much for having me. It was a really great conversation.
Mar Hicks is the co-editor of Your Computer is on Fire, a new collection from MIT Press,
and you can find more information on how to get it in the show notes. You can follow Mar on Twitter
at at HIST of Tech. You can also follow me at at Paris Marks, and you can follow the show notes. You can follow Mar on Twitter at at hist of tech. You can also follow
me at at Paris Marks, and you can follow the show at at tech won't save us. Tech won't save us is
part of the Harbinger Media Network, and you can find out more about that at harbingermedianetwork.com.
And if you want to support the work that I put into making the show every week,
you can go to patreon.com slash tech won't save us and become a supporter. Thanks for listening.