Tech Won't Save Us - How to See Tech Like a Luddite w/ Jathan Sadowski

Episode Date: January 16, 2025

Paris Marx is joined by Jathan Sadowski to discuss the relationship between technology and capitalism, and what lessons can be taken from the Luddites to properly assess and understand these systems.J...athan Sadowski is is the author of The Mechanic and the Luddite: A Ruthless Criticism of Technology and Capitalism. He’s also the co-host of This Machine Kills and a Senior Lecturer in the Faculty of Information Technology at Monash University.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon.The podcast is made in partnership with The Nation. Production is by Eric Wickham.Also mentioned in this episode:Jathan wrote about AI and the Tinkerbell Effect in Futurism.Support the show

Transcript
Discussion (0)
Starting point is 00:00:00 The thing that really drew me to the Luddites was understanding and seeing how they really embodied that form of like socio-technical knowledge and did so in a way that followed its conclusions to their radical end. Hello and welcome to Tech Won't Save Us, made in partnership with The Nation magazine. I'm your host, Paris Marks, and this week my guest is Jathan Sadowski. Jathan has been on the show before, but it's been a little while. But he has a new book out called The Mechanic and the Luddite, so I knew I had to have him back on the show so we could discuss it. Jathan, of course, is also a co-host of This Machine Kills and a senior lecturer in the Faculty of Information Technology at Monash University. Jathan's book is great for many
Starting point is 00:00:58 reasons because it digs into how so many different technologies and forms of technology affect our lives today. But one of the things that I've really wanted to focus on in this interview, and it's kind of in the title of the book itself, is the way that Jathan frames how we should be thinking about technology, the approach that we should be taking to assessing how technologies affect our lives, how they actually work, and so that we can develop this critical, one might say, Luddite perspective that we need to have in order to be able to grapple with the serious ways that all of these things being pushed out from Silicon Valley and the wider tech industry are affecting and in many ways degrading how we live. And so while there
Starting point is 00:01:41 were many things I could have picked out of Jathan's book for us to discuss in this interview to dig in with, because he does so much great work on technology in the tech industry and actually looking into those wider impacts, I thought it would be a good idea to explore that aspect of the book, maybe to help people to develop better skills at assessing the technologies that they run into in their lives, to start thinking through these important questions about how technologies are built, about how technologies work, about the broader social repercussions that come of rolling, you know, so many of these systems out into our society, and how there are certain ways of approaching them and looking at them that will really help in, you know, developing a proper understanding of what is going on. And so we do really focus on these two ideas of the mechanic and the Luddite as different
Starting point is 00:02:29 forms of thinking about technology, of approaching technology. And hopefully you find those helpful in the way that you think about these issues too. As you'll hear in the conversation, I found the idea of the mechanic to be particularly thought-provoking. And it made me think about how there are certain technical skills and certain ways of understanding technology that, you know, I don't know if I would say that they're lost, but it feels like that there simply isn't as much of in society today. And, you know, I've been considering whether that
Starting point is 00:02:57 is a bad thing and whether, you know, we should have more of that technical understanding, not meaning, you know, that everyone needs to learn to code, but in a different sense, and how the ways that these systems are developed by major tech companies today are designed to ensure that we don't really have those skills. So that's all to say that I really enjoyed Jathan's book. I found it really thought-provoking, and I think that you will as well if you choose to pick it up. But either way, I think that you'll enjoy our conversation and our exploration into this particular aspect of what Jathan has been writing about. And if you do, make sure to leave a five-star review on your podcast platform of choice. You can also share the show on social media or I can keep having these critical in-depth conversations. You can join supporters like Peter from Berlin, Philip from London,
Starting point is 00:03:48 or Rolf in Bern, Switzerland by going to patreon.com slash techwontsaveus where you can become a supporter as well. Thanks so much and enjoy this week's conversation. Jathan, welcome back to Tech Won't Save Us. Always a joy to be here. I'm always excited to chat with you, whether it's on Tech Won't Save Us. I know you haven't been on for a while or on your show, TMK, This Machine Kills, which I'm sure many of our listeners will be familiar with. But of course, you're on the show now because you have a new book that is just coming out
Starting point is 00:04:15 that everyone should be reading. And so I wanted to start by asking, you know, there's a ton of stuff in this book that is really interesting because it's not a super long book, but I feel like it's dense, not in the sense of like linguistically dense, like it's easy to get through. The language is very compelling, but like there's a lot of important details and information in here for how we understand capitalism and how we understand, you know, the way that technology is shaping our world and how it's being fueled by this economic system and, you know, the particular power relations in it. So to start, I wanted to ask,
Starting point is 00:04:46 what makes technological capitalism distinct from how we usually understand the capitalist system? And why is that important to understand? Yeah. And I appreciate you saying that, because I really did want this book to be this kind of one-stop shop for understanding this deep relationship between technology and capitalism and do so in a way that like gives us the foundation needed to understand this really big, complex thing that is also really dynamic. It is continuing to develop in new and interesting and terrifying and unexpected ways. And so it's like it's not enough to just have the one statement, right? Like, I think a lot of work on technological capitalism is about providing people with
Starting point is 00:05:32 conclusions, right? It's like, this is why bias is wrong. And this is what it is, right? Or this is why these forms of technology are bad, but these other ones are good, right? It's about providing those kinds of conclusions that it's like you can read it and walk away, maybe not knowing the analysis, but knowing the endpoint of an analysis. But I wanted to flip that, right? Instead of the kind of like give a person a fish model for a book, I wanted to teach you how to fish. I love that. I wanted to equip people with a framework for like, this is how I think about technology and capitalism and about its relationship. And it is about understanding it as this like really deeply material thing, right? You ask what makes technological capitalism
Starting point is 00:06:18 distinct from, you know, maybe other eras of capitalism. I think for me, part of that is about the role that the tech sector, as we know it, thinking about Silicon Valley, thinking about the information technology sector as this really dominant and major part of the economy, right? This really dominant and major form and method for how capital to do what capital does, right? Like, you know, accumulate wealth, circulate capital, whether it's money or commodities, right? To extract value from labor and from resources from the world. Capitalism now depends on new forms of technology to intensify those kinds of processes that are really crucial to capitalism as a system, which is also why I'm not trying to argue that this is not capitalism, that this is feudalism or it's something different. Because to me, when I analyze what's going on, when I see the political economy, when I see the social structures, it looks like a lot of the same.
Starting point is 00:07:29 But it's just doing it in different ways. And that different way is that the tech sector and technological systems are really the main vector for how these forms of power and wealth accumulation now operate. And so I think we really can't understand capitalism ever, really, over the last 250 years, 300 years of its existence as a formulated thing without looking at technology. But I think now, especially, technology is the primary kind of organ of capital today. Yeah, I think really well said. And I think a lot of listeners of this show will certainly be recognizing that, right? The power that technology and that these tech firms in particular play in not just the economic
Starting point is 00:08:17 system, but the society that we live in. Now, you mentioned, you know, the way that we analyze this, right? And the way that we understand what technology is doing, what these tech firms are doing, how this broader technological capitalism works. And in the book, you lay out the different approaches through idealism and materialist analysis, right? And I feel like some of those terms, maybe if you're more academic, and you've read this kind of stuff, they will be very familiar concepts to you. But maybe, you know, if these are not things that you have interacted with before, you know, you're not doing all this, you know, kind of in-depth reading on this sort of stuff, maybe those will be newer terms. So
Starting point is 00:08:53 what are these two approaches and why is it important to understand this when we're thinking about, you know, the impacts that these technologies and that these firms have in our lives? Yeah, I think this distinction between idealism and materialism is really important. And it is a kind of core foundation for what I build most of the book on is kind of repudiating idealist ways of understanding technology and instead giving us materialist ways. And so what does that mean? Right. Because I think when we say idealist or idealistic, we think about it in terms of like being optimistic or even utopian about technology. Right. Like I'm an idealistic person. That's not quite what I mean here with idealism as like more of a kind of philosophical term. It's not totally distinct from that. But what idealism really means here is an approach to understanding the movement of society, the things that change or structure or shape society, that underpin society. Idealism is an approach to understanding the true engine of society as prioritizing like ideas, desires, visions, vibes, right? It's this kind of
Starting point is 00:10:07 sense that the mind is the prime mover of society. If you think it, they will come, you know, the kind of fill the dreams model of how you change society, of how you do things in society. And so, you know, an idealist version of that might be like, if you believe it, the world will change. So it's this real kind of idea that like, if you put forward into the world thoughts or dreams or visions, then they will materialize. And so it kind of puts the idealist, the mental capacity, I think about it as like living in our mind palace, right? Like it puts the mind palace before the real material world. It's saying that the real material world flows from our mind palace.
Starting point is 00:10:56 We all want a mind palace though, right? Like that sounds really cool, but I have a question on that. So, you know, you're talking about this idealist analysis, right? How we're thinking about the mind, how we're thinking about ideas, you know, in contrast to materialist analysis, which you're going to lay out for us in just a second. the things that we often hear or that we maybe rely on is this notion that if enough people start to see the tech industry in a different way, if they start to believe that maybe we analyze it in a different way, then this can potentially shift the way that we approach technology. Maybe we can start to challenge their power in a way that didn't exist in the past. Is this still an idealist thing? Am I misunderstanding it to a certain degree? How would you respond to that? Yeah, it's certainly not to say that things like our framing
Starting point is 00:11:50 or our discourses or understanding don't matter. They really do matter. But I think they matter because of where those framings come from in a material sense and how those framings can lead to or limit other kinds of forms of material action, right? And so I push back against idealism in the beginning of the book because I think so much of our understanding, our discourse, our analysis of technology is really based in this idealist mode, right? And so it's like, you can find idealism in the pitches for disruption by startups, right? Where it's like, it's these kind of like visions of things that the technology may never actually work in that way. That world may never actually come to bear, right? But if you exist in an idealist mode, then you kind of take those pitches at face value and
Starting point is 00:12:43 you imbue them with a lot of power. It's like, I guess, how you talk about, you know, the techno-optimist manifesto, you know, I've written about it as something that is really based on faith, right? And it feels like even more than ever, right? There has always been this kind of, I feel like, faith-based aspect to the tech industry and what they're talking about. But it feels like in this moment, more than ever, in order to push back on this increasing skepticism or criticism of what the tech industry is doing. It feels like they're doubling down on this even further and saying like you need to believe in this future that we are realizing or you're not just the enemy but like we'll never realize it, right? Don't ask questions about why we're not achieving things. You just
Starting point is 00:13:21 need to believe that AI is going to transform the world, that all this other amazing stuff is going to happen if we just say it and let Silicon Valley do its thing. That's right. That's right. It's very, you know, it's very much this kind of like Tinkerbell effect. We can think of it too, right? Where it's like Tinkerbell only exists if we believe hard enough and clap loud enough for Tinkerbell, right? If our belief starts to waver, then Tinkerbell starts to fade away. She becomes more and more transparent. And that to me is like, that is the exact kind of same mode that venture capitalists, that startup entrepreneurs, that corporate executives, they really exist in this mode of needing us to believe in them and needing us to clap for them and imbue
Starting point is 00:14:06 them with our psychic energy, right? And then that will be enough to sustain them. So this is like the basis of so much of the hype economy for technologies like AI or blockchain or the metaverse or whatever it might be, right? This is all also based in this real idealist way of understanding how technology comes about and how it impacts the world. Now, I think there's a lot of really obvious problems with that on the supply side, right, around like building a whole tech sector that is based on this kind of like hype economy and these kind of idealist notions that visions and vibes are all that we need. But it also leads, I think, to a really quite shocking amount of
Starting point is 00:14:52 terrible coverage of technology that is focused purely on the kind of idealist nature of it, not just the optimistic, but that kind of hype and the desires of something without any real material basis to those desires, or even necessarily an interest in how these things actually work. What can they actually do? And I think it also leads as well to bad critical analysis of technology, because if we get trapped in this idealist framing of technology, then our criticisms also get trapped in it, right? And so then the problems with AI become discerning what's hype and what's not hype, right? And I think there's a lot of really great work pointing out that the hype economy creates a lot of bullshit and just a lot of vaporware,
Starting point is 00:15:46 a lot of nothingness. But so much of that criticism never breaks out of trying to attack things for being hype or discern what's hype and what's not hype and actually get down to the like that material layer of like, OK, what is the political economy structuring these technologies? What are the interests that are motivating why some things are built and why other things are not built? Why is it that we can have this multi-trillion dollar tech sector that on one hand seems to be built on sandcastles in the sky, right? Just like a lot of hype, a lot of visions, a lot of vibes. Well, on the other hand, actually controlling immense amounts of wealth and influence in society, how can we reconcile these things if we only ever pay attention to that idealist
Starting point is 00:16:41 side? And to me, that's where that demands and moving past idealism and going towards the materialist. And I feel like, you know, just having that idealist view of it, right, just understanding technology through that lens is part of what leads us, you know, right now to be having this discussion where, oh, you know, is AGI going to arrive? What are the effects of AGI going to be? How is it going to transform our lives? Are, you know, is AGI going to arrive? What are the effects of AGI going to be? How is it going to transform our lives? Are, you know, the machines going to start wiping out humans rather than looking much deeper and understanding how these AI systems actually work? Because that
Starting point is 00:17:17 would reveal a lot more about, on the one hand, how all that discourse about AGI is like kind of bullshit and shouldn't be taken seriously in the first place. But also there's a whole load of other things that we should be concerned about that don't get in the discourse public discussions are shaped and influenced and, you know, misdirected away from what it feels like we should actually be talking about. Yeah, that's exactly right. We never end up asking if our framing of technology is completely focused and shaped by stories about technology, then we never move beyond the stories, right? And we never actually do what I think materialist analysis is really good at and needs to do, which is set about shining the interrogative
Starting point is 00:18:14 light on a system, right? You put it in the interrogation room and you shine the big, bright halogen flashbulb in its face and you ask technologies, you ask systems, you ask these people really straightforward questions like, what do you actually do? How do you work? Who do you work for? You know, it's like those are the kinds of questions that I think a materialist analysis demands of a technology, of an institution, of a system of people. And as simple as those questions are, I think they are not asked nearly enough and nearly forcefully enough, right? To be like, I think that question, who do you work for? Is like one of the most important questions we can ask of any technology. And not just in the
Starting point is 00:18:58 sense of like, what does your marketing copy say? Or what does your blog post say? Or your manifesto say? But it's like, in a real material way, like, who do you work for? What do you do in society? Yeah, obviously, you're preaching to the choir here. I completely agree. But as the title of the book tells us, you approach this materialist analysis in the book from two different lenses, right, which I think are really important and they're obviously distinct from one another. But I think that when we, you know, as people who are critical of these technologies, who are trying to understand them better are approaching this, I think they give us two really good lenses to do that because I think we probably lean a bit more toward one than the
Starting point is 00:19:39 other currently, which is why I'm, you know, kind of so fascinated by the way that you lay this out in the book. And so, of course, the two different categories are the mechanic on one end and the Luddite on the other. And I wanted to go through each of those one by one and actually kind of pull them apart and dig into why they are important. And I want to start with the mechanic and how you frame this. So what is this approach to materialist analysis, right? What does it mean to be a mechanic in approaching technology and understanding it? Because I'm sure it's probably not just, you know, being able to go in a garage and tinker with a car or something. How should we understand this? And why is it important?
Starting point is 00:20:19 Yeah, absolutely. So these are my two metaphorical role models for how to do materialist analysis. And I think the conjunction here is not an either or, but it's a both and, right? These are two sides of the same coin for how we really need to approach the critical study of technological capitalism. And so to give listeners a very quick kind of log line for each, the mechanic and the Luddite, and then absolutely let's dive into them individually and pick them apart. But to me, in short, the mechanic knows how a machine is put together, how its parts function, and what work it does, right? Whereas the Luddite knows why the machine was built, whose purposes it serves, and when it should be seized, right?
Starting point is 00:21:07 And seized here in both senses of stopped or taken, right? Destroyed or expropriated. You can seize the machine by throwing a wrench in its gears, but you can also seize the machine by taking it away from the people who are using it for noxious reasons, as the original Luddites and as our friend Brian Merchant is off to say, right? I think it's important that you laid out those two log lines, right? Because I think that helps us to have this distinction in our minds before we actually dig into each one separately. And I feel like at least when I was reading it, maybe I'm just thinking about myself, I probably lean more toward the Luddite here, right? You know, I focus more on the critical analysis of the technology, the social relations, the economic relations, all that kind of stuff. You know,
Starting point is 00:21:52 I feel like I don't have that degree of understanding of the technical underpinnings and the mechanics of how some of these things work. And maybe that's because these systems are increasingly so opaque and so large and so difficult to comprehend. But I felt like that was a really important and novel thing is bringing in that kind of mechanic frame as well, because I do feel like that's something that we need to have a bit more often and not just in the sense of like, oh, everyone should learn to code, but like in a much deeper sense, like be able to understand what's going on here. So, yeah, how would you explain the mechanic more broadly and why is it important for us
Starting point is 00:22:24 to have this perspective on it? I feel like when we talk about technology, it feels like digital technology, it feels like computers, but technology is this word that refers to so much more. And in the book, you talk about how not just you, but so many people in your family, like, you know, have this experience of being able to work with machines and with technology in a way that it feels like people these days, or at least people in our roles have less and less often. And so I found that really fascinating. I think that's right, because I do start this section on the mechanic with a bit of personal history, because to me, what is really important here with the mechanic is, yeah, it is not two things, importantly, it's not on one
Starting point is 00:23:03 hand, I'm not making an argument of everybody needs to learn how to code, right? Because I think that is very often something that we see from people who do have technical knowledge, which is to completely disregard or dismiss the thoughts of anybody who does not also have technical knowledge as being irrelevant to speaking about technical matters, right? And I think that creates this real kind of like technocratic gatekeeping of if you don't know how to code, for example, then you have no right to say anything about algorithms, right? And you see this is a moving goalpost as well, because then it's like, I do know Python actually, and thus can I now say things about algorithms? Well, actually,
Starting point is 00:23:53 unless you know how to work in TensorFlow, then you can't say anything about machine learning, right? And so then it's like, it's this moving goalpost because the whole point there is a bad faith way to dismiss anybody who says something that you as a technician or a coder or an engineer might not agree with, right? And so that is not the argument here is that like, we all need to learn how to code. I think learning how to code is perfectly fine and it's useful knowledge and increasingly useful now, but I think it's useful in ways that is a sufficient but not necessary kind of form of knowledge, right? The other thing that I'm not doing is I'm really trying not to make a romantic argument about the mechanic, right? And this comes
Starting point is 00:24:38 out of things like Zen and the art of motorcycle maintenance, right? And a lot of that tends to be really based in extremely conservative and really masculine ideas of like the good old days, right? When like the peak of human civilization was like the 1950s, right? Where it's like you could change your own oil and, you know, all the men had mechanical knowledge and, you know, that kind of thing, right? Like that is also something I'm completely uninterested in doing in large part also because it speaks to these like very historical social segregations and divides between different forms of mechanical knowledge, right? And so it is also a moving goalpost. It is also ignorant of its own history
Starting point is 00:25:29 because I think about the work by the historian Leo Marx, who has outlined the development of this concept of technology over its relatively short history, right? Like Leo Marx talks about how technology as this hazardous concept, as he called it, really only came about in the late 1800s as you start seeing a divide between the mechanical arts, right? And so this is what we tend to think of as like men with greasy hands tinkering at work benches. Right.
Starting point is 00:26:06 But it also has a lot of these like class and racial in addition to the gender ideas wrapped up in it that like it's the blue collar worker. It's increasingly associated with being lower skilled forms of work, work for poor people or people of color. Right. And then it's differentiated over time by the idea of technology, right? Which is like the higher social and intellectual plane of like book learning, scientific research, right? It's associated with these ideas of like
Starting point is 00:26:37 men in sterile rooms and white lab coats, you know, building the future. So there's a lot of baggage that we have to kind of like unpack here because there's so much social baggage associated with ideas of technology and ideas of mechanical and technical knowledge. So to be able to ask these kinds of questions that we need to ask of technology, how do you work? Who do you work for? What do you do in society? Those are questions that get to the heart of the power of technology. That to me is the real purpose here is it gives us a foundation for then asking the really deeper questions around like the social and material power that technology has and the power it has in our lives, but also the power
Starting point is 00:27:26 that it is used by other people, right? As a way of materializing their own power, their own interests, their own values. And so it's that two sides of the socio-technical knowledge, right? And I think a lot of when we talk about like socio-technical analysis, it is focused on like social analysis of technical things. But I really want to keep those two things paired together with that hyphen. The socio-technical analysis should be the combination of social analysis and technical analysis. And I think the mechanic to me represents the technical, the Luddite represents the social, and where we have to exist is in the hyphen between the socio and the technical. I needed to do a bit of scene setting there because
Starting point is 00:28:11 there's so much social baggage associated with the mechanic as a figure, with technical knowledge. And I think so much of that is not by accident, but it is by design. I actually think that a lot of these technologies, a lot of these systems that we see as immensely complex, so complex as to ward away anybody from thinking that they can understand how a machine learning system works or how a kind of deterministic algorithm does its thing or whatever it might be. I think a lot of that is complex by design, right? It's not inherently complex in the sense of like people can't grasp it. The complexity there is a lot of warding people away from it. It's forbidden knowledge, right? Or it's secret knowledge that you can't get to unless you are part of the priesthood or whatever it might be,
Starting point is 00:29:05 or unless you have, you know, passed the right test and entered the right guilds or whatever. Unless you've learned to code. Unless you've learned to code. And I think the kind of old idea of a guild is actually a lot more appropriate because part of the guild was as a way to create social protections around specific skills and forms of knowledge so that you can then, for whatever reason, for good reasons, but also for bad reasons, have control over who has access to those skills, who has access to that knowledge. And that might be so that you can be in a better negotiating point with employers or with lords or with whoever it might be to say,
Starting point is 00:29:48 well, you need the skill of a carpenter, but all carpenters are part of the carpentry guild. And we have very hierarchical and formalized systems of what that means to acquire that skill, to sell that skill, and so on, right? But I think part of the guild system as well is to ward away other people from saying, you can't possibly understand how two pieces of wood are joined together. It's far too complex for your mind. And we see that same kind of thing happening with forms of engineering, forms of programming and so on. So for me, the mechanic as well, what I do like about the idea of the mechanic is that it does have in it a bit of a hobbyist idea that you can be a professional mechanic and work in like an
Starting point is 00:30:40 auto shop, for example, but you can also be a mechanic who just like tinkers on the weekends, right? Acquires these kinds of skills and knowledge through the act of doing it, through the act of being mechanically curious. And I think that is something that has been really robbed from us is the mechanical curiosity, the idea that we should be curious and interested in how the things around us work, and that that curiosity and interest can actually be fulfilled without having to enter a guild or enter into, you know, the rarefied halls of academia or a corporation or whatever it might be. One of the reasons why I didn't call it The Engineer and the Luddite is because The Engineer already emerges from, as a profession, a highly formalized set of skills
Starting point is 00:31:33 and knowledge, which is itself also, as we know from the work by people like David Noble, who's a historian of engineering and technology, and he has a book, America by Design, where he talks about the origins of the engineer as a profession come out of industry needing to create certain kinds of knowledge and skills and people who hold those knowledge and skills who can then contribute to the motivations of industry. And so the engineer is deeply and integrally related to capital from its origins up to today. And so that's one reason why I also was like, I don't want the engineer in the Luddite, I want the mechanic in the Luddite. And it feels like the engineer also comes in to like, remake these systems in such a way that the mechanic or the workers who are more involved in this are having their power over the system, their understanding of how it all works, you know, degraded or
Starting point is 00:32:29 reduced so that capital has more power. And I feel like that's kind of the story of what has happened more broadly, you know, through society where it feels like, yes, this mechanical curiosity or this mechanical knowledge feels like it has declined, but it feels like alongside that it does feel like things have become more difficult, right? If you think about, I don't know, repairing an appliance, it's not just mechanical parts now. You need to know how this computer system is going to work as well. And that requires a whole different form of knowledge than, I guess, tinkering with a more traditional way that it was put together. And I wonder, just picking up on something that you were saying there, to what degree is it that things have become more complex? And so that makes it more difficult for us to have this mechanical knowledge to understand how it works properly. idea has been seeded, thinking about idealism again, maybe, from the tech industry, from,
Starting point is 00:33:26 you know, more powerful people, that everything is more complex. So you shouldn't try to understand it anyway, because you just won't get it. I wonder, you know, both sides of the coin, I guess it's a bit of both, but I wonder how you see it. Yeah, no, I think it's definitely column A, column B. I think that there's certainly things have become more complex. There is very much a lot of unnecessary complexification of things as well. Which a lot of your work deals with, of course. Yes, absolutely. And part of that is the drive to innovation, right? The more Rube Goldberg-like you can make a system, the more innovative it is, right? Because you've added more gidgets and gadgets and woo-hahs to it and stuff. But also part of that is to make it so complex as
Starting point is 00:34:12 to shut people out from being able to do things to it. And so you talk about like repair and, you know, I think here about the kind of work and writing that people like Cory Doctorow and Jason Kobler do around like the right to repair and that increasingly that right to repair is taken away from people and not just because systems have become more complex, but because a lot more gates have been put up in front of people's ability to repair. And a lot of those gates are not just mechanical in an analog sense, but also digital, right? These kind of digital gates that are put up in front of being able to repair a toaster because it has a bunch of IoT networking sensors and devices, and it needs an internet connection, and it needs regular firmware updates from the manufacturer and also that it can like send you a text
Starting point is 00:35:07 message when your toast is at the exact right level of browning that you want, according to the computer vision system inside. You know, it's like all of that kind of. And I think that like we're recording this like right as like CES is wrapping up. And you think about like the consumer electronics show in Vegas every year is a hundred percent like a showcase of the unnecessary complexification of everything. And it doesn't improve their functioning in a material sense, right? In a sense of like, do the things work better because it has these functioning. What it does is it maybe improves their functioning
Starting point is 00:35:45 in like a financial sense, right? It gives the business, the manufacturer, more power, more control remotely over a thing. It gives more sources of revenue through data or through subscriptions or whatever it might be. So this starts getting to like that mechanic and Luddite combination here of like, we can ask questions of how it works, but then we also have to ask those questions of who it works for. Things are becoming complex and unnecessarily so. There is also this remove from being able to even sate a mechanical curiosity if you were interested in doing so. And this is really wrapped up to what I think is a massive de-skilling where a really easy way to de-skill people is to
Starting point is 00:36:32 prevent them from getting the skill in the first place. And so we see this a lot with like technical knowledge and mechanical skills as you prevent people from getting that skill in the first place and you do it maybe intentionally around like, and institutionally through these guild systems, through the moving goalposts, whatever. But you can also do it kind of unintentionally through making our interactions with technology happen in these ways that are marketed as like convenient,
Starting point is 00:37:03 making it more simplistic, more convenient, giving you the user more power because you don't have to go into a terminal or something to like run an application, right? It's like the app is just right there on your phone. Another way to de-skill people though, beyond preventing them from getting the skill in the first place, is to restructure their job such that they stop using the skills that they have. And then if you don't use it, you lose it. Right. And I think this is happening in really major ways. So it's just it's the mechanic is not just about like people who
Starting point is 00:37:39 don't have mechanical knowledge or technical knowledge gaining it, the mechanic is also about people who do have mechanical knowledge retaining it and using it in more critical ways. And so if we think about something like ChatGPT, you know, ChatGPT is a chatbot interface on the front end of an immensely complex machine learning system, a large language model, this huge neural network with like, you know, a trillion plus parameters and so on. Right. But we interface with this immensely complex and really impressive system through a chat interface that makes it as simple, convenient as possible. I mean, we do it through natural language, right? You don't even need to know even like a language like Python, which is written in like natural language,
Starting point is 00:38:30 but with an unnatural syntax, you don't even need that. You can just write in normal speak and ChatGPT will do stuff for you, right? But that means that it gives you an a technical understanding of how the system works. You don't know how the system works. You have no ability to understand how the system works because your interface with it is through this chat bot. But the same thing is happening in really technical and really important industries. So like I do a lot of work on insurance and I went last year to the InsurTech Connect, which is the largest insurance technology expo and conference in the world. It happens in Vegas every year. It's like tens of thousands of people from the insurance industry and the tech sector
Starting point is 00:39:19 with hundreds of startups. I'm a real freak that I find these things actually to be very thrilling. And even when I go, people who are there, who are there because they work in the industry are like, why are you so excited about this? But I was there and one of the premier startups that had landed a big funding round, had a large valuation, had one of the largest kind of kiosk areas in the expo room. It's this company called Root Automation, and their premier product that they built themselves, right? And trained on like actuarial models and insurance documents and contracts and stuff. So it was this like chat GPT-like system that had gone to actuary school. And so it had all of this industry specific knowledge for insurance, but the way they were marketing it is, is that this becomes a natural language front end
Starting point is 00:40:26 for changing really complex, like underwriting models or policies or business decisions, such that like, if you're an underwriter and you work for an insurance company and you decide somewhere a decision is made, we really don't want to insure people in this demographic anymore. You know, like women over the age of 65 who live in Ohio, we don't really want to insure those kinds of people anymore for whatever reason. So instead of having to go and, you know, tinker around and change a model and change a bunch of policies and do a lot of really laborious but technical work, you instead go to the InsureGPT chat window and you say,
Starting point is 00:41:11 hi, InsureGPT, we want to drop all customers and not underwrite any new policies for women over 65 who live in Ohio. And InsureGPT says, cool, right away. I'll change all the models. I'll change all the policies. I'll do everything on the backend. You know, don't worry about it. It's all good, right? They were marketing this as like, you know, this is going to save so much money. It's going to save so much time.
Starting point is 00:41:36 It means that you can have people with less training doing these kinds of, you know, really technical jobs around underwriting and contracting and so on. You just need a subscription to ensure GPT. So this is not only is this a form of de-skilling, right? You're actively saying that people with like really technical knowledge in a really important industry that have really material impacts on people's everyday lives, not only do you need to have those skills, but also you don't even need to worry about how the models actually work, what's actually
Starting point is 00:42:12 happening in the back end, right? And so this is a real theme that we see in the tech sector right now, which is a taking away of even the ability to access and exercise forms of mechanical or technical knowledge, right? To not only put it all in a black box, which is what it's doing, but to say that even the people who might have that knowledge no longer have the ability because of business decisions to exercise that knowledge, right? So you start losing the skill for how an underwriting model is actually built. You know, how are the generalized linear regressions actually calculated? You know, you start losing that skill. You start losing the
Starting point is 00:42:59 ability to understand and ask those kinds of questions of what do you actually do? Who do you actually work for? What purposes do you serve? If materialist analysis is about asking those questions of systems and the mechanic is a necessary component of that, I think we can see a real reaction from the tech sector against even letting people have the ability to ask those questions in the first place. Man, that is so fascinating. Like, I feel like that example you're giving about, you know, insure GPT or whatever it was is, you know, exactly the kind of thing that we're often being warned about, you know, when it comes to the real impacts of generative AI and these other AI tools, right? It's not the AGI kind of scenario. It's the very real implementations in various different industries to change the way that they work and in ways that are very unaccountable, very opaque,
Starting point is 00:43:57 you know, ways that we don't totally understand and where you're giving the AI the ability to, you know, make these decisions. Yeah, they've been set up in such a way by people who want those decisions to be made, but you don't actually fully know what's going on behind the scenes and whether the things are being done properly. And of course, there's warnings about how this can play out in public services, but we often don't pay as much attention in, you know, things that are private like insurance. But I did want to pivot there from this talk about the mechanic to make sure that we talk a bit about the Luddite as well. You know, I think that people who listen to this podcast will be very familiar with the story of the Luddites at this point,
Starting point is 00:44:34 right? They'll know how these were actually workers who did understand what was going on, who did fight to try to get parliament to act. They understood these technologies. They weren't just like mindless people who were going to smash up technologies they didn't like. Like, you know, these were people who had a good understanding of what was going on, such that you say the Luddites did not just have this understanding, they were mechanics, and they had this kind of social ability to interrogate these technologies in the way that you're saying that we should be able to do. So what do you see as the way that this kind of Luddite perspective, maybe you can explain it to us in a bit more detail, but how does that complement the approach of the mechanic and why is it important to have that
Starting point is 00:45:13 approach to understanding these technologies as well if we're going to be able to properly kind of push back against them, understand how they're working in society, and ultimately, you know, as the goal is, is to try to ensure that they don't have these harmful impacts that we're increasingly seeing. Absolutely. Like, to me, the thing that has always drew me to the Luddite a very long time ago, but I'm like, so happy that we're in the age of the Luddite. I think it's becoming increasingly hip to be a Luddite, you know, and I hope that like the work that you're doing, that I'm doing, that Brian is doing, right, that like the work that we're doing, that I'm doing, that Brian is doing, right, that like the work that we're doing to podcast about it, to write about it, to really flog and sing the praises of the Luddite any way that we can.
Starting point is 00:45:54 To create a vibe shift, one might say. That's right. That's right. An idealist change there. And, you know, directly feeding into the material action of the hammer. The cover of my book is also in a very cheeky style of the hammer and sickle. It is a hammer and wrench, because for me, that is the emblem of the mechanic and the Luddite. The thing that really drew me to the Luddites was understanding and seeing how they really embodied that form of like socio-technical knowledge and did so in a way that followed its conclusions to their radical end. I
Starting point is 00:46:42 mean, the subtitle of my book is a ruthless criticism of technology and capitalism. And I think this is also the thing that the Luddite really gives us, right? We need to have that mechanical knowledge, that technical knowledge combined with the social analysis. But it's not just that. If we have one of those things and not the others, then we have a problem. Because I think if the Luddites were just merely mechanics, then we see the problem when people have technical knowledge without the socio hyphenated to it. That gives us people who are interested in technology for its own sake, who have a myopic view of technology without any concern about its social context or what happens after the fact, right? And I see this a lot
Starting point is 00:47:33 because I work in a faculty of information technology. And so this is constantly testing my own mechanical knowledge because I am constantly interacting with people who build AI, who work in data science, who work in, you know, HCI and these kinds of technical fields. But I'm also constantly challenging because I see this kind of not my department ethos, right? That like, if you have technical knowledge and that's all you have, then it's really easy to say everything else beyond that is not your department, right? But the thing that really drew me to the Luddites was this merging of the social analysis with the technical knowledge. And even more than that, because I think there's increasingly more people who are doing that kind of like social analysis of technology to varying
Starting point is 00:48:21 degrees, but we need to push even further than that. And the Luddites did, to then following that analysis to its ruthlessly critical ends, right? To say, if the result of our social analysis of these technologies is that they are noxious to the commonality, that they are harming our livelihoods, that they are destroying our communities, that they are enriching a small handful of factory owners or landlords or whoever it might be. If our conclusions reach that end, then we must do everything that we can in our power to change, to challenge, to confront those technologies, right? Whether that means to protest against them, to put forth petitions for change, to try to enter into
Starting point is 00:49:16 negotiations with the people who have power and capital and say, hey, surely we can reach a consensus point where both of our interests can be safeguarded here, which is all things that the Luddites did before all those things failed. All those avenues and tactics were disregarded, dismissed, ignored. And the only options left were to either lay down and give up, say, well, I tried my best, or to grab the hammer and start swinging, right? The story of the Luddites has been mistold for so long now, since the origins of the Luddites, 200 years ago. But it is not a story of indiscriminately breaking machines, or it is not a story of ignorant technophobia, of being scared or mystified by these alien
Starting point is 00:50:07 machines that appeared one day in our workplace. And we, you know, like cave people, we picked up rocks and, you know, struck against them. And it is not a story of being anti-innovation or anti-progress, right? As if like the Luddites were merely reactionaries who wanted to stop the gears of civilization from advancing forward. These are all the framings that we have and still have for the Luddites, right? The kind of indiscriminate, ignorant, anti-innovation framing. But that's not the case at all when we actually know the story of the Luddites and understand that they were following to the radical ends the conclusions of a socio-technical
Starting point is 00:50:54 analysis, right? And so for me, if we only have one or the other, if we only have the mechanic, then we don't have the social analysis and the context needed to understand what's the point and purpose of technology, right? What does it actually do? If we only have the Luddite in this like kind of caricature of it, then we just have a kind of a technophobic fear, right? A kind of a primitivism. But if we have both the mechanic and the Luddite, then that gives us, I think, something really powerful. It gives us a powerful foundation for materialist analysis that is not afraid to be ruthlessly critical of these systems, technology and capitalism, and follow that ruthless criticism to whatever radical ends it may have, and to not shirk from confronting the powers that be. The idea of ruthless criticism here for me comes from a philosophy of analysis and action that Karl Marx outlined in some of his work. For me, Marx is the granddaddy of materialist analysis, right? Like the kind of
Starting point is 00:52:05 philosophy and the style of materialism comes from Marx. And, you know, he noted that materialism is rooted in what he called the ruthless criticism of the existing order. And he noted that ruthless here means that, quote, it will shrink neither from its own discoveries nor from conflict with the powers that be. And to me, the mechanic and the Luddite represent the socio-technical knowledge that must be the foundation of our ruthless criticism that then has a confidence in its own rightfulness, its own righteousness to not shirk from confronting the powers that be, nor shirk from following its ruthless criticism to its radical ends. That to me is like what's desperately needed for an understanding of technological capitalism. Yeah, I obviously completely agree. That will not be a
Starting point is 00:53:00 surprise, but I think it's so important, right? Because, you know, even those of us who understand the history of the Luddites, who identify as Luddites, sometimes we'll, you know, go to like the meme of, okay, we need to pull out the hammer and smash the technology. But when we think about the Luddite analysis, it's actually so much deeper than that, right? And of course, there's that recognition in it that they took to using the hammer to smashing the technology after trying so many other steps in order to make sure that they were not going to be used in this kind of harmful way, right? And at the end, they found that this was not going to be possible. So their only alternative, right, the only remedy available to them that was left, especially at this time, was to take the hammer
Starting point is 00:53:40 and destroy it. And so I think that when we're talking about a lot of analysis today, as your book lays out so well, you know, we need to have this knowledge so that we can properly push back against these things so that we have many different avenues through which we can try to restrict this power. And, you know, maybe if they don't all work out, we'll have to resort to the hammer as well. But there are many other different routes that we can take to try to use this knowledge, to try to use this materialist analysis, to try to get some form of action to build a better world, to try to rein in the harms, you know, all these different things that we might try to do. And I just wanted to pick up on one other point of the book. There's so much more we could talk about before we close off this conversation, but I feel
Starting point is 00:54:18 like what you're talking about there, you know, when we think about the future and what does a better future look like, I found this one point in the book really compelling to me because you said, even if we achieve some sort of utopia, right? We arrive at this point where we've built this society that is so much better than the one that we live in today. This ability to understand technology and to assess technology that, your talk of the mechanic and the Luddite lays out for us will still be essential, regardless of what kind of world we live in, because even if we have a better world, that needs to be maintained. So I wonder if you could pick up on that piece of it as we close off our conversation. Yeah, absolutely. Because this is also, I think, a really important feature of a
Starting point is 00:55:01 materialist analysis, is it is not trapped by these ideas that history happens as a series of static events, right? Which a lot of, when we think about idealism in its utopian flavor, it is very much thinking about like this utopia as this place, right? This like static place that if only we can reach it, then we can reach like social equilibrium where everything is perfect and nothing needs to change ever again, right? And so that to me, it's not only a wrong kind of analysis of like history and social movement, it's a dangerous one, right? Because it misunderstands that materialism is also based in an analysis and an understanding of dynamism, right? That things happen,
Starting point is 00:55:51 not as static events, but as dynamic relations, right? That society is constantly making and remaking itself according to the changing needs of complex people, right? The changing desires of new generations. You know, idealism sees technology as immensely complex, too complex to understand, but sees people as so simple as to not matter. Whereas for me, materialism understands technology as the result of human action and that people and societies are really complex. And we have to account for that changing complexity in not only our analysis of how things work and what they do, but what we want things to be. I really counterpose this to the futurist, right? The futurist to me is one of the ultimate archetypes
Starting point is 00:56:46 of the idealist way of understanding technology, because it is about kind of predicting these events, these places that are going to happen in some future, right? It's very deterministic in this sense, and it's very much based on vibes and visions and desires and these kinds of things. For me, the idea that we should keep in mind is not utopia as a static place, but what the sociologist Eric Olin Wright calls real utopias, which are multiple and plural, and they're found in these dynamic mechanisms of social coordination. They understand that one, utopia is already here, right? Utopias already exist. They're just unevenly distributed or utopias already exist. But to a lot of people, they feel like dystopias, because the utopia for some is a dystopia for others, right? The utopia for a Marc Andreessen
Starting point is 00:57:46 or a Jeff Bezos or an Elon Musk is almost by definition a dystopia for the vast majority of other people because their utopia requires the blood, sweat, and soil of other people, right? And so understanding utopias as multiple, as plural, as dynamic, it's one reason why the end of my book kind of rejects setting out a series of solutions to the problems that I've identified, as if these solutions could come together like stepping stones leading us from the bad now to the good future, right? That to me feels almost anti-materialist in a lot of ways. It goes against the very purpose of the book, which is to understand that there is no final analysis, right? That there is no static thing that we can do, that there is no simple steps. It's like an infomercial for one weird trick to undermine capitalism and reclaim technology. The bourgeoisie hate it. Call now. But it's like, that doesn't exist because what we need instead is an understanding of how society actually exists
Starting point is 00:59:01 as dynamic relationships that require a kind of active maintenance against the entropy of decay and disorder, right? That the idea of the end of history is a thought experiment turned into a mass delusion, right? It's a way of making us think that there's one weird trick that we need to reach the good place, that if only we could find the silver bullet or if we just waited long enough, right? The idea of like the end of history. And I think even a lot of Marx's work in this kind of historical materialism does have this like kind of step-by-step progression of society that is like, if we just wait long enough, then history will end and then it's not my problem anymore. But that's just not the case. We know that doesn't happen. And if that
Starting point is 00:59:52 were even possible, it would never be desirable because a stable equilibrium is also one where the decisions and actions from some past are locked in place forever, right? And I think a lot of people like Marc Andrees and Elon Musk, right? These people we talk about who put forward their own techno-utopian manifestos, they would love it if their decisions and if their actions were locked in place forever. But for everyone else, that sounds like an absolute nightmare. And we should never want to create a society that is built on achieving an equilibrium that locks in place decisions made by people hundreds of years in the past. We should want a society that embraces the intrinsic dynamism of social reality and understand that not as a problem to overcome,
Starting point is 01:00:48 but as something to embrace and to work with, to flow with, rather than to react against. Dude, why can't I just have my one weird trick? This is what I'm looking for in my books. My book would sell a lot more if it did end with a one weird trick. The one weird trick to take down technological capitalism. You know, I really do think that the book is so important, not just to laying out why it is important to view technology and to develop an ability to see technology through this light, right? So you can properly understand the effects that it has in our society. But the book also goes through so many different ways that there are these systems set up that we all depend on that, you know,
Starting point is 01:01:30 constrain our lives in very important ways. And it allows us to, by you doing that analysis, allows us to see how we can cultivate that in many other areas as we're looking at, you know, the effects that technological capitalism has in our lives. And I just wanted to leave listeners with one quote from the book before we go, which is, quote, being a Luddite is not a scarlet letter we wear with shame. It is a badge we should display with pride. And I think that speaks to all listeners of this show, as well as your own show, Jathan. It's always great to talk to you. Thanks so much for coming on the show. Thanks so much, Ferris. Always a joy.
Starting point is 01:02:10 Chetan Sadowski is the author of The Mechanic and the Lunite, a co-host of This Machine Kills and a senior lecturer at Monash University. Tech Won't Save Us is made in partnership with The Nation magazine and is hosted by me, Paris Marks. Production is by Eric Wickham. Tech Won't Save Us relies on the support of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters by going to patreon.com slash tech won't save us and making a pledge of your own. Thanks for listening and make sure to come back next week. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.