librarypunk - 159 - Why We Fear AI feat. Hagen Blix
Episode Date: February 26, 2026This week we’re joined by Hagen to talk about their new book on AI and labor/power/knowledge/eugenics. We also discuss some library Reddit developments. Media mentioned https://juandavidcampolargo....substack.com/p/naperville-library-spy https://www.juandavidcampolargo.com/projects/naperlibspy https://www.commonnotions.org/buy/why-we-fear-ai Transcript: https://pastecode.io/s/gesekv6i Join the Discord: https://discord.gg/qWPTurTnkT
Transcript
Discussion (0)
All right, let's go.
I'm Justin. I quit my job today, and my pronouns are he and they.
I'm Sadie. I work IT at a public library, and my pronouns are they, them.
I'm Jay. I'm a cataloging librarian. My pronouns are he, him.
And we have a guest. Would you like to introduce yourself?
Hi, I'm Hagenblakes. My pronouns are he or they. I'm a linguist and cognitive scientist,
and I just co-wrote a book about the political economy of AI and AI fears together with
my friend, Ingmarclimmer.
Welcome. It's a very quiet cheer, but I can't get that louder.
Welcome. Yeah, this is the, we had to reschedule you twice. I feel really bad, but I got caught in a
blizzard, and then I got sick. And so we're finally talking about this book, which you were
very generous to send to us. And I enjoyed it. I thought it was very easy to understand.
I thought it was good for general audiences. I thought it was really sad things. I like when
people talk about theory clearly, because I feel like, I love the guys to ask.
at Horizon, but they talk in like, not even paragraphs, but like subchapters and use so much
philosophical jargon that I have no idea what's going on half the time. So I enjoy it when someone
can actually talk about philosophy or plainly, because that's what we try and do here, right?
Yeah, that's the idea. I mean, that's, you know, you don't just theorize just for the hell
of it. I mean, that can be fun, but the stuff is there to help people make sense out of their
own lives and to figure out how we can all liberate ourselves, right? It's a collective
of enterprise. So I'm going to warm us up a little bit because something was posted in Reddit today
on our libraries. So this is, this guy's, this guy isn't even really an enemy of the pod. And I can't
use the corn thing because the automatic voice detector will, will screw up the transcript.
I agree. If I use, if I use corn scatting. I've got to use something that's not. Or you don't have
to use anything.
I do.
There's this guy called Juan David,
and I ended up looking into him
because I just got curious about him,
but he made this thing called
the Naperville Library Spy,
which is a great non-menacing title,
but it's like an unfiltered look
at what books are being checked out
across the Naperville Public Library.
And as far as I can tell,
it's just like a vibe-coded copy
of the Bibliocommons discovery
layer. And as far as I can tell, it looks like it is now broken because I know someone in one of
the library discords called the library and was like, hey, this guy's scraping this shit out of
your website. So it looks broken because you can see that there's like this pulse image and
it's now flatlined. So I think either he shut it down because people yelled him or someone else
blocked his access, but...
It's interesting in theory.
It's a strange idea.
I don't think it's like an actual privacy problem.
I don't either.
I mean, there definitely are concerns I would have in terms about, like, if someone
did try and scrape a bunch of public library stuff to see, like, books for challenges.
Yeah.
I can understand how this would be a pain in the ass.
But this guy just makes slop.
And you can go to his website, because obviously,
it's like Juan David Campo Largo
and you can just like slash projects
and so I just went to his projects
and he went to UIUC
Oh no we have an alma mater in common
He made the talk show for their
And the thing is like it's a good little
student talk show they interview like the professors
And talk about stuff and make YouTube videos
It's not bad
But everything else he seems to make is just like
He made this thing that it's clearly
It's called Picomasala
how I built a restaurant empire and then gave it away for free.
And it's like all clearly written by AI.
It's like all the arts done by AI.
It's for this like restaurant someone approached him to make, like branding for and then just
didn't want to work with him anymore, probably because, you know.
He says, it's detailed.
Honestly, maybe too detailed.
But I wanted to write down everything needed to make this a standout business that people
actually care about, remember and talk about.
And then at the bottom it says, two friends of mine,
made a podcast episode about this master plan.
You can listen to it here.
And it's so clearly the Notebook LM podcast voices that he just uploaded all this shit
that he made into Notebook LM and hit the make-up podcast button.
No, I've been joking about how we had this moment where people were like,
oh, the LMS has finally passed the touring test.
And now we're all so deeply familiar with the voice of Slop that it has just unpassed
the touring test again.
I really wish that the voice detects that they're using for Notebook LM was like available because it's great.
It's really good for Notebook LM.
I would love it to read websites to me in a natural voice.
Instead, the only thing you can do with it is upload books to it and ask it to give you a dipshit like podcast about.
And the insane thing about Notebook LM, the amount of compute that goes into stuff just boggles.
my mind because like I threw an audio book in there and it transcribed it very quickly and made like
a full pot like podcast summary of it and I'm like that's too much compute like I know how much
how compute intense this stuff is I've tried to run stuff on my computer using Lama and like
the stuff that this is doing is like burning electricity and GPUs it's very it's too much
much, man. It's too much. To be making picco masala one bite, two hearts. And a machine that spies on the
local public library for no reason. Because libraries are famously very secretive about information.
You need to put spies in there to figure out what kind of information is hidden in the library.
Like, I am so shocked that Project Hell Mary is one of the top titles considering the movies coming out.
that dipshit book
the let them theory. It's just all
a bunch of mass market
like
fucking Patterson, whatever the hell
his name is, and then like self-help books.
That's all it's gonna be.
James Patterson, that's who it is.
James Patterson. Yeah.
The bane of library workers everywhere.
Yeah. It's like
nothing good.
He also, I think, made a version of
of UIC's catalog as well.
So he just scraped their catalog and made his own
catalog. Oh no. Oh, it's so, that's such a huge, I work, that's such a big library. They also,
no, their course catalog. Oh, I was like, what? Yeah, no, yeah, they're, they're, it's, it looks okay,
I guess. I mean, it's an interesting idea. Like, you could just take your library collection and
dump this data and let people make little websites out of it, but like, why is he scraping his own
university's catalog to make his own university catalog.
Like, I don't...
Is it that bad?
Is it that hard to use?
This library spy thing really shows how
a useless, the Library of Congress genre form terms are, though.
The top genre novels.
Wow.
So useful.
I'm glad we assign that to stuff.
Graphic novels is a little more useful.
The top subjects is friendship,
juvenile fiction
and then humorous stories
and then schools
juvenile fiction
picture books juvenile fiction
dogs
juvenile fiction
it's like all
juvenile fiction
except for man woman
relationships fiction
where I'm guessing
that's the romance books
and like the
Colleen Hoover and stuff
wait were you seeing this
on the library spy page
on the radar tab
it tells you the top titles
as well as the top formats
the top subjects
and the top genres
Oh.
Yeah.
A board book.
Wow.
Wow.
Sorry, I just looked at this dude's project page for the library spy, and it's, I couldn't
believe libraries were a thing, which fair.
He apparently is not American.
American libraries are kind of unique, but I also couldn't believe books could be
interesting.
Yeah.
Wow.
And then the thing that is interesting about books is clearly the metadata, not the books.
Not the books, but just, okay.
He's such a hustle core grind mindset, bone maxing kind of dude.
This is like every dude I've ever encountered in the like PKM space is this dude.
Uh-huh.
Yeah.
See, could PKM have saved her?
Could personal knowledge management have given him something else to do with his time?
It would have made him worse.
And well, anyway, I think he just needs a hobby that's not.
not vibe coding all day.
I mean, reading is right there.
He could just do some of that without vibe coding about it.
Now that he knows that libraries are a thing, it should be pretty accessible as a hobby.
Profitable, affordable housing.
Yeah, anyway, this guy's, he's just the type of guy and it's very funny to me, but I've thought about him too much now.
Okay.
New type of guy just dropped.
Spine honest, the public library.
It sounds like dudes you would tell me about when you were in college, Sadie, and you're like, this guy fucking sat next to me again and kept telling me about Bitcoin for an hour.
Oh God, that guy.
Forgot about that.
Anyway, so, Hagen, you wrote a book called Why We Fear AI, and let me get the subtitle on the interpretation of Nighting.
nightmares. Fears about AI tell us more about capitalism today than the technology of the future. And I like how you talk about, like, our imaginations quite a bit, but for, to make this easy on me and to keep us on theme for the podcast, why should library workers read this book? Like, what do you think they'll come away with from it? And I think a significant part of the book is kind of about the, what are the politics of knowledge? What are we, what do we want from a,
emancipatory, liberatory politics of knowledge, which, you know, to me,
libraries are a very central kind of theme of a very, yeah, like,
utopian idea of knowledge that's accessible and useful to all made in a way that,
yeah, is accessible to people that they can do their own thing with.
And I think what we see in AI is a, it's kind of in some ways an alternative project
for how to structure knowledge, how to make knowledge accessible or inaccessible.
And in my mind, like one of the things,
that we go into a lot in the book, especially in the second half, is talk about AI as kind of a special form of a privatizing knowledge.
It's scraped all these.
They've scraped everything on the internet, whether it's, you know, or message boards or all the books that they downloaded from Libyogenesis or wherever.
Right.
But they're trying to make this thing into a private kind of thing that you have to subscribe to or get served ads to do it, which, you know, a lovely fact about the library.
they don't put a bunch of ads in your face.
Right, and there's a sense in which that is a use of knowledge
when it's in the machine that is, in my mind,
used to devalue people's skills.
I think that's ultimately the economic purpose of most AI things
is not to increase productivity or make knowledge available
or whatever it is to replace people who are coders
with people who have a six-week training course
in by coding.
It is to transform people who are logo designers into people who have to fix AI slop.
So the point is to make people into appendages of these kind of weird new knowledge machines
where they don't have to be paid well, where they can be transformed into gig workers, etc.
And I think that's a very radically different vision about how knowledge should work
than the one that I think something like a public library system presents.
So I think that contrast is very useful for making clear what kind of the,
role of knowledge, of books, of all these kinds of things is in a society and what kind of
society corresponds to one vision of knowledge versus the other vision of knowledge.
Yeah, I like the comparison you started to make in later in the book of like knowledge as
this structure in the pyramid of power, right? So knowledge for the professional worker both
justifies their position in the hierarchy, even though.
they understand that like intelligence is not like a real thing. They know their boss isn't
necessarily more intelligent than they are, but they are supposed to think that they are more
intelligent than other people. And that's why they're allowed to be in that position. It's like a
double mind. But knowledge you talk about as this thing that is essentially like proprietary.
Like you are allowed to know this much about your job. And if the dream is the more knowledge we can
keep away, we can then
sort of lower the power
hierarchy, flatten the pyramid
of power so that there's really
only the people who own all of the
useful knowledge and everyone else who
has to access it without really
understanding and having that mastery themselves.
So knowledge, you use it as sort of
a way of measuring power.
I think this would be a useful way to talk
to students about it because I try to explain
to them like most of this stuff
in our databases is not openly
available. You will not have access.
to this much information for the rest of your life.
Right?
This is like very expensive information that we have in our databases.
And then also if you could connect that to like the kind of knowledge that it takes
to be a professional in a job and be valued for that,
like knowledge of the whole process that you work on.
And if there was a way to take that away from people so that there was no professional
managerial class, there was only the lower working class and the owning class.
I guess is kind of like the, when you talk about flat.
flattening the pyramid. That's what it sounds like the ultimate dream is. The CEO who owns
controls all the knowledge, has the machines that control the knowledge. No one else gets to
have it. Right. Yeah. I think that that really is the dream of capital, you know,
can we centralize all this knowledge? I think one of the, one of the metaphors or we play around
within the book of it is to say the intelligence of artificial intelligence should be the
intelligence that we gather, you know, the intelligence of intelligence agencies, which is exactly
about can we, what kind of knowledge can we put on a need to know basis, right? And yeah, we've seen that
if you look at the history of scientific management and Taylorism onto all kinds of managerial
schools today, there is a sense in which there's always been an attempt to say, can we
centralize certain kinds of knowledge among management so that management can tease out the steps
in a labor process and say, oh, here's a part that can actually done by someone who only needs a two
weeks training course. And we're not going to have an engineer do that. We're going to like carve out
the niche for the people who do need knowledge ever smaller and more specific so that so that yeah,
whenever you give special knowledge to a manager, you do it so that you can deprive other people
of the bargaining power that they can derive from their knowledge, right? That's I think the core thing there
where we're talking about power, power derived from knowledge can be the power of bosses and managers or it can be
If workers together have knowledge of the labor processes that can't be easily replaced,
then they have the ability to go on strike, do a work stoppage, and challenge management,
challenge the owners of companies, and that way derive power.
Or if you're a very well-trained professional in a niche where labor is in short supply,
where you have specific skills that are in high demand in the market,
then you can maybe derive individual bargaining power, right?
So there's these different things in what we call the pyramid, the kind of hierarchy of power
where knowledge kind of functions differently in this way in the economy and then in the political
power that people derive from their economic position to, right?
It's not an accident that if you're a billionaire, you can call whoever you want in the Senate
or Congress. And if you're you and me, you cannot.
I think that's a really good thing to highlight with like what is the problem that we have
with AI, right?
Because, like, there's a lot of existential
problems, of course, but
even material problems, like,
in the environmental issues, right?
Theoretically, they could come up with a way
for it to not be as
bad, like, it's going to be difficult,
but there's probably theoretically, technically,
a way to do that.
So it's like, that can't be the only bargaining
chip that we have to fight against
this. So, like, the thing I've, and
like, even the whole, like, oh, it steals things.
It's a plagiarism machine. I'm like,
Yeah, but do you want more robust intellectual property law?
Because that's how you get bad.
The problem with that even isn't plagiarism or free, you know,
I almost said free use, fair use or anything.
Like, fair use is good.
We want a digital commons, right?
Like the problem, again, is like how this exists under capitalism
and the way that it affects labor.
And, you know, with plagiarism and fair use and all that,
it's because that's like a livelihood that's being hurt
because that is how someone is existing within capitalism.
So I think this way you're framing it of like power relations
and who's controlling power and everything is like really insightful.
And I guess I didn't structure my questions very well.
I make fun of myself.
But like why, I guess I wanted people to get an idea of like the,
the thrust of the book before going back and saying like, why, why as like, why write this book as
like someone who's trained as a linguist and as a cognitive scientist? Like, what got your
interest personally into like writing a book that's, you told me like a lot of the footnotes and
stuff had to get cut, right? Like it was, it was very much like made for a more general audience. So
like, what, what was the path there? Yeah, I think I kind of got interested in.
language models a couple years before
all the big chat GPT splash,
etc. That was around the time
of bird, you know, linguists were kind of
occasionally taking note of this and where
I got interested as a linguist. I was like,
oh, here's suddenly a machine that can,
you know, back in those days, it was like, it can kind of
produce something that sounds like English.
You read it and you're like, this didn't make any sense.
I have no idea what I read, but it did sound like
English. And it was like that, as a linguist,
I was like, oh, I'm studying, you know, what kind
of knowledge is involved in knowing
a language. So I was like, oh, what's in
machine. And so we, you know, I joined some projects. We looked at to like what kind of grammatical
properties do these machines actually represent well or not well. So, so initially for me,
it was a purely kind of technical interest to curiosity. But I've also been politically active on the
left for many, many years. And so a little while into that curiosity, I and my friend Ingeborg,
who was a machine learning researcher who also got interested in these kinds of objects. We were
talking about the, well, this looks like it's kind of the industrialization of language production.
What does that mean from the perspective of the political economy of knowledge production, language
production, these kinds of things? And so we got to thinking about those political questions.
And that was kind of just us talking, trying to make sense out of the situation that we were
finding ourselves in. But then Chad GBT happened. And together with Chad GBT, there was this
explosion of stories that were always in this duality, you know, the promises and perils,
the, whatever, it was always the same kind of binary. And there was this like, oh, my God,
the machines are going to, maybe they're going to destroy us. Maybe the Terminator is going to
be real. Maybe the Matrix is about to happen. Right. So there was this very bizarre sense of
the stories that used to belong to the realm of science fiction are now making it into, you know,
into very serious normal liberal newspapers,
like the New York Times or whatever.
Time magazine had published a piece
where Eliza Yutkowski said we should nuke data centers
if the machines get too smart
and risk World War III over the risk of having a language model
breakout of containment or whatever.
So there was a lot of strange things.
And I think what we saw from like the left
was either not a particular interest in that or just a desire to ridicule these stories,
you know, and I understand that.
But we had the feeling that there's something more interesting happening, that there's
something to be said about, why do these stories resonate?
What is it about these stories that makes people interested, you know, other than, well,
you know, stories about big explosions make people click on links.
But, you know, so we went in there and we kind of figured, oh, there's something to,
let's say a story about, you know, the matrix coming.
true. I think there are things in the real world that get reflected in these stories that make
these stories resonate, right? I think there is a very real sense in which even the billionaire
class has a sense of they can't stop capitalism from causing climate change, right? They're
just writing it out. They're hoping that their bunkers will hold when the time comes, right? Even for them,
they're already in a situation where they experience this as there's an unstoppable technological thing.
That's just happening, right? The oil wells aren't conscious. The oil.
oil wells haven't decided to come for us and make the planet uninhabitable for humanity.
But there's a real sense in which there is an out-of-control technology sense, right?
That's even for the ruling class.
But then there's also, you know, workers that get more and more surveillance, more automated control put on top of them.
One of the stories that we're talking about in the book is how Amazon is using video classification software and warehouses, right?
So there's a sense in which personal relations of domination, like a manager telling you what to do,
gets supplanted by a machine-mediated power relation, right?
The manager doesn't even tell you what to do anymore.
At Amazon, sometimes the algorithm just fires you.
The algorithm surveils you, checks whether you're fast enough,
whether you make too many mistakes when stowing in the warehouses,
and then the algorithm, at the end of the day, decides to, you know, fire 5% or whatever of the workforce.
So there is a person whose real life is already under the control of machines.
So there's all these ways in which there's real things in the world that can be, I think,
accurately characterized as people experience their life as dominated by machines in a way that is real.
But where the domination has to be understood, if we want to change something about that situation,
if we are like, this is a bad thing, the world shouldn't be like that.
Then we have to understand how that structure of that domination is related.
to capitalism. And so we're like, we have to give an interpretation of these stories,
or we have to give people tools to make an interpretation of their own situation and their
own feeling of why do these stories maybe resonate or not, that help them, yeah, make sense
of the larger political structure, help them make sense out of what is it that we need to change
if we want that shit to go away. So we decided to, yeah, look into it and try to write up
both how we think one can find something useful to interpreting the actual world in these
stories and also just how we think the industrialized production of language will actually change
how the world works. Yeah. And why focus on fear as a framework? Like at the beginning of the book,
I wasn't quite clear on it. And then towards the end, I was very, you know, it made sense to me.
Like, okay, fear is this framework of understanding, like, why these stories hit? But like, why
fear as opposed to, I don't know, some other emotion that like the Matrix or like Terminator brings into us?
Yeah, I mean, I think that was primarily kind of driven by the sense that that's the, I mean, you know, there's the weird booster euphoria of people who are just like, oh, it's just going to bring about the best possible world.
You know, in the words of Sam Altman, it's like so unimaginably good that you can't even talk about it.
So I think there's that feeling, but I think that's just, it's just harder to say something interesting about that.
You know, like, I think a lot of that is just kind of boringly delusional.
But I think there's many ways in which people's worries, people's fears, people's anxieties are reflective of a kind of tension that we're experiencing in this moment where we're seeing this thing starting to unfold.
But we also feel like it's certainly not come to its conclusion, right?
It's a moment that we're living through that produces a lot of anxiety.
And I think when there's anxiety, you know, people give, there's always people who want to
profit from anxiety.
I mean, there's always very deeply reactionary or fascist movements who are like, we can
solve your anxiety.
We tell you who's at fault.
And, you know, they run with that.
So I think fear can both be a thing that can motivate people to get together to produce
solidarity or it can be a thing that can produce, yeah, but fascist.
and reactionary impulses.
And so giving people tools to make sense out of anxieties
seemed very important to me.
And then I think there was this thing that Ingeborg and I just experienced
because of our own class position.
I mean, I have a PhD in linguistics.
I feel like in the last five years,
literally everyone who hasn't gotten an academic job
has started working for an AI company.
So we wanted to also write something that kind of,
addresses the anxieties that we have experienced in our own circle of friends and that is very often
people from the professional class and we wanted to get them to look at some of these things that
we talk about especially in the later part of the book where people want to cling to their sense
of their position and the social hierarchy is justified but sometimes to properly cling to that
leads you to certain kinds of necessary delusions and I'm like I think precisely because
these are machines that are about flattening the social hierarchy. They are certainly attacks on
the privilege of many white-collar workers. And again, I think there's that sense where, you know,
a middle class that's under attack. And also it tends to radicalize. And it tends to radicalize either
to the left or the right. And I'm like, I hope they radicalize towards the left. So I think
the book was an attempt of, yeah, of trying to engage with that. And the question of the book's
title, Why Do You Have Like a One Sentence Answer?
get when people ask you, why do we fear it?
I think we fear it because it is going to sharp, because it's a weapon of class war from
above.
I think that's my one sentence answer.
I like how a lot of it was, a lot of the book was focused on explaining kind of two
professionals because this is like an automation that's coming at them and saying, speaking
directly to them, saying like, why are you buying into this fear?
And it's because, and you situate, you know, this labor hierarchy, right?
Which various theoretical ways of like trying to talk about it, like labor, aristocracy, professional managerial class.
But I think at least explaining to people, you're here and justifying your position here.
That's why you're feeling anxiety.
It's sort of like all the radicalizing things that people say like happen when you get older.
Like, you know, there are certain life milestones that are like designed by society to make you reactionary, I feel.
Like, buying a house is like designed to drive you insane because all the money you will ever have in your life is in this one thing.
And then suddenly like your neighbor stops cutting their grass and you want to like kill them because you're like, I'm losing.
I can feel myself losing money because my neighborhood isn't pristine the way I want it to be.
Yeah.
Or just, oh, the rent is going down in my neighborhood.
But the value of a house or an apartment is proportional to the average rent in the neighborhood.
if you want to sell it again.
So same thing.
Suddenly you're like, no, I want the rents to go up because I have a 30-year mortgage, right?
Yeah, absolutely great.
I think that was actually very a plant.
I think, you know, after World War II, that was a very active thing with the GA bill,
et cetera, was to get people, yeah, to turn more people than before into small property
owners so that they could be like, oh, the damn socialists, they're against private property,
and I own an apartment or a house or whatever.
Yeah, I mean, that's why pensions went away. I mean, literally, is just to make you invested in the stock market. Like, when I worked for the Texas state government, right, as a public employee, my retirement, if I, like, if you have a voluntary contribution, like, give $100 into your retirement account, like a, I forget what type of $4.40-something account. That wasn't an actual savings account with money. What it was was shares of a retirement account, which was tied to oil
production. So all of my retirement money was in oil. And so if the price of oil went down,
my retirement tanked. So like I lost money right in my retirement based on how much I put in and
it like went down and cost. So I put in like a hundred dollars on every paycheck. And that was
less than I put in because like the oil prices were dropping. And I, that was how I found out that
all of that money wasn't a savings account. It was oil shares. Wow. So then who is who in Texas,
who works for the state government is going to say, yeah, let's move to solar power. Right.
No one. Yeah. Yep. Like you said, it's out of their, out of their control. Even the capitalists are
like, well, if I don't do it, someone else will, which is a very reactionary mindset, I think as well.
like, you know, it's, I mean, when I was reading that of the person who said, you know,
if I didn't make this AI, someone else would.
It just reminded me of that like settler in Palestine who was like, if I don't steal your house,
someone else will.
Oh, yeah, I remember that video.
It's extremely, you know, the guy who looked from New York and is like, hey, if I don't steal your house, someone else will, I'm, I'm indigenous to this land.
Hey, over here.
It's just like some New Yorker going and stealing someone's house and their family for like 500 years.
Hold on a second. I actually need to find the exact line in the book that I highlighted. It was the first thing I highlighted in it. Oh, actually the very first thing that I highlighted is the picture Darth Vader in Dubli, the master of the invisible hand, as someone who's into autoerotic asphyxiation in a footnote, which was just like excellent imagery there. Thank you for that. The second thing that I highlighted was the radically unthinkable centerliction.
the will without a willer and how that was I had never put that anxiety into those exact words
but realized that that was that's the anxiety of capitalism right is you can't stop it so just the
and when that becomes a technology it becomes even more opaque as a force so like yeah just
saying nobody's going to put nobody's going to be like oh yeah let's switch to solar power
because everybody's shares are
retirement shares are in oil.
It's like, it just made me think of like
the part where you're just like, yeah,
who actually can stop this?
CEOs can't.
They have their own reasons for it.
So it just, it's a very interesting angle in it.
I thank you for that.
I'm going to be churning that over for a while.
Yeah, I think it's,
you write in such a way that is useful for me
because I tend to, because I'm not naturally like a very good speaker,
I tend to like memorize short phrases
and like pithy statements.
Like a lot of my politics just comes from like folk punk lyrics because like I can memorize like,
okay, I can throw that out in a conversation.
So I tend to do that a lot.
It's like as a person who had an undiagnosed anxiety disorder, it's like, oh, I learned how to speak by just mimicking people.
So, yeah, I actually am a philosophical zombie, but no one can tell.
Can you explain to my husband what a Chinese room is?
because we had a really long argument.
I had never heard of it before.
And he starts spout about some like,
okay, imagine this Chinese room.
And I was like, but why is it a room?
Why is it Chinese?
Like, I was so confused.
I don't know why it's Chinese.
I suspect the answer has something to do with racism, but.
see I'm at crazy Justin. I just I'd never heard of this like computer philosophy quandary before.
Yeah. Well, it's common. Everyone knows it. Everyone knows about Chinese room. Everyone thinks it's a
very useful thing to talk about. I was reading blind sight and the aliens in it are highly
intelligent but not. They don't have consciousness. So when they're trying to communicate with them,
the linguist on board is like, oh, they don't understand language. They just can speak it to
us perfectly, but they don't actually think. And so she starts like insulting them and stuff.
And then they insult it back. And that sounds like a fun book to have written. How old is the book?
It's from the early 2000s, I think. That is fun. Yeah, it's, yeah, 2006, it's about humanity's first
contact with aliens. So, like, humanity gets, like, scanned by these satellites. And also there's
vampires. I don't know why you felt they need to throw vampires in there, but humans, like,
reintroduced vampires, which are, like, more intelligent than humans.
the AIs are incomprehensible.
Are they philosophy vampires or normal vampires?
They're like prehistoric humans that fed on other humans so they're more intelligent
than us.
So humans can't understand the vampires because they can always outsmart you.
But humans also can't understand their own AIs.
And then so these humans with a vampire captain and an AI on board go to make like go to look
into this like issue and with with these aliens.
and then the aliens themselves
are completely different type of intelligence
and so the whole book is about intelligence
and it's really interesting
but I really am like
why did you throw vampires in there
and they're not like
made a deal with the devil vampires
they're like no back in ancient times
these were like an offshoot of humanity
that predated on humans
and then for some reason humans
like remade them
but gave them like a crucifix glitch
so that they could control them
so that they have basically
grand mal seizures if they
see certain geometry.
That sounds like a wild story.
I love that.
Yeah, I've been joking that, you know, there's all these people who are like, well,
aren't humans just next word predictors?
Isn't that just what I do?
And I've been joking about how the one thing that we've built with these language models
seems to be a P zombie, a philosophy zombie detector, right?
The idea is that maybe there are some people who don't have a rich internal life.
Maybe it's the people who are like, well, I'm just a next term predictor.
I also just predict the next word and then say it.
Well, that's an interesting point, though, because, like, people are pushed to act like computers.
People are pushed and coerced to act like machines.
And so some of the concern that I think people have is that this is moving into the personal realm.
And so people's personal lives will start to be dominated by, like, machines.
Do you already see this with, like, grind, set kind of guys, or, like, my whole life.
is just about optimization.
Right.
I'm going to 10x this and I'm like,
which can be fun.
I hate the phrase 10X.
I took human sexuality class in the 2000s
and it showed us like a 24-7 total power exchange
and she would talk about how she would like map out
her grocery run at the store to be like perfectly efficient
so it was not to waste time.
It's like, yeah, cool.
I can understand how you want to live your life
around efficiency for like a sex reason.
But like to just do it just because like that's rational
It's like why would someone go to school?
Why would someone do art when you could have something else to do it for you?
It's like, well, then, yeah, what are you?
Like, what's left of you if this logic of capital goes into your leisure time?
Yeah, yeah, yeah, that's very true.
Because, you know, being rational can be useful for achieving a goal.
I mean, being rational is about how do I achieve a goal that I have said myself.
But being rational doesn't produce goals in itself.
what you think is valuable is not in itself a question that grounds down in rationality.
But once you're in capitalism, you can be like, well, I know what's rational.
Rational is when the numbers go up.
And the goals are whatever steps on the way to making the numbers go up are.
And then you're like, well, when I'm dead, I hope the numbers have been really high or something.
Right.
It's a thing that once you think about it in the memento mori or whatever way,
everyone should obviously realize how about it.
absurd it is, but precisely because money is also power, there's this weird way in which it
kind of short-circuits our ability to, yeah, to think about what does it actually mean for
me personally or for you or forever to live a good life, right? It kind of produces a shortcut
there, which is this bizarre thing when we come back to anxiety too. I think it's this kind of thing,
it's a shortcut that you seek because, you know, I mean, the fact that whether you're an existentialist
or whatever, like the fact that we have to kind of figure out what the meaning of our own life is
and that it's not just there to discover and that it's not neat and that it's, and the fact that we die
is all pretty anxiety-inducing, right? So if you don't have to think about it, if you can find
a shortcut for not thinking about it, that this alleviates the anxiety, but then you're actually
running in this hamster wheel of capitalism, which continually produces anxiety as well.
so yeah there's no easy solution like you try and like solve it with like Calvinism but then people
are really worried about well what if I still end up going to hell like you've solved the problem
of the fear of death but then you spend your whole life anxious was like the right kind of person to
like not have the bad afterlife so you've you never really solve the anxieties and it's it's
strange seeing people kind of make their own spiritual as you know beliefs out of
the machine, like, you know, the guy who's trying to turn, trying to force him himself into living
forever and getting their blood boys and stuff like that as a way of just not thinking about the fact
they're going to die. Yeah. And also, I think it's interesting to the, you know, the fear of
AI overthrowing is also the one that the capitalist has of the fear of all these machines you've
created out of people rising up against you as well. I think it's at the core of that anxiety.
Yeah, right. If you have read, I mean, that's,
so many science fiction stories, it's so clear that it's that the anxiety is about class and
what if the people who are below me in the social hierarchy, whether that's from the professional
experience or from the experience of the global north or from the experience of the capitalist class,
right? There's as many intersecting ways in which there's all these sub-hierarchies of power,
but it's very clear that so many of these cyphir stories about the uprising of those who are at the
bottom of the hierarchy, right? Like, even in the matrix, if you look at the backstory or from
the animatrix or whatever, it's like the cleaner and construction robots that start the uprising,
right? It is exactly the kind of jobs that are first imagined as fully automated, which is
exactly what you said, right? We're treating people as if there were already machines, as if
they were just, you know, things that I give a couple dollars and then my apartment gets cleaned or
whatever, and I don't have to have a human relation to them.
I don't have to treat them as a person.
I can treat them as some kind of consumption good.
But it's exactly those people that in the science fiction stories are first imagined as robots
and then imagined as leading and uprising.
And that's, yeah, sure, that's definitely also one of those fears.
And I think that that's maybe the most obvious one that tells you why we need to interpret
these AI nightmare stories ourselves, right?
because if there's politics that follow from the interpretation of the AI anxiety,
I don't want it to be the politics that follow from the AI anxiety
that's actually about the anxiety of what if poor people come together
and form a political movement and overthrow the capitalists in the institute a different system,
right?
I want the politics, I want the interpretation of anxiety to be one about what is capital doing
to us with these machines.
How are we getting dehumanized?
and controlled by the system.
Because I have to deal with, you know, working in academia and hearing academics try and deal with the reality of AI.
And of course, none of these people have any sort of like Marxist grounding, even though they should.
You know, like our university president is a history PhD.
And it's like, I know you know this stuff.
I know you had to read Marx once in your life.
Hopefully.
I mean, who knows?
and getting a history degree in the US, maybe not.
But I know you have to know some historical materialism, right?
I know you have to know a little bit of this.
When people talk about human in the loop and ethical AI use,
how does that hit you after writing a book like this?
Well, I know a lot of people who work in those spaces,
and I don't know.
I mean, usually it hits me with, like,
they're very bad fixes on a very deeply broken thing.
I mean, they're, like, identifying a very concrete problem.
and then are like, can we do something about this problem without changing anything of the structure?
So I think, you know, I think it often comes from a place of good intentions,
but it comes from a place where people see exactly the structure in which they have,
in which they already have agency, and I like, what can I do in this small world where I'm,
I think we've seen over the last year, especially how fragile that is to begin with,
Because, you know, the leadership of big tech companies has turned much more fascist in many realms.
And a lot of these spaces are getting much more curtailed, even they're the smallest reformist kind of stuff.
Like maybe we can make the image models not be useful for just undressing people in images, right?
Which are not radical demands.
Those are demands of like, hey, if we want to have a kind of functioning society of whatever kind, even if you love capitalism, maybe shouldn't have that.
Even those spaces are getting curtailed.
And so I think, yeah, I think the people should be thinking about how can they use their individual
power to increase the collective power of people who want more democratic control,
who want more egalitarian outcomes, right?
I think there's a tendency, once you're in a certain position in the hierarchy and academics
are, you know, the professional managerial structures more broadly are people who have individual
agency.
So that's how they see the world.
I think that comes, that's a quite natural thing.
You know, you don't really have to be much of a historical materialist to think that way that, depending on your position in the workplace, either you get to have certain abilities to make decisions, you get to have a certain freedom.
You get to, you know, there are jobs where you design the structure of your day, yourself.
You make yourself a to-do list every morning and you decide how you go about accomplishing your projects.
You structure them and their jobs, or you don't.
There are jobs like working in an Amazon warehouse where, where every single.
single step that you take is measured and surveilled and determined by rationality that isn't one
that you chose.
And these will produce different ways of thinking about, well, where's the problem?
Clearly, the Amazon worker will not solve the problem of this is a shitty situation and we should
have a say in how this place is run by just being like, well, I'll just decide to do it differently.
Right.
So I think there's almost a sense in which that kind of class position misleads.
people from the professional strata into very small, very reformist projects.
And to me, there's a hope of being like, look, if you can see this thing as an attack
on the stratum of the working class that you're in, then you should be able to connect
your own personal struggles and the larger political struggles around like, you know,
more just world, more clearly to the sense that we need to produce a collective agency
of some kind, right?
We need to be able to organize ourselves into collective union.
units that can challenge capital, whether that's political formations of some kind, whether
that is unions.
I'm not going to make, you know, I have my own ideas.
I don't have perfect solutions.
I certainly don't feel like I'm in a position where I can tell people what the recipe is
for that.
But I think we can think about this from that theoretical perspective and be like, there's
boundary conditions.
There are certain things that your political activity will have to fulfill.
And one of them is certainly that you have.
structures that can grow, where people can get involved, where people can experience their
own collective agency in democratic decision-making and in challenging the people who have
power simply by virtue of being rich. Yeah, it's something I feel like when people talk about
having AI-ready workers is like having an assembly line ready worker. Like how do you, like taking a
class on an assembly line is like an absurd prospect because the whole point of the assembly line
is to deskill you, right? So why would you need to learn about AI if, you know, I also think it's
interesting that when you talk about different levels of class domination by machines, like
telling an Amazon warehouse worker that they need to have AI literacy because their job is
dictated by AI. Like it's like, yeah, why would you say, just have some AI literacy about
this so that you aren't misusing the AI. Well, then you misusing the AI is not the problem
in the same way that me misusing the AI is not the problem. It's, you know, whoever decides
to hook it up to our power grid's problem or whoever decides to hook it up to our nuclear
capabilities problem or whoever's, you know, hooking them up to drones. It's like, you know,
telling me to be AI ready for the future is like, that's not, I always found it strange.
in the same way that when those scare stories came out,
I didn't really see them as like the way this book approaches them,
which is like understanding where people's fear is coming from.
I just saw it as all marketing.
I just thought they're just saying AI is scary and it's sky net
because they want people to believe it's more powerful than it really is.
But now I think, no, they genuinely do have like this fear.
Right.
Like this real existential kind of fear of like this thing is,
is potentially going to,
you can watch them kind of freak themselves out
because one, they're very isolated and insular billionaires,
but also like you can see that it's fears about a lot of things.
Yeah, I think, you know, in a general sense,
I think when one encounters propaganda of some kind,
it's a mistake to think that the propaganda works
because of its content only, right?
You have to ask, not that just why do people want to send this message,
And I think you're right.
I mean, I think for whatever we're saying about the anxieties of the ruling classes and the professionals,
I do think there's also an aspect of this just being propaganda and advertisement in there.
But the question is, why do these things resonate with people?
Right.
It's not just why do people want to send this message?
It's like, why do people not just immediately tune out?
I mean, there's lots of messages that don't reach people that the bridge would love to hear,
but that nobody wants to read about.
So what is it about the tension between the sender and the receiver of the message?
And is there a potential for politics in the tension and the maybe even contradictions between the two, right?
Is there a moment where we can, I think, you know, I think as a Marxist, I think the only way to deal with history is by figuring out how the existing contradictions work and figure out, is there something about the way that things are fucked up right now that produces a potential for?
for making something better.
You have to kind of, yeah,
the only way forward is through in a sense, right?
So, yeah, so that's what we kind of try to do with that,
to be like, is there?
Yeah, what about these anxieties maybe,
can maybe made into a rational source for solidarity, essentially?
And I do think the stuff about the centerlessness,
I think it's really, I'm so glad that you like that,
that made me very happy.
I mean, it's from Mark Fisher originally,
but we're connecting it to some,
some language model stuff.
But I think that's just so crucial for understanding
about something about the actual anxiety of the ruling class
and the professionals,
but also about why I would say I'm an anti-capitalist
and the problem was capitalism, right?
Because the problem is not specific capitalists.
And I think that's important to make clear,
that it's not that, you know,
the problem is not that the 15 guys in Silicon Valley
who are at the top happen to be all assholes.
It's that we're in a system where if you are a certain kind of asshole with a certain amount of money at your disposal, you can become more rich and it's precisely the wealth that gives you the power to structure decisions.
It is that capitalism is not just a system for distributing resources and dividing labor.
It is a system for producing decision power.
And it is a system for producing decision power based on people who have in the past.
has been guided by the decision to make more profit and have been reasonably good at it for whatever
the costs, right, whatever the externalities, whatever the harms to other people, whether that meant
crushing your workers. If it's been good for profit, that means you now have more decision power
in your hands than you did before, right? It's a system, that's why it's a system not an economic
system, but a system of political economy, right? Who gets to make these kinds of decisions? How did they
get to be there and replacing individuals because it's a system that selects for particular kinds
of individuals that make particular kind of decisions. And if somebody decides, I don't want to be like
that anymore, they just get kicked out of the system and somebody else steps into that fold.
That's why we're like, on the one hand, that if I don't do it, someone else will, is like a really
pathetic way of thinking about the world. And at the same time, there is a certain truth to it under capitalism,
That's why the political target has to be to figure out how do we change the system so that we can make decisions to fulfill human needs and not run a system that puts people in power who put profit above all human needs whenever.
Yeah, I think that's to bring it back to Mark Fisher.
There's this there is no alternative kind of thought that's dominating in the AI discourse.
It's like, one, this is going to happen.
you are going to be automated.
Capitalism is inescapable.
And the exact way out of that, I think, is people imagining a future that is an alternative.
I think in some ways, maybe the acceleration of it all in the last decade has made it easier for people to actually imagine a post-capitalist future.
That isn't just nuclear annihilation.
I think finally people are starting to say, like, this is, all this is fake, isn't it?
Like, COVID really helped break some people's brains in bad ways and good ways, but also, I think, the second Trump administration and all of this AI slot being forced down everyone's, you know, into people's brains all day is sort of making people go, maybe we should like do something different because this is just not working.
And I feel like that sense of the ability to imagine is coming back to some people.
I can't tell if that's my own little bubble
because it's the kind of people I like
to surround myself with but I'm starting to see
people I think talk about
other ways of the world like
you know these AI is like this
story we tell ourselves
people are also telling counter stories of
what if we didn't have to deal with this crap
so the refusal of it
is starting to generate some
thinking that is actually useful
in terms of like maybe we should have
some rules about this
Maybe we should send some billionaires to jail.
Maybe we should actually do something differently.
And that also activates people on the right and towards fascism.
But once you start breaking down that consensus view, I think the things start getting interesting.
Yeah, I agree.
I think there's almost a sense to me in which there's a lot of critique out there that
does want to break with the sense that there's no alternative to AI.
but doesn't really want to go to the,
there's no alternative to capitalism thing.
They want to like go in between those.
Yes, there's an alternative to AI,
but the alternative is like more regulation
and this and that particular AI use
should maybe ruled out or whatever.
But I think, you know, I'm almost sometimes like,
we should be like, yes, you're right.
Under capitalism, there's no alternative
to this particular kind of way of making life shittier.
So fuck capitalism, right?
Let's have a different economic system.
Let's have a more democratic world where we figure out how we can make decisions differently,
not based on profit, but on other human values.
How do we bring those in?
I often feel like people imagine that we can build a political movement that can stop AI,
but then they're like in that, which I think we should and we can.
But I think that political movement itself will be so big
that we can make much larger demands than like don't put a language model into my
education software or whatever, right?
We can be more imaginative.
Like, why not take this moment and be like, you know, up the game a bit?
You're like, now we want to talk about the rules of the whole fucking game.
Yeah, I think there's like this parallel between the resistance you see in communities,
rural communities, because they're the ones targeted of resisting the building of these data centers
and the resisting a building of ice concentration camps.
that is the same resistance to we don't have to deal with this.
And these are like conservative areas,
but they're like,
we don't want this pumping diesel into our atmosphere all the time,
and we don't want to have our economy based around big encampments
that haul in our neighbors.
And even if there's not a left-wing element to it,
there is a popular resistance that is amorphous and moldable
towards an anti-capitalist project, I think.
Yeah, I think so too. I think there's, you know, you don't build a leftist movement by just being like, well, I find, you know, the three most radical guys that I could find on campus. And then we sit at home and imagine how the world could be better. You have to build when people are engaged in concrete struggles, you have to build solidarity, support them in their struggle. And then through that, bring them into the larger project. Right. So I agree. I think, you know, I think there's lots of.
of people who are in some in this way or in that way conservative that can be brought into the
movement and that will that will be difficult and it will require certain boundaries you know there
are certain things in which you cannot compromise but it will it will also require like patience
and solidarity yeah we've never seen the movie pride about lesbians and gay support the miners watch that
yes it's a great movie it is great i cry every time every time every time
when those
the fucking union buses
show up
to leap pride
at the end
and every time
like
I haven't seen this yet
I just haven't gotten around to it
I should make that a better priority
I know
I know
you got it
no no I used to be part of a political
organization that would
like every couple years
we would show that film
as like an organizing
event basically
you know
I would just have a showing
off the movie and be like
stay after if you want to get politically involved in something kind of like this.
Weirdly,
that film is like one of the better depictions of what like organizing work like looks like
day to day and like solidarity work and all that.
Like like little practical things like no one goes out there by themselves,
right?
Like just like a little shit like that or like teaching people like what solidarity
looks like with people who you might not usually
consider yourself an ally with.
Like, it's just so good. It's so good.
Anyway.
Yeah, lesbians and gays against data centers,
lesbians and gays against concentration camps, you know.
Like, there's no reason it can't,
it can't work.
And, I mean, it is funny when I'm walking around Boston,
and I just see like the communists try and like, you're like,
communists against the things you hate.
And it's like, I don't know if that's going to work.
But I don't know if Americans are ready for that.
But one day they will be because they won't have a choice.
Because Boston's a city that's actually a college town.
And so a lot of the explicitly socialist of any flavor stuff that happens is a bunch of college students.
not to shit on college students
but that's a lot of what it is here
it's not true you hate college students
you tell me every day
this is also true the students
who are on the B-line
who don't take off their fucking backpacks
I hate if you are listening
and this is you I hate you personally
it's
it's not solidarity
to have your backpack on the public transit system
this is true
this is true I mean you know
I've been living in New York for almost a decade now,
and I know New Yorkers have are thought to be very rude,
but I think a lot of New Yorker rudeness is actually about,
like, you are not following the rules that enable everyone to get along,
and that's why we're shouting at you.
We're not shouting at you because of you mean.
We're shouting at you because you are behaving in ways that maybe you don't realize,
but they are very asocial.
You cannot just stop on the sidewalk in New York
because you thought something was interesting.
No, you're like, go behind the little,
space behind the lantern where you will be in nobody's way.
And somebody will shout at you if you don't.
But it's a public service that they're shouting it.
It's the whole thing about people in the northeast.
We're kind but not nice or like the other way around where we're like we're kind of dicks.
But it's for like it's good, you know, like shouting at someone directions.
Yeah.
Yeah.
They'll dig you out.
They'll dig your car out of a snowy bank,
while calling you a dumbass.
Yeah.
Yeah.
Yeah.
Yeah.
Yeah. It was like one of those Boston roundup things.
It was like a guy was mad about the parking situation.
So he shoveled all the spots on his block.
Yeah.
Two rules.
It's so angry.
This guy helps me parallel park while calling me a fucking idiot the whole time.
But like explained you to me in a way.
I will never forget.
Yeah.
Yeah.
It's great.
Right.
Well, that's, you know,
just we're talking about
solidarity is not a frictionless
kind of thing.
No.
But you have to, I mean, you know,
I think the problem with people in college
who want to organize is often that they
don't figure out what they can't
materially bring to support people.
I mean, and often that is because college
students don't have much access to
resources, right? Often what's needed is
resources. But often what is needed is just
fucking time, right? Can you figure out
something, some way to make your time useful?
Maybe the thing is, yeah, shoveling someone's
car out.
This is literally
I organized with
some
Catalonian people
and they
told me about
this like
I forget what they call
it but like
if you're doing
organizing work
like and say someone
who's like
they're the person
responsible for like
the planning of something
that's the logistic
or something
they've got to like
do all that work
but like
because they've like
got kids
or they have a job
where they have like irregular hours or something,
it makes it hard for them to stay on top of the organizing work that they need to do.
So people who organize with them or in their community will either like do mutual aid
and like pay their wages for a chunk of time so that they can like take that time off work paid
in order to do the organizing work that they need to do.
Or it's like, I'm going to cook dinner for you and your family tonight.
So instead of you having to spend the time and energy to do that,
you can work on organizing and like that's a lot of what they do.
And I was like, oh, like, that's, it's something that seems so obvious.
But like, even if you don't have resources, can you like walk someone's dog for them, right?
Like, can you go over and do their dishes or like help them fold laundry?
Like, can you do that?
Or even their kids for a little bit, like put on a movie and sit with the kids.
Or like, is your organizing space friendly?
for people who have children.
What time are you meeting?
Where are you meeting?
Right. Yeah, a lot of that stuff
is not glamorous, but it
is actually the stuff of people's
daily lives.
I think when you're in college,
it's also kind of normal
that the stuff of daily life is
of less interest to you than
most people because you're the furthest
removed from what it means to
have a normal daily life. You have the
We've built a very abstract and strange kind of situation where it's like this is the time where you're supposed to, A, form what social career networks and B, acquire all these specific weird skills that may or may not be relevant to your life at all, but that somehow prove that you can be accredited with this paper that says you have a degree from this more or less prestigious institution, right?
But that's a very, that's a very strange kind of thing. And we have also like structured it around like most people live materially relatively depressive.
life at that point, right?
So we're like, yeah.
But it's true that these things of daily life,
I think not only are they actually what allow these things to work,
and they are perceived that this is solidarity,
they're also the difference between the kind of organizing
that also creates communities,
which I think ultimately the really resilient political forces, right?
And that's another thing that's beautifully depicted in that movie,
I think that people have these, people care about each other.
Like, both on the, on the, like, it doesn't start from the abstract political level.
It starts from the interpersonal level that then gets a political interpretation attached to it.
And I think that's the only way that we will be politically successful if we can, can't do that kind of organizing.
Yes.
And I also think you bring a really, bring up really good point about, like, the way that college is treated, at least in
United States. I'm not sure about elsewhere, but like the sort of scourge of college students
using AI, right? And even before this, you saw in the sort of, you know, entrepreneur,
grifter mindset, like, internet space, like, oh, your college degree, you know, don't waste money
on college, read these books instead or take this course or do ever, the sort of like anti-intellectualism
that is still showing up within quote unquote the academy.
And it's like, you know, so often it's like I see people almost like defending the students
because it's like, you know, the types of assignments that are being asked are not well designed
or the workload is unreasonable.
Like it's the purpose of college has changed so much that like, you know, you can't,
where people like you can't blame them.
for using this tool to do assignments for them
when that work is not benefiting them in any way.
And I'm like, I understand what you're saying,
but then you're pointing to a problem that we can fix.
Like the way that, like, I,
it feels like it's coming out of the same argument
of like, well, in high school,
why don't they teach people how to do their taxes?
Why do they teach them algebra?
Why do they teach them a bunch of useless shit?
As if the entire purpose of education is vocational.
I mean, this is like in Germany, I think.
Like they have that sort of split education system
where at a certain point you get like split off into going to college eventually
and then going into a trade.
And you go to like completely different high schools or something.
And it's like, I don't know how easy it is to swap between.
those two, but it's like...
It's very difficult.
Is it? Okay.
Like, I remember I took German in college and we learned about that.
I'm like, that doesn't seem good.
No.
Like, but, yeah.
I always tell Americans, that's probably why college is free in Germany.
It's because for us, the teachers just select when you're like 10, whether you go to college or not.
So the class reproduction is actually quite similar between Germany and the U.S.
So if you measure, like, if you check how likely somebody with zero, one or two college degrees
between their parents is to themselves get a college degree.
It turns out the heritability of a college degree
is pretty similar in Germany and the US,
despite college being free.
I mean, I paid, I believe, 16 euros and 50 cents per semester
for my education.
I can't even get a meal for that.
That's crazy.
I didn't know if your name was.
Of which, I think 16 euros actually went to the student union.
Yeah.
Yeah, with your name,
I wasn't sure if it was German or, but yeah, like, I remember, like, do people in the United States, like, the purpose of education is not meant they shouldn't. I don't think it should be to prepare you for a job. This is actually a huge problem I have with library school. Like, a lot of people are like, oh, library school doesn't teach you enough practical skills. And I'm like, no, what you're wanting is that we should have like a vocational track for librarianship. But if there's going to be a master's degree for this, it should maybe be more academic.
like I don't like the conflation of like education for job market because then it means that like
then it's just a barrier into getting employment like with a huge price tag on it.
But like because we treat it as this like, oh, you have to have a bachelor's degree to do anything now.
It's like all of the assignments are meaningless.
A lot of like there's just the way that we do college is just not good.
So it's like I can't, I almost can't blame students.
Like it's bad.
I think they're using it, but.
Yeah, yeah, I mean, I agree with.
I certainly also think that, you know, education should be about more than just vocational training.
But there is a sense in which in our society, where we may talk a nice game about what a liberal education or whatever is for.
But ultimately, you know, we have, that's why, again, the capitalism is the system of evaluating things, right?
of putting a price tag onto everything.
And it structures all social relations that we find ourselves in, right?
Like we try to make community and meaning outside of that,
but it's difficult.
It keeps intruding into everything.
And I think, you know, students reacting to that can be depressing sometimes.
But I think, you know, again, I'm like, well, the only way is through.
So if the system produces that, then hoping that the individual students will just be differently,
is certainly not a way of changing the system, right?
So in a sense, I understand being angry about that or disappointed or depressed or whatever.
I mean, lots of friends of mine are teaching,
and I've heard horrible things from many of them about how the last couple of years have gone.
I taught during COVID during Zoom,
and I found that very depressing sometimes because a lot of the community-making thing
that happens in a classroom was just so much more difficult to do.
But, yeah, I think that, right, like, again, I think we see there with AI,
oh, AI is a tool for trying to devalue certain skills for more than cheaper for
capitalists to buy on the market.
And the students, in a sense, are reacting to that cheaper.
That is, yeah, there are in the pure, like, neoclassical economic sense, they are behaving
as rational actors, right?
They're like, supposed to produce a skill with a certain value.
The value has been deprived, has been lowered.
So I'm going to invest less time in the production of that skill set, right?
But yeah, I mean, to me, one of the things that I really like, to me is really interesting in having thought about and having tried to write about is this question of why capitalism simultaneously has this way of valuing education, of like constantly professing it as a value to.
And this deep anti-intellectualism that it simultaneously produces.
And to me, when I read Braverman and when Ingeborg and me were talking this through a lot,
it's something really clicked about the fact that very, very often the point of technological
and intellectual advancement is precisely to cheapen some other kind of knowledge, right?
Once you realize that there's a dialectic or whatever you want to call it,
the fact that there's a devaluing of some knowledge and the upvaluing of other knowledge
are two sides of the same coin constantly, right?
Like, when somebody makes a, like, Braverman was a machine worker,
a metal worker before he became a sociologist.
So he knew a lot about how metalworking had changed
between like 1940 and 1970 in the US.
And he's talking about the shift from like manual lathe operators
to numerically controlled machines to computer numerically controlled machines.
And he doesn't talk that much about it,
but there's that sense of like, yeah, of course,
the person who knows how to make a CNC machine, how to program a CNC machine,
who knows how G-code works or whatever.
That person that has that kind of technical knowledge gets paid better now
precisely because the machinists that were the highly skilled labor force of the 1940s
has now been deskilled.
So the upskilling of some and the deskilling of others,
that that's not too totally independent movements,
but that they're very strongly conjoint.
That really gave me a new grasp of understanding why there's this,
yeah, why there's this constant conjunction of valiant education and anti-intellectualism.
And I think something that didn't really make it into the book, but that I think is really
crucial to think about, is that we see this in a very radicalized form historically in fascism,
right? Like, fascism has this absolute fetishization of technology, especially, of course,
military technology, but also heavy industry, while also being an extremely anti-intellectualist force,
So in a sense, I almost think that I hadn't really before we started working on this book,
thought about how much that's a radicalization of the very normal use of knowledge for deskilling workers,
of the use of knowledge as a weapon in class war, how that kind of reappears once you have a fascist,
like, yeah, war between states and ideologies, right?
That we see this at different levels.
So once I think about the individual student who's like,
I don't want to do my homework.
I'm going to throw it into the AI thing.
In that whole context, all the way up to fascism, I am.
I'm at least kind of feeling humbled about wanting to tell the student that that's their personal fault.
For whatever, I also think, you know, like, yes, the point of education is.
The point of any training is struggled.
One can only make meaning and life out of friction.
I think this is another way in which capitalism is just very deeply anti-meaning.
Capitalism hates friction and capitalism hates friction.
and capitalism hates particularities, right?
The point of money is that everything is abstracted
into the equality with everything else, right?
Like if this thing costs $50 and that thing costs $50,
then whatever the two things are,
they are in some ways equivalent, right?
That ability of capitalism to above all else abstract away from particularities
and that desire of capitalism to remove friction
because friction is cost,
I think is like why capitalism is such a nihilistic system
because anything that we as humans make meaning of
is particularities and friction, right?
That's how interpersonal relations grow
because you've made meaning out of a conflict, right?
Your best friends are not people that you never had an argument
or disagreement with.
It's people where when you have a disagreement,
you've found really meaningful ways of engaging that
and making that disagreement,
and maybe either resolving it or making it into an ongoing source of meaning or whatever.
But friction is really important for meaning making.
Yeah.
Yeah, I think that's even part of the fear that we see with AI in his particularly creative endeavors
is because artistic endeavors are often so very, very personal and particular and not very well-valued in a capitalist sense.
So there's that fear that this, at least from my interpretation, following a lot of like artists and writers on social media is, you know, is, is not necessarily that they're going to take away our livelihood, even though there is that, but it's they're going to take away the, it's going to take away the meaning of our effort. And effort is friction.
You can grow at something unless you work at it, right? So not entirely cohesive thought here. But yeah, just like you, you.
just said, it is all about friction. Students not wanting to do their homework is trying to avoid the
friction of doing it. People not wanting to do organizing is to avoid the friction of interpersonal
relationships and having to figure out how to grow that. So embracing the friction, I think,
would go a really long way towards soothing the fear enough that we can get a clear view on it
and figure out what to do about it. Does that make sense? That's why I really appreciate the fear
framework, I guess.
That's what I'm trying to say.
Yeah, that makes total sense.
I absolutely agree.
Yeah, totally.
And I think, yeah, that was kind of the arch that I had also in my mind from like the
friction of like doing a homework, right?
That's a struggle.
It's difficult.
But the difficulty is kind of like that's, that's how you make a thing your own thing, right?
Like that's why an idea from a book can be something, can turn something into something
that you do something with after if you're at the book, but before it's just,
someone else's thing, but it's the struggle that is what makes it your own. And yeah,
in art, there's so much, the struggle is part of the production of whatever the hell meaning is,
right? I'm not, I don't have a theory to advance, but I know that that friction and sublimation
and all these kinds of things are essential to that. So yes, absolutely agree. All right. Well,
we've gone an hour and a half, so I think we are good to wrap up. Is there any, like,
final questions before I go.
Okay, just making sure.
I'm not like cutting us too short.
Okay, thanks for coming on.
They're really appreciated it.
I really liked reading your book.
I'm so happy to hear that.
Thanks for having me.
That was a very fun conversation.
Is there anything you want to plug?
People can go, things that can add into the notes.
No, I don't think I have anything right now,
other than the book, you know, which I assume is going to be in there.
Link will be in there.
and I will also put it on social media so that people will go buy it.
Perfect.
All right.
Thank you so much.
Or get it from a library.
Request it at your library.
All right.
And good night.
