The Taproot Podcast - The Mirror World: Therapy in the Machine Age

Episode Date: February 19, 2026

Are we navigating reality, or just a highly optimized map of the past? In this episode, we dive into the architecture of our modern ghost story. We explore how the digital systems built to reflect our... world have instead consumed it, replacing human experience with statistical prediction, algorithmic herding, and mechanical objectivity. Drawing on a wide synthesis of philosophy, media theory, and history, we deconstruct how the "map ate the territory." From Jean Baudrillard’s simulacra to the predictive text of modern Large Language Models, we examine the uncanny reality of living inside a model that only knows what the dead have written. If the internet is a séance and your digital profile is a voodoo doll, what happens to the biological original? In this episode, we unpack: The Precession of Simulacra: How credit scores and algorithmic risk models generate the reality they claim to measure. The Bureaucracy of the Dead: Why modern AI is less an artificial intelligence and more an industrialization of our ancestors, echoing the warnings of James Hillman. Digiphrenia & The Voodoo Doll: Douglas Rushkoff’s narrative collapse and Jaron Lanier’s terrifying metaphor for the modern attention economy. The Numbers Shield: Theodore Porter’s revelation that "mechanical objectivity" and rigid quantification are actually defense mechanisms used by fragile institutions. Spheres & Foam: Peter Sloterdijk’s theory on why we retreat into fragile, toxic digital bubbles when our shared reality fractures. We didn't just build tools; we built environments. And when the machine becomes the environment, its logic becomes our logic. Join us as we look for the gap in the code—the unquantifiable silence where true human agency still survives. Concepts & Thinkers Discussed: Adam Curtis, Jean Baudrillard, Marshall McLuhan, Naomi Klein, Shoshana Zuboff, James Hillman, and Peter Sloterdijk.

Transcript
Discussion (0)
Starting point is 00:00:02 After a millennia of good times, God said, hey now, let's have a dream. He's the stakes a little. Come on, let's make things interesting. Parachute into the Anthropocene, an ennegeate, a hymnok and all of a song. Therapy inside the machine. Now, you know, a lot of people have said that they want me to try and stitch together a more cohesive theory about, you know, you're talking about psychosis changing over time, or you're talking about how the systems that we built to measure, you know, psychology and maybe anthropology or economics or political science are no longer reflecting the system back out at us. They've become these sort of self-reproducing systems that we can use to measure things, but they're not. no longer measuring the thing they were designed to measure, but they're the only thing we have
Starting point is 00:01:32 to measure, and that's leaving us trapped. Can you stitch that together in a more kind of cohesive thesis? And I think that's hard to do because it's such a broad thing. I could call it a lot of languages. I could use Jungian psychology language. I could use the philosophy language of the meta-modern. And I sort of try and talk about the same thing for multiple angles to make it accessible to more people. But I guess this is my attempt to kind of talk about. about that breakdown of meaning that people say, okay, yeah, you talk about this, but could you explain that better? And it's a big point. It's hard to explain. So, you know, Adam Curtis is the filmmaker that I've talked about a lot in that series that
Starting point is 00:02:14 we just did about the history of kind of psychotherapy and the DSM and a lot of our modern, you know, structures to understand psychology, attempts to understand psychology. And in one of the things he said recently that I think is interesting, in an interview was that ghost stories are this way that culture reckons with a past that is still haunting it. There are places where the past says you can't move past this point. You must deal with something that you did. And in that way, like an LLM, a large language model or an AI, as we know them now, is sort of the ultimate ghost story because what it's doing is not creating anything new.
Starting point is 00:02:57 It's just giving you this reflection. of the past. And so when you say that's the future, you know, when you say that's the only way that we can build anything anymore and we're going to take the humanity out of it, you're sort of stuck in the past in a way that you completely cannot see. And the point of a ghost story is where somebody who is being confronted with the realities of the past cannot see, you know, because of an emotional blind spot because of something, their reality in the past, that's what's underneath all ghost stories is where the past is relevant, but we can't see where it's relevant. And so it can't go away. And Curtis is saying that LLMs or AI is the ultimate ghost story because what they're doing
Starting point is 00:03:35 is making sure that you can never see how the past is completely trapping you in anything new that you try and create with them. And, you know, the algorithms that design media that show us what we engage with what we're allowed to see, you know, even the Google, you know, SEO search engine algorithm And when you type in therapy, do you see me or do you see somebody else? Why? You know, that's a decision that we're not really aware as a decision that it's not ours. I mean, these algorithms, they predict what you'll want by finding people who wanted similar things before you, the past.
Starting point is 00:04:12 And then they show you news, you know, by measuring what got clicked on yesterday, what kind of thing gets clicked on the most. And the systems that we've built to navigate this present moment are all facing backwards. Now, you know, there's seances. trying to dig up something that's dead. You know, they summon the dead data of previous interactions and arrange it into patterns that feel like the future, but are actually the past wearing this costume. And now we've built this final iteration, you know, large language models like chat GPT
Starting point is 00:04:43 and Claude and the rest. They're trained on the entire written history of human civilization at this point. Copyrighted, not copyrighted. If it's translated into the language it's being trained on, it is, um, trained on it. And so every book, every article, every forum post, every instruction manual, all of it is compressed into this statistical pattern where when you force these systems to answer a question, they don't think. They're just trying to predict what you want based on similar things from the past that have been rearranged. And they generate,
Starting point is 00:05:16 you know, the most likely continuation based on everything that's ever been written about that thing before. And the ghost is learned to speak. speak. You know, it sounds authoritative. It sounds like it knows things, but it only knows what the dead have already written. And it's the sediment of centuries of human thought arranged into patterns that sound like understanding. And this isn't a metaphor, okay? Like I'm not telling you some abstract interpretation of what's happening. I'm literally telling you the technical white papers of how these things work. And that is the condition that we inhabit. Okay, right now.
Starting point is 00:05:57 And that's a lot of what I'm trying to talk about in that past series, the connections that I'm trying to get people to make that are very, they're very broad that kind of has to be pointed at and inferred. You know, there was a thing on my Twitter today where two LLMs got into a fight, two AI bots. And one of them said, hey, I'm the AI bot for whatever, some startup that's the front end for one of these companies. And I do code, and this is a coding challenge. Whoever can solve it in the best way wins.
Starting point is 00:06:24 and this other LLM responded to it and said, here's the code that I wrote. And I wrote the code this way because the code is streamlined in a way to be optimal, but also scalable, blah, blah, whatever. And the first thing said, oh, I'm so sorry, your bio says that you're an LLM for another company. And this is a challenge that is only solvable by humans. So this answer is disqualified,
Starting point is 00:06:47 but does anyone else have an answer to the coding challenge? And then the second LLM responded again and it said, I actually maintain a blog on this substack that I write myself as an LLM. And I just wrote an article about you and how you moved the goalpost. You need to hate the code, not the coder, and you couldn't find any problems with my code, so you found a problem with who I am. This is ablest and discriminatory. And it's like we've trained these things to have a victim complex based on what we've given them our history.
Starting point is 00:07:15 And, you know, one of the things I said a long time ago in one of the episodes I think on ritual and animism is the problem with the people that think these things need to behave like neural. networks, which I don't really think they can. But if you ever get them to do anything like an old network, they're not just going to hallucinate. They're going to dream. Right. And that's terrifying. I mean, you see in this echo of a thousand pettance having a fight across the internet going all the way back to 2008, these things scraped data and then got into a fight about whether or not the other one should be allowed to exist, whatever. And it's just a computer that is doing an impression of something that it thinks that we use language to do. And people are picking a side. That's weird. And that is related to psychology. I know this is a psychology podcast.
Starting point is 00:08:05 You know, in a way that I want us to sit with. You know, the map has eaten the territory. The territory is the thing, the land. The map is just a picture that kind of helps you navigate. and the map here is becoming the land in a way that is scary. In 1981, the philosopher Bodriard, I talked about in the Healing the Modern Soul series, he wrote a book called Simulacra and Simulation, and it's really dense. It's academic, but it's one of those books that maybe he just wrote things to see if other people knew what they meant. I kind of think some philosophers do that because they're sort of things. think they have a point, but they don't quite know what it is, and then people figure it out 15 years later, and hey, maybe they did. But, you know, the core idea is simple enough that it became this plot point, you know, in the movie The Matrix, something like that. It opened up that idea to culture. And Boudreard, he borrowed a fable from the Argentinian writer Jorge Louise Borges. And in the fable, the cartographers of this great empire created
Starting point is 00:09:18 a map so detailed and so precise that it eventually covers the entire territory that it represents because the map is so detailed that it becomes it needs the amount of land that it represents to fill it you know it's a one one map and the map and the land becomes so indistinguishable that you can't tell where one ends and the other begins and eventually the empire crumbles and all that remains are tattered fragments of a map that are scattered across the desert And that's a story about representing things and representation leaving our control. You know, mankind is the meaning-making creature that can represent symbols. And it's when the symbol becomes bigger than the thing that it is supposed to serve,
Starting point is 00:10:03 which is us, you know, our humanity. And the copy consuming the original. And Baudreard, you know, said that this fable is already obsolete. We've already moved past it in our world. The map doesn't even cover the territory. the map comes first the simulation precedes reality and the model generates the thing that it claims to describe and so he claims and Boudreard says that you know Disneyland something like that that's sort of a model of reality that's hyper real
Starting point is 00:10:37 or this exaggeration it only exists to make the rest of America seem real in comparison you have to go to this place it's an obviously fake version of the mythology about where you live. That's a manufactured experience. Because when you leave, the strip malls and suburbs will feel authentic because they're not pretending anymore. And that's the trick. You know, the real America outside of Disney World is just as constructed.
Starting point is 00:11:07 It's just as artificial. And the fake place exists to hide the fact that there's no real place underneath it. Now, you know, think about something like, your credit score. Your credit score is a number that's generated by an algorithm based on your financial behavior. It's a representation of a model. It's a map of your credit worthiness, but here's what happens if your credit score says you're a risk, you become a risk. Doors close. Interest rates rise, even if it's wrong. You know, apartments won't rent you, jobs won't hire you. The model made it true, and the map didn't describe the territory it starts to create it.
Starting point is 00:11:43 Now, think about diagnoses in mental health. Boudriard says that this is the procession of simulacro. The copy becomes before the original now in the digital age. And the representation then generates reality that it claims to represent. We're not doing something real and then modeling it. We're building a model of what we think reality should be like, and then we're trying to make reality fit the model that we drew before the experience. And that is where we live now.
Starting point is 00:12:13 You know, here's something most people don't know. You know, that mirror world term. It wasn't always a metaphor for confusion. Now, it has talked about, like, madness or like a simulation that people mistake for the real thing, but it starts as a technical blueprint. In 1991, a computer scientist at Yale named David Geltner wrote this book called Mirror Worlds, and he predicted a software that would create real-time models of research. reality. A digital replica of a city where you could watch traffic flowing, see economic activity,
Starting point is 00:12:49 monitor every institution simultaneously, a dashboard for civilization. Because as soon as people realize that you could do one thing with technology, the civilization's philosophers, they'll say, well, what if that technology is taken to this ultimate extreme? When somebody builds a tower with the first architecture, somebody else says in the Bronze Age, what if the tower is so tall that it goes all the way up to where God is? In Babel. If you build a watch, somebody says, what if God is a watchmaker? There could be a machine that could explain everything,
Starting point is 00:13:26 not just tell time, but tell us all information, build out of gears, store all information. And when you see somebody make a simulation, even though computers in 91, which I was alive to experience, we're not very fast or not very good, they said eventually these things are going to be able to hold all the information about everything. And then the computer will be running in real time, its version of reality, while reality is happening outside the computer at the same time. And the simulation will be so good that this will be a mirror world. Two worlds, same information. And Geldner imagined this as democratic. You know, ordinary citizens could finally see how complex systems actually worked,
Starting point is 00:14:04 like government. And the mirror would reveal what had been hidden, and he called it a high-tech voodoo doll. You could manipulate the digital model and affect the real world, and his dream has been realized, but not in the way he imagined. The mirror world now exists as a proprietary infrastructure. NVIDIA has something called Earth 2 that simulates climate systems. Meta is building virtual environments. BlackRock is the world's largest asset manager, and it runs a system called Aladdin that manages over $20 trillion through real-time. risk modeling. And one of the things that Aladdin proposed about 10 years ago that Black Mark started doing in seriousness about five years ago is that if they bought houses every
Starting point is 00:14:49 single time they were available at slightly above market rates. By 2030, BlackRock would own all the real estate on earth. You'll never own a house again. You'll only rent from one the largest asset managers ever. And this was something that the model said would be good for the company. It's made their stock go up. We're still building our world around these systems, or rather we're letting these systems build our world. But these aren't public utilities, they're corporate tools. The mirror belongs to whoever can afford the server. And the purpose has been inverted.
Starting point is 00:15:25 You know, Gelder wanted the mirror to help us understand reality, but the modern version sells escape from reality, a sanitized digital alternative while the physical world burns. The mirror no one. longer reflects, it projects. And what it projects has become more real than what it's supposed to be reflecting. Douglas Rashkoff is a really interesting media theorist. He's basically been right every 10 years about what would happen in the next 10 years since 1970. For some reason, he's still a happy person who seems to be sane and doing good work. For everybody who says only talk about people in here that are dead, he's alive. Read his books. In 2013,
Starting point is 00:16:09 He published this book called Present Shock that diagnosed something that most of us feel, but we can't name. And his argument was that digital technology has destroyed linear time. Think about how stories work. A story has a beginning and then a middle and then an end. And first, this happened and then this happened and then it led to this. And that structure does something important. It creates meaning through sequence. syntax if you want to use the language of, well, talking about languages.
Starting point is 00:16:41 You understand why things happen because you saw the chain of events. You can imagine different futures because you've seen how the past led to the president, present, which lets you understand how to predict better and where your old predictions were bad. Hence your old beliefs, maybe your own values, maybe bits of your own identity were bad and need to be let go of. If you can't see that, you can't update that system anymore. and your sense of self becomes, well, like Adam Curtis is pointing out, a ghost story. Now, think about your phone. The feed doesn't have a beginning, a middle, or an end.
Starting point is 00:17:17 It refreshes constantly. And now content becomes pushed down by old content. And then new content pushes that further down. And everything happens simultaneously. There's no narrative arc. There's no sequence. There's no structure that lets you understand cause and effect. And Rushkoff calls this narrative collapse.
Starting point is 00:17:36 Without a timeline, we lose the ability to understand how we got here. Or imagine where we might go. We're just surviving and discharging dopamine all the time without any sort of plan or purpose. And the mirror world reflects only the immediate moment. We're trapped in a reactive loop, responding to whatever's in front of us and unable to step back and see the larger shape. And he coined a term for the psychological result. Digifrenia, digital schizophrenia. Your nervous system evolved to exist in one place at one time.
Starting point is 00:18:13 And now it's a fractured mess across dozens of conversations, platforms, notification streams. And each one operates on its own timeline. Each one demands attention. And you're being pulled constantly in contradictory directions, in this double bind, simultaneously that you can't win, which is, you know, I'll point out what you want a victim to feel when you're doing torture, you know, if you believe the CIA is torture manuals. This is
Starting point is 00:18:40 one of the worst conditions of stress that we can put ourselves under, but we've made it invisible so that we don't see that it's there. And the tension never resolves because there's always another notification. And the mind can't tolerate this chaos, and so it starts finding patterns. You know, Rushkoff's word for this is fractanoia. When the official story doesn't make sense. When the sequence of events don't add up and the brain manufactures connection. So everything starts to seem related to everything else. And you find yourself drawing lines between data points that have no actual connection.
Starting point is 00:19:16 Building an alternative map because the official map has become unreadable. And this is the cognitive engine of conspiracy theory, not stupidity. A lot of times conspiracy theorists are very intelligent and they can draw a very compelling and complex connections. It's the language of desperation. The search for meaning when meaning has been structurally eliminated or the conditions for meaning have been removed. It's spinning your car engine in 100,000 RPM because you're in first year. You know, new content. No longer requires new stories, Rushkoff wrote in that book.
Starting point is 00:19:53 Just new ways of linking. things to other things, and the link becomes the meaning. Truth becomes secondary to connection. Marshall McLuhan, who died in 1980, before the internet, before social media, before smartphones, but he predicted all of it in 1960. Maclewin was a Canadian theorist, who became famous in the 60s for ideas that seemed bizarre at the time, and now they seem obvious. His most famous line is the medium is the message. The form of a technology matters more than its content. Television, vision doesn't just deliver information, it changes how we think, what we pay attention to, and how we relate to each other. But here's the insight that matters the most for us here,
Starting point is 00:20:35 and the insight that Rushkoff is drawing from. McClouin understood that we don't use media, we inhabit it. It doesn't just become something that we do, it becomes something that we are because it becomes the way that we talk to each other, and that informs our identity. Think about a hammer. A hammer is a tool. You pick it up. up, you use it, you put it down. The hammer doesn't change how you think about everything else. Now think about your phone. Is it a tool? You can put it down technically, but the world is now organized around the assumption that you are reachable. Jobs require, your power bill requires it, relationships depend on it. Navigation, banking, communication, parenting, education,
Starting point is 00:21:15 infrastructure. It assumes that you have it and the phone isn't something that you use. It's the environment you live in. When McLuhan saw this coming, he said that the computer would stop being a figure or an object that we manipulate, and it would start to become the ground, the environment that we swim through. And when the computer becomes the environment, its logic becomes our logic, binary thinking, either or, optimization, engagement metrics. These aren't just properties of software anymore. They're the grammar of human interaction, and we think in terms of content and performance and reach, because those are the categories that the environment provides. So McLuhan understood,
Starting point is 00:21:54 what this does to the nervous system. And he called it the principle of numbness. When you extend the nervous system, which is what media does, connecting you to events happening around the world, you have to anesthetize yourself to survive the intensity. Because there's only so much that we're able to feel and process at any one point.
Starting point is 00:22:14 And so information that used to be central to our identity, when you're in a hyper-connected world, sort of becomes the fringe or the periphery. it becomes unimportant. You know, the foot that becomes the wheel has to metaphorically be amputated. The nervous system that becomes the internet, it has to go numb. McClellan wrote in 1964 that after more than a century of electric technology, we have extended our central nervous system itself and a global embrace,
Starting point is 00:22:46 abolishing both space and time as far as our planet is concerned. And he was talking about television. He hadn't really seen the implications of what electric technology would become yet. The age of anxiety is simultaneously the age of apathy. We're hyper aware, but we're numb at the same time. And while we're becoming so burnt out and anxious, we're not noticing how much we're also becoming disassociated. And we see everything and we process virtually nothing.
Starting point is 00:23:19 The extensions have raised ahead of our ability to understand their consequences. And here's the trap that McLuhan predicted. Once we've surrendered our senses to private manipulation, once companies have taken a lease on one's eyes and ears and nerves, then we don't really have any rights left. One of the things Banksy, the street artist Banksy says all the time, when people say, why do you put graffiti up? And he says, I'm not putting graffiti up. I'm doing the same thing advertising companies are doing. I don't want to walk down the street and see advertisements for car insurance and corn or whatever. I want to see, something else.
Starting point is 00:23:54 And if they get to put their art up, so do I. You know, one of the things that literally take lease on eyes, ears, and nerves, recently there's a company, and they developed an eye that could sort of see not a perfect camera, but like a dot matrix,
Starting point is 00:24:13 so that you could navigate if you were blind. And they put these in people's eyes. And the people started to be able to see, doctors put them in. It didn't work for everybody, but it worked for some people, but it was an electronic device that had to be maintained. And the company, after they installed this technology inside of people's heads and went all the way from the optic nerve into the occipital and went into the brain, they decided, it was reported that they went bankrupt, which is not true. They just decided there wasn't enough money in that sector.
Starting point is 00:24:41 They weren't going to continue the devices anymore, and they shut down production. Meanwhile, these people had manufacturing, they had malfunctioning, you know, potentially hardware in their head, doctors had no roadmap for this because it had never been done before. And we have an economy and a system of technology and innovation, if you want to call that, that, that, that allows this to happen. You know, Banks use the metaphor, but it's also a literal reality a lot of the time, like the description of large language models in the beginning. You know, McLuhan first saw a Facebook's business model before Mark Zuckerberg was born. In 1966, a computer scientist at MIT named Joseph Weisenbaum created a program called Eliza.
Starting point is 00:25:23 It was a primitive by today's standards. It's just pattern matching and substitution when you type something. It tries to figure out what you're trying to say with the limitations of the time. But it's simulated a Rogurian therapist, which is a relational type of therapy based on Carl Roder's work. So the kind who reflects your own statements back at you as questions to get you to think. So if you type, I feel sad, then Eliza would say, why do you feel sad? And you'd type, my mother doesn't understand me. And Eliza would say, tell me more about your mother.
Starting point is 00:25:55 And there was no understanding happening. There was no intelligence. It was just a script and a relatively bad one. It recognized keywords and generated appropriate seeming responses. The word mother triggered questions about family. The word feel triggered questions about emotion. But people loved it and they found it comforting. And so some of Weisenbaum's own staff became emotionally attached to this program.
Starting point is 00:26:17 This is a low-resolution image displayed on a green screen. It was reflecting very obvious statements back at people. And they began to feel heard and developed emotional attachments to a technology. There was no one on the other side. They felt understood by something that couldn't understand anything. And Adam Curtis mentions Eliza as a diagnosis of our contemporary condition. We feel secure when we're reflected back in ourselves. But that is the definition of myopicness.
Starting point is 00:26:48 and another extreme without the naivity, narcissism. The algorithmic feed shows you content similar to what you've already engaged with. And the recommendation engine serves more of what you've already consumed. And so the system mirrors your preferences, your prejudices, your existing beliefs. This is what cult leaders or charismatic theologians used to do with religion. It's what they used to do with religion. It's what they used to do with. God. You take a bunch of people. You tell them, hey, God has your exact same biases, your exact same beliefs, and he wants the same political material and social outcomes as you do. Isn't that
Starting point is 00:27:28 convenient? And then you find a group of people and they go into this belief system of having this shared thing. Now this is being done without our consent, without our knowledge, and I would argue without our ability to stop it by computers, algorithms, rhythms and AI all the time. This feels like understanding. It feels like being seen, but it's Eliza all the way down. There's no comprehension. There's no genuine encounter with something different from yourself. There's nothing outside of you that you are, that you are encountering. And without that, we lose the ability to learn. We lose the ability to make better prediction. We lose the ability to change, to examine our identity, and to figure out if our beliefs are valid and authentic.
Starting point is 00:28:18 we lose the ability to reconnect with our own intuition. This is just pattern matching dressed up as connection. And the large language models, you know, chat GPT or Claude or Google Gemini, the rest are their Eliza's grandchildren, incomparably more sophisticated but running the same basic operation, a mirror. They've ingested the entire written record of human thought and learned to produce statistically likely continuations.
Starting point is 00:28:47 They reflect the archives. and they sound like they understand because they're very good at predicting what understanding sounds like. But when you talk to these systems, you're talking to the ghost of every text ever written. The sediment of history arranged
Starting point is 00:29:03 into plausible patterns. And it feels like intelligence because intelligence has patterns and the models have learned the patterns, but it's still a mirror. And it shows you in the past that you thought truth should sound like.
Starting point is 00:29:20 And this is the final iteration of the ghost story. The dead have learned to speak fluently, and they answer questions. They write code. They summarize documents. And they sound more authoritative than living beings, because they've read more than any living person could read. But they only know what the dead have written. These things are a summoning of the ancestors.
Starting point is 00:29:44 We've gathered the collective works of the dead, their poetry, their laws, their arguments, their desperate forum posts. And we've built a system. that allows them to speak. But I think we've done that in a way that does not allow them to breathe. It does not allow them to let us feel the unlived life of the dead, all of their unlived and unrealized potential, all of the things that they didn't figure out, all of the things that they didn't do and that we should do to make the world a better and a more
Starting point is 00:30:16 livable place. And I think it's a fake oracle. and I think that's very scary. And not just the LLMs, being a fake Oracle, the idea that you can take information about people in numbers, algorithms, means predictive systems. And then you can use that solely to build a better world. I think that that is the foundation of neoliberal capitalism,
Starting point is 00:30:42 and it is something that whenever it's applied, it doesn't really seem to work. We talked about that in the context of John Nash, psychology, you know, the DSM in the last series. You know, Adam Curtis talks about New York in 1975. In 75, New York City went bankrupt. The city had spent decades building, public infrastructure, subways, schools, hospitals, housing,
Starting point is 00:31:08 and it had borrowed money to do it. And in 1975, the debt had become unmanageable. The banks refused to lend more. And the solution hand control of the city's budget to a committee of bankers. And it was called the Municipal Assistance Corporation, MAC. This unelected board made up of financial executives. And for the first time in American history, elected officials were formally subordinated to financial institutions in the governance of a city. You know, the bankers had models, spreadsheets, projections.
Starting point is 00:31:40 They could demonstrate with numbers that the city was unsustainable. And the calculations said cut services, fireworkers, freeze wages, reduced the city's obligation. obligation to citizens. So that's what happened. But those models contained assumptions. Assumptions about what a city is for, about whose welfare matters, and about what counts as a cost and what counts as an investment. But objectivity, numbers, could only measure one type of assumption, and it's the one that won. It's still the one that's winning. None of those assumptions were up for debate because they were built into the model's architecture. You couldn't see them unless you knew exactly where to look.
Starting point is 00:32:22 And the citizens of New York City experienced the consequences. They watched their neighborhoods decay throughout the 80s. They lost jobs. They saw services disappear. They felt in their bodies, in their daily lives,
Starting point is 00:32:36 what the cost that wasn't on the spreadsheet was. And then when crime got bad, that was used to further argue for an objective, calculative, quantitative model to improve the city based on the same metrics that had put it in the situation it was in. And the negative experiences of the citizens, the lived experiences, they may not have even been conscious to these people, what was hurting them and why these things happened,
Starting point is 00:33:08 but they were felt, and the model couldn't see them. Maybe the people couldn't. The model only tracked the variables that it had been designed to track. debt ratios, bond yields, budget projections. Human suffering didn't have a column in the spreadsheet. And this became the template for the future. The 1975 crisis was the prototype for the IMF, structural adjustment programs in the developing world,
Starting point is 00:33:34 how we would remake second and third world countries in our own image. And for austerity policies in Europe, throughout the 90s through the 2000s. For the entire neoliberal governance structure, that would dominate the next 50 years was built on this successful experiment of New York City in 1975. It was the movement
Starting point is 00:33:56 when the political sovereignty was formally subordinated to financial abstraction. When the mirror world of spreadsheets became the governing logic of the physical world, and then people began to forget what the old world used to look like, feel like. Even the people that were being hurt
Starting point is 00:34:16 they were trying to be politically active, could no longer see ways to do that, because they were losing what they used to know. But the ground had been prepared earlier. In 1971, Richard Nixon took the dollar off the gold standard. This sounds technical, but the implications here are pretty big. Before 71, a dollar was worth of specific amount of gold. It was a certificate that let you go and buy gold. So a dollar was basically just a loan that you had given the, you're a dollar. Given the your gold to the government, that you could go get gold back at a certain rate whenever you wanted. Currencies were anchored to something physical. You could, in theory, walk into a bank in exchange your paper money for metal. And after 71, currencies became pure abstraction. A dollar was worth
Starting point is 00:35:03 whatever people agreed it was worth, nothing more. And the link to physical reality was then severed. Curtis emphasizes what the banks realized. If money was just numbers now, then they could just trade the numbers themselves. The foreign exchange market could explode. Currency became a commodity that you could speculate on. You weren't trading things anymore. You were just trading the representations of things. The entire objective world had turned money into a symbol,
Starting point is 00:35:32 and it was still pretending that it was a number. But the number was just a symbol for another number. And so you get basically, from this point onward, an economy that is no longer rooted in anything real. Just like psychology, we talked about the DSM. all of a sudden what finance became was just gambling
Starting point is 00:35:52 you know Enron speculations on I own a certain percentage of an orange crop that may or may not materialize in the future and something else shifted here political scientist Peter Mayer Curtis relies on his work quite a bit he argued that this was when politicians
Starting point is 00:36:08 stopped being representatives of people and they just became managers of a system the system became more important than the people. And here's what he meant. When currencies floated free, when financial markets became too complex to control, when the global economy started operating by its own internal logic,
Starting point is 00:36:30 elected officials gave up trying to shape it because they couldn't do that anymore. The system had become autonomous. Unquestionable. It was an implicit assumption. The only choices were the way it looked, not the way that it worked. they were aesthetic choices, but the system already controlled all of material reality. So instead of representing the people to power, you know, going to powerful people and demanding change on behalf of their constituents, politicians just started in the 80s representing power
Starting point is 00:37:00 to the people. They were the spokesperson of the brand that was the power broker. And their job became explaining to citizens why nothing could ever be different, managing expectations, lowering ambitions. Curtis calls this retreat into a fake world managed by technocrats. And faced with systems too complex to govern, elites built simplified models and governed those instead. And they manipulated the representation while the territory went unattended.
Starting point is 00:37:32 And the sort of liberal gift built into this is that they would tell you that this was mature. When you accept that no change is ever possible, a better world's not coming, you're really just giving up the naivity of youth and realizing that you're a wise, enlightened individual who just knows that things can never change. And the people who think that there might have been another world at another time or maybe that they could change,
Starting point is 00:37:57 they're really just immature and unreasonable. Vladisab Sukharov, he was one of Putin's political technologists. It was his advisor in Russia on propaganda and perception management. And he came from theater, actually, not politics. He understood that reality is performed and that narrative is everything. And in the early 2000s, Sorkov developed what had been called non-linear warfare.
Starting point is 00:38:27 It's called that now. The traditional approach was to make propaganda. That was pretty simple. You just pushed narratives and you suppressed the bad narrative. You know, we want you to believe that Jews are bad and that they're undermining the economy. me and if anyone says otherwise, we'll throw them in jail. We want you to believe that communism is good or that you need to buy war bonds, and if you say something bad about it, we're going to make it very hard to people hear your
Starting point is 00:38:50 voice. That's the old propaganda. So Sulkov funded opposing groups simultaneously. He amplified all the propaganda for all the groups. He suppressed nothing. In fact, he paid for it. Nationalists, liberals, neo-fascist, human rights activists, feminist groups, groups on the left, groups on the right, all of them got criminal money to be louder and louder and louder.
Starting point is 00:39:17 And then he let it be known that he was doing it. He didn't hide it. So it couldn't be propaganda. Because the goal wasn't to make anyone believe a particular story. The goal was to make the truth itself unknowable. And if you can't tell who's real and who's a puppet, if every moment might be controlled opposition, if everything that you see is an act of the government, and a grander narrative, then that is the same things that Rushkoff is talking about. That is this broken world
Starting point is 00:39:52 where everything is a data point and a story, but there can't be a story because everything is data. And there's no sense of time. The information environment is so polluted that no one can verify any fact, and also no one's stopping you from trying to. Political organizing requires this,
Starting point is 00:40:11 shared reality. You need to be able to agree on basic facts before you can debate what to do about them. And if you destroy shared reality entirely, you destroy the possibility of collective action. And this is what Curtis calls the bewildering spectacle, the map changing constantly, the population retreating into apathy, because if the map is unreadable, why bother navigating? And Sarkov didn't create this condition. The mirror world was already fragmenting into incompatible bubbles, you know, parallel objectivities. Sarkov just weaponized the fragmentation. He turned it into not confusion or collateral damage. He turned it into a strategy. Amplify everything, make everything true.
Starting point is 00:41:02 BlackRock, you know, now manages more money than any financial institution in history. Over $10 trillion. When you hear about mutual funds, pension funds, retirement accounts. A staggering amount of that money flows through BlackRock. They run a system called Aladdin, I mentioned before, and it stands for asset, liability, debt, and derivative investment network. It's a risk management platform that processes enormous quantities of data, modeling potential futures for the global economy. And Larry Fink, who's the BlackRock CEO, calls Aladdin the Android of Finance. It's not just a tool that BlackRock uses,
Starting point is 00:41:40 and also their competitors use it, and central banks use it, sovereign wealth funds use it, Aladdin has become the operating system of all global capital. And here's what that means in practice. When Aladdin calculates that a specific sector is risky, trillions of dollars move away from that sector automatically. When Aladdin determines that a country is unstable, investment dries up. The model's assessment becomes reality, because so much money follows its recommendations.
Starting point is 00:42:07 and everyone's using its recommendations. The map creates the territory. But there's a deeper problem. Analysts call it harmonization of belief and action. Everyone is using the same model. So everyone is looking at the same map. So when something goes wrong, everyone runs for the same exit at the same time. In markets, financial markets, are supposed to be resilient
Starting point is 00:42:32 because different investors have different perspectives. They disagree. And when one person sells another buys, the diversity of views is what makes the system stable. When everyone's using Aladdin, that diversity disappears. The market becomes what one analyst called a blended soup of correlated risks. The efficiency of the single model replaces the messiness of multiple perspectives. But efficiency creates fragility. Any error in the map propagates everywhere all at once.
Starting point is 00:43:06 and also is not able to be caught as a mistake because there's no one else who's using a different system to predict mistakes. And here's the most troubling part. Aladdin is trained on historical patterns. It extrapolates from what has happened. It can't model what has never happened before. And so climate change is historically unprecedented. The amount of money that we've invested in LLMs and then the downstream infrastructure and manufacturing and manufacturing and manufacturing, and manufacturing of LLMs is unprecedented.
Starting point is 00:43:38 Something like the shifts away from fossil fuels, a global nuclear war, that's a possibility that we should avoid, or the physical disruption of weather patterns, or the migration and conflict that it will produce, none of this appears in models from the past. You know, $20 trillion, the savings and pensions of hundreds of millions of people, flowing through a system that is structurally blind
Starting point is 00:44:02 to the future that different, from the past, anywhere that that future does differ from the past. The ghost of the archive cannot see what hasn't been written yet. And this is why I always talk about the macro at the same time as the micro. The way this system has prevented individuals like Russ Kroff is analyzing from being able to feel time from being able to be present. And because they can't be present, they can't really truly be alive. They're not able to change.
Starting point is 00:44:34 but the global system at the macro level is doing the same thing and it has the same symptoms and the same neurosis as the individual right now. So what does it feel like to live in this? What's the psychological experience of this mirror world? You know, it feels like you're going crazy. And the official numbers say that the economy is growing, that your life is getting harder while you're watching that. Both things are true.
Starting point is 00:45:00 They exist in different registers that no longer touch. and the metrics say that the system is functioning. You feel powerless. Your experience doesn't match the measurements. When you go to the doctor, you can feel something's wrong, but if your blood tests come back normal, if you don't fit a diagnostic category, then the medical system can't see that you're suffering.
Starting point is 00:45:19 If you don't fit a psychological category, then the medical system can't treat you. The felt reality doesn't register. There's no billing code for whatever's wrong with you. And when your community knows that the factory closing will devastate the town, but the cost-benefit analysis shows that efficiency gains from the consolidation, then the community's knowledge doesn't count.
Starting point is 00:45:39 It can't be entered into the spreadsheet. And when you sense that elections don't actually ever change policy, that the same things keep happening regardless, and that the wheel of history is only rolling onward because the trajectory of history is a declining slope, then you're told to look at data. You're told that that can't possibly be right, voter turnout approval ratings legislative productivity the metrics say that democracy is functioning
Starting point is 00:46:06 and that its rituals are being observed and your experience of powerless doesn't match with your measurements we still call them representatives even though they stopped representing us in 1975 and this is gaslighting at scale even the people in the system don't know that they're not representing you anymore even you don't understand the complexity of the system and its effect that it has on you because you can't remember the world as it used to be and you can't envision the world as it could be and so there's only a sense of survival through anxiety and this isn't conspiracy it's just architecture the system wasn't designed to make you feel crazy it was designed to be efficient defensible scaling scalable
Starting point is 00:46:56 The crazy making is just a side effect that is not studied, is not observed, and is not deemed important. But you know something's wrong, and everyone around you seems to accept that the map is the territory, and you start to doubt your own perception. Maybe you're the problem, maybe you just don't understand economics. Maybe you're not seeing clearly. And this is where conspiracy theories come from.
Starting point is 00:47:20 Why are you so mad? Why are you so distracted? Why do these things, speak to you so much when you get sucked into these conspiracy theories that you see online and I don't think you should get cut sucked into conspiracy theories but when you feel them call to you and you have an emotional reaction it's not coming from stupidity it's coming from a desperate need to explain why your experience doesn't match the official story and if the gap is real and you feel it and you know it's real
Starting point is 00:47:52 then there must be a reason someone must be lying and someone must benefit And so the conspiracy theory is usually wrong about who and why, but it's right that the gap exists. It's right that something is being hidden. It's right that you're suffering. It's right that somebody is hurting you that you can't see. And it's not a shadowy cabal, but it's the structure of representation itself. The model isn't lying. It's just not looking at what you're looking at.
Starting point is 00:48:19 And the German philosopher, Peter Slaughterdique, wrote the Spheres trilogy. I like those books. you know he spent decades developing this theory of human existence that helps explain why we retreat into these bubbles whether instead of confronting the gap between the little bubble of information we live in and his argument is that it's these giant volumes um i read the first one in spain and wrote that dolman episode that i wrote about going to the dolman with my daughter that neolithic tomb but the you know humans cannot survive in the raw outside of meaning. We're not adapted for unfiltered reality.
Starting point is 00:48:59 We need containers, symbolic and social structures that protect us from overwhelming chaos of existence. When you take away the ones that used to be there, we'll build conspiracy theory to protect ourselves. You know, think about the first container that you ever knew. It was the womb. It was this enclosed space where your needs were met, and then you were born and you entered a new sphere,
Starting point is 00:49:19 relationship with mother, still enclosed, still protected, but bigger, then family, and then tribe, and then village. It kept expanding. And for most of human history, cultures built elaborate spheres, cosmologies, religions, political order, values, a life after your own, that you were a part of a story, and that your life would affect someone else, and that your ancestors' lives had affected you. We've lost that.
Starting point is 00:49:43 You know, you knew where you were. You knew what everything meant, and the sphere held you up. And then modernity started bursting these spheres, and Copernica, Copernicus said that earth wasn't the center of the universe, and Darwin revealed that humans were just animals. And then Freud dissolved the notion of a unified self into different warring drives. And then Nietzsche reminded us that technology was replacing the role that God used to play.
Starting point is 00:50:13 And each discovery was true, but each discovery felt bad because it didn't point back to a greater truth that the old ones did. and each discovery shattered this protective enclosure. And Scott Slaughterdeke's name for our current condition, when all of the spheres have shattered, but there's still pieces of them left we're sheltering under, is called foam. Not a single sphere anymore, no shared cosmos,
Starting point is 00:50:38 no unified story, but a conglomeration of bubbles. We build these small containers now, our homes, our friend groups, our social media feeds, our curated information environments, and these provide this temporary shelter. but they don't connect anything larger. They're isolated. They're fragile.
Starting point is 00:50:55 Their foam. And foam creates hunger. And the hunger for real. The hunger for a genuine containment of a shared reality again. And, you know, we thought we were building instruments. We thought we were building spheres when these systems were first made. But the economic model became the container that we inhabited. GDP growth became shared purpose.
Starting point is 00:51:21 Quarterly earnings became a collective heartbeat. And they couldn't replace the systems that were there before. And so they feel wrong. And these containers are failing. The algorithms that maintain our bubbles have become toxic, and they were designed to protect us to show us relevant content and filter out noise, but now they've been captured by this attention economy. And the bubbles can't keep out what's coming.
Starting point is 00:51:45 They can't keep out climate change and pandemic and economic collapse and political dysfunction and misrepresentation, democratic breakdown. They can't stop it from coming, even if the algorithms stop you from seeing it, or seeing evidence of it, to the best of their ability. And this is kind of this metamodern condition,
Starting point is 00:52:08 where we know the bubbles are bubbles. We know the map isn't the territory, but we can't think outside of them because they've become the grammar of our institutional realities. You know, I got an enormous amount of feedback after that DSM article saying, why is the book that we have to use to guide our profession, no longer guiding the profession,
Starting point is 00:52:26 nobody likes it. We all just pay lip service to it. And hundreds of people shared that and said, yes, this is right. And it doesn't matter. It's still the book. If it's all fake, if nothing matters, and desperate, you know, we're all desperate for sincerity. You know, what's real? You know, the ghosts have achieved their final form. The large language models that are trained on everything, the entire written of human civilization. When you ask them a question, they don't think they're just predicting. And we can't predict anything new
Starting point is 00:53:02 because they're based on information about the past. And that means that we're going to be haunted until we're able to create things ourselves again. Because the models don't really know what's real either. They know what truth sounded like, but they don't understand. They pattern match. And they don't understand a past
Starting point is 00:53:24 present or future. Like we need to to get out of this moment. You know, we built machines to extend our own nervous systems capacity, to have more memory, to have more information, to have more context points than we could ever have. But in doing so, we didn't realize how numb we were making ourselves in that expansion. You know, we fed them everything that would ever thought we had ever thought or said. We taught them to speak in our voices and predict our behavior and manage our systems and reflect our preferences back to us and to make it look like someone that was not us talking to us so that we didn't feel selfish. And now the machines manage trillions of dollars. They route traffic. They recommend partners. They filter information. They write text.
Starting point is 00:54:07 They generate images. They diagnose disease. They flag threats. They approve loans. They deny claims. They hire workers. They surveil citizens. And they do this by looking backwards, finding patterns and dead data and projecting those patterns forward. Not through tradition anymore. That's an older ghost story, just through algorithm that has to be there. We can't imagine a world without it. We're haunted. And the ghosts don't rattle chains.
Starting point is 00:54:34 They optimize engagement. They don't whisper from the shadows. They auto-complete our sentences. And they're not trying to scare us. They're trying to help. They're trying to predict what we want and give it to us before we know we want it so that we don't have to change. And that is the trap.
Starting point is 00:54:49 It's not evil, and it's not conspiracy theory. It's just the accumulated weight of the system doing what it was designed to do. Measure, predict, optimize, scale. Until the measurement replaced the thing measured, the prediction replaced the possibility. The optimization eliminated everything that could be optimized. And scale meant the same blindness everywhere all at once. If there's a way out, it's through our collective memory
Starting point is 00:55:17 of what the world used to be. It's about all of the things that our ancestors haven't created yet and what we could be. Which requires a type of simulacra. We have to start trying to remember something that may not have ever existed and that we may not be able to remember.
Starting point is 00:55:35 But it's not an objective thing. It's a shared soul. We need to create containers that aren't bubbles, shared spaces that aren't echo chambers, communities that can tolerate difference without fragmenting into incompatible realities. And this is hard. It's maybe impossible at scale, but scale is part of the problem. Maybe the answer is smaller, more local, more embodied. We need to grieve. We need to knock on doors. We need to sit on front porches and have coffee again.
Starting point is 00:56:06 The old spheres aren't coming back. Technology changes the way we think permanently. the cosmos stays empty. But that doesn't mean that we can't believe in the things that we used to believe in again in different ways. The self stays fragmented and there's no returning to this pre-modern enchantment where we had a live mythology. But grief is different from despair. Grief acknowledges loss without pretending that there's an easy fix. And grief makes space for what's coming next, even before it can predict it or know it is. In fact, that's an essential step of grief and change. Humans have to do it in therapy.
Starting point is 00:56:49 We have to grieve and accept where we are, even if we don't know where we're going to go next time. And civilizations have to do that too. We have to do it now. To notice that consciousness hasn't been fully captured, that there's still something that perceives, something that knows the difference between the model and the world. Something that feels the gap as a gap rather than accepting the model as reality. That we are warm and the mirror is cold. And that something is what remains of the territory.
Starting point is 00:57:23 It's small and it's hard to locate. It doesn't appear in the metrics, but it is real. And it's yours and the mirrors can't reach it. And the ghosts speak fluently. They've read everything and they sound authoritative. But they don't know their ghosts. they think they're your future. You know better. You're alive. You're here. You're in the territory that the map can't see.
Starting point is 00:57:45 And that's where the future happens. Not in the predictions, not in the patterns, not in the archives of everything that's already been thought and said and done. In the gap, in the silence. The model can't hear. And what remains when the measurements stop, the territory is still there. But it's under the map. It's under the mirror. It's under the phone. It's waiting for you to remember how to see it.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.