Making Sense with Sam Harris - #146 — Digital Capitalism

Episode Date: January 16, 2019

Sam Harris speaks with Douglas Rushkoff about the state of the digital economy. If the Making Sense podcast logo in your player is BLACK, you can SUBSCRIBE to gain access to all full-length episodes a...t samharris.org/subscribe.

Transcript
Discussion (0)
Starting point is 00:00:00 Thank you. of the Making Sense podcast, you'll need to subscribe at SamHarris.org. There you'll find our private RSS feed to add to your favorite podcatcher, along with other subscriber-only content. We don't run ads on the podcast, and therefore it's made possible entirely through the support of our subscribers. So if you enjoy what we're doing here, please consider becoming Today I am speaking with Douglas Rushkoff. Douglas has been named one of the world's 10 most influential intellectuals by MIT. He is an award-winning author, broadcaster, and documentarian who studies human autonomy in the digital age. He's often described as a media theorist. He's the host of the popular Team Human podcast, and he's written 20 books, including the bestsellers Present Shock and Program or Be Programmed. He's written regular columns for Medium, CNN, The Daily Beast,
Starting point is 00:01:21 and The Guardian, and he's made documentaries for PBS's show Frontline. And today we discuss his work and his most recent book, which is also titled Team Human. Anyway, it was great to talk to Douglas. We get into many of these issues, and he is certainly someone who spent a lot of time thinking about where we're all headed online. So, now, without further delay, I bring you Douglas Rushkoff. I am here with Douglas Rushkoff. Douglas, thanks for coming on the podcast. Hey, thanks so much for having me. So you have a very interesting job description and background. How do you describe what you do and your intellectual journey? And I guess what brought me here was originally I was a theater director, and I got fed up with know, sort of the pre-open source
Starting point is 00:02:47 participatory narrative that we could rewrite the human story and print our own money and make our own meaning. And, you know, started to write books about that. And I wrote a book called Siberia about designer reality and one called Media Virus, which was celebrating this new stuff called viral media, which seemed like a good thing at the time. And then I watched as the internet really became kind of the poster child for a failing NASDAQ stock exchange. And all of these companies from Google to Facebook that said they would never be about advertising became the biggest advertising companies in the world. And these tools, which I thought were going to be the new gateway or gateway drug in some ways to a new kind of collective human imagination, ended up being
Starting point is 00:03:38 the opposite. So I've been not really writing books or struggling against technology so much as asking people to retrieve the human and bring it forward and embed it in the digital infrastructure rather than just surrendering all of this power and all of these algorithms to agendas that don't really have our best interests at heart. You're often described as a media theorist. Is that a label you happily wear or does that kind of miss most of what you're up to? I mean, I happily wear it when people understand media theorist in the way I do. But to most people, I feel like the word media theorist sounds like some kind of a PBS boring show or something that's good for you. But when I think of someone like Walter Ong
Starting point is 00:04:30 or Marshall McLuhan or Lewis Mumford, then yeah, because I don't mind being a media theorist because almost everything is media. It's almost hard to figure out something that's not media. So someone who thinks about it, sure, but I guess over time, I've become a bit more of a social activist or an economic thinker. It's kind of hard to just to say I'm thinking about the content on television. I'm thinking more about the platforms and the political economy that's driving this media. Was McLuhan influential for you? You know, I guess I should be embarrassed to say.
Starting point is 00:05:08 I mean, I didn't really read McLuhan. He's famously unreadable. Yeah, maybe that's why. But I didn't read him until after people said, oh, your work is like McLuhan's. So I was, you know, three books in, really. It was after media virus. People started to say, this is what McLuhan was saying. And so then I went back and read him afterwards.
Starting point is 00:05:31 And yeah, he was crazy smart. But it's a bit like reading Torah or something, where everything he says, I could say, oh, it means this, or it means that. So while it's a terrific intellectual exercise, it's a bit like, it becomes like James Joyce, where you can almost argue about it more than make sense of it sometimes. I mean, and part of why, honestly, part of why I'm excited to be talking with you is because there's certain ideas that I'm really unresolved about, and sort of certain understandings of the human story, if you will, that I'm still really challenged by. And in writing this book, I feel like on the one hand, I'm maybe accidentally or unintentionally telling a new myth, you know,
Starting point is 00:06:28 or, you know, that, oh, I'm sort of arguing in this book that humanity is a team sport. And that, you know, if you look at evolution or even read Darwin, there's just as many examples of cooperation and collaboration leading to, you know, species success as there is competition. And that if we want to understand human beings as the most advanced species, we should think about the reasons for that are our language and collaboration and increasing brain size. So the Dunbar number got up to over 100 people that we could collaborate and coordinate. And then, of course, I argue that all the institutions and media and technologies that we came up with to enhance that collaboration, they tend to be used
Starting point is 00:07:11 against that. So instead of bringing people together, social media atomizes people into their separate silos. Or even you can go back and see how text abstracted people from the sort of tribal oral culture. And then you could even argue that language before that disconnected people from some essential grunts or something. But that becomes an almost Eden-like myth that I don't want to fall into to say, oh, don't listen to the libertarian story. Listen to this story instead. But then we're stuck in another story. And so what I'm really aching for, what I'm looking to do is to give people reasons to celebrate humanity for its own sake and human values and retrieve what I consider to be, and I hate even the word, but these essential human
Starting point is 00:08:00 values without falling or without requiring some mythology or some story to justify it. You know, I'd rather justify it, you know, with science or with common sense or with some sort of an ethical template than, you know, than some other story. Right. My first reaction to some of those ideas is that basically everything we do, virtually everything, has a potential upside and downside. And the thing that empowers us, the lever that we can pull that moves a lot in our world or in our experience, also shifts some things that we don't want to see move. As you said, you could walk us all the way
Starting point is 00:08:46 back to the dawn of language, right? And obviously language is the source of virtually everything we do that makes us recognizably human, right? It is the main differentiator. And yet language, you know, under one construal, I mean, anyone who has taken psychedelics or spent a lot of time meditating or trying to learn to meditate, recognizes that this compulsive conceptualizing through language, this tiling over of experience that we do just as a matter of course once we learn to think linguistically, it is in most cases the limiting factor on our well-being in so many moments, because so much of the conversation we have with ourselves is a source of anxiety and despair, and yet we can't have civilization without our full linguistic competence,
Starting point is 00:09:42 and we certainly want to be able to use it on demand all the time. And basically, any other complex technology built on language, every form of media has had this upside and downside. So you briefly gestured at this now fairly famous notion that just the mere introduction of print and the widespread ability for people to read and write was bemoaned by many intellectuals of the time as a guaranteed way to lose our collective memory. The oral tradition would erode, each person's capacity to memorize things would disappear, and given the advantages of print and reading, that seems like a fairly fatuous concern, and yet it probably is also true, right? You can carry that forward into
Starting point is 00:10:32 the present with respect to the way markets and digital technology are changing us. Right. I mean, the one difference really between speech, text, radio, television, and today and digital technology is that the algorithms that we're building, the artificial intelligences that we're building, you know, continue on. They change themselves as they go. You know, if the words that we spoke mutated after they were out of our mouths, in order to affect people differently, it would be very different than if they were just coming from us. So I get concerned that people are not, and certainly the companies that are building these technologies, don't quite realize what they're setting in motion, that the values that they're embedding in these technologies end up, well, the technologies end
Starting point is 00:11:33 up doing what we tell them, but by any means that they see fit, they keep going. And we're not even privy to the techniques that they're using to elicit whatever response they want from us. So while I could certainly look at capitalism as a system that ended up seemingly kind of having its own agenda and capitalism working on us and the defenseless CEO or the unconscious shareholder or the worker who's being exploited, that all of these people are kind of stuck in this system that they don't understand. But digital technology seems to make this reversal between the figure in the ground, or I guess McLuhan would say the medium and the message, but I really just think it's the subject and the object, that instead of
Starting point is 00:12:23 having these tools that we're putting out there to get things that we want or to help us in some way, we're using these tools slot machine algorithms or telling them to develop their own, they're looking for exploits in the human psyche. So the exploits aren't things that we look at or notice while we're meditating and go, oh, that's interesting. This must have evolved from a human need to do something. And while on one level, it's a neurosis, on the other level, it's part of my human strength. And we could look at, how do I want to use this in my life? The algorithm just sees it as a whole. Oh, look, I can leverage that person's instinct for reciprocity. Or look, I can see this one trying to establish rapport and taking all of these painstakingly evolved social mechanisms and using them against us. You know, and that's where I can sort of feel that there's a kind of a good and an evil,
Starting point is 00:13:29 you know, and I never really go there in any of my prior work. I try to be kind of nonjudgmental. But now I'm really arguing that whenever one of these technologies or languages or algorithms, when they're bringing us together, they're doing good. And when they're turning us against each other, they're doing bad. Just to have almost a simple litmus test for people to understand, you know, am I helping or hurting? Right. Well, so are there companies that are getting to scale in the digital economy that are actually doing it well, that are at all aligned with your more idealistic notions of what the internet could be doing to us
Starting point is 00:14:13 and for us? Well, I don't know that there are companies that are doing it. There are certainly organizations that are doing it, whether it's Mozilla, which invented the browser really, or archive.org, which is a great organization where there's tremendous film archives and text archives and the Gutenberg project. The example everyone uses, Wikipedia, is at scale and doing a good job, but they're not companies as such. The only companies I'm really seeing doing that are cooperatives. And I've gotten inspired by the platform cooperative movement. And I mean, there's many companies that sort of model themselves on the famous Spanish Mondragon cooperative,
Starting point is 00:15:12 but basically where workers own the company. But that's not necessarily just a digital tradition. Associated Press is a co-op. Ace True Value Hardware is an employee-owned co-op. So I've seen things reach scale that way, but usually, or at least so far, they're not these traditional shareholder-owned companies. How would you compare something like Netflix to Facebook? I consider myself a reluctant and none-too-well-informed student of digital capitalism, essentially. I mean, having a podcast and other endeavors, I've had to turn more and more attention to this, but I feel quite late to begin analyzing all of this. But in sort of the front-facing, just consumer-eye view of these platforms, when I look at Netflix, clearly they're playing games with algorithms
Starting point is 00:16:06 and they're trying to figure out how to maximize my time on their platform. But my experience is I want them to have all the content they can have. I want them to promote content that I find interesting rather than boring or a haphazard connection between my interests and what they're promoting. So insofar as their algorithms begin to read my mind and anticipate what I will find interesting, and they do that better and better, and it becomes stickier and stickier, on some level I don't see the downside. I mean, I can curate the contents of my own consciousness enough to know that if I've spent 17 uninterrupted hours on Netflix, I've got a problem. So if every time I open that app, things just get better and
Starting point is 00:16:53 better, that's good. And the business model there is I have to pay a subscription fee and, you know, presumably they're not selling my data to anybody, and I'm not the product, right? Whereas with Facebook, everything is flipped, and again, they're trying to game my attention and keep me on site. In the case of Facebook, it's completely ineffectual, but they're doing that in order to sell my attention against ads, and we know more and more about the downside of those incentives and that business model. Do you see the distinction between these two companies this way, or is there something I'm missing? No, I definitely see the Netflix versus Facebook is sort of the same thing to me as Apple
Starting point is 00:17:37 versus Google, where here's a company where if I've got the money, and that's kind of the sticking point, if I've got the money to pay for it, I can buy television and technology and email and all of these things that are treating me as the customer. And I'm paying for my privacy. I'm paying for my customization. I'm paying for it to understand me for my benefit and my enjoyment. Whereas on Facebook or Google, we understand that we're not the customer and that someone else is paying Facebook or Google to understand us
Starting point is 00:18:13 for their benefit and then not just understand us, but tweak us to their benefit. So if Facebook can determine with 80% accuracy that I'm going to go on a diet in the next six weeks, I'm going to start seeing advertising and updates and things to push me towards going on a diet. And they're not just doing that to sell the specific products, the specific diet products that are on their site, but to increase that 80% to 90%. They want to increase the likelihood that I will do the thing that they've predicted I do. So when I look at a platform like that, or when I look at the way YouTube makes suggestions of what videos I should watch,
Starting point is 00:18:59 and when I go down three, four videos in, I'm always at some really dangerously extreme version of whatever it was that I was initially interested in. I see these platforms turning me into a caricature of myself or trying to get me to behave more consistently with the statistical algorithm that's predicted my behavior. Whereas on Netflix, the extent to which they use algorithms to deliver up to me what I might like, I find that almost part of the entertainment. You know, I'm interested when I finished Narcos Mexico, you know, and if they knew I finished it, then the next morning I look in my inbox and they say, here's what we think you might like next, based on the last six things I watched, as well as how much I paused, how quickly I got through
Starting point is 00:19:50 them. I mean, they're using a wealth of information. I find it interesting and I almost enjoy, and maybe this is just sickness, but I enjoy using it as a mirror. In other words, what shows do I have to watch on Netflix to get it to suggest Roma for me? Because I want it to think that I'm that kind of person. Yeah, apparently you're not that kind of person. I guess not. I watch too much Game of Thrones kinds of things, and they don't realize that I have that side. But the downside with Netflix and their algorithms is not so much what they suggest, but sometimes I'm a little creeped out by the way they construct some of their shows.
Starting point is 00:20:36 So we know that House of Cards was partly derived through algorithms. They found out that, oh, people that like David Fincher also like political intrigue, also like Kevin Spacey. Oh, I didn't know that. That's interesting. Yeah. And they concocted it. And then I wondered why the show kind of went through me like Cheese Doodles or something. It's like Cheese Doodles is this sort of industrial age taste styrofoam sensation that's constructed for me to keep eating it compulsively, but it doesn't actually deliver any nutrition. And I kind of felt that way with those shows. But the biggest problem right now, and it shouldn't be seen as a problem, is you get what you pay for. And I do get concerned about
Starting point is 00:21:26 bifurcating society into these two classes of people, those of us who can afford to maintain our autonomy by paying for our technologies, and those, I suppose, who still need the remedial help of marketing on free platforms. Well, that really is the source of the tension I see, because again, I have a podcaster of the tension I see, because again, I have a podcaster's eye view of this, but as someone who's decided not to take ads and to just have listeners support the show, I now have a very clear view of these two business models. There's just the PBS NPR version, which is, you know, this thing is free, and if you want to support it, you can, and I know how that works. And, you know, this thing is free, and if you want to support it, you can. And I know how that works. And, you know, I've just released a meditation app, which is a subscription-only
Starting point is 00:22:11 service through the App Store, through the Google Play Store. So that's all behind a paywall. And I see that on the podcast side, I have been engaged in this fairly heavy-handed effort to educate my audience to support this work if they want it to exist. You know, many more people engage with the podcast than have ever engaged with my books. I know. I listened to that little six-minute piece you did on why you want people to contribute. you have people, why you want people to contribute. And it articulated what the exact same thing I feel is, you know, I'll do one podcast or I did one of those TED Talks, you know, for free. More people watch that TED Talk than have bought all the books I've ever written combined. Yeah, it's amazing. And you want that kind of reach, obviously, you want it because
Starting point is 00:23:01 the goal is as a writer or as a public intellectual or someone with any ideas that you want to spread, you want to reach as many people as can conceivably find these ideas valuable. And yet what's happened here is that your phrase, you get what you pay for, I think is true, and yet it's antithetical to everyone's expectations, even mine, frankly, online. We're expecting our information to be free. There are certain contexts where people understand that they're going to hit a paywall, and that's somehow fine. Netflix is the classic example. Here's a pretty clear case. Joe Rogan has a podcast. It's obviously free. It's supported by ads. Millions and millions of people listen to it. But then he often releases a comedy special on Netflix. I don't think there's anyone thinking that they should be able to watch his
Starting point is 00:23:57 special for free. Like I don't think he gets angry emails saying, what the hell? Why are you putting this behind Netflix's paywall? But if he put it on YouTube, if he put it online in some other context and put it behind a paywall, you know, it was like Vimeo on demand or something, and he was charging $5 for people to see it, I think he would get a lot of grief over that. And it's just a very odd situation where we're in certain contexts, in the widest possible context online, we have trained ourselves to expect things for free, and yet the only way free is possible is this increasingly insidious ad model that is gaming people's attention. And in certain
Starting point is 00:24:43 contexts, it's innocuous. I mean, I'm not against ads across the board, but in others, it just feels like this is the problem we want to figure out how to solve. And yet, you know, even you voice the concern. I mean, if we put everything behind a paywall, then we have a real problem of people not being able to get access to content that we really do want to spread as widely as possible. Right. I mean, I heard your conversation with Jaron Lanier about this, and it's interesting that he was sort of blaming the famous truncated Stuart Brand quote, you know, information wants to be free, which always people leave off, but information also
Starting point is 00:25:19 wants to be protected is the second half of that sentence. Apologies to Stuart. be protected is the second half of that sentence. Apologies to Stuart. Yeah. But I don't think... I look back at the early internet, and the reason why everything was free is because the internet was a university-based system. We were using Gopher and Veronica. It's these early pre-visual text-only internet search retrieval systems, and you would download and share documents, but it was all university archives. It was free material. So because it was part of a non-profit academic world, because people actually signed an agreement before they went on the net saying, I promise I'm using this for research purposes. I'm not going to do anything commercial on here. I'm not going to advertise anything. You actually
Starting point is 00:26:09 signed an agreement. It set up expectations of a very different place. The internet really was intended at that point to become a commons. Then once we brought business on, businesses really leveraged and exploited that freeness, the sense that everybody wanted things to be free without ever really bringing forward that sort of academic ethos along with it. And it created a real mess. And then I remember the moment that I really thought it would change, and maybe it did, was when Steve Jobs did his iPad demo. And he was sitting in this big easy chair and showing a different posture. And the iPad worked differently. The iPad, you couldn't just kind of download files the way you did with your computer. Now you were going to go through an iTunes store to look at stuff.
Starting point is 00:27:06 And I feel like what he was trying to do, almost with the skill of a neurolinguistic programmer, he was trying to anchor this device in a different social contract between the user and the content maker. And to some extent, it worked, at least in the Apple universe. He said, look, it's going to be easier and better to buy something through iTunes than to go play around on Napster, just collecting music for the sake of it. Yeah. Well, part of it is once you move to digital and people understand that there's zero marginal cost in producing the content, right? And that their use of a file doesn't prevent anyone else from using that same MP3 or whatever it is.
Starting point is 00:27:54 At least psychologically, that seems to be one of the reasons why there's this expectation that free is the actual ethical norm. And they're okay with that, you know. And, you know, first they came for the musicians and I said nothing. And they came for the cab drivers and I said nothing, you know. And then once they come for me, you know, so the thing that people are upset about is not that they're ruining all these other people's jobs and taking all this stuff. The thing that they worry about is, well, now my privacy is being invaded. So now
Starting point is 00:28:25 I'm going to get, you know, now I'm going to get up in arms about what, you know, what's happening here. Or now my job is in danger. So now I'm going to get upset about that. Yeah. Well, people speak specifically of what it's like to be a writer. Recently, an article, I think it was an op-ed in the New York Times, might have been the Washington Post, but in the last couple of weeks talking about the economics of writing and how dismal they've become. And it's amazing. I mean, I've had some sense of this for some time, but to read these stats was fairly alarming. I mean, like the average professional writer who's making some portion of his or her living from writing is living below the poverty line. And even, you have to be a massive outlier in terms of, you know, just not even an ordinary
Starting point is 00:29:15 bestseller to make a very good living from writing. And for the most part, professional writers have to have other jobs. I mean, most professional writers who turn out a book every year or two or three have professorships or they have something else that's paying the bills. And that's not an optimal world to live in, especially when you throw in journalism there, which is massively underfunded. And ironically, we're living to some degree in a recent heyday of journalism because of how awful Trump is. Still, there's a kind of winner-take-all effect there where you have The New York Times and The Atlantic doing well, and then everyone else still going slowly or quickly bankrupt.
Starting point is 00:29:59 How do you view journalism and the life of a writer at this point? It's harder. I'm lucky in that when I wrote my first books in the early 90s, it was still this end of the golden period for authors where I would write a book that sold a lot less than my books do now, but my publisher would send me on airplanes on a book tour. I'd stay in the author's suite of hotels. They had these special suites that we would go to with fireplaces and the books of all the people who had stayed in it before. And you'd get this person called a media escort who would take you to your events in the different towns. somehow. Who was also somehow, right? And then, you know, whatever, you know, Viacom buys Simon and Schuster, which buys, you know, each of the little publishers and all the slack and went out of the wheel somehow. It's like we, we, they started to use just much more accurate spreadsheets and all the wiggle room that we had in publishing. It was an industry that somehow just kind of got by about at the
Starting point is 00:31:07 same size, I guess, for a few centuries. It just sort of worked. And we lost the ability to kind of fudge our way through that. And they started to demand better margins and more of a squeeze. And yeah, the power law dynamics of the internet then came into it. So it's better for a publisher to sell a Taylor Swift's autobiography and sell half a million copies of that than 40,000 copies of a book that's going to actually change the way people think. And it's tricky. I decided to get my PhD late. I got my PhD when I was 50 or 49. And that was right after the 2007 crash when all the publishers were asking for books that could help business people one way or another. Every book I wrote was supposed to have a business self-help aspect to it. So I got a university job so that I could write books, you know, like Programmer Be Programmed and Present Shock or Throwing Rocks at the Google Bus, these ones, or this one, Team Human, which, you know, are books promoting humanism. And I don't have to worry as much about whether I sell, you know, 5,000 or 50,000 copies. about whether I sell 5,000 or 50,000 copies. But it's sad. I've done university lectures where college students... A common question I've gotten is, why should journalists get paid for what they do if I could blog as easily as them? So they've almost lost all touch with the idea.
Starting point is 00:32:39 That's one of the more... I mean, there's so much in that. It's one of the more terrifying questions I've heard. Yeah, it's frightening. I mean, the way I answer it now is, well, you know, if governments and corporations can spend, you know, billions of dollars on propaganda, don't you want someone who has enough money to spend a couple of weeks deconstructing what you're being told? You know, it makes no sense. If I had to list my fears around what technology is doing to us, the erosion of the economics of journalism is one, and also just the distortion of their incentives.
Starting point is 00:33:13 I mean, the fact that even our best organs of journalism are part of the clickbait logic, and it's incentivizing all the wrong things. But what we want, we should be able to reverse engineer this. We know we want smart people to be well compensated for taking, you know, months to really fully explore and vet topics of great social importance. And so the question is, how do we get there? And the idea that someone could take months to write the best piece that has been written in a decade on the threat of nuclear war, say, right? And that that could just sink below the noise of Kim Kardashian's latest tweet, or in a similar vein, our president's latest tweet, and just disappear from a news cycle, and therefore earn comparatively
Starting point is 00:34:18 little ad revenue. And the net message of all of that is that those kinds of journalistic efforts don't pay for themselves and that we really don't need people to hold those jobs because we Panama Papers. And, you know, we see some response to that. I mean, I don't like the alternative, the opposite alternative, where it started to feel like, to me anyway, coming up, that the journalists who got the biggest platforms tended to be people who were paid well enough by the system not to challenge neoliberalism. It's like, well, if you pay the journalists enough, then they're going to support the system as it is. They'll drink martinis and shut up. So now that it's getting harder and harder for journalists to make ends meet, I feel like a little bit of them, there might be a little bit of a positive effect in terms of, at least in terms of their politics. And they're looking at saying, oh, you know, now I'm a gig worker. Now I'm in precarity. There's something, and there's something valuable to, you know, to not being able to just, you know, graduate an Ivy League school and get to write books for the rest of your life. products get tens of millions of views and certain others just sink beneath the waves,
Starting point is 00:36:08 no matter how valid or valuable the content is, right? So if it's just a fact that only 8,000 people in the United States really want to read the next white paper about climate change, well, then you can't monetize that white paper because people just don't care about climate change, well then you can't monetize that white paper because people just don't care about climate change. Now they should care about it, and we have to keep trying to figure out how to make them care about it, but if they don't care about it, given the glut of information, I mean given the fact that again, you know, you can just binge watch Game of Thrones for the third time and you can't stop people from doing that, this is just a kind of fool's errand to figure out how to get them to take their medicine. On some level, what we're
Starting point is 00:36:54 saying is that if it can't be made interesting enough and titillating enough so as to actually survive competition with everything else that's interesting and titillating, well, then maybe it deserves to sink, even if that is selecting for the Kardashians of the world and burying important stories. You know, it's interesting. I make the same sort of argument in Team Human when I'm kind of defending the work of David Lynch against the more commercial kinds of movies and the way that Hollywood will use that argument. They'll say, well, look, people in the mall, they want to see the blockbuster with the
Starting point is 00:37:33 spoiler and the climactic ending. They don't want to see the thing that makes you think. They don't want the strange. But I think people do deep down want that. I do think people want to experience awe and they're on season two, episode one, and you're going to spoil something for them. Because these shows themselves are basically timelines of IP, of little IP bombs, the little spoilers that show up every three or four episodes. bombs, the little spoilers that show up every three or four episodes. It's like the value of the thing. And you keep undoing them. It's, oh, Mr. Robot, he is his father. Or each one of them has these almost stock reversals. And you look at real art, I mean, what's the spoiler in is what they're paying tickets for, doesn't really make sense when the whole thing has been contextualized that way.
Starting point is 00:39:13 In other words, I don't think that's what people really want so much as that's what we're being trained, we're being trained or that there are shows that almost open up wonder or that almost make deep arguments or books that make quasi-deep arguments, but then just solve everything at the end. There is a right answer. So in a show, I don't know if you saw that Westworld show on HBO, no, there is an answer. So in a show, I don't know if you saw that Westworld show on HBO, no, there is an answer. It's all these timelines that are all over the place. And then you find out, oh, these were different timelines. And this happened then, and this happened then, and this happened now. And it's just a kind of a postmodern pyrotechnic, but it gets you to that same place where there is
Starting point is 00:40:02 an answer. And every article we write is supposed to have, yes, therefore you should take echinacea or don't take echinacea or vote for Trump or don't vote for Trump. This need to give people the conclusion as if, well, I'm not going to pay you for an article or a book if you don't give me the answer by the end. I'm not going to watch your movie unless everything works out by the end. And that's, in some ways, the most dangerous aspect of this cultural collapse that we're in, that everything has to have an answer or every effort has to have a utility value that's recognizable, at least by the time you've done this thing, because you can't reduce all human activity, all writing, all product, all culture to its utilitarian value. And this is
Starting point is 00:40:54 where I get into that weird, mythic religious place that I'm still uncomfortable speaking out loud. But I just read or reread Horkheimer. He wrote this book, The Eclipse of Reason, and he was talking about the difference between reason with a capital R, the real reasons, the essential human values we do things versus the small r utilitarian reasons we do things. And what I'm really trying to do is stress that human beings have capital R reasons for things, that there is something more going on here that meets the eye. And I don't just mean some magical other dimension, but some essential value to human camaraderie, to establishing rapport, to being together, to looking in someone's eyes.
Starting point is 00:41:46 It's not, I mean, yes, it's the mirror neurons fire and the oxytocin goes through your bloodstream and your breathing rates sync up. And this is the stuff that you studied, you know, and there's an evolutionary small r reason to establish rapport. But is there another one? Is there a value? Is there a meaning to this? And that's the part I'm not willing to give up. And that's the big argument that I guess the real thing I'm in now is the argument with the transhumanists or the post-humanists or the singularitans who really believe that technologies are evolutionary successor and we should pass the torch to technology because tech will not only write a more factual article and a more utilitarian program, but tech is more complicated. It's a more complex home
Starting point is 00:42:40 for information than the human mind. so we should pass the torch. And when I say, no, that human beings are special, and I start talking about awe and meditation and camaraderie and establishing rapport, I mean, the famous transhumanist, I'll leave him nameless, I was on a panel with him, and he said, oh, Rushkov, you're just saying that because you're a human. As if it was hubris for me to argue for humanity. And that's when I decided, okay, I'll be on Team Human. I'll make a bet that there's something important here. find it hard to make any of the arguments that we're making, whether they're against the market or against automation or against all of our stuff being for free or the collapse of quality or just giving in to consumerism. It seems that I have to hold up some sort of essential human value that
Starting point is 00:43:39 needs to be recognized rather than surrendered so readily. Well, I guess that there are two different kinds of surrender or blurring of the border between the human and what might be beyond. I guess you could talk about artificial intelligence replacing us or augmenting us. I know you touch all these topics. If you'd like to continue listening to this conversation, you'll need to subscribe at Thank you. This podcast is ad-free and relies entirely on listener support. And you can subscribe now at SamHarris.org.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.