Decoding the Gurus - Jaron Lanier: Fear of an Algorithmic Planet

Episode Date: May 30, 2022

The season of tech is upon us! Hold on to your hats and press those non-fungible tokens tightly to your bosom, because we are getting into the thick of it.Matt and Chris kick things off with an explor...ation of tech pioneer Jaron Lanier. Sometimes referred to as the father of virtual reality, Jaron is a promising candidate for gurudom given his foreboding warnings of a bleak algorithmic future and a penchant for opening tech lectures barefoot with impromptu performances of exotic instruments. In short, he's a groovy guy and Matt digs that! Chris, being the sour lemon that he is, takes a little while to adjust to Jaron's particular style.Some Weinsteinian warning signs might be flashing but there really are eccentric and very clever people in the world... and Lanier might just be one of them? He certainly has a track record of critically commenting on techno-optimism and social media platforms for at least two decades. But some of his pronouncements seem a tad OTT and some recommendations a little hand-wavy...So how do the decoders square this particular dread-locked circle? Well, you're going to have to listen all the way to the end to find out. So, get yourself comfortable. Drink some coffee and pop some no-doze, and strap yourself in.Smash the Duck!LinksWas the Internet a Horrible Mistake? Jaron Lanier on Honestly with Bari WeissThat VR boxing game Chris' mentionedPeterson being a smug judge on Twitter

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Decoding the Gurus, the podcast where an anthropologist and a psychologist listen to the greatest minds the world has to offer and we try to understand what they're talking about. I'm Professor Matt Brown and with me is Associate Professor Chris Kavanagh, the wings above my wind. Thank you for being here, Chris. Oh, you reversed it. I was trying to work out if I've heard that before, but yeah, it's a reversal. That's a good one. That's outside the box thinking you can double up. I just thought of that.
Starting point is 00:00:53 Yeah. I wonder why nobody has used that expression before. It's so poetic. Yeah. It's when you want to let someone know that you lift them up. When there was one set of footprints, Matt, I carried you. Christians will get it. Christians will understand that reference. If not, sorry. So have you been staring into your aquarium this morning? Have you been gazing
Starting point is 00:01:17 into the glassy orb? I have. It's mesmerizing. It's so good. It's my new thing. Now, instead of reading things online and getting upset, I gaze into my aquarium and I feel at peace and at one with everything. to record that you're not working you're not talking to your children or your wife you're just in front of the aquarium just staring there for like an hour or so watching the colorful fish dance around and if you were to do that i think i would respect you more these days it's true more often than not but yeah i like that It's a good domestic activity. I've mastered domesticity, I think. I'm pretty good at it. I'm a good homebody. In a similar way, I think if you imagine your enemies, the people in the culture war that are just partisan maniacs, I prefer to imagine them acting as Solomon-esque characters who, you know, sit at home staring at culture war orbs, just gazing into the mystic swirl. And then they go on the Twitter box and tweet out
Starting point is 00:02:36 their takes. Like, I wish they actually were proper villains. They had like evil crowns and stuff that they're wearing in private. And the sad truth is, is that they're probably trying to decide where to have brunch. You know, like even Saruman, Chris, at some point he's sitting out there in Orthanc, and he's not always staring into the orb. Sometimes he's thinking, what am I going to have for brunch? I mean, he has to, even evil wizards have brunch, Chris. Well, you could, you could ask the orb. That could be a question that you asked the seeing stone. You know, hey, pre-industrial society, where should we go? There's no, we don't have restaurants out there existing in Orfang.
Starting point is 00:03:15 Yeah. Ask the orb whether you should have the croissants or the fruit cup. It knows. Speaking of altered perceptions and things giving you a glimpse into another world, I should tell you, I should tell you this. Yeah, I used to be a lethal machine. My hands were registered weapons. I was a mid-tier, mid-tier is even being kind, but I used to do martial arts things. I did Thai boxing for a couple of years, then I did Brazilian jujitsu and judo. But I used to do martial arts things. I did Thai boxing for a couple of years and I did Brazilian jujitsu and judo. And I enjoyed that mainly because those kinds of classes
Starting point is 00:03:51 could force me to do exercise. The peer pressure and the thought of some muscly, sweaty man choking me out if I didn't train hard enough. You know, I need that kind of motivation. So when I tried to like run by myself or something, I just was like, oh, I don't like this. This is tiring. I just stop and get annoyed with it. You need to be motivated by fear, Chris. You need some muscly, sweaty man chasing you. Yeah.
Starting point is 00:04:13 Then you'd run. Yeah, there was a really horrible occasion. This probably won't go out by the hell of it. Anyway, I was training with someone. This was at a judo session. And there was an overweight guy who was a brown belt. And we were in a position where I was training with someone. This was at a judo session. And there was an overweight guy who was a brown belt. And we were in a position where I was on the ground and he was trying to muscle over the top.
Starting point is 00:04:30 His judo gi was lying open and his tummy was flapping around. He was very sweaty, Matt. And one large drop of sweat appeared on his chest area and dropped off. And I saw it drop and then I just felt it hit my mouth and the sensation of salt i couldn't avoid it i was just like oh that's great you've just given me another reason to never ever do this activity that's yeah so the good illustration of that is that why this is not a great activity we do doing in the middle of a pandemic. I haven't had this ability to be getting the exercise that I used to get from these classes.
Starting point is 00:05:10 And I've been trying various other things, but I'm just not very good at motivating by myself without having a goal. But recently, Matt, I have an Oculus Quest 2, which was intended for creating 360 videos of ritual events and getting people to experience them and lab as a stimulus and blah, blah, blah. But I'm using it now to do virtual boxing and beat saver. And it's working very well. I'm getting nice exercise and feeling better. And I just knocked someone out before we were on this call.
Starting point is 00:05:44 I beat a guy, an Irish virtual boxer. Chris, it sounds to me like you're using university equipment for personal purposes. Who do I notify about this? Who is the appropriate person? This is my personal research fund that that money came from. Second of all, I am really an asset of the university and keeping my mind and body in pristine shape is in a very direct way, it is an investment for the university. So I'm using the equipment in an innovative way, but one that does contribute to the production of research. So there, Matt, how was that? Yeah, nicely done. Nicely done. Yeah, well,
Starting point is 00:06:25 we have an Oculus Quest 2 and I would verse you, as my kids like to say. We probably can. That functionality must exist. That can be premium content for the patrons. Matt versus Chris, virtual boxing. We'll have to create a special tier just for this content. But look, regardless of who wins, I think I'd still be morally the victor. You might win in a tangible sense, but I'd win morally. I'm very out of shape, but I was glad I put him down, Matt, the virtual boxer. It's a sad word. But Matt, we've got serious business to do today. We've got Gurus Decode. We're back.
Starting point is 00:07:07 We're alone. There's no guests here. No interviews. No joint guest decoding. Just us alone. Just you and me rambling to one another with very long introductions. This is the way it was meant to be. This is vintage DTG.
Starting point is 00:07:21 That's right. We have done a lot of interviews recently, and that's just because we've had so many great people to talk to. That's right. But it will be good to get back into the decoding, keep it real. It used to be about the decoding. Yeah, it used to be. And we need to get back to that. It's good to just grind yourself.
Starting point is 00:07:35 We're starting our season of tech this week. It would be nice if we had a little tech theme now, so if we've managed one, it will play here. It's Decoding the Guru. Tech season. Tech season. Tech, tech, tech, tech, tech. Tech season. It's Decoding the Guru. Tech, tech, tech, tech, tech. Tech season.
Starting point is 00:08:09 And if not, that was just silence on the plate. Yeah, so. Yeah, yeah. So we got the tech theme and yeah, there's so many themes to get into. But we're going to move away from the, you know, the culture war, political agitators. Oh, Matt, sweet summer child, Matt. We're sick of them. You think the tech gurus aren't in the culture war?
Starting point is 00:08:30 How much are you in for a surprise? But especially seeing as we're going to look at people like Lex Fridman and Elon Musk, and these are not exactly people who are alien to the culture wars. exactly people who are alien to the culture wars so well as you say chris the culture war permeates and infuses everything and i should have known that but as you like to say we are mere servants to the discourse and we follow it where it leads yeah it's like the force it connects us it binds us together but um we have a a long-term patron i think from the very start uh a very generous patron whose galaxy brainness is is not in doubt and his name is eric oliver and he never asks anything for us matt he doesn't even ask shout outs, but he has a new podcast coming out called Nine Questions, which is related to course he teaches on how people should know themselves better. So
Starting point is 00:09:34 I want to recommend Eric's podcast that people should go check it out. And he's a political science professor at the University of Chicago. So, you know, a fellow academic, academic solidarity, Matt, and Patreon solidarity, solidarity. Um, yeah. Nine questions. Very good. I'm all for academic solidarity. Uh, yeah.
Starting point is 00:09:59 Good stuff. Happy to recommend it. Yeah. That's anyway, that was just a random note that i wanted to make um and and now map we'll move to our our decoding we're looking at a and quite an eccentric character with a large mat of red-haired dreadlocks an an unusual combination, and quite heavyset guy who is a tech entrepreneur, somebody from the first wave of the internet, involved with VR, involved with social media platforms and so on. Jaron Larnier. Is that right? Is that pronounced it right? Jaron?
Starting point is 00:10:42 I think so. Yep. Jaron? Yep. Close enough. Yeah. So let's get started, Jaron? I think so. Yep. Jaron? Yep. Close enough. Yeah. So let's get started, shall we? Let's go. Here we are, Matt. Classic DTG.
Starting point is 00:10:54 You, me, a guru. It's like it's 2021. So this is the first thing to mention is we're going to get his name wrong. And I'm very sorry. I have tried to practice it and it's not intentional. So I apologize. I think it's Sharon Larnier or Linear. Anyway, we all know who we're talking about.
Starting point is 00:11:13 When you practice your pronunciation, it just gets worse. So just do your best. Just power through. That's all we want. Right. Well, so Matt, these clips are taken from an episode of Honestly, Barry Weiss's podcast, where she interviewed him. And we haven't covered Barry Weiss, and we'll just mention her in passing when it's relevant. But I do listen to her podcast, and it is one of the ones I listen
Starting point is 00:11:39 to that regularly annoys me. So that might color things just slightly. Many people would be wondering why you listen to podcasts that regularly annoy you, but we should just let that go. Just let that go. Let's not dig into it. That's right. Let's not dwell on that. So anyway, this is how Barry introduces him on the episode. Jaron's appearance is striking. He has these piercing pale blue eyes. He wears a black t-shirt and long dreadlocks well past his butt. It's the same look he's had ever since he arrived in Silicon Valley back in the now legendary
Starting point is 00:12:14 garage era days of the 1980s. I think that's a pretty good guru introduction, right? Paints a picture with words and it continues. His house is the most unusual house I've ever seen. You do not need to apologize at all. It's painted in these pastel stripes and it looks partially like a shrine to the god of science with sculptures of atoms and electrons. This is a caffeine molecule. It's a molecule we care about a great deal here. And partially like a psychedelic playroom.
Starting point is 00:12:50 This sounds like your house, Mark. Pastel stripes all over the walls, instruments hanging. Well, it's true what Barry White says. Geraint Lanier is a very striking looking man. Looks pretty cool, actually. We heard a little snippet of his voice there. We'll be hearing a lot more of it, but I just want to bring this up front.
Starting point is 00:13:09 I want to center this, which is that he sounds an awful lot like Truman Capote as acted or voiced by Philip Seymour Hoffman in Cold Blood. I shared a clip of that with you, Chris. Did you listen to it and would you agree? I did listen to it and I would agree. I gather that we have recordings of what Truman Capote sounded like. So I trust Philip Seymour Hoffman to accurately portray that. So yeah, I can see you are right, Matt. There was definite parallels and, you know, they were both eccentric geniuses, the presentation.
Starting point is 00:13:43 I'm sure there is some sort of background or history to that type of accent. It actually makes me think of, like Americans will probably be able to tell us, this is why I'm bringing it up. Am I wrong? It could be like a southern kind of, a certain kind of southern accent. I know you're furrowing your brow, but. I mean, really? Because it sounds, I don't think there's an area where everybody sounds like this.
Starting point is 00:14:06 I don't think so. It could be, it could be. It's a legitimate question. Okay, well, I'll just say this. This isn't directed at you, Chris, because I know you're a Philistine and you wouldn't have read this. There's a very good book, won the Pulitzer Prize for literature called Confederacy of Dunces. Well, some people hate it, but I really like it. And the main character in that, whose name I've forgotten, in my mind, he sounds exactly like Truman Capote and Gerard Lannier. But that's something for the listeners to send us an email. Tell us what you think. Is that how you imagined that protagonist's voice? Let me know.
Starting point is 00:14:39 Interaction. Interactive guru decoding. This is great. So Matt, you mentioned about the distinctive laugh and we got to hear some twinkling instruments play. Let's hear a little bit more of that. Jaron has over a thousand rare instruments, including mouth flutes and ouds, all of which he plays hanging all over the walls. He's got lava lamps and toys, stuffed animals, bright colored tapestries,
Starting point is 00:15:04 and also a very nice espresso machine. This is the espresso machine. Yeah, I gotta say that, like, when we were talking about doing Jaron, I saw some of his appearances and found them slightly irritating because it felt a little bit like he's clearly eccentric, but some of it feels affected. My skeptical hackles are immediately raised if someone claims that they can play a thousand instruments in the exact same way as Jordan Peterson claims he read 200 books on
Starting point is 00:15:38 climate change, right? It's just a thousand instruments, Really? Really? I can play a thousand instruments too. Just like banging drums. You can't. I can. I can play the clarinet. I can play the clarinet. Can you? I didn't know that about you.
Starting point is 00:15:54 I mean, I learned the clarinet at school, but I don't think I can actually play it. I can make it make noises, which is what I think Jerome can do with a bunch of, at least 300 of his instruments. All right, look, we just heard a Clifford playing there. That sounded all right to me. That sounded like the Zen room in Rocky Horror Picture Show. You know, when he gets, when the guy gets. I think he can play instruments and he's a musician and
Starting point is 00:16:16 probably a talented musician. But I just, a thousand instruments, Matt, can musicians do that? Like, you know, it makes me think of Stomp. They're just walking around banging bin lids and changing toothpaste into a percussion instrument. Chris, they're probably all very similar instruments. Like, I bet there's like a thousand variations of instruments that are somewhere in between the mandolin, the sitar, and the guitar. And he can probably play all of them. I believe it.
Starting point is 00:16:41 A thousand. Maybe not a thousand. If there's a thousand different variations of a guitar then i buy it that's all right that's how you know there's only so many ways you can make would make noise the other thing that i was reminded of when we're getting that description of his house and all of the objet and the culture and art and so on. And also with the voice and the persona. It reminded me of Gale from Breaking Bad. Do you remember the assistant to... Oh, yeah, yeah, yeah.
Starting point is 00:17:10 The guy who had an interest in Thai as well. I think he was learning Thai, but he was a weird person, a weird character, a great character in that. But my point with that is that there are eccentric people. I've met eccentric people that are that different from... No, yeah, yeah. I agree.
Starting point is 00:17:30 Like, it's genuine eccentricity. I just think you can be genuinely eccentric and you can also play it up. You can ham it up a bit. You can ham it up a little bit. Eccentric people are hams. That's what they are as well. Truman Capote definitely hams it up. I'm sure that Matt, as well. Truman Capote definitely hands it up.
Starting point is 00:17:45 I'm sure that's part of being a flamboyant person. Yeah. You wouldn't know about this because you're Northern Irish, you guys are dour, and you get a kicking if you do anything nonconformist. We get teenage kicks all through the night, Matt. Uh-huh. So our musical traditions are legendary. All, like, four of them.
Starting point is 00:18:05 But yeah, so anyway, I don't mean it as a criticism. I just mean my skeptical hackles are reused whenever. I think it's partly Barry Weiss's delivery because the way she presents people is always just so fawning and so dripping with deep sincerity. It scrapes my bones. I understand as well. But I'll even give Barry a pass there too, because it's that journalistic-y, New York-y. This American life. Yeah.
Starting point is 00:18:33 Yeah. I know. I know. I mean, this house sounds interesting. It's a wizard's workshop. That's okay. But this is Barry identifying Gerona as having a Cassandra-like quality to him. I went over to have coffee with Geron, which was delicious, by the way,
Starting point is 00:18:50 because even though he had a big hand in creating what we know is the Internet today, he's also become something of a Jeremiah, shouting warnings about what the Internet has become. A Jeremiah, Matt? What the hell's that reference? Oh, my God. this could be Old Testament stuff. Oh, I thought it was just Cassandra. Jeremiah complex. We're going to have to adjust the garometer, but that is convenient. Yeah. I'm sure Jeremiah is the male analog to Cassandra. Yeah. Well, let's see. Is he a Cassandra? But what happened after that is that whole beat, which had been a desert before, got very populated and with excellent people.
Starting point is 00:19:32 So you have Shoshana Harris, Shoshana Zuboff, and a bunch of other just really excellent people writing on it. At this point, it's almost like the normative sort of thing that one reads. Like the Times has two columns this morning that might have been written by me a few years ago or something or longer. I mean, if anybody's counting in 92, I wrote a piece predicting that bots could throw elections and all this stuff. Anyway, so I've been on this thing for a long time. Okay, so does that scrape your bones or raise your hackles chris it's just i think there's this tendency that we see in the gurus to claim that they were ahead of the curve and that they originate these ideas or criticisms and geron is doing that here but i think it is quite possible
Starting point is 00:20:22 that he did do that because this does seem to have been his bid. And he was introducing there at the beginning how now skeptical, critical treatments of social media companies and the Internet world are a dime a dozen. It is no longer an unusual stance. It's the mainstream stance. It's a dominant perspective. And I think he's right about that. So yeah, I did note that he was kind of 80% Cassandra-ing it up in this clip, but maybe with justification. Yeah, I think that's fair. I think he's got healthy self-regard, which is not uncommon among people who are very clever and who have done relatively
Starting point is 00:21:06 significant things. As you said, he's got this background. He's been drenched in the Silicon Valley culture for decades and he's been up close and personal on it and he's been writing about it and thinking about it and critically commenting on it for a very long time. So yeah, he's tooting his own horn or playing his own sitar there a little bit, stroking his own sitar, shall we say. I like that. But that's all right. This is why I want to give him credit because if he's 80% Cassandra in it, he's 20% not because here's a qualification he made shortly after that. Is it satisfying to see people take up your arguments? Well, you know what it is, is it's freeing.
Starting point is 00:21:45 They're not exactly mine. I mean, they're different. Like Shoshana is a bit more anti-capitalist. So she talks about surveillance capitalism. And I'm not really. I view capitalism as a tool that has proven useful in cases and quite destructive in other cases. So, you know, I just like that because Barry wanted them to, or give him the opening to say that other people are taking up his ideas. And he was like, well, actually there's distinctions from what they're saying and what I said. And
Starting point is 00:22:14 that was like, that's nice to see. Yeah. Yeah. And he didn't portray the other person's ideas as being derivative or lesser or anything like that. He said, this is other people writing interesting things too. And we're a little bit same and a little bit different. And yeah, that's a good sign. Yeah. So Matt, you were highlighting about his critical commentary that he has on Silicon Valley, and there's quite an interesting part where he goes into some parallels he sees between the folks in Silicon Valley and the Nazi regime, which might sound hyperbolic, but let's see how he frames it. Maybe let's pick up on the Nazi analogy. Explain that analogy. So who are the, the second Hitler, it's like reductum on Hitler, someone says, like the second you bring up Nazis
Starting point is 00:23:04 or Hitler, you kind of say, I deliberately didn't bring up Hitler. I talked about the Nazi regime. And the reason why is I don't want to focus on one person's personality here. What I'm interested in instead is a society of probably no more than hundreds or maybe a few thousand people who figured out a way to exploit emerging technologies for propaganda. That's a different question than Hitler. One thing I appreciated there was like, when I initially heard it, I thought he might have
Starting point is 00:23:32 been, you know, making, I'm not about exploration, I'm about searching or seeking or that kind of Jordan Hall clarification. But when I listened back to it, no, he actually, again, he wanted to make a very specific point and he didn't want to make a simplistic comparison to Jeff Bezos or Mark Zuckerberg to Hitler. He wanted to talk about like the propaganda part of the regime. And there is an element of precision in the way that he talks, which is quite refreshing because he's one of these types who corrects people when they say something instead of, you know, like saying, well, yeah, that's roughly what I was saying. So I like that. Yeah, yeah, I could see that. And with that point,
Starting point is 00:24:17 he was drawing an analogy with a very specific thing, which is that the figures within that regime took advantage, as is well known, of emerging technologies, things like the radio, to broadcast propaganda. And that was the specific analogy he was making. So, yeah, that's a fair thing to do. I think he stands it a little bit more. So here, I don't want to say they're the same as Silicon Valley, but, you know, there's some, and in fact, some of them, you know, after the war, kind of were rehabilitated, because it sort of felt like they were, like Lenny, how do you say it? Lenny Riefenstahl. Riefenstahl, right. You know, was
Starting point is 00:24:58 somewhat rehabilitated, because there's a sense, well, here's, you know, the Nazi propagandists. Yes, exactly. There's a sense that maybe these people were more technicians who might have done propaganda for anything than real ideologues. And I suspect there's some truth to that, you know. He's implying that his colleagues in Silicon Valley are worthy to be living in the 1940s. They would be servicing the Nazi regime to make the trains run on time or whatever. So casual disparagement of your colleagues as potential Nazis seems, you know,
Starting point is 00:25:36 on brand for a contrarian. Yeah, but not terribly unreasonable, right? He makes a reasonable point. And he also has these expressions which are maybe slightly not politically correct, but I think quite evocative. Like our friend Eric Weinstein likes to coin new idioms. And I think Jaron has him beat a bit in how catchy they are. So here's one where he's describing the same kinds of people. Especially at that time, there was this hyper-libertarianism that was supposed to be guiding everything.
Starting point is 00:26:08 And I wanted to point out to people that sometimes, I used to call them ideology sluts, that you think you're adhering to this one ideology, but you're actually slipping into this other one because you're such a slut. And that had happened in Silicon Valley. And that we'd moved from this supposed free thinking thing into this technocratic way of enforcing the same thinking.
Starting point is 00:26:36 Yeah. So this is the theme, isn't it, of a lot of Gerard Lanier's work, which is that sort of wild west, open season, nerds building things in garages and building fun toys to see if they can. Gradually transformed into the kind of world we have today, we have these massive giants like Facebook and Google and Apple, which do create these closed ecosystems, which do monitor people's data and use it and do sort of direct and channel people's behavior online for their own interests. And what do you think, Chris? I mean, that's not a controversial point of view these days, is it? No, and I like this point about ideological slots, right, regardless of the particular metaphor,
Starting point is 00:27:26 slots, regardless of the particular metaphor. But the concept that a lot of the people in Silicon Valley, that they casually slip between ideologies and they maybe don't even notice what they're picking up, it feels very shallow and very, I don't know, I can't put a way to put it. I don't know, Matt, help me out. No, no, I know what you're saying and I can't articulate it very well either, but I think most people would resonate with that vibe, which is that the Silicon Valley, the same goes for the cryptocurrency, Bitcoin, bros, there's this entrepreneurial thing. You have these people that slip between these ideologies and they might one day be into this sort of data, wants to be free and be completely hardcore about one particular thing.
Starting point is 00:28:23 And then they have these aspects that we see with sense makers and so on with like mystic enlightenment but then it sort of slips into a kind of will to power and this superman type thing and yeah it's very labile and it's very odd and yeah most people find it a bit weird yeah there's there's the kind of Elon Musk as the godhead. Start off just, I appreciate somebody pushing private industries towards space travel or that kind of thing, and then end up worshipping at the feet of him as he reinvents tunnels. Yeah, the Ubermensch looms large, especially when you have your teal figures bopping along in the background. But probably one of the main planks of his criticism is around the role that algorithms play in the modern internet ecosystem.
Starting point is 00:29:19 And he's fairly unsparing in his condemnation of what they have wrought. So here's a clip of him introducing that. What that results in is people being directed rather than exploring, and that makes the world small. And I think that that is fundamental. And so when you talk to people who do this stuff at Google or Facebook, they'll say, well, it just means we need to make our algorithms better. But you can't. I mean, like you can't say we want to have a better form of constant incremental manipulation of every person. It's like the whole concept from the start is poison. Sorry, I just needed to put that pensive music in to give you pause for a while.
Starting point is 00:30:08 Carry on. Okay. So when we were talking about this before recording, I'll let you know that my general impressions of Jared Lanier were relatively good, but that's not to say I didn't disagree with points he was making. And I agree with the more general point about algorithms and the various commercial interests sort of directing human behavior, whether it's search behavior or Twitter behavior or whatever, in sort of subtle ways that might not have a big impact at an individual level, but at the bigger level can have an impact that we may or may not like. On the other hand, I'm a little bit more sympathetic to the algorithm than Lanier is, Chris, right? Because I remember in the early days of the internet,
Starting point is 00:30:51 there was a whole industry in things like search engine optimization. Hello Jeeves or Ask Jeeves, Ask Jeeves. Everybody wanted their page, their company page, whatever, to get to the first page of the search results in Google. So everyone would game the system, right? They would spam in keywords and just create paragraphs of texts in order to sort of spam. Search engine optimization. Yeah, that's right. Which was not good for the person who was wanting to do a search because people wanted
Starting point is 00:31:22 to sell them stuff basically, or self-promoters of various kinds kinds whether it was they just wanted to find something on wikipedia or something and there still are very good reasons for needing an intelligent algorithm that can try to figure out what kind of results are the ones that people are actually wanting to find as opposed to what the purveyors of content are wanting them to find. I mean, this isn't a guru comment. I just thought he is just going a bit hard and saying, well, you can't have any algorithm whatsoever. Like all algorithms are pernicious. I didn't quite buy it. So this gets to a point where he wants to argue for an alternative to algorithmic driven internet, right? And he links this in to data dignity, this concept he has, which I think we can talk about in a bit, but I'll play a clip
Starting point is 00:32:15 where he's juxtaposing the two perspectives he has, and then I'll offer my opinion on it. Yeah, the advertising model is the original sin. So then I would say is that within that, what would be the current search algorithm is basing on averaging out what other people have done. You know, it's based on this idea. It's a Maoist idea that the collective knows best. That was the original Google search algorithm. I would trash that. I would instead have a network of human publications about human publications. So I would have shoe reviewers who are paid, but not by the shoe companies,
Starting point is 00:32:51 but by subscribers through some sort of indirect means that would accumulate. I would have reviews of reviewers. I would have a world of humans who are recommending things, not algorithms. And I think that would be a better world. That would be a human-curated world. A human-curated world versus an algorithmic-curated world. So apart from that he endorsed the reviewing of reviews, which is, you know, good job.
Starting point is 00:33:18 We're on board with that. But to some extent, I struggle because I don't want to argue. I've had debates with this online with Philip Marklin, who is a researcher on Twitter, who lays a lot of blame for our current situation at the feet of the algorithm, like Sharon. And I think there are points to be made there, and valid points. There was a period where a lot of effort was placed on gaining engagement. The social media companies were just about driving engagement regardless of the content that they were promoting. We all dealt with the consequences of that. And I think
Starting point is 00:33:56 that is still there. It still happens. But I think the contrast with this human curated world that he proposes as an alternative, I don't see how that's any different than what we already have, because we already have reviewers like professional reviewers. We have non-paid reviewers who build up online followings from their reviews. We have people that are paid who just review tech products and have Patreons dedicated to it. And we have mega companies that release reviews. And in the same way, the algorithm being driven by what people are searching for and what they usually want to find when they locate those things, one seems like a little bit an extrapolation of the other. And I think there's
Starting point is 00:34:45 a danger of him leaning into a kind of utopian view of these decentralized communities versus the brutal algorithmic great flattener of Google. I don't know when there was a whole bunch of different search engines. It was annoying because it was Ask Jeeves and all those kind of things. And yet we still do have options for the way that we want to search the web. You can go on Reddit. You can go into the chans if that's your thing. The ways that you interact with the web, yes, they're filtered by mega corporations, but there's a lot of freedom and individual choice there if you want it. No, I take all those points. And I think a lot of the
Starting point is 00:35:32 time, if someone's searching for a recipe for how to make a Thai green curry, then it's kind of okay to use an algorithm to go, okay, a lot of people stopped searching for Thai green curry and found this one. And that was probably the recipe they liked. And as you say, those recipes are rated by people who use them and given stars and stuff. And when you think about what content gets viewed the most or proposed to you the most, in terms of you might like this, it's popular accounts. So people have voted with their feet, right or wrong. I get Brett Weinstein recommended to me regularly. So it is human-created to a large degree. I found his stuff genuinely thought-provoking. Even when I didn't quite understand or wasn't
Starting point is 00:36:18 fully on board, I kind of liked the sorts of things he was proposing because I found it thought-provoking. And I'll give you an example. When he was talking about this issue, which is not having these huge BMOF companies with their algorithms that actually drive everything. And in this model that we currently are in, most of the content on the internet is free. And the stuff that isn't free is getting driven to extinction, finding it very hard to survive. And because it's free, the only way for anyone who creates content or people that provide the platform for content to monetize it is via making the people the product, right? Whether it's through advertising, getting them into these sort of silos or whatever it is that that they do and what he's proposing is that the people that generate content or meta content in the sense of the people that are doing reviews or curating or doing all kinds of things that there is some mechanism for paying them such that instead of the consumer being monetized in terms of their attention and their eyeballs
Starting point is 00:37:25 and their exposure to advertising, rather people are flicking a few cents or a couple of dollars for something that they actually genuinely consider valuable. And you could operationalize something like that through these micro transactions or whatever, so that an independent journalist actually gets paid 10 cents or something like that when I click on the link to their article from Twitter. And that feels like a healthy step. Yeah, I can play a clip that relates to that point. So here's him talking about that, but also expressing his skepticism about AI. Do you think that there's some brain in a box that'll operate robots that take on human
Starting point is 00:38:04 jobs? Whereas in fact, if you look at these things, it's just as easy to interpret these devices as collaborations of people. A ton of people contribute example data and training data and desires and feedback to these algorithms to make them work. and feedback to these algorithms to make them work. Yeah. So the point I would agree with him there, Matt, is I think there is this mystique online commonly that everything is bots and everything is automated algorithms and the reality is that a lot of the things that people are complaining about or imagining involve humans in the loop. So for example, Amazon Mechanical Turk, which is this online platform for crowdsourcing micro tasks, the tasks that people post there are called HITs,
Starting point is 00:38:52 hits, right? Human intelligence tasks, because algorithms and AIs cannot do them well. And there's tons of things. And it can be as simple as getting categorizations correct on Amazon's product list when recommendations come up. Or it can be other things to do with content creation. And then the same thing with misinformation ecosystems. I'm not saying there aren't bot networks and that, but a lot of it involves state actors and involves people managing accounts. So I think acknowledging the role of humans in these ecosystems is crucial. And so if people were compensated for that as a new kind of labor, then the more algorithms and robots were functioning in society, the more compensation there'd be for people and the more
Starting point is 00:39:37 paths to livelihood and status and pride. So like every time I fill out a reCAPTCHA, I should get, and like I'm training Google AI. That's exactly right. I should get paid for it. Yeah, I think you should get a buck per reCAPTCHA session. No, seriously, that seems fair to me. I'm more skeptical about the notion that we should be getting paid for completing CAPTCHAs and stuff, because I know it was just a jokey example. But when people say, you know, when a service is offered for free,
Starting point is 00:40:06 that you are the product because your advertising data is being harvested, it is. But in large respect, it's being sold en masse at the user population level. Individual users, I think, are less. I'm not saying it never happens, but this notion that people have that your smart speakers listen to your conversations in order to tell you the product that you're likely to buy. It's not. More often than not,
Starting point is 00:40:37 it's just things working out through your social networks, the people that you're interacting with and tracing purchases in a way that is not based on this kind of perception that people have where their individual data is being monetized on an individual level. I think that is much more rarely the case. Yeah, I know what you mean. There's an element to that, which is a very popular idea.
Starting point is 00:41:06 And there's an element to it which is a little bit paranoid. And yes, your behavioral data is being harvested and contributing to a truly massive data set that is worth something to somebody. But it's totally worthless all by itself. And even aggregated, like I don't think the kind of income one would get, like the income I would get, the value to my particular shopping pattern, I don't think it would, I don't think it would buy me much. So I'm with you in the sense that I don't know if, if Lani's idea there is, would really change much, even if people were compensated, whatever
Starting point is 00:41:43 would be considered to be a fair amount. I suppose the aspect to it that I find interesting and thought-provoking, which is that we know that a lot of jobs are being automated. We know that more and more people are working in some sort of form of content creation, essentially entertaining other people online in one way or informing other people online in some way, shape or form. And it would be nice if there was an ecosystem that fostered that because I think in this sort of post-industrial transition that a large portion of humanity is going through, that can be made more healthy or less healthy so i find that idea interesting but i guess the devil's in the details yeah so let's see there's an example
Starting point is 00:42:33 he gives my about these ground keepers managing like kind of landscape gardening and he talks about you know what the difference about those people, the role in the future when there's more mechanization and the AIs are like taking over the manual labor jobs that they're doing, what their role would be in a kind of data dignity versus algorithm world. And again, I think it veers into a one-sided presentation, but in any case, it gives a good indication, I think, of the kind of thing you're highlighting. The difference isn't technological. They have exactly the same robots,
Starting point is 00:43:11 exactly the same algorithms. It's just a different ideology in one versus the other. And the data dignity feature is clearly the superior one. It clearly has more dignity, more beauty, more creativity, more respect for the future, more of a sense that we don't know everything, that future generations should accept our wisdom as absolute. It doesn't have that. It has an open-ended generosity towards the future, which is what we should have. Yeah. I think I played the one without so many details, just the waxing lyrical about that
Starting point is 00:43:42 of dignity. But, you know, it was essentially that the groundskeepers will be paid for their expertise to train the AIs to do the jobs and their expertise would be respected, not like replaced. that in line with everything being automated gradually and that automation requiring large data sets to teach the automation what to do, then there is a crucial role for people to play in building that. And once one thing is automated, I presume we'll probably try to automate the next thing. So there's nothing wrong with the idea of compensating people fairly for that. The other aspect to his ideas too is the idea of flattening out that exponential curve and that long tail of compensation and size of individuals in the information ecosystem. So at the moment, you see very few accounts, whether it's on YouTube or Twitter or anywhere else, with massive numbers of followers,
Starting point is 00:44:47 the distribution drops extremely sharply and you have this very long tail of much, much smaller accounts until you get to me and then you get to some other randos as well, right? Before you go on, Matt, let's hear him highlight that distinction. It follows a mathematical power law where there's just like a small number of winners and then this long tail that's incredibly desiccated the alternative to that is a bushy network where there's all kinds of little local situations which is what used to happen with local news and local music clubs yeah so high peak with a long tail versus a bushy network yeah so i generally like that idea because
Starting point is 00:45:21 everyone knows that the internet just intrinsically has a winner-take-all dynamic to it, right? Both commercially and in terms of popularity and everything else. with kind of a monoculture and you end up with just a few people having the stage and having all the rewards and everyone else kind of relegated to this role of hangers-on or simply an audience, a passive role. But if you tell all those people, you're not going to talk to each other. We're just going to talk to you. There's only a central authority here and you're going to all say, who's your favorite artist? What's going to happen is there's going to be a single reckoning for all of them. And then I'm going to say, well, since you chose this one, this is the one I'm going to show you more. And then what you'll start to do is you'll start to heighten that peak until it becomes more and more of what
Starting point is 00:46:19 we might call a zip curve until everything that's not right at the tippy top starts to disappear. So I have to admit, I kind of like the general idea of trying to diversify it a bit and flatten that distribution out such that there is a fatter tile. I think a good analogy is actually an ecological one. In the modern period, what we see is we see organisms of various kinds introduce species spreading all over the world. All of this diversity that was sort of there due to localization, like the particular species that lived in the particular area that I live in, for instance, get out-competed by weeds, essentially, that are brought in from elsewhere. And that phenomenon is happening all over the world. And that's a loss to the biological richness of the world. And I think that's a good analogy for the information ecosphere as well. If you could cultivate a little bit of locality, and it could be geographic locality, or it could be locality of interest, particular niche things or whatever, just subtle encouragement of that,
Starting point is 00:47:28 then it seems to me it would be a healthy thing, both for people who are creating content and for people who are consuming content and probably increasingly for everyone who's doing both. I know why you like this, Matt. It's because he lionizes the middle class. Does he? And so you end up with all these little peak illusions.
Starting point is 00:47:45 But if you have locality, if you have jazz clubs and local news, and if you have just local settings that are different, then you have a different mathematical result, which starts to resemble a bell curve, where you have a middle class in outcomes. And the middle class is where you get stability. It's where you can get nations and politics and things that aren't insane. There are problems with the middle class. My whole generation was rebelling against it,
Starting point is 00:48:09 you know. But the thing about the middle class is that weirdly— It's going to go on to say it's good. It provides the foundation for all the little interesting experiments on stability. Yeah. And I don't think people should get hung up on the middle class. Of course you wouldn't, Matt. It's because I know why think people should get hung up on the middle class. Of course you would, but it's because I know why this appealed to you. You heard the middle, your centrism antenna tweaked up into the air, rigid, and you were meat-tears grinder. Chris, this is the future that liberals want. We want bespoke coffee houses and custom- made knives made by men with big bushy beards
Starting point is 00:48:48 made local authentic and not some mass-produced stuff coming out of silicon valley that's not what this was this was part of his thing that raised my contrarian hackles because I have the same desire. I like these little quirky bars. I don't want an Irish chain pub commodifying my culture and selling it to the unwashed masses. But at the same time, I can't help but feel that part of that is like this fetishization of a bespoke and local experience. And it doesn't so much focus, you know, it's kind of like, oh, it's nice to visit there. And isn't it good that somebody is keeping this little gramophone store alive, but I'll go back to my house and listen to Spotify. And I don't know, I just had this feeling that there's a little bit of the fetishization of the exotic or the unusual. And I share it. I share that sentiment. So I had that concern.
Starting point is 00:49:57 And similarly, Matt, just one other point, you can address it at the same time. That point about a normal distribution and a bell curve, he gives this analogy about people searching out music and how if we don't do it algorithmically, that there will be a wider variety. But at the same time, he's talking about how there are peaks and troughs. Let's imagine you have a thousand people and you're going to tell those thousand people, we want you to form affinity groups and we want you to find other people who like similar music or whatever. You'll end up with all these clusters and then within them you'll say, oh, which music do you like the best? And you'll end up with a bunch of different answers from the different clusters.
Starting point is 00:50:46 And then if you look at all that, that'll start, as it gets bigger and bigger, it'll start to approach a bell curve, meaning that there'll be a kind of an average that comes out. And that's a mathematical feature of reality. If you make accurate measurements of some, a large number of accurate measurements of some phenomenon, you should get a bell curve of sort of an average coming out, which is where middle class would come from in an economy
Starting point is 00:51:09 in some hypothetically perfect market economy, which of course has never existed. But if you tell all those people, you're not going to talk to each other. We're just going to talk to you. There's only a central authority here. And you're going to all say, who's your favorite artist? What's going to happen is there's going to be a single reckoning for all of them. And then I'm going to say, well, since you chose this one, this is the one I'm going to show you more. And then what you'll start to do is you'll start to heighten that peak until it becomes more and more of what we might call a zip curve
Starting point is 00:51:38 until everything that's not right at the tippy top starts to disappear. Maybe this is the part that got to me was like pop music, Taylor Swift and all those kind of people. It's like people lamenting that they're popular and manufactured, but the fact is they are popular and people like their music. So is it, we might think it's better if people listen to avant-garde jazz, but is it objectively better? Okay. Yeah, now I hear what you're saying. Everyone likes McDonald's.
Starting point is 00:52:11 Why shouldn't there be McDonald's everywhere? I fucking hate that I'm arguing this. I get it. Why am I arguing this? I don't even agree with that, but that's the problem. You're just a contrary bastard, I know. Now, look, I mean, there's a sense in which I agree with you, and there's another you're just a contrary bastard i know now look i mean there's a sense in which i agree with you and there's another sense which i think he's wrong right
Starting point is 00:52:28 the recommendation algorithms and spotify do not push us all towards beyonce right i've not had beyonce recommended to me once because beyonce does not fit my listening patterns i do get avant-garde jazz listen to me because i am gerald lop just a bit slimmer with less hair. My current algorithm on Spotify is recommending me steampunk themed rock and steampunk themed rock. And I've got like a whole bunch of kids songs because of my children and then Ultraman albums, Ultraman female albums. And the other thing is Lo-Fi Beats and Retrowave Music. That, I'm not saying none of that is widely popular, but that feels like a fairly idiosyncratic collection, along with Celtic rock and shit like that.
Starting point is 00:53:20 Yeah, no, look, I'm with you. And you should never let your children use your accounts under your profile profile because otherwise you get like I got, which was those, the Winx princess fairy stories, volume seven recommended to you. My YouTube recommendations are freaking insane. So insane because of my children. But yeah. But look, here's the thing. I think we can all get on board with Blani because like everyone knows about the income distribution right and how it works right which is that the rich get richer and and you end up with this longer and longer tale of super rich people and not enough for everyone else so there has to be some mechanisms for kind of pushing that income to have progressive taxation move that income distribution a bit out. So, because that's just makes for a healthier society.
Starting point is 00:54:07 It's not saying it's got to be like strict communism where everyone gets exactly the same. It's just pushing it back to some degree of normality. I don't know if it has to be a strict normal curve. So I think it's a similar analogy for what he's arguing with this data dignity and so on, which is to just, here's that guy Destiny, some YouTube person or whatever. Like you have these mega accounts and okay, they're good,
Starting point is 00:54:32 they're popular, but are they really like a hundred times better than the next person? Like maybe give some of those other characters that are also popular, like nipping at his heels, just a little bit of a leg up. And because people in the long run might enjoy it. Yeah, look, it doesn't take much to get me to flip on this. Because when I just think about Logan Paul and the Impulsive podcast and the fact that anybody that is hosting that with him becomes a popular kind of celebrity. And they're such terrible terrible people
Starting point is 00:55:06 or joe rogan promotes brandon shelb the most untalented comedian slash mma guy but he's promoted by joe rogan so he can make a career out of it it doesn't feel right it doesn't feel like it should be like that but yeah it is so is. So I get it. I get it. Chris, Chris, I've just thought of a problem though, with everything we've just said, which is if we agree with Lania, right. And say, oh, you know, yes, these changes should be made. Isn't that a different kind of algorithm? Isn't that a hipster algorithm? Yeah, a hipster algorithm.
Starting point is 00:55:40 And who decides about that? Since we've got on the negative spiral, let me get the one of the arguments he made that really upset me. Emotionally, I was furious, flinging things around the room. What the hell do you mean? And let's hear what he said. Some of my targets seem relatively admirable now compared to other stuff. So, for instance, Wikipedia, by having only a single article for something. other stuff. So for instance, Wikipedia, by having only a single article for something,
Starting point is 00:56:10 like if you compare the encyclopedias on print that competed with each other because they actually made money from selling copies, you'd have an Encyclopedia Britannica and an Encyclopedia Americana. You wouldn't expect, in fact, you would be horrified if there were identical entries in both of them. They reflected a different perspective because there's no such thing as a view from nowhere. There's no such thing as a universal, absolute perspective in most human affairs. So let me just point out a couple of things there. First of all, if you had encyclopedias on the same topic that had completely different entries, no, that wouldn't be cause for celebration. It would be extremely confusing. And when you compare entries across encyclopedias, especially on popular topics, pretty much exactly the same, maybe a little bit of a different emphasis. Wikipedia,
Starting point is 00:56:59 on the other hand, has the ability for people to go in and look at the edits. I'm not saying that most people do that, but there's a little tab at the top that shows you the editorial process. So contrary to what he's framing it as, Wikipedia produced a single truth. Everyone knows that Wikipedia is edited by anyone and that it is subject to manipulation and its special interest can, you know, think. And, and Matt, Wikipedia is a fucking open source software, right? Like, so there are many different niche Wikipedias like Wikipedia on Star Wars or Conservopedia on science denialism, whatever you want. So his example of how there's been a flattening and a kind of single truth produced
Starting point is 00:57:49 is contradicted by the example that he gave. Yeah, I agree with you, I think, that he said there, really. Like, I think Wikipedia is pretty damn good. And as you say, it's totally transparent in the way that it's made. I'm not saying it's perfect, but it's huge. But if you take a reasonably popular topic, And as you say, it's totally transparent in the way that it's made. I'm not saying it's perfect, but it's huge. But if you take a reasonably popular topic, it's very rare that you strike stuff that is just plain bad. You know what I mean? Isn't that supported by donations from individuals?
Starting point is 00:58:16 Yeah, and it's still people contributing to it. I mean, maybe they should be compensated for it. That might be nice. And I also disagree with the very trend to go, oh, there is no view from nowhere every everything has its perspective and slant and whatever but you know i'm not sure that like a conservopedia and a wokopedia and the various different medias which have different lenses on which to portray i don't know whatever the french revolution say to refer at the deep cut back to an earlier episode, everybody. Like, I don't think that's good. If you look at the, like, I haven't looked at it recently,
Starting point is 00:58:51 but I'm pretty sure that if you looked at the French Revolution entry for Wikipedia, I think you'll find it's pretty damn good. So yeah, I'm not really on board with what I said. Yeah. You know, after condemning some of his hot o-ticks, I do want to give him credit that there are times where Barry Weiss in particular invites him to make hyperbolic denouncements of the internet in general, right? And he's quite resistant to do so, which I think is to his credit. He displayed a degree of nuance that I think deserves highlighting. I still believe in this idea of having this information thing between us, and I think it has potentially more benefits. And the benefits are real, even in social media as it exists.
Starting point is 00:59:34 It would be silly to say that everybody who finds someone else with some commonality, maybe rare illness, or everybody who enjoys a silly cat video or whatever, it would be silly to condemn all that. That stuff can often be either innocuous or wonderful, but it's the manipulative algorithms that are the problem. So I like that, acknowledging the positive aspects on the internet, like the ability for people to find community, or that lots of the internet is just innocuous.
Starting point is 01:00:04 Here you can see the alternative way that Barry is framing things. It feels like the internet should make everything feel more expansive, bigger, more elevated, more giant. But it feels like it's like exactly this metaphor of the escape room. It feels like it's shrunken us down like so what to what extent do you blame i guess the internet and social media for making us feel this trapped on edge feeling well you know um casting blame is a little difficult because we haven't had enough real world experiences with alternatives yeah so he definitely resists the lure there to demonize and make this kind of evil cohort of sinister villains that we can all hate on together.
Starting point is 01:00:51 He restrains himself from that. But I mean, I have to go back to some of his earlier arguments there. I mean, what Barry is mirroring back to him is sort of aligns what he said before, where something like Facebook, he says, is kind of against making communities or against allowing people to do their individual thing. But I just thought about, I rarely use Facebook, but I do use it. I subscribe to a local aquarium society, right? It's got like 50 members. Well, no, it's got more than that, but they're the people in my local town that have aquariums right and there's like dozens of these just this is just aquariums right and you can criticize facebook
Starting point is 01:01:31 for so many things and some of the communities are toxic and i don't know if you can really blame facebook for all the toxicity like it's frankly it's the people that are the members of them that are kind of the source of it. There's another local one, which is like a general one, which has all these randos complaining about each other's dogs and then politics and then threatening to call the police on each other. And it's horrible, right? It's terribly toxic. Then you've got the aquarium one. Everyone's very nice.
Starting point is 01:02:04 So I think these platforms, you could argue, they're all about creating communities and they're all about allowing that kind of diversity. And what people make of it is at least some degree up to them. I think us, the users, have to this kind of hyper-localism and ability to focus on niche communities and so on. I just kept thinking about, well, what about Reddit and that kind of things? Or he talked about paywalled content, right? about paywalled content, right? Like that if you have a minimum barrier to entry that you can make the ecosystem better. So this is him talking about that.
Starting point is 01:02:57 It should have a diversity of business models instead of just one. And it should emphasize people paying for stuff they want. Because I think you have to have a real stake in whatever you do for it to be real. And if everything's fake free, there's a kind of a casualness that contributes to the problem. So in other words, is it more likely that somebody is going to try to be a disruptive jerk at some book reading where everybody was admitted free or at some paid lecture where people bought tickets. Obviously, it's the free one because that person has less of a stake. Yeah. And, you know, he's right. A minimum barrier rate
Starting point is 01:03:36 to entry means that people are more interested in the topic and there's going to be like less disruption. But the same rationale is what led to Dave Rubin setting up Locals. And maybe he thinks what Dave Rubin is doing is groovy. That could well be the case. But I think the issue, like you hinted at, is what kind of communities you want to set up. You can set up a paywall for some pretty dark communities and only have committed members as a part of it, right? You're talking about the Decoding the Guru's Patreon now, aren't you? Yeah, you should see the shit that goes on in there.
Starting point is 01:04:16 It's terrifying, Matt. I am. So I'm just pointing out that I think I do agree that he has provocative ideas and there's stuff to be debated about all of these things. There is a bit of reinventing the wheel to some of the things that he suggests. And in a lot of cases, it is not reinventing the wheel, but it's incredibly unlikely to happen, right? The data dignity view that they're going to have brokers who are independently acting on your behalf to get the money that you're owed from all those companies, like that's a science fiction world maybe, but I just, I don't see that as a viable alternative.
Starting point is 01:04:57 Yeah. There's an aspect to the concrete suggestions that are made that are a little bit by in the sky with some hand waving in between. And the other thing too, is that many of the things he's suggesting are valid options. Like there is paywalled content out there. There are communities I'm sure where you have to have an identity, a verified identity in order to be a part of it. There is moderated ones. There are ones, there is 4chan, there is Twitter, but there's also other options. So Substack, for instance, are these gated communities where independent content creators are paid for their content. So in a way, the market kind of provides all of these options and then people will gravitate to the ones that they want. And
Starting point is 01:05:42 if they don't go to the ones that we think is best, if they want to eat at McDonald's instead of having the bespoke coffee, you can't really force them, can you? Yeah, and there's a part, it's kind of, I think this illustrates that point because there is a bit where Barry is asking him about what to do. And to start with, I thought there was a nice part where he took an anti-guru stance, where she invited him to explain to her what to do if she's using social media unhealthily. And I like this response to this. So listen to this. I'm really interested in picking
Starting point is 01:06:23 up on what you're saying about personhood and like the idea of sort of remaining three-dimensional So listen to this. And, you know, do you think that there is something, like if I spent less time with this, would I maintain my personhood more effectively? Well, you know, only you can answer that question. I really feel it's, if it's possible to notice when one's overstepping and talking about things that one doesn't know about, it's worth trying to not do that. I think it's very human to overstep that way, because we're always trying to understand the world and our place in it, and we naturally over-interpret our own interpretation. I feel really certain that I don't know what's best for you, especially with you right here. But you know... I like that. Like, decide for yourself, motherfucker. Because, you know, like,
Starting point is 01:07:22 if you want to get off twitter just go off it like i no 12 rules of life for you barry you won't give it to you but so you know this is just the nature of conversations and contradictions so he says that yeah he says that but then he goes on to suggest something and i i sometimes wonder if this whole thing about having to be on all the time is actually just false and that you would uh whatever it is you're doing whether it's selling books or getting subscribers on um what's that thing called sub stack sub stack right uh or whatever whatever it is you're doing it might be about the same if you just dropped all this stuff you know like you might not need it it might just be a. It's actually not giving you much.
Starting point is 01:08:08 It seems like it is, but maybe it isn't. I don't know. I mean, honestly, I don't know. Maybe I am special and unusual, but I just haven't ever felt some lack from not being on these things. I see, I feel like I'm about as successful as I would want to be. And I don't see, like, why do I need all that stuff? Yeah, I like that. It's a little bit of self-praise there. But I don't mind people who've got a healthy regard for themselves. And I like his style. I like the way he talks about it.
Starting point is 01:08:34 There's context there that has to be played because he was not being self-aggrandizing that because this was the previous point he made. But I just want to point out something. I've never been on social media. I've never had a Twitter account. I've never had one of the Facebook brands. I've never had a Google account. I've never had a TikTok account. I've never had a Snap account. And that's despite knowing some of these people and having had some kind of shared history in the companies of one
Starting point is 01:08:59 sort or another, especially Google. And with all of that, somehow or other, I'm able to write bestselling books and get booked as a speaker. I still have my A-list speaker status. I can still sell books and get reviewed in the good places and all that stuff. So why is that? I do have to say, and it's slightly off topic, but I think since the opportunity presents itself, Barry Weiss. So there's this thing about the intellectual dark web and the characters that inhabit it. And they're always concerned about the same sorts of things, Matt. And here's an example. I was at a dinner party recently where the argument was, is the world getting better or worse? And by every metric, of course, as Steven Pinker and others have pointed out, it's getting better. There's less poverty, like life expectancy is so much longer.
Starting point is 01:09:56 And so I found myself arguing for that position because intellectually I know it to be true, even though in my heart, I feel like things are getting worse. Or at least I feel scared about where things are going. And I wondered if we could start there. Sure. Who's right? Yeah. Is it getting better or is it getting worse? So the reason I played that clip is just I think the intellectual dark web people need to stop going to elite dinner parties.
Starting point is 01:10:26 This is my intervention for them and Douglas Murray and various others, because they always get triggered by these discussion topics that they have at these fucking dinner parties. I have never been to a dinner party where people have said, okay, everyone, ding, ding, ding. Listen, the topic for this evening, is society getting better or is it getting worse? Let's go, you know, give us your best arguments. Like, I don't know.
Starting point is 01:10:54 I guess this happens. I guess this happens in the elite literary world that these people inhabit, but it just seems so much of their issues revolve around the conversations that they have at those events. Yeah, I'm just seems so much of their issues revolve around the conversations that they have at those events. Yeah, I'm just complaining, Matt. If anyone listening to this podcast lives on the Upper West Side and would like to invite us to one of these parties, we would love to come. That's right. And here's another example, Matt.
Starting point is 01:11:19 I think that increasingly, it's hard to get through books that don't suit the intellectual orthodoxy. Yeah. And it's possible that the moment of the book has passed. I don't know. I'm going to try again. I have a couple more coming in. So just that, is it? God's side had the parasitic mind. The Impermanent was published, what, like last week? Douglas Murray was published what like last week douglas murray was published last month the heterodox market for books is not small and hard you cannot publish anything like how can they still revel in this like persecution status barry barry barry barry mean, we've had some small personal contact with the publishing industry and you get the very strong impression is that those motherfuckers will publish anything that sells. They desperately want to sell books. That's fair enough, right? Their first check on whether or not they publish a book is not, hey, is this lining up with the institutional orthodoxy? Look, no. That does not happen. I think I can grant them that there are going to be publishers who can get spooked, right? Could pull books.
Starting point is 01:12:30 There's all the stuff in young adult fiction that you see. But there's also a healthy market for Steven Pinker, Jonathan Haidt, the coddling of the American mind. These people do get published. Brett fucking Weinstein. Yeah. So I just find the constant persecution narrative tiring. And this topic, it gives Barry the chance to exercise that. But to his credit,
Starting point is 01:13:03 Joanne Lanier doesn't even seem to click on what Barry's on about. He says something relatively blunt and neutral. No, it isn't. And he goes on to explain that he thinks books are an important stalwart against the issues that he's identifying. So this was a take. I'm curious what you think about this one. The first thing to do is to try to diversify the economics of whatever you do, not with a greedy eye to finding every possible way to make a nickel, but with an eye towards not being beholden to any one stream. So I don't have anything against subscriptions.
Starting point is 01:13:39 Like I say, everything's horrible, so have some horrible subscriptions. But also maybe um i believe in books i think books um books have an interesting quality in that they take so long to write and there's such a pain in the butt to write that they require you to make a statement about who you are in a sense a book is a stand-in for a person it's not fleeting. It's not immediately reactive to whatever is there. It's saying like, no, this is, I am saying I've climbed the mountaintop and this is what I see. And it took me, it took me a whole year to climb this mountain.
Starting point is 01:14:16 And I think that's incredibly valuable. In other words, the inefficiency of the book process is its value. What do you think about that? Yeah, I think it's pretty good to me. I think there's a place for books too. People still read books. I think so too. There's just the hipster fetishization of books. It seeps in, right?
Starting point is 01:14:36 Like, how surprising that we would prefer physical media over fleeting digital stuff. Oh, yeah, yeah. I'm not on board with that. I much prefer my Kindle. I can't read a physical type of book anymore. Yeah. There is a topic that he brings up, which I think aligns with the Barry Weiss-ism perspective, but there might be quite a bit of validity to it. So let's see what you think. This is about digital Maoism. This is a phrase of yours that I've become obsessed with. And I kind of can't believe that you coined it in 2006 is this, this phrase digital Maoism. Explain to me what digital Maoism is. Do
Starting point is 01:15:13 you even remember coining that phrase? I do. That was an essay. Digital Maoism was an essay. And I, you know, I got a certain amount of criticism for it as red baiting or something. But I, you know, what's funny is when it was translated into mainland Chinese, the translator called me and said, what do you want to say here? And I explained it. He said, oh, okay, that sounds right. And they published it. Really? As digital Mao.
Starting point is 01:15:35 I don't know if they would today. Maybe they were proud of it. I mean, maybe they were hearing it in a different way. Well, also, you know, times change. I don't know what would happen today. So digital Maoism, Matt, what is digital Maoism? Anyway, the idea of digital Maoism was that the way engineers like to insert algorithms between people tends to create feedback loops that make people fit more and more into what
Starting point is 01:16:07 the algorithms expect, because the people see the information from the algorithms and are able to just do whatever they want to do if they conform to what the algorithms expect more and more, until you start to have this kind of uniformity and official single reality that is not unlike a cultural revolution feeling. And so I thought digital Maoism was an apt title for it. I guess Barry Weiss says, digital Maoism, cultural revolution, go on. But that's a lurid term, really. But underneath it seems like a reasonable point. So the idea is, is that people's behavior online, we're a little bit like ants. You know how ants, they lay down a pheromone trail. There's this process where ants
Starting point is 01:16:57 are exploring, trying to find the sweet, sweet sugar. Ants find the sugar, they come back. That's a good place to go. All the ants follow the trail and they stop exploring and focus on carting the sugar back to home. So in any kind of collective process like that, there are two things you want to optimize, which is exploitation and exploration. I take what he's saying to mean that the algorithms or just people's interaction with the algorithms is, I think, sinister. It's just a pretty good solution to your thing, whether it's a search or a music suggestion or whatever is found. It starts promoting that content to other people, just like an ant laying down a furman trial. Everyone else agrees that's pretty good. And pretty soon people stop exploring and just focus on exploiting, if you like, that one particular piece of content. I think what he's saying is
Starting point is 01:17:43 that hidden in that is something we might not notice. For any given person, the optimal thing for you to do is follow that pheromone trial or follow that recommendation because it is the best option for you at that point in time. But sort of collectively at a broader scale, it's like a local optima. We're forgoing the opportunity to do a bit more exploration and maybe find some more sweet, sweet sugar content out there on the internet. Yeah. So I found that stimulating. I don't know whether it's true or not. I'm not smart enough. I don't know enough about algorithms in the internet. Yeah. And I do have to give him credit as well. He does highlight points that it isn't just algorithms that do manipulating and recommending, right? Like, so he says this.
Starting point is 01:18:26 I mean, attempting some sort of absolutely pure lack of manipulation is senseless because that would end all communication. What we want is to end algorithmic, massive scale, sneaky, disingenuous, constant ambient manipulation. That the world is damaged damaged by that we should not have but you know a little manipulation like if somebody says hey i'm a book reviewer this is a great book you should read this and they throw in a few zingers who great like i don't like i'm not a purist about lack of manipulation i i think that we manipulate each other all the time i think uh we would be quite lonely if we absolutely refused to manipulate one another at all. How do we get people to fall in love with us? Yeah, yeah, yeah. I mean, like, let's face it. And
Starting point is 01:19:10 like, you wouldn't want to remove illusion from love, right? That would be horrible. Yeah, I like that because there are points where he's not a utopian. He's a realist about things. And I listened to some of his other long-form interview with Lex Ritman, and he had the same tendency to say at times, I'm just giving one angle to look at this issue, and I'm not saying there aren't others or that I know what's going to happen. And I think that level of humility is kind of appealing, recognizing that you just have a perspective and recognizing that even in the future that you imagine, we'll still be manipulating each other. People will be engaged in lies and all that kind of thing. There is no utopia at the end of the
Starting point is 01:19:56 rainbow. Yeah. I got the same vibe. He didn't give off the feeling of being a wild-eyed prophet coming down from the mountain, warning of dire things and why we all must absolutely do this one specific thing that he's absolutely certain is going to work. He came across to me like a little bit like our friend Liam Bright, who was saying, look, these are the problems with the journal article review system. Here's some out-of-the- box suggestions of how we might do things differently. A little bit of it is hand wavy and maybe we're not quite sure
Starting point is 01:20:30 how some of the nuts and bolts are going to work, but it's a good thing to be identifying problems and casting about for solutions. So how about this one though? So Matt, the digital Maoism you were on board with, what about this? I didn't like the term. The first year or two of most of these things is actually kind of charming, you know?
Starting point is 01:20:51 But the thing is, it's inexorably on a path to the manipulation machine, and the manipulation machine intrinsically makes everything dark and paranoid and creepy and exploitative and horrible and turns people on each other intrinsically and irrevocably. And so we're seeing that transition happen on TikTok now. And so, you know, one has to be subtle about this. It's hard to, it's hard to criticize somebody who's having fun dancing on TikTok. And yet it is part of this thing. And it reminds, I hate to say this, but it does remind me a little bit, you know, the cultural revolution as an engineered youth movement had them dancing, you know, and they'd go on these little dance trips, visiting villages,
Starting point is 01:21:37 doing their dances. And it's a little like TikTok, you know, it's a little engineered. And it's a little bit and it's a little bit. It's not totally, it's not exactly the same. There's nobody at a desk at TikTok engineering the dance. And yet there's a certain kind of... But isn't there to some extent? Well, okay. I mean, isn't like, isn't the Chinese Communist Party like scraping everyone's data on TikTok?
Starting point is 01:22:03 Yeah, you seem skeptical. Your face seems skeptical. I've seen that look. There's a loose analogy because the cultural revolution involves people dancing and TikTok involves people dancing and TikTok is headquartered in China. It feels too cute, the connection to me. And Barry Weiss says, isn't it really the Chinese Communist Party at the heart, like making everybody dance? I don't know. Is it? Is it? Or is it the medium of TikTok lends itself to those
Starting point is 01:22:35 short clips with music? Yeah. I didn't like the term digital Maoism, but my interpretation of it, which I described, was not so dramatic. But in that framing there, when they talk about the creation of this manipulation machine and so on, it does feel like a little bit hyperbolic to me. He takes it a little bit far, but I'm not sure. I don't think about this stuff. I haven't thought about it as much as he has. Yeah.
Starting point is 01:23:02 A bit later where he talks about Twitter and other social media and likens them to like Skinner behavioral experiments. And I thought that was quite interesting. What Skinner learned was that a slightly randomized noisy reward and punishment feedback system was actually more motivating and had more of a behavior mod effect than a perfect one. So a little bit of randomness is actually built into the system. So you don't know who the victim will be.
Starting point is 01:23:34 And that actually makes it more powerful because everybody's on edge all the time. It's unpredictable. This is him talking about pylons on Twitter. Yeah. And this is something actually you and I've talked about before too, which is there is an aspect in which things like Twitter are a game. Yeah. And it's an interesting game and an intriguing game because there are stakes at play. Well, that's
Starting point is 01:23:57 one bit that I don't entirely get though, because I agree with T. Nguyen, the philosopher that we've had on that's discussed about it. And the general sentiment here that one, that the fact that the reward is variable and unpredictable in certain points makes a thing more addictive and more interesting. and particularly Twitter, is a kind of playground for people to exercise their intellectual and social intelligence in order to gain status and increase their follower count. There's obvious gamification aspects of it, but the particular point they make about that part of it is the Russian roulette of who's going to be the main character of the internet being dunked on. I don't think that is really a feature that invites engagement. It's like I see that more as a byproduct of the things that make people want to do it, not the feature that keeps people interested to play. keeps people interested to play.
Starting point is 01:25:08 Like I know dunking on people is satisfying, but I mean that you're not thinking, oh, I just hope somebody falls down today so I can dunk on them. But God help it if it's me. Like that's not the appeal, is it? Well, look, here's one way to look at it. Firstly, one of the things that makes games appealing is that there's the possibility of losing as well as winning. One of the things that makes games appealing is that there's the possibility of losing as well as winning. And this has got to do with that random ratio reinforcement schedule, which is uncontroversial now to say that computer games and gambling and social media are kind of addictive or at least reinforcing because they are low effort activities that deliver those kinds of rewards.
Starting point is 01:25:48 Now, the interesting thing is the reward, the payoff schedule in social media. So in your typical gambling game, most of the times you play, you're going to lose a little bit. You lose a little bit, you lose a little bit, you lose a little bit, and then you win. You might win a little bit or you might win a lot, right? Every now and again, you win a lot. So the interesting thing, if you look at social media as just like a reinforcement delivery machine, then it's skewed the opposite, right? You might get these incremental gains, but it's like snakes and ladders, right? You might land on the snake and go and have a great big fall. So you see it in these Twitter
Starting point is 01:26:22 dynamics where people are like Icarus trying to fly as close to the sun as they can with the hottest, most outrageous takes that are just hot enough to generate all of this engagement. But if they go a little bit too far, then the tables could turn. And I think there's something to that in terms of it makes it an engaging game. Maybe I'm taking a too idiosyncratic approach or myopic in a certain way like because i'm really viewing it through how i interact with twitter and the possibility of getting cancelled is not something that makes me like more excited to be on twitter but you know it's horses for courses so i can see there are accounts that want to offer hot takes and fly close to the sun for that purpose. So, yeah, maybe this is just me.
Starting point is 01:27:09 And I don't think I'm trying to angle for the hottest possible take that won't get me into trouble. But you're right. There are people on Twitter who do. There are accounts, especially big follower accounts, who are in the hot take, like, game. Yeah. It's like anything sometimes you tell a joke and it falls flat sometimes you say something you think is whimsical and funny and actually it's just a little bit embarrassing and sometimes people think it's great and i think people find social interaction not just social media but just all social interaction engaging
Starting point is 01:27:42 for that reason you know i've been dunking on old barry i've been giving her a hard time but i'm gonna say she does me yeah i've got what you've got a complicated relationship with barry i know no i've just i'm gonna i'm gonna play a clip where she's saying something similar to what we've just been saying so who's the fool not let's listen to this i think the problem is that um judging other people has its own addictive potential. I think you can get drawn into a cycle where you get more and more that way. And then you turn into one of those people at the family Seder who's annoying. And I don't want to do that. I don't want to be that person.
Starting point is 01:28:24 Well, Twitter is basically the most addictive video game in the world. satyr who's annoying and i don't want to do that i don't want to be that person well twitter is twitter is basically the most addictive video game in the world because it uses real human beings that you get to you know it's not just nintendo where you get to like smash the duck or whatever you get to smash real people and to watch as like some of the most celebrated figures in the world spend their brainpower doing that is disturbing to me. And I felt the way that it even has changed me. Right. And it's often the parts of society that should be allies with one another, devouring each other.
Starting point is 01:28:59 Yeah. So all that seems sensible to me, although I will just add on my Nintendo smashing the duck. Smash the duck. Smash the duck. That whole game. Smash the duck. I think she's thinking of like the turtles. Yeah, yeah, the Koopas.
Starting point is 01:29:17 Do you remember the original Mario game? Yeah, I was thinking, did she mean duck hunt? But you weren't like smashing the duck, right? You were shooting it from a distance. Just, she's a gamer, like you say. So I didn't mean that as a dunk. It just, that stuck out to me. But the start of it, Matt, do I feel seen being judgy of other people, judging them
Starting point is 01:29:38 and making you a terrible person? So called out by Jaron there and legitimately so, yeah yeah i think so i think so that they're right it is a game and that there are aspects to it that are unhealthy but not everyone i think gets drawn not everyone turns into a james lindsey so i'm just wondering to what degree should we be worried i mean it's a real thing that they're referring to but is it algorithms or is it just people being people and there is yeah an element of that you and i have both repeatedly said to each other that the way people are online although you know sure they're a bit braver and so on but but if you're a dick online there is a high chance that you are a dick in person. I just think people's online and
Starting point is 01:30:25 offline personas is not as distinct as a lot of people like to imagine. Yeah. Yeah. This is this personal theory of mine, which is that even though an individual tweet or social media messages can be easily misinterpreted, and that's one of the difficulties with it, it's a narrow band of communication. In aggregate, you end up with a pretty good idea of a particular person's personality after having interacted with them for any given length of time. When I think of the accounts or the people behind the accounts that I've been mutuals with for months or even a couple of years. They don't really surprise me for good or for bad. Take Maria. Do you remember Maria? Maria Chong. And like, she's a really nice person and that comes through and she's interesting and smart and well-informed and well-read. And that comes
Starting point is 01:31:19 through on her Twitter account and she might have a nervous breakdown change overnight, but you do get a pretty good sense of who people are. Yeah. So I guess what I'm saying, the reason I mentioned that is that I attribute more of the variance and more of the culpability to the people involved. And me too, I want to blame the individuals. Like I'm responsible for what I do on Twitter. That's me, right? When I'm going to dunk and I know when I'm not for what I do on Twitter. That's me, right? When I'm going to dunk and I know when I'm not. And I'm not saying the things that the social media algorithm feeds us doesn't influence our behavior. But I think at root, you have to put the humans at the root of it. And for better or worse, that's where a lot of the bugs stop
Starting point is 01:32:05 or should stop. Yeah. Just to reiterate what you're saying, I mean, I'm delightful on Twitter and I take full credit for that. That's not the algorithm. That's true. That's true.
Starting point is 01:32:16 You're looking pensive. Just let that one go. The only reason I get hassle on Twitter is because there's no ability for people to hear accents in text. That's why if it was there, they wouldn't be so sensitive. They're just thin-skinned babies. But if I'm talking to them directly, they're fine because they can hear the tone.
Starting point is 01:32:35 It's their fault, not mine. Look, there is something that Jaron does, which was very familiar to me from other gurus. And, you know, in his case, actually, this might apply a bit more. But when they were talking about China and TikTok, he said this. I'm in a bit of an unusual position in that I've chosen to remain an insider in this world. And whether that decision is the right one or not is hard to know with certainty but anyway it's a decision i've made and very few have so i currently have this arrangement with microsoft where i'm encouraged to speak my mind and i do as you know but on the other hand there's some places where i have to kind of be careful and since we almost got tiktok at one
Starting point is 01:33:20 point i kind of shouldn't talk too much about it, but it just happens to be the thing of the moment. The lady doth protest too much, me thinks. Like he does have positions of influence, or at least he has real positions within Microsoft and so on. I did get a tendency towards the I know powerful people. And you could see it in the intro segment as well, when he's talking about his neighborhood. His house sits on a winding road next to the home of other legends of science and technology. A friend of mine who discovered dark energy and won a Nobel Prize is a few doors that way. And like Danny Kahneman is a couple of houses that way.
Starting point is 01:34:02 And the woman who just died next door was one of the founders of 20th century number theory and etc i mean there's just all the guy hi kitty and perhaps as you'd expect from someone who helped dream up the internet yeah okay yeah all right he's a bit pretentious but he would you know it's part of his colorful personality, Chris. Am I being too generous? I can be a little bit like that sometimes. Can you? Like, I don't think I've ever been in and said, oh, you know, this is Oxford University. Look over there.
Starting point is 01:34:34 On that street, this is where Richard Dawkins lived. But down the road, this was where the code was cracked. Well, I've never done that again. I always went to universities that were built out of besoplock and concrete this is northern ireland over there there was a man's and this road was has the most bombed hotel in europe the europa and the yeah i don't know i like the reason i objected that a little bit is because it contradicts, to some extent, the image against elitism. And it feels very much that he has contradictory elements to his character,
Starting point is 01:35:12 like we all do. But this part leans a little bit towards the Eric Weinsteinian referencing friends in high places, and it's not a nice thing. But it's a minor point. It's a minor point. I think it's a fair point,'s a minor point i think it's a fair point and you're right to note it it's it's not as bad as smash the duck i've just got this idea in my head of this game this game we have a baseball bat and you go around just smashing ducks and just this puff of feathers or i'm kind of imagining her just you know smashing the screen
Starting point is 01:35:42 as the ducks fly past on dot com but i'm not saying that because she's a girl. I'm saying that because Barry and the way she described it, that's all just to make that clear. Not saying girls can't be gamers. I didn't say that. Okay. Who's there? Who's there?
Starting point is 01:35:58 Who's been defensive? No. I didn't even. Who's protesting too much now Chris? Well, look, we've covered quite a lot of grind with old jaron right he's he's give us things to think about he's been interesting hasn't he i mean i can confess in wrapping up my opinion matt that i started out i really didn't like him, I find it annoying because I put him in the category of the kind of Luddite, the sky is falling preacher against social media.
Starting point is 01:36:33 And I find those people often holier than thy and to be substantially exaggerating the situation and basically unwilling to acknowledge any positive aspects of technology in their quest for their narrative. But he doesn't do that. When I talked to you about it, you influenced me, Matt, and said, you know, no, he's not really doing that. Like, listen, he adds caveats and he does. He has caveats. He acknowledges positive things. And his quirky personality is just a personality. And yes, he's like an eccentric internet guy with dreadlocks. But the world is better for having people like that now. Yeah, he does brag a bit, but he's American.
Starting point is 01:37:20 If that's the baseline, if Barry Weiss is the baseline, he's relatively modest. Yeah, like you said, well, what do you know? I sent it off your rough edges and pulled you back from your rush to judgment. I'm kill surprised. But yeah, look, I disagreed with a bunch of things that he said as I was listening to him. I mean, there was stuff that we didn't cover. For instance, he totally kind of poo-pooed that there was any kind of artificial intelligence or intelligence in AI at all, that it was all just extracting and averaging out the intelligence of
Starting point is 01:37:49 the people that helped create the data set. And so that gets a bit philosophical, but I thought he made too much of that. And we talked about a few of the other things that I disagreed with him about, but I was not assessing him like a scorecard in terms of how much we agreed or disagreed, because he knows an awful lot more about this stuff than me anyway, so that would be kind of pointless. I mean, what I can say is that it was interesting. It was thought-provoking. I didn't find myself bored and frustrated or progressively getting upset listening to him like I've done with many of our other gurus or like with Joe
Starting point is 01:38:25 Rogan just wanting to beat my head against the desk. I'll also say that one of the signs is I agree that like he's an expert in this area and we are really offering our relatively uninformed critiques. But I think justified in the sense that he's offering, you know, a particular narrative. He's not offering a very technical analysis of the situation. But on the topics that I do know well, I find he represented them accurately and he used things not in a egregious way. Yeah, he was using, you know, psychology studies that are relevant, not to show off that he knew the psychology study. That was the feeling that I got for a bunch of stuff. Yeah, exactly. Like, for example, when he compared the intermittent rewards that people are getting
Starting point is 01:39:17 on social media to the Skinner boxes, that's a totally apt comparison. He's not just dropping in some psychologizing for window dressing. So yeah, now he comes across as substantial, whether he's right or wrong, whether he's making too much of it or not. I think he's a substantial thinker. So there's one, two clips, Matt, just to finish with that are on brand for us. And he talked a little about the need that people have to explore the frontiers of knowledge. And this is what he was talking about there. And the thing about the UFO world is that people have a lot of fun with it. And it provides an answer to a human need. And people need to be able to have something to obsess over.
Starting point is 01:40:12 People need to be able to exercise their brains, thinking about something beyond the edges of what's official. They need to be able to have common quests. They need to be able to explore things that might not be true because otherwise truth calcifies. You know, like, I actually think this is legitimate. I'm not making fun of these people or looking down my nose at them. I actually think this is something we all need. And so there are some of these things that are popular that are genuinely harmless, so far as I can tell. Bigfoot is a great one.
Starting point is 01:40:43 So, like, all of that pretty much i think he's accurate like people do desire counter intuitive narratives they do like the world to be full of mystery and spirituality yeah and for there to be some sort of forbidden knowledge out there something to find out that isn't commonly known. And what he emphasises is that these pursuits, while they may be futile in a kind of material sense, satisfy a need. But he emphasised that they were what he saw as harmless ones, which I think was important, right? So he's distinguishing stuff like Bigfoot to the kinds of conspiracism that we tend to focus on. Like you generally won't find us getting upset because some people have a Bigfoot society
Starting point is 01:41:31 where they go around looking for Bill or the Loch Ness Monster. But COVID is a different matter. Yeah. And the nice thing, Matt, you teed it up nicely, is he acknowledges that. The point is that I think that being obsessed with something at the edge of thought is really important. And being able to do that with other people is really healthy and maybe even vital. The thing is, some of these things are damaging. An example is people who are trying to talk to dead relatives and get you to pay, get you to pay the money,
Starting point is 01:42:00 the seance type people. That's exploitative. People who are just making a lot of money, like extracting money for it. Like, you know, I lived in Marin, so I have really had my fill of astrology. Let me tell you, I've heard enough astrology for many lives. On the other hand, whatever, as long as somebody isn't like draining someone else's finances for it. Like, I have an example of somebody who's facing a difficult illness and has lost a lot of money to deny what negative consequences and how that can be exploited. And I think that's indicative of a lot of the nuance he has throughout the content that we looked at. And I do think he gets hyperbolic at times with his anti-algorithm stuff or his Wikipedia bashing. But fundamentally, I don't think he's one of the more toxic guru types we've looked at. He has self-awareness. He pulls back from being offered the chance to lecture. And yeah, he has moderate and
Starting point is 01:43:22 interesting take. So I ended up liking him despite myself. Yeah, good on you, Chris. I'm proud of you. I am proud of you. Yeah, I'm glad you played those clip laughs because yeah, as you said, it's a good example. He's making a good point. It's nuanced and he's thoughtful
Starting point is 01:43:37 and he's well-informed and- Groovy. So yeah, I'd recommend him. I have no idea whether his fears and concerns and the things he's warning about are totally overblown or a little bit overblown or maybe they're even worse than what he was saying, or with the little ants analogy and that issue of the trade-off between exploitation and exploration, which is a well-known thing in computer science and animal behavior. I'd never thought to apply that to human beings operating with the various recommendation or search algorithms. So that's just an example of the kind of connection that I personally made just by listening to him. So yeah, groovy guy, interesting thing,
Starting point is 01:44:30 some areas of agreement and disagreement. And the first in our opening of Tech Gurus. So not a bad guy to start with. Next up is Lex Fridman. Let's see if we're similarly positively disposed towards his content. We'll see. Could be. We'll see. Yep. We're keeping an open mind about it. We'll take the content for what it is. Thing is, just feel a lot of love towards him. Just think love is important. And that's what I think, Matt. Lovely man. Lovely man. So, you know, we got to that stage, Matt.
Starting point is 01:45:07 The nighttime is closing in. The moon is cresting in the sky. The birds are going into their little nests. That's right. The performers are exiting the stage. It's time to draw a discreet veil over our performance. We do so usually with a turn to reviews that we have received in order to incentivize people who have provided them.
Starting point is 01:45:33 Unfortunately, this week, I don't have very many negative ones because we've been too good. So I tried to look for some negative feedback, but couldn't find any. Wait, Chris, Chris, there was one that you shared on Twitter. Oh, well, yes, there was one. All right. We'll include that. So this is, yeah, I guess it is a negative review of the podcast from an anonymous person.
Starting point is 01:46:01 I will not dox despite the message that they leave. So, well, let's see. Let's hear it. I'll read it in their tone of voice. Ugh, just checked in on you after unsubscribing from your pod because of bias, like not outing Kendi as the fraud guru and grifter he is. And your bias got even worse. Stop telling people you're only slightly left. You're an anti-democratic, sociopathic, radical leftist terrorist now. The same level as the anti-fat pigs who vandalize
Starting point is 01:46:31 cities. But you're on an academic level. You're an epistemological terrorist. If the world ends, it will be because of people like you. Because of bullshitters. That's a harsh review. Tell us what you really think. Three stars. Yeah, he doesn't like us, Chris. Well, he doesn't like me. He doesn't like me. That was to you personally.
Starting point is 01:46:58 Yeah, I never get these. I never get messages like this. Funny that. Why? Why not? I don't know. You weren't nice about candy too i swear to god the candy episode jesus christ oh dear you know because we don't need to rehash it
Starting point is 01:47:16 there's multiple episodes that we did on that episode there's the episode there's the grometer episode go back you know if you think that we were really on board with everything Kendi said, you need to re-listen to the episode. We're very clear about the parts that we agree with and the parts that we don't agree with. And, yeah, just, you guys, go. Yeah, it's never enough for these people. Never enough,
Starting point is 01:47:38 you know. They want blood, Matt! They want blood! That's right. They want us to get Kendi and do something terrible to him. But, you know, we have the same, you know, we get a similar not in the same tone, it must be said, but from the left-hand side where, you know, these
Starting point is 01:47:54 milquetoast liberal neo capitalist something or others that are far too tepid, very problematic, not revolutionary enough. So, you problematic, not revolutionary enough. So, you know, I feel it all balances, they all balance each other out. That's a very centrist thing to say, isn't it?
Starting point is 01:48:13 It is a very centrist thing to say. Good job. I endorse it. So the two positive reviews, the two, because they're short, they're short. This is from Yuko Boz. Five out of five. Good job. Please rate and review this review.
Starting point is 01:48:28 Thank you. So this is someone giving the system. I don't know if I should reward this, but I've got to say, I rate your review two out of five because it was too short and it was too cheeky. So that's what you get. That's right. Yuko Boz. That's right.
Starting point is 01:48:44 That's pushing it. That's right. That's pushing it. That's right. Where was the, you gotta give us something, give us some material to work with. That, that review was just taking. There's two, there's two other reviews I have to mention. I kind of like this one, Matt, because I, I appreciate the qualifier. This is from Ashley Patience and it's enjoyable. Five stars.
Starting point is 01:49:07 She says, good content. Episodes are the perfect length for listening while you do other tasks. Matt and Chris are both quite likable and reasonable, making the content enjoyable, even when I disagree with their assessments of the content. So I like the qualifiers there, right? Like one, you know, this is the perfect, like, if you need to do other tasks that will, you know, occupy your attention.
Starting point is 01:49:32 And then secondly, the quite likeable, you know, quite reasonable, not very, just quite. No, no, I like that too. That's modded, I i'm sorry moderate tempered praise yeah then that's that's the best i'm done with that i'm done with that yeah yeah yeah but people are listening to our podcast doing other things i always imagine them at a desk yeah that's the way you're supposed to do it so So we need to highlight that a bit more. So the other thing, Matt, just a little thing at the end, we've ran really short this week. Adding extra things in is always good for people, but we did get a couple of people demanding that
Starting point is 01:50:15 we authored our take on Jordan Peterson's flouncing out of Twitter, where he made a comment about the swimsuit illustrated model not being attractive enough for his liking. He got crushed for that. He got upset and said, everybody's too mean and Twitter's such a bad thing and away he went. And I just want to say, like, I think I take on this as fairly straightforward. Jordan Peterson said something stupid
Starting point is 01:50:39 that's nothing surprising there. And then he spun it in the persecution narrative and flunked out. He'll be back. He cannot resist the lure of the Twitter gratification machine. But yeah, there's not much to analyze, is there? He just judged an attractive woman who, you know, slightly larger than his standard for swimsuit models, but very beautiful. And who gives a shit if Jordan Peterson got off on it or not?
Starting point is 01:51:07 I don't care. Yeah, I can't think of anything intelligent to say either. I mean, it's been dunked on to death. It's been dunked into the ground. I mean, I'm almost tempted to take a contrarian take and say, well, he's just giving his opinion. No, that's not true, actually, because I realized that what he did do is he said
Starting point is 01:51:23 it was an example of, what is it, authoritarian. That was the problem. He didn't just offer his opinion. He also linked it to this is them wanting him to think that she's attractive. Yeah, and I didn't quite get that. He's talking about the movement to say being fat is not unhealthy and all that kind of thing. Oh, I see. Oh yeah.
Starting point is 01:51:47 So he's in his mind, Sports Illustrated has been subject to pressure and has capitulated by putting on someone who is in his mind too fat to be attractive. And of course, it's vitally important to our civilization that the models on the cover of Sports Illustrated are exactly... Hugely significant for Western civilizations. Okay. And should people be interested in more content like this, they can download the podcast feed that they're listening to this on,
Starting point is 01:52:17 but they can also follow us on Twitter at guruspod. You can join our subreddit you can go on the discord you can join our patreon if you so choose you know there's tons of things you can do we'll put the gurometer episode on the patreon so there you go if you want more
Starting point is 01:52:38 details about the coding of Sharon there you go and now Matt the thing to say at the very end is just that we are on our way. You should, you know, do not forget the gin, note it, and countenance the disk. Pay attention to those things, Matt. I will. No, I'm going to try to keep my ideas
Starting point is 01:53:05 undistributed and my narrative ungated. Thank you. Bye. See ya. Bye. Thank you. We can stick this at the end as a little bonus for people who stay around. Yeah, that's good.
Starting point is 01:53:57 For those of you who are still listening to this, I know why you're still listening to it after the music's played. It's because you're already asleep. They just, like me, you know, they just leave the headphones in. They've been waiting and hoping that this will end and go to the next episode of whatever they have queued up. But we're just... Oh, that's interesting. That's interesting. I mean, look, there's got to be at least a few people who are currently sleeping now.
Starting point is 01:54:20 And they're just entering, like, REM stage four. Oh, that's a good thought. Whatever it is. And we're putting Jordan Peterson and swimsuit models into their mind. I've got an idea. Smash the duck. Smash the duck. Smash the duck.
Starting point is 01:54:33 Smash it. Bye. Good night. This was just a dream. This was just a dream. Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.