Decoding the Gurus - Interview with Renée DiResta: Online Ecosystems, Disinformation, & Censorship Debates

Episode Date: May 20, 2023

We are joined by Renée DiResta a writer and researcher at the Stanford Internet Observatory. Renée has done a lot of interesting work on disinformation and influence campaigns. Including leading an ...investigation into the Russian Internet Research Agency’s multi-year effort to manipulate American society in the lead-up to the 2016 election. More recently she was dubbed by the writer/conspiracy theorist, Michael Shellenberger, as the leader of 'The Censorship Industry'. In short, Renée stands accused of serving as an agent of the Distributed Idea Suppression Complex defending the Gated Institutional Narrative. So being good DISC soldiers ourselves we had to follow our orders and host our exalted leader.We discuss all of this with her and a range of other topics including how important algorithms and bots are in disinformation networks, whether contemporary influence campaigns are really anything new, and how to address debates around censorship and free speech.We enjoyed the discussion a lot and are sure that you will too... or else...Also covered in this episode: Eric Weinstein's suggestions for Twitter CEO, evidence of Lex Fridman's pilled brain, and a rather confusing review.LinksRenee's WebsiteShellenberger's Substack: Why Renee DiResta Leads The Censorship IndustryRenee's Response to Shellenberger's claimsMaking Sense Episode 310: Social Media & Public Trust (with Renee, Bari Weiss & Michael Shellenberger)Chris' old article on Cambridge Analytica on MediumDavid Pakman: Politics of Trump, Biden, Bernie, AOC, Socialism & Wokeism | Lex Fridman Podcast #375Report: The Tactics & Tropes of the Internet Research AgencyGurwinder- The Perils of Audience Capture

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to Decoding the Guru. It's a podcast where an anthropologist and a psychologist listen to the greatest minds the world has to offer and we try to understand what they're talking about. I'm Matt Brown. With me is Chris Cavanaugh. It's mid-morning here. It's a beautiful day in Australia. How are things over there, Chris? All right. How's the traffic?
Starting point is 00:00:43 I feel you've slept into like morning joe you know it's a beautiful day out there on the 557 traffic's backed up but you know yeah i think that's what it is i've had too much coffee and now i'm in like radio mode yeah i have to practice my mid-atlantic accent i'm sure I could do a pretty good one. I just need to hear a few more examples to get into the zone. Yeah, I think that's doable for you. I think that's within your range. I could see you as a mid-Atlantic radio DJ. Yeah, yeah, I could do it.
Starting point is 00:01:18 I could do it. Yeah, so we get into our new format to have a little bit of a look around the G' sphere, see what's been going on, keep tabs on some of the old favorites. A bit like one of the classic rock bands from the 1970s, we play some of the greatest hits as well as our new stuff. What have you got for us, Chris? announced a new Twitter CEO and it's been enjoyable to observe because she appears to be a relatively, I stress relatively, normie advertising executive kind of type, right? So the kind of person that you would expect to get a CEO role for a tech company that wants to attract advertisers. And this led to much wheeling and gnashing of teeth amongst Elon Musk's pilled fans because she has some involvement with the World Economic Forum, Young Leaders Program or whatever. Like she took part in something to do with the World Economics Forum.
Starting point is 00:02:24 She took part in something to do with the World Economics Forum. And as a result, they are all convinced that she's going to bring in the new world order and destroy Elon's palace of free speech. So it's been funny to see him get dogpiled by his brain minions. I know. And the beautiful part about it is that elon musk famously changed the algorithm on twitter so as to boost the blue checks so of course he removed the actual identity verification for being a blue check so it's no longer its original purpose of verifying the identity of people rather it indicates that you're prepared to pay 10 bucks a month or 20 bucks whatever it is to Elon Musk and then and so a lot of verified accounts like
Starting point is 00:03:11 New York Times or whatever chose not to do that so we're now unverified the people who are verified were the Twitter fanboys like Cat Turd 2 and the reams of twitter trolls and weirdo people that like elon and the beautiful part is is that now all of elon musk's tweets which normally if you looked underneath them you'd see i don't know in scare quotes respectable large accounts replying instead all you see is an endless stream of conspiracy wackadoodles with their $10 blue checks ranting at Elon Musk for bowing to the globalist conspiracies. Yeah, so that's been fun to observe. Oh, Chris, Chris, Chris, before you say that, I have to say, I mean, you could almost say that Elon Musk was hoist by his own cat toad. Oh, wow. That was worth the wait.
Starting point is 00:04:16 Yeah. Well, it's interesting in these kind of things because there's always like a sycophant versus conspiracist battle where the people who've been elevated by Elon Musk's attention are divided on whether they should send for him or whether it's time to call out that the emperor has no clothes and isn't conspiratorial enough. And you see some people defending the pick and other ones like laying in deal on, but he'll never be able to satisfy them. And it is, I don't think this is going to be
Starting point is 00:04:51 anything that brings them down or stops them engaging with conspiracists or whatever, but it's just him getting a taste of the kind of community that he's encouraging. But in terms of the gurus, so of course they had a variety of takes on this, but I think perhaps the champion take was offered by Eric Weinstein, who responded by saying, I was hoping for Yeonmi Park, Melissa Chan, Barry Weiss, etc. for Twitter CEO. With so many great women in the fight against
Starting point is 00:05:28 fraud control, what are we doing here? I'm distracted by travel. Can someone catch me up as to what went into the new choice? I know nothing about her. Praying hands symbol. So So Yeonmi Park, the North Korean defector who has very many credible accusations of exaggerating and misrepresenting her tale of escaping North Korea and conditions there. Not to say that conditions in North Korea are good, but just that she does not seem to be a particularly reliable witness for the stuff there. And then Melissa Chen and Barry Weiss, right-wing commentators, one could say this is Eric Weinstein perhaps flexing his own simp muscle in a particular direction for this selection of accounts to suggest but actually i've seen a lot of people poking fun at this just saying this is an illustration of how credible people like eric weinstein are how seriously we should take them and i agree
Starting point is 00:06:39 that that is an illustration of how deep and thoughtful eric's mind is on these kind of topics like it just makes no sense to appoint an skp from north korea or all the other people he mentioned just you're just random sub stackers or conservative journalists as the ceo of a multi-billion dollar company that's not how the world works so it does illustrate how Eric's brain works I think it's primarily just simping frankly yeah and also that thing about you know oh I don't know anything about it can people catch me up like obviously he's caught wind that there's the WEF conspiracies and stuff soic cannot he can never just say that that's his concern or he wants to paddle in the conspiratorial pool so he has to just invite people oh i i know nothing you
Starting point is 00:07:36 know is there some controversy around her what what's the issue right like uh yeah he's he's manipulative but it's it's so transparent's so transparent. It's like being tricked by a 12-year-old. Yeah. It's so transparent, and it illustrates his worldview, which is that, like, him and the people that he knows, his friends, you know, the good people, this inner circle of people, are the people that ought to be running things it's just instead of the reality which is that you are like us and like most normal people
Starting point is 00:08:12 rando people on the internet that's all he is and all we are for that matter but just the self-aggrandizement so annoying i Well, speaking of people with uncharacteristically UFO mindsets, I'm going to also play some clips from a recent Lex Fridman episode because, you know, we covered Lex and he's cropped up on various occasions but i i feel like these clips illustrate some aspects of why lex is deserving of criticism and how his cultured pose of enlightened centristism is is just that something of a pose so he had on on the left wing journalist, David Pakman, who has, you know, like a YouTube channel, and he covers politics and he interviews people and whatnot. And they, you know, had a long ranging discussion, but it inevitably touched on various culture war topics
Starting point is 00:09:19 and aspects about the political divide. I just want to highlight some of the things that you can see about Lex's worldview that he doesn't explicitly acknowledge, but you can clearly see them in the way that he responds to things. So here's him talking about COVID and the issues around trust in science, and who's to blame for trust in science degrading during the COVID pandemic. One of the effects of all this that makes me truly sad is this division over the vaccines has created distrust in science. Yeah. And also what makes me sad is the scientific leaders,
Starting point is 00:10:00 Anthony Fauci being one of the representatives of that community, I would say completely dropped the ball in what way? they spoke with arrogance they spoke down to people they spoke in a way that a great scientist does not speak which is they spoke with certainty without humility
Starting point is 00:10:22 like they have all the wisdom and all of us are too dumb to understand it, but they're going to be the parent that tells us exactly what to do versus speaking to the immensity of the problem, the deep core of the problem being the uncertainty. We don't know what to do. The terrifying thing about the pandemic,
Starting point is 00:10:41 we don't know anything about it as it's happening. And so you have to make decisions. You have to take risks about, well, maybe you have to overreact in order to protect the populace. But it's in the face of uncertainty that you have to do that. Not empowered by science somehow. And the deep expertise that somebody like Anthony Fauci claims to have. So I'm really troubled by the distress in science that resulted from that. And you have to blame the leaders to the degree.
Starting point is 00:11:15 Leaders take responsibility, and I think Anthony Fauci was the scientific leader behind the American response to the pandemic, and I think he failed as a scientist, as a representative of science. Yeah, there you go. Yeah, so it just, you know, there's so many signs. There's Lex suggesting, you know, Fauci's so-called or self-proclaimed expertise.
Starting point is 00:11:41 Like the guy has a decades long, multiple decade long career in public health. He does have expertise, Lex. Whatever your assessment was of what he did or did not, his expertise is clear. And then secondly, that Lex's biggest takeaway is that the people who fostered mistrust in science is Fauci, right? Or who spoke with unwarranted certainty is Fauci. Like, no, Lex, it's your guests, your friends. You had Brett Weinstein on to discuss truth, science, and censorship, and the ivermectin.
Starting point is 00:12:20 Brett had an episode called How to Save the world in three easy steps he's so has such obvious transparent double standards and it's completely in line with the right wing talking points that it's it's all fight she's doing nothing to do with trump or the right wing ecosystem yeah i think very much like elon musk he's simply been pilled, right, Chris? And it's like underneath that weird philosophy he's got, it's again, very, very transparent. Like, you know, it just goes without saying, I mean, Fauci's actions were entirely in keeping with the actions of health authorities all over the world. authorities all over the world and it was in line with the science not that i'm not saying it was perfect but what certainly wasn't perfect was the contributions of people like brett weinstein that he thinks is some great you know source of wisdom and yeah it's quite telling those sort of populist right-wing pilled kind of views which is like they're talking down to us and they're not treating us with enough respect and so on.
Starting point is 00:13:26 It's galling. And it's also like the way that Fauci is characterized. If you actually see long form interviews with Fauci, there's always these like specific examples of little snippets played on right wing media. And when you look at the extended interview, there is acknowledgement of uncertainty and we're working from the evidence currently right and there are aspects where the Fauci or the CDC spoke too strongly about a particular thing but that's like from one interview or you know from a like a stance that
Starting point is 00:13:59 lasted for a week or two and then they come out out and discuss new evidence. And that's never addressed. Yeah. Like, I know what you're saying, which is like the narrative and the premises that people like Elon Musk and what's his name? What's his boy's name again? Lex. Man's name?
Starting point is 00:14:15 Lex Rydman. Lex Rydman. The premises they're working from are entirely imaginary. Like there's this narrative built up. It's taken as a given that the health authorities and governments rode roughshod over everything and you know too you know had too much hubris and all the all of these things and it's just it's not true it's a fiction they've they've invented or at least it's an exaggerated position that there was no uncertainty communicated that
Starting point is 00:14:42 scientific authorities were just completely arrogant dismissing any notion of like trade-offs or whatever. No, they weren't. Like, listen to the extended interviews and you hear them all discuss this from the start of the pandemic. Yeah, I heard it endlessly discussed in Australia from politicians and the various health people and whatever people were arguing about you know is this going too much or should we do less and it did it was always a compromise just like any society so uh like it reminds me of elon musk they live in a fantasy world like to give you a sense of how far elon musk has gone he's he's tweeting about georgeos, about how you shouldn't assume he has good intentions. They do not. Soros, that is, wants to erode the very fabric of civilization. Soros hates humanity.
Starting point is 00:15:33 That's Elon Musk. Yeah, who's a hero to Lex. I know. And if you want to also see where Lex gets this kind of rhetoric from, some of the sources that Lex is using, this is them discussing RFK Jr., noted anti-vaccine advocate for decades, RFK Jr. And David Pakman is explaining his thoughts about him. And listen to where Lex takes us. Smart guy, nice guy, has been doing anti-vaccine work that I don't find particularly inspiring. So it's not just anti-COVID vaccine, it's more broader than that? He's been in that space long before the COVID vaccines. Yeah.
Starting point is 00:16:16 Yeah. I don't find it super interesting. Well, he also wrote the book, The Real Anthony Fauci. Is that the name of the book? Did he write that? That's interesting. I didn't, I don't know. That's, I'm not sure? Did he write that? That's interesting. I didn't, I don't know.
Starting point is 00:16:26 That's, I'm not sure about that. I'm aware of that book. I didn't know he wrote it. I think I did, but it's been, it's been on my reading list to get, I've been trying to get a good balanced reading list about the COVID pandemic to understand what the hell happened.
Starting point is 00:16:42 And anytime I start to try to go into that place, it's just, I'm exhausted by it. Well, it's interesting to me that you wouldn't wait longer before delving into those books to have maybe a more clear hindsight. But I think this is a pretty good time. You don't think so? Just, you know, the notion that the way Lex will delve
Starting point is 00:17:04 into the pandemic and get a handle on it is by reading a book by a famous anti-vaccine advocate and one that he didn't even know was yeah i didn't even realize was a was anti-vax generally and not just diving. It's. And this is a book that Joe Rogan promoted as well as shaping his realization that it's all the pharmaceutical industry. The whole vaccines around COVID is all about profiting and it's not about producing vaccines that helps. That's the thing, like Lex is pilled
Starting point is 00:17:42 and he is relying on these bad sources and i'm not saying he's as far gone as some of the other people in that space but you can see why he would have that kind of view of fauci that he expressed in the first clip because this is the kind of material that lex considers worthwhile to get an informed view a defamatory portrait by an anti-vaccine advocate like oh it's frustrating but that's what our gurus do chris there's a reason why conspiracism is on our garometer and when you hear them talking about ge Soros or Anthony Fauci and all the lies and so on. And the evil plots that they're hatching to destroy the world and erode our will to live. Yeah, it makes sense.
Starting point is 00:18:34 Well, I've got one last clip which highlights the extent to which this kind of thinking infects Lex's worldview. So here's him talking a little bit about the 2020 election and you know genuine debates that are that are difficult to have about what occurred there but i mean i think the big statements are always going to be somewhat opinions like um was the elect was the 2020 election fair? I think any answer to that is an opinion. I disagree. If we define fair.
Starting point is 00:19:14 Well, yes. So then I don't think it's possible to define fair in a way that's not several paragraphs where each sentence now has facts, right? So what do you mean by fair? Is it who can show up to vote? What was the process of how easy it is to vote? Was there actual cheating going on?
Starting point is 00:19:36 What is the evidence of that cheating? You have to actually get to the actual details of a thing. High level, everything is just going to be an opinion it feels like okay and you can approximate that to be like it's a well-founded opinion well most of science is an opinion even physics is an opinion so like i think there's a threshold beyond which an opinion becomes like uh this is a pretty reliable thing to assume for now that this is true. I just can't believe, Chris, that people think of him as a smart person. I know.
Starting point is 00:20:15 It's kind of just more the topics that he picks and the way that he poses that, you know, and I don't mean like physically physically i mean just the way he intellectually poses what he's doing like the way he editorializes that gave people that impression but the notion that like well isn't everything like it's big lebowski yeah that's just your opinion man but also it's just your opinion whether the 2020 election was fair and whether there was widespread cheating and people showing up who weren't allowed to vote like no lex that's not just an opinion people have an opinions about that but there's underlying facts which contradict the opinion that says it wasn't fair and it was widespread voter fraud no there wasn't but chris it also got
Starting point is 00:21:06 apart from being just not good thinking i mean it goes to your point before which is the very selective charity and the selective concerns and he's pilled right like look at the topics about which he obfuscates you know it's anti-vax stuff and it's stolen election conspiracies. Yeah, there's just, there's not much going on there. But what is going on is not very pretty. No, so yeah, you know, election conspiracies, Fauci is evil. Just looking at a Lex episode and you can reliably see what gets them to exercise them which ones he wants
Starting point is 00:21:46 to extend it's a lot like joe rogan it's a lot like like on the surface it's posing as a you know just i'm curious about things and i want to figure them out but it always goes a particular way doesn't it yeah well well anyway it's it's just So, you know, Lex Friedman, I just want to put the flag in for people that treat him as an apolitical little wood nymph. Maybe not. Maybe he is also a pilled, not pilled partisan, but like, I don't know, a pilled pundit. That's the way to put it. He's a pilled pundit. So, yeah, it's all all frustrating it's all very frustrating well you've really annoyed me you've put me in a bad mood with
Starting point is 00:22:31 with your clips and things chris um sorry everybody let's think about so let's make some happy thoughts so you you were expecting something fun but you know that's not the way the gurus fear rolls i want jordan hall again jordan hall makes me happy jordan hall won't make you happy if you hear his political opinions no no no you heard him talk discuss covid vaccines i i think you'll hear very similar things no i just want to hear about him plucking a guitar that's that makes me happy yeah well so on that positive note, we'll turn to our interview with our guest for this week. It's an interview episode. We have Rene D'Aresta, a researcher of disinformation, misinformation, and online ecosystems, various other things.
Starting point is 00:23:21 So let's go have a chat with her. Yep. I don't want to overse oversell it chris but i think it's probably definitely the best interview that's ever happened on a podcast ever well far be it for me to to disagree with that it's amazing that you could even suggest that matt that's astonishing it may not be true but the mere fact that I could plausibly suggest it is revealing, I think. It's telling of something.
Starting point is 00:23:49 It's telling of something. We need to think about it. But yes, so here is Rene. So with us today, we have Rene DiResta, a researcher whose work I'm very fond of. I believe, Rene, you can correct me, you're currently the technical research manager at the Stanford Internet Observatory and have had a bunch of other positions in the preceding years. But people might know you because of your research on the Russian Internet Research Agency back in the 2016 election, but you've also done a lot of other work on online disinformation, debates around
Starting point is 00:24:36 censorship. Very recently, you were on the Sam Harris podcast with Barry Weiss and Michael Schellenberger to discuss the Twitter files and issues of censorship. And that spilled out into internet controversy, which we'll probably get into. But yeah, but thank you for joining. And I don't know if that's a good potted introduction, but if there's anything important I missed, let me know. No, that about covers it. It's great to be here. And so there's a ton of things that I think Matt and I both want to ask you about. And it might
Starting point is 00:25:17 make sense just chronologically speaking, and maybe also to talk about some of the broader forces that are in effect in the online ecosphere to talk about your older work on the the dollar ira in the 2016 election and the related concepts around disinformation and misinformation, propaganda, that kind of thing. Because I think one of the issues that I see online, a recurrent narrative, is that the Russiagate views that basically all of the claims made about Trump and collaboration with Russia were proven to be fairly insubstantial, that they were exaggerated by the media and liberal commentators in general. And yes, so how accurate is that opinion? And how does your work fit into all that? Sorry, it's an incredibly long winded question, but they'll get shorter.
Starting point is 00:26:20 No, no, let's dive right in. So first of all, I didn't do any work on the collusion question. And I think the most important part of my answer to this is going to be this very, very first part, which is to say that studying interference is not the same thing as studying collusion, nor was it then. Sissy is kind of in how people refer to it. And I was appointed to be what's called a technical advisor to the committee. TAG is the name of the program. And so I was appointed by both Senator Warner and Senator Burr. Despite what you may read about me being, you know, a tool of the Democrats on the committee, that was not true. I had a bipartisan appointment. So the committee received data sets from all of the major social media companies.
Starting point is 00:27:05 So from Facebook, from Twitter, and from Alphabet. And Alphabet turned over both YouTube and Google search data. And Facebook, now Veda, turned over both Facebook and Instagram data. So we had these kind of five data sets. And I was asked to lead a team, an independent team of researchers. And there was another team that was tasked with the same request. And we didn't know who was on the other team. And the reason they did this was because given the sensitive political dynamics around questions of things like collusion,
Starting point is 00:27:35 they wanted to make sure that no one team's work could be kind of immediately dismissed as partisanship. So in a way, we were kind of like blinded, right? So it's how they constructed this research project. So the data sets that I got were attributed by the platforms. They were not attributed by me. I did not say these are the Russians, you know? And so my work for what took nearly a year, actually, I don't want to, I know you guys are academics also, but not to get too much in the weeds. It was like literally three months of like cleaning data was the first part of that. How do you turn it into a taxonomy or to a schema by which in a database we might query across data sets for particular terms?
Starting point is 00:28:15 Some of the platforms gave us literally scanned PDFs. Some gave us folders with images where you just had to CSV that tied to a number. And so it was a very, very long process. But ultimately, the intent of the work was just to ask the question, how did Russian interference manifest? And so my work was highly descriptive, it did not even attempt to answer the question, did this swing the election, because we didn't have data that would give us the answer to that question. We couldn't see what people did when they engaged with the content. We could only see engagement data, which tells a very particular type of story, which is people saw this thing and they clicked like or share.
Starting point is 00:28:54 Were they predisposed to the content? Probably. Did that, you know, so was it a propaganda of persuasion? Probably not. Was it highly active? Yes, it was. And so there were these kind of different questions around, did it have an impact in the sense of swinging the election? My feeling has always been no, not what the Internet Research Agency did. So I'll pause there for a second in case you have a follow-up. But basically, I think the kind of
Starting point is 00:29:22 key takeaway was that as Mueller was over there doing his thing, as DOJ was indicting Prokosian and the IRA, we would see that data and we were interested in it because it provided a behind-the-scenes look at how this other content, this very demonstrable attributable content, had been kind of effectuated and paid for by the act. had been kind of effectuated and paid for by the act. Yeah. So back in the Brexit campaign, whenever there were the accusations made about Cambridge Analytica,
Starting point is 00:29:52 kind of hyped by Carol Cadwaldo and the Observer or the Guardian, whatever case, there was a kind of emphasis that this was the secret sauce that led to the election or the referendum being won. And I was very skeptical at the time about that and wrote an article about it, mainly because I was aware of how rudimentary the psychographic research is and the evidence
Starting point is 00:30:22 that you can sway people in such a directly targeted way and swing an election. So when it comes to the Russian disinformation campaigns, which you investigated, my feelings were somewhat similar in that whether it's the actual deciding factor in an election, it's very unlikely because there were many, many converging factors in the election of Trump. many, many converging factors in the election of Trump. But the fact that it was taking place seems to be beyond any doubt. And your research was, as you say, kind of descriptive of that process. But there were pains taken, like you outlined, to ensure that, you know, accusations of partisanship couldn't apply. but, but it doesn't seem to work. No, it was, it was my, I was, you know, it was so interesting. It was, I think the first thing
Starting point is 00:31:12 somebody went and did when the report came out, it was like, they went and dredged up that I had donated to Hillary Clinton in the primary in 2015. And so I became like a Clintonite operative and a democratic party operative and all of this was like absolute nonsense. But again, you know how it is. When you do work that's inherently political or that has high stakes politically, it's not a surprise when these sorts of smear campaigns happen and discrediting the person is far easier than engaging with the work. And so again, this is something that I've sort of internalized since 2018 at this point. Renee, I've got a kind of a basic question because I didn't follow this story in great detail at the time and I've kind of forgotten the details. So maybe some of the listeners have too.
Starting point is 00:31:56 And it's just like you looked at the actual attempts, I suppose, to influence the election, the kinds of posts that were made from Russian sources. I mean, what were the main themes? What was the content? Could you just describe it for me? Yeah. So first of all, it's actually really different on Twitter versus on Facebook and Instagram. And that's something that I think gets lost a lot. Particularly recently, in recent days, Josh Tucker's lab at NYU, which is excellent, has just put out a report saying that the Twitter engagements were even less impactful than people previously thought. And I thought that was very interesting work,
Starting point is 00:32:30 in part because my sense from looking at the Twitter data was that they were just engaging on whatever topic people were mad about at the time. So it was very much a gasoline on the fire kind of dynamic. You may remember in 2015, this was the time of Gamergate. So they were kind of in that mix. This was the time of kind of random mobs that would sort of attack and target people. So some of the accounts kind of behaved in that way, sort of ingratiating themselves with particular types of trolling communities. There was the Disneyland measles outbreak. They had a lot to say, like, you know, 900 or something tweets about vaccines during that period of time. But they never really
Starting point is 00:33:09 talked about vaccines again. They just didn't care. It was just this was the thing that was happening on Twitter. Whereas on Facebook and Instagram, you have an interesting dynamic whereby you can establish kind of a closer, more persistent community, more persistent ties. You can make a Facebook page. You can have a group kind of adjacent to that page. You can have back and forth with people in the comments. And so what you started to see in that area was much more where the persuasion kind of content that did exist was. And so, for example, some of the earliest persuasion stuff was actually trying to get people to support Rand Paul in the Republican primary. They were really kind of all in to Rand Paul early on.
Starting point is 00:33:50 And then when he just didn't perform very well, what they then had to do was take their Tea Party focused pages and really like work to pull them away from Ted Cruz and Marco Rubio. And Cruz in particular was kind of a darling of the Tea Party. So it's very interesting to watch them say things like, well, you know, my Texas brothers and sisters, you know, yeah, I liked Ted for a while, but now I think there's this other guy, I think he might be better for us. And so that's where you see the sort of subtle persuasion actually during the primary. But much of the rest of the content was much more around, this is my identity. This is our identity, right? They always spoke as if they were a member of the community. This is our identity. And here's why it is so great to be a member of our identity. And then they would position these identities oppositionally to
Starting point is 00:34:37 other identities. So if I am a veteran, then a Muslim refugee is taking my benefits, right? And so this question of like, who is America for becomes kind of a central driving force. And for each of these, you know, they made about, gosh, I think it was between 30 and 50 Facebook groups and about the same number on Instagram. I don't remember the number off the top of my head now. It's been a couple of years, about the same number on Instagram. I don't remember the number off the top of my head now. It's been a couple of years, but the overwhelming majority of them were targeting the black community.
Starting point is 00:35:09 So that question of race relations in America was very, very integral. So race and resources, and how should we think about who is America for? And that becomes the driving way by which they activate different groups around their identities. So Renee, related to that, I've heard it kind of
Starting point is 00:35:26 emphasized, and particularly in heterodox sources, that those efforts were focused on inflaming division on both sides, left and right. And that's sometimes characterized as saying, well, it wasn't about supporting a particular candidate, It was just about amplifying divisions. And you mentioned that there was a lot of groups created that were focused on, you know, activating identities within Black communities. But is it not also the case that there was a disproportionate emphasis towards Trump in the later stages of the campaign, or is that something different? No, it's very interesting how that happened. So you might remember, there were a couple of different Russian interference efforts in the 2016 election. So the one we're talking about now is the Internet Research Agency. Then there was, I don't remember if it was SVR or FSB,
Starting point is 00:36:20 but one of the other security services was trying to hack voting machines. And then you had the GRU, kind of the APT28 fancy bear, which goes and hacks the DNC and then the Clinton campaign. And it has a whole trove of documents that it drops, which becomes kind of the origins of Pizzagate, right? So that's a whole other thing that's also happening. But what happens with the IRA is they begin their operations late 2014, early 2015. So again, this is why they're already active when the Republican Party's primaries are kicking off. Trump hasn't even declared yet, and they are talking about Rand Paul and others. The campaign becomes, I would say, if anything, more opportunism in this social division kind of dynamic that they're carrying out. So the IRA has a clear dislike for Hillary Clinton.
Starting point is 00:37:14 So the only positive posts about Hillary Clinton are posts that they have the Muslim page make in support of Hillary Clinton, trying to get the Muslims to rally in support of Clinton in the real world so that they're very visible, because they think that this will be a divisive thing, having Muslims out there rallying in support of Hillary Clinton. So that's the only pro-Clinton content you see up here is kind of targeting that group, trying to get them to do something highly visible. All of the other groups are targeted with anti-Clinton content. And of course, this is not a surprise on the right that that exists. It's already there. But they try to depress support for Clinton among the left-leaning groups and then among the Black community as well. anti-Clinton. Then you have the GRU off doing its own thing. And that's where you get the first, you know, the first trove of documents drops the afternoon after the Pussygate tape comes out.
Starting point is 00:38:16 That's a very useful taxonomy. And I think also a nice illustration, which I got from your interview with Sam Harris, that, you know, you're a researcher of this and know the details very well, which isn't always the case, even if people are spending a lot of time on these topics. But on that basis, I think it would be interesting to ask you, Renee, that, you know, so if we're talking about efforts to promote particular candidates in an election or to just sow, you know, chaos and potential geopolitical rival, that's not new, right? It's been ongoing for decades, probably hundreds of years. The printing press being a notable example in history that allowed
Starting point is 00:38:57 that. So to what extent do you regard the kinds of things that you're looking at as old wine in new bottles or is it more that there are genuine new capacities and things which weren't active in in previous generations i'm like i'm so tempted to give you like the like completely geeky academic model that we've actually kind of constructed around just this but i want to you... You can do that. We're down for that. So we look not only at Russian influence operations, but I cannot tell you how many, I mean, dozens to hundreds at this point of IO, you know, kind of state actor takedowns that I've worked on over the last eight, five years now. And one of the things that we've been really interested in is this question of what is new, what is the same. And so you can think about propaganda in the olden days,
Starting point is 00:39:49 in the broadcast media environment, as having two forms, overt and covert, right? So this is really a spectrum, if we're being completely honest, there's that black, white, gray propaganda dynamic. And that's differentiated by its attributability, right? Do you know who is speaking? Is the state transparent about the fact that it owns an outlet that's called white propaganda? If it is not, if it is like kind of nebulous, that's maybe gray propaganda. And then black propaganda is the term for like active misattribution, like front media. And so, you know, if you've ever seen like the Americans, there are these, you know, these people who are not real outlets that are not real, you know, they're just not what they seem to be. This is very common in the Cold War in particular.
Starting point is 00:40:23 People who are not real, outlets that are not real, they're just not what they seem to be. This is very common in the Cold War in particular. So then you think about social media coming in, right? And so you've just layered on a new mass broadcast, mass distribution channel. And it's a little bit different in that people are the primary source of distribution on that channel. So it's not what some broadcaster decides to put out. It's what the public, what the people who see the content decide to share. So when we say the words like it went viral, what that really means is a whole lot of people decided to share it. It's not a passive thing at all. It's very, very active or engaged, in fact. So what we think about when the context of like what is new, you basically have social media. So you have this new channel, but then you also have the overt to covert spectrum.
Starting point is 00:41:03 So now you have kind of like a two by two, right? Overt to covert, broadcast to social. And so whenever we look at these things, what we are interested in is how does a well-resourced state actor use different quadrants, if you will, in tandem? So Russia in particular has really, really good covert social. You have other entities that try. China tries quite regularly. Iran's are semi-decent. But the covert social quadrant is where you see these like internet research agency, like these trolls, these people who they're not what they seem to be. They tend to be something
Starting point is 00:41:38 that they're not, but they engage as if they are a member of your community speaking to you. So it's less like propaganda in the broadcast sense where somebody is pushing out a narrative and trying to persuade you and more like an agent of influence model where like somebody is in your community speaking to you as if they are just like you. And so that's that difference, right? So you have the broadcast, the broadcast style propaganda, then you have this, right? So you have the broadcast, you know, the broadcast style propaganda. Then you have this new world in which you have both, oh, and the overt social quadrant would be something like the wolf warriors, the ministry accounts, the influencers, you know, that kind of thing.
Starting point is 00:42:16 So that's how we think about like the entire universe of propaganda and all of the things that you can do with it. Yeah, there's obviously nothing new about powerful state actors looking to, you know, exert soft power or, you know, into some kind of influence campaign. And certainly, you know, Western democracies are not above it either. So I'm kind of curious, I think Australia probably does stuff, but it's on a very small scale. So let me ask about America. I mean, what does America do? Does it do this kind of black, does it operate in that quadrant? what what does america do does it does it do this kind of black it does it operate in that quadrant it it does ish so so i can only speak about things that i've um you know either read about or in this
Starting point is 00:42:53 case i'll talk about something i've personally worked on um which is a takedown that we uh that we examined in i believe it was august of last year And we put out a report, we called it Operation Unheard Voice, because they were trying, but nobody was engaging with it, actually. And it was entities linked to SOCOM or CENTCOM, right? So USDOD. And they were running these fake personas. And what's interesting about it was really, for me, all of a sudden delving into under what authority could the government operate and how. So again, because as you note, governments have used the euphemism as public diplomacy, right? So there's propaganda, which is that stuff that those other people do, and then there's public diplomacy. And public diplomacy is, for example, U.S. military entities will run websites detailing their point of view in theaters that they're operating in. And as long as they put a disclosure saying this is a website operated by SOCOM or whatever combatant command is running it, that's actually fine. You can have that kind of propaganda or public diplomacy. You can have that exist, but you're supposed to have that
Starting point is 00:44:05 attribution. And so this collection of networks that both Twitter and Facebook took down had accounts that were pretending to be things that they were not. So they veered into this persona territory. Extremely awkwardly, some of the accounts were recycled. So they had at one point been attributable prior to them converting into persona accounts. So this is extremely JV and really one of these like, oh, not a good look. And so it was very interesting to see. I mean, sometimes the accounts would all tweet, the automated accounts would tweet in the same, like the first second of the minute, which is one sign of automation. And there were just, you know, these tells.
Starting point is 00:44:49 It was not great. So we put out this big report operation on Herd Voice and we did detail what we found. And they were in, of course, regions of, you know, where U.S. has a vested interest. So speaking to audiences in the Chinese sphere of influence, speaking to people in the Caucasus, speaking to people in the Middle East. And so that was, you know, some of the content was this, they also ran these kind of media organizations where they would use the more traditional style of propaganda, where they would write an article, and the article would include a quote from somebody that was saying the thing that they wanted to get across. Right. And so so that that kind of content. But it was very interesting to see.
Starting point is 00:45:29 Alan Nakashima from The Washington Post really picked up the report and kind of took it to the Pentagon and got a little bit more where there was a land and what ways our government should engage in that sir my personal opinion is that u.s government particularly not the combatant commands like they just shouldn't be running these kinds of accounts it's just i mean bad is even worse but but period it's not a good thing to be doing it feels like the return on investment is disproportionate to the harm if revealed so yeah and actually renee to pick up on the point that you mentioned there so there's a lot of accusations particularly maybe maybe more so in the twitter sphere surrounding the prevalence of bots and to what extent bot accounts are doing things. You tend to hear people like Elon Musk argue that they're
Starting point is 00:46:33 everywhere and they're overwhelming the site. In my experience, just in a research context, I find that researchers overestimate the prevalence of bots versus people that are just poorly paid and unmotivated to answer properly to surveys. So I don't have any skepticism about automated bot networks and their role in amplifying messages, but I'm curious to what extent you think that that's a very core component or alternatively, to what extent those concerns are exaggerated? Like how worried should average people be that they're engaging with bot accounts? And are they easy to detect or are they not very advanced and it looks exactly like a normal person? That's a great question. So I'll take the first part,
Starting point is 00:47:25 which is the Elon question first. So if you are a person with a lot of followers on Twitter, you do have what we call like the reply bots. Like even me, I don't have that many followers, but they kind of dredge in your replies. You tweet automatically within one second. There's the, have you heard about this new DeFi thing? Or there's a link to like to some shit spam on, on, on VK. And so these spam bots are, are there.
Starting point is 00:47:51 I actually don't understand how they're still there. I mean, the VK DeFi ones that I always get are like, you know, they're, they're formulaic, but the, uh, so, so that, you know, that, that is a real thing. And the prevalence of them is probably not small. And if you are a person like Elon, who has millions of followers, you probably see that kind of activity happening a lot more than the average person. But the other thing that you're talking about is the political bots, right? The manipulative bots. And so in that regard, they were a big thing up until around 2016. And so Twitter began to crack down on, particularly after some of the rest of the stuff was found, so maybe 2017, they started to crack down in part and to consider them
Starting point is 00:48:33 low quality accounts. So you can see if content is coming through an API, you can see if somebody is responding instantaneously, like there's ways that they can identify an automated account. like there's ways that they can identify an automated account. And so they started to deprecate the weight of those automated accounts in something like trending topics, which was really one of the main points. It wasn't that the automation was so good that they were going to trick a person into engaging with them. It's that what you often see, like I'll use China as an example, there'll be a network of like two, 300,000 accounts that will come down. And as researchers, we will see that only about 25,000 of them are actually producing any content. And of those, maybe 5,000, maybe sometimes even just a couple of hundred or a couple dozen are the real high value core of the network.
Starting point is 00:49:21 Like those are the accounts that are tweeting interesting things or real content. And then everything else, the entire cloud, the other 250,000 are all there as amplifiers. And so the amplifier bots are like, they're kind of their own thing. They don't tweet, they don't engage. They're just there to boost. And they also boost the overt social accounts, just using our rubric from before. So like the Chinese wolf warriors, they're out there, they're tweeting, they're doing their thing, selling the CCP message in true name. And they're really pugnacious. They're really actually quite fine sometimes. And you see these suspicious degrees of engagement and that comes through amplifier bots. But I think the area where bots get potentially more interesting
Starting point is 00:50:05 is now in the age of highly competent generative AI, right? And so that's where you actually can, you no longer have to have bots that are detectable by copy pasta, right? Or shitty low quality spam type content. You can have AIated content that's much more personalized, much more sophisticated, reads like a real person. And so bots just became interesting all over again. Yeah. You beat us to it. We're going to ask you about the influence of GPT on this. Both Chris and I have been a bit obsessed with GPT-4 and I've joked that I don't know why I'm on Twitter anymore because I have five better conversations with Jimmy Timberlake online. So it seems like there's at least the potential there
Starting point is 00:50:49 for there to be a much better generation of bots that can just be given a mission, which is to, you know, you are such and such, and then can be kind of let free to sort of live their own online existence in a way that's, you know, would probably be almost indetectable. Yeah, I mean, and that I think is, you've started to see a couple of researchers have, I'm also on Twitter a little bit less, but as Twitter has begun shutting down API access, interestingly, you have just started to see researchers who are still you know kind of holding on or have some residual access that are finding some of the tells of of of gpt chat plants like so the the um i'm a large language
Starting point is 00:51:34 model and such and such kind of excuses that they make sometimes or qualifiers so you can actually search twitter for those phrases and you'll see uh you'll see clusters of accounts that use them. So they're just not careful. They asked it to do something and then nobody's sitting there reviewing the outputs. And so you can actually see these little tells. So that will be any sophisticated actor worth its salt will not fuck up like that. No, no. It seems pretty easy to fix that. I mean, like another tell, of course, is if GPT-4 generates a little mini essay, then it'll tend to have a concluding paragraph, which is very nice. In conclusion, it'll run through all the things, which is probably what people should do, but we usually don't bother.
Starting point is 00:52:17 We used to see these funny things that were tells of somebody who, particularly in the Facebook content, I think I said it was like a little bit longer, a little more persuasive as a, you're not limited by the Twitter character limit. And there would be these arguments that were so funny to me because it's like a Texas secessionist page, but it's writing, I don't know if you've ever studied a foreign language. I have, I'm Russian, funny enough. I'm not good at it. But you would have to, you know, these exercises are like the three paragraph essay and that was how they wrote it would be like as a texas secessionist i don't
Starting point is 00:52:49 like hillary clinton you know but then it would be you know he was very like this very proper non-colloquial extremely formal english and it would end with like a moreover or furthermore and that for me was always the talent like I'm like, nobody in, no American, especially not some Texas secessionist, is out there being like, moreover, let me close out this thesis here in this Facebook post. So I always found that kind of amusing as just these little kind of ESL-ish tendencies
Starting point is 00:53:21 that would kind of like pop through. Like you learned English in a classroom and you're writing as if you would for a professor. And here is that formulation laid out in front of me, pretending to be a Texas secessionist. I dread to think of the atmosphere in 10 years where there's just various AI bots on whatever social media platform we're on, debating, bots on whatever social media platform we're on, debating, having trouble with each other. But it might end up a bit like the prisoner's dilemma, where you can create your strategy and send it off and see which one dominates the ecosystem in real life. But so, Renee, you mentioned about the API access to Twitter, and that's probably,
Starting point is 00:54:07 to some of our listeners, all words that don't mean anything. But so there's issues about what researchers who want to study disinformation or just, you know, the social dynamics on social media would really love to have is unfettered access to the metrics that the social media companies have, to what extent they collect them. And there's a bunch of different platforms which would offer different levels of access. So I realize it's a kind of complicated topic to address. But I'm particularly curious about, under Elon, whether publicly facing, his claims are that he's going to be much more transparent about things and that they want to make things clearer about, you know, he did release some details, at least about the algorithm platform how easy is it or how possible is it to get access to those kind of metrics and has that changed notably yeah so on um i don't remember if it was like 420 or you know or um or the last day of the month that they revoked our access, but they did revoke API access for free users.
Starting point is 00:55:28 And most academics who do social media research were participants in what was called the Twitter Moderation Research Consortium or had negotiated access to APIs. You just kind of go through this academic access process or you sign some forms, university does a whole thing. So there were, and there was also people who could pay for APIs, right? So firehose access, certain companies that were either building social listening tools or whatever other reason they had sentiment analysis tools or whatever would pay for the firehose.
Starting point is 00:56:04 And then in turn, kind of basically do research for a whole lot of different clients based on a stream. For us, it was much more, we just did access, sorry, we just did research internally among our own team. So there were two different programs though. There was the API access, and that was how anybody who wanted to study information, just kind of like coming through Twitter, either through historical data or through what's called real-time ingest,
Starting point is 00:56:29 where you're just kind of, maybe you want to collect all tweets with the name Beyonce in them, right? You could just kind of like set up a keyword and do that. So that's kind of one mechanism of research. The other mechanism though that Twitter offered that was very, very, in my opinion, really best offered that was very, very, in my opinion, really best in class was the Twitter Moderation Research Consortium, where when they did things like state actor takedowns, researchers who had applied to be part of that program received visibility into the hashed datasets of the tweets that were attributed to state actors or whatever else that came down. And so in some ways, like when we would write these reports, we were one of the early, early members of that program. And so Twitter would tell us like,
Starting point is 00:57:14 hey, there's going to be a takedown. It's going to be this India, for example. And then we would receive the data. And then we would write our own independent assessment of what was in the data. So Twitter would put out its statement. But this was a way for researchers with expertise in understanding state actor dynamics would be able to write a report conveying to the public what was actually in that data set. And I think it was really valuable because it's one thing for a platform to tell you what it's taken down and for you to just trust it. But this was sort of a second order. And sometimes we would go back and there's some of our reports. I remember one of them in particular, Russia-related one. We said, there's a cluster of accounts in here. We actually really don't understand why. We cannot make these make sense in the context of the
Starting point is 00:57:59 rest of the network. And because we don't have confidence in that attribution, we're not going to engage. We're not going to put out descriptions of that stuff and potentially create false information around it. So that sort of dynamic, I think, was really important because there was that back and forth interplay of we all want to have the best understanding of this system. And so it was very collaborative. And that doesn't exist anymore. The whole team got laid off. Yeah. I might move us towards some topics that I've heard you talk about recently that we think are really interesting. And that's around these questions around moderation or censorship,
Starting point is 00:58:36 this endless controversy that you seem to see on places like Twitter between people that are all, you know, free speech is great. You know, the free market of ideas, it's a level playing field. You know, the best ideas will naturally rise to the surface and there shouldn't be any of this terrible censorship stuff. And you said something recently, which I really liked, which was that you said there was no neutral in the recommendation algorithm. So, yeah, I just wonder if you could speak to that a little bit and just let us know where you – what's a more nuanced way to think about this instead of the terribly boring kind of thing about are you for free speech or against it? Well, I'm for it.
Starting point is 00:59:18 You should. Well, no, it's a really interesting question. I've been – this was the thing that I actually started off writing about, right? It wasn't state act or interference. It was actually this question of how do we think about networked information cascades and peer-to-peer information transmission? And if you think about the internet as an ecosystem for capturing attention and that having attention can translate to real world power, what you start to see is very significant incentives to manipulate that system
Starting point is 00:59:52 to try to get attention. And this was the bots conversation was very, very big part of this when I first started writing in 2015. Hey, if you make it trend, you make it true, right? It was like this little phrase I used in Wired once. And so everybody is trying to game that trending algorithm because everybody wants their hashtag to get that attention to kind of blow up their social movement or their cause or even their shoe brand or whatever. And so it was just this, the game that was created was really, really interesting to me from the standpoint of, okay, well, how do you use different types of accounts in this system? How does this platform engage with that platform? What do you use one for versus another? And so I started really paying attention to that, again, way before it was state actors. I was kind of captivated by first the anti-vaccine movement,
Starting point is 01:00:40 why were they so good at this? And you guys, I know, study gurus, so you've probably got some opinions on that too but structurally i was just really interested in how do they use it and then uh and then isis was sort of the second thing that i became really captivated by right you have this organization with the six-word iconography right like you want to talk about like old school i mean that's it you know you're growing a caliphate you're you're you know putting out how grand were the world and you're not hiding it in any way. It's not like Russia operating as things that they're not. It is absolutely transparent over propaganda. And I was fascinated by who engaged, what they chose to put out, and the sort of style that they were doing. But one of the real interesting questions then was, why is ISIS on Twitter?
Starting point is 01:01:29 And Facebook had made the decision already by that point, by 2016, 2017. It was late 2015, actually. It was like October. It was right before the Bataclan attack. Facebook had already made the determination that these accounts were not going to be allowed on the platform. And it had what... I don't know if it was called dangerous orgs then, but that was also a way to construct around certain types of groups who were harmful, who should not be. But Twitter was having this question about one man's terrorist is another man's freedom fighter.
Starting point is 01:01:53 And if you take down ISIS, what comes next? And so I thought, well, OK, maybe we can draw lines around terrorist organizations. Is that a thing that we can all agree to? So I think it's interesting to create those questions. How do you moderate? Recognizing, as you said, that there is no neutral. So anytime you see a feed, it's ranked. Even reverse chronological is waiting time
Starting point is 01:02:19 as the most important factor. It's probably the most neutral, in my opinion. But everything else that happens on your Twitter feed or your Facebook feed is ranked according to a recommender system, a curation system, deciding what you're going to want to see and engage with. And so again, there is no neutral. It just picks one of these posts. But I was really also struck by the fact that if they change that in any way, all hell would break loose, right? Because somebody would lose, right? Some group of pages, some style of content would lose as some other one would quote unquote win, right? Would sort of rise to the top, would get that attention, would get that engagement. And so it turned into
Starting point is 01:02:53 these battles. And then of course it became partisan, right? This question of when Facebook and Twitter and YouTube to a large extent created moderation framework called Remove, Reduce, and Form. Remove was it came down, right? It comes down, it's gone. Reduce was it gets throttled in distribution. And there are not as many recommendations. It's not curated into a feed. That's how Reduce works.
Starting point is 01:03:20 And then Inform puts up an interstitial. Literally, the content gets labeled. And then inform puts up an interstitial. Literally, the content gets labeled. Sometimes now, but not in the very beginning, the content being labeled will also trigger some sort of downranking, right? So there's ways in which these two things work together. But I thought, okay, so here we have remove, reduce, and inform. This is great.
Starting point is 01:03:38 We can minimize the use of remove, right? Minimize censorship, maximize free expression. We can use reduce for particular types of time-bounded situations, right? Minimize censorship, maximize free expression. We can use reduce for particular types of time-bounded situations, right? Where maybe a rumor is going viral. Maybe it has the potential to have some kind of harm. Again, harm in the real immediate sense, not in a hand-wavy vague sense. Maybe you could throttle it in that moment while somebody tries to figure out what's going on. And then inform, I thought this is fantastic. This is counter-speech and this is contextualization, right? You're telling the audience, here's a disputed thing. It's staying
Starting point is 01:04:09 up. You can see it. We're not taking it down. But here are like other facts you might want to consider. So I started advocating much more for inform, particularly as Mubit started in 2020, really sort of leaning into inform is really your best bet here. But then now it's all called censorship. So now it's, you know, I mean, everything, everything is censorship. It's a very, it's a very frustrating conversation, I think, particularly as somebody who has really, really tried to drill down on the answer to the question, what is the best possible design in a system with no neutral? And I really have tried to engage, particularly with critics, particularly with friends on opposite sides of
Starting point is 01:04:51 this, with the question, what do you want? Because you have to be able to answer that question. What is the values that you want to encode into the system, into this platform design, into this ranking or curation system that you think is the correct thing to do? Because people who just bitch about censorship all day long, who just say that a label is censorship, they have no positive vision for how you do handle things like, what happens when somebody's putting out content telling you that you should drink bleach? Which is a thing that really happens. I'm sure you've seen some of these communities, the MMS, autism, quote unquote, cure nonsense, right? What happens when somebody is putting that content out there? Does the platform have a responsibility? If so, what is it? If so, what is the design feature or algorithmic manifestation of that
Starting point is 01:05:37 value system, of that response to that kind of user-created content? So this is, I think, I'll stop with my soapbox rant up here. But really, I think that is the question. What do you want has to be the question that we ask when people are positioning themselves as defenders of free speech because it's not as simple as that anymore. No, no. Well, you're ranting to the choir there, so I enjoyed it.
Starting point is 01:06:47 Another thing very much related to this that you talked about recently is the motivations of bad actors, I guess, because within, say, those Facebook moms groups, you have people that have discovered that they can gain a lot of clout, maybe make money by paranoias and neuroses. And there are people out there who are not agents for Russia or some other. They don't necessarily have a coherent agenda, but they've certainly learned that you can get clout, you can become popular and perhaps make money by provoking those fears through a whole spectrum of stuff. I guess you'd call it misinformation, but it can just be very highly tortured or twisted information as well. So, you know, how do we, you know, what, how do you think about these actors and those dynamics within these social media ecosystems? That became really a central focus of the work that we did during COVID, right?
Starting point is 01:07:28 Which was exactly, so it was in recognition of the fact that, as you note, many of the most influential players in the COVID conversation were people who already had large audiences, but more importantly, audiences that trusted them, right? And so wellness influencers, for example, were a big piece of shaping conversations around vaccines. There were the
Starting point is 01:07:51 diehard vaccines cause autism kind of anti-vaccine people. That's holding all of that aside. There is this phenomenon that you described, which is often either sometimes well-meaning people, sometimes people who are profiting, sometimes people who want clout, who are in there expressing their opinion, expressing their point of view. So what we tried to get at was this question of what narratives were sticky, right? And we only use public data. I don't know if I mentioned that at any point in our conversation, but when we're looking at whether the Twitter API or the Facebook crowd tangle, it only surfaces public data, so we don't get to see anything private. So within the context of the public data, you can see people who are looking for answers
Starting point is 01:08:35 to questions, right? And so there's a lot of kind of back and forth as people are trying to make sense of the world. And so one thing we asked was, could we kind of quantify groups of narratives, ways of understanding what people were concerned about? And could we surface that to public health or community counter speakers, right? And rather than trying to fight the fight with fact checks and labels, because as you note, sometimes things aren't really demonstrably false, right? Like misinformation is actually a terrible, terrible word, in my opinion, for most of what is happening on the internet, particularly in those communities. And so we were, you know, people who are,
Starting point is 01:09:15 there's a grain of truth, maybe there's an exaggeration, but they just don't really know where, you know, where the best information is at the moment, where scientific consensus is or what have you at the moment. And so we said, can we enable physicians to understand what... Because physicians are still highly trusted. Can we enable physicians to understand what people are talking about so that they can put out counter content, so they can try to engage with these communities, that they can try to reach these communities? And it doesn't always work. It's a very kind of experimental thing because if the community doesn't trust the speaker, they're not really going to care about the message. So that's the part where the sort of counter speech breaks down.
Starting point is 01:09:54 And this is one of the kind of central problems, I think, with the idea that counter speech is going to win in the marketplace of ideas. I would like it to, but the problem is as we move into kind of increasingly divergent realities, if you don't trust the counter speaker, if you don't believe they have your best interest at heart, you're going to kind of dismiss it. So the question becomes, is there somehow like a member of that community who can get that information into that community? And that I think is still an area that's, you know, really worth kind of further examination, but it's really, really hard to do. It feels as well, Renee, like all the problems you're talking about, we're very familiar with. And I can imagine as well that like any efforts to do what you're discussing, right, like to get
Starting point is 01:10:41 experts to engage with a certain community or community members to themselves highlight good information would be seized upon as kind of grassroots, fake grassroots action from the institution to control the narrative. And I think to some extent, there's an asymmetric warfare issue whereby conspiracy narratives and narratives that talk about, you know, an out of touch elite populist views, they're more sticky and the more appealing than the notion that on average, the expert consensus will be correct versus the like blown renegade, right? That sounds like the kind of view
Starting point is 01:11:25 that a sheeple person would have who is insufficiently critical. So I'm kind of curious about that because we grapple with that issue too, whereby when we are critical about the various anti-vax messages or that kind of thing, a lot of the times people present it
Starting point is 01:11:44 that we uncritically ingest the information from any institutional source, which is not the case. I guess that I'd simply... Chris isn't an NPC. He's sure of it. Yeah. In any case, it's a bit of a bugbear because my stance, possibly the same, I think, as most reasonable people is be critical of all the information you consume, but weigh it in terms of the relevant expertise that is associated with whoever's delivering that. So I'm kind of curious about that issue, though. Like, as you just mentioned before, you were embroiled in a controversy recently
Starting point is 01:12:28 primarily because michael schellenberger framed you as the leader of the censorship industry right um and you know there were releases of messages back and forth and there was various sub-stack articles targeting you and that kind of thing and accusations that you were involved with the CIA and that this had been uncovered and so on. So I'm curious in that respect, given that that's going to happen, but just how do you address that? If you're involved in the kind of work that you are that you will almost inevitably become part of the conspiracy nexus of the institutions trying to silence free thinkers especially if you're looking for like programs to push back against anti-vaccine disinformation or that kind of thing i know you've've been on Rogan. You engage with heterodox speakers,
Starting point is 01:13:25 but it didn't really prevent you from becoming a villain in that ecosystem. So yeah, I'm curious about your thoughts. Oh, well, that was sort of funny. What people say to you in the DMs is very different, of course, than what they say in public and Twitter sometimes. So I don't think it was the entire heterodox community that kind of went down that particular
Starting point is 01:13:50 rabbit hole. But, you know, it's a really interesting question. So first of all, I think just to point to one particular thing, the quote unquote censorship industrial complex, that was not based on anything sourced to the Twitter files. That was based on a man named Mike Benz, who was a Trump appointee who'd been a speechwriter for Ben Carson. He spent a couple months at state, and then he started calling himself the Foundation for Freedom Online. So he created a website, and he started blogging. And he took our report on the 2020 election. And what we did during the 2020 election was we had
Starting point is 01:14:27 analysts, a bunch of students who would kind of pay attention to narratives specifically related to claims about voting, not Hunter Biden's laptop, completely out of scope, just claims about voting. And a lot of the claims about voting really became very linked particularly by president trump to delegitimizing the election so a lot of it was mail-in balloting is fraud you know this is fraud that is fraud all these allegations of fraud completely unsubstantiated and so we were in touch with state and local election officials and so when we would see some of these narratives that would go viral we communicated with them sometimes they sent them to us. Sometimes we pinged them and we said, hey, what's the story here? Or, hey, this is kind of breaking out. Maybe you should respond. There was no collusion to take down tweets.
Starting point is 01:15:14 But this man, Mike Benz, took our report and we ran it apart where at the end we say, we just went and after the election was over, after January 6th and everything, And, you know, after the election was over, after January 6th and everything, we took the hashtags of very, very highly viral things, right? Things where there had been a whole lot of activity. And we pulled down all of the tweets from Twitter historical data about them, giving us about 22 million. Mike Benz either, let's say charitably, misread that part of the methodology and said that we had censored 22 million tweets and then that we had censored every narrative that had these hashtags in it. And then in the course of doing so, we had censored billions of tweets, silencing millions of voices, targeting conservatives, et cetera, et cetera. So Benz begins to write these blog posts
Starting point is 01:16:00 and he reframes himself as a censorship, I'm sorry, as a cybersecurity expert. I think Schellenberger calls him like the head of cybersecurity at State Department on Rogan. It's complete nonsense. But that is where all of those claims come from. So there is no evidence of any of that in the Twitter files. That's kind of the most remarkable thing about this entire situation. But based on that man's kind of blog posts and his sort of misreading of how we worked and what we looked at and what we did, Schellenberger really ate it up and he just kind of regurgitated it under oath to Congress. And I was very surprised by that, I have to say, because surely I'd been engaging with him one-on-one because he reached out in December. He wanted to know about content moderation. He didn't even know the basics, but he was already writing about
Starting point is 01:16:49 it. And so I just, you know, I talked to him, I engaged with him off the record for months. But again, when it came right down to it, rather than reaching out to me and saying, hey, I've got this guy who I connected with who's briefing me for my congressional testimony. He says you censored 22 million tweets. Nobody ever asked me that question, right? So it was a really remarkable kind of construct in which, and I get it, he's got a sub stack. He needs to make money. He needs to grow his audience. And the way that conspiracy theories work, you have to create a villain because, I mean, I just explained it to you and it probably sounds very convoluted, very like red string on a because, I mean, I just explained it to you and it probably sounds very convoluted, very like, you know, red string on a whiteboard kind of meme, right? It takes a long time to debunk
Starting point is 01:17:31 point by point all of the crazy allegations in a conspiracy theory. For me, it's an order of magnitude more effort to refute the bullshit. Whereas for him, all he has to say is she censored 22 million tweets. She's the head of the censorship industrial complex. You know, and people who are not critical, haven't read my work, don't know who I am, don't know that I've advocated against takedowns for the better part of five years now are like, oh yeah, censor. And so that's how the bespoke reality works, unfortunately. And that's how... We're not surprised by it. I'm not by it it is a you know it is a response i mean i think the again the question comes back to what do you want so do you want elections in
Starting point is 01:18:14 which viral lies spread and proliferate and election officials are threatened and people don't trust the outcome of the elections because incentivized parties are lying and, you know, and using their large follower accounts to manipulate their followers with claims that an election was stolen or ballots were destroyed or mules are doing God knows what. Is that the kind of information environment you want? Or is it reasonable for platforms to label some of these claims? Is it reasonable for elected officials to counterspeak about these claims? I still maintain that the answer to that question is yes. And I still maintain that a label and counterspeech isn't censorship. And really, that's the most basic argument I can make about it. Well, just as an anecdotal experience that I think confirms the level that censorship has reached on this topic, for all the furore that surrounded Joe Rogan, and we ourselves covered in depth his episodes with Peter McCulloch and Robert Malone,
Starting point is 01:19:18 and there was, like you say, a fire hose of information. So our episodes are like three hours long, and we barely cover one third of what they said. That's obviously wrong in that. But the outcome of all those editorials, of all those articles, of all the hand-wringing was that Spotify put up a little blue tab that says this contains information. But we get it.
Starting point is 01:19:43 Every single podcast that mentions covid gets the exact same thing as rogan and oh yes and the episodes where the n-word was mentioned were taken down so there was that as well but but just to say that you know like think of the mass amount of sub stacks and discussions and podcasts over rogan being destroyed. And what was the outcome? A banner on his podcast. He hasn't stopped talking about COVID. He hasn't stopped promoting anti-vaccine stuff. But I did want to make you opine on Rogan. The thing that I did want to ask you about, Rene, is, so, you know, what you just outlined, and this is me saying it, not you, that when I listened to the conversation between you, Barry, and Michael Schellenberger on who have not done basic research
Starting point is 01:20:46 on how moderation functions or even just the kind of policies that were in place or what agencies are. And that's been my constant impression with the Twitter files is people have, you know, very interesting access. You could present interesting material just from the moderation discussions that have to go on or, you know, whatever your position on the actions that they end up taking. But instead it became this, you know, hyperbolic politicized things where everything was presented in the most sensationalized way. And I want to just ask you that the degree to which it's frustrating or productive, like you do engage, you know, with the heterodox sphere. And I think that's good. There are discussions on there that are not kind of hyperbolic and partisan. I think people have
Starting point is 01:21:39 genuine questions as well. But how do you deal with that fact that there's so much heat and fury around the topic and so much strong opinion, but coupled with a lack of research? That's a bit that I would find personally hard to deal with. And I have had my run-ins with Sam Harris on a similar thing where he talked about the Christchurch shooter manifesto in some depth on several podcasts. He never read the manifesto. This is a particular bugbear of mine because I'm publishing research about manifesto analysis. But yeah, so that like lack of research, it just really grates on my soul. And I'm sorry to like project it onto you, but I'm just curious because you deal with it much more directly. I think the only, you know, I don't, I really like,
Starting point is 01:22:27 first of all, I like arguing, but I also really like engaging with people that I have disagreements with. I've always felt like, you know, I had fun on the Sam podcast with Barry and Michael at the time because he hadn't reduced me to a caricature and turned me into a comic book James Bond villain, right? I mean, and that was the thing, again, when somebody does that, it's because, again, they can't engage in the merits of the argument and they're going to profit from harassing you. They're going to profit from this activity. And so that was a real disappointment. But I think people who come to the conversation with a strong opinion that's rooted in some kind of values, I think, okay, well, you're approaching this with this kind of passion because you have that sense of values. Where's the common ground? Where's the common ground? we can agree on, right? Where do we draw the line? Is there a bright line around inauthentic accounts? How do we feel about them? Is there a bright line around... I think most people,
Starting point is 01:23:32 you'll get the immediate incitement to violence, right? But when we talk about harassment, right? People's experience of harassment on the internet, what do we mean when we say that? You can have these conversations. And if you can have those conversations, then you can get to a place about what kinds of things might be are foundational to a good moderation environment. People who say all moderation is censorship, you're not really going to get very far, right? And Sam brought this up a couple of times too. He said, what is it just like the let it rip school of everything? And Schellenberger is like, no, no, no.
Starting point is 01:24:02 Okay, well, then what is it? But there's certain things where I found, funny enough, what he says now about transparency was something that I said about transparency, which is, for example, government requests to a platform to take down content should appear in a database somewhere. I have in the past written about it in the context of being similar to the Lumen database that exists for copyright takedowns, for DMCA takedowns. And that just lets you see who is requesting what and what did the platform do. And Google actually does these. And it's fantastic. You can see the government of India requested XYZ
Starting point is 01:24:36 and you can go and look and it says like, Google denied the request and sometimes why, or Google acquiesced to the request and then why. And I was like, this is fantastic. Every platform should do this. This is great. Because I do think there are times when a government might have a justifiable reason to do such a thing, to make a request. And if so, then the platform should respond to whatever legal argument the government is making, but then the public should get to see that that has happened. So I think that, again, since my work has never been about advocating for mass takedowns, this argument that government takedowns should have that transparency requirement, I think, is a really good one. If private groups want to lobby private companies to take things down, should you put it up? Yes, you should.
Starting point is 01:25:25 You should have a general transparency rubric. But it's the government thing that's where you really get into impingement upon freedom of speech or censorship in the way that we in Western society have understood that term for the last few hundred years. Yeah. I think you mentioned the strong emotions or the strong sort of pre-existing values that feed into these kinds of discussions, which probably doesn't help them very much. And I'm going to cite you one last time from one of your previous talks. But, you know, actually you described yourself as centre-left, which is kind of another reason Chris and I like you, I think,
Starting point is 01:26:04 because that's the exact right like you, I think. Because that's the exact right point you should be on. But you also mentioned the kind of institutional, like a different dimension. And you described it as like being more institutionalist. Oh, yeah. The person you were talking to, you know, contrasted that with people who want to sort of burn it all down and but you know you might i think so yeah yeah yeah and um you know i think that's got that's becoming more and more important isn't it like like yeah not so much about whether or not it's your more left wing or more right wing
Starting point is 01:26:41 but rather the degree to which you have some sort of faith that there should be some kind of institutions, some sort of, and, you know, moderation or the various procedures that are happening online all sort of require the creation of some kind of institution. And that there are other people like in the people you were debating with, or people have a problem with you, just seem to me to have this libertarian paranoia about any kind of influence on things. So, yeah, what's your take on all that stuff? Yeah, I think, no, it's a fascinating question. There's a book by Balaji Srinivasan called The Network State. I don't know if you guys have read it or encountered the idea.
Starting point is 01:27:22 But, you know, so I was in the Valley for a long time. Like, this is not my, this is not the career I set out to have, right? I was in startups, I was in tech, I was in finance, you know, I had a whole other, you know, very normal, bland, non-controversial set of, set of jobs before this. So I really did love the Valley for its kind of big imaginative ideas, and then also for its willingness to kind of fight about them and debate them. Like I found that really appealing. And there was always a sense that the kind of cult of the founder visionary though, which has some negative, you know, kind of negative connotations, right? People who don't, can we, but should we, right? The should we part doesn't always get asked, but the can we part really does. And then it becomes a question of how do we solve this problem? And I always found that just a really great environment, really great group of people to be in. So Balaji's book, he gave this talk when I was in
Starting point is 01:28:22 tech called Exit Versus Voice, right? And it's, I think, kind of the central divide that you're talking about. So exit is you go off and you start your own thing. And the network state is a vision for what that might look like. What is a state that transcends geography, where people kind of opt into this community that exists in virtual space? Now, plenty of people are going to be like, that's all bullshit. But I think, again, Balaji is a person that you can really engage with around ideas. And he lays this book out.
Starting point is 01:28:53 And what's nice about it is it has a one paragraph, one page, and then a couple hundred page versions. And you can just choose your own adventure, how much you actually want to read. But that exit versus voice question of do you leave and go do your own thing or are you a voice person? And I have always been a voice person. Right. And whether that is incrementalism. Right. Saying I think, you know, my rather pointless fights on Twitter about like bringing eighth grade algebra back to the San Francisco public school system. You want to talk about like getting nailed for weird shit on Twitter Twitter. The amount of people who yelled at me about that, quote unquote, segregationist point of view was surreal. But it was a fight I was willing to step into because I believe it. And as incrementalist as it sounds, I think it really is a material thing that children need. And as a mom, I was
Starting point is 01:29:38 willing to kind of go to the mat over that one. But then the bigger question is, society doesn't function without institutions. And so while I think many of them perform badly, while I think many of them need to be reformed, while I even agree with some of the center-right people about areas of excess, right, or areas in which liberal values have been, you know, deprecated a bit or abandoned in some cases, like, I do think that you have to have something there to hold society together. This is my, you know, like institutionalist point of view. Whereas other people were like, well, it's all going to be virtual and we're going to have web three and so on and so forth was I think the context of my conversation with Antonio about
Starting point is 01:30:16 that. But it is this, you know, what I can't get through with the accelerationist is, okay, you burn it all down and then what rises in its place? Because the answer that they don't want to say is a new set of elites. It's not that there is a society with no elites and no institutions, it's that they think that the current ones suck, but accelerationism and burn it all down, something is going to rise in its place and they just believe that that something will be better. But that's the part that doesn't really get said out loud. Yeah. I mean, that's actually connected to what you were talking about before with the algorithm. There is no neutral. There's always a winner. There's someone on top. So you could demolish the mainstream media, all information and news is crowdsourced now and people have got complete freedom to choose what they want,
Starting point is 01:30:59 but you still end up with a Joe Rogan or some some people it was like you become the new thing right yeah there was a um there was a book by hugo mercier and i'm trying to remember he had a really great quote i mean it was something about how kind of gurus exist to get in front of the parade right like it's the passion of the crowd that creates the guru not the you know not necessarily the guru like deciding consciously to go be a thing. It happens a little bit more organically than that because the crowd wants it. And so I do think about that a lot. What is the demand? Where is that demand directed? And what does that look like in the realm of media, in the realm of institutions? I don't know how many
Starting point is 01:31:40 libertarians would really go to the mat to say, you know, we shouldn't have meat inspectors anymore. Right. You know, so there are, maybe they're all going to find me on Twitter and say exactly that. of what do you want, right? I think that's the one that you just, I just kind of returned to that over and over and over again in conversations with people, which is because it really reveals a lot around both what their preferences are, but also how well thought through the argument is if you carry it to its logical conclusion. We completely abdicate content moderation and then what happens? And so again, the same thing, the cdc didn't perform well okay it sucked okay you know i have i felt that way since 2015 actually honestly so what do we put in its place who runs it where does it come from who funds it what is its mandate you know get specific and that's the part where a lot of times the internet grievance brokers um it all
Starting point is 01:32:42 you know it all it all grinds to a halt because there's no answer there's uh i i i i've met hugo mercer and when he was visiting japan and i i like his book not not born yesterday as well because it kind of emphasizes that that people actually do have like are persuaded by reason and do have beliefs about things it's kind of like a an optimistic thing to hear at this day at the age but the um i don't know if i agree with it i think it's an interesting book yeah i i there are parts of it i agree with and parts that i don't but i i think i like the aspect of the less emphasizing that people are completely adrift and just purely you know the victims of the algorithm the algorithm is right i hate that framing i think that's an absolutely terrible framing i think it's really gone you know again for all of the critiques i had of algorithms in
Starting point is 01:33:36 the you know 2015 to 2018 time frame particularly because i feel like that was still sort of the wild west right where you know we i don't know maybe pushing videos of beheadings is not a great idea, actually. I mean, you're like, might we not recommend certain things? Controversial take. You know, these are my hot takes. These are my censor opinions. You know, but it's really the, you know, now the abdication of agency is also a very, very big problem. Again, this idea that I have been really encouraged.
Starting point is 01:34:08 I don't know how much time you guys are spending on Fediverse sites, either Mastodon or Blue Sky or more like the federated protocol driven platforms that are emerging. I have really been interested in seeing what early communities develop, but also you watch people kind of pop over. And because it's an invite-only community right now, particularly Blue Sky, it is very much like people have invited their friends. And so it still has that everyone on best behavior. It's like everybody's a couple degrees of separation away from each other through this invite tree. And there's maybe 60,000 people on it, but people are so excited by it.
Starting point is 01:34:48 They're like, gosh, what an enjoyable place to be. It's not a, there's no main character. There's no mass harassment, roving mobs, like waiting to find your worst take. And you're seeing people, I think, appreciate this value of a smaller network. And I'm really struck by that because I think the other interesting piece of it is Sightiverse allows for, or federated models,
Starting point is 01:35:10 allow for the creation of smaller networks with community-centered, community-driven moderation so people can create the environments that they want. And I do think, this is why I like Reddit so much also. It has that same kind of dynamic. Yeah, there's some top level you know, some top level things where like, you know, essentially the,
Starting point is 01:35:27 you're like the 10 commandments, like these are the things you will not violate. And then after that, in your communities, you'll kind of decide the nitty gritty, the sort of nuance,
Starting point is 01:35:35 what words you can and can't say or whatever amongst yourselves. And so I think that that, that kind of slandering into smaller networks is going to happen. Yeah, exactly. That sort of might be the word that Blue Sky uses, but yeah. Yeah, I have a theory that Elon intentionally has become like the liberal Jesus
Starting point is 01:35:58 because he sacrificed himself to destroy everyone's addiction to Twitter. sacrificed himself to destroy everyone's addiction to Twitter. The only way he could do it was by making it a viral environment and inviting all the evil back to play with him. But I don't think intentionally he has created that environment for himself. But he might be one of the sources that actually succeeds in making the federal more diverse social media platforms a reality. But really, I did have a question that, you know, you were talking about in 2015, 2016, the algorithm, there was a wild west of sorts, right? Then you had various figures who became quite good at manipulating what's trending. You know,
Starting point is 01:36:51 you had the kind of alt-right Cernovich and Stefan Molyneux and Steve Bannon and so on. And I would draw a distinction, although there are overlaps in certain respects with figures like Jordan Peterson or the people involved in the Twitter files, the IDW sphere, you know, there's overlaps, but in general, those strike me as different phenomenon. And I'm kind of interested that to some extent, you know, maybe to a large extent, there are things in the algorithm where a lot has changed and that you won't see beheading videos, maybe on Twitter now, but apart from that, you won't see them tend to trend as much. And there was a wave of, you know, Jordan Peterson and Elon Musk to a certain extent
Starting point is 01:37:34 and those kind of figures getting huge attention and having kind of what we're interested in on this, on the show, the ability for like secular guru types to get a large following, to dominate conversations about culture war topics. And I'm, I'm curious to what extent you think that is purely a phenomenon of social media, you know, that in reality, most people on the street don't know who Jordan Peterson is. Whereas with Cernovich and stuff, the argument is that they, you know, they helped to sway the election. So I guess I'm asking what role you think, you know, the ecosystem of gurus plays.
Starting point is 01:38:18 Are they like bit players, like a sideshow of interest, or are they very important nodes? Um, you know, it's a really interesting question. Have you ever read Damon Santola's book, How Behavior Spreads? He has this really, really, really interesting concept called complex contagion, right? And so in network contagion, think about it like a disease, right? Person A gets it, transmits it to person B, C, and D who then kind of go through the rest of the alphabet, right? So the thing cascades. But he has this argument that I think is very plausible about how certain nodes in a network, call them influencers in this case, because like people with large numbers of followings actually serve
Starting point is 01:38:57 like a gating function. Like they decide what information is going to make it through from the periphery into the main part of the network. Because they have the capacity given their follower account to when they tweet something or post something, many, many, many more people will see it. And so it will kind of perpetuate the cascade. But there's this interesting theory that says that influencers are only going to really kind of like pick up things that are not too far beyond the pale. So in a way, they kind of like it because otherwise they're risking their reputation on picking up some random crank theory, whatever crank means in your particular, you know, bespoke reality or, you know, internet faction. But they have to kind of decide what
Starting point is 01:39:35 they're going to move through and push out to other people. And that is kind of what we see. So when we look at, for example, things that went viral in the 2020 election, and again, our extremely public report is online at eipartnership.net. But if you read it, there's a couple of graphs in there. And what they show is that a lot of this stuff comes from the bottom, right? It's people, like it's like a guy who takes a picture of ballots in a dumpster. And he genuinely wants to know what is happening with those ballots. That is a very real inquiry from a very real person. But he tweets it. And what he does is he tags in influencers that he trusts. And they are often kind of primed to believe that they are finding fraud. They have just found evidence of fraud. And so they tag in the influencer.
Starting point is 01:40:18 And then depending on whether the influencer picks it up, one or another of these allegations will actually kind of make it big, right? So the influencer becomes, in a sense, that person that pulls this thing up and really moves it through and really makes it explode. And you can see the kind of exponential curves that happen when one of these people decides to tweet it. And one other final thing I'll add on that is oftentimes they will use the language of conspiracy or something very, very vague. Rather than staking a firm belief, they'll just say like, big if true, or someone should look
Starting point is 01:40:49 into this. And so it's actually a really interesting kind of red flag, I think, that audiences need to learn to recognize, which is if the guy that you're following, if your guru or your influencer is tweeting big if true, someone should look into this, that's them basically saying they themselves don't have confidence in it. And so yet again, you have the combination of the influencer's capacity for creating a cascade, which is sort of the network analysis way of looking at these things, or looking at the substance and the rhetoric. Here's a scandalous, salacious thing that I have just tweeted, and you are going to be much more interested in it than if I just hit the retweet button. You brought up so many interesting things
Starting point is 01:41:29 there today that put stuff that Chris and I have seen. Chris even coined a term I love, which is, he calls them strategic disclaimers. I mean, that's what they are. Yeah. The classic one, of course, is Brett Weinstein saying that he was not going to entertain conspiracy theories. He was going to entertain conspiracy theories. He was going to explore conspiracy hypotheses. I mean, you know, tomato, tomato, right? Yeah. Yeah.
Starting point is 01:41:53 I mean, you know, it's obviously like it depends on how you look at it. And we think about it a lot. Like to what degree are the gurus or the influencers, the people with huge follower counts, are they the ones that are the driving force in these networks? And to what degree are they, you know, tapping into the crowd and the crowd kind of makes them do what they do? And, you know, we've followed so many figures, but people like Russell Brand or James Lindsay or that fellow J.P. Sears, I mean, you name them, even Jordan Peterson,
Starting point is 01:42:25 they seem to, they are certainly following what the crowd wants from them and satisfying some sort of need there. But they do seem to be getting, almost all the people we've covered have gotten more extreme, more edgelordy over time. So you said, and I think I'm sure it's true that a guru an influencer doesn't like wants to be controversial wants to be exciting have that frisson doesn't want to be go over a line where it's completely crazy and damage the reputation but it feels like to us we've seen them experimenting with that edge and finding out that they can actually go pretty far without
Starting point is 01:43:05 reputational damage. Have you ever seen the post on audience capture by, I know him as Gerwinder. And I'm going to be totally embarrassed if I do. I feel like that's what his blog's name is. And that's how I know him on Twitter. And so anyway, it's a really great articulation of the phenomenon of audience capture. He really just kind of walks through it. And I think it's one of the best, one of the best treatments of that phenomenon is his blog, his sub stack post on it. But I, you know, I was, I talk about it sometimes. I have a few friends who are YouTube creators with very, very, very large audiences.
Starting point is 01:43:41 And they're in the science, you know, kind of science creation space. with very, very, very large audiences. And they're in the science creation space. And it's very interesting to hear them talk about it because they can see on their creator dashboards where people click in, where they drop off, what words brought them. They have so much micro understanding of each piece of content. And so it's very easy to do what they kind of refer to as feeding the algorithm, right? Making content for the algorithm. And you talk to them about it sometimes and they do kind of wrestle with, are you staying true to yourself as a creator if you're doing that? Also, you have children to feed. So there's trade-offs here. Most of them, I think, recognize the audience capture phenomenon as like a thing
Starting point is 01:44:25 that I do not want to happen to me in that sort of science creator community. But I think the political community in particular really feeds on itself and pushes itself to be a more and more and more extreme take. That's why, you know, my maybe also controversial opinion is that when we think about what propaganda means today, and we think about the incentive structures that produce it, you know, we've always focused on the incentives that lead to propagandistic media in the broadcast sense. There is also propagandistic media in the social sense. work, again, niche academic jargon maybe, but to understand what that looks like. Because it is people who have a particular agenda, particularly an ideological agenda, who have that reach, who have that skill at rhetoric, and who have that distribution. And so it's very interesting to watch the way in which they engage a crowd and the feedback loop that comes about as a result of it.
Starting point is 01:45:20 I think that in the political sphere in particular is where you really see that happen. as a result of it. And I think that in the political sphere in particular is where you really see that happen. We've also observed, and I think there's a phenomenon that, you know, especially people focusing on QAnon and leaderless cult, so to speak, right? That the ecosystems,
Starting point is 01:45:38 and maybe it's a function of social media, is particularly good at spinning out minor influencers and figures. So they're not like your Jack Pesowich or, you know, it actually is in a way grassroots because it's individual people devoting huge amounts of time and building like a relatively significant following, but not enough to support themselves. And yeah, and I know that people have talked about QAnon and whatnot through the lens of an interactive game. But in our case, it seems like there are people that are successful in other aspects of life. They might be even like CEOs of company, but they really get a kick out of being involved in these online social media ecosystems. And people have talked about Twitter brain and that kind of thing. But it definitely seems that
Starting point is 01:46:27 there's just something to attention ecosystems, which isn't, it isn't just about algorithms and stuff. It's just about people being social animals that thrive on positive feedback. So even the people that we see getting hammered, you know, for their conspiracy theories, like Brett Weinstein, every time he posts on Twitter, it's just thousands of comments saying you're an idiot, right?
Starting point is 01:46:50 But he also gets tons of feedback and praise from the kind of people that he admires and respect and from people saying that you're a brave truth teller. So it does feel like maybe those dynamics, it's not so much a top down there's top down stuff going up there's also the influencer level and you have the grassroots stuff and it's people maybe fixate a bit too much on one level as the explanation when it's a melange of all of them this is um kate starboard andwood and I did a paper on this. And we talked about it as top-down versus bottom-up. It's actually in a lot of our election integrity work too.
Starting point is 01:47:32 That was where we really started looking at the bottom-up stuff. It's really important. And because what you're describing, people get a sense of camaraderie out of it, right? They get a sense of mission. They have a sense of fulfillment. You're in the trenches with somebody. I mean, like I said i my little like fights over algebra i have like a group chat with like eight moms that i've never met in person uh but we really are like you know the die hard god damn it there will be algebra
Starting point is 01:47:54 and sf usd and i'm not even there anymore and i'm still in the group chat you know because it's like this is a thing i want to exist in the world yeah sorry you have to tell me i'm curious what what is this thing I want to exist in the world. Sorry, you have to tell me, I'm curious. What is this controversy about algebra? You know, in like 2015, San Francisco Unified eliminated eighth grade algebra. People are going to respond and they're going to say, no, they didn't eliminate it. They just like partitioned it out into other classes. But the problem is you're not allowed to start the high school math track until you're a
Starting point is 01:48:22 freshman in high school. So kids who are capable of doing it earlier used to have the ability to take it in eighth grade, as I did. And, you know, other, other, this was, if you wanted to get to like AP calculus, you had to start before, you know, to like get algebra done out of the way. And so a lot of kids really want to do that, but the district eliminated the capability for them to do that. And they did it under kind of arguments that it was an equity situation. The data didn't really bear that out. Moms have been accusing the district of lying about the stats and filing sunshine requests to get the data to do the analysis themselves. It is a whole thing.
Starting point is 01:48:57 And they tried to pass it at a California state level, leading to even more outrage that our garbage-letting local laws were all of a sudden being considered for the 37 million people who live in the state of California. So it really, really heated up again. And SFUSD is now looking at bringing it back in 2023. So small victory. But it was one of these things, though. Again, it sounds like all politics is like the nastiest politics are always local but it was it was branded as the segregationist position right if you wanted quote-unquote tracking then you were inherently a segregationist and it was really a kind of a wild like for the rhetoric to kind of again go all the way up to boiling right there? There's no, hey, these people are just reading about math. It was really like, you know, you had to have a massive fight in which you were
Starting point is 01:49:49 a segregationist or you were, you know, like a, what we wanted, like quote unquote, woke math was one of the other, kind of the other side of that argument. So again, it was one of these examples. It was very interesting to me, again, the same way that you can shut down a conversation by using epithets for people, make other people not want to participate. You know, again, that's where I think a lot of the conversation on social media or the moderation conversation or anything else is today. Like there is some moderate data-driven conceptualization of the world that we can achieve and have where, you know, the overwhelming majority of people are happy. that we can achieve and have where you know the overwhelming majority of people are happy but that doesn't work if you get into your camps of like you know these two polarized extremes yeah i hope you avoided this online renee but there was the the era of two plus two equals five the beer oh yeah yeah the mere thought of it sends me into a mental re-age because it's such a stupid thing to end up debating and it feels like everyone including
Starting point is 01:50:56 the the folks on the liberal side who are arguing that they're just talking about the complex mathematics that there's some formulas where it could be meaningful. And no, like everybody is trolling everyone else because in daily life, everyone is aware that two apples was two apples. It's four apples. But that's presented as that it's a big thing that's happening in school systems that now kids don't even get taught algebra anymore. The basic algebra, two plus 3 equals 4. And on the other hand, an unwillingness to suggest that 2 plus 3 equals 5 is completely... Everyone knows that's legitimate.
Starting point is 01:51:36 Mathematicians know it's legitimate. That is the sign of a reactionary mindset who can't handle nuance. So just like, that's peak Twitter. actionary mindset who can't handle nuance so just like that's peak twitter you know the uh the meme with like the bell curve and you've got the like drooling guy here and then like obi-wan kenobi and then in the middle is like the normie you know yeah yeah i mean twitter twitter is um it is again like so far blue sky has been you know kind of a breath of fresh air in that front i i feel like i can just go there and people are like being chatty and happy and interesting and it hasn't quite
Starting point is 01:52:10 turned into like everybody's sharing the absolute worst news of the day yet so in that in that regard it's been uh it's been actually really nice feeling like you know i actually don't know what the main character like who the main character of twitter is today or yesterday or the day before because i just haven't paid attention. It's liberating. Could you pretty please invite us? Because I am sick of Twitter. I can't say that. I thought Matthew Iglesias was supposed to be the main Twitter on Twitter.
Starting point is 01:52:36 People were posting cameras at him or something. I think it's actually kind of harder to see stuff because there's no, there's like a what's hot, but that's individual. They came up with the name Skeet, which I think Jay, the CEO, hates. But the... So the What's Hot tab is not trends. There's no... Everybody is talking about this thing.
Starting point is 01:52:56 And so you can kind of... So clicking into browse, that actually feels like you're going to see content you wouldn't normally see in the people that you're following. But it's also not like, again, the absolute, you know, biggest pile of stuff or like the stupidest trends or whatever. So I always really wondered like from a design standpoint, what would happen if you just killed trends on Twitter? Like would that make the world or make Twitter just a more stable, less polarizing place? And I actually do think the answer is probably yes. So I, Rene, I promise that we'll let you escape to the realm of slumber, or at least just out of this conversation soon. I literally could ask you tons of questions, but there's one more,
Starting point is 01:53:39 at least that I have to indulge. So like, you know, you're talking about the new platforms and the federated kind of model, which is maybe becoming more popular. We'll see, right? You know, social media websites don't tend to last forever if MySpace is anything to go by. But I noticed that recently
Starting point is 01:54:01 I heard some interviews with some guys from Substack and I saw themselves getting into trouble. Oh yeah that was yes. Not great answers around the topic of hate speech and it made me one it made me slightly concerned because it felt like you know Substack has a lot of success it's got big people on it but there still feels like there's a lot of naivety around what's going to happen and and even just basic things like you know you were talking about what do you do if somebody promotes bleach drinking for children which is a real thing or what do you do if somebody is not directly telling their followers to target the parents of dead children,
Starting point is 01:54:47 but is strongly insinuating that they're potential actors who are pretending, right? And I was disheartened to see, like, it felt like Jack Dorsey from four years ago or something in the replies. I wondered if you had any opinions about Substack or just whether we're doomed to endless, like have people who are CEOs or influential people who seem to regard that, you know, it's not going to be that much of a problem.
Starting point is 01:55:13 You know, we'll take it on a case by case basis. There's sort of an interesting one, right? Because they are, first of all, I like the platform. I found many new and interesting people on it. I have a bunch of folks that I saw in short form on Twitter that I followed on Substack, and I actually really,
Starting point is 01:55:31 really came to like their stuff, particularly the diverse political spectrum there. I think there are a couple of things that are not particularly great. I think that once the platform begins kind of recommending and cross-promoting other things, then... So this has always been one of the lines for me. The way that Aza Raskin and I... So I was an advisor to Center for Humane Tech. I think I still am an advisor. But when it was first founded, we used to talk pretty regularly. And we came up with this little phrase like freedom of speech, not freedom of reach. And one of the things that we were trying to argue with that was that there was a difference between platforms allowing somebody to be on there, right?
Starting point is 01:56:06 And platforms recommending, platforms boosting. Funny enough, this is now Elon's content moderation policy. That phrase is written on the top of Twitter's content moderation thing. It is literally in the recommender system. And I was like, the irony of me being tired is this fucking sensor while the man himself has just pulled my work to underpin his content moderation policy, but whatever. Not better. It's always interesting to see where your work winds up, right?
Starting point is 01:56:36 So we wrote this in 2018, trying to make this argument because we felt that you could have an environment where something could exist on the platform, anti-vaccine groups or what have you, as long as the recommender system wasn't pushing it out. Because once the platform made a decision to push it out, it was a value judgment. It was saying, this is content that is in line with our values as a platform that we think you as an individual who did not search for it and did not indicate interest in it, nonetheless wants to see here it is. And so we had argued for like, is there some sort of do not recommend? Might it be derived from something like Google's your money or your life framework? And these are the sorts of things where these conversations have been going on for 10 years now. Your money or your life started in 2012. So seeing new platforms not necessarily pay attention to that or to that kind of learning and think that it's all going to be you know that that free
Starting point is 01:57:33 speech is the kind of only consideration i think once they start recommending things once they start sort of promoting uh content by way of searchable leaderboards and things like that, there is a obligation to understand what you're doing and what you're hosting, right? And that becomes, you know, becomes the what you're doing part in particular is the part that I think is where the kind of need for a set of values underpinning a program of like, we will or will not support this like that that's where i think that that question of will we push it out or will it live somewhere else there is um there was one other thing about substack that was interesting and that was that
Starting point is 01:58:17 it for all of the talk about it being distinct from social media. They had this rubric early on that they called the fire emoji rubric. I'm trying to remember which tech newspaper covered this, but it was very interesting to me because they were offering people advances basically to kind of quit their media jobs and come be sub-stack writers. And they evaluated them according to this rubric of like, did they have high engagement on Twitter? Which is basically like saying, are you kind of like a brawler on Twitter? So in a way, that same kind of incentive structure, that same sort of like, you're going to do really great here if you're kind of one of those people, that does kind of wind up in the Substack DNA. And I think it's expanded beyond that now. I think the number one person on the politics leaderboard is, I think it's Heather Cox Richardson, who's a historian who writes a history substack. But it is, you know, again,
Starting point is 01:59:10 I've been noodling around this question of, you know, what does propaganda look like in the age of self-generated content? I think it's, you know, it's an interesting space to watch. I also didn't think that that interview necessarily showed like a full awareness of the the nazi bar problem as uh as mike masnick put it yeah i mean totally agree um renee i mean i think it's just a it's apart from being concerning and worrying and all these bad things it is just from an academic point of view absolutely fascinating that we're living in this world where like there isn't like you don't have propagandists and consumers of propaganda. We're all kind of involved in this.
Starting point is 01:59:49 Yeah, in the production of it, yep. And we're all, you know, even the best of us might be chasing clout from time to time. So I'm not mentioning any names there. I'm not one in this competition, but the other guys. Yeah, yeah. Those other people, yeah. But look, thanks so much for staying up late to talk to us.
Starting point is 02:00:08 No problem. It was a lot of fun. Yeah, really fascinating stuff. And we'll link to some of those talks that I was citing because I think people should listen to those as well. Chris, any final words from you? No, just seconding Rene. I have been and I'm a big fan of your your work and uh despite
Starting point is 02:00:30 your commitment to censorship and shutting down all of the free discussion on the internet i i overlooked that because i i think your work is is very interesting and i'm broadly politically agrees with me so So that's good. I hope you win the algebra battle. Yeah, we will. Yeah, you can follow the Moms for Liberty model. They seem to be having great success. Yeah, the Storm Screwed Room meetings.
Starting point is 02:01:02 That's a whole topic for a whole other podcast, but I really do wish that, you know, center left mom identity was like a little more forceful at times. One thing that I think me and Matt greatly benefit from is we don't live in America. I'm in Japan. I'm from Northern Ireland. And Matt is in Australia, from Australia. And like we get, is in australia from australia and like we get we get the holiday from the culture war basically all the time in our daily life so the little window that opens up to america on the computer is like yes it influences things but it's also like god i'm really glad i don't live there that's
Starting point is 02:01:41 that's the constant refrain that i that i get uh there's lots of grudge things about america as well they've got disneyland they've got buffalo oh sure there's lots of problems with japanese schooling as well but like you know i just i'm glad that i don't need to campaign because i've got no chance to change or influence anything here. Everything just has to be a fight here. I hope that we get past it. People are still wearing masks here. Actually, the problem has been that the government is trying to tell people they don't have to, but they still are. So that's the thing.
Starting point is 02:02:22 When I've seen all these debates online you know around mask mandates and stuff it's just it's interesting experience to be living in japan where masks were you know already a part of of life and not controversial and and are pretty much still not controversial now anyway so that's that's just a advertisement to go live in japan to escape the culture and um yeah but thank you for indulging us so much and yeah i hope all your online interactions are peaceful from here on out thank you so much have a great night and that was renee and that was our interview. And that was very good. Hopefully, people enjoyed that. I think they will because Renee has a wealth of information. Yeah, I think the work that she and others are doing, which is to understand about the role of
Starting point is 02:03:17 moderation, you know, in the broader sense, and to get it beyond this idea of censorship or free speech, that's kind of the wrong conversation. It's a silly, boring conversation. It's really very interesting, these issues of these connected networks and crowdsourced propaganda and just how to manage that. Because there's heaps of positive benefits to social media, YouTube, podcasts, and the rest. And, yes, it's just a matter about approaching it in a clever way. So luckily, the clever people at Stanford are going to sort it all out, Chris. So we don't need to worry about it.
Starting point is 02:03:52 They got it under control, I think. Yeah. Well, the other thing that I kind of appreciate is it's nice to see when somebody has a healthy sense of humor about these kind of things despite being targeted by conspiracy cranks and whatnot it's always it's always good to see somebody still have a healthy attitude towards it and so it's but it's notable that like anthony fauci renee di resta was also targeted on political grounds you know to the degree that her professional work interacts with that political sphere and uh yeah i think her personal experience is is kind of similar and you see what
Starting point is 02:04:31 happens in terms of the attacks on your reputation and so on when you are giving advice or presenting findings that politicians find inconvenient yeah that's a shrewd analysis i i signed off on it matt so hit that back part and put it in your pocket great great see i was complaining i was complaining the other day that you know you never tell me how brilliant you know brett weinstein's collaborators eric weinstein's collaborators they all get told they're brilliant all the time you virtually never give me compliments i sometimes wonder if this is kind of a abusive relationship i mean you know you could just just just tell me i'm tell me tell me i'm great sometimes but you know it's good yeah well matt it's that time of the podcast uh where you don't get told you're great. You instead get told what you've done wrong, how you've mispronounced things, what your various flaws
Starting point is 02:05:31 are. And occasionally people mention me too in our review of reviews section. Now there's one, which is a passing point that I noticed this too, Matt, and I feel you have to be pulled up in it. You have to be stopped. You refuse to correct your pronunciation of the matrix. But on the last podcast, you said mic drop. Mic drop. Yeah.
Starting point is 02:06:01 In reference to a mic drop, right? Oh, yeah, yeah. Yeah, yeah yeah you know that's wrong right you know it's not like it's not dropping an irishman like a drop mic drop it's a but it's a microphone drop it's spelt m i c how would you pronounce that mic mic mic because i know it's short for microphone so yeah you're not going to correct it are you going to keep saying that well that's a mick drop well like lex fridman says you know it's it's hard to tell what the truth is you know it's like with the election um who can say it's just opinions that's like your opinion man i mean it could be
Starting point is 02:06:40 mick could be mike who can say yeah who can say? I leave it to the listeners to decide. I tried, dear listeners, but he can't be stopped. They're not the deciders. They don't get to decide. No, but the world has decided. No one else calls it Mick Drop, but that's fine. That's fine. Mick Drop.
Starting point is 02:07:00 So, Matt, we actually do have reviews. And given the criticalness in the negative review that I'm going to read, I'll just read this positive one, which is unusually straightforwardly positive. His title is Simply Perfect, five stars. And it says, analytical and critical nonjudgmental analysis of the fallacious and conspiratorial rhetoric that is spreading all over the world mn092 from america thank you very much a mellifluous a mellifluous review that i think is just spot on it's exactly what neil does down that's good i don't get i don't get enough compliments from you so it's good to get them from reviewers occasionally yeah so this one we've got a review which is like two out of five now there's there's a lot in this review i'll try
Starting point is 02:07:53 and read some of it and stop along the way but um let's see where it goes okay so usa usa true left outside current calibration scheme thoughtful emoji okay emoji. Okay. It says, gentlemen, any good meteorologist knows measurement equipment is calibrated against a reliable standard. To what standard are the grommeters readings being calibrated against? Paranoid Northern Irishman demonstrating neoliberal nuance porn and or the comfortable passive aggression of an Aussie swim bro. Matt, I like that. So far, I'm on board.
Starting point is 02:08:30 I like the concept of a swim bro. I don't know what that is, but I aspire to be one. Why am I paranoid? Anyway, Matt knows enough math to understand the embedding, yet the podcast remains untested. What if Decoding the Gurus received an annual calibration from the American English working class left? Seems to me you casually dropped Aaron from Embrace the Void as the new token leftist.
Starting point is 02:08:59 Doesn't a Daniel Harper criticism remain unaddressed involving your protected son, I think Sam Harris, S.B.H., Sam, middle name B. Harris. OK, and then what about that polite conversations critique that never aired here? drift remains unexamined, your opinion will skyrocket with neoliberals, the majority, and paint the working class progressives who don't have time to write reviews like this until they are literally wasting PTO as somehow extreme. I don't know what that means, but you know, let me continue. You will bask in the comfort of high average ratings while slowly orbiting the giant black hole all of your gurus are doomed to spiral into self-satisfaction i demand traceability only a few generations ago our bloodiest war was fought against half
Starting point is 02:09:58 our countrymen over ownership of people i would push back against the idea that American racial politics are somehow a punchline rather than a real struggle. For real though, American flag. I would say really good pod except for the lack of calibrating your political center. I might change my reading if you decide to reflect on this at some point. There are plenty of American artists and left-leaning guru types who deserve to be within your calibration limits. And if I don't speak German, qualifies as too extreme for your analysis, we are living in different worlds, Mia. That's confusing on multiple levels.
Starting point is 02:10:36 Test developer. This is the username. So there's, like I said, Matt, there's a lot there. There's a weird thing where Aaron seems to get cast as a, I don't know if he's cast as like a fake working class leftist. He was our like token leftist that we've now discarded.
Starting point is 02:10:53 Or he was the only example of a working class progressive leftist voice. But I don't quite get Aaron as either of them because like he's a you know aaron's fine but he's a philosophy professor that's not the typical blue collar job that that is usually associated with the working classes um so yeah look unfortunately i don't think we've got time to properly unpack everything that's in there i i don't want to simplify the critique, but I think broadly speaking, it's that we're not left wing enough. Well, yeah, there's that. And just like a weird thing that we need to recalibrate our political compass to the American spectrum. our political compass to the american spectrum but the american spectrum is famously like less left-wing than in europe or australia like the left-wing parties in america on average are less
Starting point is 02:11:55 left-wing than the left-wing parties in europe and australia so it's just it's a very weird, weird accusation. I know. And something about which we should be decoding more left-wing figures. Yeah, that bit at the end was unexpected because it seems that he wants us to cover, I don't speak German, but the rest of the message sounds like he's saying we're avoiding their critiques but he's saying we should cover them which is usually i don't like it i know i agree i was confused by that too and you know the the polite conversations is reference to aina's podcast that i was hosted on and we discussed my approach to sam harris and the various issues that she took with it and like why would that be on our feed i posted it on the patreon but it's her it's her podcast
Starting point is 02:12:54 like i just take the audio from another podcast and like stick it on our feed it doesn't it's just uh anyway anyway look they're still listening thank you for sticking with us despite the frustrations we cause um that's okay um we're doing our best we're doing our best okay well i mean i thank them for the feedback but i just come on that's just a very confusing message and i guess the the argument about us seeking to suckle at the neoliberal center-left teeth, like how many times do I need to say this? I am a center-left moderate. I am.
Starting point is 02:13:34 I'm not pretending to be something different. This is what I freaking am. So if you're like, yeah, look at you out there appealing to people on the center left. Like, yes, sorry. And it's perfectly fine for you to have different political opinions. I don't mind, but I'm not disguising my political position, right? Like, how do people not get this? I don't know.
Starting point is 02:13:59 It's a funny thought imagining you being in truth, deep, really, like left of Mao, It's a funny thought imagining you being in truth, deep, really like left of Mao, but pretending to be a milquetoast, you know, center left liberal for clicks and clout. And Sam Harris, again, that notion, that critique of Sam Harris is just all pulled punches. God damn. I don't know. Anyway, but as we said last time the important thing is you know whatever your position is that you download the waking up app it's a meditation app you can subscribe it's got a reasonable cost it's very helpful sam harris also has a very nuanced podcast
Starting point is 02:14:40 where it covers various issues called making sense and you know i'm just saying you know our listeners would enjoy them so uh and if you could reference great value great value decoding the gurus remember if there's any space just decoding the gurus send you uh that's that's all i wanted to say about that very good very good All right. Let's shout out our lovely patrons. Oh, yeah. Them. Yeah. Them. The actual people that pay us and that we are beholden to.
Starting point is 02:15:12 There's many of them met. Many of them met. And they're all good people. And this isn't just me stalling while I pick up the thing that I need to hunt out. This is me just genuinely expressing my love for them. And here we are, Matt. Now that I've done that, I'm ready to say who they are. I have a bunch of conspiracy hypothesizers that I want to talk about. And I'm going to say thank you to Matthew Smith, Christian Doonan, Amanda Wimital, Matt Coofan,
Starting point is 02:15:49 Matteo Mazur-Goulet, Erland Hevetafeld, Hannah Moran, Amma Botang Mensah, Anders Bruna Pedersen, Charles Tettas, Richard, Will Thomas, Janice McKay, NDSR, Polly henderson charles tetas richard will thomas janice mckay ndsr polly darton and michael fields
Starting point is 02:16:10 i'm gonna thank them all matt for being uh conspiracy hypothesizers of the highest order thank you what about you everyone i'm gonna thank them as well. Surprise, surprise. Yeah, that's what I like to hear. And here's the clip. Any minute now when it loads back? Every great idea starts with a minority of one. We are not going to advance conspiracy theories. We will advance conspiracy hypotheses. Okay, yes.
Starting point is 02:16:43 Yes, we will. That's fine. Okay. Yes. Yes, we will. That's fine. That's true. You will do that, Mr. Brett. Now, for revolutionary geniuses, Matt, the slightly higher tier, we have
Starting point is 02:16:57 Steve Earle, Cameron Harskamp, Alicia Wilson. We also have Frederick Dumont, Andy Ball Lecter, Conspiracy Theory Impostulator. That's a good name. I like that.
Starting point is 02:17:26 Kerry, Scotty Parkett, Rymar, Lucy, Kate, and Carrie Gautason. All those guys. Excellent. Revolutionary thinkers. I'm going to thank them as well. I'm going to thank them as well. Me too.
Starting point is 02:17:38 Just for a change. 70 or 90 distinct paradigms simultaneously all the time. And the idea is not to try to collapse them down to a single master paradigm. I'm someone who's a true polymath. I'm all over the time. And the idea is not to try to collapse them down to a single master paradigm. I'm someone who's a true polymath. I'm all over the place. But my main claim to fame, if you'd like, in academia is that I founded the field of evolutionary consumption.
Starting point is 02:17:55 Now, that's just a guess, and it could easily be wrong. But it also could not be wrong. The fact that it's even plausible is stunning. Stunning. Stunning indeed indeed is it not matt yes it is a suitable award there yeah and now the last last one the the most generous the most intelligent the most beautiful the most insightful people that contribute to us, the people that can talk to us every month, every month, every month, every fucking month. Uh,
Starting point is 02:18:28 if they, if they so choose on the live hangouts, um, Otis Sanford, Diane Morrison, uh, Haley, Devin poor,
Starting point is 02:18:45 Chad Brock, that's a pretty strong name, Kristen Sweater, and Wooter. Oh, and Joey Jojo Jr. Shabadoo. That's it.
Starting point is 02:19:01 Real person, Matt. Well, you know what? I'm not going to thank these guys. I thanked everyone else. These guys, no thank you. Oh my God. Well, I'll double thank them. Make up for that.
Starting point is 02:19:14 I'll double thank them. So here's the clip and a triple thank you from me for your generous contributions. You're sitting on one of the great scientific stories that I've ever heard. And you're so polite. And hey, wait a minute. Am I an expert? I kind of am. Yeah. I don't trust people at all. So that's it, Matt. That's our lovely patrons. Done and dusted. Next up, Yudikowski, Chomsky, Huberman, Matthew McConaughey. All these people. They're all coming. Yep.
Starting point is 02:19:50 Can't wait. It's an exciting time to be a guru analyst. Looking forward to it, Chris. Have a good rest of your day. You too. Check it out, Matt. Get out of here. Go on, you little kids.
Starting point is 02:20:42 Drop your mic. I'm your mic i'm out i'm out Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.