Pod Save America - Offline: Abbie Richards on Fighting Disinformation on TikTok

Episode Date: January 30, 2022

This week on Offline, Jon is joined by TikToker and disinformation researcher Abbie Richards. A leading voice on the platform, Abbie inoculates her viewers to trending disinformation and provides them... with the tools to fight back. Jon asks her about what that work entails, why this current moment has seen the rise of so many new conspiracies, and dives into her viral conspiracy theory classification chartFor a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.

Transcript
Discussion (0)
Starting point is 00:00:00 The Supreme Court has had a busy summer loosening gun restrictions in states, overturning Roe v. Wade, and severely threatening our Miranda rights. I'm Leah Lippman, and each week on Strict Scrutiny, I'm joined by my co-hosts and fellow law professors, Melissa Murray and Kate Shaw, to break down the latest headlines and the biggest legal questions facing our country. It's more important than ever to understand the repercussions of these Supreme Court decisions and what we can do to fight back in the upcoming midterm elections. Listen to new episodes of Strict Scrutiny every Monday, wherever you get your podcasts.
Starting point is 00:00:29 I can't imagine the amount of hate and threats that come your way as someone who's taking on conspiracy theorists and extremists every day. What has that been like for you? How do you process that? Oh, no, it's great being a woman online. Of course, famously, famously's great being a woman online. Of course. Famously, famously easy and wonderful, right?
Starting point is 00:00:48 10 out of 10 would do it again. Yeah, no, it's not something I was ever prepared for. When I started hosting, you know, it's not something anyone's psyche is built for. Like your brain just is not meant to be processing that amount of like discussion about you in the first place. But especially when there's so much like hatred in that discussion, it feels very strange for your brain. So you're dealing with threats. You're seeing, you know, even your most popular posts sometimes get fewer views than, you know, some of the disinformation and conspiracy theories you're trying to debunk. You're seeing online extremism get even worse. How do you stay hopeful that this is a winnable fight? Who told you I was hopeful? I'm Jon Favreau. Welcome to Offline.
Starting point is 00:01:42 Hey, everyone. My guest this week is Abby Richards, a 25-year-old TikTok disinformation and extremism researcher. So Abby is our first TikToker. If you're not familiar with TikTok, it means you are old, or at the very least, it means that you seem old. But just so you know, TikTok is an incredibly addictive social media platform where you can create and share short videos on just about any topic that lasts anywhere from 15 seconds to three minutes. It has over 1 billion active users, more than double the number of users on Twitter, a platform that launched 10 years earlier. Abby's videos are smart, funny, and extremely popular. extremely popular. Her goal isn't just to debunk conspiracy theories, but, as she puts it, to inoculate more people against the spread of disinformation so that we can better identify it, understand why it's so easily believed, and maybe stop future conspiracy theories from infecting our
Starting point is 00:02:36 media, our politics, and our brains. So far, she's doing a pretty good job. In September of 2020, she created a chart that maps out various conspiracy theories on a scale from grounded to fully detached from reality. A chart that includes everything from Watergate to QAnon. The chart, in her words, broke the internet. It went viral not just on TikTok, but on Twitter, Instagram, basically everywhere. You can find it on conspiracychart.com. But I think it'd be even better for you to hear Abby herself
Starting point is 00:03:08 give a quick explainer in a recent post. Basically, it moves from grounded in reality up to detached from reality. The conspiracies at the bottom are things that actually happened, like Watergate or the Tuskegee experiment. Then we pass the speculation line and enter the we have questions category. These things are sus. What's good with UFOs? Do we live in a simulation? We don't know. Once we leave reality, we hit the unequivocally false but mostly harmless section. Was Avril Lavigne replaced by a woman named Melissa? No. And she'd probably like for you to stop saying that. Is
Starting point is 00:03:39 Stevie Wonder faking his blindness? Of course not. Stop making people prove their disabilities to you. It's weird. Once we pass reality denial, we're in quite dangerous territory. Denying basic science, medicine, history harms you and society at large. That's the kind of stuff that makes pandemics last longer. And finally, we hit the antisemitic point of no return. The end of every conspiracy theory rabbit hole is the Jews. Believe me, I've checked. And we'll do a deep dive on this, but anytime you hear about a secret group of evil masterminds controlling the world, just run. This will be fun. And by fun, I mean I will progressively get less sober. So yeah,
Starting point is 00:04:17 you should go check out the whole video. We talk all about the chart, why conspiracy theories spread, why certain people are more prone to believe them, what role social media has in spreading them, the most effective ways to inoculate people against them, and of course, what the fuck is going on with West Elm Caleb. I found the conversation both incredibly useful and entertaining. I hope you do too. As always, if you have questions, comments, or complaints about the show, feel free to email us at offline at crooked.com. Here's Abby Richards. Abby Richards, welcome to Offline.
Starting point is 00:04:55 Thank you so much for having me. Thanks for taking the time. I have been really looking forward to this. I very much enjoy your TikToks. Believe it or not, there may be some offline listeners who are unfamiliar with not just your work, but TikTok itself. Can you talk about what drew you to the platform in the first place and how you became a TikTok disinformation researcher? I mean, they say to lean into your obsessions, and that's really what I went for. Same, same. personalized TV social media platform where you open up the app and it immediately starts recommending you videos that it thinks you'll like. And everybody's algorithm is very personalized and different to them. So it's very recommendation heavy, very, very personalized. And it's all really video content. And I got into it because
Starting point is 00:06:09 during the start of the pandemic, March 2020, when we were all like locked inside, I downloaded it and I just found it so fascinating, just the communities it was forming, but also the discoverability and the conversations that were being had on there. And then I went viral for like kicking a water bottle. And I was like, this is such a weird platform. I am obsessed with it. Wait, so tell me about kicking the water bottle and why this was, why this went viral. I could not tell you. I literally, you're like, why did it go viral? I don't know. It was a six second video where I had a water bottle like up on one leg. And then I just like propped, like using that leg, kicked it up into the air.
Starting point is 00:06:54 And then when it was in the air, kicked it out. And I think it was right at the time that there was this like viral audio that went like, dun, dun, dun, dun, dun, dun, dun, dun, dun, beat. Oh, well, that's important. I don't think I mean, I guess it was at the time. I think the bar for going viral back then was a lot lower. But it was just like fascinating that this random video to me when I had like 13 followers suddenly got half a million views. And I was just like, why? Like, what is this? Where did you start getting interested in disinformation or when did you start getting interested in disinformation?
Starting point is 00:07:37 It happened gradually. And it happened from, I think, having a platform on TikTok and being a creator on TikTok and seeing that it wasn't being addressed. I originally grew a platform on TikTok as an environmental creator. I had a brief stint where I canceled golf. That was fun. And had this platform, but it was also summer of 2020. So I was just seeing this wild increase in conspiracies on the platform. And it really wasn't being moderated well at all. Seeing it in my comment section, seeing it in my recommended videos, and seeing creators that I know also struggling with it and struggling with harassment or struggling with censorship or perceived censorship.
Starting point is 00:08:29 It's kind of unclear sometimes on TikTok. And no one was really diving into it. And I found it really fascinating. the terms disinformation and misinformation, which I feel like these days are used by people to describe everything from extremist conspiracy theories to opinions they disagree with. I really go with intent. For the two, I think the biggest difference is disinformation. I have an idea of disinformation. It's false information that's spread with the intent of causing harm and spreading lies. Misinformation doesn't necessarily have that intent. And it is more of an umbrella term that I'll use a lot of the time for both. So you saw a lot of disinformation and misinformation
Starting point is 00:09:21 on the platform, especially in summer 2020. Eventually, you create this conspiracy theory chart that you post on TikTok, and it basically breaks the internet. Can you talk a little bit more about that and how that came to be? I did break the internet. Honestly, I was just having a conversation with somebody about conspiracies because I think everybody had been through this reckoning with just seeing more and more of it. These conspiracies just like all over the place. It was really after QAnon had really started dominating the Internet and we were like the lead up to the election um and i was having conversation with somebody about like if you had to believe in a conspiracy which conspiracy would you believe in and i was just like this is ridiculous that i i know some of these are hateful and i need ways to like categorize them to make it make sense in my own head um to be like
Starting point is 00:10:23 at what point you know what conspiracies are, at what point, you know, what conspiracies are true? At what point are we like genuinely asking questions about power? And then at what point are we just like being really anti-Semitic? Well, so the final category of conspiracies in your chart, and you alluded to this, these are the ones, the conspiracies that are most detached from reality and the most harmful, is separated by a line. And you titled the line, the anti-Semitic point of no return. Why do you think anti-Semitism connects so many of the most dangerous conspiracy theories? Well, because the general structure of the modern Western conspiracy theory is really rooted in, deeply rooted in anti-Semitism. It, in and of itself is really just an anti-Semitic construct.
Starting point is 00:11:06 This idea of there's this evil group secretly like plotting to control the world for profit and power is rooted in myths that have been around for centuries about Jewish people. And, you know, obviously there's like an endless amount of conspiracy theory debunking on the internet. What do you think it was about your chart that really broke through in a way that, you know, some other conspiracy debunking content has not? I get called a conspiracy debunker a lot of the time, and I'm really like not, conspiracy debunker a lot of the time and I really like not um because while it's helpful to some extent I I really do try and answer the questions of like why we think this way a little bit more than I just dive into like each conspiracy and give it honestly more time than a lot of the
Starting point is 00:12:01 time it deserves like I'm more interested in why people believe this and how we got here and answering those questions and thinking about the big picture rather than just like debunking single conspiracies forever and ever and ever and ever. What have you learned about why people are drawn to conspiracy theories? you learned about why people are drawn to conspiracy theories? They offer very simple, easy to understand answers to complex problems, especially during like a time of crisis. You know, for instance, I don't know, maybe like a global pandemic. They're really comforting because they oftentimes will simplify the world into like good and evil and just this duality of groups. And it's almost blaming this supernaturally powerful group for, or really anyone, for what you view as wrong in the world. really anyone for what you view as wrong in the world. So it's a super simple kind of comic book level, a story that is much easier than sitting and dealing with like the systemic
Starting point is 00:13:16 failures of how we cope with the pandemic. Yeah. Well, it's also, it seems like, you know, we live in an incredibly complex, messy world. messy human nature is complicated and you know people sort of seek easier simple explanations for why their world is so beyond seemingly beyond their control um which i imagine that conspiracy theories offer you know a relatively easy explanation for that i mean one thing i've noticed that seems to unite conspiracy theorists is a deep cynicism towards institutions. This idea that like all the elites and the powerful are lying to us and trying to screw us and we're onto them. What do you make of that? I get where it's coming from. I mean, they've been failed by quite a number of institutions
Starting point is 00:14:03 at this point. And I certainly get very fed up with institutional powers. Like I think that energy to some extent is really good. And that challenging of authority is certainly like never something that I would shame or bash because I actually think that that's worth praising. I just think that when we challenge authority it should be critical of the ways that authority generally actually abuse power so if you're taking all of that energy that is criticizing like institutional failures and then you're applying it to essentially a fairy tale you're not existing within political reality anymore. And instead,
Starting point is 00:14:46 you're taking all of that energy and just checking out. It's not helpful in making real change in the real world. Instead, it just kind of enables either that checking out or just this oversimplification that a lot of the times political leaders will take advantage of. Is there a certain kind of person or certain characteristics or conditions that make someone more susceptible to conspiracy theories? So yes, like there is research on this and there's a lot of different factors and it depends. And it's not to say that like any one person is completely immune to it because we all buy into false stuff all the time. Like that, that's part of the human condition. Conspiracy theories do like when you're younger, when your brain, like your brain just isn't fully developed yet. There is evidence that younger people are just more likely to believe conspiracy theories.
Starting point is 00:15:50 And then like having like a sense of powerlessness, sense of anxiety, uncertainty in the face of challenges, especially during like unprecedented times where you're facing some sort of existential threat. So a lot of it is kind of circumstantial. If you're somebody who's going through a major change in your life, if there is correlation between how much education you have and your belief in conspiracy theories, there's so many. And that's not to say that, like, anyone can't fall outside of any of those. In your experience,
Starting point is 00:16:31 are people on the right side of the political spectrum more prone to believing in wild conspiracies than people on the left? Seems to be the modern switch, doesn't it? I mean, I ask that partly, you know obviously i'm on on the left but i do think sometimes i worry when i think about like oh all these uh crazy right-wingers are buying into conspiracy theories and like we're you know i'm i'm educated and i'm i pay attention to the news and i'd never be able to fall for a conspiracy. But I'm like, I sort of wonder if it's something about human nature that makes us inherently susceptible to conspiracy theories.
Starting point is 00:17:10 Then, you know, I just I wonder, you know, whether people on the left are more susceptible than we think to conspiracy theories. And I just was wondering about your experience with that. My experience, and this is looking specificallyiktok with like the younger generation who does tend to lean more left um is that they'll still buy into a lot of the escapist ones and you know not necessarily the overtly kind of christian-based ones that are explicitly like very anti-Semitic and kind of rooted in these old anti-Semitic structures that we've seen for a while. But a lot of, they're absolutely buying into the more escapist, we live in a different reality kind of stuff. What are some examples of the escapist conspiracy theories?
Starting point is 00:18:06 Oh my God. There's so many. It's a lot of interdimensional kind of travel ones, a lot of alien ones, a lot of like the world ended and we live in a different reality kind of stuff. I'd say like a lot of celebrity sort of conspiracies as well. Then you start getting into like some history conspiracies, like they really all go viral on
Starting point is 00:18:34 TikTok. And a lot of the times they're so coded that unless you really understand where conspiracies get dangerous, like you can't even tell. I do think that on the right, obviously, we've seen like two of the biggest and most dangerous conspiracies have sort of intertwined over the last couple of years, which is conspiracies about the pandemic and about the vaccines. And then, of course, the big lie about the 2020 election. Like, why do you think something like 70 percent of Republicans came to embrace the big lie? And what role do you think social media platforms played in that? Oh, that's a big question. Just dropping that out there. If you have any thoughts. Abby, in 60 words or less, can you explain to me how we got here? Start from just before the polls closed in November.
Starting point is 00:19:43 Right. So when it comes to belief in conspiracies there's some research that it's correlated with populism and authoritarianism um it certainly is clear that like and throughout history we we've seen this as well leaders will take advantage of conspiracy theory beliefs for their own political gain. So I think that there is definitely a correlation that we saw there where there's a group of people who feel powerless, especially if they feel maybe like they've been losing power that they previously had. And that can shift into, you know, more easy belief in conspiracy theories. And I think politicians can take advantage of that quite easily.
Starting point is 00:20:37 It's easy to mobilize that fear. Social media isn't, you know, the be all end all, but it did not help. It's hard to moderate all of social media, but I think the tech platforms should be doing a much better job than they currently are. And it was just really easy for misinformation to be spreading on all social media platforms. Yeah. I want to dig into that in a second. But even before that, I mean, I saw that you one of your TikToks was talking about like it was just the sheer volume of misinformation and disinformation that a lot of Republican voters were seeing and hearing. And it was actually like the quantity and the amount of times they were seeing it in the in the period after the election that actually had an effect. And I hadn't thought about that before, because like, if you're watching Fox, if you're listening
Starting point is 00:21:29 to talk radio, and you're on a bunch of right wing websites, and all of your social media is following a bunch of right wing people, and all you see and hear for like weeks at a time are all these very detailed stories about how the election may have been stolen. You kind of think like, yeah, what other conclusion might you come to? Yeah, it's scary. But the reality is that your information shapes your reality. So what information you interact with every day, that's what shapes how you see the world. And part of like how we are wired is that the more you see something, the more you're going to believe it to be true. And no amount of like education can undo that. You can't be too smart to really like miss that.
Starting point is 00:22:14 Like it's just that if this continues to be presented to you as fact, your brain will eventually be like, this is fact. And if you're in a polluted information ecosystem and you're just constantly seeing lies, then eventually they start to form their own reality. the January 6th insurrection. And there's a lot of people who stormed the Capitol with college educations and were, you know, pretty well off middle-class white people who stormed the Capitol. And I think the prevailing narrative about the Trump voter as, you know, just some non-college educated white person who didn't quite, who wasn't as educated as anyone else. And that's why they believed in conspiracy theories is not, it's never been entirely accurate at all since so much of this movement has come from the middle class and upper class. Yeah. It's actually quite elitist and I can
Starting point is 00:23:14 understand why they look at us and like hate us for it. Cause that's a pretty elitist thing to be saying. And also who's to say that like having a college education even does make you smarter like maybe it just means you were privileged enough to go get one and you didn't need to like immediately enter the workforce so like i try not to really go into it with that view i don't think that people who believe conspiracy theories are stupid by any means um and i don't think that people who believe conspiracy theories are stupid by any means um and i don't think that people who don't believe conspiracy theories are any smarter than them like i i think it it really is a very emotional issue and a very you know it's rooted in a lot of societal problems uh
Starting point is 00:24:01 but i don't generally like to approach it as as a logical failure because you're never going to be able to reason somebody away from their belief. Like it's fundamentally like a very emotional belief and it's serving some sort of emotional and psychological purpose. And I do think, I mean, we mentioned the pandemic a couple of times. Obviously the pandemic was, you know, it's been a ripe era for misinformation because of the vaccines and information about the pandemic. But I also feel like isolation and alienation are probably conditions where conspiracy theories are bred.
Starting point is 00:24:41 And I wonder if you think that that has like a lot of people being home by themselves without a lot of social connection sort of leads people down these rabbit holes, or maybe I'm just guessing that, but. It's possible. Like I personally haven't seen the research yet, but I mean, anecdotally speaking, it does seem that way. And it does definitely seem like once somebody is down a rabbit hole, it's also an extremely isolating place to be. And if you become like estranged from your friends and family and you don't have people to turn to to support you, you'll often just go further down the rabbit hole into those communities that are offering support, which are
Starting point is 00:25:22 like other conspiracy believers. So it is kind of a very isolating experience in and of itself, and it does worsen it. So I think it's definitely possible that being isolated in a pandemic made it worse. It's just to pile on emotional issues we all have to work through. Yeah, right. I mean, so I do want to talk about the social media platform role in all of this, particularly, and start with TikTok, just because you know it best. You conducted a fascinating study with Media Matters about how engaging with transphobic content on TikTok causes the algorithm to quickly start recommending content that is also misogynistic, homophobic, anti-Semitic, racist, white supremacist? Why do you think that is?
Starting point is 00:26:11 Well, I think the TikTok algorithm is very good at recommending you videos that it thinks will keep you on the platform. And it does that by looking at what other people that maybe have liked similar things to you in the past have also engaged with. So I think there's just already existing relationships between something like transphobia and misogyny and transphobia and homophobia and homophobia, misogyny, transphobia and white supremacy. Those are all existing relationships that people kind of teach the algorithm and then the algorithm responds to that and will show you more of it because other people that liked the original content, the transphobic content that you liked,
Starting point is 00:26:59 but also enjoyed this homophobic content. So try that. enjoyed this homophobic content. So try that. Yeah, I had Alex Stamos on, Facebook's former chief security officer who left the company over a dispute about the spread of disinformation on its platform. You know, and he acknowledged that Facebook could be doing a much better job of stopping the spread of disinformation and so could all social media platforms. But he really pushed back on the idea that tweaking the algorithm could make a media platforms. But he really pushed back on the idea that tweaking the algorithm could make a big difference. And he argued that, you know, to some extent, social media platforms are just a reflection of society and human nature that are just showing us people and conspiracy theories that have always been there. What do you make of that?
Starting point is 00:27:40 I agree that it absolutely is a reflection of us. Whether that reflection deserves like unfettered, constant amplification and giving the worst of us, essentially, like the worst of not humans, but like our tendencies and capitalizing on those tendencies for profit to keep us engaged, I do think that we could do a better job of designing platforms that don't necessarily inherently capitalize on those. Because if you're really just looking at engagement-driven algorithms, you're going to kind of constantly run up against that problem. Have you thought about what TikTok could do? Like what that platform could do? I mean, is it something about, is it about having an algorithm as TikTok does that is so personalized that is the problem? Or like how do you, you know, in a perfect world, how would you change TikTok to avoid some of this? This is a tough question because like, I am not, I am not in, like, I don't do ethical tech. I don't necessarily have answers for like,
Starting point is 00:28:55 what the most ethical algorithm would look like. But I do still feel like, and believe with my full heart that TikTok has a responsibility to its users to just not be showing them false and radicalizing information. That seems, and especially just to not let it get millions and millions of views, which it regularly does. So we're not talking about small videos that exist in little corners, but false information, conspiracy theories that easily get millions and millions of views and people never like engage with critically. So in my perfect world, at a bare minimum, we would be reducing the speed at which those are able to spread and the distance that they can spread. So really like adding friction, at least like as a bare minimum, onto that misinformation.
Starting point is 00:29:50 And then I think that we should be doing more to platform the people who are already providing great information. And, you know, like if somebody is out there and they are a doctor or they are somebody who advocates for racial justice and their videos are getting 2000 views when somebody who's out there pushing conspiracy theories is getting 12 million, you know, maybe that is a comparison that we should be looking at trying to fix. I want to talk about the great work you're doing and specifically how you go about doing it. So once you identify a conspiracy theory or a viral piece of disinformation that you want to make a TikTok about, what you want to post a video about, what's your process? And what are the different challenges and tensions you're grappling with during that process? Oh, my number one issue is amplification. Like I don't want to amplify things that are dangerous and don't need to be amplified. And I struggle with that like every day. It is like walking this line of trying to talk about harmful things without causing further harm. Interesting. That to me is like the hardest part. It's like how much of the
Starting point is 00:31:15 bad information do you need to give people in order to prove to them that it's false? Mm hmm. Yeah. And I don't want to give them a reason to also go Googling it and trying to get down their own rabbit hole and finding it elsewhere. So I want to give a comprehensive enough explanation that they don't feel the need to necessarily go red pill themselves and, you know, find red pills that have been laid out for themselves, but also, you know, not one that essentially red pills them, which is something that I know journalists have struggled with for a while now. And it's a line that I walk very carefully,
Starting point is 00:31:54 and it's a reason why I never really post any video unless like five experts have reviewed my script. That's good that you have a whole bunch of experts checking it. I mean, that helps. Well, it came with time too. I think once I was one of the only people in this field using TikTok and getting my videos out there at this kind of unprecedented reach that TikTok has, people started stepping up to be like, all right, I'll look at your scripts. Are there certain elements of certain videos that you've found like work especially well?
Starting point is 00:32:36 Or I mean, one thing I know that, you know, you're very funny and your videos are very funny. Like, just do you think humor helps like sell them as well? Like what have you found actually works best in some of these videos? Definitely humor. I don't really like to post about horrible things unless I'm also joking about them to some extent, because then I'm not enjoying it. Like I want to have a fun time too. Yeah. And it just makes it a little bit easier, I think, to talk about really tough things if we're going to include like a little bit of like sassy humor as well. I always like to drink tea.
Starting point is 00:33:12 I mean, I'm drinking it right now. I notice that and then sometimes pour a little bourbon in the tea if it's a particularly difficult topic. If it's really dark. You know, it's very dark. Yeah. If it's like a dark, dark subject subject then we're putting bourbon in the tea but i do like the tea like i like holding something um gives me something to do with my hands otherwise i'm just kind of all over the place and it makes cutting really weird
Starting point is 00:33:38 but also it does like calm me down and it's created this ritual of like okay i'm going to film a video i'm gonna go make my tea and i really enjoyed that act um but yeah honestly like it's it's scripting it's oh it's so much scripting i imagine you put a lot of work into the scripting even the tea thing i've noticed as I've been watching them, there's something about someone who seems calm, you know, explaining conspiracy theories to you that makes it like, okay, everything's actually going to be okay. We don't have to believe this.
Starting point is 00:34:18 As opposed to someone who's like wild-eyed and yelling about something. Like, you can't believe this. It's very dangerous. Like, I wonder if there's something about the tone that helps people? I hope so. Like, I really hope that if you do believe in conspiracy theories and you see my video and you see that I'm not like yelling and confronting you or being super judgmental, that it does kind of create a space that, you know, allows people who are maybe on the edge and thinking about getting out or just have heard some misinformation, but don't want to
Starting point is 00:34:50 feel dumb for buying into it, which they shouldn't. It creates a space that's just like calming and they don't feel immediately affronted by it. At least that's what I'm going for. I don't know if it works. You know, we talked about how your strategy is not to just debunk, but to inoculate people against conspiracies. How do you do that? What works? What's the inoculation process like? I mean, it's basically like a vaccine. So the idea is that you provide somebody with like a small dose, tiny little dose, kind of like a vaccine, how you get introduced to what the virus might look like and then your immune system knows how to fight it off. It's very similar where you get introduced to either like what, you know,
Starting point is 00:35:38 a specific piece of misinformation might look like or a specific tactic that people who are spreading misinformation may use. And you get shown what that looks like. And now your brain kind of almost like your immune system has this response of knowing next time you're confronted with it, like, oh, this might be misinformation. That's interesting. I mean, one of the most common questions we always get from Pots of America listeners is, like, how do I talk to my family members who've been radicalized by right-wing disinformation? What's your advice for those of us who don't have a huge platform on TikTok? My first piece of advice is always to look after yourself. So, like, you should never feel obligated to have to go engage with things that make you feel unsafe or just really upset like you have to take care of
Starting point is 00:36:33 yourself first and foremost um and then my second piece of advice would be if you do have you know friends and family that are hardcore believers and really are not in a space to get out, it's helpful to like either encourage logging off, encourage them to like step out of those environments and maybe go engage with some activities that they've enjoyed in the past that aren't conspiracy related, like playing soccer. I don't know. Maybe you're, I don't know, just throwing it don't know just throwing it out there baking knitting any of those uh or whatever be creative with it but if if you do have somebody that really isn't going to engage with you I if if it's possible for you I always recommend that you let them know that you're still going to be there and that if they want to get out, that you are there for them and that like you love them, you care about them.
Starting point is 00:37:26 You want to see them, you know, in an emotionally healthy place and you want to support them, but you don't need to necessarily put up with all of their most wild and dangerous beliefs to support them. You know, you can be there and create some distance and be like, if you want to come talk to me, if you want my support, I am here, but I'm not going to tolerate these hateful beliefs. It does seem like coming to the conversation with some level of empathy is probably a little bit more effective than why do you believe this crazy thing that's bad?
Starting point is 00:38:07 Actually, asking why can be really, really helpful. Oh, interesting. I wouldn't go, why do you believe this crazy belief? You know, that might upset them. But if they're like, the world is run by a satanic cabal like the pandemic is faked, be like, why did they think the pandemic? Like just play dumb and keep asking why and keep like so exactly like who are these people? Like where do they convene? You know, like what are they doing? And really just play dumb and really poke at their belief with a stick.
Starting point is 00:38:42 and really poke at their belief with a stick. And you won't immediately see change, but you might be able to poke some holes until they eventually like can see that maybe it is just a belief. Yeah. So how does the microchip fit into the syringe that gives you the vaccine? Yes, yes.
Starting point is 00:39:00 Just show me where it goes in the needle. Okay, so can you explain to me why you're okay with phones and carrying that around all the time, but not this hypothetical microchip? So how do you feel about companies collecting and selling our data? I'm just asking. Is that something also on your mind? I saw that you spoke to a former QAnon believer who was de-radicalized. What did you learn from him? Ajit Arth is my friend. He's my buddy. We've been friends for a while, so I've learned a lot. Tell me the story. I was so fascinated by it. It's a short video, but I was like, God, I want to know more about that guy. Yeah, I mean, he was an early QAnon believer and then got out over time slowly.
Starting point is 00:39:52 It's really like his own story to tell. But yeah, he was deep in this place of conspiracy belief and kind of got himself out, as he says, kind of the same way he got himself in, just slowly poking holes in his own beliefs and then wound up doing a lot of work to try and help people get out or just help people who've lost family and friends to QAnon. So I know he's like a mod over at QAnon Casualties, the subreddit for people who've lost friends and family to QAnon. So I know he's like a mod over at QAnon casualties, the subreddit for people who've lost friends and family to QAnon. And yeah, he's been doing a lot of amazing work. And I got to know him like well over a year ago. So he's great. I mean, I think people who are sort of de-radicalized from these beliefs are most
Starting point is 00:40:42 fascinating to me because, you know, I'm in politics. I spend a lot of time thinking about, you know, marginal swing voters, people who go back and forth or people who don't vote, you know, and then trying to get them to vote. The folks who are like hardcore MAGA people, I'm like, I don't think it's worth spending a ton of time and energy trying to get their votes because it seems like their beliefs are so hardened. What is it that potentially gets people out of those beliefs or someone who's been in QAnon? You said that it's gradual, but is it gradual exposure to good information? Is it family and friends getting involved? What's the path out?
Starting point is 00:41:22 I don't think there is one path out. It would be amazing if there were. getting involved? Like what's the path out? I don't think there is one path out. Like it would be amazing if there were. The existing information that we have on de-radicalization isn't super solid. It doesn't seem like there is like kind of in the same way there isn't one way to get radicalized there isn't one way to get de-radicalized and you know there there's concerns that if you're de-radicalized from one belief you'll just go to like the extreme of another so it's not necessarily something that I advocate for or promote and I think that it's tough because it would be nice if we could just like expose people to high quality information and assume that that'll fix the problem. But if especially if we're talking about somebody who's like in a deep, you know, grown up in a deep red state, you know, conservative family and has this entire framework that would have to be deconstructed.
Starting point is 00:42:27 deconstructed, it's not necessarily the way I would go about it, but I would be prioritizing, you know, high quality education about systemic injustices and how we can make our systems better and getting involved in communities and prioritize that across all levels more than just de-radicalization. So would you say that your goal with the videos that you do is sort of like catching people before they fall down that rabbit hole? It's a catch-all. I think to some extent, it's catching people before they fall. To some extent, it's helping the people who are witnessing the fall understand what is going on. Yeah. Because it's so overwhelming if you've lost friends and family or you're just like watching.
Starting point is 00:43:12 Watching the world go nuts. You're watching the world go kind of insane. You want to understand it better. And I've gotten some people say that I've helped them, like my videos help them out of QAnon. I don't necessarily believe that. I think that if you want to escape that kind of belief system, you have to truly want to, to some degree. You have to approach my videos with an open mind and you have to be ready and willing to get there. I think it's not me.
Starting point is 00:43:41 It's them. It's their own power to get themselves out. get there. I think it's not me. It's them. It's their own power to get themselves out. But yeah, it's a lot of just explaining what's going on, helping people better understand it, and maybe hopefully preventing some of them from falling in. I can't imagine the amount of hate and threats that come your way as someone who's taking on conspiracy theorists and extremists every day. What has that been like for you? How do you process that? It's great being a woman online.
Starting point is 00:44:15 Of course. Famously, famously easy and wonderful, right? 10 out of 10. Would do it again. Yeah. No. Wonderful, right? 10 out of 10. Would do it again. Yeah, no, it's not something I was ever prepared for. When I started hosting, you know, with a water bottle kick, I certainly never thought I would get hate for that. And somehow that still resulted in some hate comments
Starting point is 00:44:42 because the internet's garbage sometimes. But in general, it's not something anyone's psyche is built for. Like your brain just is not meant to be processing that amount of like discussion about you in the first place. But especially when there's so much like hatred in that discussion, it feels very strange for your brain. And you're someone who spends so much time studying disinformation and conspiracies and what they do to people. And I imagine even as much as you know, these are just people, hateful people yelling at me, whatever,
Starting point is 00:45:19 it's still got to really affect you. It's probably impossible to escape that. It is. I also think that studying disinformation misinformation extremism it gives me more empathy than i had before so even like as overwhelming yeah well because you have to get into the mindset of somebody who would buy into those things and get into the mindset of somebody who would buy into those things and get into the mindset of somebody who would buy into some like you know buying into really anti-semitic or really really racist narratives really misogynistic really transphobic narratives you have to get into that headset not headset and it doesn't make it better it just you know it's it's coming from it's coming from insecurity
Starting point is 00:46:11 and it's coming from perceived powerlessness a lot of the times and that doesn't make it okay um but i think it just like makes me a little bit less like just hateful in general. I don't know if it's my background in politics or especially like speech writing, where again, you're trying to like persuade audiences. So you're trying to get in people's heads, even people that you don't agree with. But I've been thinking about this a lot, even around the pandemic and anti-vaxxers. Like when there's like yet another news story about someone who's unvaccinated who's an anti-vaxxer who died of covid like my first
Starting point is 00:46:51 i'm angry but my anger is like towards the people who spread that disinformation to that person you know more so than the person themselves because i feel like it's not like i want to excuse it and just say oh everyone's a victim and whatever but like you can see because of people's information environment and you know the propaganda that they're exposed to the conspiracies they're exposed to like why they would go down that path and why they would believe that and I just feel like it's it's more sad than anything else and it's enraging you know when it comes to like the people on tv or on the internet who are spreading that shit in the first place yeah and i get how if you're somebody especially right if you're like immunocompromised and you see that you might have a different
Starting point is 00:47:39 reaction as well like i get that just that anger and like I'm not here to invalidate that that anger makes sense um but at the same time I mean for me personally I it does make me very sad because I don't think that it's it just doesn't seem like a very amazing life to be sitting there and acting in a very selfish way and prioritizing your individual freedom over our collective well-being and prioritizing you know your right to not participate in a public health measure for everybody's benefit um yeah and put other people at risk like that to me i understand the understand the anger. To me, it just makes me sad that that doesn't seem like a great way to live. So you're dealing with threats. You're seeing, you know, even your most popular posts sometimes get fewer views than, you know, some
Starting point is 00:48:40 of the disinformation and conspiracy theories you're trying to debunk. You're seeing online extremism get even worse. How do you stay hopeful that this is a winnable fight? Who told you I was hopeful? Okay, then maybe you don't. That's the answer. Well, you're at it. You keep at it. You must feel like you're making progress or that progress is possible.
Starting point is 00:49:02 I mean, I want to contribute to trying to clean up the mess. And I think that there are lots of ways that you can do that. And this is mine. It just really depends on the day how hopeful I am feeling. But at the end of the day, I mean, I find it really interesting. And I think that it's important work and like people need to understand these issues. So I think that keeps me going more than like hope that we have some utopian light at the end of the tunnel. Just the slow kind of hopeful push of progress is more what I contribute to. We've talked about extremism, radicalization, disinformation, hate, racism. I want to switch gears with a very important question on an even weightier topic. What are your thoughts on West
Starting point is 00:49:57 Elm, Caleb? Oh, no. That is heavy. I feel like you're okay. I want to tell you what happened here. Friday, I see the news about West Elm Caleb, and I saw that you were quoted in one of the pieces about it. And I slacked it to our team, and I said, Hey, everyone, do we think I should ask Abby a question about West Elm Caleb? And also what is West Elm Caleb? And then I went down the rabbit hole of having to fucking learn all about
Starting point is 00:50:34 this guy and this controversy, but I thought you had some interesting, um, comments about it. So I would love, I'd love for you to share them. So TikTok seems to be like facilitating the creation of a meme in like a more rapid way than we've ever seen before. And TikTok, just because, you know,
Starting point is 00:50:54 if you saw a video on an emerging topic at 2 p.m. that day and you liked it or engaged with it or watched it to completion, you're then more likely to be fed a subsequent video that also is using the hashtag, also talking about those things that other people also engaged with. So it really allows for this pile on of almost obsessions with specific memes on TikTok. And one of our problems is that about every week, every two weeks, TikTok will choose a new human to turn into a meme. And that's kind of what happened with
Starting point is 00:51:34 West Elm Caleb. And it did result in like this mass harassment and doxing where it really didn't seem called for, nor is that how we generally handle justice in our society for good reason. But for those who aren't, you know, privy to the newest TikTok dramas, a handful of women in New York City realized that they'd all been on dates or dating some guy who was 6'4 and worked at West Elm, and his name was Caleb. And they accused him of manipulation and, like, love bombing them and then ghosting them. And then there was also an accusation of him sending some unsolicited nude pictures um interestingly that seems to be the thing that people are the least concerned with and they're very concerned with this like ghosting and uh and i would like to see them having more
Starting point is 00:52:40 conversations about maybe the unsolicited nudes and it makes sense that makes sense yeah but uh yeah they they all connected and realized that they were had all dated this guy and they started calling him west on caleb because he worked at west elm that was on all of his dating profiles and then sharing like his hinge messages and pictures of him and his full profile and everything they had said and doxing him, trying to get him fired from his job. And it just was kind of a shit show. And I say this as somebody who has experienced harassment on a massive scale and also isn't a fan of manipulative men is that you can dislike both.
Starting point is 00:53:26 Yeah, well, look, I think it's almost, it's not about like West Elm Caleb himself, right? You can think he's an asshole, that's fine. It is, if we're getting to, I mean, Twitter has this dynamic in a different way where I'm always on Twitter and it's like there's the Twitter villain of the day. And oftentimes that person is sort of a villain.
Starting point is 00:53:46 Sometimes that person has just tweeted something stupid or said something dumb. And if these algorithms are enabling sort of these pylons to individuals, then yeah, maybe sometimes the individuals deserve it or even most of the time. But what happens when they start just like, you know, the algorithm trains its eye on someone who didn't really do the thing that he's being he or she is being accused of? Or like, there's just a it just like you said, seems like a bad way to dispense justice. It's Yeah, it's, it's just, you know, a conversation that as a society, we need to have about whether or not we're comfortable with that regularly occurring.
Starting point is 00:54:27 And, you know, whether or not we're comfortable parsing apart people's lives for our own entertainment, because they're our villain of the day. And that's what is happening on TikTok. We've seen it happen with several people now. West on Caleb is literally just this week, so I'm willing to bet there will be a new one by like Monday. Of course. Yeah. So I think that it's, it's even if these people like aren't good and I don't know about,
Starting point is 00:54:56 I don't know what's on Caleb. I can't make any judgment calls on him. I don't know. He seems like he certainly has become like a stand in for just generally kind of shitty men. But even if they aren't that terrible like is that how how we want to handle that and you know does that actually create change because from all these conversations about men being manipulative. It doesn't actually seem like they're reaching men. It seems like this is more of a,
Starting point is 00:55:31 yeah, in a very female-oriented side of TikTok, they're having conversations about this, but I don't really see men engaging on manipulative behaviors. I don't know if it's necessarily even a learning opportunity. Yeah, well, you wonder, do individual pylons or individual shaming really help fix systemic issues i don't i don't know that there's a lot of evidence that they i mean we certainly know it's not an effective public health measure i don't
Starting point is 00:56:02 yeah right exactly like we see that all the time. Like public shaming did not get people to wear masks. It doesn't get people to wear condoms. I think that, you know, it's generally good to approach people with a bit of empathy if you want them to learn something that you're saying. Would you have been more likely to join the pylon if West Elm Caleb was an avid golfer? Yes. For people who don't know, you mentioned this earlier. You did go viral for trying to cancel golf. I will say that I'm not a golfer. I've never really been a golfer. I have a lot of friends and family who are golfers, but I never really thought about it. And that's a great example of,
Starting point is 00:56:47 I watched some of your anti-golf videos and I was like, yeah, she has a lot of good points about golf. Yeah, maybe it's sort of a waste of time. It's kind of a waste of space. Resources, space, yeah, a whole bunch of stuff. Literally, I know we'll call people like they're a waste of space,
Starting point is 00:57:04 but golf is literally wasting so much space. And the per acre land use per player, the acre land use per player is just insane. And the amount that each player is getting in water and fertilizer and pesticides and space and fresh air. fertilizer and pesticides and space and fresh air. It's just, and all of that is generally like privatized and hidden for wealthy golfers to access. Even though green space is really healthy for communities and everybody should have access to it. Yeah.
Starting point is 00:57:40 Yeah. Look at that. We're canceling golf. Final question that I ask every guest, what's your favorite way to unplug and how often do you get to do it oh i really like cooking okay i live with one of my best friends and basically every night i cook her dinner and then we sit down and we have a tv show project it It's very quarantine friendly. So we are working like all the way through Doctor Who. And that's been really fun.
Starting point is 00:58:11 But also like generally physical activity is my thing. I do a lot of handstands. Oh, handstands. I like going upside down. That's great. All right. That's a unique answer for this question. Favorite unplug, handstands.
Starting point is 00:58:28 I'll take it. Handstands. Go upside down. You got to get the blood flow up. It clears out all of the junk information that I've seen that day. Even better. That's even better. Abby Richards, thank you so much for joining Offline.
Starting point is 00:58:40 This was really fun. Thank you so much for having me. Offline is a Crooked Media production. It's written and hosted by me, Jon Favreau. It's produced by Andy Gardner-Bernstein and Austin Fisher. Andrew Chadwick is our audio editor. Kyle Seglin and Charlotte Landis sound engineered
Starting point is 00:59:05 the show. Jordan Katz and Kenny Siegel take care of our music. Thanks to Tanya Sominator, Michael Martinez, Ari Schwartz, Madison Hallman, and Sandy Gerard for production support. And to our digital team, Elijah Cohn, Nar Melkonian, and Amelia Montooth, who film and share
Starting point is 00:59:21 our episodes as videos every week.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.