The Daily Zeitgeist - Weekly Zeitgeist 293 (Best of 9/25/23-9/29/23)

Episode Date: October 1, 2023

The weekly round-up of the best moments from DZ's season 306 (9/25/23-9/29/23)See omnystudio.com/listener for privacy information....

Transcript
Discussion (0)
Starting point is 00:00:00 I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me for I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me for I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:00:30 I'm Keri Champion, and this is season four of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese.
Starting point is 00:00:39 Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have
Starting point is 00:00:46 changed the way we consume women's sports. Listen to the making of a rivalry Caitlin Clark versus Angel Reese on the iHeartRadio app, Apple Podcast or wherever you get your podcast. Presented by Capital One, founding partner of iHeart Women's Sports. Hey, I'm Gianna Pardenti
Starting point is 00:01:02 and I'm Jermaine Jackson-Gadsden. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert Maury Tahiripour. If you start thinking about negotiations as just a conversation,
Starting point is 00:01:22 then I think it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, the internet, and welcome to this episode of the weekly Zeitgeist. These are some of our favorite segments from this week, all edited together into one non-stop infotainment laugh-stravaganza. So without further ado, here is the weekly zeitgeist. Miles, it is a full... We needed this for our system. We've been having all these damn experts on lately.
Starting point is 00:02:06 We needed... Intelligent guests. Thoughtful guests. Thoughtful, intelligent guests. We needed a pure chaos episode. We are thrilled to be joined in our third seat. Oh, yeah. By a comedian, a writer, an actor.
Starting point is 00:02:19 Stand-up albums, Blake albums, Stuffed Boy, Live from the Pandemic. All debuted number one on iTunes, Amazon. His album, 12 Years of Voicemails from Todd Glass to Blake Wexler, charted on Billboard. Please welcome the hilarious, the chaotic. He's riding a recumbent bike in short shorts, and his plumpers are on full display. It's Blake Wexler!
Starting point is 00:02:39 The B! This is Blake Wexler aka i have a special a stand-up comedy special but zeitgeist my peak i have two plumpers what the hell am i doing here i am a wexpert i'm a wexpert baby this is i heart media at its finest. Blake Wexler, I'm joined by Jack and Miles. Thank you so much for having me. Oh, my God. We're going to host our show now, too.
Starting point is 00:03:11 Yes, I have data. We're going to dive into the data. We're going to talk to the people on the ground, and we are going to say that they're wrong. And that is what we say up top. Dude, dive in. Dive in on that. We have data that we're diving into today. We're going to bring up the electoral map here. And I'm going to show you which groups are reporting as of yet.
Starting point is 00:03:31 Blake, how are you doing, man? This is the Electoral Community College. That's right. It's a two-year associates. I'm doing great, guys. Thank you so much for having me. Electoral Online College. Phoenix University doing great, guys. Thank you so much for having me. Electoral Online College. Yes.
Starting point is 00:03:46 Phoenix. Exactly. This phoenix will not rise from the ashes. It has burnt itself to death. But yeah, no, I'm so psyched to be here as usual. It's great to have you. We got started a little late. We're busy integrating the Damien Lillard trade into our into our beings
Starting point is 00:04:07 into our personhood didn't see that coming it's hard is he even gonna play for them didn't he he only wanted to go to miami right and now he's going to the miami of the north milwaukee yeah i have a body of water up there, right? One of those great lakes. It's like, I'll be in charge of that. Yeah. He's like, you're not going to Miami, but the team does have M and I in the first two letters of the city. Yeah.
Starting point is 00:04:33 So... You got to read it. You got to read the whole word. I mean, that feels like... Pay attention to the opening sound and the ending sound of the word city that he was asking to be traded to. Nothing in between.
Starting point is 00:04:45 Me. I think this trade is really fun. I know this isn't our NBA podcast, but I think it's fun. It brings Damian Lillard, puts him on a team that gives him a real chance at winning a title. It makes Giannis relevant again. Giannis is probably our most fun and lovable
Starting point is 00:05:02 celebrity basketball player at the moment. I'm here for it. I i think james harden's the most fun wow wow wow lovable one i think we can take our self-hating sixers fandom off the uh off the mic but my yeah miles suggested a good new new line for for us when people ask if we're sixers fans that we we only recognize one team of sixers yeah uh-huh miles you want to tell them what it is the january sixers yeah that's the only that's the only sixers i recognize and i'm not a fan of them and i don't even know about another another group carmen we do like to ask our guests yeah what is something from your search history
Starting point is 00:05:48 well i was gonna say your first topic is a good one but uh we'll go back to the no-fault divorce but i'm gonna tell you you know what and this is not even a plug or anything but um carrie washington's new memoir she's just doing interviews and it just came out. And here's the thing. I had to Google it because I'm reading everything about it. I haven't gotten the book yet. But she also, like me, found out that her biological father was not who she grew up with. Oh.
Starting point is 00:06:18 Oh, wow. And that was kept from her her whole life until she got booked on, you know, Henry Louis Gates Jr. Skip Jr.'s show on PBS. Oh, for real? Yeah. And then I'm not telling you anything that's not out there. Yeah, yeah, yeah. But then they contacted her and she was all excited. And her parents were excited until they said, well, you need to do a DNA test.
Starting point is 00:06:40 Oof. And then her parents were like, oh, excuse me, Carrie. Can I please talk to you will you take a second yeah you have something to tell you that is the premise of the show so yeah yeah so um and it's funny because she i mean this also happened to me as well which is what i write about but what she describes really really well which i so appreciate because it's not necessarily something people would think about is she, like me, grew up our whole lives knowing something was wrong. Something was off. Right.
Starting point is 00:07:12 Almost like feeling like something was wrong with our bodies. Like we didn't, things didn't match up and that, and it was really uncomfortable. It's very like a subconscious thing and it's lots of anxiety and perfectionism, all this sort of stuff just happens and you don't even know why right and then this all this stuff comes out and you're like oh my parents have been keeping the truth for me my whole life and you imagine living with them and then they know and every time they look at you so yeah all to say people please talk to your children tell them the truth please tell them the truth. Please tell them the truth. Right. But that was my last, that was my last Google search.
Starting point is 00:07:49 Yeah. Are you, like, how radical is the radical honesty with the kids? Are you like a no Santa Claus thing? Or are you, like, how far are we going? Are we saying, like, straight up, look, Santa Claus is a a thing your friends believe because their parents are liars right first of all my child figured it out before i even had to say she's like she's that kind of kid she's like she's my kid she was like so i know who you are i know you are santa so you can stop it but please keep writing it on my presence because it's cute right but i think
Starting point is 00:08:25 that you know like she she mentions as well carrie is like when the secret is another human being right you gotta talk right like you're creating basically i say to people it's like look and i get asked this a lot when i give talks it's like well, well, what is it? Is it entitlement? Like, what is it that makes you feeling it? Look, a secret is yours as long as it's yours. But when you create another human being, that is no longer yours. That is actually a person. So it's no longer a secret. It's a human. And we can say as parents, like, well, you're my kid. Listen, kids do not. Please, everyone get this through your head. Kids do not belong to you. They are not belongings. They are people.
Starting point is 00:09:10 So you must treat them as such. And they're separate humans. And you need to treat them that way. And part of that is being very honest about how they came about and what's going on in your life. and what's going on in your life. But I will tell you a funny thing is I don't, my daughter and I, we talk a lot, but one of my nieces said something about my first husband. I can't remember how old my daughter was, but she was like maybe 10. And she was like, you've been married before? Like, she was like, you haven't kept this from me? And it was like,
Starting point is 00:09:44 I was like, girl, I just didn't think about telling you. Like it was a long time ago. And so that's not a secret. It was just, I just didn't think about it, but it was, it was like, I was like, girl, I just didn't think about it. Tell you like it was a long time ago. And so that's not a secret. It was just I just didn't think about it. But it was it was hilarious. Right. Right. If I think if like to your point, like if a secret alters someone's total understanding of themselves and the world they live in, it's like, no, no, no. Then that's not a secret. That's a potential like psychic bomb that you have to diffuse as quickly as possible.
Starting point is 00:10:04 Yeah. Tremendous bomb. And it affects your whole life. It literally affected me from the time I was a kid. Like, I remember it. It's definitely there. And you can't subconsciously carrying a secret like that and then living with your secret. Right.
Starting point is 00:10:18 Like, what does that do to you and your relationship with that person? And my thing is like, look, you keep secrets and lies from people that you love. That creates such a big distance. That person can't be your real friend or your real like love or your real whatever, like where it's just a huge distance that separates you from people you love. So, right. Kibosh it. Don't do it. Kibosh.
Starting point is 00:10:42 Come clean. Have the courage to come clean dr mcnerney what is something you think is overrated i feel like i'm not gonna win myself any friends i'll be like how to like lose friends and stop influencing people but all the disney live action movies i've hit a stage where i'm like no more live actions like the little mermaid is beautiful i love the original cinderella but you know you know, I don't want to see more and more of, like, the same movies again. Like, it makes me a bit sad. And, like, what other stories could be told if it weren't, like, always making these live actions?
Starting point is 00:11:13 But, you know, maybe I'll be proven, like, totally wrong. And the next ones will be amazing. But I'm ready for something else. Just, yeah. Based on everyone's first, like, sort of reaction. Like, oh, you saw it how it was? And they're like, yeah, it was okay like yeah it was like no one was like it was so good i'm gonna see it nine times everyone just kind of i think they're they have to reconcile like their love for the original source material with
Starting point is 00:11:34 that and not fully be like i didn't like it they're like yeah i mean it's it's yeah it's interesting you know it's like their voice just goes up. Yeah. You know, it was like, I'm glad I spent my afternoon going to that. I think that's right. I I've never been more confident in a prediction about the future of a like subgenre of movies than in saying that they're not going to suddenly figure something out about the disney live action reskins of the animations like we we know what the original cartoons look like and what happens in them we know what is possible here like right i can't i can't imagine a version like what what they're gonna pull out where we're like oh no that is not did not see that one coming whoa that bear is talking and singing hold on oh they already did that that was the best one i think was jungle book and that was uh the first one they did and
Starting point is 00:12:39 ever since i feel like yeah been diminishing returns although I didn't see the little mermaid so I cannot speak to that one is you what's your favorite genre of film though Dr. Carey oh I mean okay so I really have a soft spot for like old school superhero movies but my like most recent most amazing film I feel was the latest uh into the spider verse film what was it spider man across the spider verse yeah and i just i love animation in that film it's so witty and the visuals are extraordinary and just great films so i wax oracle about how much i love these films at work and everyone has to deal with me being like go watch it so anyone listening go watch it favorite favorite spider character spider person in the universe? Oh, my goodness.
Starting point is 00:13:28 I can't remember the name of it, but the really dark noir run for the first one. He's from all those 1930s, kind of like dark mystery films, you know, in black and white. I kind of love that. Yeah, yeah. Was it just Spider-Man Noir?
Starting point is 00:13:43 Yeah. I think that's noir spider-man maybe i don't know or i'm sorry spider-man nor nor what is when you say old school superhero movies do you mean like christopher reeves superman or are you talking about like the original man the first iron man uh i think it's almost less specific films. I love almost films that have that really cheesy origin, coming-of-age story. It's really just about discovering themselves. It's such a simple narrative.
Starting point is 00:14:17 I feel like I should crave more complexity than that, and I should want it to be more nuanced stories. But sometimes it's just really satisfying when you watch this clean-cut narrative of this good guy defeating all these bad guys. When I want to relax, I really enjoy that. In my day job, I think about lots of complexity and nuance and what we do with the future of society.
Starting point is 00:14:37 Sometimes I find it very relaxing just to be like Star Wars. You're like, I love a Joseph Campbell type flick. Just give me that hero's journey in every shape and form. Still gonna be okay. Yeah, yeah, yeah. Nothing's gonna go wrong. Challenge your old self to become your new self. Yes!
Starting point is 00:14:54 But Doc Ock is an example of what we're facing with AI, like in the immediate future. I think we can all agree on that, right? Spider-Man 2. In a way, you're like, it's giving me the equivalent of like eight arms in a way. You're like, oh, boy. What is something, Dr. Higgins, that you think is underrated? Underrated? Asking people for help. And I know that I listen to the show daily and I know people have such good responses, but that was the only thing that came to mind for me in terms of underrated.
Starting point is 00:15:25 And so that's why I'm on the show. Come to my live show on October 11th. Yeah. Help fill the seats. Come. Come listen to us babble about absolutely nothing. But no, seriously, asking for help. And I think the older I'm getting, you know, I think there's this, you know, and again, this could be the social justice in me, but I think as, you know, a marginalized person, we're always expected to have the answers to everything.
Starting point is 00:15:50 We're always supposed to be the person like the world teaches you to be the smartest person in the room all the time. And I've just gotten to the point now where it's like, yes, I have a doctorate, but I'm still not the smartest person in the room. Like, wouldn't anything say like support me bitch like that is literally how i feel all the time now like i'm like if i need help i'm gonna ask i'm not that girl that's like i've got it together like i'm good like my life is great no my life is literally three seconds it's a string from falling apart so like please help me like if you can help me in any way whether it be like you know giving me you know a network or an email or a connect to something like i'm never below saying help me
Starting point is 00:16:32 right because this life is hard like just it's it's really really hard so oh yeah and i think there's also yeah there's people don't want to want to struggle in public that's a huge reason why people don't ask for help or act like they they don't like they got their shit all figured out but it is it is like the most liberating and sort of like life-affirming thing when you ask like you'll be surprised you ask for help the help that you will receive from people that are like in your circle and i think that's also one of the underrated aspects of asking for it is like you will also realize like damn i got people on my corner yeah i'd say this is probably the most underrated thing of my adult life this is this is a great underrated it's something i've had to do for like addiction issues i've had to do in like it's made my marriage better and just in in terms of it being good at good advice we did this thing like there's this
Starting point is 00:17:26 hack yeah one hack that ben franklin wants you to know but there's this thing that ben franklin talks about that he realizes this like counterintuitive thing that if you ask someone for a favor they and they do the favor for you, they will like you more. And it's a weird, because you're showing vulnerability and there's like something like subconsciously of them, like doing something for you, like their, their brain is like, well, they must be cool then. Cause I just did this thing for them. So it's like, it's all around, like just being willing to be vulnerable enough to ask for help opens this world of possibilities and increased, you know, just everything. It makes your life better and richer in so many ways, you know, decreases loneliness,
Starting point is 00:18:20 which is such a huge problem like right now. Yeah. Yeah. And all that being said, I can speak about it for a long time, and, like, I still don't do it enough, you know? Yeah. Neither do I. Because it's hard. It is hard. And I think, I was going to say real quick, I think a really important thing is, you know, masculinity is, as a, like, a thing, right? Whether you talk about toxic masculinity or masculinity as a whole, like, we're all three men and i think so being even even for me being non-binary there's still this presentation of i'm male presenting to people in some way and so i'm supposed to never be emotional enough to say i'm struggling and i actually had to like acknowledge that a couple of like days ago like two weeks ago i was literally sitting in my office going like here comes the ideations here comes the like no one loves you no one cares like
Starting point is 00:19:10 you're you're feeling very alone so i had to get my therapist back up and i had to say like hey i'm going i'm literally slipping into this mindset again of being lonely and feeling like the world doesn't need me here like i need help and she was like you know let me get you back in and you know let me get you back on the roster. And so I think that that's something that, like, I try to emulate in everything that I do that, like, just because you see, you know, the shows and you see me on the network and you see me on TV and you see me with, you know, celebrities. I mean, I went to dinner with a celeb friend of mine just this weekend. People are like, oh, that's so cool. I'm like, yeah, but we were sitting at the table literally almost crying with each other because we're all struggling like we're all going through
Starting point is 00:19:48 it so yeah it's just you know ask for help literally like people are not going to judge you for asking for help so yeah and that it's okay to be struggling i think that's the other hard part is like whenever i start getting like you know i start having i start ruminating on shit or whatever it's like man you don't fucking start having I start ruminating on shit or whatever. It's like, man, I feel like you get your shit together, whatever. It's like, no, that's not that's not how you get out of it. You get out of it by being like, OK, OK, that's where we're at right now. Let's try and find a way to sort of pivot to something that feels a little bit better
Starting point is 00:20:17 and then incrementally get out of there. But the whole like brute force of like, I don't need to feel like this. Damn it. It's yeah. Guess what? It only makes shit worse. brute force of like i don't need to feel like this damn it like it's yeah guess what it only makes shit worse yeah what if you just said no to sadness though ah if you just said no like to drugs yeah like i don't know that that seems well just like a no-brainer to me yeah
Starting point is 00:20:37 it's just gonna make me do more more sadness that's right more sadness all right let's take a quick break and we'll come back. And speaking of sadness, we'll talk about the Republican Party and how things are going for them. We'll be right back. I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast,
Starting point is 00:21:11 Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades. Jessica and I will delve into the hidden truths between high control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine.
Starting point is 00:21:32 Through powerful, in-depth interviews with former members and new, chilling firsthand accounts, the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradente. And I'm Jemay Jackson-Gadsden. We're the hosts of Let's Talk Offline, a new podcast from LinkedIn News and iHeart Podcasts.
Starting point is 00:22:07 When you're just starting out in your career, you have a lot of questions like, how do I speak up when I'm feeling overwhelmed? Or can I negotiate a higher salary if this is my first real job? Girl, yes. Each week, we answer your unfiltered work questions. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in experts who do, like resume specialist Morgan Saner. The only difference between the person who doesn't get the job and the person who gets the job is usually who applies.
Starting point is 00:22:37 Yeah, I think a lot about that quote. What is it, like you miss 100% of the shots you never take? Yeah, rejection is scary, but it's better than you rejecting yourself. Together, we'll share what it really takes to thrive in the early years of your career without sacrificing your sanity or sleep. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:23:01 How do you feel about biscuits? Hi, I'm Akilah Hughes, and I'm so excited about my new podcast, Rebel Spirit, where I head back to my hometown in Kentucky and try to convince my high school to change their racist mascot, the Rebels, into something everyone in the South loves, the biscuits. I was a lady rebel. Like, what does that even mean? The Boone County Rebels will stay the Boone County Rebels with the image of the biscuits. It's right here in black and white in print. A lion.
Starting point is 00:23:27 An individual that came to the school saying that God sent him to talk to me about the mascot switch. As a leader, you choose hills that you want to die on. Why would we want to be the losing team? I'd just take all the other stuff out of it. On the segregation academies, when civil rights said that we need to integrate public schools, these charter schools were exempt from that. Bigger than a flag or mascot. You have to be ready for serious backlash.
Starting point is 00:23:54 Listen to Rebel Spirit on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. And we're back. We're back we're back and dr mcnerney as mentioned i am an idiot on this ai stuff i think i generally have like my version of ai up until last week i, like researching for our last expert episode was what I had read in, you know, mainstream articles that went viral and films like Hollywood films and then like messing around with open AI and or chat GPT. So I had this kind of disconnect in my mind where it was like, from an outsider's perspective, we have this C plus level, like copywriter thing with like in chat GPT, GPT-4, and then like the godfather of AI, who I'm just trusting people is the godfather of AI.
Starting point is 00:25:00 But that's what everyone uses that same phrase. They're like, the godfather of AI just quit Google and says we're all fucked in the next couple of years. And I think it's confusing to me because I don't know exactly. Like, I can't even, like, picture the way how he thinks we're fucked. And there was this letter that was like, we need to pause development on AI in the near future. And I guess I'm just curious to hear your perspective on that pause letter and what the kind of dangers of AI are in the near future. Yeah, because to Jack's point too,
Starting point is 00:25:40 also, we were talking with João Sadoc last week at NYU about it and like, at the end of it, we're like, okay, Sadoc last week at NYU about it. And at the end of it, we're like, okay, so it's not Skynet, right, from Terminator. And they're like, oh, great. But then we realized there's a raft of other things that come along with just not being the Terminator. So, yeah, I'm also from a similar perspective where I always assume Skynet. Yeah, I mean, I think you're totally not alone in the dominance of ideas like Skynet and Terminator because so much of our cultural framework for understanding what AI is comes
Starting point is 00:26:13 from a very narrow set of movies like the Terminator, like the Matrix, which always positions AI as something that's going to dominate us, it's going to take over the world, and it's going to control us. And it's important to, I think, highlight that that's definitely not the only ideas that we have about AI. We've got thousands of years of thinking about our relationship with intelligent machines. And there's a lot of different cultural traditions that have really different ways of thinking about our relationships with AI and intelligent machines that could be much more positive, much more harmonious. And so I do think our immediacy of jumping to this idea of Skynet is reflective of very much where we are right now, right?
Starting point is 00:26:51 I'm in the UK, you're in the US. These are countries that have a really long history of thinking about AI in very binary terms. So yes, I think it's important that we think about these long-term risks of AI. And you mentioned the pause later calling for a halt to generating more large language models like chat GPT until it had a bit more of a moment to think about some of the long-term consequences of these models. But I think it's really important not just to think of the long-term risks, but to think about which long-term risks we prioritize. Because
Starting point is 00:27:21 I think the Skynet Terminator fantasy eats up a lot of oxygen about how we talk about AI's risk, but there's a lot of different risks that AI poses. So another long-term risk that we don't talk about very much at all is the climate cost of AI, right? Because AI, it's hugely energy intensive. Data centers require a huge amount of water to function. And we have this massive problem of e-waste produced by a lot of electronic systems. But that long-term problem of climate crisis is much less exciting. It's really scary. It's really grounded in a lot of our experiences. And so it just doesn't seem to get as much airtime. So that's something that I think is really important, is changing the
Starting point is 00:28:00 conversation a bit to say, okay, it's sometimes interesting, sometimes scary to think about the Terminator option, but what are some of the other long-term options that could really shape our lives? Yeah. Like the degree to which the deck is being stacked towards the Terminator option was surprising to me. We dug in last week a little bit to the two stories I had always heard that are like kind of put into
Starting point is 00:28:27 the Terminator version of AI taking over a category. There's the AI that like killed a person in a military exercise that decided to like eliminate its controller. And then there's the AI that hired a task rabbit to pass the CAPTCHA test. And like in both cases, those are was the AI that hired a TaskRabbit to pass the CAPTCHA test. And like in both cases, those are like the AI that killed a person in the military exercise, like that was somebody claiming that. And then when they went back,
Starting point is 00:28:56 they were like, oh, I was just saying it could hypothetically do that. It was a thought experiment. Yeah, it was a thought experiment of what an AI could do in the right circumstances. And the TaskRabbit one was more similar to the self-driving car Elon Musk thing, where it was just there was a human prompting it to do that thing that seems creepy to us when we like start thinking about, oh, it's like scheming to get loose and get like overcome the things that we're using to keep it hemmed in.
Starting point is 00:29:28 So it does feel like there is an incentive structure set up for the people in charge of some of these major AI companies to get us to believe that shit like to think to only focus on the ai v humanity like ai gets loose of its control of our controls for it and takes over and starts like killing people version of it i'm just curious like what what are your thoughts on like why why are they incentivized to do that when it would seem like you, well, you don't want to make it, you don't want it to seem like the, this self-driving car will take over and start,
Starting point is 00:30:12 you know, kill your family. Yeah. Start killing your family. It's so powerful, but it seems like with AI, they're more willing to buy into that fantasy and have that fantasy projected to people who are not as
Starting point is 00:30:28 closely tied to the ins and outs of the industry. Yeah, I mean, I think that's such an important point because it's a weird thing about the AI industry, right? You would never have this kind of hype wave around something like broccoli where you say, oh, the broccoli, if you eat it, could kill you or
Starting point is 00:30:44 it could transform the world. And then you wouldn't expect that to somehow get people to buy a lot more broccoli and just be like oh i don't want to eat broccoli now but if they were like it's so fucking good and powerful if you explode like maybe like maybe that's what it is even worse you can then also eat you know what the broccoli would be doing. But I do think that we see this real cultivation of hype around AI and that a lot of firms explicitly use that to sell their products. It gets people interested in it because on the one hand, people are really scared about the long-term and short-term impacts of AI. On the other hand, they're also scared then though of getting left behind. So you see more and more firms saying,
Starting point is 00:31:24 now I've got to buy the latest generative AI tool so that I look really tech savvy and I look tech forward and I look futuristic. And so it's part of the bigger hype cycle, I think, to draw a lot of attention towards their products, but also to make them seem like this really edgy, desirable thing. But I think what's also interesting about both the stories that you raised is when you looked under the hood, there was human labor involved, right? There were people who were really central to something that was attributed to an AI. And I think that's a really common story we see with a lot of the type around AI is often the way we tell those stories erases the very real human labor that drives those products, whether it's the artists who originally made the images that trained a generative AI through to data labelers, all sorts of people who are really central to those processes. Right. And I know like in the in your episode of the good
Starting point is 00:32:15 robot, when you're discussing the pause letter, you know, I think that the I think the version that we see as like sort of the short-term threats, at least in the most immediate way is like for me working in and around entertainment and people who work in advertising and seeing like an uptake in that section, I go, okay, that's easy. Like I can see how a company immediately goes, yeah, it's a tool. And then suddenly it's like, and now you're on your ass because we'll just use the tool now. And we don't even need a person to prompt it. Or we need many, now you're on your ass because we'll just use the tool now and we don't even need a person to prompt it or we need many, we just need fewer people to operate it. So to me, I'm like, okay, that's an obvious sort of thing I can see like on the horizon. And you did talk about, well, there was
Starting point is 00:32:53 a lot of talk of these sort of long-term existential or quote unquote existential threats, that there were a lot of things in the short term that we're actually ignoring. What are those sort of things that we need to bring a little bit more awareness to like i know you mentioned the climate um and i i look at it from my perspective i see like the just massive job loss that could happen um but what are sort of like the more short-term things that kind of maybe are less sexy or interesting to the people who just want to write about killer terminators and and things. Yeah, I mean, I think less sexy is exactly the right phrase for this, which is a lot of the short-term issues are very much about entrenching existing forms of inequality and making them worse.
Starting point is 00:33:34 And that's often something people don't really want to hear about because they don't want to acknowledge these inequalities or because it takes away from the shiny newness of AI. It makes it very much like a lot of other technological revolutions that we've already seen. And that's super boring. Like you don't want to hear about how the wheel somehow brought about some kind of inequality. The wheel is racist and we all know it.
Starting point is 00:33:55 The big hot take from today. But yeah, I mean, something that I look at, for example, are like very mundane but important technologies like technologies used in recruitment and hiring so i look at ai powered video interview tools and look at how that affects people's particular you know likelihood of being employed and how they go through the workforce and yep it's less exciting seeming than the terminator but again when you look under the hood and dig into them you're like oh wow this could actually, really compound inequalities that we see in the workforce
Starting point is 00:34:25 under the guise of the idea that these tools are going to make hiring more fair. And that's a massive problem. Right. So because like the idea with those hiring tools, like it will actually take away these sort of like demographic cues that someone might use to like, you know, they'll apply their own biases to. So in fact, it is the most equitable way to hire. they'll apply their own biases to. So in fact, it is the most equitable way to hire.
Starting point is 00:34:50 But is it because of just the kinds of people that are creating these sort of systems? Because they tend to be a bit one note, that's inherently where like sort of that begins to wobble a bit? It's a mixture. So of course, yes, the lack of diversity in the AI industry is like very stark. And also sadly, in the UK, an industry where, for example, women's representation is actually getting worse, not better. So that's a sad slap in the face for a lot of the progress narrative that we want to see.
Starting point is 00:35:13 But sometimes it's not even necessarily that the people creating these tools have bad intentions. Maybe not even that they're using faulty data sets or biased data sets. These are two of the really big problems that are flagged. But sometimes the underlying idea behind a product is just bogus. It's just a really bad concept. And yet somehow it gets brought to market again because of all
Starting point is 00:35:35 this hype around AI. So with the video interview tools that we look at, for example, they basically claim that they can discern a candidate's personality from their face and from their words. So how they talk, how they move, they can decide how extroverted you are or how open you are, how neurotic you are, how conscientious you are, all these different markers of personality. To which I would say, firstly, no, there's absolutely no way an AI can do that. This is just a very old kind of racial pseudoscience making its way back into the mainstream saying, okay, we can totally guess your personality from your face. It's like your friends looking at someone's profile picture on Tinder or whatever and being like,
Starting point is 00:36:16 they look like they'd be really fun at a party. It's about that level of accuracy. And then second, is that even a good way of judging if someone's gonna be good for a job like how extroverted do you want a person in a job to be maybe in your job that's really really helpful in my job i don't know how helpful it is so there's just kind of a lot of flaws at the very you know bottom of these products that we should be worried about just like a C minus level job hiring process. Like that's what I feel like so many of the things like when you get down to them and see them in action, they're like not that good. Like it does feel like the whole thing is being hyped to a large degree. And like that's something I heard from somebody I know who like works know who works in, like all of my friends
Starting point is 00:37:06 who work in finance or any of those things, my brain shuts off when he starts talking about what he does. But he was saying, he pays attention to the market and he was saying there's a big thing propping up the stock market right now is AI. And it really is. That's where so much of people's wealth is tied up is in the stock market right now is AI. And it really is like that. That's where so much of people's wealth is tied up is in like the stock market. And it's just tied to like what you can get people excited about in a lot of cases. So it really like that from that perspective,
Starting point is 00:37:38 the incentive structure makes sense. Like you want people talking about how your ai technology can do all these amazing things because that literally makes you have more money than you would have if they knew the truth about your your product without being like yeah how many seven fingered trump pictures can we create and right be like man, fucking dump millions into this. Yeah, I mean, that's kind of something that's really come out of the last few years is how many firms just use the label of AI to get funding. I think there was a study a couple of years ago that said
Starting point is 00:38:15 40% of European AI startups didn't use AI. At which point you're like, well, what are you doing? What are you doing? Well, we could, though. this podcast is actually an ai podcast because eventually it could use ai and before we were recording miles was actually putting in he asked an ai to uh pitch him an episode of friends in which the cast and you know the people on the the friends on the show deal with the fallout from the events
Starting point is 00:38:46 of 9-11 and it wouldn't do it so we can't quite claim that we are an ai podcast yet but but we did it did do it when i said do a uh pitch me an episode where joey and monica drive uber oh right from yeah and it did so clearly it because you can see where these guardrails are. They're like, don't do 9-11 stuff, though. That's, no. Don't do that. So, yeah, I think there's two things we're talking about here. Like, from one perspective, like, yes, you could put it in the category of, like, well, yes, the wheel makes racism or colonialism more frictionless is a word that gets used a lot.
Starting point is 00:39:28 But like literally in the case of the wheel, frictionless. But AI and like a lot of technology is designed to make groups of people and like our interactions and the things that make people money more frictionless. And that's something that you guys have have talked about on recent episodes of Good Robot. There's this one example that really jumped out to me that I think was from your most recent episode, or at least the one that's up most recently right now as we're recording this, where you guys were talking about a company that asked a regulating body to make an exception to a law around like a high risk use of AI. And the law said that people had to supervise the use of AI, like just because it seemed dangerous. And the company appealed to the regulating body by saying, well, we just like that
Starting point is 00:40:21 would cost too much and we would never be able to like scale this and make a profit. And it feels to me like our answer as a civilization to that complaint needs to be, that's not our problem. Like that, then you shouldn't be doing it. But instead, it seems like the answer too often, not just in AI, but just across the board, especially in the U.S., is like, okay, well, we have to make an exception so that they can make a profit around this technology or else the technology won't get developed because the only thing that drives technological progress is the profit motive. But that's, you know, as I think you guys talked about in that episode, that's never been the best way to develop technology. Like it's it's been a good way sometimes to democratize existing technology. But like that's I don't know, I feel like that idea of you have to make it profitable. easy on these companies to keep trying different things for AI to become profitable is baked in
Starting point is 00:41:25 at a cellular level at this point in how a lot of Western colonial civilizations operate. Yeah. I mean, I think too often a lot of the technologies that shape our daily lives are made by a very narrow set of people who ultimately aren't beholden to us. They're beholden to their shareholders or to their boss. So they don't really have our best interests at heart. For example, take this whole rebranding of Twitter to X by Musk. I remember waking up and finding my little Twitter bird replaced with this huge X and just being like, firstly, because it was part of Twitter's low decline. But secondly, it made me feel pretty disappointed or really aware of the fact that one guy could have such a huge impact on how literally millions of people use a social networking platform
Starting point is 00:42:15 that's actually super important to their daily lives and has played a huge role in activist movements and fostering different communities. And I think that's a story we see time and time again with some of these big tech companies, which is not only do they have their own profit motive at heart, they're not beholden in any way to the public and they're not being compelled by regulation to make good decisions that necessarily benefit the public. So I think a really important question going forward is how do we support kinds of technology development that are very much based in the communities that the technology is for? I think one really big part of that is recognizing that so many AI models, as you mentioned, they're designed to be scalable,
Starting point is 00:42:56 and that's how they make money, this idea that you can apply them globally and universally. And I think that's a big problem, partly because it often is really homogenizing. It can involve like exploiting a model from the US usually out to the rest of the world. It's probably not actually appropriate to use in those contexts. But also a lot of really exciting and good uses of technology, I think, come from these really localized, specific community-based levels. So sometimes I think it can be about thinking smaller rather than bigger. Yeah. think it can be about thinking smaller rather than bigger yeah yeah i that was like another thing that struck me about like just all the warnings and even in that pause letter is sort of like the presumption that it's like well all you motherfuckers are gonna use this so we gotta
Starting point is 00:43:35 talk about it where it's like i don't know i don't even fucking know what it is like a second ago i thought it was skynet and now like you, you have your company being like, yeah, we now have enterprise AI tools, like welcome. You're like, but what am I, huh? Like what? And I think that's what's in it. Really interesting thing about this as like a sort of technological advancement is before people even really understand what it is, there is like from the higher, the powers that be sort of going into it being like, well, this is it. Like everyone's using it, but I'm still not sure how. And I guess that probably feeds into this whole model of generating as much, you know, excitement, market excitement about AI is by taking the angle of like, everything's different because everyone is going to be using AI. Most of y'all don't know what that is, but get ready. And I think that's
Starting point is 00:44:22 what also makes it very confusing for me as it's like a lay person outside of the tech sphere to just be like, wait, so are we all using it? And even now I really, I still can't see what that is and how that benefits me. And I think that's a big part of, I'm sure your work too, or even like any ethicist is to understand like, well, who does it benefit? Like first we're making this because it benefits who and how yeah and i think yeah is it right now it benefits the companies that are making it it sort of feels like that's the way it's being presented or slightly being like yeah you guys are going to love this but really it's we're going to benefit from the adoption of this technology yeah i mean i think that's this crucial question is this stepping back and saying actually
Starting point is 00:45:04 is this inevitable and do we even want this in the first place and i think that's this crucial question is this stepping back and saying actually is this inevitable and do we even want this in the first place and i think that's what really frustrated me about the pause letter and about a number of kind of big tech figures signing on to it is that they're very much pushing this narrative of like oh this is like unstoppable and it's inevitable and it's happening we've got to find ways to deal with it and it's like you're making it like you're the people literally making these technologies in a lot of cases. So if you really think it's an existential risk to humanity, stop.
Starting point is 00:45:33 It honestly could even be that simple. But that's what makes me really then question their motives and sort of coming forward with a lot of this kind of very doom and gloom language. I think it's also interesting if you look at, for example, countries as national AI strategies. So if you look at, say, like China and the UK and the US and these countries that are now thinking about what their national long-term AI strategy is going to be, they also very
Starting point is 00:45:57 much frame it around the idea that AI is completely inevitable, that this is going to be the transformative technology for imagining the future, for geopolitical dominance, for economic supremacy. And again, I think as an ethicist, what I really want people to do is step back and say, I think we're actually at a cross road so we can decide whether or not we think these technologies are good for us and whether they are sustainable, whether they are a useful long-term thing for our society, or actually whether the benefits of these technologies are going to be experienced by very few people and the costs are going to be borne by many. Right. We talked last week about the
Starting point is 00:46:36 scientific application that used deep learning to figure out the shapes of proteins, learning to figure out the shapes of proteins, the structures of proteins, and that that could have some beneficial uses, will probably have some beneficial uses for, you know, how we understand disease and medicine and how we treat that. But there are ways to probably differentiate and think about these things. Like, it's not, you don't just have to be either luddite or like ai pedal to the floor you know let's just get out of the way of the big companies you know it feels like but but it is such a complicated technology that i think there's going to be inherent cloudiness around how people understand it and also manufactured cloudiness because it is in the overall system's benefit,
Starting point is 00:47:32 the overall system being like capitalism, it's in their benefit to generate like market excitement where there shouldn't be any, basically. Yeah, I mean, I think it's easy to generate this kind of nebulousness around AI because to some extent, we still don't really know what it is. It's still more of a concept than anything else
Starting point is 00:47:51 because the term AI is like stretch and you describe so many applications. Like I spent two years interviewing engineers and data scientists in a big tech firm and they would sort of grumble, well, 15 or 20 years ago, we didn't even call this AI and we were already doing it. It's just a decision tree. You know, again, it's kind of part of that branding.
Starting point is 00:48:09 But also we have these, again, thousands of years of stories and thinking about what intelligent machine is. And that means we can get super invested and super cloudy very, very quickly. And yeah, I don't want people to feel bad for being scared or being cloudy about these technologies. It is dense and confusing, but at the same time, I do think that it is really important certain things about these technologies that really excite me. But I'm really sympathetic to some of the kind of old school Luddites who weren't necessarily anti-technology, but were really against the kind of impacts that technology were having on their societies. So the way that new technology is like, I think would be things like spinning and weaving were causing mass unemployment and the kind of broader ramifications that was having for people in the UK socially. And that kind of has quite a scary parallel to today in terms of thinking about maybe what AI will bring about for the rest of us who maybe aren't researchers in a lab, but who maybe might be replaced by some of these algorithms in terms of our work and our output. algorithms in terms of our work and our outlook. Yeah. Can you talk at all about open source like models of because, you know, when we talk about this idea that corporations have all this
Starting point is 00:49:35 power and are incentivized to do whatever is going to make the most money, which in a lot of cases is going to be the thing that removes the friction from consumption decisions and, you know, just how people interact and do these things, which, as you guys talked about in your episode, like removing the friction, like friction can be really good sometimes. Sometimes your system needs friction to stop and correct itself and recognize when bad shit, when things are going wrong. But, you know, there's also a history in even in the U.S. where corporations are racing to get to a development and ultimately are beat by open source models of technological organizing around getting a specific solution. Do you have any hope for open source in the future of AI? Yeah, I think I'm really interested in community forms of development. And I think open source is
Starting point is 00:50:36 a really interesting example. I think we've seen other interesting examples around things like collective data labeling. And I think that these kinds of collective movements, on the one hand, seem like a really exciting community-based alternative to the concentration of power in a very, very narrow segment of tech companies. On the other hand, though, I think community work is really hard work. We had Dr. David Adelani on our podcast, who's a very important figure in Masakana, which is a grassroots organization that aims to bring some of the over 4,000 African languages into natural language processing or NLP systems. And he talks a lot about how the work he does with Masakana is so valuable and so important, but it's also really, really hard because when you're working in that kind of collective decentralized environment it can be much slower and as you said there can be a lot more friction in that process but counter to this move fast break things kind of culture sometimes that friction can be really
Starting point is 00:51:35 productive and it can help us slow down and think about you know the decisions that we're making very intentionally rather than just kind of racing as fast as we can to make the next newest shiniest product i was i'm also like in your work too you know you talk about how you know like looking at these technologies especially through a lens of like feminism and intersectionality and you know bipoc communities and things like that i don't like broadly in science there's like you know there's an issue of like language hegemony in scientific research where if things aren't written in English, a lot, sometimes studies just get fucking ignored because like, I don't speak Spanish or I can't read Chinese. Therefore, I don't know if this research is being done. And therefore, it just doesn't exist because the larger community is like, we all just think in English. doesn't exist because the larger community is like, we all just think in English. Like, so how do you like, you know, specifically, because, you know, when hearing the description of your work, help me understand, like, and listeners to like, of how we should be looking
Starting point is 00:52:33 at these things from that also from that perspective, too, because I think right now, we're all caught up in like, it's fucking skydive. And it's not, you know, hold on, like, there are other subtleties that actually we should really think deeply about, because to your point, I feel like those are the dimensions of an emerging technology or trend or something that gets ignored. Because to your point, it's like the thing we of course, it's unequal, of course, it's racist or whatever. But what are those like, what are those ways that people need to really be thinking about this technology? What are those ways that people need to really be thinking about this technology? Yeah, I mean, I think English language hegemony is a really good example of this broader problem of the more subtle kinds of exclusions that get built into these technologies. Because I think we've all probably seen the cases of AI systems that have been really
Starting point is 00:53:17 horrifically and explicitly racist or really horrifically sexist from, you know, Tay, the chatbot that started spouting horrific right-wing racist propaganda and had to get taken down through to Amazon hiring tool that systemically discriminated against female candidates. These are really, I think, overt depictions of the kinds of harms AI can do. But I think things like English language hegemony are also incredibly important for showing how existing kinds of exclusions and patterns of power get replicated in these tools. Because to an English language speaker, very crucially, they might use ChatGPT and think, this is great. This is what my whole world looks like
Starting point is 00:53:56 if they only speak English. Obviously, anyone who is not a native English speaker or who doesn't only speak English, it's going to be an incredibly different experience. And that's where I think we see the benefits of these tools being really unequally distributed. I think it's also important because there's such exclusions in which kinds of languages and forms of communication can get translated into these systems. So for example, I work with a linguist at the University of Newcastle, and she talks about the fact that there's so many languages, like signed languages and languages that don't have a written language, but they're never going to be translated into these tools and never going to benefit from them. You might think, okay, well, do these communities want those
Starting point is 00:54:34 languages translated into an AI tool? Maybe, maybe not. I'd argue, of course, it's up to them, but those communities are still going to experience the negative effects of AI, like the climate cost of these tools. And so I think it's just really important, like you said, to think about what kinds of hegemony are getting further entrenched by AI-powered technologies. All right, great. Let's take a quick break and we'll come back and finish up with a few questions. We'll be right back. few questions. We'll be right back. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades.
Starting point is 00:55:32 Jessica and I will delve into the hidden truths between high-control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine. Through powerful, in-depth interviews with former members and new, chilling firsthand accounts, the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradente.
Starting point is 00:56:09 And I'm Jimei Jackson-Gadsden. We're the hosts of Let's Talk Offline, a new podcast from LinkedIn News and iHeart Podcasts. When you're just starting out in your career, you have a lot of questions. Like, how do I speak up when I'm feeling overwhelmed? Or, can I negotiate a higher salary if this is my first real job? Girl, yes. Each week, we answer your unfiltered work questions. Think of us as your work besties you can turn to for advice.
Starting point is 00:56:36 And if we don't know the answer, we bring in experts who do. Like resume specialist Morgan Saner. The only difference between the person who doesn't get the job and the person who gets the job is usually who applies. Yeah, I think a lot about that quote. What is it like you miss 100% of the shots you never take? Yeah, rejection is scary, but it's better than you rejecting yourself. Together, we'll share what it really takes to thrive in the early years of your career without sacrificing your sanity or sleep. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:57:09 It was December 2019 when the story blew up. In Green Bay, Wisconsin, former Packers star Kabir Bajabiamila caught up in a bizarre situation. KGB explaining what he believes led to the arrest of his friends at a children's Christmas play. A family man, former NFL player, devout Christian, now cut off from his family and connected to a strange arrest. I am going to share my journey of how I went from Christianity to now a Hebrew Israelite. I got swept up in Kabir's journey, but this was only the beginning. Hebrew Israelite. I got swept up in Kabir's journey, but this was only the beginning. In a story about faith and football, the search for meaning away from the gridiron,
Starting point is 00:57:55 and the consequences for everyone involved. You mix homesteading with guns and church, and then a little bit of the spice of conspiracy theories that we liked. Voila! You got straight way. I felt like I was living in North Korea, but worse, if that's possible. Listen to Spiraled on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. And we're back. And dare is somehow still a thing. Having been born in 1980 and like lived in like all my early memories are in a decade where it was just your brain on drugs, like just this weird, straightforward, just say no to drugs. I thought drugs were a thing people were going to make me do, like force into my bloodstream when I got to high school. Well, that's only in Seattle, Jack. If you drive through Seattle, that will happen to you. That will happen. They will force it actually through your car. Through your car window.
Starting point is 00:58:49 Well, you can't. You have to have your windows up. Come on. They're going to blow heroin into you. And you have to have the air recycling in your car air conditioning or else the heroin gets in. But, so I think we assumed that the D.A. But so I think we assumed
Starting point is 00:59:06 that the D.A.R.E. program went away most of us, right? Until I saw some people outside of a Best Buy. Same, Best Buy, yeah. Did you see that? Wait, you saw D.A.R.E. people outside of a Best Buy also? Yeah, it was the Best Buy, yeah. D.A.R.E. people stand outside of the Best Buy. I don't know if they have like,
Starting point is 00:59:23 there's a contract where they work together. Yeah. But that's where I saw dare people too. I've seen dare people outside two Best Buys before with like trying to get you to sign something. And all I said was like, y'all are what the fuck? I was like, nah,
Starting point is 00:59:37 man, I'm here. Where the hell did y'all come from? I'm here for a mouse pad. Now leave me the fuck alone. So, yeah. Yeah. I mean, there's obviously a epidemic of overdose deaths. Like the thing that Pee Wee Herman was saying in that PSA that we listened to up top is that was not true of crack.
Starting point is 00:59:58 Like crack didn't have like a epidemic of ODs, I don't think. But it did like that. That is happening with fentanyl like fentanyl is causing a massive spike in overdoses like that and you don't know if there's too much and people like that does seem to be like the thing that they told us about drugs in the 80s has absolutely come true it's called manifesting. Yeah. They've manifested this. It might not be this one, but one will happen and we will go fucking hog wild
Starting point is 01:00:30 over this shit. Yes. So they've decided, okay, this is real. No more fucking around. Let's go back to that idea that was proven to not work and was just a way to get money,
Starting point is 01:00:43 to funnel money to cops right essentially yeah so the dare program is being brought back to several school districts following a prolonged absence this time with a specific focus on fentanyl doesn't seem like they've learned anything from the first time like that the thing that people say was wrong with the dare program in the first place first of all that it was cops cops going into classrooms yep yeah that part second of all they were doing it too early they were targeting kids who weren't really they're trying to like get it in kids brains before like drugs were a real thing to them and just like it that's why it didn't work for me my drug was power rangers at eight right i don't like i have no idea what
Starting point is 01:01:33 a crack pipe is you know yeah totally or like i knew it from a movie but like other than that i was like yeah i mean i've seen seen new jack city but yeah that's not that's not how I live. So, yeah. Because I think I was in fifth grade. Yeah, same. I was probably in like fourth or fifth grade when they started pushing dare down my throat. And I just remember being like, this is, if anything, you're educating me on how dope drugs are. Yeah. That's all that was.
Starting point is 01:02:00 That was like my first real conversation about drugs where like the cop brought drugs into the classroom and we passed it around. You smell that pipe? That's the worst smell you're ever going to smell. I remember him saying that. I was like, that smells good. Yeah. Fine. It smells better than cigarette smoke.
Starting point is 01:02:16 It smells better than cigarette smoke for failure, but go on. Yeah, no, 100%. Yeah. It was, I didn't realize it was created by the LAPD chief of police, Daryl Gates, who his thoughts on recreational drug users is that they should be taken out and shot. Oh, well, okay. It's a very humane approach to drug abuse. He was Duterte before Duterte. Yeah.
Starting point is 01:02:47 Duterte got his shit from the Johnny Appleseed of the DARE program. At its height, DARE was being taught at a whopping 75% of schools. By 1998, it was reportedly costing taxpayers $600 million a year. Yeah. That is loud. And the presentations were just full of racist dog whistles. Or just not dog whistles, just racism. Like pointing to the broken black family as the source of the drug crisis.
Starting point is 01:03:21 Yeah. Going to a predominantly white school, have the officer be like miles you know about that yeah yeah does your parents or your family yeah you're like um i have no idea what this is i always see your mom picking you up right yeah wish dad you're like man get out of my life but the thing that always bugs me about dare was like it's like the gum theory i don't know if you all are familiar what i call it the gum theory it may not be real but i think about it it's like when you tell some like when you tell kids to not chew gum in school they're gonna chew gum because you're telling them not underground gum racket yeah there was a whole like i was the girl that was bringing the gum to school to give to everybody like what are you so
Starting point is 01:03:59 like dare was always like it was so dumb to me because i'm going you're telling kids not to quote unquote do something and all you're doing is you're telling kids not to, quote unquote, do something. And all you're doing is you're just piquing their interest. But yes, it was very much riddled in anti-Blackness. And that is something that I've always had an issue with when it comes to this program. Yeah. You're paying the least cool, least trustworthy people in your community to come into school and talk to the part of your community that most despises them. Right. And they put drugs on people. They put drugs on people. So how are you going to tell? Yeah, I don't know. I've always had issues with D.A.R.E. I've always had issues with the police.
Starting point is 01:04:38 But I will say very openly that I think D.A.R.E. is just a way to make Black people look worse than what the world already does so yeah just yeah reinforce like the police perspective or what drug use even is yeah it's it's black people mostly and then also like getting kids like snitch on their parents and shit like that it's like they have where the drugs are I remember them having a box that you put anonymous notes into, the DARE box, and then at the end of the class, the DARE officer would
Starting point is 01:05:12 read the questions, and this was just a thing that smart asses used to, like, be funny, make them read stupid shit. But, yeah, apparently it ended, like, that DARE box resulted in children like narking on their parents and being removed from their own homes by social services because
Starting point is 01:05:35 they revealed that their parents were taking drugs aka smoking a little bit of pot. Wow. So really cool problem solving there. Yeah. Creating fucking just horrible situations. But like to Dr. John's point, like it, it's like verifiable that dare did fuck all to prevent kids from using drugs. Right? Yeah.
Starting point is 01:06:00 Yeah. Yeah. The stats are officially in and it would either had no effect or made children slightly more likely to use drugs and alcohol at an early age. I'm I'm telling you it did because I legitimately wanted to smoke PCP. told us yeah he said i'm not joking i'm here with my fucking best friend but it's so funny i know but this is my fucking 10 like my 11 year old brain right officer first of all the first thing we always ask him at the top of every dare class was can we hold your gun yeah he never he was like no there's no way but we asked it every time he's like you guys have to stop asking what can you pull it out and show us the bullets and he's like bullets he's like we have to get going he's like what is that and then like we have kids who like new guns
Starting point is 01:06:49 and he would always get annoyed but so he's talked about this like so we're gonna talk about pcp angel dust shit like that i got this exact same speech right crazy he said he went to a call at a jack-in-the-box where a guy had just beamed up, smoked some PCP and was fighting the police there. And they had they needed backup. He said when he got there, this guy who smoked PCP lifted a fucking dumpster above his head. I remember that story. And threw it at a cop car. Yeah.
Starting point is 01:07:17 And me and my friends were all like, holy shit. We were like, yes. Are you for real? Superpowers. Yeah. We're like, how does it give you how does it give you strength he's like i'm not sure it's like the drugs the drugs duty it's like but then like we're like what is it like you your muscles like could you work like i remember asking shit
Starting point is 01:07:35 like if you worked out on pcp would you keep that strength and he was so again irritated by my curiosity around pcp induced superhuman strength but yeah he just got mad he's like guys this is not a cool thing and and i and it's funny that everybody i guess someone everybody heard this version of pcp gives you superhuman strength yeah to the point where like youtube came out i was i was scouring youtube for footage of somebody on PCP with that kind of strength, never found it. And I was like, was this, was this cop lying to me?
Starting point is 01:08:08 And he was, Oh, Oh, you believe that shit? Yeah. And so as the stats are coming out, the dare program is doing their own research, you know,
Starting point is 01:08:20 and they're like, all right, well, we're going to get the DOJ to fund this study to see what the, like what actually happened, what's actually going on here. And a month later, for the first time in memory, the DOJ refused to publish a study it had funded for the first time ever. Because it found that the DARE program was useless and was taking money away from programs that actually worked. Behind the scenes, DARE went ballistic,
Starting point is 01:08:51 and they just kept saying this one, like every time they get attacked, the DARE program is like, criticizing DARE is like kicking your mother or saying that apple pie doesn't taste good. And then a decade later, they were criticized or like proven to be fraudulent again. And their response was, it's like kicking Santa Claus to me. We're as pure as the driven snow.
Starting point is 01:09:16 So like we're as pure as the Colombian cocaine we put in front of children. You're a Colombian flake. I just want to know where the money went. There was a lot of money that was being moved in the 90s around D.A.R.E. and I'm just wanting to know where it went. To police. To corrupt cops. I'm wondering, did it go into some
Starting point is 01:09:37 benevolent police fund, I wonder? Or it was just like, if you got that D.A.R.E. gig, they're like, oh shit, man, you're off the street beat? You're just doing fucking dare classes oh that's right right there they were they were confiscating luxury cars like porsches and bmws and turning them into dare vehicles like with a dare decal on it that they got to drive around they were like yeah well it becomes pretty apparent who's winning here, though.
Starting point is 01:10:06 You know what I'm saying? Right. And then a kid's like, wow, the police department buys you guys cars? Like, no, we can't afford that. Drug dealers get rich enough to buy cars like this. And then you're like, oh, okay.
Starting point is 01:10:16 That's the wave. Yeah. In reality, they were doing the same exact thing that drug dealers were doing. Right. 100%. 100 100 they were selling you something it's yeah it's ridiculous but amazingly dare still uses the dare box
Starting point is 01:10:33 like they still have the write down questions for cops folks it's anonymous purely anonymous uh psych we're actually gonna investigate your parents if you say that you know what any of these things are. That smell familiar to you? Oh, yeah? And where to? Like, I could totally see being like, oh, yeah, I know what that smell is. That's weed. And then being like, whoa, yeah?
Starting point is 01:11:00 How do you know that? My parents? Yeah. You know, like just fucking panicking and throwing your parents under the bus yeah i don't know i don't know i was at a reggae festival recently that's why i know next question no questions i need a lawyer officer for the questions your honor yeah all right that's gonna do it for this week's weekly zeitgeist. Please like and review the show if you like the show. Means the world to Miles. He needs your validation, folks.
Starting point is 01:11:34 I hope you're having a great weekend, and I will talk to you Monday. Bye. We'll be right back. Outro Music I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me For I Have Followed
Starting point is 01:12:54 on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Keri Champion, and this is season four of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese. Every great player needs a foil. I know I'll go down in history.
Starting point is 01:13:12 People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports. Listen to the making of a rivalry. Kaitlyn Clark versus Angel Reese. On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Capital One, founding partner of iHeart Women's Sports. Hey, I'm
Starting point is 01:13:31 Gianna Pradenti. And I'm Jermaine Jackson-Gadson. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert Maury Tahiripour. If you start thinking about negotiations as just a conversation, then I think
Starting point is 01:13:53 it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.