The Daily Zeitgeist - Weekly Zeitgeist 306 (Best of 1/22/24-1/26/24)

Episode Date: January 28, 2024

The weekly round-up of the best moments from DZ's season 322 (1/22/24-1/26/24)See omnystudio.com/listener for privacy information....

Transcript
Discussion (0)
Starting point is 00:00:00 I'm Keri Champion, and this is Season 4 of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese. Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports. Listen to the making of a rivalry.
Starting point is 00:00:20 Kaitlyn Clark versus Angel Reese. On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Elf Beauty, founding partner of iHeart Women's Sports. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me For I Have Followed on the iHeartRadio app,
Starting point is 00:00:55 Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradenti. And I'm Jemay Jackson-Gadsden. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert Maury Tahiripour.
Starting point is 00:01:19 If you start thinking about negotiations as just a conversation, then I think it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, the internet, and welcome to this episode of the weekly Zeitgeist. These are some of our favorite segments from this week, all edited together into one nonstop infotainment laughstravaganza. Yeah. So without further ado, here is the weekly zeitgeist.
Starting point is 00:01:55 Anyways, Miles, we are thrilled to be joined in our third seat by a hilarious stand-up comedian, actor, musician with a 7.4 rated album on Pitchfork to his name. Booyah! That's right. You can listen to his podcast Cold Brew Got Me Like, anywhere find podcasts or give it away for free. His new book, The Advice King Anthology available anywhere.
Starting point is 00:02:19 Find books are sold or given away for free at the library, you know. The poetry window is open because it's chris motherfucking croft what's up good to see you guys what's up man man you i noticed something about you look like you have your your fist over your chest like you're doing half a wakanda salute but i'm yeah yeah i broke my uh scapula thursday i got an email from, or no, I sent an email saying, hey, I would like to go back on the Daily Zeitgeist. And I got an email back that said, hey, we were thinking about it on Tuesday. And I said, sure.
Starting point is 00:02:55 Or Wednesday. Today's Wednesday, I guess, in the podcast world. But I don't want to ruin everything. I've ruined everything. So anyway, when I sent that email, I did not have a broken scapula. I had never even heard of a scapula. And then when I confirmed, yeah, sure, I'll do it Tuesday, I was on morphine. But I figured by Tuesday I'd be all right.
Starting point is 00:03:28 But I broke my scapula Thursday afternoon. Fuck, man. And I broke my rib. And so Nashville's been like this ice skating rink for a week. And now it just went away because it just got finally went up to like 40 or 50 or whatever. Right. And so all the stuff melted. But for a week guess nashville has like a couple snow plows i mean yeah you mean literally like two they have a few i guess but it was not cool it was like everybody was falling
Starting point is 00:03:57 down my neighborhood was like a fucking circus like it was like the whole the street was an ice sheet for a week so people were trying to go to work and have to turn back. My neighbors, a guy, a drunk guy came down our street. Like, first of all, our street has got no sidewalks or anything. So like if you go off the street, you're in a lawn. Right. And so and so this my neighbors like I was asleep because it was after I broke my scapula. So I was in bed. But I woke up and my roommate was like, that's right.
Starting point is 00:04:28 I said roommate. My roommate, your buddy. My buddy. Yeah. My wife. My mother. My beautiful wife was doing animation on the new Fast and Furious movie in the living room of my well-appointed Silver Lake. Pigeotere, wherever you pronounce that fucking word.
Starting point is 00:04:53 However you're animating it. Our new Fast and Furious movie? Whatever, she pays the bills. I was asleep. She pays the bills. I live right next to the Silver Lake Reservoir. Just gotten this note from Vin Diesel. I'm friends with Jimmy Kimmel.
Starting point is 00:05:07 It's unbelievable. So me and Jimmy Kimmel were taking a nap in the same bed. And my wife was animating Fast and Furious 12 or whatever it is. And no, okay, I'm going to go back to what's really happening. My roommate said that the next door neighbors, he's like, did you see what happened? And I was like, no, I was in there. that the next door neighbors he's like did you see what happened and i was like no i was in there and he said oh my god this car came down the street lost control and went into my neighbor's yard like it slid on the ice and landed in their yard wow and then it was a drunk person too even
Starting point is 00:05:39 though it was the middle of the day and um yeah because i always think drinking is nighttime thing right but not for this guy and so then while he was trying to get out of the yard which he also couldn't get out of because it was ice too and it's down in a gully like our streets on kind of a hill and like it's a it's just a very you know this was not an ideal piece of land for houses uh when they put them in here so the so this car this car like smashing into the people who lives there he's trying to back up and stuff and he smashed into their car. And then they eventually had to go out there and take his keys away from him.
Starting point is 00:06:09 And then they gave him a blanket because it was so cold and they gave him some water. And then they, and the police came and I guess I didn't see the police, but he said the police were very mean to the guys. He didn't speak English. So, you know, anyway, Nashville is like needs more snow plows on the double. So if anybody out there in Daily Zeitgeist world has an extra snow plow, you might want to call up the city city hall here. Bring it on down. How's your rib? How's your scapula?
Starting point is 00:06:35 That's what I mean. I'm concerned about. The thing is, I've broken so many fucking bones that it's really just embarrassing. Like I was more embarrassed when I hit the hit the ground than anything else because I heard or i didn't hit the ground i hit the steps but i heard things crack so i knew you were like it's something yeah that's something it's you know and it's just like i just i haven't broken that much stuff i mean yeah i have but i haven't i mean i broke my hip in 2018 right you know roller skating you know and everybody thinks that's funny, but you know, it's not funny.
Starting point is 00:07:07 No, I'm only laughing because I felt awkward. It's funny. It's funny. That's LA's fault. That's because people in LA, grown people go roller skating because they're reliving their childhood that they didn't get to have in Wisconsin because their mean dentist father never talked to them or whatever. Right.
Starting point is 00:07:23 So they move out to LA and start roller skating. Oh, I'm going to roller skate and wear crazy clothes now. Even though I'm 40. Look at how high these socks are. Yeah, exactly. These crazy socks. I got them on the internet. The Yeti fuck film on there.
Starting point is 00:07:40 So I wrote, yeah, so I wrote, I'm doing a Yeti fuck film. We're pitching it to Adult Swim. It's going to be called Yeti fuck film. We're pitching it to Adult Swim. It's going to be called Yeti fuck film, but we're going to censor it. We'll put stars instead of the fuck. You know, it's going to be F and then three stars. Yeah. And I know somebody at Adult Swim.
Starting point is 00:07:54 He's like a lower down guy. But I know the guy put the roof on Adult Swim. He replaced the roof and he talked to one of the guys there, so... Didn't they switch buildings, though? Yeah, but he's still got the guy's number. Yeah, well, he also... Yeah, he's just, like, in the loop. You know what I mean?
Starting point is 00:08:13 Yeah, yeah, yeah. He's got the guy's number. I don't know how it's going to work, but it's pretty much all set. Yeah, I hear that. Yeah, it's all set. So I fucking... I don't even know what I'm talking about anymore, but I... It was, like, the... know what I'm talking about anymore, but I, it was like
Starting point is 00:08:25 the, the, the, my roommate said the day it happened, my roommate's car got stuck in the middle of the road and it was stuck. And he came running in and he said, Oh my God, Oh my God, Oh my God. The car stuck in the middle of the road and someone doesn't come over the hills. We live over a hill. And he's like, somebody's gonna run into my car. And I was like, Oh fuck. Yeah.
Starting point is 00:08:40 I love jobs. You know what I mean? Yeah. And I like helping. So I was like, Oh shit. I was standing barefoot in the kitchen kitchen eating a cold quesadilla. And I put on my sneakers and he said the stairs are slippery. But that was the only thing I hold against them because the stairs were not slippery.
Starting point is 00:08:55 They were impassable, unusable. Each one was a solid fucking piece of ice. So I went outside and happily put my foot on the top step and immediately flew sent up up in the air and landed in the air yeah i realized oh my god i am 54 years old and hovering over a set of concrete fucking steps and i am fucked and so one step broke my rib and the other step broke my scapula which is the piece on the back of your shoulder, the wing. And,
Starting point is 00:09:26 you know, I didn't even feel the shoulder thing because the rib hurts so much. So I, my only thing was, I actually thought I probably broke my back. So when I stood up, I was happy. Yeah.
Starting point is 00:09:36 You're like, it's a miracle. It was really lucky. Yeah. I hadn't spun. We are glad. Yeah. That,
Starting point is 00:09:41 that sounds like it could have been worse. Yeah. It's, it felt like the mundane way i mean you know the no one plans on getting paralyzed or i mean it's like that's what it felt like it felt like oh fuck this is really dangerous like when i was when i was in the middle of slipping i was like oh my fucking god i mean it's just like credit to you man i'm glad you're doing well and also i'm glad you know you listened to our email where he said get the fuck over it you
Starting point is 00:10:04 said you'd wanted to we're doing this show so we're doing the. And also, I'm glad, you know, you listened to our email where we said, get the fuck over it. You said you wanted to. We're doing this show. You said you wanted to be in the big time. Yeah, more on that later. You want to be in the big time? You're going to slip big time. This is Hollywood. Yeah.
Starting point is 00:10:14 I thought you were a professional. What the fuck is this? I will never talk to you again if you don't get on this Zoom. What do you mean something's broken? Like, I get it. I believe you. But, like, why are you telling me that? Yeah, what does that have to do with fucking anything baby yeah tell it to your personal assistant you scumbag oh man so i have a gofundme anyway if anybody i already made
Starting point is 00:10:39 the i made the goal though and people from daily Zeitgeist and you guys, whoever's in charge of your social media, retweeted my GoFundMe. So I'm very grateful, as usual, to the Daily Zeitgeist community because, you know, they've just been a huge part of my life in the last three, four years. Zeitgang. Yeah. And a part of theirs and a part of ours, man. Yeah. Amazing people. I canceled my health insurance because I feel like I can just, like, reach out to them.
Starting point is 00:11:03 Just wing it. Good idea. That's a good idea. Yeah. That's probably the best idea you've ever had. I'm sure your wife is an animator for one of the major movie series. Well, her work on the last five. Working on the New Avengers.
Starting point is 00:11:17 Yeah. She's a stunt coordinator for the New Avengers flick. Yeah. Yeah. I know how Hollywood works. Kevin, what's his name? Feige. Yeah. He gives us good health insurance. What is something from your search history that's revealing about who you are? Oh, gosh. I don't really know if I like what this reveals about me, but you know,
Starting point is 00:11:37 when you get like really obsessed with something and it's not even recent, it's not trending, like there's no reason why. But for all my search history for the last few days, just really wild deep dives on the 2019 film Cats, the awful one with all the CGI fur and all the superstars kind of crawling around on all fours. And I don't know why, but for some reason, a specter just came back to haunt me. And I was like, today I need to listen to an hour and a half YouTube video
Starting point is 00:12:05 on the making of Cats the movie 2019 and why it was a disaster. And so I'm not busy enough. I think I need more hobbies and I need more productive uses of my time. But I am now an encyclopedia on terrible CGI slash why you shouldn't try and make very weird Broadway musicals into films.
Starting point is 00:12:25 Yeah, I was immediately like trying to connect the dots. I was like, oh, well, I think I remember that they used technology to remove the cat assholes from like, wasn't that one of the things where I was about to come out and they did a test screening and everyone's like, they're pink cat assholes are in our face the entire movie are are we going to just do that and they went back and digitally removed them but not yet what did were you into the movie when it came out were you anticipating it or you just kind of have it so i love i'm a dancer so like i love musicals and like you know i was one of the only people that was like unironically excited at four other love musicals. And I was one of the only people that was unironically excited
Starting point is 00:13:06 at the thought of a cat's musical. So I realized that's for people who actually would have wanted to see this. And then the trailer came out, and it was just so frightening. What it was a really good example of is in AI and robotics, you have the term the uncanny valley. This might be something you've heard of, this idea of the closest something looks like to a human or to a living creature. like being different enough that you can tell like it's not quite human like the creepier it is and so this original robotics experiment um masahito mori or
Starting point is 00:13:34 the person who like writes this essay about this uses the example of like a mechanical hand that moves and it's like that's super creepy like no one wants to see that and all i could think of is this like essay from this japanese roboticist where i saw those cat humans things moving and i was like it's gonna be so bad like no one wants to see this whether or not the pink assholes are there or not you know in the end they just end up being all smoothed out which is also awful yeah so yes it was a kind of terrifying disappointment. Because, yeah, there's kind of a Streisand effect, I feel like, where people are like, but where are the assholes?
Starting point is 00:14:11 You know what I mean? And then people are like, well, we actually did like, I don't know, might have been better with them. Have you, speaking of cats and an interesting sort of obsession with them, have you listened to your fellow compatriots podcast, Guy Montgomery and Tim Batts podcast called My Week With Cats, where they keep watching cats over and over
Starting point is 00:14:30 and talking about it? No, but this sounds exactly up my street. Yeah, yeah, yeah. We've had them on the show. Guy is one of our favorite guests and fellow Kiwi. And yeah, like there's such an absurd podcast. You just keep revisiting cats over and over. Yeah. I mean, have you listened to the iconic Kiwi. And yeah, like, there's such an absurd podcast. You just keep revisiting cats over and over. Yeah. I mean, have you listened to the iconic Kiwi podcast who shout on the floor
Starting point is 00:14:50 at my wedding? And we're kind of viral. I still haven't listened to it yet. But that's also on my listening list. Yeah, that's one that I've heard many times be like, have you come on? And all the write ups about it, too, are like, it's absolutely the most riveting thing that we've listened to this year is what I feel like most people say about that. That's amazing. So how many viewings deep are you of 2019 Cats? Or is it just like you watched it once and then it's all YouTube explainers? Oh, yeah. I watched it once.
Starting point is 00:15:20 I don't know if I could do it again, to be honest. I did it once on a transatlantic flight. And that was, you know, myself trapped in the middle tube had this moment with cats when I was like, oh, it is as bad as everyone said. And then now it's just extremely scathing movie reviews on YouTube and in print. So I'm single-handedly keeping the movie YouTube review economy alive at this point. What's something you think is overrated? Authenticity. What do you mean? I'm over it. What are you saying here? Yeah.
Starting point is 00:15:54 I'm over it. At first I was going to be specific and talk about like gray hairs and personal grooming. But then I realized just in general, I'm over authenticity. Give us an example of how you are bucking the scourge of authenticity. Okay. Number one, please take note of my attire today, folks at home. I am in flannel, covered in eggs, banana, avocado, and apple. Yeah. I did. I was going to say, wait, don't forget the avocado, and apple. Yeah. I was going to say, wait, don't forget the apple.
Starting point is 00:16:29 But yeah, okay. Some of it from last week. Oh, yeah. Oh, yeah. Wow. I got snot stains on this sweatshirt I'm wearing right now. Now, I also have like all my grays peeking through, a little bit of sunburn on one side.
Starting point is 00:16:45 Who cares? Sunburn on one side. Who cares? Sunburn on one side? You don't want to know this. You've been breaking up the parenting by being a long haul trucker. Yeah. Is that where the sunburn on one side is coming from? I'm just saying, I don't want to be fucking authentic. I don't want to be authentic.
Starting point is 00:17:01 I watch you at home just imagining that I look fucking glamorous. Yeah. That my hair is like blown out. My makeup is contoured and perfect. My Snapchat filter is like really how I look. Mm hmm. Mm hmm. You know what I'm saying?
Starting point is 00:17:17 That I'm like a Kim Kardashian mom. That's just like together all the time. All the time. Even though I'm always on my banana phone just having long like three hour long phone calls like mommy's busy mommy's busy anyway rafi get back to the team we all know it's not true but isn't it better you know i've already been on the show talking about how important lying is right and this is really just part two of that just keep the lies going you don't need to know how people are doing really whenever people are like how are you doing like
Starting point is 00:17:56 it doesn't matter yeah i'll tell you look i'll tell you if i'm doing bad how about that otherwise let's just presume i'm balling out of control okay i'm just out here just killing it just slaying yeah you know drinking from my hydro flask that should have been a stanley okay yeah i did we did clock that before you we started rolling and i was like oh you came with this stanley and then you turned it around set the record straight it is a hydro flask but it looks functionally exactly the same as a quencher. And you say you are not off the Stanley? You know, I screwed up.
Starting point is 00:18:32 I should have lied. Yeah. This is the one area that we have to disagree. Authenticity is important. Damn it. If it's a Stanley quencher or a hydro a hydro flat it needs to be a stanley unfortunately uh for us to be okay well because we're gen z we're gen z we're gen z that's why did you not know i forgot don't like it's very important that's so fucking cringe like you don't even
Starting point is 00:18:56 know you're gen z zara like i'm so sorry oh my god did you try a stanley though like for real though did you try a stanley you said you tried't. Like, for real, though? Did you try a Stanley? You said you tried a Stanley, right? I tried the Stanley. I didn't like it. Okay, so break it down how... No, you must not have been using it right. Yeah, well, like...
Starting point is 00:19:12 The Stanley is narrower. Mm, okay. And I was extra conscious of how it might tip over. Oh, it didn't feel stable. It didn't feel stable. Okay. I care about my center of gravity i also don't have as much collagen in any of my joints anymore just kind of things are wobbly yeah
Starting point is 00:19:34 yeah just thought you were a great dancer that's true, it's narrow. So this might answer the question that I was asking yesterday, because I was like, okay, the Stanley being the size of a cup holder makes sense. Why do so many of my cups not fit into a cup holder? What are you thinking? And it's for stability in non-cup holder
Starting point is 00:20:00 situations, it would seem. Girth matters. Right, exactly. I like a good, wide cup. Ooh. Hold on. All the way down to the base. Right. Girth matters all the way down. Exactly. Take it all the way down. You know what? This fits. This fits
Starting point is 00:20:16 in everything. It does? Yeah. It fits in my car. It fits in my vagina. I tried it. A lot of things fit in there. Still got it. Every night, just making sure. I could put the baby back in. I probably could.
Starting point is 00:20:35 I had a natural birth. We didn't talk about this last time. I don't think we did, no. I had a, I should say, vaginal birth that took 20 minutes. We did talk about it off mic, though, but I think we weren't prepared to be like, yeah, 20. Yo, shout out. 20 minutes. Shout out to a quick labor.
Starting point is 00:20:54 10 pushes. Good for you. Man, 10 pushes. My doctor asked me if I would like to do this professionally. Right. And going to the bathroom with this Stanley mug really quick. That's the audition.
Starting point is 00:21:09 It was easy. You sure you haven't done this before? That was, wow. I had no issues. Also, this fits in my vagina now. Hydro Flask and Stanley. Oh, man. Jack, you gonna get a Stanley, though?
Starting point is 00:21:25 You said you're adhering. I'm boom. Oh, man. There you go. Jack, you going to get a Stanley, though? You said you're adhering. I'm thinking about it. Okay. It's definitely under heavy consideration. You haven't gotten it yet. You're just waiting for Hydroflask to be more popular. That's what. Our household has a Stanley in it.
Starting point is 00:21:38 I am just not permitted to use it or look at it. But, you know, we's, we have one, we've invested in a Stanley, and now, you know, it just takes some time. Whether we are a two Stanley household is just a thing that you kind of have to have a conversation about with your loved ones. That's for every family.
Starting point is 00:22:00 That's it for every family. Yeah, to this day. The Stanley also has a rim. Do you know how easy it is to wash this rimless Hydro flask flat face? Oh yeah. That, that like that. It's so easy to wash this.
Starting point is 00:22:13 I just hold it under the sink and just. Yeah. Yeah. So easy to wash. No crevices. Yeah. I have to, you have to like.
Starting point is 00:22:21 Yeah. Yeah. We don't have time for that. Yeah. We don't have time for that. Do detailing. Yeah. I don't want to have time for that. Do detailing? Yeah. I don't want to have to do detailing.
Starting point is 00:22:26 That's what it sounds like when I wash dishes. Same for me. That's definitely my internal monologue. Yeah, that's annoying. Yeah, I open like an old bottle and I'm like, what happened here? And then I'm like, just clean it quickly. I do every household chore like Paul Rudd picking those things up in Wet Hot American Summer. What happened here? Clean it quickly. I do every household chore like Paul Rudd picking those things up in Wet Hot American Summer.
Starting point is 00:22:51 Yeah, putting his utensils away. Exasperated zero energy with every movement. Right. Yeah, that's how I orgasm now. Yeah. That's how I orgasm now. Honey, was it good for you? Yeah.
Starting point is 00:23:08 Whatever. Give me the Stanley. Just hand me the Stanley. Give me 10 minutes. Hand me my phone. Give me the banana phone. Hand me my banana phone and my Hydra flask. Let me get right real quick. Get right with the Lord.
Starting point is 00:23:27 Francesca, what's something you think is underrated, though? Okay, so I have to correct something that I said last time, which was underrated, not sleeping with your cat or your husband in bed. And I realize I've, you know, it caused some pangs of panic. Nobody knew how to respond. Put everyone on blast that, yes, sometimes, you know, we have to sleep in different beds. Everyone gets a good night's sleep, even a terrible night's sleep last night. So underrated is my loving, wonderful husband, the father of my daughter, Matt Lieb, who is great, who even though we sleep in different beds, we have a very robust. We have we have a perfect sex life to quote Alan Dershowitz.
Starting point is 00:24:08 And that is who you want to be quoting when you're talking about your sex life. No, no, but just to say, look, it doesn't mean that you're going to get a divorce. It doesn't mean you don't love each other. Everyone should feel secure out there. Don't you hear the security in my voice? Mm-hmm. Yeah. Why is Matt crying in the background, though yeah just uh yeah miles uh and jack were very they they you guys are both awkward when i said we were like not sleeping in the same bed you guys were like i did but i'm here
Starting point is 00:24:40 to say i'm here to say that it's okay and that I love him no matter what. And one day we're going to get one of those king beds that are like actually separate. I don't know, Jack. This sounds like damage control to me, fam. It 100% is. It just cut out all the times I just kept saying, hmm, trouble in paradise? Yeah, yeah, yeah. This is no, this is 100% damage control,
Starting point is 00:25:06 and it's directed at only Matt Lieb. Yeah. Oh, man. Yeah, I definitely need my own blanket. Yeah. You know, the more I get to that, that Scandinavian style, it's like, bro, you need your own blanket
Starting point is 00:25:20 because the fights that occur over the blanket tussling, nah. Occur? Occur. Wow wow especially because you're taller and so like my hillary clinton the fights that you just said between me and bill over the sheets oh my god i'm surprised she didn't do a tweet like that. She's like, you know, despite what is a cring in power time,
Starting point is 00:25:52 it's like, oh no, no, no, no, no. Hillary, Hillary, Hillary. Put the Barbie down. I think it's great. I've definitely taken some time to sleep in a separate room. I think it's wonderful for your sleep sometimes. It just depends, you know?
Starting point is 00:26:10 Yeah, it really does just depend. Well, I mean, also, like, I think for anyone who, like, actually reacted to that, like, being like, oh, my fucking God, they sleep, like, it's the fucking death knell. That's a fucking death rattle for a relationship. It's like, come on, y'all. Like, no, it isn't. Well, no, it's not. And that's why I think it's, but it's like come on y'all like no it isn't well no it's not and that's why i think it's but it's okay to hear because here's the thing if all the audience found out after like i or if you guys found out anyone found out after like i didn't say it they'd
Starting point is 00:26:35 be like oh now you're like why are you keeping it a secret kind of thing that's why i'm like telling my friends so that when when they you know what i mean so it's like, so that they don't find out later and they're like, I didn't realize you guys have been sleeping in separate beds. Like that, it's not about that. It's like, no, no, this is like. Did you hear? I know. I'm getting very warm even just, it's like a pro sleep move, you know?
Starting point is 00:26:55 Right, right, right. It's just about sleep. Yeah, of course. I mean, like, that's not, there's not the real, like, for me, I get into bed like the second I'm about to sleep. Like, I'm not one of those people who hangs out in bed and does a ton of shit. So it's not like that's really not a venue where like my relationship with Her Majesty is like we're making like dreams and memories and shit like that. It's more just like, yeah, here's a place I sleep.
Starting point is 00:27:17 Exactly. The rest of the house. The problem is that I think also it's a little bit of a flex. It's like saying you have like seven kids or whatever, or like sleeping in separate beds or separate rooms. It's like, you guys have two rooms? Like who has the luxury to do that, you know? Yeah. That's called the timeout bedroom.
Starting point is 00:27:31 Yeah, exactly. But, you know, it's good. My cat's still pissed. She's been peeing all over the place. Luckily, Matt does not pee all over the place to protest. So, but the kitty kitty does. And that is a flex also, that you have a husband who doesn't pee all over the place
Starting point is 00:27:45 isn't it though yeah yeah congratulations wow because yeah jack and i are like looking nervously at each other right now yeah yeah no my wife would say the same thing right now i don't pee in the linen closet yep uh and she's very proud of me for that. Yeah, we're going to Dave & Buster's later to celebrate. I've been about that. Would you like to go to Dave & Buster's, babe? She said, I can play whatever I want. Yeah, I can play the Jurassic Park game. I can play Halo for like five hours in a row. It's fucking awesome.
Starting point is 00:28:18 In the arcade? Yeah, dude, Halo. That's the big game. You can play that there? Yeah, you can play Halo now and like it's they've got like this giant like widescreen experience
Starting point is 00:28:31 oh they do have that big shooter game you're right yeah you play that with your kids I don't play it but my seven year old will play it until I like drag him away because he is suffering from malnutrition like drag him away because he is suffering from malnutrition. Like he will just play that shit.
Starting point is 00:28:48 It's like withering away. Yeah. Like, yeah. Oh, shit. Oh, man. I just imagine a seven-year-old with like a beard, a very long beard. Yeah, like wispy though.
Starting point is 00:28:57 So wispy, but so long. It's good though, because I always know where he is, you know. Just follow the trail of the long beard. Yeah. All right. We're going to take the long beard. Yeah. All right. We're going to take a quick break. We're going to come back and we're going to check in with our good friends at Exxon,
Starting point is 00:29:11 who we can trust to solve this climate thing. NBD. We'll be right back. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films
Starting point is 00:29:41 and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades. Jessica and I will delve into the hidden truths between high-control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine. Through powerful, in-depth interviews with former members and new, chilling firsthand accounts,
Starting point is 00:30:01 the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I've been thinking about you. I want you back in my life. It's too late for that. I have a proposal for you. Come up here and document my project. All you need to do is record everything like you always do. One session, 24 hours. BPM 110, 120. She's terrified.
Starting point is 00:30:43 Should we wake her up? Absolutely not. What was that? You didn't figure it out? I think I need to hear you say it. That was live audio of a woman's nightmare. This machine is approved and everything? You're allowed to be doing this? We passed the review board a year ago.
Starting point is 00:31:01 We're not hurting people. There's nothing dangerous about what you're doing. They're just dreams. Dream Sequence is a new horror thriller from Blumhouse Television, iHeartRadio, and Realm. Listen to Dream Sequence on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. It was December 2019 when the story blew up. In Green Bay, Wisconsin, former Packers star Kabir Bajabiamila caught up in a bizarre situation. KGB explaining what he believes led to the arrest of his friends at a children's Christmas play. A family man, former NFL player, devout Christian, now cut off from his family and connected to a strange arrest.
Starting point is 00:31:45 I am going to share my journey of how I went from Christianity to now a Hebrew Israelite. I got swept up in Kabir's journey, but this was only the beginning. In a story about faith and football, the search for meaning away from the gridiron and the consequences for everyone involved. You mix homesteading with guns and church and a little bit of the spice of conspiracy theories that we liked. Voila! You got straight away. I felt like I was living in North Korea, but worse, if that's possible. Listen to Spiraled on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:32:21 wherever you get your podcasts. And we're back. We're back. And it's frenum, frenum, not frenulum, just for the record. We knew that.
Starting point is 00:32:32 And we knew that. That's right. And we knew that. And we knew that. And we knew that all along. And it was a test for y'all. Yes. And you passed. Completely knew that.
Starting point is 00:32:39 Well done. Well done. You passed. Congratulations. Yeah. As we all know, Elon Musk has repeatedly amplified anti-Semitic conspiracy theories on Twitter, allows hate speech to proliferate all over the face of his social media platform
Starting point is 00:32:54 with some Nazi-loving accounts even earning money through its ad revenue sharing program. Wow. So in order to save face, he instituted some serious new policies to kind of clamp down on... No, wait, I'm sorry. He did? No, that's incorrect. No, he traveled to Auschwitz for a nakedly self-serving
Starting point is 00:33:16 publicity stunt. That's what he did. My bad. I was shocked. He didn't change anything of substance about how he operates. You see, wouldn't it be nice if Elon wasn't so on brand? Right. Right. Would be.
Starting point is 00:33:31 But then he wouldn't be Elon Musk. No. So beginning on Monday of this week, he participated in a two-day conference themed around combating anti-Semitism hosted by the European Jewish Association. And predictably, it was a complete shit show. He got a private tour of Auschwitz along with Ben Shapiro,
Starting point is 00:33:52 who was also at the conference and later interviewed him and just lobbed him a bunch of softballs that didn't raise any of his history. That's even disrespect to a softball. Yeah, meatballs. They were like cotton candy dreams any of his history. That's even disrespect to a softball. Meatballs. They were like cotton candy dreams in the shape of a sphere, basically.
Starting point is 00:34:11 And he had a tennis racket to just, like, that, I mean, beyond that quote-unquote interview where they're like, because you would have thought any serious examination of anti-Semitism dealing with the person who's running Twitter,
Starting point is 00:34:23 you'd be like, and also in our next segment called This You, here's some of the posts that you have been putting up for the last couple years. Really would like to talk about that, but it was not the case. You've been putting up, retweeting, liking, retweeting with comment, being like, this is weird, huh? Interesting. About like anti-Semitic conspiracy theories.
Starting point is 00:34:44 Yeah. He also brought his three-year-old. So the, you know, the website for the Auschwitz Memorial says it is not recommended that children under 14 visit the memorial, which seems like one of those things that absolutely goes without saying, but I guess needs to be said. But you see it and you're like, yeah, no, of course, of course. He brought his three-year-old and was like giving him shoulder rides like it was fucking the circus. Oh, wow.
Starting point is 00:35:14 Is it the daughter? Wait, is she 12 or whatever that name is? That's that's his son or daughter. That's the son, according to sources who I'm looking at. OK, how do we desert? Do we have a what are we what are we calling? How do we pronounce that jumble? No, I think you just nailed it.
Starting point is 00:35:36 She nailed it so hard that I'm not even going to attempt to reproduce. Yeah, she said, OK, she said, she said, yeah, yeah. But yeah. to reproduce. Yeah, Shea Shea. Okay, Shea Shea. Shea Shea. Yeah. But, yeah, so after the tour, he sat down for an interview with Ben Shapiro, and Shapiro just didn't, in no way, mention his history of anti-Semitism.
Starting point is 00:35:56 The talk even opened with a video produced by the European Jewish Association that imagined what the Holocaust would have been like if social media, and specifically X, had existed at the time. And ended by posing the question,
Starting point is 00:36:15 if we had had X in 1939, how many lives could have been saved? If we had X in 1939? I mean, if they're talking about ecstasy, like Molly. Yeah. Yeah, rather than like meth. Yeah, they're mainly on meth. Yeah. They're mainly on
Starting point is 00:36:33 very rudimentary meth. And I feel like generally people are less likely to commit mass genocide when they're on MDMA rather than rudimentary math but also like twitter is stoking nazism now yeah or x i should say oh you noticed that okay shit oh i guess we didn't really think about that but did they answer the question how many
Starting point is 00:37:01 lives would have been saved for that absolutely unuseful question? Useless question? They implied they implied an answer. They had in one of the videos like fake tweets. They had an official account posting about Auschwitz's thriving inhabitants only to have that claim debunked by X's community notes. That is so fucking grim. Are these doctored tweets from at auschwitz camp official like what this is such a grim thought experiment then i'm like who does
Starting point is 00:37:35 this fucking benefit i mean it's just really just like creating more cover for x which is a legitimate uh i mean at this point 4 4chan, 8chan, and now X or whatever we want to call this thing. It's just like a terrible, it's a cesspit. But yeah, community notes would have come through and been like, yeah, actually this is actually the site of untold horror.
Starting point is 00:37:58 Is that what they're saying? Because the community notes, right now I feel like half the community notes are just being like, this is from a dropship company and you can get this product for much cheaper at another outlet for this consumer good. I feel like that's the most community notes I see recently.
Starting point is 00:38:13 He literally claimed, this is a quote, if there had been social media, it would have been impossible for the Nazis to hide. What? Elon Musk said that. Like, try to imagine literally any other CEO making that argument about their product. Yeah.
Starting point is 00:38:33 Like, if the head of PepsiCo was just like, you know, I think Hitler would have had a hard time rising to power with the refreshing taste of Mountain Dew's Baja Blast. Okay, well, Jack, let's not. Let's not. Just saying. Let's, that may, but let's, let's, let's create a space where that is possible, though.
Starting point is 00:38:52 You know, maybe Baja Blast may. In a, in a Stanley Cup. A bad example, because that's probably true. Yeah, yeah, yeah. That's a bad example. Bad example. Bad example. Yeah, just the wild ass shit.
Starting point is 00:39:03 If it would have been impossible to hide like well right now they're not even hiding on your website they're like fucking out in full force so i don't even understand that or is he trying to say that like x also because of like sort of like open source intelligence people who like kind of begin to like identify who nazis are are you siding with those people who actually do try and drag nazis out into the sunlight like that's what's so wild i'm like what version of twitter are you celebrating exactly this is wild to me also because like you know everything that's going on right now in terms of just like talking about genocide in Israel is anti-Semitic. Right.
Starting point is 00:39:46 But Elon Musk having this conversation where he just like waxes poetic while he stokes anti-Semitism on his platform. It's such a weird thing because I know like revenue from it. Right. And I know that people like at the ADL were really upset with the leadership there because on one hand, they're like we they've called out Elon Musk's anti-Semitism. And then the other hand be like they're a great partner in the fight against anti-Semitism. And it's like a very it's just like it's really, really inconsistent. really inconsistent and yeah it it you begin to wonder like because i because you know that there's like this whole thing right where the israeli government is like we've completely lost the digital battlefield in terms of like sentiment on social media and like there's now like a real
Starting point is 00:40:37 concerted effort to really address that because they're like i don't know what happened like on the internet like we so they may see like having the power of twitter harnessing the power of twitter or elon's desire to try and like you know whitewash his anti-semitism away as a as like a potent tool to begin like battling that messaging because you also have a lot of like these accounts that are like the apparent that are like that are they're like there's like one called at defund israel now or something like that that apparently elon musk like actually is the one who's like okay i'm gonna give you that like silver gold blue like badge or whatever that apparently he has to
Starting point is 00:41:15 have like a say in verifying and that account is like basically doing all this stuff to be like hitler's talked about so like poorly. But meanwhile, there's a real genocide happening that Jews are doing. You're like, what is this? Oh, wow. Oh, my gosh. So it's like, he's like, he's like playing every single angle. And I think cynical people are saying that accounts like that are not cynical, but like the cynical read on promoting account like that is to just tie any stance that is anti-apartheid or genocide as being part and parcel of like full-blown nazi stuff like right full-blown anti-semitism yeah to completely tie that those
Starting point is 00:41:57 two ideas together so they're like inextricable so then like the shorthand for people to be like oh you're saying that that means you're like actually a Nazi. some tweets because the Nazis revoked their right to free movement long before the death camps were built. And also much of the world did know about what the Nazis were doing, but turned a blind eye, which I think is important to keep in mind at this time, because that feels like what we're going through right now is a lot of the world just kind of turning a blind eye and just being like, well, that's not really my problem, is it? Yeah. Well, it's weird, too, because the social media has
Starting point is 00:42:51 allowed more people to sort of engage with like engage with the topic while governments for sure. It's like it's that's why I feel like there's such a tension existing in many countries where people like, I'm sorry, what are we what's our part in this as a nation? I can, we do something about that. And like, Oh, you saw that. Uh, we were just hoping to like, let that pass until there's some other global controversy that can kind of keep this thing moving. But yeah, like just to, to say like, it, it's just the tweets would have changed everything is just disingenuous and just makes like an utter mockery of like everything that's happening. Because right now, I feel like, like, while people are like a lot of media is unable to really contend with what's happening, especially in Gaza and the West Bank. Now, it's like now it's more like everyone's being like, what do you see what Hillary Clinton had to say about Greta Gerwig and Margot Robbie, you know, getting snubbed and like that's getting it's just we're we're in a bizarre upside down world. Yeah. But I mean, he did make kind of an airtight case that he can't be doing in anything that would be confused with anti-Semitism because
Starting point is 00:43:59 he said that he has Jewish friends. Oh, no. Oh, yeah. Then that. OK. No, actually, two thirds of my friends are Jewish. I'm like Jewish by association. Yo. Problem. He said, I'm asked.
Starting point is 00:44:17 He said, I'm asked. That full quote is I'm. Then he said, I'm aspirationally Jewish. Dude, you are out here like retweeting that kind of stuff where you're saying it's that fake-ass, not-real Voltaire quote about being like, you should be worried. It's to the effect of you have to think about the people you are not allowed to criticize in a society.
Starting point is 00:44:38 And that's where you know where the power lies. But that really just comes from an anti-Semitic fucking creep. But they're like, that was actually Voltaire like you're doing that kind of as voltaire once said and then quoting a straight up like 4chan yeah you're like uh that was from a reinhardt hydric speech that he gave to the okay whatever sure but despite his uh oddly specific fraction of jewish friends and uh calculated photo ops uh x still not solving its anti-Semitism problem. Several anti-Semitic posts on X, which have been identified as anti-Semitic, moderators have refused to delete, claiming that they do not violate the platform's rules.
Starting point is 00:45:18 So they've been reviewed and not deleted. There's been a spike in anti-Semitic posts in the country that Musk just visited in Poland because of an incident in December in which a far-right MP used a fire extinguisher to snuff out a menorah during Hanukkah, which was a major news story and inspired a slew of white supremacist memes. And they're just like, yeah, I mean, what are we going to do? It's, you know, so they they're just very selective in where they care about this. That's why it's such a it's so this is so fucking dangerous to play around with what is hate or what is not hate speech. You know what I mean? Yeah.
Starting point is 00:45:58 Like it's completely it. Well, it's going to lose meaning because it's if it's it has almost yeah like to the point where like it's it's like i don't it really it really blows my mind because i do not see how beginning to weaponize anti-semitism in like a very cynical way makes anyone safe and it's just used to sort of like stop any kind of discussion or debate or dissent or whatever but meanwhile you have somebody who is so open about like what their like their philosophical view is supposedly be like the standard bearer for the fight against it it's just like what on earth yeah it just it it just for me i just see like it i only see this getting worse like to begin to just to
Starting point is 00:46:45 fuck around like this constantly but again i think like there's clearly there's clearly been this thing of like online the sentiment whether there's like young people or whatever it is to blame people's just general disgust for what is happening or them being completely taken aback by the violence that's happening against innocent people. And I don't know, I'm just like, well, this is the problem with like the FCC not being on top of this shit. Right. Right. And this is why I just want to bring us back again to the fact that authenticity sucks. We don't need to hear about these like hate-filled assholes and their anti-semitic bullshit like they don't get to have this much voice and presence and energy like we are supposed to censor shit like that you know or is it censure you know or both like it's that's not okay and you don't get to just
Starting point is 00:47:47 like walk around and in fact half the reason why we're having to deal with this all again and again and again is because x isn't on top of regulating it in as well as other media platforms yeah that's why i'm like yeah i've i remember in the beginning that like it felt like the eu was really being like yo you need to fucking answer for the kinds of garbage that's on this website because like we see that as like a threat to our like stability here but i'm not sure like where that's headed and yeah i don't know again i mean i understand why the u.s especially doesn't have a reckoning with hate speech because it's it's so such part and parcel of the culture that they there's no way like that. We could begin to do that and have people come out of the woodwork and be like, it's all of our free speech. Not to say that people need to be possibly we can't even get gun control together.
Starting point is 00:48:40 Yeah, right. Exactly. They're like, man, we can't even control objects like that we could easily control yeah like words ah you got us man we can't even fucking but there's no regulation yeah and yeah like i don't know i mean the fcc obviously is dealing with stuff that's like broadcasted so i don't know who you know obviously right is the real regulator there but i mean it's just we thought that maybe the advertiser Exodus would do something, but it just seems like now we're just watching it fall apart, but it's like watching like a star collapse on itself. And then it's probably going to end in something really fucking gross.
Starting point is 00:49:18 What I'm saying, Miles is, I don't know why they don't let me do it. Let you be the charge ass kicker on Twitter. Yeah. I would be really good at it. You don't get to play here anymore. Yeah. And that is the banana phone's final word.
Starting point is 00:49:34 Click. Ring, ring. Yeah. All right. Let's take a quick break and we'll be right back. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades.
Starting point is 00:50:10 Jessica and I will delve into the hidden truths between high-control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine. Through powerful, in-depth interviews with former members and new, chilling first- firsthand accounts, the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I've been thinking about you. I want you back in my life. have followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. 24 hours. BPM 110, 120. She's terrified. Should we wake her up?
Starting point is 00:51:08 Absolutely not. What was that? You didn't figure it out? I think I need to hear you say it. That was live audio of a woman's nightmare. This machine is approved and everything? You're allowed to be doing this? We passed the review board a year ago. We're not hurting people. There's nothing dangerous about what you're allowed to be doing this? We passed the review board a year ago. We're not hurting people.
Starting point is 00:51:26 There's nothing dangerous about what you're doing. They're just dreams. Dream Sequence is a new horror thriller from Blumhouse Television, iHeartRadio, and Realm. Listen to Dream Sequence on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, everyone. I am Lacey Lamar. And I'm Amber Ruffin, a better Lacey Lamar. Boo.
Starting point is 00:51:52 Okay, everybody, we have exciting news to share. We're back with season two of the Amber and Lacey, Lacey and Amber show on Will Ferrell's Big Money Players Network. You thought you had fun last season? Well, you were right. And you should tune in today for new fun segments like Sister Court and listening to Lacey's steamy DMs. We've got new and exciting guests like Michael Beach. That's my husband. Daphne Spring, Daniel Thrasher, Peppermint, Morgan J., and more. You gotta watch us. No, you mean you have to listen to us.
Starting point is 00:52:23 I mean, you can still watch us, but you gotta listen. Like, if you're watching us, you have to watch us. No, you mean you have to listen to us. I mean, you can still watch us, but you got to listen. Like if you're watching us, you have to tell us. Like if you're out the window, you have to say, hey, I'm watching you outside of the window. Just just you know what? Listen to the Amber and Lacey Lacey and Amber show on Will Ferrell's Big Money Players Network on the iHeartRadio app, Apple podcast or wherever you get your podcasts. and we're back we're back and one of the things that your new book dr mcnerney talk about is what is good technology and is it possible and i feel like i'm so used to this hyper-capitalism paradigm that I don't think we can have technology without having loss of jobs and free will. But I've seen this recent reappraisal of the Luddite movement,
Starting point is 00:53:16 which is just a phrase that I grew up using to be like anybody who didn't want to use a computer, was slightly resistant to technological progress. And now people are like pointing out, no, they didn't just want to destroy all machines. They were focused on the ones that took jobs and led to wage losses. But we turn them into like old man screaming at cloud because of our paradigm of like, yeah, but that's counter progress. That's unrealistic. of our paradigm of like, yeah, but that's counter-progress. That's unrealistic. So what is my closed-off capitalist mind missing out on when I think about the direction that technology
Starting point is 00:53:54 can take? What are the good things that aren't just basically AI being McKinsey? Yeah, no, no. I mean, first, I do love this reappraisal of the Luddist movement. I know Brian Merchant has a book out called Blood in the Machine, which is specifically trying to reframe the Luddites, this movement against automation in the UK, as the origins of the revolution against
Starting point is 00:54:18 big tech. And I do love this because I do think the Luddites have been unfairly maligned as these tech haters. But yeah, second, you know, so myself and my work wife, Dr. Eleanor Drage, co-edited this book called The Good Robot, which is the same as our podcast, on this like provocative question. And we mean it very much as like a suggestion or idea, not like an inevitability that technology, you know, maybe can be good.
Starting point is 00:54:42 In a lot of spaces, that definitely doesn't sound like a particularly radical idea, particularly in the, like, tech hype spaces we've discussed. But, like, for people like us who spend our day-to-day looking at these really awful effects of technology, like, either because they've been designed to be really awful, like, predictive policing tools, or because they're being exploited and used in, like, really harmful ways, like the way that technology is used to perpetrate gender-based violence. It can be really easy to be unable to see any kind of positive possibilities for a lot of these new technologies. But while I definitely, you know, think that there's a real place for just the total refusal of people like
Starting point is 00:55:19 the Luddites or the Neo-Luddite movement, which is kind of trying to bring back a lot of these ideas, we wanted to challenge ourselves and all the guests we have in our podcast to say, what would it mean for technology to be good? And so for us, that's feminist and pro-justice and informed by all these different kinds of ideas about equality and fairness. And also, what would that look like sort of grounded in our everyday lives? So for me, for example, a lot of thinking about good technology is trying to reclaim technologies that we might not think of as being very high-tech, often because they've been associated with women. So I knit and I have a pair of knitting needles on the table next to me. And knitting is often not understood as being
Starting point is 00:56:00 like a very high-tech practice. But in the 1980s, when people were trying to get more girls and women back into computer science, there was this idea that if you can knit, you can code. Because if you can read a knitting pattern, then you can use a coding program. And so sometimes now at computer science conferences, you'll see people put a knitting pattern up on the screen and they'll say, what coding language is this?
Starting point is 00:56:21 And then usually only one or two people, often one of the few female attendees will say, oh, that's a knitting pattern because they're the only ones that can understand that kind of code. So I think there's something really beautiful in reclaiming those particular kinds of technologies that have been maybe excluded
Starting point is 00:56:38 from the way we talk about tech. On my, again, work wife who co-edited this book, well, she edited most of it actually, so who really did the heavy lifting on this book, Eleanor talks about the WISC as her example of a good technology. And she says she loves the way it looks. She likes how she can use it in all these different ways. And she says, she's sure there's ways that you can misuse this, but it's something that just makes her life better and is designed well in a very simple way. I unfortunately have already undermined this good technology for her because I then told her about when I was about 14, my school went to a trip at the technology museum in Auckland.
Starting point is 00:57:15 It's called Motet. If you're from New Zealand, everyone in Auckland's been to this museum because there's not that much to do in Auckland for a school trip. not that much to do in Auckland for a school trip. And then one of the girls in my class wound the hair of another girl into like an antique egg beater. Then they couldn't get her out. She got stuck. And so, you know, children can make all technologies bad. But apart from that, you know, I think like trying to find these like little examples of technology that aren't about the kind of like
Starting point is 00:57:42 big hype of AI, but maybe bring us back into the ways that we use technology to reshape our worlds and make things a bit better is what I like to do with this question. When you think of like, you know, I think that the one version is like, well, this generative AI, like it democratizes certain things. And I think while on one hand, it may allow people access to like create things that they of ultimately determines whether or not a technology is good or, you know, used in a positive or negative way. like for all the people that are preaching and proclaiming about how AI is opening the door to something new, what, like, as it relates to sort of these large language models, what are the ways that that can't, like, is that more about a use case or we need to lean more into the regulations to make sure that AI isn't wielded by nefarious powers? Like, how do you look at that specific technology and think, OK, while there's definitely like a lot of biased or weird uses of it, like, there's also another way to look at this and not just kind of like lean into the Skynet version?
Starting point is 00:58:54 Yeah, I mean, I would take this makes these models and who has control over them and who can afford to. Because one of the big changes I think that's happened in the last few years is that language models have gone from much smaller models that maybe like one researcher with a reasonable budget could train themselves in a lab to being these absolutely huge models that you need a massive amount of energy, a massive amount of data, and a massive amount of money to create. And so what that means is that companies with a first mover advantage like Google, like OpenAI are the ones who can afford to make these models. And I think increasingly, it's going to be harder and harder as the models get bigger for small firms to enter that market. So what we end
Starting point is 00:59:42 up with then is a monopoly. And I think we're starting to see some of the effects of that monopoly right now, when you have a few big tech firms kind of having a hand on like most of the most powerful and effective models. And so I think like, even though people say, oh, this is going to democratize AI, because everyone can generate text with these models, it's like, yes, but you know, very few people are profiting from it. And also, I think very few people then have control as well over how long we're able to use those models for. One day, will they just all be turned off or will they be shifted in a way? So I think there's still kind of a concentration of control. Yeah. And I think second, you mentioned kind of biases and weird stuff in the models super important like large language
Starting point is 01:00:25 models are trained on data scraped from the internet the internet can be not the best place as we all know it can be right for like all kinds of information um but it's also full of a lot of exclusions so like for example again when i was working uh talking to these data scientists and engineers at this big tech firm would ask things like, oh, well, where do you pull your training data from? And they'd say, Wikipedia, for example. And, you know, we'd say, oh, but like, Wikipedia is not a very equitable place. Like women are really, really vastly underrepresented on Wikipedia pages, both in terms of who writes them and also in terms of who gets Wikipedia pages written about them. So the physicist Jess Wade here in the UK has had this long running project where she just adds a woman to Wikipedia like every day. And she's done that, I think, now for like years.
Starting point is 01:01:15 But it kind of just shows like how inequitable, though, that distribution is. If you're training a model on data from Wikipedia, implicitly, you might not be trying to do this in any way. a model on data from Wikipedia, implicitly, you might not be trying to do this in any way. You're also training that model to believe in a world where, say, women make up 20% of the population, not 50% plus. So there's a lot of biases and harms that come just from exclusion. Another example of this is, I have a good friend who is a linguist and something she talks about is communities that don't have written languages are already automatically just not going to be able to partake in whatever benefits might come from large language models, whether that's signed languages or languages that are only oral.
Starting point is 01:01:55 And so, you know, I think there's just a lot of different ways that even beyond these kinds of immediate harms of like the AI has produced something that we think is really offensive or gross, that we could see the use of large language models maybe creating further inequities. Now, a lot of the AI like promise, like the developments that are being promised.
Starting point is 01:02:16 So even at Davos, like this past Davos, which I had to miss, unfortunately, and I hate to miss Davos because I learned so much there. But you've been to four Davai, haven't you? Davai, yeah. My fourth Davai was the best, man. We all did Mali and just had a cuddle puddle. But Sam Altman, I don't know if he's like
Starting point is 01:02:39 consciously scaling back people's expectations, but he was like, soon, you might just be able to say, what are my most important emails today and have AI summarize them. I was just like, alright. Doesn't Outlook already have an offer, like offer a
Starting point is 01:02:56 shitty version of that already? It just feels like the versions of AI that I'm hearing, there's this older New Yorker article that was like, I'm not that worried about AI. I think it's from like the kill us all perspective. I think it's going to be like a little McKinsey
Starting point is 01:03:15 in everyone's pocket. Like it's going to be this like economic optimization tool that like everybody has access to. And that's going to just make everything shitty and boring so i i don't know like i'm just curious for your thoughts on that and like if there are examples of just like functionality from ai that actually like capture your imagination where you're like oh shit that would be like cool that's a cool idea of like something that would be fun and you know improve people's lives even if it's just like make their video games better or whatever yeah i mean maybe
Starting point is 01:03:52 the email thing appeals to some people personally i want fewer emails i don't want a summary of my emails i just want my inbox to quietly shut down between the hours of like 5 p.m. to like 10 a.m. every day and just be like, I'm email free. And then the writer Ian Bogost, I think, has this idea of hyper-employment, which is like the technologies that say they're going to make our lives easier and more stress free actually make our lives much busier. And we now waste a lot more time. So he talks about emails as a way of saying like, oh, we we're gonna have far fewer meetings and we're gonna like have to spend less time like sending each other letters or whatever sorry i'm from the post-internet generation but then now we spend like so much time like answering emails and i think ai feels a bit like this like when people say like ai is going
Starting point is 01:04:40 to save you so much time i'm like you are not a teacher and educator because the amount of time we have wasted this year trying to figure out what to do with AI generated essays, like absolutely not. Yeah. And so I feel like things like the AI email summarized, I sense could end up in a very similar pile, but, you know, kind of to come to the more positive side of your question, like what makes me excited? I think a couple of things. One is anything where AI can genuinely scale up in a way that is not too ecologically damaging or costly, a process that is already going well, where the statistics and the procedures in place are working for us because AI is able to scale things. It's not necessarily able to do new things always.
Starting point is 01:05:26 So if we know we have a sorting or categorization process that works that's when i think ai computer vision these kinds of systems can be really useful where it doesn't work is when you're asking ai to do something that like we don't actually have good processes in place to do so like when a tool says oh i can like tell a candidate's personality from their face. Like, no, you can't do that. That's just a straight up phrenology. Please don't do that. But also secondly, if we had easy ways of telling if someone's going to be good for a job, humans would be able to do it already.
Starting point is 01:05:55 This is much, much more complicated than you're making it out to be. A second kind of use that I think I find really exciting or makes me like, you know, really happy, I think, is particularly tools around trying to kind of like support particular communities as needs in a way that is really driven by that community. So, for example, in New Zealand, where I'm from, there's been a lot of effort put into different kinds of like AI powered tools and data sovereignty programs around Maori traditions, the Maori language or Te Reo Maori. And like, I think, you know, this is an example of where like that's been led by Maori people and is in a response to kind of the way that in colonial New Zealand, like Te Reo Maori was like very deliberately stamped out. And there's been a huge movement to try and kind of protect and revive the language.
Starting point is 01:06:45 And I think it's like when you have projects like this that makes me a bit more hopeful about the way that ai machine learning could be used to you know promote these pro-justice projects but you know i think those projects are always have to exist in a little bit of tension with like big tech and we've seen this like with other organizations for example, like Masakana, which is an amazing grassroots organization, which aims to bring the like 4,000 different African languages into natural language processing models or large language models. You know, but I know that these kinds of groups often do struggle with this idea, like, do we commercialize because then will we be brought into this hypercapitalist world? Do we keep this to ourselves? Will we be brought into this hyper-capitalist world? Do we keep this to ourselves?
Starting point is 01:07:27 But yeah, I think it is important sometimes to step back and be like, there are really interesting community projects which are trying to use these techniques and these kinds of knowledge in ways that push back against the email summarizing bot. Right. Is there a historical precedent? I was reading some articles about the competition between the United States and China and how the U.S. is like trying to freeze export of like certain chips to China because they think that will allow China to catch up with them. And it feels like where there's going to be inevitably an argument where they're like, we need to just go pedal to the floor on AI
Starting point is 01:08:07 development because this is the new Manhattan Project, trust us. You talk a lot about just like this alternate possibility of like, what if we didn't use this technology for hypercapitalism and militarism. Are there examples where, like from history that you're aware of where like technology hasn't been, has successfully been like protected from those sorts of things? Are there any, even if very small examples where people have been able to keep technology like fenced off from that sort of thing? Yeah. No, I mean, this is something that really preoccupies me. I spend a lot of time mapping and tracking with the AI Now Institute, this narrative of an AI arms race between the US and China, and how that story is super damaging because it can cause this race to the bottom and lead to us
Starting point is 01:09:04 trying to develop AI faster and faster without necessarily trying to make it better or safer. And I think we've seen some positive movements when it comes to AI regulation recently from the US's commitment or declaration around AI through to the Bletchley Declaration in the UK and the EU AI Act. But at the same time, you know, I think that this sort of racing narrative like still lurks and is still sometimes used to try and push back against regulatory measures, particularly by people with investments in big tech. But yeah, at the same time, I also think this question of like, can we look to history to find ways to tell different stories about AI, maybe bring about different futures? It's something that really interests me. I think the comparison that is most often made between AI and another technology when it comes to regulation and governance is nuclear and specifically nuclear weapons. And like you mentioned, Manhattan Project, certainly kind of this sort of language of like an Oppenheimer moment when it comes to AI and this
Starting point is 01:10:05 kind of idea of being on like the cusp of a new cold war that will be AI driven rather than nuclear weapons driven is a very common media narrative but something I try to do like with my own research is to try and look for and support different kinds of historical analogies that maybe offer maybe less kind of less hawkish futures when it comes to international politics um so for example maya and dira ganesh who's a fantastic researcher at the leverhulme center where i'm at looks to like uh histories of feminist cyber governance in the early 2000s as a way of saying like actually we have a lot of precedence for thinking about the ethics of the internet why aren't we bringing this into thinking about ai or um matish mass who's a legal researcher looks at
Starting point is 01:10:50 histories of technological restraints so like when did we choose not to make a technology even though we could because we thought that it actually wouldn't be good for the world and for societies and so i think you know like making sure we have examples outside of nuclear, because while nuclear can be still useful in some ways, like it's only one metaphor and metaphors are inherently limited. They tell us something about the world, but they can't tell us everything. And I think having these alternative historical examples can be really useful for thinking a bit differently. Yeah. yeah so yeah it's it's just funny because like in the end it always feels like thing with potential to create like unforeseen levels of productivity or power it's like and then we made it a weapon then we like put it in the bomb you know kind of like and and yeah like we really do have to sort
Starting point is 01:11:36 of break out of that thinking i mean i wonder precisely because our brains are filled with skynet and oppenheimer and the Manhattan Project, that we're just in this really weird pattern of always looking at something that has the potential to unlock new levels of something are inherently going to always be like, but how do our enemies kill us with it? And then we begin to lose the plot there. So yeah, I look to these other examples to try and again, open my mind to looking at it less of like, and then how they make, and then how they make global domination with that. Yeah, I mean, I think it's either,
Starting point is 01:12:15 it's like, how do we kill someone with it? Or as I think the history of like, how tech is represented in Hollywood would show us, like, how do we have sex with that? And this is like a very classic trope in sci-fi, right? It's like, you get like a dude in his basement who makes like a sex spot.
Starting point is 01:12:29 And I remember interviewing Jack Halberstam, who's like this very famous feminist and queer theorist with Eleanor on the podcast. And he was talking about the film Ex Machina. Have you seen this? It's from like 2014. It's like a kind of, a kind of an indie film,
Starting point is 01:12:43 but had quite a lot of prestige. I think success. Yeah tech circles yeah for sure yeah yeah and so i remember um we were talking about ex machina and jack halverson was saying like it sort of shows like the limits of the imagination like particularly like the tech bro imagination that he has like all this expertise at his fingertips, all this data, and he basically just makes like sex robots and like that's all he can really think to do with it. And I think, you know, to some extent, like we're still a little bit trapped in that imagination,
Starting point is 01:13:14 which is why I think like both like different projects to do with AI, but also different stories about AI are really crucial. Right. Yeah. We have to get out of the fuck or fear paradigm. Right. We have the technology. It's going to do one of the two, man. So,
Starting point is 01:13:31 yeah, we need new ways to look at it. I'm here to do two things. Fuck something or kill it because I'm scared of it. Yeah. Alright, that's going to do it for this week's weekly Zeitgeist. Please like and review the show if you like the show. It means the world to Miles.
Starting point is 01:13:52 He needs your validation, folks. I hope you're having a great weekend, and I will talk to you Monday. Bye. Bye. Bye. Bye. Thank you. you Thank you. I'm Carrie Champion, and this is season four of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese.
Starting point is 01:16:01 Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports. Listen to the making of a rivalry, Caitlin Clark versus Angel Reese on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Elf Beauty, founding partner of iHeart Women's Sports. Hey, I'm Gianna Pradenti. And I'm Jermaine Jackson-Gadsden. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Women's Sports. Hey, I'm Gianna Pradenti. And I'm Jermaine Jackson-Gadsden. We're the hosts of Let's Talk Offline
Starting point is 01:16:26 from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do,
Starting point is 01:16:38 like negotiation expert Maury Teherry-Poor. If you start thinking about negotiations as just a conversation, then I think it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Jess Costavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed.
Starting point is 01:17:07 Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.