The Daily Zeitgeist - Weekly Zeitgeist 292 (Best of 9/18/23-9/22/23)

Episode Date: September 24, 2023

The weekly round-up of the best moments from DZ's season 305 (9/18/23-9/22/23)See omnystudio.com/listener for privacy information....

Transcript
Discussion (0)
Starting point is 00:00:00 I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me for I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me for I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:00:30 I'm Keri Champion, and this is season four of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese.
Starting point is 00:00:39 Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports.
Starting point is 00:00:48 Listen to the making of a rivalry, Caitlin Clark versus Angel Reese on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Capital One, founding partner of iHeart Women's Sports. Curious about queer sexuality, cruising, and expanding your horizons? Hit play on the sex-positive and deeply entertaining podcast, Sports. will broaden minds and help you pursue your true goals. You can listen to Sniffy's Cruising Confessions, sponsored by Gilead, now on the iHeartRadio app or wherever you get your podcasts. New episodes every Thursday. Hello, the internet, and welcome to this episode
Starting point is 00:01:34 of the Weekly Zeitgeist. These are some of our favorite segments from this week, all edited together into one nonstop infotainment laugh-stravaganza. Yeah. So, without further ado, here is the Weekly Zeitgeist. Miles. Yes. We are thrilled, fortunate, blessed to be joined in our third seat by an assistant professor of technology, operations, and statistics at NYU,
Starting point is 00:02:08 where his research focuses on the intersection of machine learning and natural language processing, which of course means, you know, his interests include conversational agents, hierarchical models, deep learning, spectral clustering. You guys are all probably finishing this sentence with me. Yeah, of course. Spectral estimation of hidden Markov models. Yes, obviously. And of course, time series analysis. Please welcome Dr. Joao Sadak. Hi, everyone.
Starting point is 00:02:41 Hi. Thank you. Yeah. Thank you, Jack and Miles, for having me. It's a pleasure to be here. Pleasure to have you. We're glad you're here because we got a lot of questions about a lot of stuff. Some mostly to do with what your area of expertise is.
Starting point is 00:02:58 Some things I just need a second opinion on. The career of Allen Iverson. Is it a bad joke if it's been used in like a cable? I believe that's a DirecTV commercial is using. They're like, and we have our very own AI that helps you select shows. And then Allen Iverson sitting on the couch with the person. Wait, does that make it off limits? Yeah, yeah. That's a real commercial.
Starting point is 00:03:20 Wow. And I'm going to be running with that joke all day today because I'm not any better than the comedy writers at DirecTV, I guess. I don't know what the fuck company it is. Yeah. But Dr. Sadak, is that good? Dr. Sadak, is that a good way to address you? Yeah, sure. I'm very easy about things. Okay. So, yeah. How about J Money? Doc, Jowl. Well, I've never been called that. I had some students in my class call me J Wow.
Starting point is 00:03:55 J Wow? Probably, yes. Yeah. Is it Jowl? So, so the pronunciation is, so it's Brazilian Portuguese. So Jow João, that's what I'm named after, but I go by João, João. Okay. So, yeah, very easy going about it. Dr. J, even? Dr. J.
Starting point is 00:04:16 Oh, no, no. Rock the cradle, Dr. J. Amazing. amazing all right well yeah i mean the main thing we we've brought you here today to ask the big questions around ai such as what is that what is ai and we'll follow from there but you know questions that reveal that we're smart people, obviously. Hey, Molly, what's something from your search history that's revealing about who you are? Well, right now it's Crossroads by Bone Thugs- Bone, bone, bone, bone, bone, bone.
Starting point is 00:04:56 Yeah, you did point out that actually the first lyrics of the song are bone, bone, bone, bone, bone, bone, bone. Bone, bone, bone, bone. Bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone, bone. I was also just reading about remembering about the video that there's a part where there's like a newborn baby that's died. Yeah. And people love when newborn babies die in their music videos. And the reaper takes them all to heaven. Yeah.
Starting point is 00:05:22 Wait, isn't there one part where like someone touches the dude's eyes and no they touch the forehead and the eyes go black and oh that's right that's how you know that god's got yeah that to me i was like i don't know if i want god to get me like that that seems pretty look i'll be i'll be real i won't pretend to have looked up something cool the most recent thing i searched for was the pop star tate mRae, who has a viral TikTok song. So I was like, who is Tate McRae? Please help me. She has a viral TikTok song.
Starting point is 00:05:53 Her name is Tate McRae. Okay. And she's a Canadian pop star. And so the video is set in a hockey rink because it's about. Oh, hell yeah. She's getting back at her hockey player boyfriend who just cheated on her. Oh, no.
Starting point is 00:06:10 Wow, I've never seen someone look more like their name. Yeah, look more like a Canadian pop star named Tate McRae. Yeah, like I was like, you just showed me like, who's this person? I'm like, that's Tate McRae, Canadian pop star. She's BFF with Olivia Rodrigo.
Starting point is 00:06:26 Oh, okay. But she just had a song that went really viral on TikTok. And so I was like, who is this person? Okay. The song is called Greedy. And it's like, she's put a video for it. The video looks like it is a Canadian spoof of Hit Me Baby One More Time. It does look like that. It's really funny. It's like, what if Hit Me Baby One More Time. It does look like that. It's really funny.
Starting point is 00:06:46 Like, it's like, what if Hit Me Baby One More Time, but Canada? Yeah, kind of. It's like if the Grimes video wasn't fascist. It's like, I like when there's like a sports pop music video. Oh, like that first video she had at the Monster Truck Rally or whatever the fuck that was? Yeah, where she was like, I love arenas. Yo, wait, where did they shoot this video? Is she based in LA?
Starting point is 00:07:09 Who, Tate McRae? Yeah. Why, is it a hockey rink you recognize? Yeah, yeah. This is where I used to play hockey in Burbank, I think. Of course it is. They shoot everything there. Oh, it's Pickwick?
Starting point is 00:07:22 Yeah, yeah. The Pickwick Ice Skating Place. I'd have to see if they have skating place i'd have to see if i just went there for the first time dang yeah there was a big hockey boom in los angeles because of the kings yeah yeah you know this but children of the valley loved to go play hockey in a climate where hockey has never been played yeah it's true it's true it's so true and uh who else i remember the other fate like i think i was like honestly i think it was one of the first black and east players to ever play hockey because this was like in the fucking 80s i was getting on the ice and then kurt russell and goldie and you were like you just improved the game a thousand percent
Starting point is 00:08:00 it was we were such a ragtag group it was like like us, like some Armenian kids, some Canadian kids whose parents were like... That's the plot from Mighty Ducks, by the way. Yeah, exactly. Exactly. The Blazing Kid. It was just a ragtag group. This coach came back. He had just gotten DUI.
Starting point is 00:08:15 One of them was a girl. That movie's crazy. We had two Mexican kids, Jose and Joel. Yo, the team photo looked fucking lit. I don't know what the fuck was going on. Bro, this is a movie. Yeah, it was. And it was 88.
Starting point is 00:08:28 It's literally the Mighty Ducks, but also it should be a movie about you and your ragtag group of valley friends. Going to New England and getting real culture shock. Desert hockey. Playing against the rich kids who have all the hockey gear. Yeah, right. Exactly. Right. Like Minnesota.
Starting point is 00:08:44 They're just like mean canadians like just people people want to root against canadians this video is funny because it's like i like i've never seen anyone do the aesthetics of hockey in a pop music video before yeah and it's like so canadian obviously but there's a part where she's wearing like a goalie glove it's like she's dressed like sexy but but she's wearing these like pieces of hockey gear. It's really funny. Yeah. It's like she's wearing like like a crop top and basketball shorts kind of.
Starting point is 00:09:13 Yeah. There's like this lone old like like Gordie Howe hockey glove that she's like holding that I'm like, yo, that shit is so old looking. Okay. Okay. Go ahead. It's like Jason Voorhees' hockey gear is a little outdated. Yeah, it's really funny. It looks like the Thanos glove or whatever, too.
Starting point is 00:09:31 It's just like, it's not sexy. And then she's driving a Zamboni at the end. She's driving a Zamboni. Jamie Loftus content. The same pickwick. Jamie Loftus' influence is felt across much of the pop music landscape. Yeah, it's very jamie loftus coded of tape mcrae it's kind of funny like the last shot being her driving a zamboni like very
Starting point is 00:09:54 like in a very weird like yeah and the quote was like it's about being confident and sassy taking control it's pretty funny it's funny and there's like a good it's a pretty good song i heard it on tiktok a million times and was like what is this and now i know you know it's tape mcgrann now our listeners know and that is the sort of shit that they were not going to hear from us i can tell you that much so this is a great search history you are doing i'm bringing pop corner yeah well it's just like usually the things okay can i tell you the other thing i looked up oh this is more on the type of thing i normally look up but i found out that well maybe whatever i'll never mind i'll tell
Starting point is 00:10:39 you guys later okay i like that all right i don't want to make this part go too long. I'll bring it up in the jet fuel part. Okay. Uh-oh. The jet fuel doesn't melt. 9-11 was last week. What's going on? What is, what's something you think is overrated? You know what I think is overrated is celebrity worship culture in America.
Starting point is 00:11:01 It is terrible. What is that? But we gave up on God, so we need new gods to live your own life though right it's like really yeah i don't know i don't get it what aspect i mean what what spurred this and what did you observe and you're like we're doing this again huh we're bowing at the prostrate at the feet of the kardashians just a tiny bit personal too but we'll get into it later but yeah i mean i mean, I, I don't, I don't want to get into specifics because I don't want to name names, but it's for instance, it's like, I'm an artist,
Starting point is 00:11:30 right. And I'm in a very much an independent artist and I do sometimes work in mainstream things, but a lot of my art that I create is very much independently done. So I have friends who are like, Oh my God, I can't come see your show. I was at that because of this weekend, I'm going to see Adele and I paid a thousand dollars for my ticket. I love Adele, but I'm like, bitch,
Starting point is 00:11:49 Adele does not need your thousand dollars. I need you to pay $20 and bring five of your friends and come see my show. So it's like that kind of thing. And like right now I'm dealing with this again. I have a feature film coming out and we just got told by the people were working on the marketing end of, they're like, Oh,
Starting point is 00:12:03 well there's like two really big films coming out that weekend, guys. So it's like, I don't know if anyone's going to come see yours. And I'm like, great. We need more of this glitzy whatever. Yeah, yeah. Wait, does the American-ish come out October 6th? Is that what
Starting point is 00:12:20 I read? Yeah, it's going to be October 6th. And we talked about the congestion of the calendar the film calendar because taylor swift's movie comes out the 13th and that caused like a ripple effect with every other studio be like all right that scary movie we had we can't do it on friday the 13th anymore because that's when taylor swift is re-entering the atmosphere the second scariest date 10 6 23 so yeah so that's another thing like i'm dealing with my film right now i lost theaters and screens because taylor swift is one of them and then there's another really huge bollywood film that's coming out i'm like god damn it that's also an
Starting point is 00:12:57 audience south asian right so yeah great oh that's just god so you know it's is it unique to america would you say the like celebrity worship is it more fervent in america than uh other places that you've been we do it so weird in america too like why are we obsessed with the queen and like the you know the royalty in the uk it's so weird like people are having these parties to like watch marriages over i'm like what is happening yeah yeah it's like weird in America, but no, I think the whole, everybody does it everywhere,
Starting point is 00:13:28 right? The whole world. Yeah. Are people just getting so caught up because they share the same birthday as one of the Royals. I think that's really just weird behavior for sure. You know, uh,
Starting point is 00:13:38 Miles is the same birthday as Harry. You don't got to bring it up. Harry is the same birthday as miles. Thank you. Thank you so much. Thank you so much. But yeah, no yeah no and i think like but it is everywhere too like i look you know i'm japanese too and i look in japan we have like it's called idol culture like we literally call these celebrities fucking idols you know and it's just this you know i think why everywhere is so much responsibility on these poor little people too i don don't know. Maybe some of them also just want
Starting point is 00:14:05 to make art, guys. They just want to make art. Just let them. You know what I mean? Yeah. I feel like they don't mind it though. They don't. It's like a trap. I think there's a trap where they want it and pursue it with the single-mindedness
Starting point is 00:14:21 of Captain Ahab, but then once they get it, they're like, oh, my life is over. I don't have like a life anymore. Right. Or like I can't go to the store or every person I interact with, I have to worry that they're trying to get something from me because of the status. But hey, it comes with a lot of money to rest your sad little head on.
Starting point is 00:14:41 Usually when you ask them, they're like, yeah, but I would not trade it for anything, obviously. No, no, no. Go back to being like you? No, no, no, no, no, no, no. What is something, Pallavi, that you think
Starting point is 00:14:54 is underrated? Okay. Like I said, I'm very bad at self-care. I just started using, I have two dogs and they shed a lot. I just started using a Roomba.
Starting point is 00:15:04 It's like, let the robots take over. You're doing it better. This place is so much cleaner that I could ever sweep it up. Just take my job. I don't care. I don't care. You can do it, robots. New Roomba?
Starting point is 00:15:19 How long ago did you get the Roomba? My brother gave it to me. It's an old Roomba, but I just started using it. I'm so excited. I'm not gonna say anything because it will disappoint you so i will just let you learn that on your own wait what's disappointing about it just there's a lot that i can't do that it promises like that it can just like my father no um it'll get tangled up. And like, if you have long hair at all, that shit, especially curly hair, will get tangled up in it.
Starting point is 00:15:49 It'll not. You'll have to keep cutting through. There's just a lot of upkeep. And then sometimes it's like, I don't understand that there's a carpet step. I will keep knocking my little Roomba head against it. See, I got it for free and I had no expectations. And I took forever to try it again.
Starting point is 00:16:04 Oh, see, then you're gonna, it's awesome. I believe I want everyone to have that first week that you have the Roomba where you're like, everything is possible. Everything is possible. That's how I feel. Yeah. And sometimes it will start listening in
Starting point is 00:16:20 on your conversations. You'll be like, wait, why is the Roomba in the corner of the room while we're talking about our financial information? And why is it wet? Why is it wet? Why does it have those Boston dynamic dog legs? And why is it climbing up the walls?
Starting point is 00:16:37 Where to get that gun? Yeah, my experience with the Roomba. So I like, I had the same question in my heart. The second you brought up the Roomba, I was like, oh no, she doesn't know yet they don't know yet yeah i saw i saw the light leave both of our eyes jack i saw that too i thought you'd be so excited and i was like ah this is so fun and then but i every time someone talks about their room but i'm like what is it a new purchase because i think they will solve it like i think eventually it won't be bad.
Starting point is 00:17:06 Like, because, like, there are things it does that are pretty cool. Like, we've made artificial hearts and shit. Like, you're telling me you can't find a fucking Roomba solution? Like, I believe in us. I do. I love that we're talking about it like we did it,
Starting point is 00:17:22 like sports teams. Yeah, of course we did. We made artificial hearts like let's get on it guys let's go on yeah i mean maybe the most advanced technology and robotics is not going to be purchasable at a bed bath and beyond like maybe i was looking for it in the wrong place but how dare you say that about bed bath & Beyond. What the fuck do you think Beyond is about? Exactly. That's what the Beyond was.
Starting point is 00:17:49 The singularity. I also feel like I don't care how far technology advances until we make a printer that works. You know what I mean? Like just print. Just print what I want you to print. And that I'll be excited about. And that I also don't have to be like,
Starting point is 00:18:03 oh, I now need to get like new ink or some shit toner primer highlighter whatever the fuck the printer needs that my face also needs you know like i i know i spend that at fenty i don't spend it on things for you i don't yeah zeitgang for anybody who's like ever worked at like a you know in the government or something like that do do the powers that be secretly have printers that work like do they just have like i feel like that there is a lot of like you know designed obsolescence and like designed like fuck-ups with printers so that like the ink costs more than the printer itself and like that's by design i just think man big ink i feel like they've been saying it somebody's probably figured out the printer and they just like don't it's not profitable for
Starting point is 00:18:58 them to sell a printer that is like yeah this lasts 10 years and just works. I will say, speaking of the artificial hearts and stuff, I have been in academic settings with professors who are Nobel Prize winners, and they too, who are creating these amazing technologies, can only be foiled by printers and PowerPoint presentations. That is the only thing that can defeat a professor successfully. That tracks. That tracks, honestly. Yeah.
Starting point is 00:19:29 All right. Let's take a quick break, and we'll come back and catch up on some news heading into the weekend. We'll be right back. I've been thinking about you. I want you back in my life. It's too late for that. I have a thinking about you. I want you back in my life. It's too late for that.
Starting point is 00:19:47 I have a proposal for you. Come up here and document my project. All you need to do is record everything like you always do. One session. 24 hours. BPM 110. 120. She's terrified.
Starting point is 00:20:03 Should we wake her up? Absolutely not. What was that? You didn't figure it out? 120, she's terrified. Should we wake her up? Absolutely not. What was that? You didn't figure it out? I think I need to hear you say it. That was live audio of a woman's nightmare. This machine is approved and everything? You're allowed to be doing this?
Starting point is 00:20:18 We passed the review board a year ago. We're not hurting people. There's nothing dangerous about what you're doing. They're just dreams. Dream Sequence is a new horror thriller from Blumhouse Television, iHeartRadio, and Realm. Listen to Dream Sequence on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Senora Sex Ed is not your mommy sex talk. This show is La Plática like you've never heard it before. We're breaking the stigma and silence around sex and sexuality in Latinx communities. This podcast is an intergenerational conversation between Latinas from Gen X to Gen Z. We're covering everything from body image to representation in film and television.
Starting point is 00:21:03 We even interview iconic Latinas like Puerto Rican actress Ana Ortiz. I felt in control of my own physical body and my own self. I was on birth control. I had sort of had my first sexual experience. If you're in your señora era or know someone who is, then this is the show for you. We're your host, Diosa and Mala, and you might recognize us from our flagship podcast, Locatora Radio. We're so excited for you to hear our brand new podcast, Senora Sex Ed. Listen to Senora Sex Ed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. It was December 2019 when the story blew up. In Green Bay, Wisconsin, former Packers star Kabir Bajabiamila caught up in a bizarre situation. KGB explaining what he believes led to the arrest of his friends at a children's Christmas play.
Starting point is 00:21:56 A family man, former NFL player, devout Christian, now cut off from his family and connected to a strange arrest. I am going to share my journey of how I went from Christianity to now a Hebrew Israelite. I got swept up in Kabir's journey, but this was only the beginning. In a story about faith and football, the search for meaning away from the gridiron and the consequences for everyone involved. You mix homesteading with guns and church and then a little bit of the spice of conspiracy theories that we liked. Voila! You got straight away.
Starting point is 00:22:31 I felt like I was living in North Korea, but worse, if that's possible. Listen to Spiraled on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. And we're back. Shall we talk about, what do you want to talk about? You want to talk about private equity? This was,
Starting point is 00:22:51 this is interesting. I, I saw a headline that was, uh, allow me to pull up the exact headline. Cause I was like, what is this? Uh,
Starting point is 00:23:01 it said, can private equity be ellipsis? Nice. And I was like, what the fuck is this uh it said can private equity be ellipsis nice and i was like what the fuck is this and it's an interview by uh in slate by megan greenwell where she interviews one of the heads of kkr which is this like private equity firm that just bought the publisher simon and schuster and kkr is doing something that is kind of unheard of in the private equity world. They're offering employees of the companies they purchase an ownership stake. And you're like, wait, what do you mean? What the fuck is this?
Starting point is 00:23:33 They basically built in a deal where part of like the total equity of the company gets put aside for employees. And higher earning employees have the option to buy like additional equity if they want. But everyone gets a piece for free and the idea here is that when the company sells or goes public everybody gets a little taste of something not just the c-suite which is like normal and i feel like tech companies have been doing that with like stock like get allowing employees to like have stock and stuff like that for a while but then when when it comes time to actually distribute it, they always find a way to, these things are tied to 40 page contracts
Starting point is 00:24:13 that the employees aren't gonna have time to read or have their legal teams pour over. Some of the examples is a trucking company. These truckers don't have fucking legal teams to pour over like some of some of the examples like a trucking company like these truckers don't have fucking legal teams to like pour no make sure they're not getting fucked over but like so this one their golden goose of an example is this company called chi overhead doors where the private equity firm bought them everyone had their stake and when they sold an average of 175 000 uh was was like distributed to, to the like rank and file
Starting point is 00:24:47 employees, some longtime truck drivers, they say made as much as 800,000. And like, even in speaking to them, like these truck drivers are like, yeah, it was cool. Like I made the money. And also when you have a stake in it, you actually begin to see the inefficiencies in your own business. He's like, I used to get 40 cents a mile no matter where I drove. And I didn't care because more miles meant more money. But when I realized that that could affect the sale price of the company, then we started actually optimizing things. So it's like a very fucking like double-edged sword here.
Starting point is 00:25:18 Because overall, it makes no sense given what private equity is about and like their bloodlust for just chopping down the workforce to just, you know, lower in the name of lowering costs. But they say the private equity say it's win-win. It says workers get the chance for a big payout in a voice and company decisions. Not really. And investors get increased staff engagement and retention, which in turn creates higher profits. So the optimistic read on this is that it's an attempt to trying to make capitalism a little more worker friendly. The cynical version is that this is just corporate. This is like a corporate whitewashing scheme. And it's meant for people to be like, oh, this is like
Starting point is 00:25:54 this could actually benefit people in the interview. The guy from the firm, he's using a lot of maybes and mightbes when it comes to this becoming like a huge payout for people. And it's all nice and like a hypothetical context. But the outcomes begin to differ based on what happens even like when the equity firm exits the company, like do they go private? Do they sell it to another firm? And then what happens there? Does that next firm even give a fuck? So I think that, you know, it seems like they're one example of like the truck drivers making 800k like the ownership stake incentivized workers to just start cutting down on their own rather than the private equity firm coming in and doing it and i think the other thing is that it's huge for retention which seems to be the huge cost of when like a new ownership you know team comes in
Starting point is 00:26:41 people just fucking they're like man fuck this place i'm out of here but not like what if we give you a piece and now they don't have to cycle through thousands of employees to retain and train them so to me it seems like a very elegant way of like cost savings while appearing to be doing the right thing because not everyone's going to get paid out eight hundred thousand dollars like right one example they sort of parade around. Yeah. I mean, there are probably versions of capitalism that work, right? No, there are none. No, there's not. Like, there's probably, like, a handful of times that... Nope.
Starting point is 00:27:16 But, like, there are millions of opportunities for it to work. It's called accidental socialism, if anything. Right. Yeah, socialism works. It doesn't work because you have to exploit somebody for it to work it's called accidental socialism if anything right yeah socialism works it doesn't work because it has you have to exploit somebody for it to exist right so somebody gets somebody has to get fucked for it to function the other thing there's no it existing without somebody getting fucked the like in this interview too when like you know very pointedly this journalist is asking like why don't you just raise wages like when you come in like how about that to keep employees like what yeah what doesn't
Starting point is 00:27:50 that work and then they use this i gotta like you wouldn't be writing this article if we did that probably what they're holding on let me also i know megan greenwell shout out megan greenwell yeah is that who wrote this yeah and she unders yeah she's got she knows she's writing a book right now about how private equity fucks workers like so she didn't go into this interview being like this is going to be great like it's there's a lot of really pointed questions but yeah uh kim stanley robinson like talks about how the like profit inherently like profit is the only goal of any of this shit and profit inherently is exploitive because it's like getting more than your fair share like that's that's the point
Starting point is 00:28:32 of profit that's what profit means so in in this they ask like why don't you just like raise the salary like like that's the easy way to go and he says like look you know we we manage all kinds of people's money we including teachers, teachers, retirement funds. We also manage wealthy people's money. So I don't want to cherry pick, but I'm just giving you a flavor for why this is so tricky. If you're managing teachers pension money and you want to just raise everyone's salary, that is on the backs of the teachers, which is not ethical and it's not our money. And that's like the fucking rationale they use to just hide behind that. To be like,
Starting point is 00:29:06 then the T that's the T it's the teacher. We're actually doing it for the teachers of America. Yeah. Yeah. Yeah. The, the, like,
Starting point is 00:29:12 like you come away from this interview still kind of being like, okay, like this seems not like a win win. It seems like a win, maybe win if the environmental factors are precisely correct. It's like what they're giving you a cut of a company that they're about to run into the ground. Yeah.
Starting point is 00:29:28 But they say, if, if you guys can figure out how to save money even better, cause then we don't have to be, we don't have to just start doing layoffs. Like, which to me, this is what's happening in Hollywood right now too.
Starting point is 00:29:42 It's like the thing that was happening in publishing, where they run all the big companies, like your old school publishers, like Simon & Schuster, into the ground. This is the company that just bought Simon & Schuster, right? Yeah. They're talking to the people who just bought them from, I think, it was like a car
Starting point is 00:29:59 from Paramount or something. They don't have a plan. Their plan is to extract profits and get the fuck out of there. It's all Enron all the way down. Oh, yeah. And it's wild, too, because when this guy says, like,
Starting point is 00:30:12 we're not just here just purely for profits. I'm like, this is, then you wouldn't be running a fucking private equity firm. Right. Like, so I love that, like, the words come out this way. But, yeah, it is a very,
Starting point is 00:30:23 it's an interesting, I think, scheme from private equity to try and, you know, save money in their own ways on like employee retention while trying to incentivize people to like stick around as long as possible before the inevitable. Because then they say stuff like we wouldn't just sell it to like another company who's like, doesn't have the same ethos. Like, really? If the fucking money is right, then, then then what i don't think that's the case this is funny that like the global corporate media can't will like find these examples where like if you look at it from a certain angle it looks like hey this thing's working and it's really like helping out the work yeah anytime they're like actually check out this cool thing it it's like, mmm. Really? No, they're doing something different. They start making March Simpson
Starting point is 00:31:07 noises. My stomach starts making March Simpson noises. Alright, let's cheer ourselves up by talking about how it continues to just keep raining shit on Rudy Giuliani. We've been hearing
Starting point is 00:31:23 for a while that he's broke, can't pay his legal bills, and now his lawyers, people who are supposed to be representing him, are suing him for $1.36 million in unpaid legal fees. He racked up fees and expenses totaling more than $1.5 million, but he only managed to pay $214,000,
Starting point is 00:31:44 and his defense is just like oh it's like really expensive man what the fuck that like cost so much are you serious dude no way no way it's this high yeah no it is by the way so you know he himself is a lawyer. And while he was representing Trump, he reportedly charged $20,000 a day. Hmm. OK. So, you know, he's. I like that he really the way Rudy Giuliani says it's a real shame when lawyers do things like this. And all I will say is that their bill is way in excess to anything approaching legitimate fees. Sure. Sure. their bill is way in excess to anything approaching legitimate fees sure sure the lawyer that he's
Starting point is 00:32:27 stiffed who is now suing him is one of his like oldest best friends his friendship with juliani dates back to the 70s when they were both prosecutors in the u.s attorney's office in the southern district of new york wow wonder what the, how that friendship started. Like where, like if this, his friend, like Robert Costello. And I was like,
Starting point is 00:32:49 look at you now, Rudy, look at you. Look at you. You got nothing. You got nothing. I mean, we've seen how recently that it's been all hands on deck to try and get him
Starting point is 00:33:00 more money because it felt like he was going to Trump and be like, I got bills, dude. And like, you need to trump and be like i got bills dude and like you need to help me because like i know a lot of shit uh and then recently there was a fucking hundred thousand dollar a plate fundraising dinner for rudy giuliani legal fees specifically so yeah i love i have a feeling i wonder he probably was able to pay a lot of that off but anyway who, who knows? Who knows?
Starting point is 00:33:25 He's too busy being. Yeah. I like when the scammers get scammed. Get. Yeah. Taken in for their scams. By their own. By their own.
Starting point is 00:33:32 The scammers scam each other. Yeah. Or you're just like, none of these people have any loyalty to anyone but themselves. Of course, they're all going to like fuck each other over at the end and turn on each other. That to me seemed like the linchpin of the whole trump thing was like everyone he works everyone he fucks everyone but it's like everyone he works with is like somebody who would sell him out in a minute yeah and vice versa yeah like they're all that's the best part when they all turn on each other at the end yeah it's i'm waiting to see how
Starting point is 00:34:04 the dominoes fall because it's every time we hear about the stuff happening in like georgia or like florida there's more people who are like uh actually they do not want to have the same defense team as trump now and realize that like they were being like nudged to do what was the best for trump and not for them to be free you know you know how these people like sell themselves as like brilliant businessmen and people are like well if he could make a billion dollars, like surely he could run the country. Like their skill, the thing that they all have in common that they're good at is the same skill displayed by if you've ever like been out to dinner with a group in like high school and everyone suddenly like, nah, I don't have enough money to pay this. What are you talking about?
Starting point is 00:34:48 No, you know, like arguing over the bill. Yeah, before a homecoming dance. That is their skill. They were just the most stubborn of the people who refused to pay for whatever they ordered. And they're like, no, that's, you see right here on the menu it says 12.99 and so i put in 12 and it's like well we all also had drinks and no no no
Starting point is 00:35:17 yeah we all got those big coca-colas because we're in high school. Yeah, exactly. Through that like weird brown plastic cup that had the logo blasted on it. Got apps. You ordered the apps. You demanded we order the apps. You ordered the Mojo potatoes at Shakey's where we're having our dinner. Right.
Starting point is 00:35:37 Well, I drove someone here. So like that costs gas money. Oh, man. Just we'll argue until you're too exhausted to continue the argument. I remember we, Molly, you'll appreciate this. Before our dance, we went to Micelli's right there across from the Ghislaine Maxwell In-N-Out. Oh, you mean Micelli's? Yeah, yeah, yeah, yeah.
Starting point is 00:35:59 Well, that's how we say it. That's how we always say it in school. We're going to Micelli's with a one C. That's why I would always be like, that's a 1C. I say the accurate Italian. Ah, yeah. Perdonami.
Starting point is 00:36:10 But so we went, and I remember one guy, this one dude straight up just fucking, just denied that they ordered the food that they did. And we were like, what the fuck? That's so funny. It was like wild. you know what yeah yeah i
Starting point is 00:36:26 would have to respect that because all my friends well he's like well where is it then yeah now shout out to my friends from this place you know what they do now we all every single one of us was just like nah uh-uh i don't give a fuck i I'm not paying. Oh, just gaslighting when the bill comes? Ninth grade. Fuck you. Nah. Applebee's? Nah. Well, it wasn't me then. It was your fucking idea we came here because you said your cousin was working to get a discount. Well, she's not, so we can't get the discount.
Starting point is 00:36:57 Jack, have you been to Michelli's? No. Oh my God, it's amazing. There's one in Hollywood too. It's like the waiters also sing. Yeah. Oh, hell yeah. Yeah, yeah, yeah. It's like a piano bar and your waiter will be like,
Starting point is 00:37:11 your waiter's like, here's your food. And then they like turn around and take a microphone and sing Memory from Cats. Yeah, yeah. And the food, fantastically subpar. It's so mid, It's so mid. But I love it. But it's so good.
Starting point is 00:37:26 You're paying for the experience. You know when you walk in, you're like, this food is not tasting great, but it will taste like Italian food. As we know it by the American definition. I love those like mid Italian restaurants from the 50s. Yeah. That's one of them. But it's also like they have like little like. Usually named after somebody right yeah it's
Starting point is 00:37:47 like a fake balcony inside yeah yeah oh i love it it's wild it's like two inches deep oh we should go yeah i'm gonna take you guys there let's do it yeah it's great because it's like a karaoke thing but you don't have to do karaoke it's like someone just the waiters are just singing it's so weird i love it very i see miles where you get your taste for the old country yeah from miss ellies as we would always incorrectly say in my elementary school and we will we just can't we can't let that pronunciation go That guy who was ducking the bill, he went on to have a successful chain of daycare centers. Oh, my God. Prophet king, you know?
Starting point is 00:38:32 He knows how to find a prophet. Prophet daddy. He knows, man. Yeah. It's real. Somebody's got to get shorted. In that case, it was you. Yeah, exactly.
Starting point is 00:38:43 And the kids. That's all it is. All right. Let's take a quick break and come back and talk about uh something that we're all every single one of us every if you're hearing my voice and you don't and you are not in the c-suite at lockheed martin you got shorted on this shit uh we'll be right back. I've been thinking about you. I want you back in my life. It's too late for that. I have a proposal for you. Come up here and document my project. All you need to do is record everything like you always do. One session, 24 hours. EPM 110, 120.
Starting point is 00:39:25 She's terrified. Should we wake her up? Absolutely not. What was that? You didn't figure it out? I think I need to hear you say it. That was live audio of a woman's nightmare. This machine is approved and everything?
Starting point is 00:39:41 You're allowed to be doing this? We passed the review board a year ago. We're not hurting people. There's nothing dangerous about what you're allowed to be doing this? We passed the review board a year ago. We're not hurting people. There's nothing dangerous about what you're doing. They're just dreams. Dream Sequence is a new horror thriller from Blumhouse Television, iHeartRadio, and Realm. Listen to Dream Sequence on the iHeartRadio app,
Starting point is 00:40:00 Apple Podcasts, or wherever you get your podcasts. When you think of Mexican culture, you think of avocado, mariachi, delicious cuisine, and of course, lucha libre. It doesn't get more Mexican than this. Lucha libre is known globally because it is much more than just a sport and much more than just entertainment. Lucha libre is a type of storytelling. It's a dance. It's tradition. It's culture.
Starting point is 00:40:25 This is Lucha Libre Behind the Mask, a 12-episode podcast in both English and Spanish about the history and cultural richness of Lucha Libre. And I'm your host, Santos Escobar, the emperor of Lucha Libre and a WWE superstar. Santos! Santos! Join me as we learn more about the history
Starting point is 00:40:44 behind this spectacular sport from its inception in the United States to how it became a global symbol of Mexican culture. We'll learn more about some of the most iconic heroes in the ring. This is Lucha Libre Behind the Mask. Listen to Lucha Libre Behind the Mask as part of My Cultura Podcast Network on the iHeartRadio app, Apple Podcasts, or wherever you stream podcasts. Señora Sex Ed is not your mommy sex talk. This show is la plática
Starting point is 00:41:09 like you've never heard it before. We're breaking the stigma and silence around sex and sexuality in Latinx communities. This podcast is an intergenerational conversation between Latinas
Starting point is 00:41:20 from Gen X to Gen Z. We're covering everything from body image to representation in film and television. We even interview iconic Latinas like Puerto Rican actress Ana Ortiz. I felt in control of my own physical body and my own self.
Starting point is 00:41:37 I was on birth control. I had sort of had my first sexual experience. If you're in your señora era or know someone who is, then this is the show for you. We're your hosts, Diosa and Mala, and you might recognize us from our flagship podcast, Locatora Radio. We're so excited for you to hear our brand new podcast, Senora Sex Ed. Listen to Senora Sex Ed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. señora sex ed on the iHeartRadio app, Apple podcast,
Starting point is 00:42:04 or wherever you get your podcast. And we're back. And Miles, I mean, we, we talk a lot about how we, we, what we think this future is going to look like,
Starting point is 00:42:17 right? Yeah. I mean, I share the name with Miles Dyson. Ever heard of him? The guy started a little thing called Skynet. You don't think that keeps me up all night? Well, but in seriousness, I... Hey, he went out of hero, Miles.
Starting point is 00:42:31 Yeah, I did, I did. Look, it's all about altruism, you know, at the end. But like, I think for me personally, right, so much of my understanding of science is derived from film and television because I'm American. And I think with like with ai i think whenever i would think of like oh my god like it's like you don't know where this thing's gonna go but the place i think it's gonna go is skynet from terminator i just see the shot
Starting point is 00:42:59 from the sky of all the missiles being launched like just coming down yeah I'm trying to have a nice day at the park with my kids. I'm watching through a chain link and next thing I know... Some lady's over there shaking the chain link fence for some reason. Hey, that skeleton's rattling the kids over there. Get her out of here. First of all, it's a sort of two-fold question.
Starting point is 00:43:19 How far off base is the idea of how many jumps am I making in my brain to be like, Chad GPT, next stop Skynet. And, and then also, because of that, what are all of the ways that people like me are not considering what those actual real tangible effects will be of, and I don't want to be like the most cynical version. But, you know, if we aren't careful and we have very sort of individualistic or profit minded sort of motivations to develop this kind of technology, like what what does that worst case sort of look like? Or not worst case, but what are the ways I'm actually not thinking of because I'm too busy thinking about T-1000s? thousands right like i just to like add on to that when when i think about the computer revolution like for decades it was these people in san francisco talking about how the future was
Starting point is 00:44:12 going to be completely different and done on on computers or something called the internet and we were just down here looking at a computer that was like green dots, like not, not that good of a screen display. And by the time it like filtered down to us and the world does look totally different, it's like, well, shit, like that's, that turns out that was a big deal. So yeah, I feel like I want to be constantly like on our show just asking a question like where is this actually taking us because some in the past i feel like it's been hard to predict and when it did come it wasn't exactly what the you know tech oracle said it was going to be and it was like unexpected and weird and beautiful and banal in weird ways and so i'm curious to hear your
Starting point is 00:45:06 thoughts on like once this technology reaches the level of consumers what like of just your average ordinary person who's like not teaching machine learning at myu like what what is it going to look like but first but obviously first skynet Skynet, Skynet. But first Skynet, obviously. I think Skynet, okay, so I think the idea of worrying about some version of, you know, this technology going out of control, right, of this technology going out of control, is there's so many checks and balances of ways in which we are thinking about and the technology is sort of moving forward that Skynet, the idea of something emergent
Starting point is 00:46:02 and then not only is it emergent but it's going to take over and try to destroy civilization right and also send and also send cybernetic organisms into the past to ensure that those people do not grow up to take arms against skynet like john connor and his yeah yeah so i i i'm a firm believer that in being able to send people to the future but i still don't think that we're not going to be able to send people in the past okay so so that's that's maybe one problem but that's just like your opinion man all right i'm not back to the future is gonna happen for me and by the way i'm the doc brown character i'm gonna befriend a young child yeah right exactly
Starting point is 00:46:52 uh how long have they been marty and doc been friends like probably like it seemed like eight years like so like when marty was like years old, they became best friends. Anyways, sorry about that. Forgetting sidetracks. So, well, so I think, you know, so what I'm, what I'm thinking about when we think about this kind of technology and how it could be, how the technology itself could go wrong, I don't think that that kind of integration is likely, right? I think that the concept of, you know, the singularity, it's so far out, and I don't think that we have a good ability or understanding of, like, consciousness, and I also don't think we have a really good understanding of, like, okay, you know, why would, you know, these large language models start, you know, trying to harm us?
Starting point is 00:47:52 So there's all those steps and jumps and leaps and all the intermediate pieces that just seem so incredibly unlikely. seems so incredibly unlikely. Now, I know that a lot of the AGI people who worry, their worry is, oh, well, we got to worry about this tail race, right? And a human-level extinction event is worth worrying about. And some people worrying about that is probably not a bad thing. But I just think it's very unlikely for the reasons of those, of all the chain that would need to happen and we're not very good at robots also
Starting point is 00:48:32 robots are you know actually much further away but um that is one okay we'll talk about it but put a pit in that because i want to come back to that. To bad robots. To bad robots. And how easy. I just want to brag about how easy it would be for me to beat one up. Both of those Boston Dynamics ones. Yeah, yeah. I got money on you in that fight. But the thing that I do want to point out is that, you know, the concept of starting to build some of these rules into the large language models, you know, to try to say, okay, well, you know, try to be benign, don't do harm. I think that's a really super good idea, right?
Starting point is 00:49:14 And I think that, you know, even if you don't believe in Skynet, actually believing in trying to incorporate responsibility into the large language models, I think that is something that's very important. Moving into some of the dangers, though. I think actually large language models and people like OpenAI was actually worried about this in one of its first iterations of GPT was the ability for the large language models to create misinformation and disinformation. And, you know, I think that that's a really bad use of the technology and very, very potentially harmful. And it already seems to be what it's being used for.
Starting point is 00:50:01 Like the ways that companies are trying to replace journalism with it or you know clickbait article like just generate tons of clickbait articles that are like targeted at people is like it feels like it's already training in that in in a lot of ways yeah unfortunately i think you're right like that right. Like, that there is this bad use of the technology where it's like, oh, let's make this so that it could be as persuasive as possible. You know, there's obviously been a ton of research and marketing on, like, figuring out, okay, how do we, you know, position this product so that it's, you so that you're more likely to buy it, right? And the same thing can be now applied to language of like, okay, for Jack O'Brien, how am I going to make this tailored advertisement
Starting point is 00:50:55 just for him to be able to do, to make him buy this particular product that I'm selling or to make him not vote, right? Yeah, right. You know, those kind of, I mean, I think that this is scary and I think that this is in the now, right? Yeah. Which, you know, is something that in some sense
Starting point is 00:51:15 is going to be hard to like really stop people from using this technology in that direction without things like legislation. You know, there's just some components that need, you know, more legislation. And I think that's going to be hard to do, but probably necessary. Right. Yeah. Yeah.
Starting point is 00:51:40 I can even see, like, just even in politics, you can be like, OK, I need to actually figure out the best campaign plan for this very specific demographic that lives in this part of this state. And for sure. And then just imagining what happens. But I guess is that also part of the slippery slope? Is that like the reliance sort of gives way to like sort of this like thing where it's like this actually is going to whatever this says is the solution to whatever problem we have and like just kind of throwing our hands up and all just becoming totally reliant i mean i i'd imagine that's also seems like a place where we could easily sort of slip into a problem where it's like yeah the the chat bot may give us this answer well well or or that people are starting to make like very homogenous decisions and choices. Right. Where you would make, you know, many different choices.
Starting point is 00:52:31 But because you already have a template that's been given to you by, you know, this AI, you're like, oh, okay, well, you know, I'm just going to follow, you know, this choice or this decision that it's going to make for me and not you know do one of the thousand different alternatives right right and if you do that i do that jack does that right like all of a sudden we would have done vastly different choices but now we have this sort of weird sort of uh centering effect where all of us are actually making much less, you know, varied choices in our decision making. Right. Which has, I mean, it does possess real risk. I mean, imagine applying this to resume screening, right? Where you could imagine the same type of problem, right?
Starting point is 00:53:18 And those are, there's just so many scenarios that you can think of where, you know, we need to take care. yeah and and this is a good time to think about that right and we're yeah we're turning our free will over to the care of algorithms and you know the phones like the skinner boxes in our hands that we're carrying around and like that feels like a thing that's already happening ai might just make it a little bit more effective and like quicker to respond and like but yeah it feels like a lot of the concerns over ai that make the most sense to me are the ones that are already happening and yeah just in like to kind of tip my
Starting point is 00:54:00 head a little bit like the in doing a lot of research on the kind of overall general intelligence concerns that we we've been talking about the ones where it like takes over and evades human control because it wants to you know what wants to defeat humans yeah i was surprised how full of shit those seem to be like instance, there's one story that got passed around a lot last year where during a military exercise, an AI was being told not to take out human targets by its human operator. And so the AI made the decision to kill the operator so that it could then just like go
Starting point is 00:54:47 rogue and start killing whatever it wanted to. And that story like got passed around a lot. I think we've even like referenced it on this show. And it's not true. First of all, I think when it first got passed around, people were like, it actually killed someone, man. And it was just a it was just a hypothetical exercise, first of like just in the story like as it was being passed around and second of all it was then debunked like the person who said that it happened in the exercise later came out and was like that actually like that didn't happen but it seems like there is a real interest and real incentive on behalf of the people who are, like, set up to make money off of these AIs to, like, make them seem like they have this, like, godlike reasoning power. There's this other story about where, like, they were testing the, I think it was GPT-pt4 to see like like during the alignment testing
Starting point is 00:55:47 alignment testing is like trying to make sure that the ai's goals are aligned with humanities and like making humans more happy and they like ran a test to see where the gpt4 was basically like i can't solve a captcha but what i'm going to do is I'm going to reach out to a TaskRabbit and hire a TaskRabbit to solve the CAPTCHA for me. And the GPT-4 made up a lie and said that he was visually impaired and that's why he needed the TaskRabbit to do that. And again, it's like one of those stories like it feels creepy and it like gives you goosebumps. And again, it's like one of those stories like it feels creepy and it like gives you goosebumps and again it's not true it's like they were the gpt was being prompted by a human like it's very similar to the self-driving car myth like that that self-driving car viral video that elon musk put out where it was being prompted by a human and like pre-programmed and like there were all these ways
Starting point is 00:56:45 that the human agency involved with the kind of clever task that uh we're worried about this thing having done we're just like taken out we're just edited out so that they could tell a story where it seemed like the GPT-4, which is like what powers chat GPT, that like it was doing something evil, right? And it's like, so it's, I get how these stories get out there and become like, because they're the version of AI that we've been preparing for because like in watching Terminator,
Starting point is 00:57:26 but it's surprising to me how much like Sam Altman, the head of open AI, like just leans into that shit and is like, he, like he, and in an interview with like the New Yorker, he was like, yeah,
Starting point is 00:57:41 I like keep a Sinai capsule on me and like all these weapons and like a gas mask from the Israeli military in case like an AI takeover happens. And it's like, what? You of all people should know that that is complete bullshit. But and it totally like takes the eye off the ball of like how the actual dangers that ai poses which is that it's going to just like flood the zone with shit like it's going to keep making the internet and phones like more and more unusable but also more and more like difficult to like tear ourselves away from like i feel like we've already lost the alignment battle in the sense that like we've already lost any way in which our technology that is supposed to like
Starting point is 00:58:33 serve us and make our lives better and enrich our happiness like are doing that like they they stopped doing that a long time ago like that's why i always like say that the more interesting question like the questions that were being asked always like say that the more interesting question like the questions that were being asked in philosophy classes in the early 2000s about like singularity and like suddenly this thing spins off and is we we don't realize we're no longer being served and like we're like 20 steps behind like that happened with capitalism a long time ago like that we're like capitalism is so far beyond it. And it's like, you know, we're no longer serving ourselves and serving fellow humanity. We're, like,
Starting point is 00:59:10 serving this idea of this market that is just, you know, repeatedly making decisions to make itself more efficient, take the friction out of, like, our consumption behavior. And, yeah, I think AI is going to definitely be a tool in that. But, like, that behavior. And yeah, I think AI is going to definitely be a tool in that. But like that is the ultimate battle that we're already losing. Yeah, I mean, I completely agree with you on the front of like your phones are,
Starting point is 00:59:40 I mean, the companies are trying to maximize their profits, which mean, you know, possibly you being on your phone at the disservice of doing something else, like going outdoors and doing exercise, right? Right. Or, you know, doing something social. I do see, on the other hand, that maybe AI can have positive effects, right? Where, you know, take tutoring for an example, or public health, where we take the phone and take the technology and use it for benefit, like providing information about public health by helping students who don't have the ability to have a tutor have a tutor, right? Like all of these kind of things where let's take advantage of the fact that, you know, the majority of the world has phones,
Starting point is 01:00:41 right? And try to use that technology for, you know, a good positive societal benefit. But like any tool, it has usage that are both good and bad. And I do think, like, you know, there's gonna be a lot of uses of this technology that are not necessarily in the best interest of humanity or individual people. Right. It's like really like sort of the real X factor is how the technology is being deployed and for what purpose.
Starting point is 01:01:14 Not that technology in and of itself is like this runaway train that we're trying to rein in. It's that, yeah, if you do phone scams, you might come up with better, you know, like both like voice models to get this rather than doing one call per hour. You can do a thousand calls in one hour running the same script and same scam on people. Or to your point about how do you persuade Jack O'Brien to not vote or to buy X product, then it's all going in that direction versus the things like how can we optimize like the ability of a person to learn if they're in an environment that typically isn't one where people have access to information or for the public health use and that's why like i think in the end
Starting point is 01:01:56 because we always see like we're like yeah this thing could be used for good and it's almost every example is like yeah and it just made two guys three billion dollars in two seconds right and that's what happened with that and now we all don't know if videos we see on instagram are real anymore thanks yeah yeah it's not yeah it's not inherently a bad technology it's that the system as is currently constituted favors scams like that's why we have a scam artist as like our most recent president before this one and probably could be future president again like because that is what our current system is designed for and like it is just scamming people for money that's why when when blockchain technology like for has its like infant baby steps it immediately becomes a thing that people use to scam one another because that is the software of our current society so, there are like tons of amazing possibilities with AI. I don't think we find them
Starting point is 01:03:08 in the United States, like applying the AI tools to our current software. I would love for there to be a version of the future where AI is so smart and efficient that it changes that paradigm somehow and changes it so that our software that our society runs on is no longer scams. You know? Right, right. Well, I mean, there is a future where, you know, AI starts to do a lot of tasks for us.
Starting point is 01:03:43 You know, I think that there's starting to be, you know, depending, it's mostly in Europe now, but also here it talks about like universal basic income. It's kind of, I mean, I don't think that that's going to be the case. I don't think that in the US
Starting point is 01:04:02 anybody has an appetite for that, maybe in 20, 30 years. But, you know, I do think that in the U.S. anybody has an appetite for that, maybe in 20, 30 years. But, you know, I do think that, you know, there are certain things about my job, right, as a professor, about like writing and certain other components, right, where, you know, here's some things where ai can help right and you know today i use it as a tool to make me more efficient like you know okay i want to have an editor look at this okay well you know uh it's not going to do as good as as a real, but maybe as a cop, as a, you know, bad copy editor. Yeah, that's good. You know, like, but, you know, I mean, I've been writing papers for a while, you know, when the undergraduates write the paper, it doesn't even better job, right? Because they have more room for growth. Right. And, and so I think that, you know, yeah,
Starting point is 01:05:05 I think that on a micro level, right, these tools are now being used. I think that, you know, Russian spam bots
Starting point is 01:05:12 have been most likely using large language model technology before chat GPT. Right. Right. And so,
Starting point is 01:05:21 you know, I mean, there are some ways that I think we're going in a positive and good direction right yeah yeah now now it's like now that you say that i'm thinking i'm like we thought cambridge analytica was bad and it's like what happens now when we like turn it up to 1000 with this kind of thing where it's like yeah here are all these voter files now like really figure
Starting point is 01:05:41 out how to suppress a vote or get to sway people. And yeah, I again, it does feel like one of these things where in the right hands. Right. We can create a world where people don't have to toil because we're able to automate those things. off-ramp our culture of greed and wealth hoarding and concentrating all of our wealth into a way to say like well we actually need to spread this out that way everyone can have their needs met materially because our economy kind of runs on its own in certain sectors obviously other ones need real like human labor and things like that but yeah that's where you begin to see like okay that's the fork in the road can we make that do we do we take the right path or does it turn into you know just like some very half-assed version where like only a fraction of the people that need like universal basic income are receiving it well you know we read more articles about why we don't see van vat in hollywood anymore as Jack pointed out.
Starting point is 01:06:46 That terribly worded thing about Vince Vaughn that AI could not get right. Couldn't get his name right, so they called him Van Vought. So the sci-fi dystopia that seems like, not necessarily dystopia, but the sci-fi future that seems like it's most close at hand and like given you know doing a weekend's worth of research on like where where this is and like where all the top thinkers think we're headed the thing that make that seems the closest to reality to me is the movie her like her is kind of already here there's already these applications that use chat gpt and, you know, GPT-4 to, again, like, it really seems to me like, and I don't necessarily mean this, like, in a dismissive way. This is probably, you know, a good description of most jobs. Like, the thing that David Fincher, I was, like,
Starting point is 01:07:41 listening to a podcast where they talked about how David Fincher says words are only ever spoken in his movies by characters in order to lie. Because that's how humans actually use language, is just to find different ways to lie about who they are, what their approach is to things. I think these language models, the thing that they're really good at, like whether it be like up to this point where people are like, holy shit, this thing's alive, that's talking to me. Well, it's not. And it's like just doing
Starting point is 01:08:11 a really good approximation of that and like kind of fooling you a little bit, like take that to the ultimate extreme. And it's like, it makes you think that you're in a like loving relationship with a partner. And like, that's already how it's being used in some in some instances to great effect so like that that seems to be one way that i could
Starting point is 01:08:32 see that being becoming more and more a thing where people are like yeah i don't have like human relationships anymore i get like my emotional needs met by like her technology, essentially. Is that what, what would you say about that? And what, is there a different fictional future that, that you see close at hand? So I think her is great. I actually, I mean, it's really funny because, you know, I remember back in 2017, 2018, where I was like, oh, her is like a great sort of, this is where AI is going. And, you know, for those of you who haven't seen the movie, it's a funny movie where, you know, they have Scarlett Johansson as the voice of an AI agent, which is on the phone. And I think that this is actually a pretty accurate description of what we're going to have in the future, not necessarily the relationship part, but the fact that we'll
Starting point is 01:09:31 all have this personal assistant. And the personal assistant will see so many aspects of our lives, right? Our calendars, our, you know, meetings, our phone calls, everything. And so it'll be this assistant that we all have that's helping us to make things more productive. And as a function of the assistant, to be effective, the assistant will probably want to have some connection with you.
Starting point is 01:10:03 And that connection will be likely to allow you to trust it. And here's where the slippery slope comes from, you know, where you're like, oh, this understands me. And you start getting into a deeper relationship where, you know, a lot of the fulfillment of a one-on-one connection can come with your smart assistant. And I do think that there's a little bit of a danger here. Like, you know, again, I'm not a psychologist. I'm someone who studies AI machine learning, but I do actually, you know, study how well machines can make sense of emotion and empathy.
Starting point is 01:10:50 And really, you know, GPT-4, which is the current state-of-the-art technology, is actually already really good at understanding. I use that phrase with a quote, the phrase understanding, but really what you were feeling, right? How you're feeling under this scenario. And you can imagine that feeling heard is one of the most important parts of relationship, right? And if you're not feeling heard by somebody else and you're feeling heard by your personal assistant, that could shift relationships to the AI, which could be fundamentally dangerous because it's an AI, not a real person. Right. Yeah. replica that app uh r-e-p-l-i-k-a where they are you know designing these things explicitly to like fill in as like a romantic partner and at least with some people and it's always hard to tell
Starting point is 01:11:55 did they find like the three people who are using this to fill an actual hole in their lives or is it actually taking off as a technology but yeah the the personal assistant thing seems closer at hand than maybe people realized any other like kind of concrete changes that you think are coming to people's lives that they aren't aren't ready for or haven't haven't really thought about or seen in another sci-fi movie sci-fi movies are remarkably good at predicting yeah or maybe they have seen in a sci-fi movie no but i i think um you know another another thing that one potential that i think not enough people are talking about is how much better video games are going to become.
Starting point is 01:12:48 So people are already integrating GPT-4 into video games. And I think our video games are just going to be so much better. Because, you know, you have this ability to, like, interact now with a character AI that's not just like very boring and chatting with you but they can actually be truly entertaining and fully interactive interesting so i think you know computer games are going to be much much better and possibly also more addictive as a function of being much better perfect and then that will dull our appetite for revolution when our jobs are all taken by the... Yes, this is great. This is great.
Starting point is 01:13:30 I love it. All right. I feel much better after having this conversation. Dude, Grand Theft Auto is way better. The kinds of conversations I have with people I would normally bludgeon on the street with my character. All right. That's going to do it for this week's weekly zeitgeist. Please like,
Starting point is 01:13:50 and review the show. If you like the show, uh, means the world to miles. He, he needs your validation folks. Uh, I hope you're having a great weekend and I will talk to you Monday.
Starting point is 01:14:03 Bye. Thank you. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church.
Starting point is 01:15:16 Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Keri Champion, and this is season four of Naked Sports. Apple Podcasts, or wherever you get your podcasts. Presented by Elf Beauty, founding partner of iHeart Women's Sports. Curious about queer sexuality, cruising, and expanding your horizons? Hit play on the sex-positive and deeply entertaining podcast, Sniffy's Cruising Confessions.
Starting point is 01:16:04 Join hosts Gabe Gonzalez and Chris Patterson Rosso as they explore queer sex, cruising, relationships, and culture in the new iHeart podcast, Sniffy's Cruising Confessions. Sniffy's Cruising Confessions will broaden minds and help you pursue your true goals. You can listen to Sniffy's Cruising Confessions, sponsored by Gilead, now on the iHeartRadio app or wherever you get your podcasts.
Starting point is 01:16:24 New episodes every Thursday.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.