The Daily Zeitgeist - Are We Overrating A.I.? Which Movie A.I. Gets It Right? 09.26.23

Episode Date: September 26, 2023

In episode 1553, Jack and Miles are joined by A.I. Researcher and co-host of The Good Robot, Dr. Kerry McInerney, to discuss… Ways The Future Of AI Might Look Different Big And Small, The Pause Lett...er, Taking The Profit Motive Out? Alternatives to the Corporate Capitalism Stuff? There’s A Bias In The Media To Make It Seem Like AI’s Plotting Against Us, Are We Overrating AI? How Movies Shape How We Picture AI, What Should We Be Reading/Watching Instead? And more! PRE-ORDER The Good Robot: Why Technology Needs Feminism NOW! LISTEN: White Science feat. ZelooperZ by John FMSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 I'm Keri Champion, and this is Season 4 of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese. Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports. Listen to the making of a rivalry.
Starting point is 00:00:20 Kaitlyn Clark versus Angel Reese on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Capital One, founding partner of iHeart Women's Sports. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:01:00 Hey, I'm Gianna Pradenti. And I'm Jemay Jackson-Gadsden. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert Maury Tahiripour.
Starting point is 00:01:19 If you start thinking about negotiations as just a conversation, then I think it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, the internet, and welcome to Season 306, Episode 2 of The Daily Zeitgeist! Yay! A production of iHeartRadio. This is a podcast where we take a deep dive into America's shared consciousness.
Starting point is 00:01:48 And it is Tuesday, September 26th, 2023. Ooh, you ready? You know what this is, right? You know what this is, right? For all my pancake lovers, it's your day. It's National Pancake Day. It's also National Shamu the Whale Day. Shout out Shamu.
Starting point is 00:02:06 It's also, for all my dumpling lovers out there, for me me specifically my gyoza lovers out there national dumpling day okay so you know celebrate however you seem fit depending on your cultural disposition it's also national johnny apple seed day and only bring that up because i remember in christian school we were made to sing that as like a song before we went to lunch man the amount of pro johnny appleseed propaganda that i had to live with did you live a lot of pro yeah yeah yeah he's like yeah he's like john chapman or something originally he just really liked it really fucked with apples that guy there's like a children must learn yeah I bet there's some dark ass shit probably behind, you know, like some, a story like that. It's gotta be a milkshake though.
Starting point is 00:02:50 Yeah, for sure. Very problematic. But the way, like the amount of information that I consumed, like I had one of those, my family had one of those book sets where it was important historical figures. And right next to Lincoln was Johnny Appleseed. And I was prepared for Johnny Appleseed to be one of the major figures in my life. Right. And education.
Starting point is 00:03:21 Did you know the song? No. I'm sure. Oh, the Lord is good to me. And so I thank the Lord for giving me the things I need. The sun and the rain and the apple seed. The Lord
Starting point is 00:03:34 is good to me. And then we could go eat lunch. But we had to sing that shit before. Wow. It's from the Disney movie apparently. So I didn't even get all of the Johnny Appleseed propaganda. I got the Christian capitalist Johnny Appleseed propaganda just mainlined into my little brain.
Starting point is 00:03:55 He wore a pot on his head, right? I mean, I think, who knows? Sure. Like backwards, kind of. Like dumb baby Crockett? Yeah. Anyways, this is all dumb shit that is probably specific to American audiences. Yeah.
Starting point is 00:04:11 But my name is Jack O'Brien, a.k.a. Start spreading the news. I'm eating today. I want to eat a part of it. Pecan, pecan pie. These chattering teeth are longing to chew. Bite through the nutty part of it. Pecan, pecan pie.
Starting point is 00:04:39 That is courtesy of Maxer1216 on the Discord. A little, just real solid, right down the middle meatball of a Weird Al parody in reference to our conversation about Pecan Pie. For some reason, he was lobbing that meatball up to you.
Starting point is 00:04:58 Yeah. I saw that in the Discord. I was tagged in that, but you took that. Oh, so you just do other people's AKs. Okay. Interesting. Alright, well i'm thrilled to be joined as always by my co-host mr miles gray okay so you did my aka and i heard you do this one on friday so allow me to do it one more time say miles and jackie have you seen it yet oh it's just flying around. L-l-l-losing all my jets. Yeah. Oh, there's a Bree, and it's all spaced out.
Starting point is 00:05:29 Oh, Jetty, she's really neat. She's got a stealthy use, a bomb and shoot. So fancy it can't even be seen. Oh, l-l-l-losing all my jets. losing all my jets shout out to the department of defense for losing all of our multi-million dollar killer fucking spacecrafts uh and shout out to johnny davis and blinky heck one more time one more time yeah maybe we can like layer those because you went high i feel like i want a little bit lower maybe we could get a little harmony. Yeah.
Starting point is 00:06:08 Anyways, Miles, we're thrilled to be joined in our third seat for today's expert episode by a truly brilliant guest who I felt especially stupid singing an AKA in front of. She's a research associate at the leverholm center for the future of intelligence where she researches ai from the perspective of gender studies critical race theory and asian diaspora studies she's also a research fellow at the ai now institute the co-editor of the upcoming volume the good robot feminist voices on the future of technology and the co-host of the upcoming volume, The Good Robot, Feminist Voices on the Future of Technology, and the co-host of the wonderful Good Robot podcast.
Starting point is 00:06:48 Please welcome the brilliant, the talented Dr. Carrie McInerney! Dr. Carrie! Thanks so much for having me. Thank you so much for joining us. How different has this been so far from what you were expecting? Oh my gosh, I don't have a song. I was like, how do I say I definitely don't have a song. I was like, how do I say I definitely don't have a song.
Starting point is 00:07:07 And I also don't know who Johnny Appleseed is. I grew up in New Zealand. And so I didn't have a book of like historical figures. I had a book called watch out these creatures bite and sting. And it was like all the ways you could die being killed by like a jellyfish or a snake or like an octopus. I mainly, if you went to Australiaralia and so it
Starting point is 00:07:25 kind of like traumatized me for the good from childhood but it does mean i didn't have the great apple seed song that you just say yeah did you guys have apples though because i i was under the impression that the only reason i had apples was johnny apple seed yeah and the lord and the lord obviously who provided who provided provide it's just all kiwis right down there pretty much and the Lord. And the Lord, obviously. Who provideth. Who hath provideth. It's just all Kiwis, right, down there? Pretty much Kiwis, Kiwi fruit, no apples, no songs, just no music. It's just silent. Pardon my ignorance.
Starting point is 00:07:57 Is the Kiwi, is that like a native New Zealand fruit, I'm hoping? So the Kiwi is a native New Zealand bird, and then the kiwi fruit is what we call that fuzzy brown fruit. But here, I feel like they kind of mix up the two. They're both brown and fuzzy and sort of bulbous. Ignorance incarnate here, folks.
Starting point is 00:08:18 But if you do cut the bird in half, it looks exactly like the kiwi fruit. A lot of people don't know that. It's green with a little seed core. Yeah. All right. Dr. McInerney, we have you here today to talk to us about AI.
Starting point is 00:08:33 We've talked last week's episode about AI. We're thrilled to ask you our follow-up questions. But before we get to any of it, we do like to get to know our guests a little bit better and ask you what is something from your search history this is pretty embarrassing i just got a nintendo and i'm like finally entering my game ago era i'm like this is gonna be so fun but my whole search history is just me googling like very basic controls and zelda breath of the wilds it's like how do i jump like it's really awful i went and looked it's like how do i jump like it's really awful i went and
Starting point is 00:09:06 looked it's like how do i defend a dog like how do i climb a mountain and so it's not going well like i need to outsource my zelda playing to like children who will be much better than me wait but is it is it it's not forcing you to give up right you're just you're you're just acclimating to the new environment right i, the kingdom is not going to be saved, whatever I'm meant to be doing. I'm just walking in circles now for an infinite amount of time. But I'm having fun, which I guess is the main point. Yeah, yeah, yeah.
Starting point is 00:09:34 Anyway, did you play games in childhood and you're kind of coming back? Or this is all new territory for you? I did, but I was also bad in childhood. This is the story of, oh i used to be like super into games and i like fell out of touch like how people were like amazing athletes as children and then like pick it up again like i'm consistently awful at game so the search history is not a surprise got it got it i mean you were doing you know worthwhile things like thinking about how to ethically deal with all this technology. Well, I was laughing my ass off playing Donkey Kong Country.
Starting point is 00:10:11 I'm just picturing you being like, ha ha, ha ha ha. You can slap the ground and bananas come out. It's a little hack people didn't tell you about. Our AI guest on last week's Expert episode was saying that video games are one of the ways that he thinks that the future of AI is going to impact our day-to-day lives. Is that something that you think about as you're out there kind of just walking in circles in Zelda Breath of the Wild? I mean, I feel like I perform so poorly. They probably think I am am an NPC or something. It's just like, that is not a person with free will.
Starting point is 00:10:49 That person is following a fox around for days. It just rotated 720 degrees over and over for no reason. Have they made games more fun for people who suck at video games? that's the innovation that i'm
Starting point is 00:11:07 looking for because i i was never very good at them but i i liked playing them i like hearing about them i think i'm ready to enter my gamer girl era and come back but yeah you should i mean look jack they already know you as the switch god because you've been on nintendo so much the controllers basically fuse to your body. But there are a lot of new, like the new Star Wars game, you could just set it to like, man, I'm not trying to do all this fancy shit. I just want to beat people up
Starting point is 00:11:36 or just mash the keypad over and over and win like that. And you can do that. Because I think it is about being able to play at different skill levels rather than being like, oh, you don't know how to use the force while you're doing your melee attacks? Come on, though. Some of us just want to be kids. Unraveled is good for that, too, of a fun video game
Starting point is 00:11:58 that is not really requiring you to have these sort of highly developed video game skills, and it's an easy play. Nice. Dr. McInerney, what is something you think is overrated? I feel like I'm not going to win myself any friends. I'll be like,
Starting point is 00:12:14 how to let lose friends and stop influencing people. But all the Disney live action movies, I've hit a stage where I'm like, no more live actions. I had a little mermaid was beautiful. I love the original Cinderella, but you know, I don't,
Starting point is 00:12:24 I don't want to see more and more of like the same movies again like it makes me a bit sad and like what other stories could be told if they weren't like always making these live actions but you know maybe i'll be proven like totally wrong and the next ones will be amazing but i'm ready for something else just yeah based on everyone's first like sort of reaction like oh you saw it how it wasn't like yeah it was okay no one was like it was so good i'm gonna see it nine times everyone just kind of i think they're they have to reconcile like their love for the original source material with that and not fully be like i didn't like it they're like yeah i mean it's it's yeah it's interesting you know it's like their voice just goes up yeah Yeah. You know, it was like, I'm glad I spent my afternoon going to that.
Starting point is 00:13:08 I think that's right. I, I've never been more confident in a prediction about the future of a like subgenre of movies than in saying that they're not going to suddenly figure something out about the Disney live action reskins of the animations like we we know what the original cartoons look like and what happens in them we know what is possible here like right i can't i can't imagine a version like what what they going to pull out where we're like, oh, no, that is not, did not see that one coming. Whoa, that bear is talking and singing.
Starting point is 00:13:52 Hold on. Oh, they already did that. That was the best one, I think, was Jungle Book. And that was the first one they did. And ever since, I feel like, yeah, been diminishing returns. Although I didn't see The Little Mermaid, so I cannot speak to that one. What's your favorite genre of film though, Dr. Carey? Ooh, I mean, okay.
Starting point is 00:14:14 So I really have a soft spot for like old school superhero movies, but my like most recent, most amazing film, I feel was the latest Into the Spider-Verse film. What was it? Spider-Man Across the Spider-Verse. yeah and i just i love animation in that film it's so witty and the visuals are extraordinary and just great films so i wax oracle about how much i love these films at work and everyone has to deal with me being like go watch it so anyone listening go watch it it's amazing
Starting point is 00:14:41 favorite spider character spider person in the universe oh my goodness i can't remember the name of it but the light really like the really like dark noir run for the first one he's like from like all those like 1930s kind of like dark mystery films you know in black and white i kind of love that yeah yeah was it was it just spider-man noir yeah i think that's noir spider-man maybe i don't know or i'm sorry spider-man nor nor what is when you say old school superhero movies do you mean like christopher reeves superman or are you talking about like the original man the first iron man uh i think it's almost like less like specific films i love like almost
Starting point is 00:15:27 films that have that like really cheesy like origin coming of age story like you know it's really just about like they're discovering themselves and it's such a simple narrative and like i feel like i should crave more complexity than that and i should want it to be more nuanced stories but sometimes it's just really satisfying and you watch this clean-cut narrative of this like good guy defeating all these bad guys when i want to relax i really enjoy that in my day job i think about lots of like complexity and nuance and like what we do with the future of society and so you know right sometimes i find it very relaxing just to be like star wars yeah you're like yeah i love a joseph camp type flick. Just give me that hero's journey in every shape and form.
Starting point is 00:16:06 Still going to be okay. Yeah, yeah, yeah. Nothing's going to go wrong. Challenge your old self to become your new self. Yes! But Doc Ock is an example of what we're facing with AI, like in the immediate future. I think we can all agree on that, right? Spider-Man 2.
Starting point is 00:16:22 In a way, you're like, it's giving me the equivalent of eight arms. In a way. You're like, oh, boy. What's something you think is underrated? Ooh, underrated. I mean, I'm a big cozy night in person. I want to be one of those fun party people who is out raving. But realistically, at 10 p.m., I'm like, the day is over.
Starting point is 00:16:44 I'm in bed. I think a cozy night in, especially autumn is coming. But realistically at 10 p.m. I'm like, the day is over. I'm in bed. Like, I think a cozy night in, especially autumn is coming. Like, that's just my happy place. Just me at home, my husband. Life is good. Yeah. Cozy vibes.
Starting point is 00:16:56 I like that. That's my whole thing. Especially, we don't get seasons here. So the second there's like, just the feeling of like a chill i'm like i i need to start nine fires and just be near them and that is a problem that miles has and he's working on his therapy with i guess legally it's called arson or something but what i say is i just want everyone to be cozy around the greater los angeles metropolitan area. Yeah, I cannot go within four miles of the Angeles National Forest, but hey, whatever.
Starting point is 00:17:29 They must see my light. Yeah, I love coziness. I love a... Are you cozy or are you summertime, Jack? If you had to pick between, would you rather be summertime Jack or cozy Jack? I do love a summer especially i'm in my uh swimming phase where i just i really like getting in body of water get run running into the ocean even when it's a little cold ocean and yeah so i i think i'm like in a summer phase but i yeah i am feeling the need to get cozy right now.
Starting point is 00:18:05 I'll always take cozy over summer. That's just me. I know. It's like you fetishize winter. I do. I do. It's problematic. That's why I would only date women from the Northeast.
Starting point is 00:18:22 Yeah. I feel like you're dating me for this exotic sort of life you think I live. What's it like with the snow? You're appropriating Minnesota, like Minnesota culture. Right. Yeah. Well, hey, look, we're all trying.
Starting point is 00:18:38 Hey, I heard that. You said, hey. All right. Let's take a quick break and we'll come right back and talk about ai we'll be right back i'm jess casavetto executive producer of the hit netflix documentary series dancing for the devil the 7m tiktok cult and'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together,
Starting point is 00:19:10 we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades. Jessica and I will delve into the hidden truths between high control groups and interview dancers, church members and others whose lives and careers have been impacted just like mine. Through powerful, in-depth interviews with former members and new chilling firsthand accounts, the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. This summer, the nation watched as the Republican nominee for president was the target of two assassination attempts, separated
Starting point is 00:20:03 by two months. These events were mirrored nearly 50 years ago when President Gerald Ford faced two attempts on his life in less than three weeks. President Gerald R. Ford came stunningly close to being the victim of an assassin today. And these are the only two times we know of that a woman has tried to assassinate a U.S. president. One was the protege of infamous cult leader Charles Manson.
Starting point is 00:20:27 I always felt like Lynette was kind of his right-hand woman. The other, a middle-aged housewife working undercover for the FBI in a violent revolutionary underground. Identified by police as Sarah Jean Moore. The story of one strange and violent summer. This is Rip Current, available now with new episodes every Thursday. Listen on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I've been thinking about you.
Starting point is 00:20:57 I want you back in my life. It's too late for that. I have a proposal for you. Come up here and document my project. All you need to do is record everything like you always do. One session. 24 hours. BPM 110.
Starting point is 00:21:14 120. She's terrified. Should we wake her up? Absolutely not. What was that? You didn't figure it out? I think I need to hear you say it. That was live audio of a woman's nightmare.
Starting point is 00:21:28 This machine is approved and everything? You're allowed to be doing this? We passed the review board a year ago. We're not hurting people. There's nothing dangerous about what you're doing. They're just dreams. Dream Sequence is a new horror thriller from blumhouse television iheart radio and realm listen to dream sequence on the iheart radio app apple podcasts or wherever
Starting point is 00:21:52 you get your podcasts and we're back we're back and dr mc. And Dr. McInerney, as mentioned, I am an idiot on this AI stuff. I think I generally have like my version of AI up until last week, I guess, like researching for our last expert episode was what I had read in, you know, mainstream articles that went viral and films, like Hollywood films, and then messing around with OpenAI or ChatGPT. So I had this kind of disconnect in my mind where it was like, from an outsider's perspective, we have this C-plus like copywriter thing with like in chat gpt gpt4 and then like the godfather of ai who i'm just trusting people is the godfather of ai but that's what everyone uses that same phrase like the godfather of ai just quit google and says we're all fucked in the next couple years and i think it's confusing to me because I don't know
Starting point is 00:23:08 exactly, like, I can't even like picture the way, how he thinks we're fucked. And there, there was this letter that was like, we need to pause development on AI for in the near future. And I guess I'm just curious to hear your perspective on that pause letter and what the kind of dangers of AI are in the near future. Yeah, because to Jack's point too, also we were talking with Joao Sadoc last week at NYU about it. And like at the end of it, we're like,
Starting point is 00:23:42 okay, so it's not Skynet, right? From Terminator. And they're like, oh, great. But, we're like, okay, so it's not Skynet, right? From Terminator. And they're like, oh, great. But then we realized there's a raft of other things that come along with just not being the Terminator. So yeah, that's, I'm also from a similar perspective, where I always assume Skynet. Yeah, I mean, I think you're totally not alone in the dominance of ideas like Skynet and Terminator because so much of our cultural framework for understanding what AI is comes from a very narrow set of movies like the Terminator, like the Matrix, which always positions AI as something that's going to dominate us, it's going to take over
Starting point is 00:24:16 the world, and it's going to control us. And it's important to, I think, highlight that that's definitely not the only ideas that we have about AI. We've got thousands of years of thinking about our relationship with intelligent machines. And there's a lot of different cultural traditions that have really different ways of thinking about our relationships with AI and intelligent machines that could be much more positive, much more harmonious. And so I do think our immediacy of jumping to this idea of Skynet is reflective of very much where we are right now, right? I'm in the UK, you're in the US.
Starting point is 00:24:48 These are countries that have a really long history of thinking about AI in very binary terms. So, yes, I think it's important that we think about these long-term risks of AI. And you mentioned the pause letter calling for a halt to generating more large language models like ChatGPT until it had a bit more of a moment to think about some of the long-term consequences of these models. But I think it's really important not just to think of the long-term risks, but to think about which long-term risks we prioritize. Because I think the Skynet Terminator fantasy eats up a lot of oxygen about how we talk about AI's risks, but there's a lot of different risks that AI poses. So another long-term risk that we don't talk about very much at all is the climate cost of AI,
Starting point is 00:25:36 right? Because AI, it's hugely energy intensive. Data centers require a huge amount of water to function. And we have this massive problem of e-waste produced by a lot of electronic systems. But that long-term problem of climate crisis is much less exciting. It's really scary. It's really grounded in a lot of our experiences. And so it just doesn't seem to get as much airtime. So that's something that I think is really important is changing the conversation a bit to say, okay, it's sometimes interesting,
Starting point is 00:25:59 sometimes scary to think about the Terminator option, but what are some of the other long-term options that could really shape our lives yeah like the degree to which the deck is being stacked towards the terminator option was surprising to me like we we dug in last week a little bit to the two stories i had always heard that are like kind of put into the Terminator version of AI taking over a category. There's the AI that like killed a person in a military exercise that decided to like eliminate its controller. And then there's the AI that hired a task rabbit to pass the CAPTCHA test. And like in both cases, those are like the ai that killed a person in the military exercise
Starting point is 00:26:46 like that was somebody claiming that and then when they went back they were like oh i was just saying it could hypothetically do that it was a thought experiment yeah it was a thought experiment of what could what an ai could do in the right circumstances and the the TaskRabbit one was more similar to the self-driving car Elon Musk thing, where it was just there was a human prompting it to do that thing that seems creepy to us when we like start thinking about, oh, it's like scheming to get loose and get like overcome the things that we're using to keep it hemmed in so it it does feel like there is an incentive structure set up for the the people in charge of some of these major ai companies to get us to believe that shit like to think to only focus on the ai v humanity like AI gets loose of its control of our controls for it and takes over
Starting point is 00:27:48 and starts like killing people version of it I'm just curious like what what are your thoughts on like why why why are they incentivized to do that when it would seem like you well you don't want to make it you don't want it to seem like like this self-driving car will take over and start. Kill your family. Yeah, start killing your family. It's so powerful. But it seems like with AI, they're more willing to buy into that fantasy and like have that fantasy projected to people who are not as closely tied to the ins and outs of the industry. Yeah, I mean, I think that's such an important point, because it's a weird thing about the AI
Starting point is 00:28:32 industry, right? Like you would never have this kind of hype wave around something like broccoli, where you say, oh, the broccoli, if you eat it could kill you, or it could like, transform the world. And then you wouldn't expect that to somehow get people to buy a lot more broccoli and just be like, oh oh i don't want to eat broccoli now but if they were like it's so fucking good and powerful that it'll make you explode like maybe like maybe that's what it is even worse you can then also eat you know what the broccoli would be doing yeah but i do think that we see this real cultivation of hype around AI and that a lot of firms explicitly use that to sell their products
Starting point is 00:29:08 and it gets people interested in it because on the one hand, people are really scared about the long-term and short-term impacts of AI. On the other hand, they're also scared then though of getting left behind. So you see more and more firms saying, well, now I've got to buy the latest generative AI tool so that I look
Starting point is 00:29:23 really tech savvy and I look tech forward and I look futuristic. And so it's part of this bigger hype cycle, I think, to draw a lot of attention towards their products, but also to make them seem like this really edgy, desirable thing. But I think what's also interesting about both the stories that you raised is when you looked under the hood,
Starting point is 00:29:40 there was human labor involved, right? There were people who were really central to something that was attributed to an AI. And I think that's a really common story we see with a lot of the type around AI is often the way we tell those stories erases the very real human labor that drives those products, whether it's the artists who originally made the images that trained the generative AI through to data labelers, all sorts of people who are really central to those processes. Right.
Starting point is 00:30:07 And I know like in your episode of The Good Robot, when you're discussing the pause letter, you know, I think the version that we see as like sort of the short term threats, at least in the most immediate way is like for me working in and around entertainment and people who work in advertising and seeing like an uptake in that section, I go, okay, that's easy. Like I can see how a company immediately goes, yeah, it's a tool. And then suddenly it's like, and now you're on your ass because we'll just use the tool now. And we don't even need a person to prompt it, or we need many, we just need fewer people to operate it. So to me, I'm like, okay, that's an obvious sort of thing I can see on the horizon.
Starting point is 00:30:46 And you did talk about, well, there was a lot of talk of these sort of long-term existential or quote-unquote existential threats, that there were a lot of things in the short term that we're actually ignoring. What are those sort of things that we need to bring a little bit more awareness to? Like, I know you mentioned the climate um and i i look at it from my perspective i see like the just massive job loss that could happen um but what are sort of like the more short-term things that kind of maybe are less sexy or interesting to the people who just want to write about killer terminators and things yeah i mean i think less sexy is exactly the right phrase for this which is a lot of the short-term issues are very much about entrenching existing forms of inequality and making them worse. And that's often something people don't really want to hear about because they don't want to acknowledge
Starting point is 00:31:33 these inequalities or because it takes away from the shiny newness of AI. It makes it very much like a lot of other technological revolutions that we've already seen. And that's super boring. Like you don't want to hear about how the wheel somehow brought about some kind of inequality. The wheel is racist and we all know it. The big hot take from today. But yeah, I mean, something that I look at, for example, are like very mundane,
Starting point is 00:31:57 but important technologies like technologies used in recruitment and hiring. So I look at AI powered video interview tools and look at how that affects people's particular likelihood of being employed and how they go through the workforce. And yep, it's less exciting seeming than the Terminator. But again, when you look under the hood and dig into them, you're like, oh, wow, this could actually really, really compound inequalities that we see in the workforce under the guise of the idea that these tools are going to make hiring
Starting point is 00:32:24 more fair. And that's a massive problem. Right. So because like the idea with those hiring tools, like it will actually take away these sort of like demographic cues that someone might use to like, you know, they'll apply their own biases, too. So, in fact, it is the most equitable way to hire. But is it because of just the the kinds of people that are creating these sort of systems because they tends to be a bit one note that that's inherently where like sort of that like it begins to wobble a bit it's a mixture so of course yes the lack of diversity in the ai industry is like very stark it's also sadly in the uk an industry where for example women's representation is actually getting worse not better so that's a sad slap in the face for a lot of the progress narrative that we want to see.
Starting point is 00:33:09 But sometimes it's not even necessarily that the people creating these tools have bad intentions. Maybe not even that they're using faulty data sets or biased data sets. These are two of the really big problems that are flagged. But sometimes the underlying idea behind a product is just bogus. It's just a really bad concept. And yet somehow it gets brought to market again because of all this hype around AI. So with the video interview tools that we look at, for example, they basically claim that they can discern a candidate's personality from their face and from their words. So how they talk, how they move, they can decide how extroverted you are or how open you are, how neurotic you are,
Starting point is 00:33:47 how conscientious you are, all these different markers of personality. To which I would say, firstly, no, there's absolutely no way an AI can do that. This is just a very old kind of racial pseudoscience making its way back into the mainstream saying, okay, we can totally guess your personality from your face like it's like your friends looking at someone's profile picture on like tinder or
Starting point is 00:34:10 whatever and being like they look like they'd be really fun at a party like it's about that level of accuracy you know and then second is that even a good way of judging if someone's gonna be good for a job like how extroverted do you want a person in a job to be maybe in your job that's really really helpful in my job i don't know how helpful it is so there's just kind of a lot of flaws at the very you know bottom of these products that we should be worried about just like a c minus level job hiring process like that's what i feel like so many of the things like when you get down to them and see them in action they're like not that good like it does feel like the whole thing is being hyped to a large degree and like that's something i heard from somebody i know who like works in
Starting point is 00:34:59 you know let like all of my friends who work in finance or any of those things, my brain shuts off when he starts talking about what he does. But he was saying, he pays attention to the market and he was saying there's a big thing propping up the stock market right now is AI. And it really is. That's where so much of people's wealth is tied up is in the stock market. so much of people's wealth is tied up as in like the stock market and it's just tied to like what you can get people excited about in a lot of cases so it really like that from that perspective the incentive structure makes sense like you want people talking about how your ai technology can do all these amazing things because that literally makes you have more money than you would have if they knew the truth about your your product without being like yeah how many seven
Starting point is 00:35:53 fingered trump pictures can we create and right be like yeah man fucking dump millions into this yeah yeah i mean that's kind of something that's really come out of the last few years is how many firms just use the label of AI to get funding. Like, I think there was a study a couple of years ago that said like 40% of podcast because eventually it could use AI. And before we were recording, Miles was actually putting in, he asked an AI to pitch him an episode of Friends in which the cast and the people on the Friends on the show deal with the fallout from the events of 9-11. And it wouldn't do do it so we can't quite claim that we are an ai podcast yet but but we did it did do it when i said do a uh pitch me an episode where joey and monica drive uber oh right from yeah and it did so clearly it because you can see where these guardrails are they're like don't do 9-11 stuff though that's right no but i mean so yeah i think there's two
Starting point is 00:37:07 things we're talking about here like from from one perspective like yes you could put it in the category of like well yes the wheel makes racism or colonialism more frictionless is a word that gets used a lot but like literally in the case of the wheel, frictionless, but AI and like a lot of technology is designed to make groups of people and like our interactions and the things that make people money more frictionless. And that's something that you guys have talked about on recent episodes of Good Robot. Like there's this one example that really jumped out to me that I think was from your most recent episode, or at least the one that's up most recently right now
Starting point is 00:37:51 as we're recording this, where you guys were talking about a company that asked a regulating body to make an exception to a law around like a high risk use of AI. And the law said that people had to supervise the use of AI, like just because it seemed dangerous. And the company appealed to the regulating body by saying, well, we just like that would cost too much and we would never be able to like scale this and make a profit. And it feels to me like our answer as a civilization to that complaint needs to be, that's not our problem. Then you shouldn't be doing it. But instead, it seems like the answer too often, not just in AI, but just across the board,
Starting point is 00:38:39 especially in the US, is like, okay, well, we have to make an exception so that they can make a profit around this technology or else the technology won't get developed. Because the only thing that drives technological progress is like the profit motive. But that's, you know, as I think you guys talked about in that episode, that's never been the best way to develop technology. Like it's, it's been a good way sometimes to democratize existing technology. But like, that's, I don't know.
Starting point is 00:39:09 I feel like that idea of you have to make it profitable. You have to make it easy on these companies to keep trying different things for AI to become profitable is baked in at a like cellular level at this point in how a lot of you know western colonial civilizations operate yeah i mean i think too often a lot of the technologies that shape our daily lives are made by a very narrow set of people who ultimately aren't beholden to us they're beholden to their shareholders or to their boss and right so they don't really have our best
Starting point is 00:39:42 interests at heart right like? Like, for example, take this whole rebranding of like Twitter to X by Musk. I remember waking up and finding my little Twitter bird replaced with this huge X and just being like, oh, firstly, because it was part of,
Starting point is 00:39:56 you know, Twitter's low decline. But secondly, it made me feel pretty disappointed or really aware of the fact that one guy could have such a huge impact on how literally millions of people use a social networking platform that's actually super important, you know, to their daily lives and has played a huge role in activist movements
Starting point is 00:40:15 and fostering different communities. And I think that's a story we see time and time again with some of these big tech companies, which is not only do they have their own profit motive at heart, they also, they're not beholden in any way to the public and they're not being compelled by regulation to make good decisions that necessarily benefit the public. So I think a really important question going forward is how do we support kinds of technology development that are very much based in the communities that the technology is for? I think one really big part of that is recognizing that so many AI models, as you mentioned, right, they're designed to be scalable.
Starting point is 00:40:51 And that's how they make money, this idea that you can apply them globally and universally. And I think that's a big problem, partly because it often is really homogenizing. It can involve like exploiting a model from the US usually out to the rest of the world. It's probably not actually appropriate to use in those contexts but also a lot of really exciting and good uses of technology i think come from these really localized specific community-based levels so sometimes i think it can be about thinking smaller rather than bigger yeah yeah i that was like another thing that struck me about like just all the warnings and even in that pause letter
Starting point is 00:41:24 sort of like the presumption that it's like well all you motherfuckers are gonna use this so we gotta talk about it where it's like i don't know i don't even fucking know what it is like a second ago i thought it was skynet and now like you know you have your company being like yeah we now have enterprise ai tools like welcome you're like but what am i huh like what and i think that's what's a really interesting thing about this as like a sort of technological advancement is before people even really understand what it is, there is like from the higher the powers that be sort of going into it being like, well, this is it. Like, everyone's using it, but I'm still not sure how. sure how and i guess that probably feeds into this whole model of generating as much you know excitement market excitement about ai is by taking the angle of like everything's different because everyone is going to be using ai most of y'all don't know what that is but get ready and i think that's what also makes it very confusing for me as it's like a lay person outside of the tech sphere
Starting point is 00:42:22 to just be like wait so are we all using it? And even now I really, I still can't see what that is and how that benefits me. And I think that's a big part of, I'm sure your work too, or even like any ethicist is to understand like, well, who does it benefit? Like first we're making this because it benefits who and how. And I think, is it right now it benefits the companies that are making it? It sort of feels like that's the way it's being presented, or slightly being like, yeah, you guys are going to love this, but really, we're going to benefit from the adoption of this technology. Yeah.
Starting point is 00:42:56 I mean, I think that's this crucial question is this stepping back and saying, actually, is this inevitable? And do we even want this in the first place? And I think that's what really frustrated me about the pause letter and about a number of kind of big tech figures signing on to it is that they're very much pushing this narrative of like oh this is like unstoppable and it's inevitable and it's happening we've got to find ways to deal with it and it's like you're making it like you're the people literally making these technologies in a lot of cases so if you really think it's an existential risk to humanity stop it honestly could even be that simple but that you know what
Starting point is 00:43:33 makes me really then question their motives and sort of coming forward with a lot of this kind of very doom and gloom language um i think it's also interesting if you look at for example countries as national ai strategies um so if you look at say like example, countries as national AI strategies. So if you look at, say, like China and the UK and the US and these countries that are now thinking about what their national long-term AI strategy is going to be, they also very much frame it around the idea that AI is completely inevitable, that this is going to be the transformative technology for imagining the future, for geopolitical dominance, for economic supremacy. And again, I think as an ethicist, what I really want people to do is step back and say, I think we're actually at a crossroad where we can decide whether or not we think these technologies are good for us and whether they are
Starting point is 00:44:15 sustainable, whether they are a useful long-term thing for our society, or actually whether the benefits of these technologies are going to be experienced by very few people and the costs are going to be borne by many. Right. We talked last week about the scientific application that, you know, used deep learning to figure out the shapes of proteins, the structures of proteins, and that that could have some beneficial uses, and that that could have some beneficial uses, will probably have some beneficial uses for how we understand disease and medicine and how we treat that. But there are ways to probably differentiate
Starting point is 00:44:54 and think about these things. Like it's not, you don't just have to be either Luddite or like AI pedal to the floor, let's just get out of the way of the big companies, you know, it feels like. But it is such a complicated technology that I think there's going to be inherent cloudiness around how people understand it
Starting point is 00:45:20 and also manufactured cloudiness because it is in the overall system's benefit the overall system being like capitalism it's in their benefit to generate like market excitement where there shouldn't be any basically yeah i mean i think it's easy to generate this kind of nebulousness around ai because to some extent we still don't really know what it is. It's still more of a concept than anything else because the term AI is stretched and used to describe so many applications. I spent two years interviewing engineers and data scientists in a big tech firm and they would sort of grumble, well, 15 or 20 years ago, we didn't even call this AI and we were already doing it. It's just a division tree. Again, it's kind of part of that branding.
Starting point is 00:46:04 But also we have these, again, thousands of years of stories and thinking about what an intelligent machine is. And that means we can get super invested and super cloudy very, very quickly. And yeah, I don't want people to feel bad for being scared or being cloudy about these technologies. It is dense and confusing. But at the same time time I do think that it is really important to come back to that question of what does this do for us so you know the question of like Luddism or being a Luddite I think is really interesting because I you know personally I do use AI applications there's certain things about these technologies that really excite me but I'm really sympathetic to some of the kind of old school Luddites who weren't necessarily anti-technology, but were really against the kind of impacts that technology were having on their societies.
Starting point is 00:46:50 So the way that new technology is like, I think would be things like spinning and weaving were causing mass unemployment and the kind of broader ramifications that was having for people in the UK socially. having for people in the UK socially. And that kind of has quite a scary parallel to today in terms of thinking about maybe what AI will bring about for the rest of us who maybe aren't researchers in a lab, but who maybe might be replaced by some of these algorithms in terms of our work and our output. Yeah. Can you talk at all about open source like models of because you know what when we talk about this idea that corporations have all this power and are incentivized to do whatever is going to make the most money which in a lot of cases is going to be the thing that removes the friction from consumption decisions and you know just how people interact and do these things which well as you guys talked about in your episode like removing the friction like friction can be really
Starting point is 00:47:51 good sometimes sometimes your system needs friction to stop and correct itself and recognize when bad shit when things are going wrong but you know there's also a history in even in the US where corporations are racing to get to a development and ultimately are beat by open source models of technological organizing around like getting a specific solution. Do you have any hope for open source in the future of AI? Yeah, I think I'm really interested in community forms of development. And I think open source is a really interesting example. I think we've seen other interesting examples around things like collective data labeling. And I think that these kinds of collective movements, on the one hand, seem like a really exciting community-based alternative to
Starting point is 00:48:45 the concentration of power in a very, very narrow segment of tech companies. On the other hand, though, I think community work is really hard work. We had Dr. David Adelani on our podcast, who's a very important figure in Masakana, which is a grassroots organization that aims to bring some of the over 4,000 African languages into natural language processing or NLP systems. And he talks a lot about how the work he does with Masakana is so valuable and so important, but it's also really, really hard because when you're working in that kind of collective decentralized environment, it can be much slower. And as you said, there can be a lot more friction in that process but counter to this move fast break things kind of culture sometimes that friction can be really
Starting point is 00:49:30 productive and it can help us slow down and think about you know the decisions that we're making very intentionally rather than just kind of racing as fast as we can to make the next newest shiniest product i was i'm also like in your work too you know you talk about how you know like looking at these technologies especially through a lens of like feminism and intersectionality and you know bipoc communities and things like that i don't like broadly in science there's like you know there's an issue of like language hegemony in scientific research where if things aren't written in english a lot sometimes studies just get fucking ignored because like, I don't speak Spanish or I can't read Chinese.
Starting point is 00:50:10 Therefore, I don't know if this research is being done and therefore it just doesn't exist because the larger community is like, we all just think in English. Like, so how do you like, you know, specifically,
Starting point is 00:50:21 cause you know, when hearing the description of your work, help me understand like, and the listeners to like of how we should be looking at these things from that also from that to your point, it's like the thing we of course, it's unequal. Of course, it's racist or whatever. But what are those ways that people need to really be thinking about this technology? Yeah. I mean, I think English language hegemony is a really good example of this broader problem of the more subtle kinds of exclusions that get built into these technologies, because I think we've all probably seen the cases of AI systems that have been really horrifically and explicitly racist or really horrifically sexist from, you know, Tay, the chatbot that
Starting point is 00:51:18 started spouting horrific right-wing racist propaganda and had to get taken down through to Amazon hiring tool that systemically discriminated against female candidates. These are really, I think, overt depictions of the kinds of harms AI can do. But I think things like English language hegemony are also incredibly important for showing how existing kinds of exclusions and patterns of power get replicated in these tools. Because to an English language speaker, very crucially, they might use ChatGPT and think, this is great. This is what my whole world looks like if they only speak English. Obviously, anyone who is not a native English
Starting point is 00:51:54 speaker or who doesn't normally speak English, it's going to be an incredibly different experience. And that's where I think we see the benefits of these tools being really unequally distributed. I think it's also important because there's such exclusions in which kinds of languages and forms of communication can get translated into these systems. So for example, I work with a linguist at the University of Newcastle and she talks about the fact that there's so many languages, like signed languages and languages that don't have a written language, but they're never going to be translated into these tools and never are going to benefit from them. You might think, okay, well, do these communities want those languages translated into an AI tool? Maybe, maybe not. I'd argue, of course, it's up to them, but those
Starting point is 00:52:35 communities are still going to experience the negative effects of AI, like the climate cost of these tools. And so I think it's just really important, like you said, to think about what kinds of hegemony are getting further entrenched by AI-powered technologies. All right, great. Let's take a quick break and we'll come back and finish up with a few questions. We'll be right back. I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church.
Starting point is 00:53:13 And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades. Jessica and I will delve into the hidden truths between high-control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine. Through powerful, in-depth interviews with former members and new, chilling firsthand accounts, the series will illuminate untold and extremely necessary
Starting point is 00:53:45 perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. This summer, the nation watched as the Republican nominee for president was the target of two assassination attempts separated by two months. These events were mirrored nearly 50 years ago when President Gerald Ford faced two attempts on his life in less than three weeks. President Gerald R. Ford came stunningly close to being the victim of an assassin today.
Starting point is 00:54:25 And these are the only two times we know of that a woman has tried to assassinate a U.S. president. One was the protege of infamous cult leader Charles Manson. I always felt like Lynette was kind of his right-hand woman. The other, a middle-aged housewife working undercover for the FBI in a violent revolutionary underground. Identified by police as Sarah Jean Moore. The story of one strange and violent summer. This is Rip Current. Available now with new episodes every Thursday.
Starting point is 00:54:56 Listen on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I've been thinking about you. I want you back in my life. It's too late for that. I have a proposal for you. your podcasts. 110, 120, she's terrified. Should we wake her up? Absolutely not. What was that? You didn't figure it out? I think I need to hear you say it. That was live audio of a woman's nightmare. This machine is approved and everything?
Starting point is 00:55:38 You're allowed to be doing this? We passed the review board a year ago. We're not hurting people. There's nothing dangerous about what you're doing they're just dreams dream sequence is a new horror thriller from blumhouse television iheart radio and realm listen to dream sequence on so i mean one thing that i want myself to to just like get out of this conversation is just a sense of like ways that we like what we think ai might look like,
Starting point is 00:56:25 how it might impact what the world around us looks like in the not-too-distant future. And we've already talked about some ways that AI is being overrated as an autonomous killbot, like Terminator-style killbot. There's some other examples. On one of the episodes of your podcast an ai expert talks about getting an ai enhanced cancer scan and assuming the scan
Starting point is 00:56:56 was like taking 3d video and doing or doing like 3d modeling and it was just putting a box on a 2d image and i believe the guest admits that they were influenced by Hollywood movies. And it seems like that's what people who are trying to make money off of this want. So we're going to have this steady push to make us overrate, misunderstand what the actual promise of AI
Starting point is 00:57:24 is going to look like, what tools are going to be given to us in the near future, and what are those tools actually able to do? So I'd just be curious to hear from you, what do you think ways that AI might intervene in our lives in the near future might already be intervening in our lives. What are those things that we're underrating? And what are the things that you think are probably taking up too much of our bandwidth in terms of how we're picturing AI? Oh, that's a great question. I think to start with the second half, to start with what I think is maybe a little bit overrated, or maybe you'll take a bit longer to pan out. I would say some of these really high tech applications, things like AI and medicine, for example, I think that we might not see them expand soon every hospital will have this tool or say AI in education, that every classroom is going to have access to this tool, I think is unfortunately just really grossly overestimating the kinds of resources, certainly in this country, that schools and hospitals have.
Starting point is 00:58:35 I was talking to a friend who's a doctor here and she was saying, oh, well, what do you think about AI in medicine? And then she kind of stopped for a moment and just laughed and said, well, I don't know why I'm asking you that because my hospital uses paper notes. And I said, what? And she said, yeah, our whole hospital is not computerized. And that was a good reality check for me because it made me realize like, wow, actually some of the stuff that I'm thinking about is so far away from the reality of what people like my friend, the doctor is having to deal with on the ground and same with schools you know here a whole hundreds of schools I think have had to be closed because
Starting point is 00:59:10 there's some kind of issue with the concrete that they might collapse which is a horrific but b is such a different level of infrastructural problem that thinking about AI in the classroom is really not on the radar in that scenario yeah let's work on walls first exactly yeah think about
Starting point is 00:59:26 autonomous ai socrates yeah it's like a low bar yeah people are like what if students cheat and you're like look what if the building crushes them yeah yeah you know so because those are things that i think are maybe a little bit on the overrated side. I think in ways things are underrated. Like, I'm really interested in how AI can change how we see the world and how we see ourselves. Like, I think, you know, take, for example, something like TikTok. You know, I have to confess, I'm a big TikTok user. I love scrolling a lot of mindless garbage. It brings me a lot of joy and peace.
Starting point is 01:00:03 But even things like AI- powered beauty filters, right? Like I think that has such a profound effect on how you just understand and see yourself and your own face. They're often really imbued with gendered and racist kinds of assumptions as well. I often look a lot whiter when I use a beauty filter, even though I'm multiracial, like all of those assumptions, I think they get baked in that seep into our lives in like really
Starting point is 01:00:25 subtle ways but i think collectively it can have quite a big impact yeah i was talking i was talking about the i don't know if this is technically ai but then again i don't know what ai is or if there's like a specific argument but like the the way that like iphones take 40 pictures like consecutively and then like choose the best one from the 40 and like the live photo setting like feels like a thing and like it's really good at it and it's something that I had like just it happened on my phone with like an update and I was like yeah this is just how you take pictures now and like pictures are way better than they used to be I think I saw somebody speculate that like some of the software like Photoshop and other things that have traditionally had sort of a difficult learning curve will suddenly get much easier for people to use.
Starting point is 01:01:17 So, yeah, I like those. Those are like having actual tools that are like kind of easy to use and like very simple, straightforward. We know what the goal is here and we're able to use these enhanced kind of programs to to achieve that goal. Like that feels more possible to me and like something that we we might see in the not too distant future. Totally. That's like a pitch you can understand like this will make it easier for you to photoshop a friend out and just put someone else in you're like okay rather than right now it's like this shit will end the world and you're like well what is it i don't know i don't know i don't know but it will fuck you up
Starting point is 01:02:02 and you're like huh and i guess that's such a different proposition up front. And earlier, Dr. Carey, you're saying like there are things that genuinely excite you about like this sort of like these emerging technologies. And I would love to hear from your vantage point because you are looking at this and like what is ethical and what is going to bring meaningful value to people? Like what are those things that I can feel like, Oh yeah, yeah, I can get down with that future. Yeah.
Starting point is 01:02:28 I mean, I recognize that I sound very doom and gloom a lot of the time that I speak and all my friends have to deal with my sort of constant existential anxiety about technologies. But at the same time, you know, I'm an active user of a lot of them. Like I use a lot of voice dictation software,
Starting point is 01:02:44 voice editing software, you know, for when we record our podcasts. And these are things that genuinely make my life so much better and easier. I love being able to transcribe what I'm saying and have it appear on the screen and not have to type out my emails and have them appear. I recognize these are all very boring. I should have said something like, I DJ. No, not at all. I write emails using, not at all. I write email using voice dictation software. But I think kind of extrapolating out from that,
Starting point is 01:03:17 there's really amazing applications when it comes to accessibility and when it comes to the kinds of access people can now have online because of advancements in AI-powered tools. And so I think, though, you know, what's really central there is kind of that broader vision of, you know, what is the kind of benefit this is meaningfully bringing to society? What problem is being solved? And are the people who are, like, most affected by that problem the ones leading the conversation and saying what they need? Because too often, I think we see tech developers
Starting point is 01:03:41 creating stuff that actually no one really asked for. I'm sure you have, like have all seen that thing on Amazon when you're like, who asked for that banana holder or avocado peeler or something? And too often, I think tech add-ons can be a little bit like that. Whereas I think when we have really interesting conscious development of, say, for example, feminist tools that are designed to encourage good conversations, things like that, that really makes me excited about the kinds of futures we could have with technology.
Starting point is 01:04:11 Right. Yeah, it almost feels like the danger here is when someone has like a technology and go, and this is going to be in every home. And you're like, this is a sales pitch, actually. Yeah. Because if you're not saying this is how we will work less and have more time to frolic, to enjoy our human existence, to connect with our families, then then like miss me with that. Because it reminds me of crypto. It reminds me of the metaverse and reminds me of Zuckerberg being like every worker will have this fucking headset on.
Starting point is 01:04:41 No, no. But nice fucking try asshole and i feel like this is kind of like it has a similar tone of like get ready folks for this thing and granted they have so a shiny toy in the form of these large language models that are fun to do like do and really they're just skimming the internet and just giving it to you in a nice tidy sentence but yeah like i feel like that is just kind of like, I'm seeing that dimension when you see the hype around it, which feels a little more like,
Starting point is 01:05:10 y'all are talking about this to make money. Whereas the other things that you're talking about, like accessibility and trying to democratize certain applications or things like that, there's less scale involved with something like that. So maybe it isn't talked about. Exactly. And I just think it's really important to recognize that,
Starting point is 01:05:28 you know, I'm not saying that, say, things like smart home technology is like Amazon. Alexa and things are like inherently bad products. I'm sure heaps of people like love having those technologies in their home. They find them useful. But I just think whenever you bring in a new technology, you also bring in new vulnerabilities and you bring in new costs. And sometimes the hype wave means we focus on the benefits.
Starting point is 01:05:49 You know, oh, if you bring this to every home, everyone's life will be easier. Everyone will have a more seamless home experience when actually that's not the case. There's real costs to bringing those tools into a lot of people's homes. homes everything from for example the way that certain kinds of elder care is getting replaced by technological tools in the UK care system through to the fact that those tools can really easily be used for the purposes of domestic abuse intimate partner violence it can be used to control or trap someone in their home and these are really ugly truths about those technologies and they're often not the ones that are put at the heart of technology development and so that's a lot of my job I guess is just to say how do we put those vulnerabilities and those put at the heart of technology development. And so that's a lot of my job, I guess, is just to say,
Starting point is 01:06:26 how do we put those vulnerabilities and those costs at the forefront and then judge holistically whether or not we really want this product to exist? Yeah, but here's the thing. I got a lot of my stock portfolio tied up in these companies. I need these fucking things to moon,
Starting point is 01:06:44 if you know what I mean. So I want to end the episode just doing something that you guys do on the good robot which is talk talk about some like books that are recommended or you know what works you recommend you know you talk on an episode earlier I think it was in the summer about how a lot of the ideas and fears and hopes that we have currently are very similar to what we've seen in movies. And like those, like when you look at Elon Musk's 10 favorite books, they're all like fed by this same, like Isaac Asimov,
Starting point is 01:07:21 like I robot thing. I'd add a lot of Kubrick's ideas around AI are go back to this very specific version of technology where it inevitably turns into a mean psychopath who is set on dominating all of humanity. So as an alternative, because we do like to talk about, you know, the importance of expanding our imagination, talk about you know the the importance of
Starting point is 01:07:46 expanding our imagination like what you know with regards to climate a lot of it is like either it's either apocalypse or business as usual capitalism there's not like if people have a hard time imagining the the alternative on ai on the subject of AI, like, what are, is there, first of all, is there any, like, mainstream kind of popular movie that you feel like actually, like, that example of AI, like, got it right? And then are there other kind of more obscure works that you would send people to in order to sort of feed their imagination of like what what a world with ai technology could look like yeah i mean that question of the imagination and how our current imagination of ai is really narrow is super key and you know i think it's fascinating that yes you know musk and these folks have like a very narrow set of stories that they refer to in those stories it's always like oh this hyper masculine tech bro makes an ai and then it looks like him and then it takes over the world and you're like are you okay you're like your bedtime story weird the mean sociopath bent on world domination thinks that
Starting point is 01:08:57 all ai is going to be a mean sociopath bent on world domination that's so weird right yeah who would have guessed that yeah but yeah i mean and something that i really like are stories particularly science fiction stories that i love sci-fi and fantasy i'm like a huge believer in like the way that it can help us imagine different worlds um in terms of like mainstream films i guess the one that like comes to the top of my head is big hero six so the disney pixar film where you have an ai powered kind of robot health care buddy called baymax and i think that's a really interesting example of an ai that you know at one point in the film he gets dressed up in armor and and you know uh the kid here starts to try and use him as a kind of weapon but like
Starting point is 01:09:40 this is an ai that so hasn't been designed to be a weapon right that kind of resists being weaponized a lot of ways and in that sense it's really countercultural to a lot of the ai we see in western film and cinema which is often very weaponized it's often either this like kind of sexy cyborg figure or it's this hyper masculine terminator figure yeah um so i find baymax is kind of genderless and like just a big cloud puffy balloon person but that's also so accurate of what a young boy would do immediately with that is like the tweet i can tell you me and my friends would have killed et with hammers it's like little little boys are monsters but that that's a great example yeah in terms of
Starting point is 01:10:27 like some alternative like stories and ideas sorry to do a really shameless self-plug but we have a book coming out called robot why technology needs feminism and i'm plugging it because it's really beautiful we worked with a science fiction illustrator called cindy lee and there was myself and dr elinorge, my co-host. And the whole thing is these 2000 word essays. They're really short and really punchy by lots of different guests we've had on the podcast. And they all respond to this idea of good technology, dot, dot, dot.
Starting point is 01:10:55 So it might be good technology is free or good technology, you know, challenges power. And the idea is you can just dip in, read one that was illustrated, and then just dip out when you need a break or a moment uh but yeah i really would encourage you to pick that up because it just contains the most incredible feminist philosophers technologists inventors activists who are really pushing forward different ideas of the kinds of technological futures we could have i mean apart from that i read a lot of like chinese diasporic asian american sci-fi as
Starting point is 01:11:25 well particularly sci-fi that's like thinking a lot about the climate crisis and ai and the intersections of that and so i think a lot of those stories have really interesting and different perspectives on what ai is it can be from like aliette de bodard's work which explores you know the relationships between like humans and sentient mind ships in the space opera universe through the like lyricalized saltfish girl which is like a very dystopian imagining of like a future society based on labor from clones like all of these novels i think just deserve a lot more love and a lot more kudos they're just absolutely interesting and imagine have been gorgeous ideas about the future.
Starting point is 01:12:09 Amazing. Well, this has been such a fascinating conversation. Thank you so much for taking the time at what I have to assume is midnight where you are right now. It's almost nine. Almost nine. Okay. Yeah, that's too late. Well, thank you, Dr. McNerney, for doing the show. Where can people find you, follow you, hear you, all that good stuff? Yeah, thanks so much for having me on. It's been a blast. follow you, hear you, all that good stuff? Yeah. Thanks so much for having me on. It's been a blast. Yeah. Check out The Good Robot.
Starting point is 01:12:29 We're on YouTube, Apple, Spotify, and then keep an eye out for the book coming in February 2024. Amazing. Yeah.
Starting point is 01:12:36 We'll have to have you back for that. Yeah. Yeah, absolutely. And is there a work of media that you've been enjoying? Ooh,
Starting point is 01:12:42 media in terms of like a show, like a podcast. It could be a show like a podcast a tweet oh i feel like i should choose something really intellectual but i actually just sent my husband a tweet that really got me because it was very me which is um it was it's actually a bit dark but it was this news headline about this hiker who'd been missing and the rescuers couldn't get in contact with them and they didn't get rescued for 24 hours because they wouldn't answer their phone because it was an unknown number and this is just exactly me like my husband's like please pick up the phone and i'm like you know what i'm like i will not i will not answer an unknown number and so that brought me a
Starting point is 01:13:21 little bit of joy today absolutely and i feel like that's a great way that a great example of how technology has marched forward and completely ruined like something that used to work phone communication. But we can't answer our phones anymore because the fucking bots. Miles, where can people find you? What's the work media you've been enjoying? Yeah. At miles of gray uh wherever they got at i'm there uh soon to blue sky shout out ill will uh christy you got it main yeah uh you and you and i were hopping in okay codes we got we got the invite to check out the blue sky
Starting point is 01:13:57 so yeah they're shortly i'll also check jack and i out on our basketball podcast, Miles and Jack out Mad Booskies. Check me out on my 90 Day Fiance podcast, 420 Day Fiance. And also the true crime show, The Good Thief, which has all eight episodes out. So please stream those, binge those, talking about the Greek Robin Hood, the man who legitimately was kidnapping millionaires and doing some old fashioned wealth redistribution in the, in a positive way, never hurting people either. Never hurting people like, again, an ethical criminal. If there is such a thing, some tweets I like first one is from, you know, shout out to the WGA and all the negotiators, uh, because it looks like the WGA strike is they're close to getting something ratified. So we like that. Uh, This tweet is from Jeff Yang at Original Spin tweeted, just underscoring how WGA negotiators
Starting point is 01:14:48 told the studios, the union wouldn't go back to work until SAG also has a deal. Because if the last five months proved anything, it's in together, win together. And there's this snippet from, it might be a deadline,
Starting point is 01:15:01 but it says, quote, the studios also inquired if once a tentative agreement is ratified by the scribes, if the writers would pick up their pens and hit their keyboards again very soon afterwards. The Guild, from what we understand, made it clear that they would not be going back to work until SAG-AFTRA also had a new agreement with the AMPTP, reflective of the WGA's feeling of solidarity between the two unions that has characterized their first mutual strike since 1960. Love the solidarity.
Starting point is 01:15:29 And I know IATSE is also going to have a contract that is going to need to be renegotiated next year. And it looks like, guess what, folks? They did it once. They only listened to one thing. Yeah. Putting the tools down. Put the tools down and pick them other tools up, which is getting in your solidarity back. So we'd love to see that.
Starting point is 01:15:48 And another tweet I like is from T-Pain. At T-Pain, just put, bartender just doing her job. Me, just this photo of Kevin James. The photo of Kevin James that's going viral. Shrugging. But it is this. Shrugging with his hands in his pockets.
Starting point is 01:16:04 Trying to be like, hey, hey, can I get your eye contact? Just a ginger beer if you can. Tweet I've been enjoying at BRNBNE. I don't know how to pronounce that, but their display
Starting point is 01:16:19 name is BBBBBB space BBBBBB which I have to assume stands for bone, bone, bone, bone, bone, bone, bone, bone, bone. b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b b The tweet is, what if you went to E.T.'s planet and all of the other E.T.'s were wearing clothes? That really fucked up how I thought about E.T. Yeah. Yo, what's good, E.T.? Oh, damn. All right, E.T.
Starting point is 01:16:56 You can find me on Twitter at Jack underscore O'Brien and on threads at Jack underscore O underscore O'Brien. And on Blueski, I'll have a username there eventually. Jack underscore O'Brien and on threads at Jack underscore O underscore O'Brien and on Blueski. I'll have a username there eventually soon. And you can find us on Twitter at Daily Zeitgeist. We're at The Daily Zeitgeist on Instagram. We have a Facebook fan page and a website, DailyZeitgeist.com
Starting point is 01:17:18 where we post our episodes and our footnotes where we link off to the information that we talked about in today's episode as well as a song that we think you might enjoy. Miles miles what's the song people might enjoy uh this is an artist uh detroit's very own john fm and i just figured what an appropriate title it's called white science uh and the track is like very it feels like if like like i don't know like if prince was making shit in like the late night, it has like a Princey vibe. Like it's Princey and it's kind of got this like vocal modulator on it that feels a little bit Princey.
Starting point is 01:17:52 So and also it's just a really good track. So this is John FM with White Science. That's what Jack FM's mom calls him when he's in trouble. John FM, get in here now. The Daily Zeitgeist is a production of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. That is going to do it for
Starting point is 01:18:12 us this morning, back this afternoon to tell you what is trending, and we'll talk to you all then. Bye. Bye. I'm Carrie Champion, and this is Season 4 of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese.
Starting point is 01:18:31 Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports. Listen to the making of a rivalry. Kaitlyn Clark versus Angel Reese. On the iHe I heart radio app, Apple podcast,
Starting point is 01:18:46 or wherever you get your podcast presented by capital one founding partner of I heart women's sports. I'm Jess Casaveto, executive producer of the hit Netflix documentary series, dancing for the devil, the seven M Tik TOK cult. And I'm Cleo gray, former member of seven M films and Shekinah church.
Starting point is 01:19:04 And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradenti. And I'm Jemay Jackson-Gadson. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert
Starting point is 01:19:40 Maury Tahiripour. If you start thinking about negotiations as just a conversation, then I think it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.