The Daily Zeitgeist - A.I.: Fear It or F#@% It? 01.23.24

Episode Date: January 23, 2024

In episode 1611, Jack and Miles are joined by Research Associate at the Leverhulme Centre for The Future of Intelligence, co-editor of The Good Robot: Why Technology Needs Feminism, and co-host of The... Good Robot podcast, Dr. Kerry McInerney, to discuss... Is AI New Or Just The Old Stuff On a Continuum? CES Was A Joke….EVERYTHING Had “AI In It”, Other Interpretations Of AI, Sam Altman, What Is Good Technology And Is It Possible? Will This Affect What It Means To Be Human? Is the AI Arms Race With China Is Going To Make It Hard To Put Restrictions On AI Growth? And more! At CES, everything was AI, even when it wasn’t AI Hits the Campaign Trail Chaos in the Cradle of A.I. Exclusive: Altman says ChatGPT will have to evolve in “uncomfortable” ways LISTEN: Black Narcissus by Joe HendersonSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 Kay hasn't heard from her sister in seven years. I have a proposal for you. Come up here and document my project. All you need to do is record everything like you always do. What was that? That was live audio of a woman's nightmare. Can Kay trust her sister or is history repeating itself? There's nothing dangerous about what you're doing.
Starting point is 00:00:18 They're just dreams. Dream Sequence is a new horror thriller from Blumhouse Television, iHeartRadio, and Realm. Listen to Dream Sequence on the iHeartRadio app, Apple Podcasts, and culture in the new iHeart podcast, Sniffy's Cruising Confessions. Sniffy's Cruising Confessions will broaden minds and help you pursue your true goals. You can listen to Sniffy's Cruising Confessions,
Starting point is 00:00:54 sponsored by Gilead, now on the iHeartRadio app or wherever you get your podcasts. New episodes every Thursday. Señora Sex Ed is not your mommy's sex talk. This show is la plática like you've never heard it before. We're breaking the stigma and silence Thursday. Recognize us from our first show, Locatora Radio. Listen to Señora Sex Ed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, the internet, and welcome to Season 322, Episode 2 of Der Daily Zeitgeist! It's a production of iHeartRadio.
Starting point is 00:01:38 It's a podcast where we take a deep dive into America's shared consciousness. And it is Tuesday, January 23rd, 2024. Yeah. You know what that is? One, two, three, two, four. Oh, one, two, three, two, three. Well, guess what? It's National Handwriting Day.
Starting point is 00:01:56 Shout out to everybody who's still nice with the pen and pencil or quill. And also National Pi Day. But yeah, handwriting. I realize they're not teaching cursive anymore out here in the U.S. So, huh. I, yeah. I haven't written cursive since like sixth, seventh grade, I think. Do you have a pen and paper?
Starting point is 00:02:13 National Handwriting Day is discriminatory against me as a left-handed person. Wait, do you have a pen and paper in front of you right now? No. Oh, I was going to say, would you write your name right now in cursive? I just want to see what that looks like. I mean, that one I can probably do. Right, right, right. My signature.
Starting point is 00:02:29 But like, yeah. Pizzeria? Pizzeria would be a real Billy Madison style problem. Raruto? Raruto? Yeah. Yeah. Those Z's.
Starting point is 00:02:40 Are you fucking kidding? Kidnapped ya. Those don't look like any letter i ever seen anyways by the way i was just being a type of person i hate who's like really calling you out for calling it 2023 oh shit did i say that oh yeah yeah you said one two three two three and that's fine for i think we have a grace period until June, personally. After the pandemic, it's weird. Time has gone in a weird direction, and I think it's fine.
Starting point is 00:03:13 If somebody writes the year as 2022, that's a fun thing to laugh with them about. But 2023, give them a fucking break. You're being like the person at the sleepover who's like, you mean this morning because it just hit midnight. Yeah. It's 12.03. And then he keeps correcting people. You mean yesterday. Yeah.
Starting point is 00:03:34 I feel like with that, I'm finally experiencing that form of time dilation our parents experience when they say the other day. And it could mean three weeks to 13 years ago. Oh, yeah. Yeah, yeah. So, yeah. Yeah, the other day and it could mean three weeks to 13 years ago oh yeah yeah yeah so yeah um yeah the other day it's uh 2023 yeah with movies that like came out i think recently and it turns out it's over a decade ago that really fucks me up and happens to me all the time anyways my name is jack o'brien aka the rich don't give a fuck about us the rich don't give a fuck about us the rich don't give a fuck about us eat them on the half shell lambo power that is courtesy of lacaroni on the discord who i was just fucking around when i said you can't write songs about me pissing my pants anymore man you can you can totally write songs about that's that's fine
Starting point is 00:04:23 i'm the one who's choosing to sing them yeah so yeah but through clenched teeth wiping away tears every time yeah but it's a it's something that my therapist has said i need to work through uh musically so you're doing the work i'm thrilled to be joined as always by my co-host mr miles gray it's miles gray aka how about a deal to make you feel okay we know you feel depression how about a quick holiday uh shout out to andrew bub on the discord because we were talking about blue monday the most depressed day ever in history but we don't really get that so much in the u.s and just how it's really just a scheme to sell people vacations or McDonald's. Oh,
Starting point is 00:05:08 I missed that one. Which day is blue Monday? Is it today? It was last Monday. It was yesterday. Oh, last Monday. Yes,
Starting point is 00:05:14 exactly. Oh, but they gave us a day off so that we wouldn't like be so sad. Well, it's just like a thing that was started in the UK with sky travel where they were basically had a guy be like, I have out like mathematically calculated the most depressed day as like a function of time that has elapsed since christmas uh sadness and motivation and it was all just bullshit and you know yeah yeah but
Starting point is 00:05:37 hey it's a great way to sell i respect it it's a tough holidays absolutely miles yeah we are package holidays. Absolutely. Miles! Yeah? We are thrilled to be joined in our third seat by an expert guest, a returning champion, a guest on our most popular episode of 2023. Uh-oh. She's a research associate at the Leverholm Center for the Future of Intelligence. I don't think I pronounced that word right. Future? Where she researches AI from the perspective of gender studies, critical race theory, and Asian diaspora studies. Also a research fellow at the AI Now Institute, the co-editor of the new book, The Good Robot, Why Technology Needs Feminism, and the co-host of The Good Robot podcast.
Starting point is 00:06:27 Please welcome back to the show, the brilliant Dr. Carrie McInerney! Dr. Carrie! Oh, thanks for having me back. Oh, thank you so much for accepting our invitation. So good to have you back. And what time is it where you are right now? It is about 10 to 8 p.m. And I am an early bird, so this is near my bedtime.
Starting point is 00:06:43 But that just shows how keen I am to get to come back on and talk to you both. But I, you know, usually after sort of 9 p.m., all my friends know they're like, there's just going to be no response. I have no point trying to plan anything. It's bedtime. This is a special occasion. Oh, you honor us. You're such an impressive guest that we actually opened our call with you
Starting point is 00:07:03 by asking you, like like why you came back. We were like, so like you're, you like like us or something? You were cool with that? You're like, so we're like idiots, right? Why are you back? And yeah, we're thrilled to have you regardless of the questionable decision
Starting point is 00:07:20 by you to come hang out with us this late in the evening. The unlearned, yesed yes yeah because so much has happened i feel like everything there's always something happening with ai and we're like when should we we gotta ask dr carrie like what about this what about this and then now we've got the perfect opportunity to ask you a bunch of questions and also let you rain your knowledge down upon us and the listeners because yeah last time was really really good talking to you and gave us so much perspective uh because we were definitely like we like ai end of world right like it's like so powerful oh no no just okay okay okay okay just these language models are
Starting point is 00:07:55 doing some a lot of the lifting right now okay we've seen t2 we get it it's wild how much that movie has a hold like it's still raised in so many articles. Like, you know, journalists are like, as in the movie Terminator 2. Like, they're still like, that's the one I remember. So that's what we're going with for this model of understanding. But we're going to get into all sorts of AI questions. Your new book talks about the question of, is there good technology? Is that possible? Which is a real question for me at this point, because I've been so long at sea in this hyper
Starting point is 00:08:35 capitalist paradigm. So I'm curious, very interested to hear your answer to that. Unless it's just like, nah, it's not possible. A spoiler alert. Yeah. Super short book. But before we get into all of that, we do like to ask our guest, what is something from your search history that's revealing about who you are? Oh, gosh, I don't really know if I like what this reveals about me. But you know, when you get like really obsessed with something and it's not even recent it's not trending like there's no reason why but for all my search history for the last few days just really wild deep dives on the 2019 film cat you know the awful one with all the like cgi fur and
Starting point is 00:09:17 you know all the superstars kind of crawling around with all fours and i don't know why but for some reason like a specter just came back and warned me and I was like today I need to like listen to an hour and a half YouTube video on the making of Cats the movie 2019 and why it was a disaster and so I'm not busy enough I think I need more hobbies and I need
Starting point is 00:09:37 more productive uses of my time but I am now an encyclopedia on terrible CGI slash why you shouldn't try and make very weird Broadway musicals into films. Yeah. I was immediately trying to connect the dots. I was like, oh, well, I think I remember that they used AI technology to remove the cat assholes. Wasn't that one of the things where it was about to come out?
Starting point is 00:10:02 And they did a test screening and everyone's like, their pink cat assholes are in our face the entire movie. Are we going to just do that? And they went back and digitally removed them. But were you into the movie when it came out? Were you anticipating it? Or you just kind of have it? So I'm a dancer. So I love musicals and like you know i was one of the only people that was like unironically excited at the cat's musicals i
Starting point is 00:10:31 realized that's like four people who like actually would have wanted to see this and then the trailer came out and it was just so frightening what it was a really good example of is like in like ai and robotics you've attained the uncanny valley this might be something you've heard of this idea of like the closest something looks like to a human or to like a living creature but like being different enough that you can tell like it's not quite human like the creepier it is so this original robotics experiment um matito mori or the person who like writes this essay about this uses the example of like a mechanical hand that moves and it's like that's super creepy like no one wants to see that and all i could think of is this like essay from this japanese roboticist when i saw those cat humans things moving and i was like it's gonna be so bad like no one wants
Starting point is 00:11:19 to see this whether or not the pink assholes are there or not you know yeah they just end up being all smoothed out which is also awful yeah so yes it was a kind of terrifying disappointment because yeah there's kind of a streisand effect i feel like where people like but where are the assholes you know what i mean and then people are like well we actually did like i don't know might have been better with them have you speaking of cats and an interesting sort of obsession with them, have you listened to your fellow compatriots podcast, Guy Montgomery and Tim Batts podcast called My Week with Cats, where they keep watching cats over and over and talking about it? No, but this sounds exactly up my street. Yeah, yeah, yeah. This is all over the house.
Starting point is 00:11:59 Yeah, we've had them on the show. Guy is one of our favorite guests and fellow Kiwi. I've had them on the show. Guy is one of our favorite guests and fellow Kiwi. And yeah, like there's such an absurd podcast. You just keep revisiting cats over and over. Yeah. I mean, have you listened to the iconic Kiwi podcast, Who Shout on the Floor at My Wedding? It's kind of viral. I still haven't listened to it yet, but that's also on my listening list. Yeah, that's one that I've heard many times be like, have you come on? Yeah, that's one that I've heard many times be like, have you come on? And all the write ups about it, too, are like, it's absolutely the most riveting thing that we've listened to this year. So I feel like most people say about that.
Starting point is 00:12:42 That's amazing. So are you how many viewings deep are you of 2019 cats? Or is it just like you watched it once and then it's all YouTube explainers. Oh, yeah. I watched it once. I don't know if I could do it again, to be honest. I did it once on a transatlantic flight. And that was myself trapped in the middle tube, had this moment with cats. And I was like, oh, it is as bad as everyone said. And then now it's just extremely scathing movie reviews on YouTube and in print. So I'm single-handedly keeping the movie YouTube review economy alive at this point. What is something you think is overrated? I mean, I feel like pretty much anything TikTok tries to sell me, but particularly those like
Starting point is 00:13:19 gigantic cups, those Stanley cups. I don't know if you've seen this, but they've gone so viral, but they're huge. My dad used to take a thermos tumbler of that size out fishing for the day so that he would have enough tea or coffee to keep him going. But now I just see normal people and their normal little walks with a huge flask.
Starting point is 00:13:38 I don't get it. So if anyone can sell me, why? What is so attractive about these gigantic cups? You've got to stay hydrated, Dr. Maffner. with them gotta stay yeah i mean that's what's important doctor yeah i've never heard of it not a medical doctor so don't come to me on the plane anything useful it's not the same oh no screen all right well i'm sorry i sent you so many medical questions earlier this week over
Starting point is 00:14:05 email didn't realize that wasn't your area of expertise i mean it's like one of those weird it's just one of those consumer things that have taken over it's like the era of like when i was a kid like millennials and stuff when everybody wanted like a slap bracelet or a tamagotchi or a furby like or Beanie Babies. We're taking that into adulthood, and now it's like, do you have all the Stanley fucking quenchers? Yeah. Which is a wild thing.
Starting point is 00:14:33 But then also, it's wild what it's done for the SEO, because when you search Stanley Cup, which is the trophy for the National Hockey League, it's mostly overtaken by these fucking Stanley insulated mugs. Yeah. I mean, it is wild though those are 50 each of those things yeah yeah they're like at least 45 right yeah 45 to get you in the door and people are lining up the day before like to get an opportunity to pay 45 for this thing that as far as I could tell is like, has been available on the market for a long time. Like the, you know, just well insulated cups.
Starting point is 00:15:13 That thermos like that you're talking about, Dr. Carey, that's like the like, OG Stanley like item that people have that was like sort of like that minty green, very like every working person, every working stiff sort of insulated thermos kind of thing. We need a Stanley Stan on the show to just explain it to us. Like, what is the what what is so much better about the Stanley mug from like all the other insulated water carrying devices out there? Yeah. All right. Look,, look, Zeitgang, if you're a Stanley Stan, hit us up. We're interested in hearing from you. And also, I feel like there's a high likelihood that you are also involved
Starting point is 00:15:51 in some other, like, completionist collecting hobby in childhood. I'm just guessing. You know what I mean? Like, before I was into, like, I had all the fucking Pokemon cards, or, like, I had all the Beanie Babies. Like, it just has that same energy of being like, got to get them have them all if you're on that wave because i know some people just like them but some people have like way too many yeah i'm like over here laughing at my wife for like be her most value valued object right now is a stanley tumbler she got it like a week and a half
Starting point is 00:16:21 ago she like wrote her name and was like i will reward if if found because like she's really into it and i was like wow like what is why are you so obsessed just like i got in like right before like i got it before everything sold out and like i really like this color way and when she was like talking about the color like the color way of it i was like oh it's exactly how i am with shoes. Like, I have absolutely no room to look down on this. Yeah, we all have things. Shoes are more expensive.
Starting point is 00:16:51 So, yeah. Jordans, which were a thing that I was into when I was a kid. What is, Dr. McInerney, what is something you think is underrated? Oh, well, I just finished watching with my husband season four of a show that i think is so underrated it's called for all mankind it's on apple have you seen it i've heard
Starting point is 00:17:11 of it but i have not seen it yeah okay well now i can pitch it i don't work for apple for this show just to be clear although i should get commission now because i've pitched it so hard but it's like an alternative history of the space race so the idea is like what happened if Russia got to the moon first and so space exploration became this like major site of like investment and travel um and like I'm like not one of those people who's super into space like I'm not someone who seeks out a lot of space related sci-fi but this was riveting it's so well acted and every season is a different decade. So it starts in the 1960s, and then 70s, 80s, 90s, and 2000s, kind of mapping this alternate history of the world.
Starting point is 00:17:51 And they've got lots of little kind of like Easter eggs, little plot twists and things like that. So in this alternate history, for example, this is not a big spoiler. Say John Lennon is shot, but he doesn't die. So he's kind of still alive and like featuring a little plot point or like I think Margaret Thatcher gets assassinated by the IRA or like
Starting point is 00:18:10 it's a lot of these like little tiny points in history that go differently so I would recommend checking that out it is a really great watch okay and this all happens because Russia gets to the moon for like as far as I can tell the race to the moon they america got up there and there was like oh man there ain't shit here turns out it's a lot of dust we discovered nothing is it just like the bragging rights kind of turns everything on its head yeah i think that they kind of portray that like not only i mean a lot of it is the bragging rights and the international politics of like oh we've got to be competing to show that we're more like technologically advanced. But also it shows like, yeah, how the technological advances that came from continuing to invest in space, as well as the discoveries they make out there, plot a slightly different course in human history from like new energy sources through to, yeah, like new technologies for flight or for, you know, cultivating life in different places.
Starting point is 00:19:04 So it's just quite an interesting, like speculative future. yeah like new technologies for flight or for you know cultivating life in different places so it's just quite an interesting like speculative future yeah yeah i see i love a speculative future i've i i i had apple tv plus plus tv or whatever and then i've i i let the thing lapse so then i'm like my i turn it on my like tv and they're like oh you don't got this anymore but then i realized you basically get it for free if you buy like any Apple product anymore. And I recently like replaced my phone. So I'd be like, oh, yeah, I'll cash in my whatever three month period of freeness I get to maybe check this one out. Because I just wasn't watching a lot of stuff on Apple TV.
Starting point is 00:19:40 But it's also because we talk about this all the time. There's just too many things to watch now. So I need like if people like you whose opinion I respect to come on, be like, you got to check this out. And I'm like, see, I need something like that rather than seeing some Twitter account being like, this is the fucking dopest shit ever. I need more human recommendations. Yeah. Human recommendation. Yeah.
Starting point is 00:20:02 All right. Well, we are going to talk about how AI stacks up to humans when we get back. We'll be right back. It's not really what we're going to talk about, but I wanted it to sound like I was good at my job. So I did like a transition that sounded like, yeah, that sounded like a show that I would listen to. I'm Dr. Laurie Santos, host of the Happiness Lab podcast. As the U.S. elections approach, it can feel like we're angrier and more divided than ever. But in a new, hopeful season of my podcast, I'll share what the science really shows, that we're surprisingly more united than
Starting point is 00:20:45 most people think. We all know something is wrong in our culture, in our politics, and that we need to do better and that we can do better. With the help of Stanford psychologist Jamil Zaki. It's really tragic. If cynicism were a pill, it'd be a poison. We'll see that our fellow humans, even those we disagree with, are more generous than we assume. My assumption, my feeling, my hunch is that a lot of us are actually looking for a way to disagree and still be in relationships with each other. All that on the Happiness Lab. Listen on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. I'm Renee Stubbs, and I'm obsessed with sports, especially tennis.
Starting point is 00:21:36 On the Renee Stubbs Tennis Podcast, I get the chance to do what I love, talk about how tennis and other women's sports are growing and changing and what the future holds. I think I just genuinely loved what I did. I loved this waking up, putting on my sports gear. I still believe it was so rewarding. Maybe you can relate to it as well. As a woman, I think it's a very powerful feeling to have a job at which you're able to see improvements in real time.
Starting point is 00:22:02 On the show, we dissect everything going on in the game straight from the biggest players in the world. Plus, serve up recaps of all the matches and headlines in the game, including a rundown of the US Open every Monday. Listen to the Renee Stubbs Tennis Podcast every Monday on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Capital One, founding partner of iHeart Women's Sports. MTV's official challenge podcast is back for another season.
Starting point is 00:22:33 That's right. The challenge is about to embark on its monumental 40th season, y'all, and we are coming along for the ride. Woo-hoo! That would be me, Devin Simone. And then there's me, Davon Rogers. And we're here to take you behind the scenes of... Drumroll, please.
Starting point is 00:22:48 No, no, no, no, no, no, no, no, no. The Challenge 40 Battle of the Eras. Yes. Each week, cast members will be joining us to spill all of the tea on the relentless challenges, heartbreaking eliminations, and, of course, all the juicy drama. And let's not forget about the hookups. Anyway, regardless of what era you're rooting for at home,
Starting point is 00:23:07 everyone is welcome here on MTV's official challenge podcast. So join us every week as we break down episodes of the Challenge 40 Battle of the Eras. Listen to MTV's official challenge podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. And we're back.
Starting point is 00:23:31 We're back. And all right. So I had something happen over the weekend. I just wanted to run by you. Remember, she's not a medical doctor, Jack. I just want you to look at this thing on my back. Yeah. I know you're not a medical doctor, but it's just weird, right? Something happened over the weekend.
Starting point is 00:23:48 I could just use your... No. So I had one of my son's friends. My son is really into chess. And his friend was playing chess on his iPad. And my son asked who he was playing. He said, I'm playing AI, instead of just playing the computer. It was the first time I'd heard somebody refer to it as AI instead of playing the computer. And as we knew you're coming on,
Starting point is 00:24:18 I was looking at all the headlines, and it felt like a lot of the stories that are coming up are basically on the same continuum of progress of things that the tech industry has been of people have a hard time conceptualizing like so what exactly is ai though and i think there's a lot of people like capitalizing off of that because now it's like it's no longer just algorithms are making it hard to tell what is true and what isn't in on social media now it's like ai is gonna fuck up this election everybody you know miles was talking about how like ces was you know full of all these products that are like yeah but there's ai in the in it now rapid rehydration sports drink everything said was posted in this weird way to be like yeah this thing and now it's got ai in it and you're like ai in it what do you mean we're talking about a
Starting point is 00:25:23 coffee maker it's like yeah dude it's got ai in there and everyone's like breathlessly being like oh my god this new fucking thing has ai in it but again it just sounds like algorithms but also people are also i think also confusing like large large language models for being like oh so the computer is going to think like a person and then my coffee machine is going to think like a human and problem solve like a barista versus like. So I just stick a little brain in there. Right. So I guess like the glib way of summarizing it would be that like, you know, there were these large language models that impress people quite a bit. And they've used that to change the word for all tech advances
Starting point is 00:26:05 to be just like AI for branding purposes, kind of. I don't think that's true about all of it, and there's definitely some things that seem to be advancing more quickly than I'm comfortable with and that I think people are comfortable with. But I guess I'm curious to hear your thoughts on that as you're seeing so many stories and products and all these things and children playing chess misusing the phrase AI. What are your thoughts? I mean, my first thought when you were saying that about the sports drink, I was like, maybe the Stanley Cup has AI in it.
Starting point is 00:26:37 That's why it's plugging in to TikTok and starting its quest for world domination. Yeah, I mean, I think this is one of those cases where two things can be true at once. It can be true that we've seen massive developments in large language models and generative AI and other kinds of big data projects. And at the same time, lots of firms are jumping on the AI train and marketing a lot of things as AI that even if you somehow decide on one definition of what AI is, we would probably say that these things definitely aren't AI enabled in any way. And actually, I spent like a couple of years interviewing scientists and engineers at this
Starting point is 00:27:17 big tech firm. And one of the things that they were super concerned about was this exact problem. So saying, you know, oh, well, 10 or 15 years ago, we would have just called this a decision tree. Now we're calling it AI. That doesn't mean there's something wrong with that technology, but it does mean that maybe the way AI is being used is more to do with branding, more to do with selling it as a product and less to do with its actual technical functions or what the tools can actually do. So a lot of my job as an AI ethicist is just like asking customers and people to just be a little bit more critical and ask those questions around,
Starting point is 00:27:51 can this tool actually do what it says on the tin? What does it mean for something to be AI or AI powered? And is it going to actually help you get where you want to be? Because it actually doesn't matter if it has an AI function, if that ai function is not going to help you achieve your goal so like an example of this is like a lot of tools say ai powered hiring tools an example i know beef like they'll say okay we have an ai function right so we use ai to help analyze say like the words that a candidate uses in the interviews but like a lot of these tools have like a toggle on off function. So you can just like turn the AI part off, which is what a lot of companies will do is they'll buy in an AI powered tool.
Starting point is 00:28:30 But then because that opens them up to a lot of liabilities, if they actually use AI to process candidates, they'll just turn that bit off. So they kind of, you know, try to have things both ways, I think, where you're getting a tool that seems really high tech,
Starting point is 00:28:43 but you're like not actually using that function. So yeah, I think this kind of idea of AI as branding is also something that we should all really be a bit concerned about. Yeah. Yeah. Well, and also just seems like there's this weird thing too,
Starting point is 00:28:55 where companies are like, yeah, we use AI at other times. Don't fucking tell them we're using AI. Like, you know what I mean? It's like, what the fuck is it?
Starting point is 00:29:02 Because recently there's this video game, apex legends that there was like some art artwork that came out for like a new thing involving final fantasy and people were like wait i think this is generic like generative ai was used for some of these like models because like the fingers don't look right etc and it was like a whole thing where it's like embarrassed to admit or shouldn't have used ai. So there's, there's like this interesting balancing act where people are like, don't use it there. Yes, I'm glad my insulated mug has AI in it. And but we're, but like, we're still kind of finding like, what is useful, what is acceptable, what is embarrassing, or whatever. So yeah, it just feels like so many things are still up in the air in regards to its
Starting point is 00:29:42 use. Yeah, yeah. And I think you're right. There's this interesting dynamic where like, on the one hand, I saw a lot of critiques on TikTok, which is again, how I spend most of my time sort of mindlessly scrolling past mugs and other various bits and pieces. But there were a lot of critiques of that new Disney film, Right Wish, which is kind of blending 2D animation and 3D animation. And one of the critiques people kept saying was like, the song sounds like it's AI generated or the art looks AI generated. It was being used differently in this like quite derogatory way.
Starting point is 00:30:11 Whereas you're right, like a lot of AI startups sell themselves in the idea that like we're an AI first company. I think there was a study that was done like a few years ago with like European based AI startups which showed, I think, you know, like something like up to 40% of those AI startups,
Starting point is 00:30:27 like didn't actually use AI. They were just marketing themselves as AI startups. So there's a lot of kind of, you know, like, you know, weird sort of misleading advertising going on, but then it can also lead to some like really strange other phenomena,
Starting point is 00:30:42 like the Wizard of Oz phenomenon, which is when you actually have companies having humans pretending to be ai not the other way around so that they look more tech forward so you think you're talking to an ai chatbot and you're not you're actually talking to a human pretending to be an ai so it's kind of just spiraled down to all sorts of absurdities isn't one of amazon's big companies called the Mechanical Turk, like based on the fit, there was like an early instance where a robot was like really good at playing chess and would like beat all these people at chess. And it turned out they just put a chess master inside a like robot suit. And that's who was beating everybody.
Starting point is 00:31:20 Like it feels like they're knowingly doing that. is beating everybody. It feels like they're knowingly doing that. They're trying to make it seem like they are using advanced technology when they're not. I think one of the big questions that I've had all along is in reading articles, interviewing Sam Altman or these people who are at the forefront in these massive positions of power in the tech industry and specifically around AI, they really seem to be willing to sell
Starting point is 00:31:54 the threats and evoke Terminator 2 type imagery. I think the last time you were on, we talked about how Sam Altman, the head of OpenAI, in a New Yorker interview was like, yeah, keep, like, keep a cyanide capsule on me, you know, just in case the robots take over. And I gotta, I gotta go because I don't want to see what that looks like. Yeah, I don't want my Stanley Cup to revolt against me.
Starting point is 00:32:29 and it's like well he of any of all people knows that that's like not a not a threat right now like at this moment but he is doing this dance that we're talking about right where he it is driving major massive volume for him to convince people with money that like the thing they're doing is not part of is not on this continuum with like past technological advances it's this new thing that's going to like come in and like you know be a magical hyper intelligence yeah and i think there's something particularly ironic when it is people like sam altman who you know may very well actually believe all of these things right i don't know personally i mean i don't want to say oh he's just sort of trying to deliberately create this hype like in a very cynical way maybe he really does believe that
Starting point is 00:33:14 um but surely if you really thought like oh there's actually like a meaningful risk that robots could take over and i'm gonna have to take my cyanide tablet. When you just stop investing and making these things, you know, these are people who have like, actually do have the power to shape the direction of tech development. And I think there's something particularly ironic or, you know, quite hypocritical about scaremongering about AI and AGI while also actively orienting your company towards building it. And also simultaneously like undercutting regulation so like at the same time that altman was going really public about the risks of artificial intelligence open ai was like simultaneously lobbying for weaker regulation with the eu ai act and i think we see this a lot with elon musk for example warning around the dangers of ai
Starting point is 00:34:00 by simultaneously firing twitter's whole ethical ai team. So to some extent, you know, I'm not saying that they deliberately try to weaponize this narrative, but there's just a real mismatch between what they're saying and like what they're actually doing and the domains that they can control. Yeah, at the same time,
Starting point is 00:34:17 the people at the forefront of, you know, understanding what climate change was going to look like were at Exxon and Chevron. And, you know, they were climate change was going to look like were at Exxon and Chevron. And, you know, they were perfectly willing to be like, this is fucking bad, you guys. It's not our fault. It's your fault, maybe. You should switch to paper straws, I think. So, yeah, the consensus among people like that, like I was in Davos this past weekend.
Starting point is 00:34:44 I don't know if you guys were there. No, I didn't make it. And just talking to some of the people there. I was not in Davos. But the idea that, sure, we can have these ethical concerns, but that's the progress and what capitalism tells us to do is inevitable. There's no arguing with that. It's just like we're going to always do what delivers the most shareholder value yeah and like anything else is naive on the
Starting point is 00:35:11 world economic forums website with like four takeaways from you know like the the gathering in davos like one of them was talking about like ai is all the buzz and a lot of it was definitely focused on like this is how like we're gonna fucking boost global gdp this is how we can close the gaps in like the lack of skilled labor like it's all gonna happen then this philosopher spoke like on something about like will we completely lose our touch to know what it means human anyway that was like a whole other thing but then this other person came on to talk about like they really like sort of nodded at maybe there's something to contend with here but i feel like more of the takeaway was like this is y'all this is how we're also gonna get more richer please so it's yeah like it's they're definitely
Starting point is 00:35:49 trying to do many things at the same time which is like boosting in our in our sort of like zeitgeisty consciousness that like ai is like the fucking like the new plutonium like uranium plutonium rod that can power everything and also destroy us but also like good for profit so it feels like this omni issue and it just depends on your own sort of like your sort of dispositions to sort of figure out like what the which version of it you're seeing and then recently i feel like the latest thing i saw as it relates to like politics because that's where i see like it having the most impact right now like i I've seen on YouTube, so many like mid journey created thumbnails to like make it look like certain things are happening in the world that aren't just as a way to get people to click. But also like in New
Starting point is 00:36:33 Hampshire, where there's a primary, apparently there was like a fake Joe Biden robocall that people were trying to suspect whether it was like used through some kind of AI voice tool or an impressionist or whatever, that that's where I feel like we're starting to see the real kind of stuff impact our real world rather than being like, no one works at McDonald's anymore. I guess all that to say is, is that, you know, from your perspective, that's what I'm like, okay, Dr. Carey, like, I get it. AI is in our toasters now or whatever. I can put that aside. But, you know, like when I think that's the for I think for us, especially in the US with this election year happening, that's where I see like a lot of potential for like just it's just really
Starting point is 00:37:16 these tools that allow people to add to like a media narrative very easily in a compelling way that feels like one of the most potent things we're going to see this year but i'm am i off like what from your perspective what are the things we should be looking at or maybe some things that are distractions no no i mean i think you're definitely on there when it comes to the meaningful risks posed by ai and its ability to help generate disinformation um and because we've got i I think, over 70 elections this year, so it's a huge year internationally for elections, including, as you mentioned, the US election.
Starting point is 00:37:52 And we've seen kind of over the past few elections in the US and also elsewhere, like the way that disinformation and kind of the mass spreading of forms of political ideology has been hugely damaging, I think, in many ways to our information society and the way that we conduct political elections. And we've seen, unfortunately, examples of specifically AI-generated misinformation. I think actually it was in the Slovakian elections a few weeks ago where there were a number of AI-generated clips and deepfakes, including one where one of
Starting point is 00:38:26 the candidates was i think the audio was edited in a way but it appeared that they said they were going to double the price of beer which in that context was like a huge kind of outcry like really very very damaging not my president so important yeah some people were like whoa you know yeah and so you know we do see examples of this at the same time i think you know for me this is one of those classic examples where tech illuminates a problem that's already there right so ai can provide people the tools to make more convincing misinformation maybe to make it at more scale these things are both bad but i think we need to say, actually, what makes people vulnerable to misinformation in the first place? What makes people vulnerable to fake news? Why do people still choose to consume and share fake news, even when they know it might not be factually
Starting point is 00:39:14 true? These are all things where, again, I think technology is often treated as a domain just for computer scientists or mathematicians, but this is why we need psychologists and we need people from the humanities and the social sciences. We need political scientists to try and understand what makes certain communities vulnerable to these kinds of narratives and ideas. And I think until we do that and really
Starting point is 00:39:37 prioritize this kind of research, building better media literacy and better resilience to these narratives, we're not going to necessarily be able to find a quick tech fix yeah i would just stand i would just like accept going eight years into the past in terms of like how much media there was like how much money was being spent on media because it's just such a fucking just wasteland out there these days like all all of the legacy media outlets are just gone or not spending money.
Starting point is 00:40:09 They seem as likely to be using AI tools. Like I got this headline from the LA Times this weekend sent through to my phone. The headline was, could your life could use more wonder? From the LA Times. I don't know that they're using AI. Maybe it was just a typo.
Starting point is 00:40:31 But like, I don't know, Sports Illustrated, we found out about them getting caught using AI. And it was, you know, the articles are just there and like saying nothing. So like, it's just this weird situation where the people who I think typically we would be counting on to be gatekeepers for this sort of misinformation are also the ones who are trying to get by using the technology to replace humans.
Starting point is 00:41:00 Yeah. So it's just such a weird environment. And it just feels like we're facing a blizzard of of misinformation now instead of you know something that you can like point to and be like okay here's one thing one lie that they're trying to spread now it's like there's 40 a day basically could you life use more wonder could and you're like maybe but like it's also i feel like it's just accelerated just the degradation of like journalism in general and like you know look at what happened to pitchfork and other places or like i think a lot of media companies do look at it is like yeah sure this website that used to have all these great journalists people don't
Starting point is 00:41:42 come there for like their human writing anymore. It's just a brand that brings people to click on things. And if I can seed that sort of tent with all this fake tent or content, as y'all call it, that people will just go continue to click. It's like, and that's how we make money. It's not even that what's written there is important. It's just that I can create a little click website
Starting point is 00:42:03 that people do clickies on all day and fuck all the human like contribution aspect to it and yeah then we're left with more and more people getting laid off and just these like really bizarre like you can tell it's just wild you can tell pretty quickly now i feel like when i'm starting to read something with ai i'm like you're leaning into these like very specific details so much in the writing it's like it reminds me again of like when i used to bullshit on an essay in high school and be like george washington who was also known as our nation's first president i'm like just adding all this detail about george washington rather than like getting to the point or commentary about what he did um and so yeah like it just i feel like those are the other meaningful ways that
Starting point is 00:42:45 we see things going downhill. But it's, yeah. Yeah. I mean, I feel like just the sad way you said, yeah, was like a pretty good summary of how I feel going on Twitter nowadays. Like I just, you know, all my joy
Starting point is 00:43:02 from that platform is gone. But I think you point out something really important, which is while the revenue model for news sites is clicks-based and advertising-based, like I think we're not necessarily also going to see a solution to this problem. Because unfortunately, you know, and I know this from doing like my own social media promotion with the podcast I do, The Good robot um the thing that generates money is controversy effectively like nothing does as well as the posts that i do on tiktok where someone has you know a go at me because they don't like my particular interpretation of star trek or whatever it can be pretty banal but that as soon as a fight starts in your comments that's when a video starts to get views and so you know kathy o'neill the ai ethicist has a really interesting book on this called the shame machine where she talks about who profits from public shaming who profits from like these big twitter pylons like it's big tech
Starting point is 00:43:54 it's firms like twitter or x um or whatever that will inevitably be called next year um but you know it's the big platforms and it's not the people who are doing the piling on or the people getting piled on. And so I think we see kind of a similar thing with a lot of these news media organizations, which is when the incentive is just to get as much visibility as possible rather than to be informing, you know, providing, you know, yeah, true information. You know, I think we're not going to find a solution. you know I think we're not going to find a solution yeah all right we're going to take a quick break and we're going to come back and talk about maybe some good things some good possibilities like is good technology possible still we will be right back I'm Dr. Laurie Santos, host of the Happiness Lab podcast. As the U.S. elections approach, it can feel like we're angrier and more divided than ever.
Starting point is 00:44:55 But in a new, hopeful season of my podcast, I'll share what the science really shows, that we're surprisingly more united than most people think. We all know something is wrong in our culture, in our politics, and that we need to do better and that we can do better. With the help of Stanford psychologist Jamil Zaki. It's really tragic. If cynicism were a pill, it'd be a poison. We'll see that our fellow humans, even those we disagree with, are more generous than we assume.
Starting point is 00:45:21 My assumption, my feeling, my hunch, is that a lot of us are actually looking for a way to disagree and still be in a relationship with each other. All that on the Happiness Lab. Listen on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. I'm Renee Stubbs, and I'm obsessed with sports, especially tennis. On the Renee Stubbs Tennis Podcast, I get the chance to do what I love,
Starting point is 00:45:53 talk about how tennis and other women's sports are growing and changing and what the future holds. I think I just genuinely loved what I did. I love this waking up, putting on my sports gear. I still believe it was so rewarding. Maybe you can relate to it as well. As a woman, I think it's a very powerful feeling to have a job at which you're able to see improvements in real time. On the show, we dissect everything going on in the game straight from the biggest players in the world. Plus, serve up recaps of all the matches and headlines in the game, including a rundown of the US Open every Monday. Listen to the Renee Stubbs Tennis Podcast every Monday on the iHeart
Starting point is 00:46:36 Radio app, Apple Podcasts, or wherever you get your podcasts. Presented by Capital One, founding partner of iHe Heart Women's Sports. MTV's official challenge podcast is back for another season. That's right. The challenge is about to embark on its monumental 40th season, y'all. And we are coming along for the ride. Woohoo. That would be me, Devin Simone. And then there's me, Davon Rogers. And we're here to take you behind the scenes of, drumroll please. No, no, no, no, no, no, no, no, no.
Starting point is 00:47:06 The Challenge 40, Battle of the Eras. Yes. Each week, cast members will be joining us to spill all of the tea on the relentless challenges, heartbreaking eliminations, and of course, all the juicy drama. And let's not forget about the hookups. Anyway, regardless of what era you're rooting for at home, everyone is welcome here on MTV's official challenge podcast.
Starting point is 00:47:28 So join us every week as we break down episodes of the Challenge 40 Battle of the Eras. Listen to MTV's official challenge podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. And we're back. We're back.
Starting point is 00:47:48 We're back. And one of the things that your new book, Dr. McInerney, you talk about is what is good technology and is it possible? And I feel like I'm so used to this hyper capitalism paradigm that I don't think we can have technology without having like loss of jobs and free will. But I don't know, I've seen this like recent reappraisal of like the Luddite movement, which is just a phrase that I grew up using to be like anybody who didn't want to use a computer didn't, you know, was slightly resistant to resistant to technological progress. And now people are pointing out, no, they didn't just want to destroy all machines. They were focused on the ones that took jobs and led to wage losses. But we turned them into old man screaming at cloud because of our paradigm of like, yeah, but that's counter progress. That's unrealistic. So what is my closed-off capitalist mind
Starting point is 00:48:47 missing out on when I think about the direction that technology can take? What are the good things that aren't just basically AI being McKinsey? Yeah, no, no. First, I do love this reappraisal of the Luddist movement. I know Brian Merchant has a book out called Blood in the Machine, which is like specifically trying to reframe the Luddites, this movement against automation in the UK, as the origins of the revolution against big tech. And I do love this because I do think the Luddites have been unfairly maligned as these tech haters.
Starting point is 00:49:26 these tech haters. But yeah, second, you know, so myself and my work wife, Dr. Eleanor Drage, co-edited this book called The Good Robot, which is the same as our podcast, on this like provocative question. And we mean it very much as like a suggestion or idea, not like an inevitability that technology, you know, maybe can be good. In a lot of spaces, that definitely doesn't sound like a particularly radical idea, particularly in the tech hype spaces we've discussed. harmful ways, like the way that technology is used to perpetrate gender-based violence. It can be really easy to be unable to see any kind of positive possibilities for a lot of these new technologies. But while I definitely think that there's a real place for just the total refusal of people like the Luddites or the neo-Luddite movement, which is kind of trying to bring back a lot of these ideas, we wanted to challenge ourselves and all the guests we have in our podcasts to say, what would it mean for technology to be good? And so for us, that's feminist and pro-justice and informed by all these different kinds of ideas about equality
Starting point is 00:50:35 and fairness. And also, what would that look like sort of grounded in our everyday lives? So for me, for example, a lot of thinking about good technology is trying to reclaim technologies that we might not think of as being very high-tech, often because they've been associated with women. So I knit and I have a pair of knitting needles on the table next to me. And knitting is often not understood as being a very high-tech practice. But in the 1980s, when people were trying to get more girls and women back into computer science, there was this idea that if you can knit, you can code. Because if you can read a knitting pattern, then you can use a coding program. And so sometimes now at computer science conferences, you'll see people put a knitting pattern up on the screen and they'll say, what coding language is this?
Starting point is 00:51:19 And then usually only one or two people, often one of the few female attendees will say, oh, that's a knitting pattern because they're the only ones that can understand that kind of code. So I think there's something really beautiful in reclaiming those particular kinds of technologies that have been maybe excluded from the way we talk about tech. On my, again, work wife who co-edited this book, well, she edited most of it actually, so who really did the heavy lifting on this book, Eleanor talks about the whisk as her example of a good technology. And she says she loves the way it looks. She likes how she can use it in all these different ways. And she says, you know, she's sure there's ways that you can misuse this, but, you know, it's something that just like makes her life better and is designed well in a very simple way. I unfortunately have already undermined this good technology for her
Starting point is 00:52:06 because I then told her about when I was about 14, my school went to a trip at the technology museum in Auckland. It's called Motet. If you're from New Zealand, everyone in Auckland's been to this museum because there's not that much to do in Auckland for a school trip.
Starting point is 00:52:22 And then one of the girls in my class wound the hair of another girl into like an antique egg beater. Then they couldn't get her out. She got stuck. And so, you know, children can make all technologies bad. But apart from that, you know,
Starting point is 00:52:35 I think like trying to find these like little examples of technology that aren't about the kind of like big hype of AI, but maybe bring us back into the ways that we use technology to reshape our world and make things a bit better is what I like to do with this question.
Starting point is 00:52:50 When you think of like, you know, I think that the one version is like, well, this generative AI, like it democratizes certain things. And I think while on one hand it may allow people access to like create things that they haven't before. It also makes other, like you're saying, it's the use of it that makes things that sort of ultimately determines whether or not a technology is good or, you know, used in a positive or negative way. Is there, like when you look at all, like for all the people that are preaching and proclaiming about how AI is opening the door to something new. What, like, as it relates to sort of these large language models,
Starting point is 00:53:28 what are the ways that that can't, like, is that more about a use case or we need to lean more into the regulations to make sure that AI isn't wielded by nefarious powers? Like, how do you look at that specific technology and think, okay, while there is definitely, like, a lot of biased or weird uses of it, like, there's also another way to look at this and not just kind of, okay, while there's definitely like a lot of biased or weird uses of it, like there's also another way to look at this and not just kind of like lean into the Skynet version. Yeah, I mean, I would take this in a lot of different ways. Like my first and sort of most important route in is I would say who makes these models
Starting point is 00:53:59 and who has control over them and who can afford to. Because, you know, one of the big changes, I think, that's happened in the last few years is that language models have gone from much smaller models that maybe like one researcher with a reasonable budget could train themselves in a lab to being these absolutely huge models that you need a massive amount of energy,
Starting point is 00:54:20 a massive amount of data, and a massive amount of money to create. And so what that means is that companies with a first mover advantage like Google, like OpenAI are the ones who can afford to make these models. And I think increasingly, it's going to be harder and harder as the models get bigger for small firms to enter that market. So what we end up with then is a monopoly. And I think we're starting to see some of the effects of that monopoly right now, where
Starting point is 00:54:44 you have a few big tech firms kind of having a hand on like most of the most powerful and effective models. And so I think like, even though people say, oh, this is going to democratize AI because everyone can generate text with these models. It's like, yes, but you know, very few people are profiting from it. And also, you know, I think very few people then have control as well over how long we're able to use those models for one day. Will they just all be turned off or will they be shifted in a way? You know, so I think there's still kind of a concentration of control. Yeah. And I think second, like you mentioned kind of biases and like weird stuff in the models, super important, like large language models are trained on data scraped from the internet.
Starting point is 00:55:26 The internet can be not the best place, as we all know, it can be right for like all kinds of information. But it's also full of a lot of exclusions. So like, for example, again, when I was working, talking to these data scientists and engineers at this big tech firm, we'd ask things like, oh, well, where do you pull your training data from? And they'd say, Wikipedia, for example. And, you know, we'd say, oh, but like, Wikipedia is not a very equitable place.
Starting point is 00:55:51 Like women are really, really vastly underrepresented on Wikipedia pages, both in terms of who writes them and also in terms of who gets Wikipedia pages written about them. So the physicist Jess Wade here in the UK has had this long running project where she just adds a woman to Wikipedia like every day. And she's done that, I think now for like years, but they kind of just shows like how inequitable though that distribution is. If you're training a model on data from Wikipedia, implicitly, you might not be trying to do this in any way. Like you're also training that model to believe in a world where say like women make up 20% of the population,
Starting point is 00:56:28 not 50% plus. So there's a lot of like biases and harms that come just from exclusion. Another example of this is, you know, I have a good friend who is a linguist and something she talks about is communities that don't have written languages are already automatically just not going to be able to partake in whatever benefits might come from large language models, whether that's signed languages or languages that are only oral.
Starting point is 00:56:52 And so, you know, I think there's just a lot of different ways that even beyond these kind of immediate harms of like the AI has produced something that we think is really offensive or gross, that we could see the use of large language models maybe creating further inequities now a lot of the ai like promise like the developments that are being promised so even at davos like this past davos this past i had to miss unfortunately and i hate to miss davos because i learned so much there but you've been to four davai haven't you my fourth davai was the best man we all did molly and just had a cuddle puddle but sam altman i don't know if he's like consciously scaling back people's expectations but he was like soon you might just be able to, what are my most important emails today and have AI summarize them. I was just like, all right. Doesn't Outlook already have an offer, like offer a shitty version of that already?
Starting point is 00:57:55 So it just feels like the versions of AI that I'm hearing, like there's this older New Yorker article that was like, I'm not that worried about AI. I think it's from like the kill us all perspective. I think it's going to be like a little McKinsey in everyone's pocket. Like it's going to be this like economic optimization tool that like everybody has access to, and that's going to just make everything shitty and boring. So I, I don't know, like, I'm just curious for your thoughts on that and like if there are examples of just like functionality from ai that actually like capture your imagination where you're like oh shit that would be like cool that's a cool idea of like something that would be fun and you know improve people's lives even if it's just like make their video games better or whatever yeah i mean maybe the email thing appeals to some
Starting point is 00:58:51 people personally i want fewer emails i don't want a summary of my emails i just want my inbox to quietly shut down between the hours of like 5 p.m to like 10 a.m every day and just be like i'm email free and then the writer ian bogost i think has this idea of hyper employment which is like the technologies that say they're going to make our lives easier and more stress-free actually make our lives much busier and we now waste a lot more time so he talks about emails as a way of you know saying like oh we're gonna have five fewer meetings and we're gonna like have to spend less time like sending each other letters or whatever sorry i'm from the post internet generation but then now we spend like so much time like answering emails and i think ai feels a bit like this like when people say like ai is going to save you so much time i'm like you are not a
Starting point is 00:59:40 teacher and educator because the amount of time we have wasted this year trying to figure out what to do with ai generated essays like absolutely not um yeah and so i feel like things like the ai email summarize i i since could end up in a very similar pile but you know kind of to come to the more positive side of your question like what makes me excited i think a couple of things like one is like anything where where AI can genuinely scale up in a way that is not too ecologically damaging or costly, a process that is already going well, where the statistics and the procedures in place are working for us. Because AI is able to scale things. It's not necessarily able to do new things always. So if we know we have a sorting or categorization process that works,
Starting point is 01:00:26 that's when I think AI and computer vision, these kinds of systems can be really useful. Where it doesn't work is when you're asking AI to do something that we don't actually have good processes in place to do. So when a tool says, oh, I can tell a candidate's personality from their face, no, you can't do that. That's just a straight-up phrenology. you can't do that that's just a straight up phrenology please don't do that but also secondly like you know if we had easy ways of telling if someone's going to
Starting point is 01:00:50 be good for a job like humans would be able to do it already like this is a much much more complicated than you're making it out to be um a second kind of use i think i find really exciting or makes me like you know really happy i think is particularly tools around trying to kind of like support particular communities as needs in a way that is really driven by that community. So for example, in New Zealand, where I'm from, there's been a lot of effort put into different kinds of like AI powered tools
Starting point is 01:01:18 and data sovereignty programs around Maori traditions, the Maori language or Te Reo Maoriori and like i think you know this is an example of where like that's been led by maori people and is in a response to kind of the way that in colonial new zealand like te reo maori was like very deliberately stamped out and there's been a huge movement to try and kind of protect and revive the language and i think it's like when you have projects like this that makes me a bit more hopeful about the way that AI and machine learning could be used to promote these pro-justice projects. But I think those projects always have to exist in a little
Starting point is 01:01:55 bit of tension with big tech. And we've seen this with other organizations, for example, like Masakana, which is an amazing grassroots organization, which aims to bring the like 4,000 different African languages into natural language processing models or large language models. You know, but I know that these kinds of groups often do struggle with this idea, like, do we commercialize? Because then will we be brought into this hyper-capitalist world? Do we keep this to ourselves? But yeah, I think it is important sometimes to step back and be like, there are really interesting community projects which are trying to use these techniques and these kinds of knowledge in ways that push back against the email summarizing bot.
Starting point is 01:02:34 Right. Is there a historical precedent? some articles about the competition between the United States and China and how the U.S. is like trying to freeze export of like certain chips to China because they think that will allow China to catch up with them. And it feels like where there's going to be inevitably an argument where they're like, we need to just go pedal to the floor on AI development because this is the new Manhattan Project, trust us. You talk a lot about just like this alternate possibility of what if we didn't use this technology for hypercapitalism and militarism. Are there examples where, like from history that you're aware of, where like technology hasn't been, has successfully been like protected from those sorts of things?
Starting point is 01:03:34 Are there any, even if very small examples, where people have been able to keep technology like fenced off from that sort of thing? Yeah. No, I mean, this is something that really preoccupies me. I spend a lot of time mapping and tracking with the AI Now Institute, this narrative of an AI arms race between the US and China, and how that story is super damaging because it can cause this race to the bottom and lead to us trying to develop AI faster and faster without necessarily trying to make AI faster and faster without necessarily trying to make it better or safer. And, you know, I think we've seen some positive movements when it
Starting point is 01:04:10 comes to AI regulation recently from like the US's commitment or declaration around AI through to the Bletchley Declaration in the UK and the EU AI Act. But at the same time, you know, I think that this sort of racing narrative like still lurks and is still sometimes used to try and push back against regulatory measures, particularly by people
Starting point is 01:04:32 with investments in big tech. But yeah, at the same time, I also think this question of like, can we look to history to find ways to tell different stories about AI, maybe bring about different futures is something that really interests me.
Starting point is 01:04:50 I think the comparison that is most often made between AI and another technology when it comes to like regulation and governance is nuclear and specifically nuclear weapons. And like you mentioned Manhattan Project, certainly kind of this sort of language of like an Oppenheimer moment when it comes to AI and this kind of idea of being on the cusp of a new cold war that will be AI driven rather than nuclear weapons driven is a very common media narrative. But something I try to do like with my own research is to try and look for and support different kinds of historical analogies that maybe offer maybe less kind of less hawkish futures when it comes to international politics um so for example maya andera ganesh who's a fantastic researcher at the leverhulme center where i'm at looks to like uh
Starting point is 01:05:33 histories of feminist cyber governance in the early 2000s uh as a way of saying like actually we have a lot of precedence for thinking about the ethics of the internet why aren't we bringing this into thinking about ai or um matish mass who's a legal researcher looks at histories of technological restraints so like when did we choose not to make a technology even though we could because we thought that it actually wouldn't be good for the world and for societies and so i think you know like making sure we have examples outside of nuclear because while nuclear can be still useful in some ways like it's only one metaphor and metaphors are inherently limited they tell us something about the world but they
Starting point is 01:06:10 can't tell us everything and i think having these alternative historical examples can be really useful for thinking a bit differently yeah so yeah it's it's just funny because like in the end it always feels like thing with potential to create create unforeseen levels of productivity or power. It's like, and then we made it a weapon. Then we put it in the bomb. And yeah, we really do have to sort of break out of that thinking. and Oppenheimer and the Manhattan Project, that we're just in this really weird pattern of always looking at something
Starting point is 01:06:48 that has the potential to unlock new levels of something are inherently going to always be like, but how do our enemies kill us with it? And then we begin to lose the plot there. So yeah, I look to these other examples to try and again, open my mind to looking at it less of like, and then how they make, and then how they make global domination with that.
Starting point is 01:07:11 Right. Yeah. I mean, I think it's either, it's like, how do we kill someone with it? Or as I think the history of like how tech is represented in Hollywood would show us like, how do we have sex with it? And this is like a very classic trope in sci-fi, right? It's like, you get like a dude in his basement who makes like a sex bot and i remember um interviewing jack halberstam who's like this very famous uh feminist and queer theorist um with eleanor on the podcast and he
Starting point is 01:07:35 was talking about the film ex machina have you seen this it's from like 2014 it's like a kind of a kind of an indie film but had quite a lot of prestige. I think success, particularly in tech circles. Yeah, for sure. Yeah, yeah. And so I remember we were talking about Ex Machina and Jack Halverson was saying, it sort of shows the limits of the imagination, particularly the tech bro imagination,
Starting point is 01:07:58 that he has all this expertise at his fingertips, all this data, and he basically just makes sex robots. And that's all he could really think to do with that. And he basically just makes like sex robots and like, that's all he can really think to do with it. And I think, you know, to some extent, like we're still a little bit trapped in that imagination, which is why I think like both like different projects to do with AI,
Starting point is 01:08:16 but also different stories about AI are really crucial. Right. Yeah. We have to get out of the fuck or fear paradigm that we have the technology. It's going to do one of the two, man. So, yeah, we need a new ways to look. I'm here to do two things. Fuck something or kill it because I'm scared of it. Yeah. Well, Dr. Carrie McInerney, what a pleasure having you back on the Daily Zeitgeist. Where can people find you, follow you, all that good stuff? You can find me at Carrie A. McInerney on Twitter slash X, reluctantly still there. And also check out the Good Robot book, which you can find at Bloomsbury for pre-order. It'll be out next month. I think also on Amazon. But yes, please take a look if you're interested in all things feminism and technology.
Starting point is 01:09:06 Amazing. And is there a work of media that you've been enjoying? Oh, gosh. All that's coming to mind right now is this tweet that I saw that I just moved to London quite recently. And it was a tweet where someone had said, should I like stay in bed or get out of bed and inevitably spend $140? And I feel like that's literally my life right now. I just spend all my time deciding, do I risk going outside in this very beautiful but very expensive city? So I've been enjoying that.
Starting point is 01:09:34 Yeah, it's funny too, because even comparatively, coming from LA, I'm like, man, I wish there was London prices here. Yeah. The US is bad right now. Right now. But it's going to get better once ai kicks in yeah exactly can ai bring the prices down uh no unfortunately but like it's funny too because we're just on the heels of like that story that came out about like yeah a lot of the cost of living of ills that we're feeling are because of greed inflation uh really nothing to do with
Starting point is 01:10:03 anything else aside from companies being like, yeah, yeah, we can squeeze them. Let's squeeze them. If only there was a daily comedic news podcast that has been saying that for years! That makes me very mad. And if there is, let us know about that podcast. We'd love to listen to it. Yeah, that sounds like a good show.
Starting point is 01:10:19 Miles, where can people find you? Is there a work of media you've been enjoying? Yeah, let's see. You can find me at Miles of Gray at all the app-based platforms. You can find us, Jack and Miles, on our basketball podcast, Miles and Jack Got Mad Boosties. And if you like a bit of 90-day reality trash, you can check me out on 420 Day Fiance that I do with Sophia Alexandra. Let's see. Let's see. let's see let's see let's see a tweet i like this one is from the hype with five four wise uh tweeted boarding a flight a guy's trying to sit
Starting point is 01:10:57 down in seat 16d because that's his seat but there's a lady already sitting there he asks if she's in the right seat and she says no i'm actually 15 D but 16 is my lucky number. So do you mind if I stay here? And he goes, what, what? It's a wild, like we always see stories like this where people are like,
Starting point is 01:11:19 just on, like just for vibes purposes. Like, do you mind if I like, don't sit in my seat and because i just like a window better like huh no no this is my pet peeve i'm generally a pretty easy switch on the on the plane i don't i don't really give a fuck i mean obviously if i'm sitting next to my kids but like if it's just me i'll switch with you yeah yeah you can target me weird vibes people
Starting point is 01:11:44 i'll switch with you i'll be like target me weird vibes people i'll switch with you i'll be like oh you seem attached to it i mean i get like if i'm just sitting in an identical seat just in a row behind i guess not but it's just so weird for the person that's where it just irks me a bit is that you feel that you are entitled to your vibe seat god like what it must be like to live in a head like that, though. Yeah. Goddamn. I would be like, what would happen if I said no?
Starting point is 01:12:09 The plane goes down, right? The plane goes down, right? The plane goes down, right? Yeah, then you know what? I'm not feeling too great. So you know what? I think I want my original seat. Thanks.
Starting point is 01:12:20 Get your ass to 15D. Dr. McInerney, you were saying that's your pet peeve. It really is because I'm like a wet towel of a person. get your ass to 15 deep dr mcnerney you were saying that's your pet peeve is somebody trying it really is because i'm like a wet towel of a person so if someone just like sits in my seat and it's like this is my seat now i'm like i resent you so much but also am i really gonna pick a fight but what if you're just like well 16 is my lucky number oh shit it's mine too that's crazy who's gonna win on that yeah but no i i resent that so the person that tweeted i'm with you yeah out of my seat please i mean i get to like like from jack your perspective is like whatever
Starting point is 01:12:52 it's not that big of a deal for me to like sit there but there's also just this part of like how quickly the world just owed you everything oh oh i will be sitting in my new seat for the entire flight sweating with anger at them you're like you know what i'm gonna say something ladies and gentlemen we are making our descent into herbing so if you'd like please put your seat backs oh i should have said something should have said something next i see it's 27C right across from the laboratory. Okay. The stink seat over in Stink Hut. Tweet I've been enjoying.
Starting point is 01:13:34 Something that got retweeted a lot was this tweet from Emily K. May. Tweeted, the thing about Taylor Swift is that she so perfectly encapsulates through her lyrics the interior lives of women. It's why we all can't stop listening we're all saying wait you felt that way we were all feeling this way do men have someone like that and then uh metal.txt responded we do yeah and screen capped the chorus from the boys are back in town which goes which reads the boys are back in town the boys are back in town i said the boys are back in town the boys are back in town I said the boys are back in town
Starting point is 01:14:07 the boys are back in town like seven times I didn't realize that it said it so many times in a row so yeah we got that one covered you want to know how men's minds work we're just excited that the boys are back in town and that's basically
Starting point is 01:14:23 it you can find me on twitter at jack underscore o'brien you can find Just excited that the boys are back in town. And that's basically it. You can find me on Twitter at Jack underscore O'Brien. You can find us on Twitter at Daily Zeitgeist. We're at The Daily Zeitgeist on Instagram. We have a Facebook fan page and a website, DailyZeitgeist.com, where we post our episodes and our footnotes, where we link off to the information that we talked about in today's episode,
Starting point is 01:14:45 as well as a song that we think you might enjoy. Miles, what song do we think people might enjoy? It's rainy here in L.A. I've been listening to some jazz, you know, some saxophone from Joe Henderson, who's a great sax player. This is a track called Black Narcissus. Just a nice rainy day song just to have on and just look out your window, do some reading, whatever you want to do.
Starting point is 01:15:08 But yeah, this is it. Joe Henderson, Black Narcissus. All right, we'll link off to that in the footnotes. Today's Zyka is a production of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. That's going to do it for us this morning. Back this afternoon to tell you what is trending.
Starting point is 01:15:23 And we'll talk to y'all then bye bye k hasn't heard from her sister in seven years i have a proposal for you come up here and document my project all you need to do is record everything like you always do what was that that was live audio of a woman's nightmare can k trust trust her sister, or is history repeating itself? There's nothing dangerous about what you're doing. They're just dreams. Dream Sequence is a new horror thriller from Blumhouse Television, iHeartRadio, and Realm. Listen to Dream Sequence on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 01:16:00 Curious about queer sexuality, cruising, and expanding your horizons? Hit play on the sex-positive and deeply entertaining podcast, Sniffy's Cruising Confessions. Join hosts Gabe Gonzalez and Chris Patterson Rosso as they explore queer sex, cruising, relationships, and culture in the new iHeart podcast, Sniffy's Cruising Confessions. Sniffy's Cruising Confessions will broaden minds and help you pursue your true goals. You can listen to Sniffy's Cruising Confessions, sponsored by Gilead, now on the iHeartRadio app or wherever you get your podcasts. New episodes every Thursday. What happens when a professional football player's career ends and the applause fades
Starting point is 01:16:35 and the screaming fans move on? I am going to share my journey of how I went from Christianity to now a Hebrew Israelite. For some former NFL players, a new faith provides answers. You mix homesteading with guns and church. Voila! You got straight away. He tried to save everybody. Listen to Spiraled on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.