The Daily Zeitgeist - Down The Stupid AI Rabbithole 06.25.24

Episode Date: June 25, 2024

In episode 1697, Jack and Miles are joined by hosts of Mystery AI Hype Theater 3000, Prof. Emily M. Bender & Dr. Alex Hanna, to discuss… AI Is Breaking The Internet and The World AI, Debunking L...ies About AI Magic, Dangerous And Harmful Ways AI Is Actually Being Used and more! LISTEN: Out In The Sun (Hey-O) by The Beach-NutsSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 they had that like continuous seinfeld generator for a while which i didn't oh really yeah i forgot what happens i think it started i think it just went to and started doing racist things though it started talking like jerry seinfeld yeah actually yeah exactly it started talking like Jerry Seinfeld. Yeah, exactly. It started talking like Seinfeld and Michael Richards. Yeah. The IDF is doing great things! I'm telling you! What's with it? With these student protesters?
Starting point is 00:00:36 What's the deal? They all got the same tent! Who's giving them the tents? That is his material. Thanks. That is his material. I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church.
Starting point is 00:01:10 Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradenti. And I'm Jemay Jackson-Gadsden. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert Maury Teherry-Poor. If you start thinking about negotiations as just a conversation, then I think it sort of eases us a little bit.
Starting point is 00:01:43 Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Keri Champion, and this is season four of Naked Sports. Up first, I explore the making of a rivalry, Kaitlyn Clark versus Angel Reese. Every great player needs a foil. I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports.
Starting point is 00:02:07 Listen to the making of a rivalry, Caitlin Clark versus Angel Reese on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Elf Beauty, founding partner of iHeart Women's Sports. Hello, the internet, and welcome to season 344, episode 2 of Dirt Daily's iGhost, a production of iHeartRadio. And this is a podcast where we take a deep dive into America's shared consciousness, America's deep brain, if you will. A little tip of the cap, because we're big AI fans. A little spoiler.
Starting point is 00:02:41 We're big AI fans now, folks. Miles and I have seen the light come around. I think when it's all you see on social media, you're like, this is art. Oh, yeah. Maybe this is cool. Yeah. It's Tuesday, June 25th, 2024. Oh, yeah.
Starting point is 00:02:58 Big day. It's National Catfish Day, which is odd because it's also my partner's birthday. Happy birthday, Her Majesty. Have you been catfishing me this whole time? catfish day which is odd because it's also my partner's birthday happy birthday are you have you been catfishing me this whole time it's national strawberry you guys haven't met in person yet still right but one of these days no well and every time i want a video chat she says her phone's broken so we just kind of stick to the phone call stuff um but it's also very normal at this stage in a marriage very normal also this is so weird and this is just like some weird religious stuff it's national league did you know this is national
Starting point is 00:03:30 leon day do you know what that even is it's because you're six months away you're six months away from christmas noel no it's the most opposite it's the most opposite time of the year what does this even mean that jesus gets a half birthday instead of dong ding dong they dong ding ah dr seuss all right anyways happy national leon day uh we're gonna need to workshop that that. Maybe we can run it through an AI comedy writer because those are good. And in no way just humans posing as AI comedy writers. Anyways, my name is Jack O'Brien, a.k.a. Big Ass Plumpers to Thin Ass Fools. Guys keep it coming like Grimace Spood. We tell it on the podcast, break in all the news.
Starting point is 00:04:20 Say I'll be goddamned if there ain't more raccoons. That is courtesy of Halcyon salad. I like that name. Halcyon salad. I see what you've done there and I enjoy it. I'm thrilled to be joined as always by my cohost, Mr. Miles Gray.
Starting point is 00:04:37 It's miles gray. I got them bell, big boots. When the weather suits. Yeah, I got that swag. because I'm a dad. Shout out Razzak on the Discord because, yeah, like I said, I have zip-off cargo pants that go from pants to shorts because I'm a dad. And that's like just mandatory swag that you have to have.
Starting point is 00:04:58 You unzip the bottom of the shorts. Yep. Which is what they're designed to do when it gets warm. At the knee, yeah. And then you flip. Invert them when it gets warm. At the knee. And then you flip them. You invert them and put the cuff at your knee so you've got little bell
Starting point is 00:05:11 bottoms, like two-part bell bottoms. Yeah. Three-part bell bottoms, I guess. I saw it on Pinterest. It could have been AI, but it looked like a thing normal people do. That's where we get our fashion from. Yeah, yeah. It might have been completely off, but hey. Most people say, hey, good luck on you. Most people say, hey.
Starting point is 00:05:27 Miles, we're thrilled to be joined by the hosts of the Mystery AI Hype Theater 3000 podcast once again. It's Dr. Alex Hanna and Professor Emily M. Bender. What's up, guys? Welcome to the show. Hello. What's happening? Welcome back. Hey, so glad to hear you.
Starting point is 00:05:46 What was the thing about the raccoon? A raccoon ripped a crow in half in my backyard recently. Oh my gosh! Yeah. Like, it's just WWE style. Just put it over its knee with its strange human hands. I didn't see it happen. I just found the crow.
Starting point is 00:06:03 The aftermath. I have grown friendly with the murder of crows in my backyard, as one does, and found the crow's body right next to a garbage bag, garbage can that had been flipped over by what could have only been a raccoon. And we do have a raccoon family living nearby. If you have an in for raccoons, have you ever seen the video where someone's feeding a raccoon and we do have a raccoon family living nearby so if you have if you have an in for raccoons have you ever seen the video where someone's feeding a raccoon cotton candy no oh and they don't they like take it to the water i've seen like oh they watch it yeah they watch it it dissolves and like no yeah they always watch their food yeah very clean respect it i just like have a much healthier respect for raccoons we should be treating raccoons like handguns. They're very impressive and dangerous, and we should just give them the proper respect. They have opposable thumbs. Yeah, exactly.
Starting point is 00:06:56 And they hunt birds. So it's all... I didn't realize the food washing thing. My mom famously has opened her home to possums and other neighborhood wildlife and and where she lives and i remember in the kitchen where the cat food is there was like a bunch of kibble in the water bowl and i was like what is going on and mom was like i think that's what the raccoon does yeah and i was like oh and it was said so casually that i was like in your home she's like yeah yeah yeah but then it leaves and i was like this is very okay well does it just use does it use like the cat like door does it yeah yeah it
Starting point is 00:07:31 comes through washes the kibble has a few bites and then takes off into the night it's like a pit stop for one of the raccoons in the night yeah i'm leading so i'm leading a raccoon based dungeons and dragons campaign starting next week. That's going to be a heist that takes place in the warehouse where I play roller derby. I'm very excited. I've got it
Starting point is 00:07:56 really architected. I don't want to give any secrets in case any of my player characters listen to this podcast. Right. Okay. That sounds amazing. And what a coincidence. Raccoons, having a bit of a moment at least to this podcast. Right. Okay. That sounds amazing. And what a coincidence. Raccoons having a bit of a moment on this podcast. Yeah,
Starting point is 00:08:10 they are. So in addition to being a host of the, the wonderful mystery, a hype theater, 3000 podcast, which podcast hosts the highest honor one can attain in American life. But you both have some pretty impressive credits. Emily, you are a linguist and professor at the University of Washington, where you are
Starting point is 00:08:30 director of the Computational Linguistics Laboratory. Yep, that's right. Do we have that right? Okay. Alex, you are director of research at the Distributed AI Research Institute, both widely published, both received a number of academic awards, both have PhDs. We had you on the podcast a few months back, told everyone the truth about AI, that a lot of the stuff that we're scared of and a lot of the stuff we think it can do is not
Starting point is 00:08:58 true. It's bullshit. And I sat back and was like, well, we'll see what ai does after this one because and it's just kept happening you guys what the what the heck can we do if anything it's gotten worse since we told everybody the truth what's happening truly you know everybody seems to want to believe and it's absurd yeah it is so wild yeah and and part of what we do with the podcast actually is like try to be a focal point for a community of the people who are like no that's not right why does everybody around me seem to believe that it's like you know actually doing all these things yeah and so it's yeah it's it's you know that's what we say in our podcast like every time we think we've reached peak ai hype the summit summit of bullshit mountain, we discover there's worse to come.
Starting point is 00:09:46 Like it's not stopping. Oh yeah, this is just a base camp until you get to the real peak. Well, it's just that you just keep on thinking that keeps on becoming and there's more and more things that these CEOs just, you know, are really
Starting point is 00:10:02 just saying incredible nonsense. I don't know if you saw this. I think it was the last week, the chief technology officer of OpenAI. Oh, yeah. Mira Maradi. Maradi. Yeah. Who famously was, I think it was an interview of 60 Minutes.
Starting point is 00:10:19 And when they were talking about one of their tools, Sora, you know, they had asked if they. My favorite filmmaker. Yeah. Exactly. My favorite. Yeah. Exactly. You speak Kub their tools, Sora, you know, they had asked if they... My favorite filmmaker. Yeah, exactly. My favorite, yeah. You speak Kubrick, now it's Sora. Thank you. Yeah, exactly. Just Sora really edging out, you know, David Lynch these days.
Starting point is 00:10:34 And so, you know, and they asked her, do you train this stuff on YouTube? And she kind of grimaced so painfully. Yeah, we covered it. And I remember a great Twitter comment that was like, well, if you're going to just lie about stuff, you at least have to have a good poker face about it. Yes.
Starting point is 00:10:52 And so the last week she was like, well, I think she was doing another interview and she was like, well, some creative jobs are going to go away. Like some artists should be, you know, completely on a month. Some creative jobs maybe shouldn't have existed in the first place right as if like these jobs were an affront to god or
Starting point is 00:11:11 something some of them just shouldn't have even been there but she does have a french accent so it's really hard to be like this is ridiculous she's italian and that's what's amazing about her having a french accent i don't i i not, I'm not a cultured person. I don't know the difference between. They're all French to me. I'm American. Right. She has a Canadian accent, I think.
Starting point is 00:11:34 I'm not sure. I know. It's only, it's only plagiarism if it comes from the French region of Italy. That's right. That's right. That's right. Yeah, we're going to get into that story and just, yeah. All of the madness that has continued to happen, the bullshit has continued to reign even harder, it seems like.
Starting point is 00:11:59 Yeah. Which, yes, does make the mountain go higher, unfortunately, the bullshit mountain. But before we get to that, Emily, Alex, we do like to ask our guests, what is something from your search histories that's revealing about who you are? Alex, you want to kick us off? Oh, gosh. Okay, the thing is, I don't think... So I use DuckDuckGo, and so it doesn't actually keep the search history.
Starting point is 00:12:27 And if I actually look at my Google history history it's actually going to be really shameful it's going to be me like searching my own name to see if people are like shit talking me online this is just how we tell if someone's honest as if they actually give that answer you actually search yourself
Starting point is 00:12:42 but I think the last thing I actually searched was like queer barbers in the bay area because i haven't had a haircut in like a year and i think i need to trim up or or get you know air out the the sides of my head for pride month so that's yeah that's the last thing i searched. What are you going with? You're going full shaved on the sides? I think maybe trim it a little bit and spin it up the back and bring out the curls a little bit. Okay. Love it.
Starting point is 00:13:13 On board. I wish I could bring out my curls. You've got a few more days in Pride Month to get that done. I know. In July, you're like, you do discounts? Do discounts?
Starting point is 00:13:24 I'm late. It's like after Valentine's Day. Do I get an undercut at 50% off now? Right, exactly. Emily, how about you? What's something from your search history? So forgive the poor pronunciation of this and the rest of the story
Starting point is 00:13:38 because Spanish is not one of my languages, but champurrado. Oh, yeah. Is something I search in search. Yeah. So I was in Mexico City for a conference last week and at one of the coffee breaks, languages but uh champurado oh yeah it's something that's so interesting yeah so i was um in mexico city for a conference last week and at one of the coffee breaks they had coffee and decaf coffee and then they had champurado con chocolate or kenya and you're kind of telling it on the spanish
Starting point is 00:13:57 pronunciation by the way i don't don't mean to give a stab at that what do you see when you see that word champuradoado Mexican hot chocolate. All right. Yeah. You're literally reading the Google results. So the labels all had like translations into English. And so it was Champorado with Oaxacan chocolate. I'm like, yeah, I got that. What's Champorado?
Starting point is 00:14:19 And so I look it up because I want to know what I'm consuming before I consume it. And it's basically a corn flour based thick drink. So like chocolate corn soup. It was amazing. Chocolate corn soup. You had me until chocolate corn soup. But the corn is just a thickening agent. Thick chocolate drink.
Starting point is 00:14:40 Thick chocolate drink with a slight corn flavor. Like think corn tortilla, not corn on the cob. Yeah, yeah. Ooh, yeah, yeah, yeah. That sounds amazing. Yeah. Big chocolate drink with a slight corn flavor. Like think corn tortilla, not corn on the cob. Yeah, yeah. Ooh, yeah, yeah, yeah. That sounds amazing. Yeah. It was really good. I love some corn flakes in a chocolate bar.
Starting point is 00:14:53 Uh-huh, uh-huh. So corn and chocolate. There you go. You got to arrive in your own way as to why that appeals to you. That's right. So I'm back on board with the thick corn chocolate drink. It was really good. And just awesome that it was there.
Starting point is 00:15:05 Like, you know, the coffee breaks had like the Mexican sweetbreads and stuff like that. But otherwise, it was pretty standard like coffee break stuff. And all of a sudden, there's this wonderful mystery drink. Yeah. One of the big urns. It was lovely. That sounds great. What is something you think is underrated, Emily?
Starting point is 00:15:19 I think Seattle's weather is underrated. Okay. Yeah. Everyone makes fun of our weather and like, you know, fine. Believe that we don't need lots of people coming here. And it's true.
Starting point is 00:15:29 It gets dark in the winter, but like almost any day you can be outside and you are not in physical danger because you are outside. I guess that's, that's, I mean, if you're going for, yeah,
Starting point is 00:15:40 that's interesting. But I mean, it's the, I mean, the winters are just so punishing though it's so gray it's dark but the weather it's dark it looks it looks like shit but experientially not bad for you i mean i yeah i know it's when does like it doesn't get all gloomy i imagine in the summer right you have wonderful blue skies and you can enjoy the summer.
Starting point is 00:16:06 The summers are gorgeous, fire season aside. Right. But yeah, from sort of mid-October to early January, it can be pretty, like, it's gray. And so, like, when the sun is technically above the horizon, it's a little hard to tell. Yeah, right, right, right. So, but, you know, compared to, like, Chicago, where you have maybe four livable weeks a year between the two hot and the two cold. Wow. Wow.
Starting point is 00:16:29 Don't do that. Because my thing was going to be Chicago because I was just there. And I was going to say my answer was going to be that Chicago is the best American city. I stand on this like 100%. For two weeks out of the year that's very true no absolutely not true no I'll even deal I'll even deal with the winter
Starting point is 00:16:51 I'll deal with the winter I mean if I okay I'll be honest if I didn't you know if the weather in Chicago if I could bring Bay Area weather to Chicago I would live in Chicago I mean there's other reasons but i mean it's it's look the vibes immaculate street festivals the neighborhoods uh it's the
Starting point is 00:17:14 one place that's probably the food it's still comparatively affordable compared to the coasts radical history you know just you, some of the best politics. Yeah. They shot Fugitive there. They shot... What did they shoot there? The Fugitive. Oh, that's a deep cut. Yeah. I mean,
Starting point is 00:17:37 I think they've shot a lot of Batman movies there because, you know, the iconic kind of lower Wacker Drive and they call it Gotham. And it's, yeah. That's pretty cool. Great city, crappy weather, right? If you're going to dump on weather somewhere,
Starting point is 00:17:52 everyone makes fun of Seattle's weather. Honestly, Emily, this is a hot take. I'd rather take Chicago's weather than Seattle's weather because I can't do gray. I can do- I feel like I'm on crossfire. I can't do gray. I can do... I feel like I'm on crossfire. I can do frigid. I cannot do gray.
Starting point is 00:18:10 It's too depressing for me. Well, this is why I say, like, don't move to Seattle if you can't handle our weather. Like, the people who move here and then complain about the weather are the worst. Yeah, it's like, what'd you expect?
Starting point is 00:18:19 Like, all of this, what they say is true about it being gray. And they're like, oh, I didn't expect it to be that gray. Right. Why do you think people talk about it like that all right alex let's stay with you uh what is you guys's overrated and please do it in a point counterpoint style also that contradicts one another well i gotta think about what's what's overrated these days
Starting point is 00:18:42 oh i just don't know what's in the... I know the name of the show is The Daily Zeitgeist, but I don't really know what's in The Zeitgeist. I mean, I guess Taylor Swift. I mean, I don't really have... Maybe that's controversial. I'm saying something that's hot take, but I guess that's maybe not controversial
Starting point is 00:18:59 to people of our generation. No. So... Joining Dave Grohl on the attack this weekend yeah wait what happened with dave grohl was just like implying that she's like he's like well we play our music live like raw live rock and roll you know unlike the eras tour you know we've got the errors tour and then everyone's like fuck you dave or other people being like exactly exactly yeah it's just like yeah yeah i mean dave girl is is also overrated i guess but like i mean i enjoy look i enjoy ever long like the next like yeah middle middle-aged sort of like dad figure. But I,
Starting point is 00:19:47 you know, I'm sure I'm glad that you played every part in that song. It sounds good, but, you know, it doesn't make you an authority on Taylor Swift. So I think I'm undercutting my own point. No, let's go, Dave. Yeah, you did the
Starting point is 00:20:02 counterpoint in your own overrated. Which is excellent, because I don't even have an opinion about did the great counterpoint in your own overwrite. In my own, yeah. And that's great. Which is excellent, because I don't even have an opinion about Taylor Swift. Never saw Tucker Carlson do that. Was that when that show was called Crossfire? Or was that the- Well, Crossfire was with, what's his face? Tucker Carlson and Paul Begala.
Starting point is 00:20:19 That was- Was it? The one that Jon Stewart came on and was like- Destroyed? Yeah. Yeah. It was like, this show is bad. And then they canceled it a couple of weeks later.
Starting point is 00:20:29 But then there was that one show, Hannity and Combs, where Sean Hannity was supposed to be a conservative voice, and then Combs, where I don't even know the guy's first name. They kind of just had him as a token liberal on and then they just... It was on Fox News so they just attacked him relentlessly. He wasn't allowed to read the news. He's like, you argue the
Starting point is 00:20:54 liberal points, but you're actually not permitted to leave this room. We're going to keep you in here old boy style for the entire season. Oh, that was the end of 60 Minutes that Andy Rooney would do. There was... Part of 60 Minutes andy rooney would do there was part of 60 minutes was point counterpoint and it would be andy rooney if that's what you're thinking jack i don't know there's many no no there was a show yeah it was right when i got out of college and worked for
Starting point is 00:21:16 abc news and so everybody was always watching news and at that time there was a big show on cnn called crossfire yeah it was tucker yeah tucker carlson was the conservative paul begala was the liberal and they just like got on and yelled at each other i'm looking it up now this is good apparently there there was a they uh they had a revival. And then in 2013 and 14, on the left was Stephanie Cutter and Van Jones. And then Newt Gingrich and S.E. Cupp on the right. And then whenever they needed breaking news, they'd bring in Wolf Blitzer for some reason. Extracting him out of the Situation Room? Yeah, they released him from the cryogenic
Starting point is 00:22:05 chamber. He was helicopter lifted from the situation room three rooms over to the crossfire set just with dead pan. We need you. No hint of emotion on his face ever. You guys ever seen the Wolf Blitzer episode
Starting point is 00:22:20 of Celebrity Jeopardy? No. Do yourself a favor. Is it as good as the SNL parody is the Celebrity Jeopardy with Sean Connery? He's so bad.
Starting point is 00:22:37 No. And also incorrect. One after another. He had negative. Went into the into the red he's in there quickly in final jeopardy well wolf we're gonna spot you 3000 because we can't have somebody be in negative numbers going into uh going into final jeopardy and i think andy richter was on with him and just destroyed was so good that's so funny andy richter like destroying
Starting point is 00:23:07 wolf this the kind of crossover i didn't know i need yeah it's still up there mostly from what i could tell it's on youtube yeah i am an old person all right we still have emily you're overrated what do you think is overrated big cars cars are overrated. Oh, totally. Sort of half-heartedly looking for our next car and can't find anything that is like reasonably small. And the other day I was in the parking lot for a grocery store near here. Like mostly I can walk for groceries, but occasionally I have to drive to this other store. And half the spots were labeled compact. And like all of those spots were taken up two at a time by what
Starting point is 00:23:46 we now have as regular cars because somebody's decided that people in this country don't deserve normal size cars yeah they're so i mean it's to the point where like even the people who design parking lots are like we have to tell the automobile manufacturers like the standard we've set as people who like create parking lots, like they're, they're pushing the boundaries of what we can actually do or how we measure things because the cars are so fucking big. And our streets around here in Seattle, we have a lot of neighborhood streets where there's like parking on both sides and then sort of just barely enough space for two normal cars to go through. Or sometimes you have to like pull over to the car and the bigger the car is like the harder that gets. Or sometimes you have to like pull over to the car and the bigger the car is like the harder that gets. I love that thing.
Starting point is 00:24:31 I remember one of the times I went to Seattle, seeing how everybody just parks on whatever side of the street in whatever direction they want. I was like, all right. I'm like, all right, Seattle. I was not familiar. That's fine. Yeah. A little bit of chaos. It totally offends my spouse who's like, that's not how parking works. But that's it.
Starting point is 00:24:44 Yeah. I love it. Yeah. Automakers just seem to be getting bigger and heavier they won't stop until they make a car that like is legally required to have a foghorn on it right so the cyber truck yeah the cyber truck i was gonna ask have you have you considered the cyber truck yeah i've seen one in person they are hilarious like you can't not laugh exactly when you see one it's such it? I've seen one in person. They are hilarious. Like you can't not laugh. Exactly. When you see one. It's such,
Starting point is 00:25:07 it is an experience seeing one in the wild. It's like, wow. Just want to say that what we really need is functional public transit. But like short of that, we also need to not be doing bigger and bigger cars. Yeah. Yeah.
Starting point is 00:25:18 No, I just, I, I mean, I have a truck, I have a 2020 truck and I wish, I really wish it was much smaller because it's hard to park. It's way too big. I mean, I have a truck. I have a 2020 truck, and I really wish it was much smaller because it's hard to park.
Starting point is 00:25:28 It's way too big. I mean, I think the peak of truck design was like a 1987 Toyota Tacoma long cab. You know, where like, yeah, you had to bunch up your knees in the back if you wanted to fit four people in it. But you actually a long you know you actually had a truck bed that actually had you know some carrying capacity you know and and it was a car you could absolutely run into the ground with no problems at all oh yeah oh yeah versus now people are like my new ford lightning needs a software update oh Oh, God. Well, that's the thing.
Starting point is 00:26:08 It's like, yeah, I mean, that's a big deal. I know in Oregon, which is where they had a right to repair bill. And I mean, in some ways, the people that were kind of into it, weirdly, were like, Google actually came out kind of into it. There was a good four or four media podcast where they talked about this with an apple because they have such a closed ecosystem was so against right to repair you know even if you have right to repair they'd actually add on all these things where you you'd still have to send to an authorized dealers because of firmware issues or whatever right right and then john deere like john deere is this kind of thing where they have, you know, so much of their tractors are computerized. And so there's like a lot of like these John Deere hacking kinds of things.
Starting point is 00:26:52 So people who are outside of the U.S., you know, programming these kinds of hacks for people running these tractors and can't run their firmware. Yeah. Farmers have all the good, the GPS, that was something. But did you hear about how the GPS was out for a while with the solar flare? Oh, yeah. I heard about that. This is again,
Starting point is 00:27:13 404 Media podcast, but the way those tractors work for like planting is so precise. Yeah. And they use GPS. To the center, I think. With the GPS off, they basically couldn't plant because then they the
Starting point is 00:27:25 seeds wouldn't be in the right spot for the next process yeah and so they had to wait and there's a really narrow window apparently with our you know currently genetically modified like very very specific corn that monsanto owns yeah and so it was actually looking pretty bad for a while i didn't hear any follow-ups so maybe the solar flail was short enough and the gps came back online but apparently that was a big thing. Yeah. It has to be a brief window because it goes from corn plant, corn seed planted in the ground to like popcorn in the movie theater in two and a half weeks. Yeah. Hyper engineered.
Starting point is 00:27:58 Yeah. Most of it doesn't even go to popcorn in the movie theater, right? Most of it goes to animal feed or ethanol, I think. Yeah. Right. Yeah, right, right, right. Yeah. All right. Well, let's take a quick break and we're going to come back
Starting point is 00:28:10 and dive into why Miles and I are excited about the future of AI. We'll be right back. Crossfire! I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church. And we're the host of the new podcast, Forgive Me For I Have Followed.
Starting point is 00:28:45 Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades. Jessica and I will delve into the hidden truths between high control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine. Through powerful, in-depth interviews with former members and new, chilling, first-hand accounts, the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:29:26 Hey, I'm Gianna Pradente. And I'm Jemay Jackson-Gadsden. We're the hosts of Let's Talk Offline, a new podcast from LinkedIn News and iHeart Podcasts. When you're just starting out in your career, you have a lot of questions, like how do I speak up when I'm feeling overwhelmed? Or can I negotiate a higher salary if this is my first real job? Girl, yes. Each week, we answer your unfiltered work questions. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in experts who do, like resume specialist Morgan Saner.
Starting point is 00:30:00 The only difference between the person who doesn't get the job and the person who gets the job is usually who applies. Yeah, I think a lot about that quote. What is it like you miss 100% of the shots you never take? Yeah, rejection is scary, but it's better than you rejecting yourself. Together, we'll share what it really takes to thrive in the early years of your career without sacrificing your sanity or sleep. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:30:33 I'm Keri Champion, and this is season four of Naked Sports, where we live at the intersection of sports and culture. Up first, I explore the making of a rivalry, Kaitlyn Clark versus Angel Reese. I know I'll go down in history. People are talking about women's basketball just because of one single game. Every great player needs a foil. I ain't really near them.
Starting point is 00:30:49 Why is that? Just come here and play basketball every single day and that's what I focus on. From college to the pros, Clark and Reese have changed the way we consume women's sports. Angel Reese is a joy to watch. She is unapologetically black.
Starting point is 00:31:02 I love her. What exactly ignited this fire? Why has it been so good for the game? And can the fanfare surrounding these two supernovas be sustained? This game is only going to get better because the talent is getting better. This new season will cover all things sports and culture. Listen to Naked Sports on the Black Effect Podcast Network, iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:31:25 The Black Effect Podcast Network is sponsored by Diet Coke. And we're back. We're back. So just for people who haven't listened to your previous appearance in a while, I feel like a broad overgeneralization, but it feels like the stuff that AI is actually being used for and capable of
Starting point is 00:31:51 is not what we're being told about through the mainstream media. Like it is not an autonomous intelligence that is going to be the bad guy in a mission impossible. I mean, it is a bad guy in a Mission Impossible movie. I mean, it is a bad guy in a Mission Impossible movie, but it's not going to be a bad guy in reality, the way it is in a movie. Yeah.
Starting point is 00:32:11 The way that our actual president believes it is. That was an amazing reveal that Joe Biden basically watched Mission Impossible and was like, we got to worry about this AI stuff, Joe. It's going to know my next move. But it is, like the large language models are basically more sophisticated autocomplete that is telling you what its dataset indicates you want to hear or what its dataset indicates will make you think it is thinking, talking like a person.
Starting point is 00:32:44 In many cases cases that means uh what they call hallucinating what is actually just making shit up what other jobs could you say that you're like sorry i was just hallucinating and they're like oh okay all good oh but you wouldn't last long as a precog yeah you were hallucinating. Yeah, I would be the worst precog. On the IRS, I was hallucinating on that last tax return. That's probably what happened. Can people talk about using this to do your tax returns? Yeah, right.
Starting point is 00:33:14 There's actually, yeah, there's actually, I mean, in California, there's a, whatever, the Department of Tax and Revenue, there was some great reporting in CalMatters by Kari Johnson, and he was talking about how they were using this thing, some language model to effectively advise the people who call in or advise the agents who respond to people who call into the California Franchise Tax Board. And they're like well they're they're and they're like well you know the the agents are still going to you know have the last word but i'm just like yeah yeah but they might they're overworked like are they gonna yeah they're gonna read this stuff or meet them you know right exactly oh you're gonna use this as an extra thing just an extra expense to
Starting point is 00:34:00 do the product do your job even better That doesn't sound like a company necessarily. Yeah. Yeah. So an interesting thing that we're seeing happen, we pay attention when there's an AI story that captures the zeitgeist. We covered the B-minus version of a George Carlin routine that came out.
Starting point is 00:34:21 They were like, AI just brought George Carlin back from the dead. We covered Amazon Fresh, having that store where the cameras know what you've taken. And so even if you try and shoplift, the cameras know they're going to catch it. And then you don't even have to check out. You just walk out and it charges your account because of AI. and then what we're seeing is that when the truth emerges it does not enter the zeitgeist because you guys cover it on your show which is why we're so thrilled to have you back but you know we have updates on those two stories carlin that was just written by a person the amazon fresh those videos were being fed to people working in india to try to
Starting point is 00:35:08 track where everything was going which was why there was like a weird pause like as people were where they're like uh we i think we got okay yeah we're just gonna do a best guess but it's straight up like mechanical turk. Which, again, Amazon named one of their companies the Mechanical Turk. So they know what's going on. They knew what they were planning to do here all along, maybe. But is that kind of the model you're seeing? Is big flashy announcement? This is what AI integration can do.
Starting point is 00:35:41 And then when it falls short, people just kind of ignore it or how how does it seem from where you're sitting yeah we haven't seen really good implosions yet and it's surprising because like the the stuff that goes wrong goes like really really wrong and people like yeah well it's just in its infancy which is a really really annoying metaphor because it, first of all, suggests that this is something that is like a human, like an animal, at least, that's a baby and can grow, something that is learning over time, and also sort of pulls on this idea that we should be kind to these systems because they're just little babies, right? And so if something goes wrong, it's like, well, no, that's just, it's still learning.
Starting point is 00:36:24 And we get all of these appeals to the future, like how good it's going to be in the future. And there is at this point, I think so much money sunk into this that people aren't ready to like let go and own up to the fact that. Yeah. So and it is, I guess, too easy to hire exploited workers for poor pay, usually overseas, to backstop the stuff. So you gave us the Amazon Go stores actually being monitored by people in India. There was one of the self-driving car companies admitted that their cars were being supervised by workers in Mexico. And do you remember the stats on this, Alex? Yeah, it was. So it was Eric Voigt, the CEO of Cruise. So it was Eric Voigt, the CEO of Cruise,
Starting point is 00:37:08 and then there was this reporting in the New York Times where they said they use humans. And then he was like, wait, wait, wait, wait, you're really blowing out of proportion. We only use it something of 3% to 5% of the time. That's a huge amount of hours. And he posted this himself on on hacker news which is this you know kind of like i don't know 4chan for tech bros i guess well i guess 4 4chan is 4chan for tech bros but i mean it's you know but like with a little less overt racism i guess just a little yeah just Yeah, but it was still, yeah,
Starting point is 00:37:46 but we're seeing this in a lot of different industries. At the end of the day, it's just outsourcing humans. Janet Vertessi is a sociologist at Princeton. She has a piece in Tech Policy Press, which the title is something like AI is just forecasting. It was just outsourcing 2.0 effectively. And yeah, we're seeing a lot of the same patterns that we saw Something like AI is just forecasting or is just outsourcing 2.0 effectively. And yeah, we're seeing a lot of the same patterns that we saw in the early 90s when these business process outsourcing or BPO organizations were really becoming all the rage in the US.
Starting point is 00:38:18 Right. The other thing that I see a lot too is like I felt early on, especially when we were talking about it, the thing that intrigued us was when everyone was when everyone's like dude this thing's gonna fucking end the world it's how powerful ai is i have i have a whole plan to take myself off this mortal plane if i have to the moment in which ai becomes sentient and takes over and like i think it felt like maybe the markets were like hey man you're scaring the kids man Do we have another way to talk about this? And I feel like recently I see more of like together when we harness human intelligence with AI, we can achieve a new level of existence and ideation that has not been seen ever in the course J-Lo movie where like the entire crux of the film was this AI skeptic had to embrace the AI in order to like overcome the main problem, conflict in the film. Or just even now,
Starting point is 00:39:14 like with the CTO of OpenAI also doing a similar thing when talking about how AI, like some creative jobs are just going to vanish. But that's because when the human mind harnesses the power of the AI, we're going to come up with such new things. That feels like the new thing, which is more like we got to embrace it so we can evolve into this next level of thinking, et cetera, computation or whatever. You guys on Mystery AI Hype Theater 3000 reads the research papers so that we don't have to.
Starting point is 00:39:44 And Miles watches the J-Lo movies so that you don't have to, and Miles watches the J-Lo movies so that you don't have to. Got to know what they're saying. But I'm glad you're watching the J-Lo because there's so many different cultural touchstones of this. Yeah. I had to look because I thought
Starting point is 00:39:57 the movie you were talking about was the sort of the autobiography, This Is Me Now, a love story. And I'm like, there's a film. And I was like, there's an AI subplot in that. I didn't know that J-Lo's life was a complete cautionary tale about AI and the inevitability of it. But yeah.
Starting point is 00:40:21 Right. But sorry, Emily was about to say something. I just wanted to be starky. So, our colleagues Tameet Gibru and Emil Torres coined this acronym TESCRIL, which stands for a bundle of ideologies that are all very closely related to each
Starting point is 00:40:36 other. And what's interesting about the transition, you notice that they've basically moved from one part of the TESCRIL acronym to another. It's all the stuff that's based on these ideas of eugenics and of sort of really disinterest in any actual current humans in the service of these imagined people living as uploaded simulations in the, in the far long future.
Starting point is 00:40:59 It's utilitarianism made even more ridiculous by being taken to an extreme end point. So the, this thing like it's going to kill us all comes partially from like the utilitarianism made even more ridiculous by being taken to an extreme endpoint. So the, this thing like it's going to kill us all comes partially from like the long-termism part of this, where people are fixated on this idea of we have to, and it's ridiculous. They,
Starting point is 00:41:14 they have a specific number, which is 10 to the 58, who are the future humans who are going to live as uploaded simulations in computer systems installed all over the galaxy. And these are people who clearly have never worked in IT support because somehow the computers just keep running. Yeah, it'll be fine. Yeah.
Starting point is 00:41:32 And the idea is that if we don't make sure that future comes about, then we collectively are missing out on the happiness of those 10 to the 58 humans. And that's such a big number that it doesn't matter what happens now. All right. And I always say when I relate that it doesn't matter what happens now, right? And I always say when I relate this story that I wish I were making this up, but there are actually people who believe this. And so that's where the sort of like,
Starting point is 00:41:52 oh no, it's gonna end us all stuff lives. And that's the L, the long-termism part of test realism. But this idea that we should join with the computers and become a better thing, that's the T, that's the transhumanism. And it's all part of sort of the same bundle and way of thinking. And there's this great paper out in the publication called First Monday by Emile Torres and Timnit Gebru sort of documenting the way that all of these different ideologies are linked one to the next. There's overlaps in the
Starting point is 00:42:20 people working on them, there's overlaps in the ideas ideas and it all goes back to eugenics and race science wow okay just adding on to i mean it's really so the doomerism and the boosterism are you know two sides of the same coin even though they kind of pose each other to be different so you sure if you imagine there was an article and it was a very funny chart that was a company this article there's an article in the Washington Post, it was written by Natasha Tiku, and she had this kind of grid. And it was really funny because it was like, on one end was this guy, Eliza Udowski, who's like a big doomer. He had this thing in Time where he wrote an op-ed in Time magazine.
Starting point is 00:43:00 He was like, you know, basically, if we need to, we have to be willing to do airstrikes on data centers. Which he actually wrote, and he actually said, actually, speaking of Tucker Carlson, I think Tucker Carlson also was like, oh, geez, maybe we should do that. And on the other end, you have, you know, Sam
Starting point is 00:43:18 Altman, who was like, in Mira, Maradi. Whoa, you got a Cybertruck? Sorry. No, it's okay. Miles just won a Cybertruck. You you got a Cybertruck? Sorry. No, it's okay. Miles just won a Cybertruck. You just won a Cybertruck? Incredible. Yeah, he's going to pull it. I've got to enter my social security number. One second. Yeah, yeah. Alright, sorry. Go on, Alex. It's fine.
Starting point is 00:43:33 I'll give you mine. I want to beat you to it. And so, you know, but, you know, they're actually two sides of the same coin because they basically want to, I mean, in the middle was Tim Neat, who's on this test real paper. He's also my boss at DARE. And I'm not saying that to kiss ass.
Starting point is 00:43:52 We actually get along quite well. But it's posed as someone in this grid. So Doobers and Bistroms both basically see AI as this kind of inevitability. We're going to get there. And to me, I think the metaphor you allude to, Emily, is kind of like thinking about this like a kid that needs to be formed. And in some ways, I think it's colonialism. It's manifest destiny.
Starting point is 00:44:21 It's always five years away. It's manifest destiny. It's always five years away. But they both see development of AI to be really, really critical to whatever's happening next. When we can be like, no one is asking for this shit. And it's taking up...
Starting point is 00:44:38 And we're seeing a lot of memes online that are like, I'm so glad that we are draining a lake every you know to generate this image of and it's there's one of the images was like a toddler holding you know a crucifix and the and with a helmet that says the police on it and they're neck deep in water and they've got big kawaii emoji eyes. Right. And if you blur your eyes, it's Jesus's face.
Starting point is 00:45:09 Yeah. Because that's another huge thing I see. There's so much AI nonsense art out there, too. And it's super environmentally disastrous. Right. This is the thing. And a lot of it is nonconsensual in the sense that if you try to use Google these days to do do a search the first thing you get depending on search but many searches is is the ai overview which is the output of one of these text-to-text models taking way more processing power than just returning the 10 blue links which is what they used to do and you can't turn it off so if you use google do a
Starting point is 00:45:39 search you're stuck with that you're stuck with its environmental impact yeah the the one positive i'd say that i've seen more and more since we last spoke is people being like wait who the fuck is this for exactly who is asking for this which i think is ultimately going to become a louder and louder question is this seems to be mainly for tech CEOs who are very wealthy, but especially like in countries that aren't, where corporations are not more powerful and valued more as humans than actual humans, like the United States,
Starting point is 00:46:14 I feel like it will become more and more of an issue and then be a matter of figuring out if that message actually gets in in the US. But yeah. Well, we can keep shouting it and that's a great question like who asked for this right right who asked for this and who likes where it's going yeah yeah because the the luster is worn off from the early you know days of those early chat gpt models that came out like this thing could be a doctor this thing is could be a lawyer and then most people kind of like the people that i know who are first impressions like yeah like it helped me write an
Starting point is 00:46:49 email that's about the most i can do with it because i hate writing formally and so it helped that but beyond that i don't know many i guess personally but and i know people that work in many different fields like it sort of ends up being like yeah i don't really have a use for it aside from like making funny songs that I share with my girlfriend that, you know, we wrote a song about how she loves Chipotle and it was in the style of, you know, crystal waters.
Starting point is 00:47:14 Yeah. And now that's it. Yeah. Yeah. I mean, I think that's, I mean, I like this phrase that Emily coined,
Starting point is 00:47:21 which is, you know, resist the urge to be impressed. Yeah. Where it's like this, this kind of thing where kind of thing where you've got this proof of concept and it's got a little gee whiz to it. But beyond that, I mean, are you going to use this in any kind of real product cycle or ideation or whatever?
Starting point is 00:47:42 And there's just been, I mean, we've had a tech lash for the last couple of years here. And I mean, I think it's really, the only people I ever see praising this, and I mean, maybe this is just a virtue of my timeline that it's curated and Elon Musk hasn't absolutely destroyed it yet. But the only people I see praising it
Starting point is 00:48:03 are typically people who are very very much in the industry you know they're the same people who are like i live in san francisco and i support you know london breed saying she wants to clean up our streets from homeless people and it's the people and i mean some of them are not out and out that fascist but you know they're they're they're they're brushing they're brushing along with it. Everyone else I see, especially also a lot of people in technology, they're like, I work in technology and I hate this bubble. And I'm so tired about talking about this. And I just want it to go away.
Starting point is 00:48:39 I mean, this is, you know, and then I see teachers, you know, we see a lot of people from different professions, teachers, nurses, doc. My sister's a nurse. She's like, I hate this stuff. My sister's a lawyer. And she's like, yeah, I've used this to sort of start a brief, but it gets so many things wrong. I need to double check everything. And at the end of the day, yeah. Does it actually help?
Starting point is 00:49:02 Yeah, probably not. But it's just a little baby now, Alex. It's just a little baby. It's just a little baby. You wouldn't scream at a little baby. You wouldn't kick a little defenseless baby. That comparison is so interesting, too, because it's like, and it could be a lawyer or a doctor,
Starting point is 00:49:18 which is also what parents are hopeful of. I'm so impressed by my little baby. Like what they just said, they could be a lawyer or a doctor one day. So, so watch for the phrase in its infancy, the coverage of this, it's all over the place.
Starting point is 00:49:34 Shout out, shout, shout out. And I just want to give a shout out to Anna Lauren Hoffman, who's someone who's a professor at UW, really good friend of mine and colleague of Emily. And so she's done a lot of work and sort of talking about these metaphors of this kind of
Starting point is 00:49:49 babiness and how it absolves AI of being racist and sexist and absolutely fucking up. It's like, oh, it's just a little baby. Yeah, it's a baby with a billion dollar valuation or whatever, however much it is. But having AI put in the, you know, however much it is.
Starting point is 00:50:06 But having AI put in the, like mentioned in the earnings call. But if we're going to keep that consistency with that metaphor, you'd be like, then why are you as the parent wheeling this child out to do labor that it's wholly unprepared for? You look fucked up actually. So Miles, I have kids that are a little older than yours.
Starting point is 00:50:23 And so I can say that is actually good parenting to wheel them out and have them do whatever. And it brings in the cheddar. Yeah. Jack coming on the podcast. Hot take. Child labor actually underrated. And when it's your kid, come on. When it's your kid?
Starting point is 00:50:40 That should be your call. It should be your call. Those are called chores. And yes, if they're doing the chores for a multinational corporation, it gets a little hazy. Yeah, they got an eight-hour chore shift at Chick-fil-A. Hazy because of all that money coming in. So on this point about nobody wants this, I have a talk that I've been doing since the summer of 23 called ChatGPY. And then the subtitle is, When, if ever, is synthetic text safe, appropriate, and desirable?
Starting point is 00:51:06 And it's meant for sort of non-specialist audiences. I basically go through, okay, what's a language model? What's the technology for in its original use cases? How did we get to the current thing? And then what would have to be true for you to actually want to use the output of one of these things, right? So the first thing is you'd want something that is ethically produced, so not based on stolen data, not based on exploited labor.
Starting point is 00:51:25 Basically don't have those. Okay, so can't tick that box. Okay. You also want something that somehow is not an environmental disaster. We also don't have that. Okay. So assuming we somehow got past those two hurdles, then it's things like, okay, well, you need a case where it's not going to be misleading anybody. not going to be misleading anybody. It's got to be a case where you don't actually care about the stuff being accurate or truthful, what comes out of it, including, you know, being able to recognize
Starting point is 00:51:52 and mitigate any biases. You also don't care about the output being original. So plagiarism is okay. And like, by the time you've done all that, it's like, yeah, this use case of helping you draft an email because that's tedious. Yes. Right. And is that worth the two that we started with there? Right. You know, the labor exploitation, data theft and environmental impacts. Right. We've just broken down all of human meaning for the past, like for civilizations.
Starting point is 00:52:31 None of the things that we've always seemed to think matter anymore because billion-dollar companies needed a new toy to hype up for the stock market. It feels like that's very frustrating. Let's take a quick break. We'll come back because I do want to talk about what it is being used for. I mean, AI is a very loose term, the way it's being applied. is a very loose term, the way it's being applied. But people want to use these new advances in computing to make themselves seem more profitable. And so I want to talk about what it is actually being used for. We'll be right back. I'm Jess Casavetto, executive producer of the hit Netflix documentary series, Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church.
Starting point is 00:53:10 And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and LA-based Shekinah Church, an alleged cult that has impacted members for over two decades. Jessica and I will delve into the hidden truths between high control groups and interview dancers, church members, and others whose lives and careers have been impacted, just like mine. Through powerful, in-depth interviews with former members and new, chilling firsthand accounts, the series will illuminate untold and extremely necessary perspectives. Forgive Me For I Have Followed will be more
Starting point is 00:53:45 than an exploration. It's a vital revelation aimed at ensuring these types of abuses never happen again. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradente. And I'm Jemay Jackson-Gadsden. We're the hosts of Let's Talk Offline, a new podcast from LinkedIn News and iHeart Podcasts. When you're just starting out in your career, you have a lot of questions.
Starting point is 00:54:12 Like, how do I speak up when I'm feeling overwhelmed? Or, can I negotiate a higher salary if this is my first real job? Girl, yes. Each week, we answer your unfiltered work questions. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in experts who do, like resume specialist Morgan Saner. The only difference between the person who doesn't get
Starting point is 00:54:35 the job and the person who gets the job is usually who applies. Yeah, I think a lot about that quote. What is it like you miss 100% of the shots you never take? Yeah, rejection is scary, but it's better than you rejecting yourself. Together, we'll share what it really takes to thrive in the early years of your career without sacrificing your sanity or sleep. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Keri Champion, and this is season four of Naked Sports,
Starting point is 00:55:06 where we live at the intersection of sports and culture. Up first, I explore the making of a rivalry, Kaitlyn Clark versus Angel Reese. I know I'll go down in history. People are talking about women's basketball just because of one single game. Every great player needs a foil. I ain't really near them boys. I just come here to play basketball every single day,
Starting point is 00:55:24 and that's what I focus on. From college to the pros, Clark and Reese have changed the way we consume women's sports. Angel Reese is a joy to watch. She is unapologetically black. I love her. What exactly ignited this fire? Why has it been so good for the game?
Starting point is 00:55:41 And can the fanfare surrounding these two supernovas be sustained? This game is only going to get better because the talent is getting better. so good for the game? And can the fanfare surrounding these two supernovas be sustained? This game is only going to get better because the talent is getting better. This new season will cover all things sports and culture.
Starting point is 00:55:52 Listen to Naked Sports on the Black Effect Podcast Network, iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. The Black Effect
Starting point is 00:55:59 Podcast Network is sponsored by Diet Coke. And we're back. We're back. We're back. And on your show, you've done some good stuff on just the surveillance side of AI. Which, I mean, that turns out a lot of the technology that we initially thought was promising was just eventually used for the purposes of marketing and surveillance in the end. And it seems like AI skipped all the promising stuff.
Starting point is 00:56:36 And it's just like, what if we just went right to the surveillance? We went right to harming people. harming people. Yeah, I will say that kind of, I mean, you had mentioned that this term AI is kind of being used loosey-goosey. And, you know, I mean, AI is kind of synonymous with large language models and image generators.
Starting point is 00:56:59 But, you know, things that have been called AI also encompass things like biometric surveillance, like different systems which use this technology called, quote-unquote, machine learning, which is kind of this large-scale pattern recognition. So a lot of it's being used, especially at the border. So doing things like trying to detect, verify identities by voices or by faces. I probably see this if you've been in the airport, the TSA has been using this
Starting point is 00:57:35 and you can still voluntarily opt out for now, but they're really incentivizing it. I saw that TSA has this touchless thing now, which is this facial recognition. So you don't have to present your ID id you can just scan your face and go and and like don't do that like yeah they take every option to opt out and that this the fact that those signs are there saying that this is optional was it toana petty petty somebody actually petty yeah the only reason we had that science is because of her activism saying like this has to be clear to the travelers that it's actually optional and you can
Starting point is 00:58:07 opt out. So it's, it's posted there that you don't have to do this. Yeah. All right. Then I'm going to feel you up. Sorry. Those are just the rules.
Starting point is 00:58:15 Yeah. It's just, it's absolutely. But I mean, it gets, it gets, you know, leveraged against people who fly to a lesser degree,
Starting point is 00:58:23 but I mean, folks who are refugees or asylees you know i mean people on the move really encountered this stuff in incredibly violent ways you know they do things like try to they take their blood and say that well we can we can associate your we're gonna you know sequence your genome and say if you're actually from the country you say you're from which is first it's pseudoscience i mean basically all biologists have been like you can't use this to determine if someone is xyz like nationality because nationalities are one political entities they're not biological ones and And so like, we can sort of pinpoint you to a region, but it says nothing to say of anything about the political borders of a country. There's a great book I started reading by Petra Molnar, which is called The Walls Have Eyes, which is about this kind of intense surveillance state or intense surveillance architecture
Starting point is 00:59:25 that's being used typically in the border, the US-Mexico border, but also the various points of entry in Europe where African migrants are fleeing places like Sudan and Congo and the Tigray region of Ethiopia. So just like, and this is just some of the most violent kind of stuff you can imagine. And it's way far away from this kind of, ooh, here's like a fake little child or a Jesus holding 12,000 babies, riding a truck with the American flag on it.
Starting point is 01:00:05 You know what I mean? Right. That's so the reality is yeah. Much more stark. And you see that you see the one to many image matching. So you get all these false arrests of people because the AI said that they matched the image from the, the,
Starting point is 01:00:22 the grain surveillance video. And it's one of these things where it's bad if it works because you have this like increased surveillance power of the state and it's bad if it doesn't work because you get all these false arrests like it's just it's just a bad idea it's just a don't um and it's not just image stuff so we read a while back about a situation in germany i think where where asylum seekers were being vetted as to whether or not they spoke the right language. So one of the things you can do with pattern matching is, okay, language identification.
Starting point is 01:00:54 This string, what language does it come from? But it was being done based on completely inadequate data sets by people who don't speak the language or who are not in a position to actually vet the output of the machine. And so you have these folks who are in the worst imaginable situation. Like you don't go seeking asylum on a lark, right? Because your Wi-Fi broke at home. Yeah.
Starting point is 01:01:16 Right. And then they're getting denied because some algorithm said, oh, you don't speak the language from the place you claim to be coming from. Your accent is wrong. Your accent from, where your accent is wrong or your variety is wrong or whatever. And the person who's run this computer system has no way of actually checking its output, but they believe it.
Starting point is 01:01:34 And then you get these asylum seekers turned away. Yeah. So how does that, you know, with everything you said, how should we feel that OpenAI recently welcomed to their board the 18th director of the NSA, Paul Nakasone. Is that bad?
Starting point is 01:01:51 Or what should we take from that? So how should we feel? Not at all surprised, right? How should we feel when OpenAI is like, okay, bad is whatever the rest of that is, is bad. Seems bad, man. I don't know. It seems like there's, again, we're talking like this technology to mass surveillance pipeline and who better than someone who ran the fucking NSA?
Starting point is 01:02:11 Like, and I know the way it's being spun is like, you know, this is part of cyber command. Like he, he inherently knows like how, what the, the, like what the guardrails need to be in terms of keeping us safe. But to me, it just feels like, no, you brought in a surveillance pro, not someone who understands inherently like what this specific technology is, but more someone who's like, learns how to harness technology for this other specific aim. Yeah. Yeah. And so surveillance is not synonymous with safety. Like the one, the one kind of one use case for the word surveillance that I think actually was pro public safety is there is a study, um, a long-term study in Seattle called the Seattle Flu Study. And they are doing what they call surveillance testing for flu viruses.
Starting point is 01:02:52 So they get volunteers to come in and get swabbed, and they are keeping track of what viruses are circulating in our community. Right. I'm all for surveilling the viruses. Yeah, sure. Especially if you can keep the people out of it. Yeah, I would add a wrinkle to that just because I think that i mean there's a lot of surveillance i mean that's the kind of technology that's the kind of terminology they use with health surveillance to detect kind of virus rates and whatnot i would also add the wrinkle that like a lot of those you know organizations
Starting point is 01:03:18 are really trusted by distrusted by marginalized people like what you're going to do what to me you know like especially thinking like you know like lots of lots of trans folks and like especially like under housed or unhoused trans folks and just like you're gonna do what you want this data on me for who you know right yeah yeah understandably especially because because surveillance in general like is not a safety thing right it's not, it is maybe a like safety for people within the walls of the walled garden thing, but that's not safety, right? That's, yeah.
Starting point is 01:03:50 The other thing about this is that what we call AI these days is predicated on enormous data collection. And so to one extent, it's just sort of an excuse to go about claiming access to all that data. And once you have access to all that data, you can do things with it that have nothing to do with the large language models.
Starting point is 01:04:08 And so there's, you know, this is, I think, typically less immediately, like, threatening to life and limb than the applications that Alex was starting with. But there's a lot of stuff where it's like, actually, we would be better off without all that information about us being out there. And there's an example that came up recently. So did you see this thing about the system called recall that came out with
Starting point is 01:04:29 windows 11? So this thing, God, this is such a mess. So initially it was going to be by default turned on. Oh yes. Yes. Right.
Starting point is 01:04:38 Yeah. This is kind of like the Adobe story too. Yeah. Yeah. Every five seconds it takes a picture of your screen and then you can use that to like using AI search for stuff that you've sort of. And their example is something stupid. It's like, yeah, I saw a recipe, but I don't remember where I saw it. So you want to be able to search back through your activity and like zero thought to what this means for people who are victims of intimate partner violence.
Starting point is 01:05:00 Right. That they have this surveillance going on in their computer that eventually ended up being shipped as off by default because the cybersecurity folks pushed back really hard. And by folks, I don't mean the people at Microsoft. I mean, the people out in the world who saw this coming. But that's another example of like surveillance in the name of AI. That's supposed to be the sort of, you know, helpful little thing for you, but like no thought to what that means for people. It's like, yeah, we're just going to turn this on by default because everybody wants this, obviously. It's like, no, I know how to look through my history, actually. I've
Starting point is 01:05:32 developed that skill. I don't need you to take snapshots of my desktop every three seconds. Your show's covered so many upsetting ways that it doesn't seem like it's people implementing AI, it's companies implementing ai in a lot of cases to to do jobs that it's not capable of doing uh there there's been incorrect
Starting point is 01:05:52 obituaries grok the elon musk one the twitter one made up fake headlines about iran attacking israel and like public like put them out as like a major trending story. You have this great anecdote about a Facebook chat bot AI, like responding to someone has this like very specific question. They have like a gifted disabled child. They were like, does anybody have experience with a gifted disabled, like two E child with like this specific New York public public school program and the chatbot responds yes i have
Starting point is 01:06:28 experience with that and just like made up because they knew that's what that's what they wanted to to hear and fortunately it was like clearly labeled as an ai chatbot so the person was like what what the black mirror yeah that was a good quote but world health organization you know eating disorder institutions replacing therapists with ai like you just have all these examples of this going being used where it shouldn't be and things going badly and like there's a a detail that I think we talked about last time about Duolingo, where the model, they let AI take over some of the stuff that human teachers and translators were doing before. And you made the point that people who are learning the language who are beginners are not in a position to notice that the quality has dropped. Yeah. And I feel like that's what we're seeing basically everywhere now is just the internet is so big they're just using it so many different places that it's hard to catch them all and then
Starting point is 01:07:38 there's not an appetite to report on all the ways it's fucking up. And so it just, everything is kind of getting slightly too drastically shittier at once. Yeah. And I don't know what to do with that. I would say, yeah. Well, go ahead, Emily. What you do with that is you make fun of it.
Starting point is 01:08:01 That's one of our things, is ridiculous process to like, you know, try to keep the mood up, but also just show it for how ridiculous it is. And then the other thing is, is to really seek out the good journalism on this topic, because so much of it is either fake journalism output by a large language model these days,
Starting point is 01:08:17 or journalists who are basically practicing access journalism, who are doing the genius thing, who are, who are reproducing press releases. And so finding the people who are doing really good critical work and like supporting them, I think is super important. Yeah.
Starting point is 01:08:30 Alex, you were going to say, well, I was, well, no, you just teed me up really well. Cause I was actually going to say,
Starting point is 01:08:35 you know, some of the people who are doing some of the best work on it are like four or four media. And, you know, the one I give a shout out to them, cause they're, you know,
Starting point is 01:08:43 these folks are basically, you know, they were to give a shout out to them because they're, you know, these folks are basically, you know, they were at Motherboard and Motherboard, you know, or the whole Vice place. And, you know, that focuses specifically on tech and AI. And these folks have been kind of in the game for so long. They, they, they know how to talk about this stuff without really having this kind of being bowled over, you know, there's people who play that access journalism,
Starting point is 01:09:22 like, like Kara Swisher, who like kind of poses herself as this person who is very antagonistic. But like, you know, right. Just like fawning over like AI people. Yeah. Like all the time. Well, I trusted Elon Musk.
Starting point is 01:09:38 And I was like, well, why did you trust this man in the first place? Did you know I was reading the uh the peter teal biography the contrarian and uh you know and like it's a very it's a very harrowing read i mean it was fascinating but it was very harrowing it wasn't an augur it was pretty like critical but like you know they discuss the paypal days you know 24 years ago when, you know, Elon Musk was like, well, I want to rename PayPal to X. And then and then everybody was like, why the fuck would you do that? People are already using people are using PayPal as a verb. You know, this effectively the same thing you did with Twitter.
Starting point is 01:10:21 Like, yeah, people are talking about tweet as a verb. thing you did with Twitter. People are talking about tweet as a verb. Why would you say, you know, it's been like an absolutely vapid human being with no business sense. Anyways, that was a very long way of saying Kara Swisher sucks. And then also saying that there's lots of folks, there's a number of folks doing great stuff. So I mean folks at 404, Karen How folks, there's a number of folks doing great stuff. So I mean, folks at 404, Karen Howe, who's independent,
Starting point is 01:10:49 but had been at the Atlantic and MIT Tech Review and Wall Street Journal. Kari Johnson, who was at Wired, is now at CalMatters. There's a lot of people that really report on AI from the perspective of like the people who it's harming rather than starting from, well, this tool can do X, Y, and Z. You know, we really should take these groups that they're claimed.
Starting point is 01:11:10 But yeah, I mean, the larger part of it is, I mean, there's just so much stuff out there, you know, and it's so hard and it is like whack-a-mole. And I mean, we're not journalists by training. I mean, we're sort of doing a journalistic thing right now. We're doing commentary. I think we're, I would not say we are journalists. I would say we are doing a journalistic thing. We're doing journalism, but we are not journalists.
Starting point is 01:11:38 We are not doing original reporting. Sure, sure. But it is, well, and you know, I would, you know, I'm not, I don't know. I'm not the, I don't, I don't know who decides this is the court of journalism. Sure, sure. than, you know, a whiz bang CNET article or something that is, comes out of a content mill and says, Google just published this tool that says you can, you know, find 18 million materials who are, that are, you know, complete, almost like, okay, well, let's look at those claims and upon what grounds do those claims stand and, and, you know, how that's, that's a pretty, pretty poor thing.
Starting point is 01:12:26 What we're doing is, is first of all, sharing our expertise in our specific fields, but also like modeling for people how to be critical consumers of journalism. Um, so journalism adjacent, but yeah,
Starting point is 01:12:38 definitely without training in journalism. Yeah. But I think we want to, do we want to do the M and M article? I mean, Oh my gosh there's this article that has like done broken our brains because it just has this series of sentences that i don't know that like because everything is degrading like journalism you know there's that story about like the daily mail was like natalie portman was hooked on cocaine when
Starting point is 01:13:01 she was at harvard you're like no that was from that rap she did on SNL. And that was like a bit, but because this thing's a scrape. And then the Daily Mail had to be like, at the end they corrected it. They're like, she was not. That was obviously satirical and that was due to human error. Like they really leaned into that. You're like, no, yeah, yeah, yeah, of course.
Starting point is 01:13:19 Did I tell you about the time that a fabricated quote of mine came out of one of these things and was printed as news? No. So I also, like Alex, have searched my own name because I talk to journalists enough that I like to see what's happening. And there was something in an outfit called Bihar Prabha that attributed this quote to me, which was not something I'd ever said and not anybody I remember talking to. So I emailed the editor and I said, please take down this fabricated quote and print a retraction because I never said that. And they did.
Starting point is 01:13:46 So the article got updated, remove the thing attributed to me. And then there was a thing at the bottom saying we've retracted this, but what they didn't put publicly, but he told me over email is that the whole thing came out of Gemini. And then they posted it as a news article. And you know, the only reason I discovered it was it was my own name.
Starting point is 01:14:04 And like, I never said that thing. Of course. enter the market till around august like around when fall comes but eminem this is why we were covering it because we are journalists and so we are the important stories in in may pumpkin spice already no uh but again they were saying this is because apparently gen z and millennial consumers are celebrating halloween earlier but this is this one section that completely wait wait can we back up what yeah i don't know. That's what they're saying, according to their analysis that we were... So let me read this for you.
Starting point is 01:14:52 Quote, the pre-seasonal launch of the milk chocolate pumpkin pie M&Ms is a strategic move that taps into Mars' market research. This research indicates that Gen Z and millennials plan to celebrate Halloween by dressing up and planning for the holiday about 6.8 weeks beforehand. Well, 6.8 weeks from Memorial Day is the 4th of July, so you still have plenty of time to latch onto a pop culture trend and turn it into a creative That's all chaos. It doesn't make any sense. I know.
Starting point is 01:15:25 Look, Alex. I'm fixating on 6.8. Exactly. What does that even mean? What the fuck does that mean? And where did Memorial Day come from? And what is 6.8 weeks for Memorial Day? Because it's not any of the days that they said it was.
Starting point is 01:15:43 They said July 4th. like and also 6.8 eight weeks isn't a real amount of time that's 47.6 days yeah what is what is even a 6.8 week so so if this were real it's possible that they surveyed a bunch of people and they said when do you start planning your halloween costume and those people gave dates and then they averaged that and that's how you could get to i get that and then i get that that's fair but also it totally sounds like someone put into a large language model write an article about why millennials and gen z are planning their halloween costumes earlier or something like yeah it sounds like that but also just so odd to say well 6.8 weeks from memorial day is the 4th of
Starting point is 01:16:26 july this article didn't even come out like it came out after memorial day and yeah fourth it's just nothing made sense and i was like i don't fucking understand what they're doing to me right now but again that's this is like the insidious part for me this appeared in food and wine this is in food and wine magazine with a human uh like in the byline and i actually dm'd this person on instagram and i said do you mind just clarifying this part like i'm a little bit confused and i've i've gotten no response i've got no i'm wondering if it's because i know that i mean there was some good coverage in futurism and they were talking about um this this company called advon commerce and the way that basically this company has been basically making AI-generated articles for a lot of different publications, usually on product placement.
Starting point is 01:17:19 Right. And so it makes me think it's sort of like, because food and wine may have been one of their, I forgot the article, but they had like, you know, better homes and gardening and, you know, kind of these legacy articles like that. So I don't know if it's something of that or this journalist kind of said, write me this thing and I'm just going to drop it and then go with God, you know. drop it and then right go with god you know yeah yeah my other favorite example of a is this headline i saw somewhere it's no big secret why van vought isn't around anymore and with a picture of vince vaughn but they just like got his name completely wrong yeah it's no big secret why van vaught isn't around anymore i'm like i'm certainly not you know if i was just scrolling and i just and i'd say like yeah i you know it's like you know i liked van vaught and the intern and then yeah but then i but i'm and then i would have looked at it and then i would have double taped i'm like wait wait wait yeah
Starting point is 01:18:22 is did he co-star with owen mick wilson or something yeah yeah yeah exactly russell wilson was in that i think it was the ad week reporting that you're thinking of alex so futurism did a bunch of it but then ad week had the whole thing about advon and i i can't quite no no no it was it was it was futurism yeah because because ad week had the thing on this program that Google was offering and it didn't have a name. Oh, right. Yeah. So Advan was Futurism. Yeah. But it totally sounds like, but it is happening. Yeah. Yeah. Right. Yeah. I thought you were going to talk about the, the surveillance by M&M thing when you said M&Ms. So this was somewhere in Canada, there was an M&M vending machine that was like taking pictures of the students
Starting point is 01:19:03 while they were making their purchases. And I forget what the like ostensible purpose was, but the students found out and I got it removed. Wow. Probably freaked out and made a big deal about it. Oh, we're taking pictures of people. Students, are we right? Well, I feel like we could talk to you guys once again for three hours. There's so much interesting stuff to talk about.
Starting point is 01:19:25 Your show is so great. Thank you both for joining. Where can people find you, follow you, all that good stuff? Emily, we'll start with you. Well, first there's the podcast, Mystery AI Hype Theater 3000, where you find any podcast, you can find ours. And we've also started a newsletter.
Starting point is 01:19:42 If you just search Mystery AI Hype Theater 3000 newsletter, I think it'll turn up. And that's an irregular newsletter where we basically took the things that used to be sort of little tweet storms. And since the social media stuff has gotten fragmented, we're now creating newsletter posts with them. So it's, you know, off the cuff discussions of things on Twitter. X and Macedon and blue sky. I'm Emily M Bender. And I'm also reluctantly using LinkedIn as social media these days. It's going to be the last one. It's going to be the one that survives them all. Cause I know some people kind of need it. Really the talk,
Starting point is 01:20:19 really the talk virtues of social media. Yeah. Yeah. Yeah. I'm at Right. Yeah. Yeah. I'm at Alex. Yeah. Alex, Hannah,
Starting point is 01:20:28 H a N N a on Twitter, blue sky. My, I barely use blue sky or Macedon, but Twitter's the best place to find me. Also check out dare, dare, D A I R hyphen institute.org.
Starting point is 01:20:44 And we're also dare underscore institute on Twitter, Macedon, and we're not on Blue Sky yet, but we're on LinkedIn. But that's where you learn a lot about what our institute's doing. Lots of good stuff, amazing colleagues and whatnot. Yeah, amazing. And is there a work of media
Starting point is 01:21:05 that you've been enjoying? Yes, I've got one for you. This, I think, started off as a tweet, but I saw it as a screen cap on Mastodon. So it's by Llama in a Tux. And the text is, don't you understand that the human race is an endless number of monkeys?
Starting point is 01:21:19 And every day we produce an endless number of words. And one of us already wrote Hamlet. That's really good that's that's such a hyper specific piece of media um i think i think last i think last time i was on this i was plugging worlds beyond number which is a podcast which i'm just absolutely in love with which is a a dungeons and dragons actual play podcast, but it's got amazing sound production. I would just like plug in everything on dropout.tv. I mean, it's a streaming service, honestly.
Starting point is 01:21:52 It's Sam Reich, who is the son of Robert Reich, kind of liberal darling and former Department of Labor secretary in the Clinton administration, Earl Darling and former Department of Labor secretary in the Clinton administration has turned college humor into an area of really great comedians. So they're putting out a lot of great stuff. So I'd say, you know, make some noise. It's coming out with a new season today, which is it's a really great improv comedy thing. And yeah, let's just let's just go with that. Those very important people interviews are hilarious.
Starting point is 01:22:28 Those very important interviews, Vic Michaelis. I named one of my chickens vehicular manslaughter after an inside joke there. And another one, Thomas Shrigley. So yeah, just incredible, incredible stuff. Yeah. Shout out to Sam. He's one of the best.
Starting point is 01:22:46 Miles. Yes. Where can people find you? Is there a work of media you can join? They have at symbols. Look for at Miles of Gray. I'm probably there. You can find Jack and I on our basketball podcast,
Starting point is 01:22:59 Miles and Jack got mad boosties, where we've wrapped up the NBA season. And I have streaming down my face with pain and anger as the Celtics win again. And also, if you want to hear me talk about very serious stuff, I'm talking about 90 Day Fiance on my other show, 420 Day Fiance, which you can check out wherever they have podcasts. A tweet I like. First one is from past guest guest uh josh gondelman uh he tweeted i bet the best part of being in a thruple is you have someone to do all three beastie boys parts at karaoke i guess one way to look at that uh and then another one uh from other past guests demya
Starting point is 01:23:40 digiweebay at electro lemon uh got his account hacked and he tweeted, hi, hello, it's Demi. I got my account back. I feel the need to clarify that under no circumstances should you ever believe that I or anybody on this website is selling cheap MacBooks for charity or otherwise. And what benefit would my signature do to a laptop? So, yeah, thank you for clarifying. I actually remember because I followed Demi and I remember when his account got hacked and I thought, man, that's really. And I at first I thought it was a bit because Demi is hilarious. But then I'm just like, what the hell? It's funny.
Starting point is 01:24:14 His follow up tweet was for anyone who thought I was doing a bit. What's the punch line? My jokes are never so obtuse. I love it. I want you to know it wasn't all that funny and i want you to know quick yeah no i i was also trying to find out what the punchline was right right yeah wait for it well because he's so funny that part of you wants to be like well hold on what are you doing here yeah what's like what's what's the deal here you don't want to immediately just dismiss demi because he's such a great comedic yeah but yeah if you do want good demi content the who's who's welcome at the cookout
Starting point is 01:24:50 you can find that's some dropout content that you can get for free on youtube there you go tweet i've been enjoying sleepy at sleepy underscore nice tweeted it's absurd that diddy kong wears a hat that says nintendo patently ridiculous there's no way he understands the significance it would be like me unknowingly wearing a hat that coincidentally depicts the true form of the universe that's incredible oh my god it's so fucking good
Starting point is 01:25:24 the second he showed up you're like I don't know yeah Nintendo you can find me on twitter at Jack underscore O'Brien you can find us on twitter at Daily Zeitgeist we're at The Daily Zeitgeist on Instagram we have a Facebook fan page
Starting point is 01:25:40 and a website DailyZeitgeist.com where we post our episodes and our foot notes where we link off to the information that we talked about in today's episode as well as a song that we think you might enjoy miles what song do you think people might enjoy uh i came across this track from like the 50s uh that is like not really popular it's like it was playing on the radio and i just when i hear it when i heard i was like wait what what is this song uh because i thought it was like maybe a like newer artist doing sort of a send-up of like 50s music like surf music it's called out in the
Starting point is 01:26:14 sun parenthetical hey oh and it is a bit like bella fonte's day oh and kind of has this like sort of similar sort of cadence to the verse but it it's just like when I heard it, I'm like, this sounds like the kind of like song like Tarantino would pluck from obscurity and then put under like a really dark scene. And it's just got like it's like a beach song, but there's this like darkness to it that I really love. But anyway, this is the Beach Nuts with Out in the Sun. So, yeah, check this song out. It's when did it actually come out? Is it recent? You know, it's from the 50s. No, it, check this song out. When did it actually come out? Is it recent? Yeah, no, it's from the 50s.
Starting point is 01:26:48 No, it's from the 50s. Like, they're an actual band. Wow. The lyrics are like, hey there, girls, where are you going? And they're like, down to the beach is where we're going. The lyrics are so literal. But there's this, like, charm to it. And the instrumentation is cool.
Starting point is 01:27:04 So, anyway, this is the Beach Nuts with Out in the Sun, parenthetical fail. All right. Well, we will link off to that in the footnotes. The Daily Zeitgeist is a production of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever fine podcasts are given away for free. That's going to do it for us this morning. We're back this afternoon to tell you what is trending and we will talk
Starting point is 01:27:26 to y'all then. Bye. Bye. Bye. Bye. I'm Jess Casavetto, executive producer of the hit Netflix documentary series Dancing for the Devil, the 7M TikTok cult. And I'm Clea Gray, former member of 7M Films and Shekinah Church.
Starting point is 01:27:46 And we're the host of the new podcast, Forgive Me For I Have Followed. Together, we'll be diving even deeper into the unbelievable stories behind 7M Films and Shekinah Church. Listen to Forgive Me For I Have Followed on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, I'm Gianna Pradenti. And I'm Jermaine Jackson-Gadson. We're the hosts of Let's Talk Offline from LinkedIn News and iHeart Podcasts. There's a lot to figure out
Starting point is 01:28:12 when you're just starting your career. That's where we come in. Think of us as your work besties you can turn to for advice. And if we don't know the answer, we bring in people who do, like negotiation expert Maury Tahiripour. If you start thinking about negotiations
Starting point is 01:28:24 as just a conversation, then I think it sort of eases us a little bit. Listen to Let's Talk Offline on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Keri Champion, and this is season four of Naked Sports. Up first, I explore the making of a rivalry. Kaitlyn Clark versus Angel Reese. Every great player needs a foil.
Starting point is 01:28:45 I know I'll go down in history. People are talking about women's basketball just because of one single game. Clark and Reese have changed the way we consume women's sports. Listen to the making of a rivalry. Kaitlyn Clark versus Angel Reese. On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Presented by Elf Beauty. Founding partner of iHeart Women's Sports.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.