The Daily Zeitgeist - Weekly Zeitgeist 379 (Best of 4/28/25-5/2/25)

Episode Date: May 4, 2025

The weekly round-up of the best moments from DZ's season 386 (4/28/25-5/2/25)See omnystudio.com/listener for privacy information....

Transcript
Discussion (0)
Starting point is 00:00:00 I want you to ask yourself right now, how am I actually doing? Because it's a question that we rarely ask ourselves. All of May is actually Mental Health Awareness Month. And on the psychology of your 20s, we are taking a vulnerable look at why mental health is so hard to talk about. Prepare for our conversations to go deep. I spent majority of my teenagers and my 20s just feeling absolutely terrified. I had a panic attack on a conference call.
Starting point is 00:00:26 Knowing that she had six months to live, I was no longer pretending that this was my best friend. So this Mental Health Awareness Month, take that extra bit of care of your wellbeing. Listen to The Psychology of Your Twenties on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. It's nostalgia overload as Wilmer Valderrama
Starting point is 00:00:44 and Freddie Rodriguez welcome another amigo to their podcast, Dos Amigos. Wilmer's friend and former That 70s Show castmate Topher Grace stops by the Speakeasy for a two-part interview to discuss his career and reminisce about old times. We were still in that place of like, what will this experience become? And you go, you're having the best time. But it was like such a perfect golden time. Listen to Dose Amigos on the iHeart Radio app,
Starting point is 00:01:07 Apple podcasts, or wherever you get your podcasts. Hey, I'm Jay Shetty, and my latest interview is with Michelle Obama. To whom much is given, much is expected. The guilt comes from am I doing enough? Me, Michelle Obama, to say that to a therapist. So let's unpack that. Having been the first lady of the entire country
Starting point is 00:01:27 and representing the country and the world, I couldn't afford to have that kind of disdain. Listen to On Purpose with Jay Shetty on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. The number one hit podcast, The Girlfriends, is back with something new, The Girlfriends Spotlight.
Starting point is 00:01:47 Each week you'll hear women triumph over adversity. You'll meet Tracy, who survived a terrifying attack. I remember that feeling of, okay, this is how I die. And turned that darkness into light. I want to take over the world and just leave this place better than I found it. So come and join our girl gang. Listen to The Girlfriend Spotlight on the iHeartRadio app, Apple Podcasts or wherever you get your podcasts.
Starting point is 00:02:18 Hello the internet and welcome to this episode of The Weekly Zeitgeist. These are some of our favorite segments from this week, all edited together into one nonstop infotainment, laugh, stravaganza. So without further ado, here is the Weekly Zeitgeist. Miles, we're thrilled to be joined once again in our third seat by a research associate at
Starting point is 00:02:49 the Leverholm Center for the Future of Intelligence, where she researches AI from the perspective of gender studies. She's the co-host of the great podcast, Good Robot. Please welcome back the brilliant Dr. Carrie McInerney. Dr. Carrie. Hi. Thanks so much for having me back. Oh, of course. Wonderful to have you back.
Starting point is 00:03:09 Thank you for accepting our offer to return, to class up the joint, as I say. It's great to have a question. It's an international diplomacy going on. We had to work it out, but we're thrilled to have you back. How have you been? I mean, how have we all been? We don't really know how to answer that question anymore. So I guess in the big scheme of things,
Starting point is 00:03:31 I'm personally doing really well, but yeah. Yeah. Thriving. Yeah. Yeah. What's the energy like in Europe looking at the cesspit that is a flame, as known as the United States. Is it like Iraq war anger like I experienced going to Europe?
Starting point is 00:03:51 We're like, what the fuck are you all doing over there? Or is it like how we felt after Brexit? People are like, oh, they're fucking themselves bad. I feel like a lot of confusion and fear. I don't know if it's necessarily even just anger as much as it's just people being like, what is happening? What are you doing? But I think there is a degree of shell shock to it where just so much keeps happening that
Starting point is 00:04:13 I think you start to become slightly numb, which is very bad, but also very understandable. I got a bit of a refresh from that because I was back home in New Zealand in March and there's nothing quite like being in a country where you watch the news every day and nothing happens. It's like a tree fell near the motorway, not even on the motorway or that kind of level of news. I came back here and I was like, oh my goodness, what is happening? This is a very different level of news. Yeah. Yeah. Yeah. Well, we're not getting numb to it, unfortunately. It's like, hey, damn, what the fuck? Anything for that hurts every day. Yeah. I mean, but that is the point, I think.
Starting point is 00:04:53 They want as many people to turn off and ignore what's happening. But I think as things ramp up here, people more and more realize how much the norms, as flawed as they were, were things that were worth keeping and trying to improve rather than being like, fuck everything, get rid of it all and whatever this is. Yeah.
Starting point is 00:05:13 What is something from your search history that's revealing about who you are? So I am working on a show called How to Live Forever right now, Hell yeah. which is, you know, I'm just so enamored with like how these billionaires are spending all their lives, like believing that they're going to live to 160.
Starting point is 00:05:32 And, and, but they're also like crazy things like cats that like, you know, I think Rick Perry's cousin is helping them live to their thirties. They're like cats live to their thirties. Yeah. With like tiny thimbles of wine and and there's a whole exercise course for them. The wine thing. Wait, is the wine thing real? Because I always assumed that was just wishful thinking.
Starting point is 00:05:54 Is it real? No. Well, for cats it is. I don't know about us. Cats do better with a little bit of wine. Wow. Okay. I mean, it takes the edge off, I guess. But there are all these like,
Starting point is 00:06:08 and they're miracles of science right now. There are these dancing molecules that in a lab in Northwestern, they have severed mice's spines to make them paraplegic, and then give them this injection, and within a month, they're walking normally again. The advances in science are unbelievable, but also there's all this like steak oil and what happens when billionaires stop giving
Starting point is 00:06:33 their money to society and taking billionaires pledge and just conserving it for when they're living to 200 or whatever. But in all this research, I stumbled into this rabbit hole on the enhanced games. Have you been paying attention to this? Like it's basically an all drug Olympics and they're doing it. I know that it was a Peter Thiel pitch at one point, but I didn't know that they did it.
Starting point is 00:06:56 It's supposed to happen next year. And, uh, and there's all this craziness of like, trying to figure out where is it going to be because they're like, you know, you can, I think, dope up as much as you want for these Olympics. They're getting people who are like retired Olympians and like people who've sort of like failed out of the, you know, they were like a fifth place like swimmer or whatever and they're getting roided up and other drugs, obviously. And then they're getting these huge cash prizes to beat Michael Phelps record or like,
Starting point is 00:07:29 and it's just the whole world is fascinating to me. The fact that the Olympics themselves are pretty corrupt. There are things where Ben Johnson and Carl Lewis took- Ben Johnson. Yeah. They took the same drugs, right? And one of them is like a hero and the other is like this pariah. And like.
Starting point is 00:07:54 Wait, Carl Lewis took those drugs too? Yeah. Come on. Carl. 92 Barcelona Olympics, Carl? So that's been documented since? Yeah. Wow.
Starting point is 00:08:09 Has he admitted? No, but there's a documentary about it, about steroids, that talks about it. Ben Johnson just didn't hide it as well, because he definitely looked more on steroids than I think anyone I've ever seen. Well, I think also, I think the countries are better at hiding it than other countries, right? Certain doctors are better at hiding it. So whether it's Russia or China or various places, Eastern Germany was very, very good
Starting point is 00:08:35 at hiding it and not hiding it. But anyway, the fact that you won, the countries are trying to bid on whether they can like sports wash through like an all drug Olympics is like kind of stunning to me. Right. And also just the people are invested in this because they feel like people like Peter Thiel are invested in it because they hope that all these longevity drugs will come out of an Olympics. And it's hard to tell, you know, is it going to be like a fire festival of, of sport or is it going to be like,
Starting point is 00:09:07 so it's something that makes money and becomes like, you know, regularly watched on TV or just having it, where's it located? Haven't announced yet. Oh, international waters. Yeah. Is that, or we're going to see it like the first televised like race where people are having like a heart attack during the butterfly stroke of a swim competition and you're like, oh boy, oh my gosh. I got to catch an air while doing the butterfly stroke. You can literally see his heart pumping out of his chest.
Starting point is 00:09:39 I feel like it either needs to be Vegas or Detroit in honor of RoboCop. I'm not sure. Yeah, right. Or like, or it'll just be like Saudi Arabia to your point. Yeah, Saudi Arabia. That makes a lot of sense. Yeah. Yeah.
Starting point is 00:09:56 And it just like, place it to this whole like weird, there was a GQ article about like, everyone you know is on steroids, you know, and it feels like all of this. No, I'm not, Jack. I told you I'm not. everyone you know is on steroids, you know, and it feels like all of this. No, I'm not, Jack. I told you I'm not. Yeah, I had to ask my mom twice. She denied it. This article says everyone I know. That she lifted the couch with one hand. It was crazy. But got terrible back acne too, I noticed. But it is, I mean,
Starting point is 00:10:25 this whole world of biohacking and human augmentation where people showed up six inches taller after the pandemic because they'd had those leg lifting processes. This whole world is so bizarre and it's fascinating to see what'll stick. I have some suspicions. There was one time where I ran into someone I hadn't seen in a while, and they were definitely taller than they used to be.
Starting point is 00:10:51 And I just wouldn't... Because I hadn't really thought of that as being a real thing, I just wouldn't let it rest. I'd be like, how are you so damn tall? That's crazy. It's like you hit a goddamn growth spurt in your 30s. And eventually they're like, yeah man, just like stop. Just, they're like, no, I've always been this tall.
Starting point is 00:11:09 I don't know what you're talking about. Oh man. You should've said like, bro, yeah, I had my bones extended. You're like, oh, okay. Bone extension. Yeah, exactly. And you were saying, before we recorded, Brian Johnson is hosting that show with you. That's your co-host the guy So that's cool
Starting point is 00:11:32 That's That's that is super interesting. I can't wait to have them have these games and They can't even get close to the real Olympian records because they were all on steroids too. Mm-hmm. You know, we'll see. What is something you think is underrated? Mashed potatoes. Underrated. Okay.
Starting point is 00:11:57 Mm-hmm. You think they're more versatile, just more delicious than people give them credit for? Yeah. Like- Where do they read? I? Think they're top potato so you're saying Underrated because we're emphasizing what the fry too much I think so I think we're getting too fancy with potatoes being in Idaho brought me closer to the potato and
Starting point is 00:12:21 Yeah, I You can just throw butter and salt and that's good shit. It doesn't need anything else. People gussy it up too much. So mashed potatoes underrated a lot of butter. You know, if you really want to really fucking a lot of butter, like enough that there are people like, are you are you OK? I'm like, yeah, you OK?
Starting point is 00:12:40 Because this is delicious. When you worry about yourself. Yeah. No one just makes a bowl of mashed potatoes as they should. Right. Yeah. Yeah. I did that. Sit down with the movie.
Starting point is 00:12:53 Three nights ago, I had a bag of potatoes that like one started to sprout a little, you know, a little eye out of it. And I was like, all right, I got to cook these straight. Yeah. Just right away. I was like, maybe I can roast them. I'm like, no, dude, I want to eat a big ass bowl of mashed potatoes and I did it and my life is better Exactly. You don't even need gravy. It's just comfort filling easy
Starting point is 00:13:12 Yeah, mashed potatoes are underrated. Where are we on the you can't see big bowl. We're like mashed potato Sort of a base the base that you're working off of I loved it the base that you're working off of. I loved it. I've been suicidal at one point in my life, but never that sad to have one. KFC Big Ball. A dark experience that is also worth it. That's the thing. I know it's great.
Starting point is 00:13:42 It's just I can't look at myself in the eye. It almost feels like, yeah, probably the reason I never did heroin. I was like, I've gotten close. But part of it is like, it just ain't calling me like that. I feel like if it did, it would be all bad.
Starting point is 00:13:58 It would be all bad. I did feel like William Burroughs talking about heroin. I was like, it's dark, it's a dark experience, but it's actually worth trying. Just once. Down a road that you have to go down at least once in your life. Yeah. I haven't.
Starting point is 00:14:15 Make it lunch indeed. Yeah. I haven't tried that monstrosity where it's the fried chicken, cheese, bacon, fried chicken. What was it? Oh, the double down? Yeah, that thing. Yeah. Have you guys tried it? No. I can't bring myself to. That one just seems gimmicky to me. But the mashed potato bowl always made sense to me. I was like, yeah, no, this is something I would
Starting point is 00:14:38 make at home. It's basically what I did with Thanksgiving leftovers for my time of life. So why not? Yeah. It makes total sense. And I think because my family's Southern, KFC is just like an abomination of fried chicken. Oh, right. So that, I do have a bias. So that probably goes into it.
Starting point is 00:14:59 Yeah. Wow. What's your favorite fried chicken? Honestly, it is Ralph's. Ralph's. Fuck yeah. Roast with fried chicken? Honestly, it is Ralph's. Ralph's grocery store fried chicken is really good. Hell yeah.
Starting point is 00:15:10 Yeah, so California grocery chain has probably some of the best fried chicken I've had. That's not my Mimos. So. Yeah, right. Yeah, Bad Day, fried chicken, mashed potatoes, couch. Ralph's is Kroger. I'm wondering if Kroger elsewhere has good fried chicken or if it's just something about Ralph Oh, yeah, how's it up by you anywhere else? Um, we don't have Ralph's like I'm on the border of California and Nevada so I We have a Rayleigh's and it looks just like Ralph's logo.
Starting point is 00:15:47 Uh huh. But it's different. I think it's privately owned. Yeah, that's not even a name. They just made that shit up. Yeah. Right. Rayleigh's. Yeah, that's Rayleigh. It's just as expensive as Ralph's. Yeah, but different.
Starting point is 00:16:02 They don't have Kroger brands. I don't know what their brand name is, their store brand. It's like, gang, let us know what is the best grocery store chicken. Because I do agree. I mean that and pavilions, pavilions, when you get a fresh, fresh batch of pavilions fried chicken too, yes. That and their donuts. Holy fuck. When I lived in Orange County, our pavilions made donuts every morning.
Starting point is 00:16:21 They were like still hot and a little greasy from the fryer. But they were like, I don't know. Donuts? Holy fuck. When I lived in Orange County, our pavilions made donuts every morning. They were like still hot and a little greasy from the fryer. Oh my God. Maybe they fry the chicken in the donuts. Maybe they fry the donuts in the chicken. Oh my God. KFC, are you listening? Yeah, I know. KFC Krispy Kreme collab. It's like, yeah, we brought over the fryer oil from a KFC to fuck up these donuts over here.
Starting point is 00:16:51 Just bring in a weed brand and I'd never get anything done again. Wow. Brian, what is something you think is overrated? Oh, yeah. So that my pit thing was anti-heroes. Yeah. I think they're overrated. We're done with it. Leave that shit in when we weren't in the age of Aquarius because we are in the age of Aquarius
Starting point is 00:17:13 is currently as of a few months ago. That's great news for me. Is that good? That's from hair. It's when all the it's like, historically, I don't know anything about what I'm talking about. It's when all the it's like historically, I don't know anything about what I'm talking about. Historically, it's when like the huge like revolutions and shit happened. Oh, good. All right. Oh, hell yeah. I think we kind of need that right now, guys. Low key. I think we need that right now. All right. So that was your under and overrated under. And it is the pit overrated is antiheroes antiheroes
Starting point is 00:17:45 No underrated was Fuck actually forget what I just said. No, you say the pit No, the pit was me leading into antiheroes. I think antiheroes are overrated and underrated is Something else is the pit Wait, oh, oh, I fucked this up. Processed food. Oh, that's okay. So it gets underrated?
Starting point is 00:18:10 Oh, I'm dead. Underrated to processed food. Okay, so I think we flipped your overrated and underrated. What's something you think is underrated? Processed food. Okay, why? Processed food. No, which one? What kind?
Starting point is 00:18:23 What's the most underrated processed food? I think like cheese. Processed cheese, American cheese. Yeah. I'm just kind of like, Oh, healthy. It's like picking the grossest one. Yeah. American just oil and water in the shape of fucking, I am fully in agreement
Starting point is 00:18:39 that I love American cheese. Like the melt, the melting point on that shit is so crucial. Like, yes, I get it was made in a lab. I used to rake my teeth over the edges of the plastic to make sure I was getting all that cheese off when I was a kid. You know what I mean? Yeah. It's like, no, no, no.
Starting point is 00:18:57 Stays melted even at like room temp. It's like melts at room temp is a wild,'s a wild quality for a substance to have. Watching the craft singles get squirted into the plastic and then the plastic adhering around it is pretty fascinating. Poetry. Poetry. Yeah. That's the Hopecore edit that I should be watching. Yeah.
Starting point is 00:19:19 I guess, I mean, we shouldn't be surprised that processed food is your underrated because you said that your favorite cookie is white chocolate macadamia nut cookies from Subway. Yeah. That's fine. They're always moist. They are. They are.
Starting point is 00:19:36 And you know why? Because they are sprayed down with chemicals that aren't available to anybody except, no, no, Fortune 500 corporations and the US military. Brian, just he trusts the process. That's right. You got to trust the process. I think we need to stop having trying to have so much control over everything. That's right.
Starting point is 00:19:56 Trust the process. It's delicious, right? Control works in the short term, long term it's a bad strategy. It's going to fuck you up trying to control everything. Just let it go, it's a bad strategy. It's going to, it's going to fuck you up trying to control everything. Just to let it go. Eat the fucking processed food. Trust the processed trust the process.
Starting point is 00:20:12 And we, we know how well that worked out for the 76 or so it'll probably work out for all of us as well. That's my basketball team, right? You need a hope core sixers at it, man. Really? I do not. I, I, I'm going Hope Core Sixers at it, man. Really? I do not. I'm going over here. I'm dead, man.
Starting point is 00:20:29 I need to not think about that shit for a little while. Yeah. Yeah. Well, all right. Great to have you back, Brian. We're going to take a quick break. We're going to come back. We're going to talk about some of this news.
Starting point is 00:20:38 You heard about this stuff, the news. We'll be right back. Hey, my name is Jay Shetty and I'm the host of On Purpose. news. We'll's unpack that. Former First Lady Michelle Obama and someone who knows her best, her big brother Craig. We'll be hosting a podcast called IMO. What have been your personal journeys with therapy? We need to be coached throughout our lives. My mom wanted us to be independent children. And she would always tell me, stop worrying about
Starting point is 00:21:26 your sister. Having been the first lady of the entire country and representing the country in the world, I couldn't afford to have that kind of disdain what would you say has been the most hardest recent test of fear. Listen to on purpose with Jay Shetty on the I heart radio app Apple podcast wherever you get your podcasts. It's nostalgia overload as Wilmer Valderrama and Freddy Rodriguez welcome another amigo
Starting point is 00:21:51 to their podcast Dos Amigos. Wilmer's friend and former That 70s Show castmate Topher Grace stops by the Speakeasy for a two-part interview to discuss his career and reminisce about old times. We were still in that place of like, what this experience become and you go you're having the best time. But it was like such a perfect golden time listen to dose amigos on the I heart radio app Apple podcast or wherever you get your podcasts. And the dream season is now complete the Golden State
Starting point is 00:22:20 Warriors of the 2015 NBA champion on the new limited podcast series dub dynasty it's been 10 years since their shocking run to a championship. We examine the controversial move that made it possible. It's never a great conversation as a player when you hear that you're being benched. For the entire behind the scenes story of Golden State's incredible 10 year run, listen to Dub Dynasty on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, hello, Malcolm Gladwell here. On this season of Revisionist History, we're going
Starting point is 00:22:54 where no podcast has ever gone before. In combination with my three-year-old, we defend the show that everyone else hates. I'm talking, of course, about Paw Patrol. There's some things that really piss me off when it comes to Paw Patrol. It's pretty simple. It sucks. If my son watches Paw Patrol, I hate it. Everyone hates it, except for me. Plus, we investigate everything from why American sirens
Starting point is 00:23:21 are so unbearably loud, to the impact of face blindness on social connection, to the secret behind Thomas's English muffins, perfect nooks and crannies. And also, we go after Joe Rogan. Are you ready, Joe? I'm coming for you. You won't want to miss it. Listen to Revisionist History on the iHeart Radio app, Apple Podcasts, or wherever you
Starting point is 00:23:44 get your podcasts. And we're back. We're back. And we, on this podcast, we ask the hard hitting questions such as, where are we at with AI? Generally. Just like generally, what's the vibes, Dr. McNearney. I'm so every because I remember when we first had you on, we were like in the midst of like E AI, E I underlay underlay mommy.
Starting point is 00:24:12 Yeah. Yeah. Oh, AI researchers giving us like the warning to like, this could be the next it's going to end the world. I'm, I'm taking cyanide now to preemptively, excuse myself from this apocalypse. After a while, I think that's as we started to speak to more and more experts, we all became rightly unconvinced by its capabilities and like,
Starting point is 00:24:35 oh, it's a fancy computer program. That's like a toy basically, but they're saying you can do so much more. Then we see more and more articles about how people using AI are like fucking up in legal proceedings. Like the MyPillow guy, his lawyers used chat GPT recently, and the judge was like, there are like 30 critical errors in your response here. Like, I don't even know what this is.
Starting point is 00:24:57 But now I'm like, are we now sort of I think because we're sort of past the thing of like, this is that world ender game changer, but still these companies have to make the line go up. Are we like now seeing more of a strategy to just kind of normalize AI use to get people to slowly warm up to it? Then sort of like the big explosive sort of debut with like chat GPT and things that we saw, because I feel like now I hear more people talk about AI when they are like Miyazaki finding themselves or getting a recipe recommendation rather than like this is going to replace the entire medical field. Yeah, I mean, I think normalization is the exact right word.
Starting point is 00:25:36 And I was just thinking of this today as I went on to Google search. And of course, they now have that AI overview summary. Yeah. And even someone like me who. Yeah, my friend. You're talking about my friend Gemini. Yeah, we're pretty tight actually. they now have that AI overview summary. And even someone like me who- My friends, yeah. You're talking about my friend Gemini? Yeah.
Starting point is 00:25:47 We're pretty tight actually, so careful. Gemini, they're providing us some answers. Even things like that, I think people like me who say probably wouldn't have gone to chat at GPT just to do basic information searches, I have to admit, I do read the AI summaries quite a lot. And I think that it's a way of just making it really frictionless and really easy to immediately use an AI tool.
Starting point is 00:26:10 I think we see that with a lot of AI rollout. On the one hand, I'm not completely against the use of these AI tools if you're really finding it useful and if it's reliable, and you're getting the full picture behind the tool and what it's for. But on the other hand, I think something I do worry about is something that I think has to be integrated into all our ethical tech use, is a meaningful sense of the opt-out or a meaningful sense of the opt-in. So do I have the right and the ability to say,
Starting point is 00:26:35 no, I don't want to use this? Am I being actively consulted and being able to consent to using particular tools or programs and so on and so forth. And yeah, I think the current rollout of AI tools is not really complying with that particularly well. Right. Yeah, they're just excited. They have a new toy that they want to use, that they're going to make a Super Bowl commercial about. It's the Gouda cheese, the most consistent. What was it? Gouda cheese?
Starting point is 00:27:04 Did you see that, Dr. McEnroe? That Google advertised their AI product with a Super Bowl ad that had incorrect information in it. The premise was this is a cheese farmer, and he is a whiz when it comes to making good cheese, but he's all thumbs when it comes to writing marketing material. And so it showed Google AI telling him facts that he could put on his cheese.
Starting point is 00:27:29 And one of the facts was that Gouda cheese is the most popular type of cheese. And is responsible for 60% of the world's cheese sales. What? Which is just on its surface, obviously wrong. And they still put it in a Super Bowl commercial. Like somebody caught it once the ad had like been up. And so it didn't make it to the Super Bowl, but it made it like online and was viewed millions of times.
Starting point is 00:27:56 And it's just, that seems to be like, if it's not going to be 100%, if it's not gonna be right 100% of the time, it's kind of useless because it's not going to be 100 percent, if it's not going to be right 100 percent of the time, it's kind of useless because it's just like that. I mean, yeah, I feel like everybody should be, would be opting out of it if they knew. Just so you know, one out of every, I don't know, 20 of the things that you search is going to be blatantly
Starting point is 00:28:22 wrong in a way that is going to be humiliating to you. Yeah. And we're not going to tell you which one. Enjoy the product. Yeah. I mean, first, I love the idea that it's not even an error, but it's just big gouda is out there trying to spread some misinformation about the popularity of gouda cheese. But second, like, yeah, I feel like one way I've heard people describe it is this idea
Starting point is 00:28:42 of like, yeah, something like chat GPT or the AI overview can be useful for getting like a sense of a topic. But yeah, I guess the issue for me is like that often requires quite a lot of expertise, to be able to know whether something is right or not. And so for example, like my husband is from North Carolina, massive basketball fan, like I am sort of forcibly inducted into the NBA enough that, yeah, I could probably tell the 80% maybe, actually, that might be overestimating my knowledge. But yeah, that 20% would be totally out of my knowledge.
Starting point is 00:29:11 And so I think that's my fear is if you're relying on these tools, it's not that people can't tell that something's wrong, but you know, that when we're using it for like a really wide range of applications, that does actually require a lot of expertise in all those different things. And I think certainly speaking for myself, like that's not something I would be able to do or to discern. Yeah. Cause I mean, sure. Like you think of it like, well, if I got to be on my test, but if you're
Starting point is 00:29:34 asking something to like explain the civil war and you get all the generals and the battles and dates, right. But the, the, the reason for the American civil War wrong, where it's like, and they all fought over economic rights. And then now it doesn't matter that 80% is completely negated by this other piece of information that has completely colored the description of something. That's why I'm like, even every time I see those summaries, I'm like, no. I'll click on the links that are saying, we're using these to tell us. I look at them like, this is not really even exactly what's happening here to the point that I feel like it's causing
Starting point is 00:30:11 more harm than good because I've at least learned to try and just research myself. That's how I got my theories on the Earth shape and things like that, my own research. Miles has some interesting ideas on that. I don't know if you have a couple hours, Dr. McInerney. Well, every time I've emailed Dr. Kerry, she's respectfully declined to entertain the conversation,
Starting point is 00:30:32 which I understand she has a lot going on. But it is fun. She can be a useful source to you though. You're looking for somebody who's been in an airplane. Look down. Are there cool uses of AI that aren't getting attention? Or I guess even tech breakthroughs that, we asked this the last time you were on,
Starting point is 00:30:51 but we're still in the early stages of AI. Are there directions that have popped out to you that the future of technology, and this technology in particular, could take that are promising for the bettering of the world for more than 12 rich guys, which I feel like is the current model. It's like these 12 guys are like, yeah, this would be amazing if we could replace all the people. Did you see me with Totoro?
Starting point is 00:31:19 I was with Totoro from the Miyazaki movie. But where are you seeing hope? Where are you seeing hope, I guess? I'm genuinely terrified. I'm going to say something and you'll be like, that's the exact same answer you said two years ago and destroy any remaining hope. Yeah. I mean, I think I'm always quite excited by more creative uses of AI or people who are
Starting point is 00:31:44 really trying to think about, instead of saying, how can we make one product that works for the whole world, which tends to be the approach of things like ChatGPT and then like Spoon. They obviously don't work for the whole world in all these different cultural contexts and all these different linguistic contexts
Starting point is 00:31:57 because I don't think a single product can. But I do think there are really interesting examples of groups like Tehiku Media in New Zealand, where I'm from, which have been using AI and machine learning techniques to try and focus on te reo Maori or the indigenous language of New Zealand's language preservation. And so because of long histories of the state suppression of te reo Maori, there was a period where there weren't that many native te reo Maori speakers, or it was really aggressively suppressed.
Starting point is 00:32:25 And now as a result, people are trying to really reinvest and support the revitalization of Toriyomori. And Peter Lucas-Jones, who's this CEO, I believe, of TIKU Media and their whole team, have been really intentional and really world-leading, I think, in how they've been trying to use AI machine learning for this. I think a big part of their project, though, is that they're very insistent on indigenous data sovereignty or making sure that their platforms aren't sold to Big Tech or reliant on Big Tech.
Starting point is 00:32:54 And I imagine that's a really challenging project because this sort of small handful of Big Tech firms are incredibly dominant in this space. But a lot of that has been around, like, no, we really want to make sure that this remains technology by and for Māori people and for our organization. And so I think projects like that, I find really, really inspiring and really important. But I think there was just a great example of AI development being done super well, which is like you have a clear problem and you have clear ideas and ways that you think that AI machine learning can help us address in part some of this problem without like positioning it as the solution.
Starting point is 00:33:31 Because I think if anyone comes out of the gate and is like, AI is going to solve this problem, that's when I think you should always be a bit like, oh, I don't think it is, especially if the problem is something like really, really massive, like discrimination. It's kind of like, well, we just have to reject that one sort of straight out the gate and kind of think a bit more specifically about this. And I'm sure those companies are like, and when we made that claim, that wasn't actually meant for you to be the receiver of that message. It was for Wall Street and investors when we said this thing will solve everything. Because like, yeah, I mean, like that application feels like the kind of thing that like, you know, in the U S right now, Trump is very focused on eliminating
Starting point is 00:34:08 any semblance of equity or diversity inclusion, obviously, as we've seen, like the woke DEI initiative, like crusade against all those things. And that sounds like exactly the kind of thing that Trump would be like, that's not useful. It just has to be this other thing. That's a money-making endeavor. Because right now, the people within the administration, like the head of science and technology, have said things like Biden's AI policies were divisive, and it's all been
Starting point is 00:34:36 about the redistribution in the name of equity. And naturally, Trump has fired many of the AI experts Biden hired because it was clear, like obviously Biden hired them. He's like, there's a huge bias problem with any of this stuff. And if it's even going to be a product people use, like that's probably a thing worth tackling. But now, like, it's become sort of like normalized within this administration to say this is all harmful now because it's trying to advance equity. When we've spoken to people like you and other experts, it's like, no, you have to get rid of those inequities or else it doesn't even fucking work.
Starting point is 00:35:13 Like when you're talking about things like being able to someone who has a darker complexion, how does a self automated car even identify that pedestrian as an object to avoid? Because again, these biases affect all these different systems. a self-automated car even identify that pedestrian as an object to avoid. Because again, these biases affect all of these different systems. But it feels like now, like at least from the American conservative side of things, or just generally the tech concert conservative movement is like,
Starting point is 00:35:37 yeah, maybe it's just fine if it misidentifies like black people or doing these other things that just kind of show that at the end of the day, it only needs to work in the way that we want it to work and other applications would never be damned. Keeps identifying black people as traffic cones. Do we think that we need, can we go to market with this still? Or is that, are we good here? That's, yeah.
Starting point is 00:36:01 I'm just, I'm amazed at how, you know, I think how much, like, objectively, this is a thing. If you want a product to work, it has to be able to A, be used around the world. So how useful is that in a place that doesn't have like an ethnic majority, that's all white people, uh, in the same way, like that, you know, again, these all just feel very counterintuitive, but that seems to be the name of this year. This year's theme generally.
Starting point is 00:36:27 Yeah. Yeah. I mean, the idea of this is the era of counterintuitive things really resonates. Um, yeah. And I think it's like not only disappointing because it's like, I do think that it shouldn't be super hard to buy into the idea that like, yeah, AI that is like more equitable, less biased and more fear, like genuinely is actually in the long run, good for everyone. Although I know there's like a reactionary group that feels like any kind of
Starting point is 00:36:48 equity or equality is, you know, kind of impinging on their own kind of share of the pie. But, you know, I really think like AI ethics and safety is for everyone. But yeah, I also think it's very sad because like this has huge knock on effects for the rest of the world because like the US is a world leader in AI and tech production. And so yeah, if you have an environment that is kind of saying let's throw ethics out the window, then that does have knock on effects for the rest of the world that buys and uses still a lot of this technology. So yeah, I think like these rollbacks obviously massively affect the US domestic context, but they certainly don't stop there.
Starting point is 00:37:22 Right. Is there any do you see any like, this is as stupid on its face as it sounds right to be like, we have to we have to stop with these inclusive efforts to address biases within AI like models. There's no like, it's as dumb as that sounds, right? Because in my mind, and everything I've read, people are like, No, no, no, like, it works worse when it has all these like inbuilt biases, like it will not work as good, therefore is not viable. So it is that is bad, right? There's no like secret things like, well, you know, some bias is good for these things.
Starting point is 00:37:58 I mean, if someone if that is the that is the secret, so listen, please tell me, definitely not what I would think. I mean, yeah, I think it just comes down to, again, I think there's a fundamental irony of saying the power of these AI enabled tools and products is that we can perform all these massive tasks at scale. And this idea of, again, a product that can be sold across the world, a product that can be used at scale,
Starting point is 00:38:21 with the kind of knowledge that this only really works for a very, very narrow base. I mean, my assumption, to be fair, though, is like, I think that a lot of people who make products that have these kinds of exclusions or biases aren't necessarily going in being like, I know my product is really biased and I actually just like don't care. Like, I don't think that usually is it. I think often to me, it's just this kind of mindset of either we just like haven't really thought about it. I think this is particularly common with accessibility, which is that often accessibility has to be
Starting point is 00:38:49 integrated in from the very beginning of the design process. It can't be slapped on at the end. And too often, I think that's how it gets approached. And so people just haven't even begun to think about, you know, oh, like, will my language, I mean, sorry, will my model work for disabled people, will my product work for people with these different kinds of like physical disabilities? Like, you know, I think it's just that whole groups of populations just get ignored. Or, you know, maybe they've realized and they think,
Starting point is 00:39:13 oh, it's actually a really bad thing that this product doesn't work for this particular group. But I think I'm just gonna make the trade off and decide, like, I think that's a small enough consumer base that I can still sell my product. And like, I don't think I'll get too much into pushback. I think it'll be fine. And I think this might often be the case for populations that are perceived as being very,
Starting point is 00:39:31 very small. So I'm thinking of say like trans or gender diverse people who often get erased from certain data sets because they're like, oh, well, this is statistically a very small percentage. And it's like, yeah, but those people's exclusion really, really matters. It has a huge impact on their life if they go through a scanner and their body is not recognized or seen as non-normative or if they're excluded from different databases. These do have huge knock-on effects for people. So, yes, I guess I would say that the kinds of people maybe who are not seeing AI ethics as a priority aren't doing it because they're just like,
Starting point is 00:40:04 maybe who are not seeing AI ethics as a priority aren't doing it because they're just like, you know, oh, you know, de-biasing, whatever, that's fine. I think it's, yeah, it's probably more just to do with a lot of different blinkers, like a certain kind of narrow mindset about who your consumer is, and who's actually using these products. Right. Yeah, it does feel probably but for the Trump administration, I'd say they're probably very much focused on the fact that because they're, it doesn't matter. It's like, I don't know, even if it makes everything unsafe, I just have to say the words, I don't like equity. And that's without any consideration for what that means. And if the whole world breaks, like the better for him to consolidate power, you know, like that feels. Yeah, I mean, did you see the semaphore article about like the group, the Mark Andreessen
Starting point is 00:40:45 group chats that like it. So he's been having like these signal chats since the days of the pandemic. And it's like he has like gone out of his way to bring in all of these tech leaders and then like fucking super far right wing, like people like Richard Hanania, like the guy who's like an outright white supremacist, and like put these people in group chats together. Like at one point he like brought in some progressive people, and then they wrote a New York Times op-ed criticizing laws that were banning critical race theory.
Starting point is 00:41:25 He just had a meltdown and was like, you betrayed me by writing this thing and kicked them out of the group chat. Since then, it's just all this extreme right wing propaganda that's being vomited back and forth between these people who are the biggest, most powerful oligarchs and like the people who are in charge of the direction that tech takes. And so I, yeah, I mean, it feels like this whole thing is developing in a way that feels particularly like non-optimal and like stupid and narrow-minded and racist and white supremacist and all those things and like this
Starting point is 00:42:08 Was very helpful for for a context. I just wonder like all of like we're seeing a model that's being Developed not in the best way possible. It seems like like it to say the least and we're probably going to discount a lot of these possibilities because they're so shitty at what they're doing. But do you see that sort of white supremacist mindset kind of pervading in the tech mainstream? Oh, what a question. I know. So it's a big one.
Starting point is 00:42:43 Yeah. No, I mean, I feel like, I guess, like, what I do think this gets you towards is this, like, very public repositioning of Silicon Valley, which I think always had this, like, relatively liberal veneer, even if it's not clear how deep those roots actually went, kind of very explicitly aligning themselves with the right and with Trump. And it's a little bit hard to tell, like, how much of this is, like, political expediency, people saying, well, clearly, you know, I'm going to do anything to avoid heavier regulation, anything to avoid this kind of like, sort of punitive regime that Trump is exacting on many different institutions. So we're going to, you know, put ourselves in his camp. And how much of
Starting point is 00:43:20 this is like an expression of like deeper ideologies and deeper kind of political beliefs that have maybe sat dormant or sort of have kind of been cultivated within Silicon Valley and tech firms, and now kind of finally bursting onto the scene now that they feel a bit more empowered as this kind of been this like global shift towards the right. how much we can discern between the real deep feeling and the kind of political expediency argument. But I think it's just undeniable that you have people like particularly Mark Zuckerberg, who would have been normally more kind of center, possibly center left on a lot of issues, even though he was obviously running a massively exploitative, gigantic,
Starting point is 00:44:01 quite dangerous tech company, now sort of really aggressively rebranding as a kind of wooing Trump, but also sort of buying into a lot of these like hyper-masculinist tech bros, sort of languages and ideologies that I don't think certainly five years ago, we would have seen him sort of supporting in the same way.
Starting point is 00:44:20 So yeah, it feels like there's been this sort of very visible shift in Silicon Valley, but again, as someone who's not based in Silicon Valley, like I probably couldn't tell you like how much that feels like this is, this is actually the real face of the beast versus this is actually just like what people are doing for the moment. Yeah. Cause I mean, you see how much like Andreessen got Silicon Valley money together to get Trump into office along with a bunch of other crypto people's like $78 million from like the Andreessen side of things.
Starting point is 00:44:47 And you know, like these group chats, they do, it's like our modern day smoke filled lounges where you get to see these very powerful people sort of debate these topics and get their takes out there that are like wacky as hell. And that's what I think, like reading this article, it was a little bit more like, damn, I mean, I don't know if anybody in this group chat thinks that but there are definitely some vocal people in this group chat that definitely think they are the ones who are going to solve these very complex issues. But in the most
Starting point is 00:45:19 like inelegant one size fits all kind of way that's just all about power. And that's what I'm like, oh, I think it's really these are starting to blend together, especially as we see how much how people's like media diets and the information they receive are informed by what these people who are running these companies, how they believe in, like how information should move and how people get siloed into information bubbles and things like that. That's when I started to begin to be like, hmm, this feels a little bit more like a smoking gun.
Starting point is 00:45:50 When you hear like, you know, these ideas are being exchanged and knowing that like, there's this guy, Curtis Yarvin, that we talked about a few weeks ago, who's like basically like a tech monarchist, who has a lot of ideas that people like Elon Musk and Peter Thiel like are really subscribed to. Katie, yeah. Yeah. And like in these group chats, it's where these a lot of these very influential people start getting introduced to these kinds of ideas. So that's when I'm like, this feels very sinister at best. Like when you see this and then thinking that these are the people that control the levers that just normal people who are using the internet and stuff get affected by their policies,
Starting point is 00:46:30 it feels like a little bit like they figured out this is how they exercise their immense control over people in these sort of less sort of, not like in the ways that we think in terms of governance or whatever, but through the spread of information and ideas. Yeah. I guess I feel like the Signal Chats raised two things about this pivot to the right, which I think are both really important. And I think the one is what you've pointed out is this small scale influencing. I think often when we talk about disinformation or misinformation, we're thinking about that in terms of, oh, someone makes an AI deep fake, and they release it onto the internet. And somehow that deep fake convin of like, oh, someone makes like an AI deepfake and they like release it onto the internet. Somehow that deepfake like convinces like many, many, many people
Starting point is 00:47:08 that this event occurred or didn't occur and it has like seismic effects. And like, you know, again, I don't think that's necessarily how disinformation works. I think that yeah, these kinds of like small cultivated circles of trust are maybe the things that we should be really, really concerned about, which is like what actually causes people to change their mind or change behaviors. Like maybe these are the levers that we should be really, really concerned about, which is like what actually causes people to change their mind or change behaviors. Like maybe these are the levers that can like be significantly more effective. But yeah, the second is kind of a point you were raising and I think does again like tie into what happens when you build these bonds of trust, which is just like following the
Starting point is 00:47:37 money and like you mentioned, sort of the massive amount of tech funding that got Trump into office. And I had a colleague who now works at The Guardian, and he and the team kind of just like tracked all the funding behind the Trump and Harris campaigns. And what was just really astounding was just like the extraordinary amount of money that specifically just came from tech. And like that's been a really big change in the last five years or so is just like
Starting point is 00:47:59 how much money sort of Silicon Valley has been putting it into political lobbying. And so I think, you know, yeah, it's even regardless of your political orientation, but particularly right now with the Trump administration, I think that's something we should all be really concerned about. Yeah. I mean, because it feels like this is the best way. In some level, the tech industry is sort of like, well, we're actually now the
Starting point is 00:48:24 ones that are really able to manufacture consent through social media and misinformation. You marry that with a presidential administration that would really benefit from that full court press propaganda making. It feels like everyone wins in their own way, although clearly some people in tech had their regrets after all the tariff chaos. And they're like, no, no, no, no, no, no, no, no, that actually also.
Starting point is 00:48:49 Don't trust that, don't trust that. Oh, fuck. Fuck. That one's actually really valuable. I just wanted less regulations and for people to say the N word on Twitter more. That was my whole thing. And now I have no money or less money, but yeah,
Starting point is 00:49:03 it feels like it's just like the most clown show version of all this, I think which is also very frustrating to watch or just have to sit idly by as it's all happening to us. Very intentional. Also, it's just really pathetic. Just seeing how easily influenced these people are. I mean, it makes sense because it feels like they're from a church that believes that wherever the most money is equals the right thing.
Starting point is 00:49:34 You just get them on a group chat with a billionaire and they're like, well, I mean, that guy must be the smartest guy in the world. His ideas I have to listen to. Yeah. I do think as well, to some extent, I'm like, yeah, to that point, much less interested in what people quote unquote, really think and much more interested in just what they do. And so to some extent, I'm like, yeah, does it really matter whether Mark Zuckerberg is personally invested in these quite misogynist ideologies about what it means to be a man, or does it matter that he has gone on Joe Rogan and now propagates a lot of these ideas
Starting point is 00:50:09 and has publicly shifted meta to align with or support Trump's administration? That to me is what really matters right now, which is that we have these active shifts of power in the tech industry that go beyond politically signaling in certain ways in certain ways. Like they're really, really tangible. And I think Musk is like the flagship of that in terms of someone who's just been like very, very visible in the administration, but you know, he's certainly not the only one. Yeah, it feels like there's no shortage
Starting point is 00:50:36 of these tech billionaires who are dead set on a horrifying ideology. All right, let's take a quick break and we'll come back and keep talking about AI. We'll be right back. Hey, my name is Jay Shetty and I'm the host of On Purpose. I just had a great conversation with Michelle Obama. To whom much is given much is expected. The guilt comes from, am I doing enough? Me, Michelle Obama, to say that to a therapist.
Starting point is 00:51:08 So let's unpack that. Former First Lady Michelle Obama, and someone who knows her best, her big brother, Craig, will be hosting a podcast called IMO. What have been your personal journeys with therapy? We need to be coached throughout our lives. My mom wanted us to be independent children and she would always tell me stop worrying about your
Starting point is 00:51:32 sister. Having been the First Lady of the entire country and representing the country in the world I couldn't afford to have that kind of disdain. What would you say has been the most hardest recent test of fear? Listen to On Purpose with Jay Shetty on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. It's nostalgia overload as Wilmer Valderrama
Starting point is 00:51:56 and Freddy Rodriguez welcome another amigo to their podcast Dos Amigos. Wilmer's friend and former That 70 Show castmate Topher Grace stops by the Speakeasy for a two-part interview to discuss his career and reminisce about old times. We were still in that place of like, what will this experience become?
Starting point is 00:52:12 And you go, you're having the best time. Yeah, yeah, yeah, yeah. But it was like such a perfect golden time. Listen to Dos Amigos on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. And the dream season is now complete. Apple podcasts or wherever you get your podcasts. conversation as a player when you hear that you're being benched. For the entire behind the scenes story of Golden State's incredible 10 year run, listen to Dubb Dynasty on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hello, hello.
Starting point is 00:52:57 Malcolm Gladwell here. On this season of Revisionist History, we're going where no podcast has ever gone before. In combination with my three-year-old, we defend the show that everyone else hates. I'm talking, of course, about Paw Patrol. There's some things that really piss me off when it comes to Paw Patrol. It's pretty simple. It sucks.
Starting point is 00:53:20 If my son watches Paw Patrol, I hate it. Everyone hates it, Except for me. Plus, we investigate everything from why American sirens are so unbearably loud, to the impact of face blindness on social connection, to the secret behind Thomas's English muffins, perfect nooks and crannies, and also we go after Joe Rogan. Are you ready, Joe? I'm coming for you. You won't want to miss it.
Starting point is 00:53:47 Listen to Revisionist History on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. And we're back. We're back. I do just add the, the Floam reference miles. I don't want that to go by without being remarked on. Okay. Like I said, check my bio encyclopedic knowledge of the 90s. Is that the one that was like, that would almost like crackle when you molded it? Yeah. It was like a bunch of little beads almost like held together. It was really, what if Rice Krispie treats was toy?
Starting point is 00:54:23 What if Rice Krispie treats gave you bowel problems you know, what if Rice Krispie Treats gave you bowel problems? If you even touched your mouth after handling it But now it's like every it's if you ate them they were in your body for the rest of your life and also your children's lives Your progeny also now every third person is like gooning to ASMR shit and we have we have flown to thank for that I know exactly. We were just out here just trying to figure it out on our own with Nickelodeon floam.
Starting point is 00:54:49 I gave it the full brand name because I felt like people would forget if I didn't properly say Nickelodeon floam. Nickelodeon floam. A larger and larger part of their body is just made out of floam. It's like instead of a left knee, I have floam down there. I can just mold it and remold it as I walk. Anyways, old debate. And this is not, this is not the story that I thought I was going to be talking
Starting point is 00:55:14 about today. And yet, like I said, we report on this like guys, we don't make it. Yesterday, I had the experience of checking social media and it just being wall to wall 100 men versus one gorilla, you know commentary videos and So this is an old debate. I mean this sort of debate, you know There is one of my favorite articles from the early days of cracked like 2008 cracked I think was by this writer Chris buckles like 2008 cracked, I think was by this writer, Chris Buckles, that was just like how to fight 20 children as one person.
Starting point is 00:55:49 It was just really well thought out and good advice that has shaped me as a person. That you've used many times. Yeah. I'll link off to that as my work in media. But it is one of those questions, once you start thinking about it, it's kind of hard to stop. I will say the original question as originally posed on Reddit five years ago, I think has a fairly simple answer because they really stacked the deck against the humans.
Starting point is 00:56:23 Oh yeah. Yeah. The original question was like, all right, first round, you can only go two at a, two men at a time. Yeah. Like in a guerrilla and they're in a guerrilla exhibit. The second one is in a construction as at a construction site and you can go 10 at a time, they also say that it just has to be like 100 average people.
Starting point is 00:56:41 And then the third one, I guess you can go all at once, but you're not going to have many people left after those first two rounds to go all at once, which is like, really your only hope. I think all of the, the sort of modifiers are meaningless because really the question is how can, what just enough in a fair one. Okay. Yeah. No tools, no weapons, not because this guy's basically bringing up like rainbow road to the top of the mountain. is how can what just enough in a fair one? Okay. Yeah. No tools, no weapons.
Starting point is 00:57:06 Not because this guy's basically bringing up like Rainbow Road type shit. We're like the one that McGrill fall off the edge or some shit. No. What is it going to take for 100 human beings to beat the fuck out of a gorilla? Yeah. Like a cage match or something like that. Yeah. Yeah. And my first reaction is it's not happening. It's not happening ever. My big thing with all of this is human beings don't have fangs.
Starting point is 00:57:34 They cannot draw blood. So you're counting on purely overpowering a gorilla. I mean, we do have fangs, Miles. But you can you draw blood from through that gorilla's fur? Probably if I had to, if he stayed, if I had to, if he stayed still for a little bit, if I bit the finger real hard, like Charlie. Yeah, I think I probably could. I can bite through a lot. I got what I'm saying.
Starting point is 00:58:01 That's how I see a gorilla somehow, like if it fought a tiger, it's because the tiger slashes the gorilla and it's right It's like it's leaking. Yeah, it's in a straight-up hold the gorilla fucking hundred people are going down That gets ripped in fucking half by this thing all 99 other people are gonna go No, right you need you have to overcome the things that I think you need I think you need. I think first of all, we need to, we need a draft. Like, and not, not a, like military draft.
Starting point is 00:58:31 Where are we going by birthday? I think we need to, McGroober style, get to put together a team. McGroober the film, not the SNL sketch. Oh, thank you. Which I, the film, the sketch. The sketch is a completely different thing. The film is a work of art. The, you know, I think we need to be able to intentionally assemble a team.
Starting point is 00:58:54 I think there needs to be some sort of a squid game style motivation that is going to prevent what you're talking about. Where they see a person get ripped in half and everybody's like, this is not worth it. It needs to be worth, like it needs to be 100 men and the griller in the cage and like, it's not over until. That would be sick in squid game, like in the actual,
Starting point is 00:59:18 like a new season of squid game, like when there's like 400 people left, it's like all 400 of y'all versus gorilla right now. Right? Yeah. Yeah. You see all are gonna die. A lot of y'all gonna die. But 400 400 feels like maybe 400. So how are you? I don't even know 100 is different than 100 other than like, I feel like a gorilla killing 300 people with its bare hands may tire the gorilla out get tired And then the other ones are just like coming up and like kind of yeah
Starting point is 00:59:49 Then you have a shot feel like that would probably be true after like 50, right? Like that's a lot of people to kill I could see that I almost think Now I think you could almost it can be it can almost be a random assortment of a hundred people It doesn't they don't you don't even need almost be a random assortment of a hundred people It doesn't they don't you don't even need to do a draft. You just need a coach That's right. Yeah, like like a maestro a conductor. Yes a maestro who's running plays and basically the goal of every play is Somebody goes for the eyes. I think once you get the eyes
Starting point is 01:00:22 Brian Okay, they're I like that. Okay. They're kind of down. That is right. It's got to be the eyes. Although, all right, so this is what I'm saying. It's got to be the eyes. That convinced me in a split second.
Starting point is 01:00:36 At first, I was like, we need a legendary football coach. Dude, I think Brian's the coach. I know. Well, now I'm thinking we need still still I want Mike Tomlin Mike Tomlin from the Steelers and the Steelers like just every year they have a terrible team Everyone's like these guys are gonna win two games this year and every year they are like a two-seat right and he's just has Energy he like yells at everybody cut your eyelids, cut your eyelids off, cut your eyelids off
Starting point is 01:01:07 when they go out to play, which is just like, so I don't know. And also a great tactician, although we might not need one with Brian. No, no, no, this Brian has absolutely- I was thinking primatologist would be helpful. Like somebody who can really speak to it. But like a primatologist where you've like, you know, kidnapped their kid because otherwise they're gonna be like,
Starting point is 01:01:28 we should let the gorilla kill us. Yeah. Because I don't wanna introduce like, you know, there could be something like gorillas hate, like reflective material or something like maybe like a non-weapon object that evens it up. But I don't even like that idea because now we're not talking, we're talking about a hundred people.
Starting point is 01:01:44 Yes, a hundred people with tools can be in a gorilla, right? We're just if we're going to brass tacks, bare hands and feet. I think Brian, yes, you you take out the vision. Now you're dealing with an animal has no situational awareness. And now you can kind of get a little like, but even then, you're gonna be teasing it. Yeah, yeah. how to get a little like, but even then you're going to be teasing it. Yeah.
Starting point is 01:02:05 Yeah. That was a mental game. That's right. But either way, terrified. Don't we still have it solved the part that you have negated the absolute exponentially higher strength of the gorilla, because the second you grab it, the gorilla will just put hands on you, rip you in half, even if it can't see.
Starting point is 01:02:27 And I feel like he's going to, you're just going to take the gorilla off even more with his eyes being all poked out. That gorilla is PO'd. Yeah. I, so I was, I was fully on the, you just overwhelm it with a hundred at once. And then this, uh, Guardian article about this debate pointed out that somebody did a three professional soccer players, three professional football players versus 100 children.
Starting point is 01:02:51 Oh, I saw that. That was in Japan. In Japan. Yeah. And the professional soccer players won. Oh yeah. But they beat the shit out of them. They want, I mean, they won one, nothing, but like, I don't, that may be really,
Starting point is 01:03:02 if you see the video, they had like 60 kids in goal, like literally 60. Like there were like 60 kids dressing as goalkeepers in the goal, and they just had to kind of head it in over them. But like the way they just the three of them just spread the field out and were just hitting long balls that they can like, like just had the kids chasing. Right. And it was pretty it was pretty impressive. If it's the one I'm thinking of because I've definitely seen that one. I was like, yeah, their skills are just way too high for the kids to handle that.
Starting point is 01:03:31 I think if it's 100 random people, it depends. I do think you're probably going to get one person who screams go for the eyes in that group. You know what I mean? Then you might have a chance. If we're allowed to build a team, I do like the human's chance. Like if we're allowed to draft, even if we're allowed to draft 10. There's no answer. There's no answer. Like there's no conceivable answer for how you negate. What are you going to do?
Starting point is 01:03:54 Like have 10 guys try and hold the arms. Right. Like a gorilla will. I'm talking Mike Tomlin and the foremost primatologist. And then like Lauren, I'm talking Mike Tomlin and the foremost primatologist. And then eight. Dr. Ryan Bahi. And then like Lawrence Taylor or whoever the closest person to Lawrence Taylor currently.
Starting point is 01:04:14 I like the I thing, but I keep coming back to the thing of like, you can't negate the strength. You just can't negate the strength. I do think, even in the I thing, I do think 10 people will die for sure. Oh yeah. Oh yeah. Oh, yeah Ultimately the the people will win. I Do I do think that's you like you draw some everyone's pushing vibes right now. No one's giving me real
Starting point is 01:04:36 What's wrong plays and then you you show them Brian give me an example of the play what happens? How are you gonna like you can you can't damage a gorilla's Achilles tendon with your teeth. You know what I mean? I'm trying to think of everything. The first 10 are absolutely going to get fucked up. That 11th person goes through the eyes and then the 90 rush the fuck out of the gorilla.
Starting point is 01:04:58 I think a big difference is going to be if it's in a ring where no, you can't pick up something that's close by because like our access to tools or like our ability to use whatever is at hand as a tool was like the crucial thing that allowed us to like get out of the food chain. So like if, if it's a gorilla enclosure at a zoo or even, or like, you know, I feel like that is almost an advantage because then I do feel like you can like you have rocks, you know Okay, how about this? I see but that's the thing. Yeah, I get that
Starting point is 01:05:30 But I'm I think that the thing that is into the appealing about this problem is how could the pure Just strength of the human body overcome a gorilla obviously the human is powerful That's you've been watching these sports. These home core videos have brought in, uh, in a place where he believes not I'm with him. I just like over that, like, like track, you just hear this song playing. That's like the hope and all the hope core.
Starting point is 01:06:06 You're like, oh man. And then the hundred guys ran after the gorilla. It's just very graphic. And you just, I'm saying one, like access to one rock. I don't know. So there's this story I like about a guy's a device versus a guy named the life Thank you Sports Hope Corps
Starting point is 01:06:36 We don't bring it back way to bring it back All right, that's gonna do it for this week's weekly zeitgeist Please like and review the show if you like the show. Means the world to Miles. He he needs your validation, folks. I hope you're having a great weekend and I will talk to you Monday. Bye. So So I want you to ask yourself right now, how am I actually doing? Because it's a question that we rarely ask ourselves. All of May is actually Mental Health Awareness Month and on the psychology of your 20s, we are taking a vulnerable look at why mental health is
Starting point is 01:08:04 so hard to talk about. Prepare for our conversations to go deep. I spent the majority of my teenage years, of my 20s just feeling absolutely terrified. I had a panic attack on a conference call. Knowing that she had six months to live, I was no longer pretending that this was my best friend. So this Mental Health Awareness Month, take that extra bit of care of your wellbeing. Listen to The Psychology of Your 20sies on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. It's nostalgia overload as Wilmer Valderrama and Freddy Rodriguez welcome another amigo
Starting point is 01:08:35 to their podcast, Dos Amigos. Wilmer's friend and former That 70 Show castmate, Topher Grace, stops by the speakeasy for a two-part interview to discuss his career and reminisce about old times. We were still in that place of like, what will this experience become? And you go, you're having the best time. But it was like such a perfect golden time. Listen to Dos Amigos on the iHeart radio app, Apple podcasts, or wherever you get your podcasts. Hey, I'm Jay Shetty. And my latest interview is with Michelle Obama. To whom much is given, much is expected. The guilt comes from am I doing enough?
Starting point is 01:09:09 Me, Michelle Obama, to say that to a therapist. So let's unpack that. Having been the first lady of the entire country and representing the country and the world, I couldn't afford to have that kind of disdain. Listen to On Purpose with Jay Shetty on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. The number one hit podcast, The Girlfriends is back with something new,
Starting point is 01:09:35 The Girlfriends Spotlight, where each week you'll hear women share their stories of triumph over adversity. You'll meet Luanne, who escaped a secretive religious community. Do I want my freedom or do I want my family? And now helps other women get out too. I loved my girls. I still love my girls. Come and join our girl gang.
Starting point is 01:09:58 Listen to The Girlfriend Spotlight on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.