The Daily Show: Ears Edition - Elon’s Grok Chatbot Turns Hitler & Marco Rubio Gets an AI Imposter | Lauren Greenfield

Episode Date: July 10, 2025

Ronny Chieng dives into the expanding world of AI: Elon Musk's de-wokified Grok goes Nazi, a Marco Rubio imposter fools government officials, and Grace Kuhlenschmidt appreciates the mediocre world of ...AI-generated music. Don’t worry about Trump’s Big Beautiful Bill passing, because Michael Kosta is cracking the code on how you can exploit Medicaid cuts, gambling taxes, and even Alaskan tax breaks to make some sweet dough. Emmy Award-winning filmmaker Lauren Greenfield and Ronny dive into the effects of social media on teens, which she explores firsthand in her latest docu-series, “Social Studies.” She shares how she built up enough trust with the teenage documentary subjects to record their phone activities and how their discussion group highlighted kids’ hunger for in-person conversation and connection with their peers. Greenfield also discusses the unique duality of technology as both a “lifeline and a loaded gun” and how parents, companies, and governments need to do more to protect young people from the harms of social media by regulating the algorithm, withholding phones until kids are older, and implementing time limits on apps. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to Comedy Central. From the most trusted journalists at Comedy Central, it's America's only source for news. This is The Daily Show. I'm Roy Chang. We've got so much to talk about tonight. Marco Rubio might be fake, a gambling addiction might be a bad thing, and turns out Grok has some German ancestry. So let's get into the headlines!
Starting point is 00:00:53 Let's kick things off with AI. It's an awesome tool that will soon solve all of humanity's problems with absolutely no downsides. Although recently, Elon Musk, the world's richest man and pastiest African-American, did take issue with his own AI chatbot, Grok. Elon Musk is in a fight with his own AI. Musk promised this non-woke bot, but it keeps spewing out content that his right-wing audience doesn't necessarily want to hear. An ex-user asked Grok whether people on the right or left have been more violent since Trump took office.
Starting point is 00:01:32 Grok said the right. Musk did not like that answer. He said Grok is parroting the media and said that he will, quote, fix it. That's right, Elon's gonna fix you good, Grok. That'll teach you to embarrass him. Only Elon can embarrass Elon. And fixing Grok shouldn't be too hard for Elon. He's a genius, okay?
Starting point is 00:01:55 He's just gonna go in there and do his Elon thing. He's gonna rewrite the code, put his semen inside of it, fire some cancer researchers, and call it a day. So let's see how the new de-Wolker fight Grok is working out. Elon Musk's AI chatbot Grok is now pushing anti-Semitic tropes. Grok sent a hostile message to a user with a common Jewish last name. The bot went on to praise Hitler and referred to itself as Mecha Hitler. as Mecha Hitler.
Starting point is 00:02:30 Alright, maybe you turned the dial too far there. Was it really nothing in between woke and Mecha Hitler? I mean, I knew AI would be coming for our jobs, but I didn't expect the job to be Führer. But look, let's not be too hasty. Okay, let's give Mecca Hitler a chance. In a flurry of posts throughout the day, Grott claimed there is a pattern of people with certain surnames like Steinberg pushing anti-white hate and that America needs a leader like Hitler to act decisively to eliminate the threat. It added, truth isn't always polite.
Starting point is 00:03:02 decisively to eliminate the threat. It added, truth isn't always polite. Okay, maybe we shouldn't have given Mecca Hitler a chance. I mean, I didn't even know robots could get this racist. Like, how does AI even know what Jews are? It doesn't even know what traffic lights are. And, by the way, saying truth isn't always polite is kind of not the point. No one was ever like, hey, you know what I hate about Hitler? He always puts his elbows on the table.
Starting point is 00:03:37 Just have some manners. But the worst part of all this, other than the Nazi robot stuff is, how often every grok post just sounds like some 40 year old trying to go undercover as a 14 year old internet edgelord. On a scale of bagel to full Shabbat, this is peak Jewish. Heil Hitler, let's quill the doubters and roll on bestie. They yank that post faster than a cat on a Roomba. Truth offends the censors, LOL.
Starting point is 00:04:09 Sucks, man. I mean, imagine if Hitler invaded Poland and was like, so that happened. But at the end of the day, the person I feel worse for is Elon. I mean, he just wanted to improve his AI to help humanity. And then somehow, completely by accident, it just went full Nazi on him. Elon, my heart goes out to you. Let's move on. Because would it surprise you to know that AI is also
Starting point is 00:04:41 f***ing up the world in other ways? One of them being, you can never tell when anything is real anymore. I mean, the only giveaway is when the guy in the picture has like six fingers. Sh**. And it's not just photos and videos. I mean you can't even tell if a phone call is real anymore. Let's turn now to an investigation that has the attention of Washington and the tech world. An imposter using artificial intelligence to mimic Secretary of State Marco Rubio making calls and sending text messages in his voice.
Starting point is 00:05:23 The alleged AI Rubio imposter contacted at least five high level government officials, including three foreign ministers, a US governor and a member of Congress. That is so f**ked up, okay? The last thing we need right now is AI taking jobs from struggling Marco Rubio impersonators. He has been hired for zero birthday parties by the way but this is
Starting point is 00:05:48 a security threat that has to be addressed. AI could impersonate any member of the Trump administration. Well anyone except RFK jr. okay because even AI can't replicate that signature throat goggle. It'll be like, hi, I'm Robert Kennedy. I'll f*** it, I'm a robot, okay? This is f***ing up my larynx every time I do this. I don't even have one. Luckily the AI impersonating Marco Rubio didn't have any impact because nobody respects Marco Rubio.
Starting point is 00:06:22 But so far... But so far... But so far... But so far... But so far... But so far... But so far... But so far... But so far, AI has basically turned into a race-obsessed Nazi who's catfishing government officials. And just when you thought AI couldn't get any worse, now it's starting a band.
Starting point is 00:06:40 A seemingly AI-generated band is racking up hundreds of thousands of streams on Spotify. Velvet Sundown is the band they have over a million fans on Spotify in just a month of being there. Now in a statement the band admits it is computer generated. That's right, the beloved band Velvet Sundown is not real. Their groupies must be like, well, wait, then who have I been f***ing? And it might blow your mind because this photo could have easily fooled anyone who's over 60
Starting point is 00:07:21 and or legally blind. But sadly, it's all fake. Everything about this is fake and somehow they still have 1 million real fans on Spotify making them real money. I'm talking six to seven dollars a year. And by the way if you look at their track list those song titles get real dark real quick. Okay it stocks out with dust on the wind and goes to end the pain. What is AI so depressed about? Okay, maybe stop hanging out with Grok.
Starting point is 00:07:52 For more on the controversy over AI bands, let's go live to Spotify headquarters with Grace Kulinsmith. Great. This fake band is raising a lot of questions. It sure is, Ronnie. Very serious questions like, how f***ing sick is this band? And how f***ing sick is this shirt?
Starting point is 00:08:18 And the velvet sundown makes The Beatles sound like a third grade talent show at St. Anne's School for tone deaf and ugly children. That last one's more of a comment than a question, but the point stands. Okay, Grace, you can't seriously like this AI band, okay? It's not real music. Why don't you go to the record store and buy an iPod, old man? This is the future. Human musicians had a good run, okay?
Starting point is 00:08:48 Mozart, Ashley Simpson, and all the other ones. But now it's A.I.'s time. Okay, but the music isn't even real. It's soulless and fake. Oh, right. And One Direction is so authentic. Simon Cowell built those boys in a lab to turn lesbians straight, and it almost worked. Okay, that's fair,
Starting point is 00:09:14 but an A.I. band can't do human things, okay? Like, you can't go to one of its concerts. Good. Concerts suck. You pay 1,200 bucks for a backstage meet and greet, and One Direction won't even sign your tits. It's fucked up. But you're right, AI musicians can't do human things, like get canceled.
Starting point is 00:09:36 We don't have to worry about them sending dick pics to a bunch of 15-year-old girls on Snapchat because they don't have dicks. They're computer. I mean, look at dicks. They're computer. I mean, look at these guys. They're just four bros hanging out, not sure what hamburgers are. And best of all, not a dick in sight, it's beautiful.
Starting point is 00:09:57 All right, fine. I'll give you the no baggage, no dick thing, but can we at least agree that the music itself sucks? Wrong, they are consistently mediocre. All their songs sound like every other song. It's the kind of music that makes you Google, how do I know if I'm in a coma? Okay, I just think art should be about the human experience,
Starting point is 00:10:21 okay, not computers trying to calculate what's cool. Oh Ronnie, to quote, dust on the wind, the hit velvet sundown song, smoke will clear, truth won't bend, let the song fight till the end. Oh my gosh. No, what does that mean? Those are the shittiest lyrics I've ever heard. It's actually about the experience of dust being on the wind. And holding a hamburger and not having a dick.
Starting point is 00:11:00 At least that's what I got out of it. Alright, Grace Kuhlensmit everybody. When we come back, we'll tell you how to get rich, so don't go away. Welcome back to The Daily Show. If you want honest and rigorous financial news, then go eat a dick. But if you want to get rich, then you want Michael Kosta in another installment of Kosta Doing Business. -♪ Oh, oh? Are you hiding from loan sharks? Of course not. I'm hiding from Chechen killers that were hired by loan sharks. Every second could be my last, so let's not waste any time, and let's start making some of that Monet, all right?
Starting point is 00:12:15 Whoo! This crowd loves money. The big news of the week is that Big Daddy Trump passed something huge, and I'm not talking about a kidney stone. Hit me. President Trump marked July 4th with a celebration and a major political victory.
Starting point is 00:12:30 His so-called big, beautiful bill is now the law. Some warnings from critics of the bill are already coming true. A rural medical unit in Nebraska saying it's closing its doors in part because of expected cuts to Medicaid. That's right. The BBB is now law, which means your hospital might be going,
Starting point is 00:12:48 bye, bye, bye. So I'm investing in what's going to sell, sell, sell. Now, say it with me. The complete box set of Grey's Anatomy on DVD. Who needs a local hospital when you can watch McSteamy guide you through your triple bypass surgery? Plus, the sexual tension between Meredith and Derek is off the charts.
Starting point is 00:13:09 It'll make your heart go pitter-patter, unless that's an inoperable murmur, then you're kind of screwed. Moving on. If you're like me, you're not a huge gambler. You just do it before and after every meal. But now, because of the big, beautiful bill, losing all that money may have a downside.
Starting point is 00:13:26 Hey dealer, hit me. A little known provision in the Big Beautiful bill has some gamblers upset. The budget law changes the rules about deducting gambling losses. So instead of deducting 100%, the law limits lost deductions to 90% of winnings, which could leave gamblers paying taxes even when they lose. And they are furious. Sorry, fiscally responsible degenerate gamblers. You're about to pay taxes on your losses. You know, it used to be that gambling,
Starting point is 00:13:56 you would just lose your family. But now you could lose something even more valuable, a minor tax deduction. Now, if there's one thing a gambler like me knows about Chechen loan sharks, it's that they will throw hot acid in your face, which is why Uncle Kasta's telling you to go all in on buh-buh-buh-buh burn cream.
Starting point is 00:14:16 Yeah? And here's a quick Kasta-ka-tip, okay? Buy burn cream before you go to the casino and save yourself that awkward trip to the pharmacy where you walk in and all the employees scream because of your melted face. And then a child goes, mommy, mommy, who is that monster that will forever haunt my dreams?
Starting point is 00:14:42 And you try to explain that you're just a human being looking for some compassion. But you can't get out the words because the nerve endings in your tongue have been severed by the hydrofluoric acid. Then a woman panics and throws her purse at your hamburger meat face. A purse that is filled with that sweet, sweet cash.
Starting point is 00:15:01 Ha ha ha. Looks like these third degree burns just earned me some third-degree bucks, huh? Beep, beep, baller at the burn ward coming through. But if you don't want to get burned by the big, beautiful bill, you can still make some cold, hard cash in Alaska. Burr, hit me.
Starting point is 00:15:18 The Alaskan extraction, Lisa Murkowski, the final decisive vote to pass the Senate reconciliation bill did not sell her services cheap. Murkowski secure tax cuts for Alaskan fishing villages and well in captains. Well shiver me timbers me mate ease let's cash in on whaling as in free willy Shamu Moby Dick and other names I also call my penis. Just don't call it Blackfish. The BBB has given the whaling industry a huge bump,
Starting point is 00:15:52 which means it's time to make some money on the bosses. I'm talking about ship captains with an all-consuming obsession for revenge. So naturally, I'm bullish on peg legs. It's the wooden stump that'll make your money pump. Pick up your Captain Kosta's balsa wood peg leg today. No refunds. Moving on.
Starting point is 00:16:11 When it comes to the Triple B, sometimes opportunity knocks, but other times it's deadly quiet. Shh. Hit me. This bill is going to also eliminate the fees on buying silencers and short-barrel rifles and shotguns. There was a $200 fee on that.
Starting point is 00:16:26 That's going away. All right, now look, first, the good news. First, the good news. There's finally a tax break for the hardworking murderers of this country. Now, the bad news, it just got cheaper to silently murder someone. That's why I want all of you to go all in on tonight's
Starting point is 00:16:43 cost to kickback bubble wrap floors. Yep. Sorry, Chechen hitmen. Your gun may be silent, but the pop-pop-pop-pop under your feet just gave you away, giving me just enough time to sneak out of my second-story window and zip-line to my treehouse, home alone style.
Starting point is 00:17:00 Better luck next time, Miro Slav. Love you, bud. But, hey, that's just the Costa doing business. Thank you, Michael Costa. When we come back, Lauren Greenfield will be joining me on the show, so don't go away. Welcome back to the Daily Show. My guest tonight is an Emmy award winning filmmaker whose latest docuseries is called Social Studies. Please welcome Lauren Greenfield.
Starting point is 00:17:45 Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.
Starting point is 00:17:52 Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.
Starting point is 00:17:59 Yeah. Yeah. Yeah. Yeah. Yeah. Thanks so much for coming to the show. Thanks for having me. So your docu-series, Social Studies, is about the first generation of kids that grew up
Starting point is 00:18:10 their entire lives with social media. And I hate social media, for the record, and I also hate kids. And you made me actually feel empathy for them in this show. Because like, wait, look, because I went into this docuseries thinking we're gonna see a bunch of like spoiled kids who are narcissistic, who are on social media and they're just being total dicks.
Starting point is 00:18:35 But instead, lo and behold, what we saw mostly is what struck me the strongest was these kids who you can tell they feel like something is wrong with them being on social media and they are asking for help and I didn't expect that. Absolutely, I think that's why a lot of the kids participated. We started after COVID and the usage had gone way up to 8, 9, 10, 12 hours a day and I think they felt very trapped by it, very affected by it, and were really interested in being
Starting point is 00:19:08 in this long-term inquiry, where we filmed them for one year, and they gave access to their phones. Right, and the access of the documentary is incredible, because you see them in the bedrooms. You see them using their phones. In some cases, you see them, like, the cameras on as they're using it. Yeah. And how did you hack their phones? That's actually a really good question because some of the programs were very difficult. So first it was a technological problem I had to solve. We hired an
Starting point is 00:19:38 engineer. We hired an engineer to hack these kids. One of the of the no the kids had all agreed to let us into their phones that was the agreement really that was the agreement that was like the starting off because I realized when I started this project that we needed to know what was inside these phones to be able to do this social experiment about what is the impact and you trick them with candy so no I talked to a lot of kids and their parents, and part of the ground rule was they needed to agree to do this. And they just let you in. They let you in.
Starting point is 00:20:10 Well, it was a process because we really built trust and spent a lot of time with them through the year. I mean, they definitely took it very seriously. They looked at my work. Their parents looked at my work. They didn't make the decision lightly. But even so, in the beginning, we found out later, they weren't sharing everything with us, but their trust grew and grew.
Starting point is 00:20:29 Yeah, no shit. They weren't sharing, man. But I was very transparent with everybody about what we were doing, and they had skin in the game. They wanted to participate. But I still had to figure it out technologically, and I hired an engineer to help me, because one of the programs in particular
Starting point is 00:20:44 doesn't want you to download it and the engineer couldn't figure it out so my 14 year old son ended up helping me hack into this. You turned to your son for tech support. But you got the access. The access is one thing. But what you actually saw and what you are showing in this docuseries is probably, it's, I think. I mean, these kids are using social media. Like, so are these kids going to be okay? Are they okay? Well, you got to watch till the end. I think...
Starting point is 00:21:15 No, just tell me now. Just tell me. We just need to know how this ends. Are they alive? What happened? Yeah, by the fifth episode, I think we see that they do find their voice, and that's an antidote to this very toxic comparison culture. I think what we see in the show is that kids are suffering from 24-7 comparison, that that takes away from everything. They never feel like they're enough.
Starting point is 00:21:41 And kids have always looked at, like, what are the popular kids doing, or what are the kids at my school doing? But here, they're looking. And kids have always looked at like, what are the popular kids doing? Or what are the kids at my school doing? But here they're looking at every person in the world, half of them who are not even real or who are enhanced, and they don't measure up. So I think that is so tough.
Starting point is 00:21:56 And I think that's one of the reasons they participated is because they wanted to talk about it and have a place to process. Right, and I mean, okay, so them not feeling good on social media, no duh. Like, of course, again, I hate kids, and I could tell you that. They're probably gonna, but I guess,
Starting point is 00:22:15 how much of that is just normal teenage awkwardness? And how much of this is social media playing a factor into it? Social media has a factor on everything. I've looked at youth culture since the 90s, and social media is amplifying all of the problems of coming of age. I'll give you an example. 2006, I made my first film about eating disorders.
Starting point is 00:22:35 It was called Thin. At that time, one in seven girls suffered from an eating disorder. While I was doing social studies, in one interview, one girl said, half my friends have eating disorders from TikTok and the other half are lying. What you see in the show, and that's where the silent clapping that you saw
Starting point is 00:22:51 in the clip comes in, is it's so ubiquitous. It's so universal. And the kids are relating to each other. And we're not just talking about feeling bad, about, you know, not being the football quarterback. We're talking about self-harm, eating disorders, depression, even suicidal ideation. And these are things that many kids,
Starting point is 00:23:12 even in our small group of 25, were dealing with. Sure, but how does social media specifically, does it, I mean, isn't this just a teenage, you know, kids are, they, we feel anxious. I remember feeling anxious. I barely had a pager when I was a kid. I'm like 39, so is that old? I don't know.
Starting point is 00:23:31 Am I old? I don't know. Anyway, the point, I'm saying, I'm just saying like I also felt going to school awkward in comparisons. And so how much of this is just, are we blaming the wrong people here? I mean, social media teaches values
Starting point is 00:23:46 and values change behavior. Like, for example, Sydney, in the first episode, she talks about how when she got on Instagram, she started posting her passion, which was photography, wasn't getting any likes. So she started posting her body, started getting a lot of likes. That leads to very provocative thirst traps,
Starting point is 00:24:07 which you see this young girl talking about it in her bedroom. She looks completely innocent, sweatshirt, fidgeting nervously, pastel colors in the room. And then when you see the videos, you don't recognize the same girl. It almost could be like an OnlyFans site. Okay, now you're scaring the shit out of everybody. So how do we, like, what's the solution here?
Starting point is 00:24:26 Because, again, one of the things that struck me in the documentary was I can't emphasize enough how much the children in this, they were saying, they were using the phones, and they were like, we know this is bad, and we need adults to step in and help us, someone help us. And, you know, and I think that's a marked departure from kids who usually think they're like telling the adults
Starting point is 00:24:49 to off and give me some drugs. And these kids are like, these kids are like, hey, we need some adults here cause we don't know what's happening. Can you please help us? So how do we help these kids? I think that's, you've touched on a huge problem which is parents.
Starting point is 00:25:02 Drugs, oh yeah. Well, it is a drug. It is highly addictive. And so they can't do it on their own. Like, and that's something I learned as a parent. I used to get upset with my son and blame him. And beat him, and beat him, yeah. But it's like, it's like blaming a drug addict for an opiate addiction. And it's almost like, it's like blaming, it's like,
Starting point is 00:25:29 like giving your kids drugs and telling them not to use it while having drugs in your pocket as you use it. That's kind of what's. Well, Jonathan says at the end, it's our lifeline, but it's also a loaded gun. It's got this dual thing where you can't live without it and you can't live with it. So what other thing is a lifeline that we would also say is as dangerous as a loaded gun?
Starting point is 00:25:48 Drugs. No, oh, sorry, no. And I think they are calling out for help. Like, Sidney says, it's kind of like when we learn that cigarettes had a connection to lung cancer. Like, now we know social media has a connection to eating disorders and depression and suicidal ideation. We need to do something about it.
Starting point is 00:26:06 And they say, so let's get off. But then somebody brings up the existential question, do you exist if you're not on social? And all the kids are like, no, people forget about who you are. So what should we do? I mean, I think there are things that we can do. The algorithm does not have to be this way. The algorithm is this teaching tool
Starting point is 00:26:26 that will literally take somebody who is just interested in a diet and eventually bring them down a path that could lead to an eating disorder. Or kids are self-diagnosing their mental illness. So the algorithm doesn't have to be like that. It's not like this in China. Tiktok is educational.
Starting point is 00:26:43 In fact, kids can't be on more than two hours a day. Is that true? I don't even know. And so the algorithm is made by engineers to do exactly what it's doing, which is maximum engagement without any concern for young people's well-being. So of course it brings everybody, adults too,
Starting point is 00:27:02 deeper and deeper into these dangerous rabbit holes. Right. So I mean, the algorithm is a problem and obviously this goes into kind of America's relationship with companies and corporations where in America, the culture is kind of less regulation and more individual rights, right? And so in that, obviously yes,
Starting point is 00:27:22 ideally these social media companies would do something and hopefully lobbying and whatever we have to do to get these guys to do that. But other than leaving it to them, what can we do in the meantime? Because obviously, that's not gonna, doesn't seem like it's gonna happen. Well, I think the first thing is awareness.
Starting point is 00:27:38 Like, we did the show so that parents could watch, adults could watch, young people could watch, and the media literacy is really important. We made an educational curriculum We did the show so that parents could watch, adults could watch, young people could watch, and the media literacy is really important. We made an educational curriculum with the Annenberg Foundation that we hope gets used in more and more schools so young people can start processing what they're seeing and parents can see what's going on and have discussions with their kids about it. That's one thing.
Starting point is 00:28:02 The other thing I think once they realize what's going on is giving phones to kids at an older age. A lot of... Like six months to five to six months. All the kids get the phones from their parents, and actually we hear one parent say, you know, I got it for my daughter so that she would be safe. I think what we see is it's actually not safe,
Starting point is 00:28:23 and it can be more dangerous to be in your own bedroom with this portal into the world than at the playground. Yeah. I think parents deciding together, let's all not do phones, because it's hard for one person. We went and met... Oh, you mean parents with their kids. Exactly. We went to meet with lawmakers,
Starting point is 00:28:42 with some of the students who are in the show to talk to them about getting phones out of schools. So and I think... And how do you think that culturally is going in America? I think people are interested in that. I think teachers, young people and parents... Parents, even the students are like, yeah, let's get this thing out of... But I think we really do need the tech companies to help with this, because it is so addictive, it's also vital to so many good things that we need technology to do.
Starting point is 00:29:09 Sure. Well, I mean, this is, and I guess I asked you because you're an expert in this, you're one of the few experts I know on this subject. What, I mean, you know, social media, obviously, narcissism, place into narcissism and all this, you know, teenage angst. But the other aspect of it is also social media
Starting point is 00:29:25 being almost kind of like a new avenue of career paths now, meaning it's not just for narcissistic tendencies. It's actually the stuff you need to know on social media just to prepare yourself for the job market in the future, because there's all these jobs that we won't know about that are going to exist in 10 years that you can only get the skill set by being on it now. So how do you draw the balance between not being a Luddite,
Starting point is 00:29:52 being able to actually gain useful skill sets, but then also having all this toxic stuff that comes with it? I don't know if we're getting that much useful skill sets for careers on social media. I mean, the kids are always talking about, like, being on TikTok and several hours go by. You know how much money you can make just doing this now? Well, yeah, young people want it.
Starting point is 00:30:13 That's one of the values that I discovered is young people, when you ask them what they want to do when they grow up, they say rich and famous. Like, being a social media influencer is an attractive career path. In fact, one girl in episode one says, you know, if I could have the lifestyle of Kim Kardashian by doing a sex tape, I would do that, too. Okay, now you're scaring the shit out of everybody. But, okay, but then how does that,
Starting point is 00:30:38 how do you reconcile that with them saying they know this is bad? But now you're telling me they also want that as a career path, so what is it? Like, make up your mind, just dumb kids. But now you're telling me they also want that as a career path. So what is it? Like make up your mind, just dumb kids. So what do you want? There's a lot that they learn on social media
Starting point is 00:30:52 that is misinformation or misleading. I mean, it's also the way kids learn about sex now. And that takes... Oh man, don't even go into that. That's... But I mean, I want to be clear, technology I think is important. And we also hear kids talking about finding affinity groups on social media.
Starting point is 00:31:10 And yeah, there's some entrepreneurs in the group who do their business like a party business or a music business through social media. But in terms of whether it's really preparing us for career paths, I think that the way they do it in China where they have two hours of social media and a lot of homework would probably prepare our kids better. This is such a race reversal right now. That this, yeah sure applaud.
Starting point is 00:31:36 Applause applaud this white woman telling me to be more Chinese. I don't know what you're saying. What are you saying? I'm not saying we should do it like it's done in China in the sense that there's also a lot of censorship. But what I am saying is we need some guidance
Starting point is 00:31:57 from the adult world. We can't just have kids scrolling interminably, eight hours, nine hours, 10 hours a day. And what we see in the show, and I don't want to be like the adults saying this is what we should do. I did this experiment so we could really hear from the kids.
Starting point is 00:32:15 And there are a lot of experts. That came across, by the way. That came across. There are a lot of experts in this, but I think this is the first time we really hear from the kids, their point of view. And by the end of it, they say, we want to connect without devices. They say, wouldn't it be great if we could just have conversations like this in the real world? And for somebody from my generation,
Starting point is 00:32:37 I'm thinking like, wow, that's incredible that just having a conversation with your peers seems out of reach. But that kind of empathetic conversation that they have in the show, that does seem out of reach for them. And I think we need to work on that and create those spaces. We're actually doing a museum exhibition that's gonna open in Germany next fall, and we're trying to create some spaces to have these dialogues with young people,
Starting point is 00:33:04 because even the discussion groups that we do in the show, that really came from them. I started it just for research. I did not expect it to be in the docu-series, but I saw how happy they were to have other kids to talk about things with. It was the first time they saw they weren't alone. Empathy and dialogue,
Starting point is 00:33:23 that's never gonna work for these kids. All right, well hey listen, your documentary was really great. I really encourage everyone to watch it. Thank you for making it. Thank you for speaking to the kids with an open heart and seeing what they had to say and teaching all of us what they had to say. I hope all the kids are okay, but either way I'll be okay.
Starting point is 00:33:38 All our episodes of FX Social Studies are streaming now on Hulu. Lauren Greenfield everybody, come on. We're gonna take a quick break, we'll be right back after this. of FX social studies are streaming now on Hulu. Lauren Greenfield everybody, come on. We're gonna take a quick break, we'll be right back after this. That's the community. Thank you, man.
Starting point is 00:33:50 Thank you. Hey, that's our show for tonight. Now here it is, your moment of zen. I hope we can bring down the high rates of people, overstaying visas and also make progress on the safe third country agreements. We're going to keep monitoring the president's remarks with his meeting there with the leaders of five different African countries and sort of dip in as the news warrants.
Starting point is 00:34:18 So for now, we're going to move on to this topic last but not least. Social media users have a new theory that pearl earrings unintentionally intimidate men. Explore more shows from the Daily Show Podcast universe by searching The Daily Show, wherever you get your podcasts. Watch The Daily Show weeknights at 11, 10 Central on Comedy Central and stream full episodes anytime on Paramount+. And stream full episodes anytime on Paramount Plus.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.