On with Kara Swisher - A Silicon Valley Satire That Feels Uncomfortably Close to Reality

Episode Date: March 19, 2026

"Raising money off frothy numbers to sugarcoat the rotten apple is what built this town.” That’s according to Duncan Park, the protagonist of "The Audacity," a new dark comedy from AMC about tech ...CEOs and their relationships with their families, therapists and enablers. Live from SXSW, Kara spoke with executive producer and showrunner Jonathan Glatzer ("Succession," "Bad Sisters," "Better Call Saul"), star Billy Magnussen, and executive producer Gina Mingacci ("Killing Eve," "Orphan Black"). The series follows Duncan, an ambitious, arrogant data-mining CEO determined to join Silicon Valley’s billionaire elite, and holds up a mirror to the greed and ego shaping modern tech culture. They discuss how tech’s obsession with disruption and data has come at the cost of privacy and real human connection. While "The Audacity" is a sharp satire, it also finds surprising humanity in even the most unlikable tech bros — and even made Kara feel empathy for them, despite her best intentions. Questions? Comments? Email us at on@voxmedia.com or find us on YouTube, Instagram, TikTok, Threads, and Bluesky @onwithkaraswisher. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 I never run into a banker that goes, but we're really here to do, Kara, is community through money. Hi, everyone, from New York Magazine and the Vox Media Podcast Network. This is on with Kara Swisher, and I'm Kara Swisher. My guest today are the executive producers and the star of The Audacity. It's a dark comedy series from AMC that explores the lives of tech CEOs and their relationships with their families, therapists, and enablers. The show follows Duncan Park, the ambitious, arrogant and often delusional CEO of a data mining company in Silicon Valley.
Starting point is 00:00:46 He's a wannabe tech titan who's set on joining the billionaire ranks of the Valley's powerful elite. And like real-life counterparts, his ego, greed, and insecurities help push him towards his goal and also get in the way. Emmy Award winner Jonathan Glitzer is the Audacity's executive producer and showrunner. He's also written and produced for Succession, Bad Sisters, and Better, call Saul. Billy Magnuson stars in the series as Duncan Park, and the show is executive produced by Gina Mungachi, whose credits include Killing Eve and Orphan Black. I interviewed Jonathan, Billy, and Gina live last Sunday on the box media podcast stage at South by Southwest in Austin, Texas. I was excited to talk to him because this is a really good series. It's sort of the next version
Starting point is 00:01:33 of Silicon Valley, which I was an advisor to, but it's a lot darker and a lot more realistic about the mani of these people and the damage they do. In this case, instead of adorable people, the principles in this series are more villainous, and I think that's appropriate for the time. And a quick note before we get into it, I have a new series from CNN called Kara Swisher Wants to Live Forever, premiering April 11th. We watched the trailer ahead of this conversation at South by Southwest, and you'll hear me make a few references to it during the interview. It was a fun conversation, so stick around. Support for this show comes from Odu.
Starting point is 00:02:25 Running a business takes everything you've got, and a lot of the tools out there that are supposed to make your life easier just aren't great at talking to each other, and that means you end up having to toggle between a dozen different apps and services just to keep the lights on. Enough of that. Now there's Odo, the all-in-one fully integrated platform that actually might help you get it all done.
Starting point is 00:02:44 Thousands of businesses have made the switch, so why not you? Try O-Doo for free at O-Doo.com. That's O-D-O-O-O-com. When WestJet first took flight in 1996, the vibes were a bit different. People thought denim on denim was peak fashion, in-line skates were everywhere, and two out of three women rocked, the Rachel.
Starting point is 00:03:07 While those things stayed in the 90s, one thing that hasn't is that fuzzy feeling you get when WestJet welcomes you on board. Here's to WestJetting since 96. Travel back in time with us and actually travel with us at westjet.com slash 30 years. Thank you for joining me on the Vox Media Stage
Starting point is 00:03:28 at South by Southwest for a live taping of On with Kara Swisher. Okay. So the audacity is full of characters who are greedy, narcissistic, nihilistic, basically assholes, essentially. But it's only coming 12 years after HBO's Silicon Valley, which presented a much more optimistic, value-driven focus on sort of quirky characters. And it did nail them in a satirical way, but it was sweet. and at the end, the characters want to do the right thing,
Starting point is 00:04:04 which it never happens in tech. But it's also mirrors a larger shift to how people perceive Silicon Valley. So talk about your own perceptual changes in tech for each of you. Let's start with Gina, then Billy, then Jonathan. We kind of try to, and we've said this film, like, moment one with the show, when Jonathan only had one script, which is it really is only about these characters, Kara, like, you know, and we really focused on families and how, you know, these parents do what they do.
Starting point is 00:04:40 The kids really just want to be loved, right, by their parents. So we're kind of always trying to show, honestly, in every episode, if everybody sticks with it, good and evil in one scene, right? So we really, we want to show both sides of that, right? there's hope, there's dreams, I'm going to try and make this thing, that it might fuck up your kids. We hold both of those in each scene because that's real and we're going to keep doing that. So why said it in tech? Could have been Hollywood, right?
Starting point is 00:05:10 Parents making stuff that may or may not be great for you, having their kids, you know, be affected by what you're making and doing and the values and the morals could have very easily been Hollywood or, you know. So I think tech provides, I mean, it's what we're all talking about right now. You can't escape it. Jonathan and I said literally from the beginning, we're like, we want our parents to see this show. Like, we want people that wouldn't have watched other shows about tech to watch this show. Because it really is our world right now. And it's a snapshot of this is the world we're living in. It's the conversation we're all having.
Starting point is 00:05:49 It's the conversation a lot of us don't want to have even with our own families, right? Like too much social media. Get off your phone. You know, we can't escape it. So it felt rich. I mean, one of our characters, even Billy's character tells us therapist, you could have come anywhere. You came to Silicon Valley. Right.
Starting point is 00:06:06 Right? This is where money, power, largest concentration of the richest people in the world, you know. Yeah. So it just felt like fertile ground to find stories about how we have a responsibility. to the next generation, to not fuck up the world so that it's, we can't put it back together. Right. Which right now we feel like we might be on the verge of not being able to put it back together. In fact, we are.
Starting point is 00:06:35 Go ahead, Bill. With change in tech, well, I think it's a double-edged sword more than anything. I, again, the original or genesis of the idea was to help people. It was exciting to, I was on the cusp of after college, finally tech and social media and these phone devices. I think I got like one of the iPhones, smart phones when I was like 27. And I remember, you know, before when you were young, you could be like, hey, I want to meet my buddy on the corner here.
Starting point is 00:07:03 And you would have to meet them at 8 o'clock that night. But now with tech, you could go like, hey, we can meet anywhere, change everything. But also now they could pull out of meeting you at 8 o'clock. They could like get away from you now. Instead of like making it better, there's this devil edge sort of, sorry, I'm so nervous at the shit.
Starting point is 00:07:24 I'm saying it's these things you have this opportunity to connect us even more, but at the same time, it distances us. Absolutely. What was your perception? Because your character is pretty fucking loathsome at all times, at all times. Like one of the things I was saying to a friend of mine
Starting point is 00:07:40 was I can't stand him and there's pathos to him. And so when I was watching it, I thought, I don't want to feel sorry for these fuckers. I don't, and I did. I felt sorry for you. Well, I think it's because we're all humans and the humanity of these characters. Again, his intention to go through the valley
Starting point is 00:07:59 and punch a hole in the sky was genuine. It was something to make the world's a better place. And then with power, with money, with corruption, you know, these people are still innocent deep down, their children. Well, that's certainly true. But go ahead. Well, I think that for me, the change is, first of all, any comparison to the show Silicon Valley is an incredibly flattering compliment.
Starting point is 00:08:26 Not that you did that exactly in terms of which one you liked better, but we're going to work on that. I think it's in the title, The Audacity. It's an attribute that is a double-edged sword. It's something that has fascinated me for a long time in terms of it's like the superpower that we all have but most of us don't use because it means crashing through the gates and, you know, moving fast and breaking things. And which makes you a bull in a china shop, which used to be not a good thing, right? Right. It never was, but go ahead. It never was. It still isn't. But it is now regarded. I think the culture in Silicon Valley changed to make breaking things, disrupting things, a positive. and the effects are very clear in people's daily lives today.
Starting point is 00:09:24 And I think that, you know, that's a change that seemed inevitable to a lot of people. I've been paying attention to you for many years, and you've been calling out this. This is where we're headed, people. But I think that there is a madness that needs to be stopped, and I don't know how to do it. And I don't think satire is going to do it. It's the only weapon that I have is to sort of hold up a mirror to them and to ourselves. What they're doing. One of the interesting part was in, I'm not supposed to talk about the thing,
Starting point is 00:09:56 but there's a character who seems to be based on me because it's the code conference. Not possible, by the way. And they're not possible. No, there's only one carousel. Let's be honest. Red chairs, code conference. But one of the things, his character, he says explicitly what has been implicated. for years. And I think one of the things we're
Starting point is 00:10:17 going right now is these tech moguls are saying explicitly like Sam Altman this week saying we're going to take all the intelligence that you have and we're going to steal it and sell it back to you. That's right. Yeah. Like we're just going to do that or we're going to take this information and you
Starting point is 00:10:32 articulate that. So Billy, again, you play this sort of loathsome tech entrepreneur in Duncan Park and you've got that energy, that sort of frenetic and I can't be a relevant energy. And And there's a lot of lines that I just love where you said belief is affirmed by fuckability, which is, by the way, beautiful writing.
Starting point is 00:10:53 Thank you. That's a very insecure character. Yes, exactly. Another one is empathetic is just pathetic with a prefix when something. Jonathan Glasser, everyone. When he wrote that, he was so proud of himself. He was. It's because he gets too.
Starting point is 00:11:15 diagnosed as possibly empathetic, which is a problem for him. And then my favorite one is raising money on frothy numbers to sugarcoat the rotten apple is what built this town. Exactly right. Hello, AI. But you think your stock's about to crash, which is terrifying for you because your stock price and your dick are, quote, one and the same. Talk about him under the surface. How do you relate to a character that's deeply flawed and insecure? I've always said this with any character I've created good or bad or whatever that they're always, they're the hero in their own story and he thinks he's championing and creating the best idea for the world around him. I love the hypocrisy of the show as someone that's trying to make a perfect product for people or make the world perfect for people while at home he's a trash fire. Emotionally, personally, doesn't know how to take care of things.
Starting point is 00:12:11 And I think building the character around the insecure child he is deep down and the love he wanted from his father and the world he just wants affection and attention. It could drive people to do crazy, crazy things just for acknowledgement. So people are onto him. They're like you're in it for yourself constantly through the show. And he seems not self-aware in any way. Not self-aware, but also fighting it. He's trying to put the mask on and create.
Starting point is 00:12:41 the illusion of the world he believes he should be living in. And I think that's a fascinating character to dive into it. So, Gina, you said the show looks at Silicon Valley through a small town lens, which it is. It is a high school. It's a high school cafeteria in a lot of ways. So Tiny Town Little Bubble is how you put it. Talk about that approach where people don't know much about the Valley.
Starting point is 00:13:01 It's probably not because now they're global. These people are global and famous. And to have met them back then when Jeff Bays, I met him wanting to have any money. I met Sergey and Larry in the garage. Although I have to say jerky people were jerky then, and they're just jerkier now. Talk a little bit about the tiny lens, small bubble kind of thing. Well, we took trips there, right? And we went and, you know, we went to bucks and we diner and we saw all the things hanging on the wall. This is the booth where this was.
Starting point is 00:13:33 Yeah. They love that. They love that. And, you know, it's not dissimilar to Hollywood, right? So, you know, where we like to like hold on to the tradition of the place, right? The classical tradition. For us, it's like, I mean, even when we screen this in San Francisco for a tech crowd a couple weeks ago, everybody was coming up to us afterwards going, that's where I get my coffee. That's where I, you know, and it's, it's, it's, I feel like part of what, you know, we're trying to hold on to is the idea that everybody's up in each other's business.
Starting point is 00:14:08 Everybody knows what's going on. You can't hide, right? What's dissimilar about, let's say, Hollywood in this world is, you know, it's good to have a failed company and get on to your next one, right? Like, we hide from those failures in Hollywood, right? Here, it's like how many empty, like, we were driving around doing location scouting, and it was just gigantic empty buildings everywhere, you know? so it just feels like everyone knows your business. Duncan is trying to make sure that it doesn't come out, that he needs to get another fish before it comes out.
Starting point is 00:14:46 Well, that it's always a scheme. It's always a Ponzi scheme, essentially. But I do want to say something going back to the Billy thing, because I think this is really important to understanding, like, you know, kind of one of the themes of the show we're hoping lands. Later in the season, Billy goes to see his ex-s, CFO of his first company Fafa. It's great thing, yeah.
Starting point is 00:15:08 And Randall Park plays this character. And he's about to maybe screw Duncan over, we don't know. And Duncan says to him, like, we did this. We built this company together. Don't screw me over. And he's like, you thought it was a movement and it was a startup, you know? And part of one of the themes is keeping that alive. It's like part of what Duncan is making.
Starting point is 00:15:34 maybe struggling with a little bit is what did bring me here? Why am I in the valley? Did I come here to create something? Am I the man or am I always going to be the com man? What was my? Well, it's the idea that it's supposed to be a greater than kind of thing. I never run into a banker that goes, what we're really here to do, Kara, is community through money. Actually, one of my first stories for the Wall Street Journal was all the bullshit when I got there. I had just gotten there and I was a person and someone said, cover it like you're a foreign correspondent. Like go in and just listen. And it was all about like, we're here for the community.
Starting point is 00:16:11 We're here together to make decisions. And I was like, well, why do you have controlling shares then if it's for all of us? Or we're just simple people. And I'm like, well, we're just wearing hoodies and comfortable clothes. I'm like, that's a $600 hoodie, cashmere hoodie. I'm sorry. It just doesn't gel kind of thing. And so the bullshit that went on from the beginning.
Starting point is 00:16:32 I think was trying to, and they're still doing it. We're just simple people. We just happen to have a yacht plane, chiefs of staff, cooks, et cetera, et cetera. I think the cult-ish aspect of that, though, is starting to come through. You managed to land that. So you were a writer and producer on Succession. We've talked about this. When you first got approached to write this series, your reaction is you don't know anything about tech. You don't know shit about tech. But when you went to Northern California, as you said, and spent some time, what were your observations? Well, it was very monocon. culture. I'm embarrassed to say I was looking for people of color. Yes, good luck with that. And it was difficult. Try to find a woman. Go ahead. That too. That too. It's really is it is because I think that
Starting point is 00:17:17 and what they do with everything is they say, well, we put it through an algorithm and this is this is clearly the way to go. And it's clearly the way to go because culture and society has has sort of fed the algorithm and why so many algorithms come out somewhat racist and misogynistic is because of that. But in terms of what I saw up there was this kind of grinding ambition that masked this utopian vision
Starting point is 00:17:47 and a real kind of like, why are these people who have such difficulty communicating about them, about their work and being honest about what they're up to and putting all of these aphorisms ahead of the truth, why are they in charge of how the rest of us are going to talk to each other? And I heard a clap. You know, and it's not just how we talk to each other. It's every fucking thing.
Starting point is 00:18:17 It's how we meet each other, how we eat, how we shop, how we masturbate. and they are paying attention to all of those things and marking how we do each of these things in a way that is genuinely it's terrifying the more you know about it but it's also I think much more terrifying that we don't know about it that we don't know how they do it
Starting point is 00:18:43 when Google first put out its version of a social hug which failed miserably several of them did they were like we don't know why we can't get it I'm like because you are the least social people I've ever fucking met in my life It's like, why are you telling us how to socialize, which was interesting? But it's also some of the stuff that say Facebook did was addictive at the same time. Yeah.
Starting point is 00:19:02 Look, it's like, you know, dermatologists who had acne as kids. They're trying to fix their own problems in a way through us. And it's really fucking, you know, fix thyself position. Yeah, a good point. I was very struck when we went, we were shooting in Palo Alto. I was very struck one night, like, we went out to, like, get ice cream or whatever. and it was a long line of couples, like local couples. And like the guys didn't know, like they didn't know how to talk to girls.
Starting point is 00:19:30 They didn't know how to, like, I was just watching this long line of people that didn't know how to actually communicate. And now they're creating a generation of other guys that don't know how to talk to girls either because of the, all they know how to talk to is their phones on each other. We'll be back in a minute. Support for this show comes from Odu. Running a business takes everything you've got and a lot of the tools out there. there that are supposed to make your life easier just aren't great at talking to each other, and that means you end up having to toggle between a dozen different apps and services just to keep the lights on. Enough of that. Now there's O-D-O-D-O-O-D-O-O-O-D-Com, the all-in-one fully integrated
Starting point is 00:20:20 platform that actually might help you get it all done. Thousands of businesses have made the switch, so why not you? Try O-D-D-O-D-com. That's O-D-O-O-O-O-O-O-com. Support for On with Kara Swisher comes from Roons. If you're looking for a health goal that you can actually stick to, you might want to check out Grooons. Grooons is a simple daily habit that deliver real benefits with minimal effort. Their convenient, comprehensive formula packed into a snack pack of gummies a day. This isn't a multivitamin, a greens gummy or a prebiotic. It's all of those things and then some at a fraction of the price. And bonus, it tastes great.
Starting point is 00:21:02 Groon's ingredients are backed by over 35,000 research publications, while generic multivitamins contain only seven to nine vitamins, Grunes have more than 20 vitamins and minerals and 60 ingredients which include nutrient dense and whole foods. That includes six grams of prebiotic fiber, which is three times the amount of dietary fiber compared to the leading greens powders and more than two cups of broccoli. It's a daily snack pack because you can't fit the amount of nutrients grunes does into just one gummy. Plus, that makes it a fun treat to look forward to every day. Kick off the new year right and save up to 52% off with the code Kara at Grooen's.com.
Starting point is 00:21:40 That's code Kara, K-A-R-K-A-A-R-U-N-S dot CO. Where are my gloves? Come on, heat. Winter is hard, but your groceries don't have to be. This winter, stay warm. Tap the banner to order your groceries online at walla.com. Enjoy in-store prices without leaving your home. You'll find the same regular prices online as in-store.
Starting point is 00:22:18 Many promotions are available both in-store. and online, though some may vary. Billy, your co-star, of course, is trying to figure out how to get that way. Your co-star, Sarah Goldberg plays a therapist to tech CEOs. It's a little like the Sopranos, right, that relationship, Dr. Melfi and Soprano. Her name is Dr. Joan Felder. Their fates become intertwined after Duncan blackmails her. Without giving any way, talk about the power dynamic between those tech billionaires on the show
Starting point is 00:22:48 and the professional class. I thought you nail, like that's the one thing that you always sort of focus in on the billionaires, but there's a whole series of enablers. Yeah. You know, and it's, they're really interesting because they're, they're barely seething. You could feel that from them. But I remember being drawn into one among the Google founders of the chiefs of staff of various people fighting with each other. But the chiefs of staff were the ones fighting with each other on behalf of the billionaires.
Starting point is 00:23:15 And literally they started dragging me and I'm like, you need to get away from me. I don't want to hear about your stupid chief of staff beefs, essentially. But talk a little bit about this relationship, because that dynamic is well done. First of all, I just would love to say, like, Sarah Goldberg is a fantastic actor, and every day working with her has been an absolute treat. I'm sad she's not here on this trip, but, God, she is fantastic. She's from the second that. Yeah, she's wonderful to work with.
Starting point is 00:23:41 You know, for me, I think the people in your world, if you pay them enough, they're going to tell you everything you want to hear. That's correct. That's one thing. But then it's funny that for some reason, Silicon Valley has made the importance of these billionaires the most, the top peak of what people have to pay attention to.
Starting point is 00:24:01 And it's just made up rules. Everyone in that valley are making shit up. And so for some reason, that's where the money is. The people around them are also saying like, oh, this has to be important because they're making money doing it. Right.
Starting point is 00:24:12 And so that is a perpetual cycle of just bullshit on top of bullshit on top of bullshit. You do a lot of, like, drinking different elixirs and things like that and then throwing them at the assistance who keep bringing you things whatever you happen to ask for which is really very funny of course of course you'd make an excellent billionaire yeah thanks i wish come to know yeah came very easily to him is there anyone you were focused on that you see publicly um yeah just look at the valley again as an artist as an actor like you like to i like to i like to to see the things I love about people and I love, hate about people. And I make my own
Starting point is 00:24:52 melange. That's like the gift of acting, I think, is having the freedom to create these characters and build all the kinks and weird things and the unique qualities that they have. So yes, of course it's based off people out there. And then it's a melange of the ideas I incorporate myself. So there's also a Zach Gallifanacca's character, Carl Bardoff. He's fantastic, by the way. He's a tech pioneer, a former idealist who made much. money off of spam, ultimately, which just hurts because he did. And everyone knows it, but he tries, and he's filthy rich, he's still miserable because he thinks he's underappreciated. Carl tells Joanne in therapy, where's our parade? All I see is pitchforks and ingratitude, which is a great
Starting point is 00:25:35 line and just angry about it. He's like a really mean Steve Wozniak. I don't know who else does, and Steve Wozniak is, and Steve Woznihani is not mean, by the way, what does he represent to you? Because there is a class of early Valley that is so wealthy that they've loved. And he's lost all perspective of everything. And at the same time, one of the things that's always, I try to communicate, is the victimization they feel is so profound. I've had so many people say, you're so mean to me. And I was like, you need to get a dog or you need to speak to your parents about hugging, but it's not my problem. And so talk a little bit about that, the victimization. Zach's character, Karl Bardolph, is, he's actually the only billionaire in the cast. Billy's
Starting point is 00:26:18 character is, wants to be. And I think that part of that desperation to get to that class is what drives our Billy's character and so many of the others. But, but Carl landed there with spam. And he's embarrassed about that. He doesn't think being reminded. Yeah, it's the, he's also about legacy, which is also very important to these guys. Carl is stuck between being aware of the bullshit and also wanting to be a part of it. Yeah, he needs relevance. He needs relevance. He's desperate for relevant. He does, but he at the same time, he knows that that's an absurd notion. And so that sort of, that split, that splinter in the brain creates in Carl and really played brilliantly by Zach this kind of, you know, this need to fill the hole. Yeah. And he thinks that maybe Duncan could
Starting point is 00:27:17 be a unicorn. He's very attracted to Duncan. He's very attracted to Duncan. He wants to kill him essentially at the same time. But he's never met anyone as relentless as him. It doesn't matter how many times you knock him down, he keeps popping back up like a inflatable clown that you punch. He just keeps coming back up at you. And actual punches, actually. And he does actually, yes, there's some punching involved as well. But yeah, he's somebody who understands that back in the day, we all talked about when we were making the sausage, making the world a better place. And we all, all the billionaires kind of come to a fork in the road and most of us go Dr. Evil. Right. And he does, he's not comfortable doing that. He's not ready to do
Starting point is 00:27:59 that. But he also really wants to be in the club. Of relevance. Of relevance. The next thing. As we're writing season two right now, which we were just greenlit for season two, so we're very excited about that. Thank you. We find out the backstack. story of like why he doesn't feel like he's in the, what the incident was that caused him to not feel like he was, you know, part of the gang, the, what we call God Mountain. Yeah. The figures of God Mountain. Yeah. And, uh, and it's very, very small and petty. It is. It's amazing and still angry. I was, I'm in, I'm in Washington now and they've followed me there. Um, because he can't get enough of me. And I ran into one of them, one of these people, well-known name. And they came up to me and
Starting point is 00:28:46 they're in with Trump now, right? And they go, yeah, it looks like we've won. And I looked at my head, but you're still an asshole. And they crumble. I still had them. I'm like, what do you think that, like that change of that idea?
Starting point is 00:29:03 Like there was a time when Silicon Valley started that they kept the separation from the government. They were like, Oh, because they figured out you can buy things. I think they had a real disdain, especially like Bill Gates was always bragging, didn't have lobbyists. And then they realized, especially as they moved into things that were more analog,
Starting point is 00:29:19 like cars and things like that, that, oh, wait a minute, we disdain them, but we can buy them. And Trump is the ultimate coin-operated president, so as I know. So...
Starting point is 00:29:29 In the Michael Lewis... Which is quicker. Sam Bankman-Free. There's a moment where Sam Bankman-Free is like, oh, this is actually very affordable. Affordable. It's affordable. You can do it with...
Starting point is 00:29:39 And Trump's explicit about it. Like, give me the money. You get a pardon. Give me the money. You get to stand in front of the argument. It's very easy. And I think, they privately, they insult him constantly.
Starting point is 00:29:50 And just so you know, Kara, I'm like, get the fuck out of here. I'd rather talk to Marjorie Taylor fucking green than you. Because at least she believed in him. She actually did. Not any longer. She's still terrible. So just ask her about trans issues and then you'll stop loving her so much. You could have made, Duncan, the CEO of a food delivery app or social media,
Starting point is 00:30:13 but it's data mining what's happening here. It's this godlike omniscient idea, which has been around for this idea, which they have now. They really do. And I think you articulate that really well. You're making a point about privacy, obviously, or total lack of it. And, you know, that started back with Scott McNeely. There's no privacy get used to it kind of thing. Talk about what that means, why that's such an important thing, because that runs throughout this show, this constant surveillance.
Starting point is 00:30:43 And I talk a lot about surveillance culture. it's obviously very key in China. Here it's for more selling you shit. But eventually, when you have an administration that feels treacherous, it could be problematic. It feels really problematic. Talk a little bit about that. So yesterday on our panel that you did with us,
Starting point is 00:30:59 Rob Cordy, one of the actors in our show, told you about something that was very real that happened with all of us, and that was, you know, one of the characters in a later episode who works at Cupertino. she's telling her room full of employees that are all, you know, young in her 20s, you know, you know not to accept cookies, right? Yeah.
Starting point is 00:31:22 And they all go, no, I don't really know that. Right? So, and we all kind of looked at each other and like, we didn't know that either. Right. Yeah. And I'm not, I'm not afraid to say it. So, so that was a, that was a big conversation with all of us. Like, oh, shit.
Starting point is 00:31:39 Like, so, so really, I personally, for the. in this, I'm coming through that, like going to, like, do my parents know that? Every time my mom gets on to shop, does she know that she's, what she's giving away? And the truth is, no, people don't know what they're doing. So that's a big lens. Also, again, to bring up the kids again, and we get into this a little bit in the season two, which we're, there's a writer's room up now for it, which is, I didn't grow up with social media and I'm really glad I didn't, you know?
Starting point is 00:32:11 Like imagine if every single thing you did could be broadcast to the world, right? It's just a different time we're living in now. And there needs to be a different level of protection that we are not offering right now. So we kind of just want to just from a very sort of like in the ground, like do you know what you're signing up for every time you click that button? And the answer is no. So that's kind of one of the things that's important to us. Yeah. So early in the first episode, we see a billboard for a company.
Starting point is 00:32:41 called Spookle. The names are really great. Yours is hypnosis. Hypernosis. The names are fantastic. Yeah, thanks. You could call a company that, actually. But the tagline is, you agreed to this. Yeah.
Starting point is 00:32:57 Yes. Yeah, truth is, we agreed to this. That's the honest. Yes, we did. So, Billy, when you're preparing for the role, did you lean into that uncomfortable truth? We agreed to this. And you articulate that later.
Starting point is 00:33:08 because on some level we have accepted the terms and conditions because you do lean into it as a character, which I suspect you don't believe what you were saying. This guy really says, too fucking bad. Too fucking bad, but I think he also has such a need or desire. His want is so grand that nothing can stop him. Any steel door is going to be knocked down either with a pickaxe, a blowtorch.
Starting point is 00:33:34 He's going to keep fighting through it. But he never thinks it's a bad thing. you really do deliver on that, that he never questions. I think a lot of the other characters do know what they're doing is vaguely evil. Yes, I think it's a mindset Duncan has is like, no, I'm doing this for you. It's what you need. It's the medicine you don't need. I'm giving it to you.
Starting point is 00:33:55 Don't worry. I'm going to make your life better that you don't even know it. So that's a dangerous person in my opinion. Absolutely. He's terrifying, actually. I was just going to say, I mean, I think that it is in the end of the season, and I don't I think I'm giving too much away. Duncan does basically confront all of Silicon Valley at a conference that Kara did not host.
Starting point is 00:34:20 I would never wear those chunky glasses, but go ahead. But there is a sense of you all already, they're angry at him for basically he's going to market people's personal data, commodify it. And they're angry at him for what is really going to happen is a disruption of their profit center, which is us. And that's the thing. When I first went up to Silicon Valley, one of the things that I asked about was what should this guy, what's this company? And I didn't get the answer, but I then said, what about data? And they're like, oh, yeah, that's evergreen. That's not going away.
Starting point is 00:35:00 Right. Especially with AI. Especially with AI. And then I did more research into it. the truth is it is how they make their money is off of us through advertising how they sell to third parties all of that stuff and so if there's a guy that comes along and is going to sort of pull back the curtain and say this is what we're doing this is what we're doing and and you guys you're such hypocrites you're already doing this you you know when women's menstrual cycles are
Starting point is 00:35:30 you know when to sell them sweatpants and when to sell them stilettos like that's what that is what you're already doing it. I'm just explicitly saying. I mean, there's one, Sam Altman, as I said, reason said, we see the future where intelligence is a utility like electricity or water and people buy it from us on a meter. And of course, the intelligence comes from, which is an astonishing thing. He also talked about, he went on to say intelligence needs to become, quote, too cheap to meter. And of course, he also said that it takes a lot of energy to raise humans. So what's the problem with data centers? Which is unusual coming from someone who just had a baby, but that's another.
Starting point is 00:36:07 We're not going into that. But when you think about that idea, this sort of, Walt Mossberg named them repacious information thieves 15 years ago on stage. He said to Sergey and Larry, you're rapacious information thieves, is what you are.
Starting point is 00:36:24 And just this week, Gramerly, was stealing me. They were selling me to people without asking me if it was okay by pulling in all my stuff. Yeah. What, what, how do we not agree to this, Gina? What is the, we do agree to this for convenience, as you noted.
Starting point is 00:36:42 It's, it's really a trade of ourselves. We are the product for convenience of a cheap, free map or a dating service, or a piece of information that we actually paid for in the first place. Yes. Yeah. You know, going back to Duncan's first company that he had, which was called Fafa, you know, we, we hear, a character go, yeah, I bought my first Subaru on Fafah, and I bought, you know, there was a moment when it really was that, like, you know, it really was something that, you know. Candles and canoes and shit.
Starting point is 00:37:15 Yeah, yeah. We have another character, Anushka, who is the Cooper Tino executive, say, okay, great, we can get Cuk tips delivered overnight, but really, what have we done? What are we doing? We actually go there in one of those scenes, and we have someone saying exactly what you're saying, which is, let's take this moment. moment and know the world we're living in. And it's like, but really what have we done? Right. What have we done other than we can get shit delivered in 24 hours, really? You know,
Starting point is 00:37:45 and that's a question. That's really a question. Less than 24. Yeah. And it is, it's also, you know, did we set up, did we do anything that we set about to accomplish? Are we more tolerant society? Of course not. Did we do anything positive for our kids? Provably, no. Did we, you know, help the environment? No, these data centers are, you know, running more than a city. Right. One of the things, I'm going to talk about the business, the larger Hollywood business in a second, but one of the things that's most poignant here are the kids. Yeah. I happen to know a lot of these kids that you're depicting. And I feel for them, you know, and a lot of them I actually, some of their parents aren't talking to me anymore, but the kids are just FYI.
Starting point is 00:38:28 And I went out with one of them and one of their parents took an ugly turn, I would say. and he asked me, 16 years old, and we had dinner, and his mom asked me to have dinner with him, and he goes, what do we do? I said, here's what we do. When your parents, when your dad dies, we're going to take the money and give it to the Sierra Club. That's what we're going to do. We're going to do good things. Like, very much like the people who ran the gas companies, the Rockefeller Foundation turned around. I was like, you have an opportunity to shift it into something better, like, that kind of thing. But the kids, I think, are the key to this in time. show. All the kids and the way they're hurting in ways. That scene, there's a scene where they're at
Starting point is 00:39:10 dinner, which is devastating as they're not paying any attention kind of stuff. Yeah. So that to me is the heart of the show. I agree. Yeah. And it is, you know, one of the, one of the main characters is this kid Orson, kind of based on me, I grew up in a house with two psychiatrists and I could, I could hear them. Yeah. And, but he, in, in our story, he moves from Baltimore into this bubble where everyone is optimizing their brains and their bodies. And he is an awkward pubescent with IBS.S. And the idea of like, can we come up with somebody who is so genuinely human, genuine, you know, his corporal self is betraying him. in this culture that is basically saying, hey, you could take this, you could take that.
Starting point is 00:40:08 You know, you can be a better person mentally and physically. And seeing it happening all around. And there really are literal billboards for, I'm sure. No, that's what my whole series is about. None of it works, but you're going to die in the end. Just plot spoiler for my series, but you're all going to die someday. And thank goodness, too. We'll be back in a minute.
Starting point is 00:40:47 Support for this show comes from Odin. Running a business takes everything you've got, and a lot of the tools out there that are supposed to make your life easier just aren't great at talking to each other, and that means you end up having to toggle between a dozen different apps and services just to keep the lights on. Enough of that. Now there's O-DU, the all-in-one fully integrated platform that actually might help you get it all done. Thousands of businesses have made the switch, so why not you? Try O-D-O-D-O-D-com. That's O-D-O-O-O-O-O-com. One plus one equals more of the greatest stories.
Starting point is 00:41:25 Hulu on Disney Plus. Stories about... Survivors. The most dangerous planet. Family. Retribution. Murder. Prophecy.
Starting point is 00:41:33 Beer and propane. The ultimate soldier. Chicago, all right? The best of the best stories now with even more from Hulu. Amazing. Have it all with B1 Disney Plus. You don't need AI agents, which may sound weird coming from service now, the leader in AI agents.
Starting point is 00:41:57 The truth is, AI agents need you. Sure, they'll process, predict, even get work done autonomously. But they don't dream, read a room, rally a team, and they certainly don't have shower thoughts, pivotal hallway chats, or big ideas.
Starting point is 00:42:10 People do. And people, when given the best AI platform, they're freed up to do the fulfilling work they want to do. To see how ServiceNow puts AI to work for people, visit servicenow.com. I want to move to the industry in general, because it's also affecting all of you. All of this, what you're depicting here is what the rest of us are being pulled around by the nose by this industry.
Starting point is 00:42:36 So I just had Matt Bellany on the podcast, who's a journalist, very good journalist. He told me the biggest question in Hollywood is, will I lose my job? For a lot of people, the answer is yes. Actors will see. At least in the short term, you're okay, Billy. Thanks, thanks. Not for long. But there's a lot of job-bosses.
Starting point is 00:42:54 Billy first talk about, you've been working as an actor for 20 years. What do you think the next 20 years is going to be like with AI and other? What is the top of mind for each of you on right now, the mood you have? The mood I have is. Sometimes always. Rarely never. It's a line from the music. All I could relate it to is kind of like the music industry weirdly.
Starting point is 00:43:17 And I think of like the idea of like being in touch, the original idea of Napster or like all these things is to have access to all these beautiful programs, or songs, I would say. And then slowly over time, you realize these artists that were creating this music couldn't make it anymore because the money never made it to them. They couldn't sell an album. And I think it's happening in our industry where, like, you used to be able to do DVD sales or have part of the VHS sales. And you had back in, you could support a life as an actor. Because of this technology and like, I applaud AMC to not having a tech company,
Starting point is 00:43:54 supporting them. They're doing it against all this shit. Today. Today, today. Yeah, but I hope they keep strong. Yeah, they're just stealing all the pot and not giving the resources, the people that are actually creating these pieces of work and these pieces of art. And that scares me. What do you think about when the tech companies are buying, which I wrote about many years ago, this is the next thing, their health care and this? Well, if they're going to organize it properly, we already know like AI has the ability to, if I have a bottle in a scene and if they're going to show this movie in like Thailand, they could put whatever product out in Thailand is the best selling product in my hand at the same time.
Starting point is 00:44:37 Shouldn't that performer or people that created that show get a cut of that profit? Yes. Yeah, they should. You won't. I don't know who's fighting for it. I want to, but yeah, that shit scares me. What if it's Hitler beer? Jesus.
Starting point is 00:44:51 You don't even know. Say in it too, I think. That's just on Grock and they're not doing very well, but go ahead. Well, I came up in 90s independent film in New York in the Miramax world, which is a whole other podcast. And at that time, we were working with, you know, lots of filmmakers that were attours. and we shifted from film to digital in that time. And that was a huge conversation, right? What does this mean?
Starting point is 00:45:25 This is not real filmmaking. This is not, this is, we're going to, we're going to die from this, right? And, you know, I'm a creative producer. I view my job as a producer of bringing other people's stories, helping them bring their stories to life. So I kind of saw at that time that, you know, there's a, as any innovation, especially in the creative industry,
Starting point is 00:45:51 there is a reluctance to change, right? Like, this is how film is made. We need to carry cans around and we need to, you know, have it with our hands. Gradually, everybody accepted it, and they accepted it because some of the old guard, like Scorsese and Coppola,
Starting point is 00:46:09 started to say, no, this is great. This is a new democracy for filmmaking. This is, now everybody can do it. It's not just this specialized thing for, you know, a certain group of people. Not anyone can be a filmmaker. So we found a silver lining then, which was then shooting a phone on my phone. I can be a filmmaker. I can be a kid making a film in my backyard in anywhere in the world, right?
Starting point is 00:46:35 I don't know what that silver lining is yet for the phase we're about to be in. I believe we will find one because we always have. as creative people. We've always found a way to hold onto the stuff that's most important to keep telling the stories, to keep putting it out there,
Starting point is 00:46:52 but we have to do that and we do it together. We find it together. Jonathan? I mean, are you, look, tech companies are going to own these things. We don't know how AI is going to transform
Starting point is 00:47:03 how you've seen what happened. I don't know if you've watched the Doge staffers. It's really disturbing to watch them. They're these kids who are used chat GPT to cancel over $100 million in humanities grants without any reason, just because
Starting point is 00:47:19 they were doing searches and replacements, essentially. Yeah. What is that, how do you think of it as a producer? I would be worried if I were you. Yeah, sure. I'm warning you. No, I, first of all, I feel incredibly fortunate to be able to be making this show
Starting point is 00:47:35 and to be working. A lot of my friends are in trouble. The amount of shows that are being made is reduced just because attention spans are going elsewhere. Maybe that's the natural flow of economy and art, you know, the format being adjusted, as it always has over time. I do have some faith in that storytelling is not going to go away. We've been doing it for thousands of years. I don't expect that we're going to stop. I think that the desire to see a story that it plays out, out for more than 60 seconds is also not going to go away, although it will be diminished. It
Starting point is 00:48:18 has already been diminished. Detention spans. I mean, one of the most frightening articles that I read recently was how film schools are having, film schools are having a difficult time getting their film students who are paying to go there to watch a full feature film. They just, they can't, you know, with books, same thing. With books. Oh, with books, for sure. That's been going on since I was in school. anyway. But my refuge was the movies. And, you know, two, three hours in a movie was a fantastic place and way to spend my time. And that is a difficult thing to get people to do. Especially when you can minimize it and do it quickly, as you say, point people to it. Although I don't think AI could write lines like cheaters never lose and losers they never cheat.
Starting point is 00:49:07 And information is not insight. I thought that was one of the best lines. Thank you, yeah. And going back to succession, I don't think any AI could write, you are not serious people and have it mean so much. I think right now AI can do a procedural pretty well. Yeah. Especially something that's been around, you know, 20 seasons. You know, 20 seasons of Law and Order, I'm sure that that can be replicated.
Starting point is 00:49:34 I'd like to think, as a viewer and as a fan of storytelling, that most people, at some point, after there's a, I'm sure there's going to be a Fed, I'm sure there's going to be something that comes out there the way it is now in music, where there are actual AI celebrity singers and songwriters, if you can call it a writer, there's going to be something that comes out that is going to have that, you know, AI, the newness of that. But at some point, when you're being asked to emotionally invest in characters and story, do you really want it to be written by a fucking computer?
Starting point is 00:50:10 I don't. I don't know. I don't. I think that we have to acknowledge the, and this is, I think in many ways, the real thesis of this show is that we are losing touch with our humanity, and that no matter how good they get, they're only going to get good because they are absorbing our experience. and then spitting them back, as Mr. Alvin said.
Starting point is 00:50:40 It does get better. In this series, I have, I create a bot of myself, a 3D bot, and it gets better and better. I hate to say it. But at first, it's creepy, and then it's like I had a real conversation, just like you do in the show. And I was like, wow. And they're like, what do you think about me? I said, I'm trying to figure out a way to kill you.
Starting point is 00:50:59 Wow. But that's your instinct. And I think that that's the instinct of perhaps it doesn't come out as bluntly in others. but I did kill it. You did? That's exciting. That's exciting. I made it.
Starting point is 00:51:13 I am the creator and the destroyer. So last question from me. I'd like to know if you think this series is hopeful, dystopian. What word would you use to describe it, Billy first, then Gina and then Jonathan? Oh, man. What word would I describe it? I can't encapsulate everything that's in Jonathan Glasser's mind and the talent that has been putting this show together in one word because like human life, you can't. How do you feel doing it?
Starting point is 00:51:47 For me, it's, I love it. It's every day waking up and showing up to set was a gift. The people I work with, the language that I get to speak. It was always a gift. It was a feat. And I feel the responsibility like the, as an artist is like, you are supposed to put the mirror up to reality. And like, I'm not saying we're good or bad or whatever, but this is what's out there,
Starting point is 00:52:13 and this is how we see it. And it's your choice as an observer or someone who digest it to interpret it your own way. That's the point of art. It's like, I have a feeling you have a response. If we can connect on an emotional level of out of it, that's the magic of art. And I don't think an AI model can paint a photo that will touch my sense. soul, like someone who could just draw a line like fucking Picasso, you know. It means something. Hopeful. Hopeful. The kids in our show are hopeful. They're redeemable. And that's, that,
Starting point is 00:52:51 hopeful. By the end of the series, they're not going to be their parents, I don't think. That's absolutely true today, I have to say. I also struggle to come up with the word and hopeful is not it. I'm sorry to say. That's why we're great together. I know, I know. It's true. It's true. You know, hope to me is, I think I said to you yesterday, you know, the old line that satire is the protest of the weak.
Starting point is 00:53:18 And it is. I am weak. And I don't have the ability to change society. But if we can just remind people of what is unique about, our species and how much of that is being diminished by this tech and remind them that they're going to die, as you say, you know, the brackets of life that we have for, you know,
Starting point is 00:53:54 millions of years understood, you know, fairly soon after we're born that we're going to die, that's what gives life meaning. And it is something that I would love for them to understand. that their own fallibility is a gift. Which was interestingly the message from Steve Jobs when he did that speech before he died about that. We talked about death as the great. It's actually a big theme in this, how it shifted, like as you say, the fork in the road,
Starting point is 00:54:20 which is important to understand. It's everything. It is, and that I have a great allergy to certainty. I think anybody who claims, absolute knowledge of any particular topic is, you know, about to step in a bucket of shit. Yeah. And they just don't know it. And you often do.
Starting point is 00:54:45 And he often does. Your character, not you. Don't get. I don't know what you doing. You're really. Myself as well. That's fine. Yeah. Anyway, this is a wonderful show. I truly, I watched it all last night. It was wonderful. And I have to say, I don't want to spend another minute with these people,
Starting point is 00:55:01 not these people. But the people in tech. and I really felt for them, and I'm furious that you made me feel for them in any empathetic way because it's just pathetic with a prefix. Anyway, I really appreciate it. Thank you so much for joining me. Jonathan Glasser, Gina Mangachi, and Billy Magnuson. Thank you so much. Audacity premieres on April 12th on AMC and AMC Plus.
Starting point is 00:55:27 Thank you. Thank you, Kara. Thank you. Today's show was produced by Christian Castro-Wissel, Aloy, Catherine Millsop, Megan Bernie, and Kalyn Lynch. Nishot Kerwa is Vox Media's executive producer of podcasts. Special thanks to Catherine Barner. Our engineers are Rick Juan and Fernando Arruda, and our theme music is by trackademics.
Starting point is 00:55:54 If you're already following the show, we're glad you agreed to this. If not, all we see are pitchforks and ingratitude. Go wherever you listen to podcasts, search for On with Caraswisher, and hit follow. Thanks for listening to On With Caraswisher from Podium Media. in New York Magazine, the Box Media Podcast Network, and us. We'll be back on Monday with more. Support for this show comes from Odu. Running a business takes everything you've got,
Starting point is 00:56:25 and a lot of the tools out there that are supposed to make your life easier just aren't great at talking to each other, and that means you end up having to toggle between a dozen different apps and services just to keep the lights on. Enough of that. Now there's Odu, the all-in-one fully integrated platform that actually might help you get it all done. Thousands of businesses have made the switch.
Starting point is 00:56:43 so why not you? Try O-D-D-4-4-O-D-O-O-O-com. That's O-D-O-O-O-O-com. Getting ready for a game means being ready for anything, like packing a spare stick. I like to be prepared. That's why I remember, 9-88, Canada's Suicide Crisis Helpline. It's good to know, just in case.
Starting point is 00:57:11 Anyone can call or text for free confidential support from a train responder anytime. 9-88 Suicide Crisis Helpline is funded by the government in Canada. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.