The Bulwark Podcast - Kara Swisher: A Tech 'Tough Love' Story

Episode Date: March 1, 2024

Tech has allowed some very bad people to do some very bad things—including in the democracy arena. Swisher joins Tim for the weekend pod to share her burns of Elon, Trump and the vampires of Silicon... Valley. Plus, a capitalist case for government doing more to rein in Big Tech. show notes: Kara's new memoir, "Burn Book" Tim's Walter Isaacson interview

Transcript
Discussion (0)
Starting point is 00:00:00 Hello and welcome to the Bulwark Podcast. I'm delighted to be here with Kara Swisher, host of the On with Kara Swisher podcast, co-host of the Pivot podcast, which is always right around the Bulwark Podcast and the Apple charts. Not that I obsess over those. I do. I haven't looked. And she's the author of three books, including the brand new memoir, Burn Book. Thanks for doing this, Kara. Thank you for having me. So if you don't mind, I want to start, like, as I was reading the book, I became obsessed with one question that's kind of not really a topic of the book, but it's adjacent. So if you don't mind, can we do one big picture question? Nope, not at all. It's your podcast. Let's do it. Sure. So, you know,
Starting point is 00:00:43 because you started this, what, in the 90s? In the 90s, you started reporting on this? 90s, early 90s. So I was thinking about this as I was reading through, and I was like, going back through the 90s into the mid-aughts, maybe even late-aughts, if you asked people if they thought the technological advances to date were good or bad for society, basically everybody would have said good. There are always some Luddites, but basically everybody would have said positive. Today, if I raise that question with my peers, there's a lot of uncertainty. And so I wonder where you fall on that spectrum sitting at now, 2024. Has it been a net positive, this transformation you've covered, or net negative? It's interesting. I don't think you could even net it out, right? I think one of the key quotes in the book is the Paul Virilio
Starting point is 00:01:23 quote, which is, when you invent the ship, you invent the shipper. When you invent electricity, you invent electrocution. And I think that is, you know, is electricity a net positive? Are cars a net positive? It is, but maybe not if the planet burns up, right? We don't know what we don't know in the future and where things are going to come out. And we never will because things change over time. I would say it's a net positive and has the potential to be a real net positive. But I would say the negatives have far outweighed the positives in some critical areas like democracy, right? And the things, the deleterious effects of wealth, the deleterious effects of partisanship
Starting point is 00:02:03 have been boosted and amplified by social media and these technologies. And it's given some very bad people an ability to be very bad at scale. And so that's been super problematic for any kind of comedy. And it doesn't have to pull us apart, but that's what these tools have been used for, for the most part. Yeah. I mean, I wonder if that's true, though, that it doesn't have to. And I guess my challenge is, I think about the phones, right? Because in the book, I admitted to you in the green room, I hopped around. It's good. It's long. But, you know, I'm trying to get through everything. But I hopped around and I hopped
Starting point is 00:02:35 to the Apple chapter or the various parts where you talk about Apple. And, you know, Cook and Jobs are on balance. I mean, you paint three-dimensional pictures, but on balance, you know, more towards the white hat side of things. I would agree. You know, I really think about this question and it kind of comes down to a lot of the negatives that have happened have been downstream of the hardware phone question, right? And if you think about the loneliness, the teen suicide, the democracy, the polarization, right? The fact that we're getting all of this right now in our handheld device. And I just, I do wonder if you look back on that with a little bit of man,
Starting point is 00:03:09 I don't know, is this a cigarette situation? I'm not doing a, you know, guns don't kill people, people kill people thing here. But in this case, you know, it's just a phone. It can be used for a lot of things. It's a tool. You know, there's Brad Smith at Microsoft had a book, which I thought was very smart. I think it was tools and weapons. It's either a. Brad Smith at Microsoft had a book, which I thought was very smart. I think it was Tools and Weapons. It's either a weapon or a tool. Every single nuclear power tool, yes, for sure. And a very promising one. Weapon, absolutely. You know what I mean? That kind of thing. And I think it's just how you use these things. In this case, a screen, a TV screen, is that a weapon? Because it's used to broadcast propaganda by Donald Trump? Yes, but it really is the propaganda, right? Really, it's what he's saying. And he would find any
Starting point is 00:03:50 media. I say, you know, Hitler didn't need Instagram, did he? He had lots of other tools. But if he had it, very powerful piece of technology for him. And so I tend not to blame the items themselves, even the software itself. What I do blame is when they, for example, Facebook, when they allowed lots of people to come in some of these chat groups, the way they did it, they didn't limit it. So rage could move very fast. I do blame certain social networks for pushing more virality over context and accuracy. I blame them for doing it at speed. I blame them for not putting safety
Starting point is 00:04:27 measures in place. Once it's deployed, how they manage it is more what I worry about. And so I think it's very hard to blame the device itself, except if it's built in a way that is addictive, which I think some of these things are, some of the software is, or it's built in a way so you can't put it down. It's like cigarettes. It's addiction. Yeah, I basically agree with that. It's just, so I'm addicted. So I'm on the addict list. I'm just trying to work through it. I can see that. I can see it from your activity. I don't hide it. I don't hide my addiction at all. And so, you know, but when I'm on X, you see somebody will put up a viral chart and they'll be like, oh my goodness, loneliness. Look at how much it's dropped. Or happiness, look how much it's dropped.
Starting point is 00:05:11 Or polarization, look how much it's up. It always feels like it's 2011, 2013. That chart starts to go up or down. And it's like, what was that? And it's like, well, it was about the time when everybody had phones and social media in their hands. Yeah, it wasn't phones as much. It was the social media on top of it and the addictive nature. Yeah, the smartphone element, right?
Starting point is 00:05:29 Not like the flip phone. Yeah, the social media on your phone, I guess. The combination. Yeah, exactly. But it's a combination of all of them. Tristan Harris has become an advocate against this stuff, against the way tech is used. He worked for Google. And one of the things he is absolutely right about is this stuff crawls down your brainstem, right? It appeals to human nature.
Starting point is 00:05:49 There's lots of parts of human nature of wanting to coalesce and be with people. And then there's a part of human nature that wants to be by themselves, you know, you know, sniffing the whatever, right? And with your addictive, whatever it happens to be in many cases, the phone, these things are built for addiction and they don't have to be, right? You can turn, for example, if you hit the side of an iPhone three times, it becomes black and white and it becomes less interesting to people on a visceral level. And you don't touch it as much if it's black and white. I know it sounds dumb, but it really does work.
Starting point is 00:06:19 And they could do a lot of things that don't make you descend into addiction. They could put like an Uber app. Are you spending a lot of time on your Uber app you descend into addiction. They could put like an Uber app. Are you spending a lot of time on your Uber app? No, you call it, you use it, it goes. That should be on the front page. Facebook should be deep in a folder. So it takes a minute to get to it, but they don't naturally do that. The other thing is when they design these things, it's very much like a casino where if you push this button, you want to push it, don't you? Push this button. And it goes way back to AOL days when I was covering it. It was here in Washington, DC.
Starting point is 00:06:54 Britney Spears was a big clicker. I mean, people would click on anything about Britney Spears back then. And one of the pictures was fuzzy of her and the front page of AOL. And I said, why is that fuzzy? Why is that picture fuzzy? And this guy who ran the front page said, well, we make it fuzzy so they click in. They lean in. They click in. They want to see more. You know, it's the same thing as a casino or whatever is else used. Well, now we literally have the casinos on our phone too. That's right.
Starting point is 00:07:16 You know, the sports gambling. So we do both. You don't have to design it like that. It doesn't have to be designed so it appeals to addictive qualities. And that is on tech companies. The way they design the software is designed to make you not be able to put it down. You described yourself and looked kind of as Cassandra about some of these threats. There have been some critics that have said, oh, well, even you were too chummy early on.
Starting point is 00:07:39 You just kind of talk about that process where you're living amidst this. I'm sorry to tell you, it's largely from men I competed with and beat. So fine. All right. We'll put it there. When I started my career at the Washington Post and Wall Street Journal, I was a beat reporter. You know what that is. You can't go on these assholes, right? Like you cannot do that. This happened today at Google. This happened today. You know, you're a news reporter. That's what you do. And, you know, it's like when they accuse people who cover Trump of that. I'm like, they're beat reporters. What do you want? You want them to go in with like a hammer at him? Like, I'm not sure what. We need to learn. We need to know. I'm always a Maggie
Starting point is 00:08:13 Haberman defender on this. There needs to be Maggie Habermans and people that shout about how terrible what she reports is. She's just not. It's just, it's not true. It's just, but I see why they do it because they hate him. They want her to do something about it because she's near him. Fine. She's a beat reporter. I'm sorry, kids. That's what she does. She tells you what they're doing. Every now and then the times makes a mistake, but usually they do a pretty good job covering it in general. I know, I know they're mad at the age thing, but whatever. I don't, that's because they want to win. That's, that's different from anything else. So I was a beat reporter. And then when I left to do all things D, a lot of these things are written by people who were born years after we were covering this
Starting point is 00:08:50 stuff. We did very heavy duty coverage and very critical coverage of Google trying to take over the search market. We did extensive coverage about sexual harassment in Silicon Valley around the Ellen Powe trial. We were one of the leading groups of people pushing on the disaster that was Uber, including its terrible CEO, Travis Kalanick. This was in the Times. This thing wasn't tough until 2020. Hey, why don't you look at the archives of the Times in 2018, my very first column for them. I call them digital arms dealers. That's nice. Like, give me a break. I'm sorry. It's just not true. And so, you know, chummy, I don't, I don't, I have to know them. I have to speak to them. You don't seem very chummy to me, by the way, just if I'm grading.
Starting point is 00:09:34 I would pick any 10 fortune covers back then over Kara Swisher. Like we were known as mean. And what was really interesting in this phenomena is all these PR people that I cut, that I had to deal with back then was like, I don't know who the fuck you were talking to, but she terrified us and was really not very nice to us. Like not, not was tough on us. And yet the PR people defending you, I guess, I don't know. But one of the things that drives me crazy about it is that we were among the first to call in the question in the, through these interviews, Mark Zuckerberg and the anti-Semitic stuff. 2010, I did an interview with Mark in which Walt and I really drilled him on privacy needs so much so that he
Starting point is 00:10:09 started sweating and had to take off. He asked him to take off his hoodie. Yeah, exactly. So in a lot of ways, I'm like, what do you want me to do? I have to speak to them. It centers around Elon. I literally say in the book, I really loved what he was doing when he was doing space and cars. And he was a little bit of a narcissist. He was a little bit juvenile. I didn't see this coming. And for some reason, people are like, we knew it was coming.
Starting point is 00:10:34 I'm like, where? Where did you write it was coming? Nobody did. Nobody saw this dramatic shift. Very few people, maybe one or two, in the industry around him. Everybody loved this guy. And he was more interesting than everyone else because he was doing significant things around. Starlink was amazing. I'm sorry. It just is. And the fact that you say Starlink's amazing, everyone's like, you love Elon. I'm like, I really don't. You can see I don't.
Starting point is 00:10:59 But Starlink was amazing. What he did with Tesla, it pushed forward electric vehicles. I'm sorry to tell you, but it was dead until he pushed it forward. It was. Same thing with space. He's innovated space. And this is a guy who attacks me regularly. I still say, I'm sorry, but you have to be honest about his accomplishments, even though he's become one of the more dangerous figures in technology. And now he is. So what are we going to do about that now? So that's what drives me nuts. I'm sort of like, okay.
Starting point is 00:11:28 Who do you think one of my, when I was asking around, what do I ask Kara? You said he was one of the most dangerous. Who is the most dangerous right now of our overlords? He is. Yeah. He's got money and means to sue. He's been suing all kinds of, he just sued OpenAI today because he's, you know, he's hurt that they kicked him out, I think.
Starting point is 00:11:44 But he has some cockamamie reason for it. He sued another Roberta Kaplan case, this group that was pointing out the hate on. He's trying to quash their free speech is what he's doing. He's got his myths all over space. And he can decide things that our government should be deciding. He's got his myths in Ukraine. He really is ill-equipped to do so. Our space program depends on Elon Musk right now. So that's not good.
Starting point is 00:12:09 I interviewed Walter about this. I love Walter Isaacson. And his point is, right, it's a government problem. I heard your interview, and we talked about your interview with him, and there were some very good critiques of his book. But he is right on this point about the Starlink thing. This is our government. How do we get in a situation where this crazy person is responsible for this? I would agree. I think that's correct. It is our government's fault. But the fact of the matter is, that's a privatization that's been going on forever, right? The privatization of space. Our government, which built the internet, by the way, paid for the internet, created it, and then everybody else
Starting point is 00:12:43 made money off of it except our government, really has abrogated this responsibility in basic research, in AI. AI now is being run by private companies. That's why Elon's suing. He wants to get in, right? He wants to get in on it, and he's doing his own thing. But right now, AI is dominated by Microsoft. OpenAI is a smaller company, but it is dominated by all the bigs again. And so this is a critical national security issue and everything else. And our government is sitting on its hands. So, you know. Yeah. I want to, I want to get to the AI thing, but just a couple more things on Elon. Just really quick, there's an NBC story yesterday that's really good that lists all of the various oversight things Elon's dealing with right now from the SEC to all, you
Starting point is 00:13:23 know, all the various agencies and how I think he's also lost his mind, but he's financially motivated, incentivized to try to help Trump this time because of all the threats facing him. He is. I'm curious your take on the psychology of this. He did a tweet yesterday. I never went to therapy on my gravestone. We highly recommend therapy on this podcast. I don't know if you have any mutuals anymore with Elon, if you can help him get there. You know what I think that was? I said something publicly and they said, what can Elon do? I said, seek therapy. It might've been a joke, but it was good advice. Actually, he should seek therapy. And I think probably he probably read that. I think the big thing about Elon is,
Starting point is 00:14:01 is it related to Twitter, right? Does is there something about the Twitter platform that breaks people's brains? Because he's not alone on this. And or is or was this underlying and you have this hilarious Harambe story in the book where like you, you introduce Salzberger to him, like, and you knew him personally. So like, it was this craziness always underneath and something triggered him? Or was it something about the app? Like, how do you assess? Well, you know, he's always been a troubled person and he doesn't hide it. Like if you go back and look at some New York Times stories, he's sort of very emotional around when Tesla was in big trouble. And he's talked about it compared to a lot of people.
Starting point is 00:14:38 He talks about his mental health struggles. He has several times. He said he's manic depressive, I think, at one point. He doesn't hide his unhappiness and he never was. And that made him unlike people. Cause a lot of them feel robotic and Elon always felt emotional all the time. You know, you could see, I ran into him at a party once and I go, how's it going? He goes, I'm really lonely. And I was like, Oh, okay. TMI. You have nine children.
Starting point is 00:15:02 Yeah, I know. I was like, Oh, I didn't know what to say. I was like, oh, well, okay. All right, then I'll get a drink over here. Maybe if you had hugged him. Maybe if you'd hugged him in that moment, we wouldn't be here right now, Kara. No, thank you. You know, he wasn't dating someone, I think. It was weird. I remember being, I felt bad for him. And I think he has long mental health struggles. I think, as you saw in the Wall Street Journal, he enjoys medicating himself with a variety of drugs, self-medicating. And I think that story was very important to write because it links to some of the behavior. radicalized during COVID, the vaccine stuff. For some reason, he got pulled into that whole anti-vax kind of thing or questioning the vax. And then he got into ivermectin. And we had an
Starting point is 00:15:51 interview during that period where he just went off the rails and he had never done that. I have to say in an interview for sure, where he threatened to leave the interview because I doubted his intelligence on COVID. And I was like, I just don't think you know what you're talking about. And that offended him greatly. And he didn't leave the interview, of course, because he's such a paper tiger in that regard. And so I think it built, and there was always an element of these dank memes, boob and penis jokes, ha ha, boobs. And I remember thinking when he did it a couple of times, God, this guy is in his 40s what is he doing this is kind of sad like how sad that kind of thing but it was a minor part of his prefrontal cortex
Starting point is 00:16:32 yeah i was like whatever it's so juvenile but okay but i think twitter did help do that i think it was a combination of covid i think he's got as he got richer you know all these people it happens in politics too and they're not even rich. They have people around them licking them up and down all day. They think they own the world. They're so hypocritical. Like you saw that Hunter Biden thing with Matt Gaetz, where he goes, you know, what, did you take drugs? He's like, you're not the person to be talking to me about that. But that's how Matt Gaetz is. He's, you know, come on, Matt Gaetz. We know you're a partier. It's ridiculous. And to be so high handed about drug use. By the way, I don't find any, whatever,
Starting point is 00:17:09 take whatever drugs you want. But I think he changed. He got radicalized. I'm going to stay away from needles, kids. Needles, needles, kids, yes. I'll just say that. Stay away from needles, kids. I'm talking about, you know what I'm talking about. So he changed. He became radicalized. I know it sounds crazy, but the one thing that I remember him getting so upset about in one of the interviews was Biden did not invite him to a car confab, electric car confab. He had all the big ones. And I got to say, he is the pioneer of that, right?
Starting point is 00:17:35 He was the pioneer of that. And he wasn't invited, and he was so mad not to be invited. He was like a little much. I deserve to be there. I like, ugh. That's where he turned on Biden. And I remember calling someone from the Biden administration. I was like, you should have invited him. They're like, you know, they didn't because of the union issues. That was
Starting point is 00:17:53 what was the problem there. Cause it's not a unionized shop. Tesla isn't. And he was furious about that. It was fascinating to me. I'm like, what do you care? And he was like, I deserve to be there. So to sum it up, I think he's become more radicalized. I think he's changed. And he thinks because he's so rich, he thinks he's untouchable. And who does that remind you of? Who has changed also, by the way? Trump was not this way all the time. Are you sure?
Starting point is 00:18:21 He was a little bit, but it was harmless. It was harmless and silly and performative when he was on that show. A lot of it was tongue in cheek. You know what I mean? And then he became the character he was playing on TV. And it fed into the way he was. And by the way, now that we see all the sexual assault stuff over the years, it's like, oh, yes, no, he was always like this. But he hid it well, I guess. He hid it well. I see a little bit of a different parallel that is kind of similar,
Starting point is 00:18:48 though. When you talk about this rich guy resentment, that's hard for me to get. And one thing I was dying to ask you about is the Andreessen Manifesto. Mark Andreessen is one of these guys people don't know, big venture capitalist, also a brilliant guy, started Netscape. And he had a manifesto about tech optimism. I'm interested in your take on, and I just want to read one bit for it. Our enemy is the ivory tower, the know-it-all credentialed expert worldview, indulging in abstract theories, luxury beliefs, social engineering, disconnected for the real world, delusional, unelected, and unaccountable, playing God with everyone else's lives, with total insulation from the consequences. I two questions about this one why are the
Starting point is 00:19:27 richest people in the world so resentful of people in the supposed ivory tower and do they why do they not realize he's talking about himself here he's talking about himself very confused he's a very he's always been a very troubled person i don't know what else to say he's he's a very difficult complex person and uh in when i knew him I used to talk to him almost nightly, which was interesting. Really? Yeah. We used to text. Like he'd call you?
Starting point is 00:19:48 No, we text. We'd talk about politics or text about different things. He's very gossipy. He's a very gossipy personality. He was. I'm sure he still is. That's about him. That is about him.
Starting point is 00:19:59 These people in Silicon Valley, it's a miracle that they can see themselves in mirrors there. You know what I mean? It's a miracle. They're like vampires. They can't see themselves. And so... Why? What is it about? What is the resentment about? A combination of mental challenges, of extreme wealth, godlike tendencies. They all think they're in a video game in which they're ready player one. They think they know better because they know about one thing. They know about, oh, I'm going to tell you about Ukraine or whatever. By the way, one of the good things
Starting point is 00:20:28 about tech is a natural questioning of the status quo. That's a good thing. Why are we doing it like this? But instead of why are we doing it like this, now the thing is what they're doing is bad and we must kill it. It's changed from let's try a new way to let's kill them because they're hurting us. And so they're contrary for a contrary sake, which is ridiculous. And it's, it's infected people in the media too. You know, it really badly, some people, everybody, like there's a whole bunch, there's a whole strain of, you know, Matt Tybee, those people who are like lapdogs to Elon Musk. And then he kicked them, which was a surprise. He kicked all of them. He's kicked everybody in that Twitter files thing. He's
Starting point is 00:21:10 kicked them all. It's fantastic. I knew it would happen. That was kind of satisfying. It was sad. It was sad actually for them, but you knew where that was going. They really feel like they're victims. One of the things that I used to get, because I was considered, although many men think I'm not tough enough, too bad. Mama's not mean enough, too bad. They used to call me mean. Like they'd always, they would call me, these tech moguls, when I'd write something and they're like, you're mean to me. And I'm like, what are you talking about?
Starting point is 00:21:38 Your company collapsed. I said it collapsed. Like, they're like, yeah, that's real mean. And I was like, again, I would always be like, I'm not your fucking mama. I don't know what your problem is. We're not friends. I'm not trying to get you. It's just facts.
Starting point is 00:21:51 Your company collapsed. I would get that a lot. You're not nice to me. There was that scene in the book with the Google guys where I called them. I said I was writing a story about them trying to take over Search. And this was early 2000s at some point, 2008 maybe. And I wrote this thing and Dr. Seuss was saying, would not, could not have a monopoly or something like that.
Starting point is 00:22:12 I made it rhyme. I had covered the Microsoft trial where they were, the antitrust trial many years before in the 90s. And I said, at least Microsoft knew they were thugs. These people pretend they're not. They have their giant colorful balls and their pogo sticks and their soft food, but they're the same. It's the same killer. So they called me all hurt.
Starting point is 00:22:33 They're like, that really hurt us, calling us thugs. And I was like, well, I think you're thugs. I don't know what to tell you. And they said, we're not bad people. And they referenced their don't be evil thing. And I said,, they referenced their don't be evil, you know, thing. And I said, you know what, guys, I don't think you're evil. I really don't, actually. I said, I'm worried about what you're building.
Starting point is 00:22:52 The next person is going to be evil. And they're coming. You know, they're coming. Evil is coming for this. These tools are so powerful. They're so pervasive. They can amplify really bad things. What you're building is dangerous.
Starting point is 00:23:03 Even if you aren't bad, the next guy is sure to be bad or he's coming, the bad guy's coming. And they never got that. They never understood history or anything else. And that was very troubling to me about these people. And they would always say, you're mean. And I'm like, I'm not mean. I'll tell you one other example is I wrote a column in the New York Times in 2019, in which I said, if Trump loses the election, this is my hypothetical. If Trump loses the election, he's going to start saying it was stolen. He's going to say it was a lie. He's going to perpetrate it up and down the online ecosystem. It's going to have resonance because it's going to go up and down, up and down. And it's going to,
Starting point is 00:23:41 it's going to radicalize people. And then he's going to ask people to do something about it in the real world. It's going to jump off online into offline. And we are fucked if he does that, like this is going to get violent because he had already started with violent phrases on Twitter before that. And I said, I think it was back to 16. He was doing it.
Starting point is 00:23:59 Exactly. Right. I put this scenario out, right. Which is happened. Right. And I said this scenario out, right? Which happened, right? And I said, this is the most likely scenario based on what I've seen this guy do. I got calls from every one of those social media sites saying, how dare you say this?
Starting point is 00:24:15 This will never happen. I'm like, this will happen. This is exactly where this is headed. And they were mad at me for saying so. And, you know, and I said, I think at this moment, you are quickly becoming handmaidens to sedition. That's what you're doing. Yeah, let's do the Trump thing, because JVL in the newsletter yesterday for the Triad wrote, created JVL's law, which I really liked, which is relevant to this. It says any person or institution not explicitly anti-Trump will become a tool for Trump's authoritarianism eventually.
Starting point is 00:24:46 And this was true of all – and he was talking about the courts and Mitch McConnell. But this is true of the tech companies too. And I just – all these guys – you write about this in the book about how Trump wins and then they all go to try to work him over, to try to meet with him, to try to be on the inside. That even includes the white hat guys. Tim Cook is out there trying to work Trump over and they're putting out press releases together about manufacturing or whatever. Talk about that, how that was happening in real time and what you write about in the book about these guys accommodating Trump and the dangers of that. Well, I hadn't been a beat reporter for a while, but I got the tip that they were all going, which was a shock to me because
Starting point is 00:25:22 nobody said anything. And you know, these people are so performative. Everything they do requires a press release or a tweet or whatever. But suddenly it was silent because they were embarrassed. They had trash trumped to me off the record a million times, right? Like, oh, what a clown, what a buffoon. Buffoon was the common word. And he can't win. And he's an idiot. We can work with Hillary. You know, that's what they thought was going to happen. And some of them were more explicit. Sheryl Sandberg was a big supporter who was at Facebook, was a big supporter of Hillary Clinton. Meg Whitman had famously shifted. Now, by the way, she didn't go to the meeting.
Starting point is 00:25:54 She said he's a despot is what she said. She was a Republican. She was like the only Republican. The Never Travers did the right thing. We were the ones. We see it clearly. Meg is a traveler. Yeah, she was. For her to shift like that was really quite something to watch.
Starting point is 00:26:10 And cause she was, she's conservative, conservative, but she, you know, she's a typical Republican and being a Republican in Silicon Valley in California, right there, she was a unicorn. There's a couple of them. John Chambers, I think was one. There's a couple, but not many. And there certainly were no Trumpers. There were no Trumpers. And so when they, I heard about this, I was literally with my son at a farmer's market and I'm like, they're going where? All of them? And then I started to see who was going. And I was like, it's all of them going. And so I said, surely they're going to say something publicly about his comments on immigration because immigration built Silicon Valley. Surely they can't go to this meeting without making a statement about immigration. And I got on the phone with all
Starting point is 00:26:48 of them, including Elon. And he was the one who actually was like, listen, I don't think he's going to do this Muslim ban. I'm going to stop him, blah, blah, blah, blah. Like I'm Jesus kind of thing. And I said, you're not going to stop him. He said he's going to do it. This guy, for all his ridiculous clownishness, I think's gonna do it like he said so he promised his people this is not a hard thing to do like the wall or whatever but he said it and i counted it up and i was like he said it 712 times on the campaign trail he's gonna he doesn't let he's a racist he's a longtime racist this guy has persistently been attacking people of color so i feel like he's gonna do it and different people. And it's just, I don't know.
Starting point is 00:27:25 Anyway, I talked to all of them. They thought he wasn't going to do it. And they're like, we'll talk to him off the record. And I'm like, no, you're the powerful people. You're the ones who stand up for immigration because it's helped build your industry. And none of them did. And it was really something to see. And then they skulked out. They never made a statement. And Trump used the entire thing as a press release. Trump was smart enough to use it. And he did a little bit. Multiple times he used all of them for press releases.
Starting point is 00:27:50 Love me. I put them on my council. They're on my side. The smart guys are on my side. Tim Apple's bringing the jobs back to America from China, the whole thing. Which he wasn't. But okay. And he got it wrong in lots of ways.
Starting point is 00:28:02 But when he got it wrong, they didn't correct him either, by the way, which was fine. I got that. Someone from Apple was like, what are we going to do? Say the president's an idiot? I said, we could start there. Yeah. But they can't. I got that one.
Starting point is 00:28:14 I got that he's a polite man. He's not going to call him out right there. But all of them were happy to call me and insult him. But none of them were happy to do it on the record, which I thought was really nefarious. I just was like, you're kidding me. Welcome to my life, Carol. Yeah, I know. They wanted their money back. There was all this income, and they wanted the money repatriated. It hadn't been repatriated, this cash that they wanted. They wanted tax breaks, and they wanted no regulation. And so that's what they got.
Starting point is 00:28:40 I want to do another area where you were warning and how it ties to now, which is media stuff. You warned all these companies, the Murdochs. You told Don Graham in the book this would wipe out his classified business. He laughed and said, ouch. Ooh, guess he was wrong on that one. You can talk about that if you want. But I'm also more curious about where your warnings would be now to these media companies, particularly with regards to the AI and how
Starting point is 00:29:05 things are going to get even worse, frankly, or more complicated, at least, maybe not worse. When we have these technological upheavals, there's one in farming a long time ago. A third of people used to be farmers. Now it's so tiny, the population of people who do farming. Same thing with manufacturing, mechanization, and robots and things like that. That's changed that completely. Now it's coming for the white collar. This AI stuff is for white collar, really. And it's going to decimate certain industries, and it's going to really change the way we work. And media is one of those places.
Starting point is 00:29:36 I don't think decimation, but I do think we've already had the shit kicked out of us in terms of online advertising, which is now dominated by two tech companies, which is Google and Facebook or Meta and Alphabet. They have sucked up all the digital advertising for the most part. And then some companies do okay, like the New York Times and some others. So the economic stuffing is knocked out of it to start with. And now these tools will make it so every single company that has information will be able to be much more efficient and cut costs. And where do you think the costs are? People. That's where most costs are. And so anything, you know, one of the lines I have in the book is anything that can be digitized will be digitized.
Starting point is 00:30:17 Now, it's not just digitized, but it's smart digitization. Like it'll take, it'll do head, like in media, it'll do all kinds of things. Now, it's not going to write stories or report them. That is not true. But it can collate and collect information in a way that people used to do, that we don't need people to do that anymore. I worry a little less about the job than I do about the consumers. I had your co-host, Scott Galloway, because he's kind of AI optimist-ish with caveats, you know, smart about it. And so when I was pushing him back on this, the one area where we kind of both were like, yeah, this one's tough is,
Starting point is 00:30:52 I said to him, I was like, if I sometimes get confused, not that often, but every once in a while I get tricked by something online. And I am a, we just talked, I'm an addict. I consume more information than anyone. So if I'm getting tricked, what is my aunt going to do? What's my, you know what I mean? What are people that didn what is my aunt going to do? You know what I mean? What are people that didn't go to college going to do with AI? I don't think anybody's even trying to come up with an answer to this.
Starting point is 00:31:16 Well, I think it's going to not affect blue-collar workers as much at all. I mean, some of it is. I mean, people are worried about, say, autonomous vehicles. I think there's not enough truck drivers, and I think a lot of truck driving should be done. It's a dangerous job. And so it could change that industry in a good way, actually. You could see it. But it's very hard. I think one of the problems with tech is that it's addictive, but it's also necessary. You can't do your job in a white collar situation without digitization. You just can't. And so it's unavoidable. It's unavoidable and addictive, and it knocks the stuffing out of
Starting point is 00:31:45 the economics of most businesses. That's really scary. What about the misinformation side of it, though? What about people getting confused, people not knowing what's real and what's fake? Well, it started with cable, like with Fox News, which is very effective, but now it's at scale, right? Now it's at scale. So if people are getting all their news from Facebook, what Facebook picks to put in front of them is important. The problem is the people at Facebook don't care what they put in front of people. Nazis or cat videos, it's all the same to them, right, kind of thing.
Starting point is 00:32:15 And then it also, it gives you what you want. So if you start down one road, you get to the other road, right? And so it's a path of radicalization that happens. It used to be called propaganda, but now it's propaganda at scale and that you do it yourself propaganda. They don't have to put up a poster in Berlin in the 30s depicting Jewish people as vermin, for example. They don't have to do that. You know what they can do? They can send an individual message to one person. They know their fears. They know what they like. They know their fears, what they like. They know their habits. They can send messaging that is so designed at them
Starting point is 00:32:55 that it's dangerous. It's designer propaganda is what it is and very much aimed at individual people. I've talked about my mom just being totally, you know, just complete. And that's just Fox news during COVID. She's like, it's just the flu that went on for a while. I did one time, which was incredible. I did an interview with Hillary Clinton and my mom called me and she goes, oh, that Hillary Clinton, she's saying this, this is this about people like me, people like me is their favorite phrase, right? They're trying to get us people like me. And I said, oh, that, right? They're trying to get us, people like me. And I said, that sounds vaguely familiar.
Starting point is 00:33:28 And I said, can you just tell me more about it? And she started to tell you. And it was my interview she was quoting, except it was through the lens of right-wing media, right? Right. Which wasn't accurate at all. They had twisted it. And I said, mom, that's not what you said.
Starting point is 00:33:41 She's like, no, that's what she said. And I go, it's your daughter. And it was my interview. that's not what you said. Oh, she's like, no, that's what she said. And I go, it's your daughter and it was my interview. It's not what she said. I made her go listen to it. And she did. And she came back. She's like, okay, that's not what she said.
Starting point is 00:33:53 But she's still plotting against our country and really secretly running it. And I was like, oh, yeah, yeah. Like it didn't matter. So that's propaganda. And it's very good. It's propaganda on speed is what it is. My just hope level for our politicians' ability to actually regulate this in a way is just basically nil. I know that Mark Andreessen's worried that he's being overregulated.
Starting point is 00:34:15 Our mutual friend Luther Lowe, he texted me and I was like, what should I ask Kerry? He said, these guys can't even end the self-preferencing thing he's obsessed with, right? Which is like Google is putting its own products at the top of Google search. So if government, if these guys can't regulate just the basic stuff about privacy, about self-preference, how in God's name are they going to handle the AI side of this? And what is the optimistic angle on that? There isn't, because this is so private, right? That's the issue is that not just AI, but space is private. Everything that's important that government used to have a hand in is private.
Starting point is 00:34:54 AI is run by big companies. It's not run by our government. It's not. Our government doesn't have a handle on it. This is something our government, because of national security issues, because of all kinds of things, should be deep into. And they just aren't in the way they used to be, at least. And so now decisions are being made by big companies. I'm not sure what could happen. And also, there's a ton of money at stake. Like, Sam Altman is raising $7 trillion for a chip factory.
Starting point is 00:35:20 Microsoft is a multi-trillion dollar company. NVIDIA, which makes chips, a multi-trillion dollar company. Apple, a multi-trillion dollar company. NVIDIA, which makes chips, a multi-trillion dollar company. Apple, a multi-trillion dollar company. You know, they're all- We've got guys making 180 grand a year in DC in charge with trying to, you know, put some bumpers on this. And there's just no hope. But they haven't. They've had the chance for three decades now, and they haven't. And, you know, one of the things is I was at an event last night, and Amy Klobuchar has tried her hardest to get even a basic antitrust bill through, you know.
Starting point is 00:35:52 And she got kneecapped by the tech companies who were spending in districts, including by Democrats, FYI, who just pulled away from her bills because they got kneecapped in their own districts or whatever. These companies have enormous lobbying organizations now that really can move the needle here. And it's like Standard Oil got it together before we could break it up, right? They got it together. And there's so many of them. There's so many of them. This is my free market, my old free market coming through. I think that the regulatory side of this is more important than the competition side, right? I don't know. To me, I always think that the
Starting point is 00:36:28 obsession with the anti-trust is a little overstated. I'm just using it as one example. With the exception of Google, right? Because, yeah, right? Because some of these, we see now, I mean, even Google now, ChatGP, like there are other companies that are disrupting it. You know, people's like, Facebook's a monopoly. I'm like, really? There are 19 social media companies. All right, I'm going to push back on that because I am also a capitalist myself. I built lots of businesses. But in that time, they got to dominate digital advertising. In that time that Amazon didn't have to pay sales taxes when other retailers did, they got to dominate. So they get to dominate and build a great business off the backs and they don't pay the cost.
Starting point is 00:37:04 Facebook got to dominate and didn't pay the price for propaganda, anti-Semitism. Guess who? It's like they're opiate makers. And we're saying, thank you. And you don't have to pay the price of your damage. Pharmaceutical companies don't get to do it. They definitely break laws, but there's laws to break. There are zero laws in place.
Starting point is 00:37:24 I mean, there should be more than one, right? There just should be. And there aren't any. Antitrust is just one of them. I just think we need to update our antitrust laws. It's 100 years now. Companies have shifted and it should be done smartly. I think we've got to update our algorithm. What is in those algorithms? How do they make decisions? Let's talk about safety. These are just basic things that don't hinder capitalism. Privacy. Why are they scraping your content and mine? At least we have copyright laws that are good.
Starting point is 00:37:52 Should they be able to just shoplift your shit, Tim? No. I mean, why? And you have to sue them to get it back? I want a penny. Give me that Spotify cash. I want 0.001 pennies for every time they use my tweets. You know, AOL was doing it.
Starting point is 00:38:07 At one point, they're like, we make $50 for every user. I'm like, where's my vig? Because it's my information. Passed or made by walking, it's a very famous quote. That's my walking. I want to be paid for my, you know, they scrape everybody's information and then they count themselves. We're down here in the content mines. Right.
Starting point is 00:38:25 And then they say, you're welcome. And also say, oh, you know, because it's capitalism. I'm like, is this capitalism? No, it's the really top people get to use their positions to help them in other ways. I don't think that's competition. The way America wins is through innovation at the lower level. And if every, if all AI is dominated by big companies, do you think there's going to be a lot of little companies
Starting point is 00:38:49 which are the lifeblood of this country in terms of innovation? You know, I like these big companies, good for them, but we need little companies growing almost constantly to one, be innovative and two, to keep up in terms of things. And that's what's going to get killed here is real capitalism, which is what I am for, not, you know. I agree with that. Okay, we're about out of time. A couple rapid fires. I've got my happy person, my happy character for the book.
Starting point is 00:39:14 I did not know this about you, but you're kind of responsible for America's best governor, Jared Polis. I am. You know, because you told his mother to cash out on their digital greeting card company. Give us one sentence on their digital greeting card company. Give us one sentence on that. Oh, Jared.
Starting point is 00:39:31 He was like, you know that Michael J. Fox show where he was the conservative and his parents were hippies? Alex P. Keaton. He was Alex P. Keaton. A gay one, but he was Alex P. Keaton. Same. Maybe this is why I'm so alive. Yeah, it was really funny. Yeah, that was interesting. It was called Blue Mountain Arts.
Starting point is 00:39:42 At the time, they were buying traffic. Everybody was trying to buy traffic. And so they sold their company. It was a funny. Yeah, that was interesting. It was called Blue Mountain Arts. At the time, they were buying traffic. Everybody was trying to buy traffic. And so they sold their company. It was a greeting card company. And he was doing candies and flowers. He was so funny. He was a funny little entrepreneur. And you got no big on that either.
Starting point is 00:39:54 But you suggested to his mom, maybe it was time to sell. They used to send me cards. I was like, I don't want your cards, real cards and stuff like that. I'd like cash on the barrel. On the other side of the equation, are the worst people in the world, the friends of the actual genius entrepreneurs who get rich off RSUs like David Ballsacks? Are those the worst characters in the book? He's not in the book. He's just on the back of the book. I ignore him completely. I'm not interested in talking about enablers and minions. I'm not.
Starting point is 00:40:25 They don't interest me. Suck-ups. They only interest me because they're real. I do think. In some ways, I kind of, I at least dislike Elon and Teal and these people, but at least they were innovators. Kind of people that are riding on their backs, the ones that bug me. Okay, this is a request from a listener.
Starting point is 00:40:41 You can reject it if you want. And we're going to PG it up a little bit. Kiss, marry, disappear. Elon Musk, Mark Zuckerberg, Sam Altman. It's tough. Disappear, Elon. Go to Mars. Enjoy yourself. It would be great.
Starting point is 00:40:54 We've had enough of you, and take Bill Ackman with you. I probably would marry Sam Altman because it would be a beautiful gay marriage together with us. That's sweet. I guess I'd have to kiss Mark. He's a nice fella. I'm not going to kiss Mark. He's got his muscles now, though. He does.
Starting point is 00:41:14 He was so skinny. So was Jeff Bezos. He was skinny, skinny. They're both skinny little things. Speaking of gays, I think the most lesbian clause ever written is in this book, The Hardware Store is My Safe Space. Okay, my final question. I asked you if you could tell one ever written is in this book. The hardware store is my safe space. Okay. My final question. I asked you, you can tell one story that was in the book.
Starting point is 00:41:28 I want to hear the ice sculpture story. Okay. So during this period of craziness, when everyone was adorable, my wife worked for Google many years after I'd started covering them, but she went there. I stopped covering them when she went there. We went to this baby shower for Ann Wojcicki and Sergey Brin. They've since divorced, but they were having their baby. And you walked in and there were all these baby photos.
Starting point is 00:41:47 And when you walked in, they always have these assistants. They're full of assistants, these people. And they all have swingy blonde hair, all of them, women. They're all women. And they said, would you like a onesie? Do you have a swingy blonde hair? No, as you can see, I do not. It's just Kara and me and my Eeyore back there.
Starting point is 00:42:03 They said, would you like a onesie or a diaper? And I was like, what? And so they made, they love dressing up. These people like forced fun. I used to call it forced joy. Um, so they made you put on a diaper or onesie and then gave you a sucker and a baby hat and a bottle of a fake baby bottle to put liquor in. And I said, I'm not putting any of this on. There's no fucking way, you know, I'm putting any of this stuff on. And a rattle, there was a rattle involved. And, and, and so I ran inside before they could make me do it. And they like chased me. And I was like, I'm not putting on this shit. And I walked inside and it was like, it was a dystopian version of people pretending there were babies. And there was a bounce house. There was baby food, everything
Starting point is 00:42:41 in little baby food jars. All the food was in baby jars. People were, Sergei was in a onesie on roller skates. There was all kinds of bouncy balls. It was just like, it was a nightmare. And I had toddlers. And so I was like, this is bullshit. Like, this is really weird. You know, I was like, I don't need this. I have it at home. I don't need this stuff. And I didn't like, they'd always tried to get you to act like a child, which I hated. They had slides in their office. There i think yeah like we're childish no child like and i was like you're childish that's for sure and so there was a nice sculpture there too which i was riveted to as a full it's a torso of a woman and out of the breasts came white russians that you put your cup up and they to the the boob, like it was breastfeeding.
Starting point is 00:43:26 It was so ridiculous. And I look over and right near the breastfeeding is Gavin Newsom, who was mayor of San Francisco at the time. And he's in one of his fantastic suits. That guy can dress, right? And he didn't have a diaper on. And I was like, how did you, he's like, how did you get out of it? I said, I ran. I wasn't going to do it.
Starting point is 00:43:42 Dignity. And he's, I said, how did you get out of it? He said, I knew you'd be here and you would take my picture. And that would be the end of my political career in the diaper at the behest of billionaires. And we were laughing hysterically because it was so, calling to fact check this with him was so funny. We were laughing the whole, he was like, oh my God. This was not a hallucination, right? This really happened. I just need you to confirm. And then we'd had some of the white rush and it was delicious. So I love that.
Starting point is 00:44:06 It was just so, it was everything wrong and right with that time of period. It was so fucking ridiculous. But at the same time, it was kind of sweet. It was weird and sweet and strange. And also what is wrong with these people. And also therapy inducing. Yeah. Take the full circle.
Starting point is 00:44:24 Kara Swisher, host of the podcast on with Kara Swisher and Pivot. She's got a new book, Burn Book. Go get it. Thank you so much for taking the time with us. And just remember, I never wore a diaper and neither did Gavin Newsom. So vote for him for president. We resisted the diaper. We resisted the diaper.
Starting point is 00:44:39 It's a low president bar these days, but you know, we're going to take it. That's where we are right now. Thank you so much, Tim. I love your podcast. I love your work. But I would get off the internet a little bit for you. I have to say you're very present. Thank you. That's a good advice. I appreciate that. My husband agrees. We'll talk to you later. Okay. Thanks, Tim. I can't stop thinking of your face La la la la la la I'm six feet under the Bodhi tree With my crap new age philosophy
Starting point is 00:45:10 Diamonds weather once per star As I'm sitting in Jane Mansfield's car Yeah, yeah I'm independence Yeah, yeah I'm borderline Yeah, I'm borderline. Yeah, I'm California. My mind's all screwed and upside down. But my heart's on overdrive. Yeah, my heart's on overdrive I need to take a shower when I look at you You sting and hurt like a bad tattoo
Starting point is 00:45:52 I wish you'd change my point of view I cruise the canyon to get some breeze With hidden treasures up my sleeve I like the light and hate the heat But I'll lick the blood right off your street Yeah, yeah, I'm cherry cola Yeah, yeah, I'm candy-eyed Yeah, yeah, I'm California
Starting point is 00:46:23 My mind's all screwed and upside down But my heart's on overdrive The Borg Podcast is produced by Katie Cooper with audio engineering and editing by Jason Breck.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.