Tech Won't Save Us - The Problem with "CEO Said a Thing" Journalism w/ Karl Bode

Episode Date: April 23, 2026

Paris Marx is joined by Karl Bode to discuss how tech journalists coupled with corporate interests are irresponsibly boosting the profile of tech CEOs, further damaging public trust in institutional j...ournalism and highlighting the need for publicly funded media organizations. Karl Bode is a freelance reporter and writes The Fine Print newsletter. Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon. The podcast is made in partnership with The Nation. Production is by Kyla Hewson. Also mentioned in this episode: Karl wrote about how the press mythologizes tech CEOs. The New Yorker published a brutal article on Sam Altman’s compulsive lying to get ahead. OpenAI’s Pentagon deal led to a large user migration. Shout out to Karen Hao’s book Empire of AI Allbirds, the shoe company, is now entering the AI space. Here is the latest on the attacks on Sam Altman’s home. Meta has been found to amplify hate and contribute to genocide, something that is an ongoing concern with the platform. And we can’t forget about attempts to force their ‘Free Basics’ internet on India.

Transcript
Discussion (0)
Starting point is 00:00:00 But that's what I think of, what I think of these outlets. It's like they're journalism, but they're just like dull, bland garbage designed not to offend anybody that's just not really useful. I think you'd be better off going and watching The Muppet Show for a half an hour, and you'd probably come away better inform than most of this stuff. It's just not good. It's not interesting. They're not really interested in tech, and I think it's embarrassing. And I think we have to really, I'm hoping this era ends with some sort of renaissance. Hello and welcome to Tech Won't Save Us made in partnership with The Nation magazine.
Starting point is 00:00:46 I'm your host, Paris Marks, and this week my guest is Carl Bodie. But before we get to that, just a reminder that this month is the sixth birthday of Tech Won't Save Us. I've been doing this show for all these years now. I have interviewed over 300 amazing guests to give the many thousands of listeners to this show the insight they need on what these tech companies and these executives are doing to our world and to our individual lives as we have to interact with these products and the way that they, you know, change the society around us in ways that are really not serving us and degrading our existence as they become more and more powerful. So, you know, I think that these conversations
Starting point is 00:01:24 are really important. Many of you do as well. And if you do enjoy this, if you want to make sure I can keep doing this work, having these interviews, providing these perspectives on the tech industry, this month I'm trying to get 100 new supporters of the show over on patreon.com. So if you do want to help support the show, help support the work. that goes into making it. Take a minute to go to patreon.com slash tech won't save us, become a supporter, and, you know, helping sure that I can keep doing this.
Starting point is 00:01:50 Thank you so much. As I said, this week's guest is Carl Bodie. He's a freelance reporter, and he has a newsletter called The Fine Print that I highly recommend you go subscribe to, you know, as people do with newsletters. By now you've probably seen this story about Sam Altman in The New Yorker.
Starting point is 00:02:05 It has been making the rounds. There has been a lot of commentary on it. And Carl wrote about the problem with how the media reports on these CEOs, and how they seem to get tricked or, you know, get the wool pulled over their eyes time and again by these people who, you know, are basically lying to us in order to enrich and empower themselves. And time and again, they get held up as these important figures who we should be paying attention to, who we should be, you know, giving some degree of benefit of the doubt to or, you know,
Starting point is 00:02:31 at least believing the types of things that they tell us. And time and again, it turns out that they are not really how they are being presented to us. You know, you have someone like Elon Musk who presented a certain image of himself, And of course, you know, was never always that person, you know, that person who he presented himself to be. And now he is embracing right-wing politics. He is having all of these detrimental impacts on our society. And now we see someone like Sam Altman who has followed a similar mold who, you know,
Starting point is 00:02:58 was praised as this kind of rising CEO genius. And now more and more, we have people admitting that he is seemingly a compulsive liar who will do anything to gain power and influence. And, you know, is that really the kind of person who we want? wielding this much power and influence in our society. I think not. And certainly Carl doesn't think so as well. So we dig into this report in The New Yorker on Sam Altman, what we take from it, what we think about it, but we also extend that into a conversation about how the media reports on these CEOs, reports on these tech companies, the real flaws with that, how, you know, there are a lot of
Starting point is 00:03:32 great tech reporters out there, but there is also a lot of reporting that happens that just serves to boost up these CEOs to reflect the narratives that they're trying to tell the world, and we're not served very well by that kind of reporting. So I think you're really going to enjoy this conversation. I always enjoy talking to Carl. It was really happy to have them back on the show. If you do enjoy this conversation, just as you enjoy many conversations on Tech Won't Save Us. Again, I would ask you to consider going over to patreon.com slash Tech Won't Save Us, helping us meet our goal for the show's sixth birthday so that I can keep doing this work, keep having these conversations and keep educating you on what the tech industry is doing to our society.
Starting point is 00:04:12 So thank you so much and enjoy this week's conversation. Carl, welcome back to Tech Won't Save Us. Hey, though. Thank you for having me. Nice to see you. Absolutely. It's always great to talk to you. I love watching your commentary online of everything going on with the tech industry, but in particular, these tech billionaires and everything that they're up to and how they're making the world a worse place. And you've been writing recently about the way that the media reports on these CEOs and these tech companies. Of course, not a new subject matter for you, but you've had some recent pieces that were really intriguing to me. And especially on the back of this Sam Altman piece in The New Yorker, I figured it was a great time to actually talk about how the media reports on these CEOs, reports on the tech industry again, you know, because we've certainly talked about it on the show in the past. But, you know, there's new details here.
Starting point is 00:05:02 There are things to dig into. And so I guess I just want to start with kind of a broader question. Why does tech media, and really, I guess media in general at this point, report on these tech CEOs in the way that they do? I think over time, they've just kind of become an extension of marketing. I think as media got consolidated under the ownership of mostly right-wing, very rich white men, they have a very vested interest. And it's not subtle when their news coverage is kind of polluted.
Starting point is 00:05:31 their motivations, right? So you get what I affectionately call a lot of CEO said a thing reporting. Well, they'll just mindlessly parrot whatever the CEO said. We're going to Mars. AI is going to be sentient in just three weeks if I get $1 billion. And they'll just like, they'll repeat it. They won't include any context, any history. It doesn't matter if the CEO has been full of shit for 10 straight years. They won't mention that. So it's a very specific class of reporting. And I see it all the time. I said, you know, Elon Musk has obviously been a huge beneficiary, Sam Haltman, Zuckerberg, just this weird panamime. It's like, it's not really journalism. It's like stenography and just parroting. And they don't, you know, they could easily call up like an
Starting point is 00:06:12 academic. Academics are desperate to be called up on the phone and ask questions because they've been studying subjects for 30 years, right? They wouldn't even include like a paragraph where an academic comes in and says, that's not really very plausible, you know, because that would, that would require actual journalism and actual work. And that's not what this class of journalism does. And I think as things have gotten consolidated and they've fired a lot of the real journalists. What's left is this weird simulacrum. It's just kind of pathetic and sad, quite honestly. And I think Sam Altman has really benefited from that.
Starting point is 00:06:42 Yeah, I completely agree with you, right? It's wild to look at some of these articles. Like, there's a certain level where you can understand why it happens, right? Where I feel like we have been through this era of, you know, the resources really coming out of a lot of journalism. You know, there's the pressure to produce. a lot of articles now that it's online and, you know, things are being detected by the search engines and you're trying to optimize for the search engines and I guess now the chatbots or something, I don't know.
Starting point is 00:07:09 But, you know, so you have that degree of, you know, I guess the pressures that are there. And then on top of that, you have less resources to actually put into the reporting on many of these things. And so it just becomes easy, as you say, to kind of repeat the press releases, to repeat the statements of the CEOs. And it's wild to read some of these articles. Like, for example, Elon Musk talking about going to the moon or going to Mars, it can't even include the fact that he said we'd already be there by now, you know? Right.
Starting point is 00:07:39 Or the optimum, the endless stories about the optimum robot, you know, when nobody's really seen anything that has a functional battery life. It's just, there's endless examples of it and it's constant. And as AI has been integrated into this, I think what these folks want that own these outlets is just like to create this auriborous. of click engagement. You know, everybody's, whatever gets people to click. So if people click on it, it's good.
Starting point is 00:08:01 They're going to automate that. They're just going to generate a bunch of ad revenue. They don't care about the ethical implications. They don't care if the tech works. They don't care how it impacts labor. They're just building this massive monolithic thing that shits out ad money for them without the pesky need to pay journalists a living wage or health insurance. So you see it everywhere.
Starting point is 00:08:20 You know, you'll see it everywhere. It's not been subtle. There used to be some debate. You know, media academics have warned about this stuff for 2030. years what we were building. And it used to be like they'd be people, they'd get like a stinky raised eyebrow at some of their claims. But these days, it's so unsettled, especially with all the authoritarian bullshit in the States. It's really not even up for debate anymore. Most of these outlets are just extensions of the extraction class. And so they're, they're creating these swaddling
Starting point is 00:08:44 narratives for rich people to tell themselves to feel good about what they're building. And a lot of it is just gibberish and bullshit. Yeah, I feel like, I've had Victor Picard on the show before, I believe. He's great. Yeah. And he does really great work kind of like looking at, you know, kind of assessing the media industry and what would have to be done to make it better, right? To better serve the public and, you know, the, you know, I guess what we expect the media and journalism to be doing in our societies and democratic societies in particular, which is probably like a reason for me to have him back on the show soon as well and talk about, you know, how this is all going. Because he's fantastic on, you know, these issues, right?
Starting point is 00:09:22 His big things that corporate power and journalism cannot exist side by side. They have just completely different financial interests. There's just there's no way to, you know, you can sometimes get something that kind of looks like decent journalism. I'm not saying that all corporate journalism is 100% awful. But his big point is that you really can't, those two can't cohabitate very effectively. And he's a big advocate for publicly funded media, you know, which Trump just destroyed the last vestiges of in the United States with his assault on NPR and PBS. But I think that's true. I think we need some kind of publicly funded, crowdsourced public media in this country to actually, that actually cares about the truth because it's very clear. A lot of these corporate outlets simply don't. No, I completely agree, right? Like, I think we need much more public media. And certainly it's the case in the United States. But even in a country like Canada that has a major public broadcaster, I would say that it needs more resources and more funding. And we need to look at more public structures in order to put more resources into journalism so that we can, again, get the benefits that come of it.
Starting point is 00:10:22 We need this institution that is holding power to account. And as it becomes decimated, as there are fewer newspapers, as there are fewer journalists, as there's less resourcing going into it, it is not able to do that. And we very much see that reflected in the way that the tech industry and these CEOs are reported on and have been reported on for many years. As you say, Elon Musk is someone who has benefited immensely from this type of reporting and the way that he can just throw out these big sci-fi ideas. and it will be echoed and repeated,
Starting point is 00:10:55 and he will get magazine profiles and be on the covers of these magazines, and everyone will be fawning over these big ideas. It doesn't matter whether he will actually deliver them. It's very important to kind of pumping up the valuation of the companies in the big picture, right? Right. They're creating an alternate reality, right? Where history doesn't exist in many ways. Like, he's an overt white supremacist.
Starting point is 00:11:17 He says hateful, ignorant, vile shit all of the time. And you could literally go to any story about, him, pluck any story writers, AP, New York Times, Washington Posts off the newswires, and you will not find, you might find a vague reference to the fact that he's controversial, but you won't find any overt mention that he's full of shit, consistently full of shit, an overt racist, unsubtally, you know, white supremacist, a fascist supporter, his incompetence in the Doge stuff, pretending he was going to cut government efficiency and then just blew up a whole bunch of money and stole a much data and ran off. Like, that's relevant, right? If I'm going to write, if I'm a real journalist writing a story about Elon
Starting point is 00:11:52 Musk. I'm going to have at least one or two paragraphs maybe about his history of just abject failure and disgusting comments. And they just, they just, they memory hold this stuff, right? Because that's not what a lot of these outlets are interested in. These outlets are actually interested in accumulating wealth. Tech is incidental for a lot of them, I think. They don't actually care how the tech works. They don't care how open AI really works. They don't care how electric cars drive, how the engine works, which actual engineers worked on this. They're not interested in that stuff. They're not interested in engineering. You know, they're not interested in They're interested in accumulating money.
Starting point is 00:12:24 And it's obvious. And once you look at it through that frame, it becomes very clear where they behave this way, I think. Yeah. Well, the kind of CEO says a thing, articles that you're talking about, like they feel very much like clickbait kind of articles, you know? It's the kind of thing that comes out. It's a grand statement.
Starting point is 00:12:38 Oh, what is Elon Musk saying now? I need to click this and see what it is, right? It's weird because I don't even know how many people read these articles. You know, I don't, I think they're aimed at kind of an MBA grad person who doesn't want to think too deeply about the ethical impact of tech. I think that's one of the target audience, you know, and I think they're aimed at people who are just gobsmacked and easily impressed by innovation in tech, you know, people who really want to believe that there's billionaires out there who are going to take us to the moon and solve all the problems. I think that's a comforting
Starting point is 00:13:08 narrative for them to push. And I think those two kind of get fused, but I still don't know how many people actually click on this stuff. And I think its primary interest is to swaddle, like I said, the extraction class and kind of these narratives that they are good people doing noble things. And, you know, their American ingenuity and innovation is the forefront. And a lot of nationalism creeps in. And it's just kind of, for an industry that pride itself, supposedly on telling the truth, it's interesting how challenging it is for them to recognize when they're not doing that. You would think 30 years as a reporter, a lot of these guys would be better at that. But I think over time, like I said, a lot of the more critical things,
Starting point is 00:13:47 reporters get weeded out and what's left is kind of whoever toes the line, with exceptions. You know, like that New Yorker article was an excellent, I think it took the long way home. I think it buried the lead in some spots. But that was an excellent analysis of Sam Altman, and that came from, you know, a corporate establishment media outlets. So it's nice to see, you know, the paradigm broken up occasionally anyway. It does feel like there's been like more permission for this kind of stuff to be written in the past few years in a way where it might have been harder to see, I don't know, 10 years ago, right,
Starting point is 00:14:21 this kind of reporting, this kind of image of the tech billionaires being presented. But because the public has swung so much against not just the billionaires, but the tech industry to a certain degree as well in Silicon Valley, especially in recent years, it feels like there's more of an opening for this kind of critical reporting on, you know, tech billionaires, the tech industry. but then it's odd to kind of see the critical stories about, you know, say the story about, you know, a worker who died in an Amazon facility recently and, you know, they just told everyone they keep working. And, you know, these kinds of like stories about the harms and the clear problems with the tech companies and their models, then sit alongside these stories that are just like, again, kind of the CEO says a thing or the kind of rewritten press release kind of stories. It's like, it's weird to see. these two sitting next to one another then. Yeah, totally. I do think the public
Starting point is 00:15:18 really craves the truth. I think they can see their lived in experience. Like young people trying to get into the job market can see that the promises about AI making their lives easier at four-day workways, where four-day work weeks, they can see that's bullshit, right? And they want somebody to tell them the truth. And so
Starting point is 00:15:33 they look to these corporate outlets and they don't. And so when somebody does, I think they can elevate above the mire, you know, especially in the AI era when everything's got this homogenized sameness to it. I think a lot of the reporting is going to be more and more of that. I think authenticity and truth is going to have a premium. At least I like to tell myself that. No, I definitely feel that way. It's interesting you were saying, like, you know, there are a lot of journalists who just don't seem to be able to see through the, you know, I guess the statements
Starting point is 00:16:05 that these companies make or the lies that these CEOs tell. And to me, it feels like, on the one hand, there's this like group of journalists who really want to believe in what the tech industry is selling. You know, they want to believe in the grand narratives of like transformation and, you know, the world getting better and innovation and the sci-fi future is being realized and all that kind of stuff. So there's like that class of people. But then it also feels like there's another group that is kind of like it's, it's risky to put your neck on the line and to say, this is bullshit and this is not happening because what if this is like the time when they're actually going to follow through and deliver something and then you've been shown to be wrong by questioning it, you know?
Starting point is 00:16:48 Yeah, yeah, but you could be honest. You could just choose to be honest. You could just be like, okay, I was a booster previously and this came out and it's good or it's bad. You know, you don't have to worry about that. If you're a journalist, you're just going to where the truth is, right? You don't have to be worrying about what your legacy is. I think that's the polluted thinking that comes from access journalism.
Starting point is 00:17:08 You know, you've commented on your show a lot about these certain tech access journalists who are very happy to be close to power and excited to be called up on the phone by CEOs. But the CEOs are picking them for a reason, right? They're not picking them because they're a good reporter. They're picking them because they know that reporter won't really push them. And even the ones that, you know, kind of sell themselves is like truth to power outsiders like Kara Swisher. I know you've talked about the past. Absolutely. Her entire legacy is that she's this, you know, outside the box, free thinking, you know, rugged guerrilla journalist,
Starting point is 00:17:38 who's giving it to the man and asking tough questions. And I think she's got a leather coat and the sunglasses, you know. She's got a pretty strong track record of being a little too cozy with these CEOs. You know, it happened with Elon Musk. It's happened with Sam Altman. And I just, I think the industry is really quite populated with those folks because that's what, you know, that's what billionaires want to give their money to. You know, the people that fund media are very rich, white conservative men or at best
Starting point is 00:18:07 centrist, you know. There's very few left-winger billionaires that are out there funding media, and the media reflects that. You know, if we had a bunch of progressive billionaires with hearts and ethics, you know, who actually cared about the functioning of the country and worth this just purely extractive, I think you might have some media outlets that reflect at that, but we most certainly don't. So I think the journalists that are left are the ones that tell stories that these people like. And they really like them if they tell those stories and pretend to be, you know, bold truth tellers. That's a certain, that's a certain, that's a a tough balance to reach, you know, to sell the public that you are simultaneously carrying the
Starting point is 00:18:41 industry's water and hyping up their products, but also, you know, holding them to account. We're being quite critical, but as you've said, you know, there are a ton of fantastic journalists reporting on the tech industry who do really great work at these publications, right? And I don't want to make it seem like we're shitting on every, all of them. Yeah. No. I mean, every outlet has, you know, a half dozen or more excellent reporters who in there. The Wall Street Journal has great reporters, Reuters, AP, New York Times, Washington, to post. They all have good reporters buried in there, but they are not the norm. Yeah. And these voices are not, you know, mainlined above all the other gibberish that corporate power wants you to consume.
Starting point is 00:19:17 Totally. It's like you see someone like Kevin Ruse, obviously, at the New York Times, who really wanted to buy into the crypto stuff until it all fell apart. And now is like, you know, a big AI booster. And this is one of their chief kind of people explaining to their readers what the tech industry is, how it's working. And it's like, you know, it's clearly selling. the industry's narrative to the public in a way that is very useful to the industry, but is really not informing people about what is the reality of these technologies, how they're really working. You know, it's kind of putting that through the lens of what the industry would want people
Starting point is 00:19:53 to believe. And then you have people being misled about how AI works, what it is actually doing at the moment, even though, of course, they would claim that that's, or, you know, someone like Roos would claim that that's not what he is doing, you know, what his reporting actually contributes to, but that's because he sees all this in a certain way. You know, you mentioned Zuckerberg earlier. It would always stand out to me that whenever Zuckerberg would make a big announcement, one of his first interviews would be with Alex Heath. He used to be at the verge now. He has his own kind of thing. And it was always presented as like, look, I got the big Zuckerberg interview. And it's
Starting point is 00:20:27 like, yeah, because he knows you're not going to ask the hard question. Yeah, it's not, it's not, it's not the point of pride you think it is because they chose you for a reason. You know, I mean, I mean, most of the time, they don't even choose journalists anymore. They'll choose, what's his name? Lex Friedman or somebody like that. There's this array of fake journalist podcasts that they'll go on because they know they're going to get peppered with softball. But yeah, if you land a CEO interview,
Starting point is 00:20:50 it's not like you're being chosen for your chops. And Roos is interesting to me because when he was talking about AI, you know, there was one interview he did, I think, with Casey New, last year that stuck in my craw a little bit where they really attributed all sorts of human. and malice and motivations to AI, which suggested to me they don't, or at least Roos, doesn't understand how this tech really works. It doesn't think or truly understand. You know, it's providing you a sentence structure based on just massive input of what it thinks
Starting point is 00:21:19 the correct answer is. It's, you know, there's some moment in the article where you talked about how it would be good for mental health therapy. And then there's another point in the article in which he says it has ulterior motives, you know, and stuff like that and has bad intent sometimes. That's just that advertises to me. If you want to see if a tech journalist understands AI at all, see if they attribute human intention because that's not what's actually happening. Absolutely. And it feels like on the one hand, there has to be this kind of, you need to take seriously the statements of the tech CEOs and the tech companies, you know, what they claim the technology is doing or what it's becoming, you know, because we hear from these AI CEOs how
Starting point is 00:22:03 the technology is starting to think for itself and AGI is right around the corner. Maybe it's already here. You know, the computers are starting to question things. You know, we see these stories from Anthropic like every six months about how the AI is thinking now. And they're so shocked and all this kind of stuff, right? Because they asked it whether it can think and it told them yes or something. Like it's ridiculous kind of stuff. So there's that like piece of it where you take what the companies are saying seriously and you give it weight.
Starting point is 00:22:31 but the growing number of studies about kind of the social harms of the technologies, what it's doing to people's critical thinking skills and all the stories about people's kind of mental health breakdowns and the way that it is, you know, assisting people at taking their own lives or in planning school shootings and all these sorts of things. Those details need to be kind of like not taken as seriously or not addressed in the way that they should be because what matters is what the companies say and not all of the kind of growing pool of evidence that what they're saying is not reflective of reality, right? Right.
Starting point is 00:23:06 Which is a, yeah, exactly, which is a shame because the technology really is interesting, how it works, what it can do. I mean, it's software, you know, it's interesting evolutions in software, some of which are actually useful, and there's interesting conversations to be had about how things work, but that's not what you ultimately wind up getting. You get a lot of boosterism, you get this hyperbolic, just gibberish that pushes the idea, like you said, that AI sentience is just right around the core. And it's a shame. You should easily have a paragraph or two where you explain context, history,
Starting point is 00:23:35 how the tech works, throw in an objective quote from somebody who's truly an objective academic and has actually really studied this. It's easy. And it's not like that's hard to do. They just choose not to you most of this particular type of journalist. Yeah. Yeah. And so, you know, we've been talking about this kind of broadly in a more abstract sense. You know, obviously we've pulled out specific details. But I did want to pivot a bit more to this story in the New Yorker that we're talking about, right? That goes into Sam Altman, who he is. And, you know, I guess the presentation that we have had of him for the past few years and how that is not really reflective of who this man seems to be and how reporting and journalism
Starting point is 00:24:14 has kind of contributed to giving us a particular image of who Sam Altman is that is maybe not reflective of reality. So I just, I guess I want to start with, what did you make of this piece and what really stood out to you from reading through it? I thought it was great. I thought it was good. Like I said, I thought it buried the lead a little bit. It kind of took the long way home to the point that this guy has consistently lied about everything all the time. One central theme through the piece is that he just tells everybody what they want to hear all the time. He's good at that. I remember this really came out in like Karen House book as well, I think, was one of the key things that really stood out from that for people who read it, right? Yeah, yeah. He's like Elon Musk. He's not as severe as Elon Musk.
Starting point is 00:24:54 I don't think he's got the same authoritarian trappings or white supremacist interest as Elon Musk's, but he's got a lot of the same skilled opportunism, understanding media, understanding how to tell people what they were looking for to get money, which is a talent. You know, it is a talent. But also like Musk, he's not really an engineer. The article goes into a little detail about how he really doesn't understand the finer details of the technology he's talking about. I wonder if he's also asking OpenAI employees, how many lines of code that they have
Starting point is 00:25:23 written. Right. Yeah, yeah, it's similar. I bet his management style is not a lot better. I'm sure we'll see more on that. But yeah, a lot of the stuff is stuff that the board member, I mean, the trajectory itself is fun, right? This opening I was supposed to be started as a nonprofit that was concerned about the public interest. And like last month, they signed a deal with a pedagon to be their chief, you know, surveillance and targeting partner, you know, and then you read their press release and their pinky-squaring that none of this is being used nefariously, right? And you have no way to confirm that, and we no longer have functioning regulators in the United States to show insight, and we'd never have any transparency into domestic surveillance. So that trajectory is amazing.
Starting point is 00:26:03 You know, a company that was started on The Promise for Good and has ended here in the toilet partnering up to authoritarian to help them spy on people and target minorities, you know, in foreign countries around the world. That's an interesting story, right? And in 2002, I think, the Open AI Board made clear observations when they tried to fire Altman. that he was not a reliable narrator, they had multiple financial conflicts of interest, that you really don't want. This is not a kid you want with his finger on the button,
Starting point is 00:26:32 as one board member said. So this is stuff we knew already, but the tech press kind of broadly decided to just ignore it. At the time when those board members came out, highly critical of Altman, I remember the general tone in the tech press was that these board members were all hyperbolic cranks, you know, to be disregarded.
Starting point is 00:26:50 Sam Altman is just a pioneer. He's doing things on the edge. of innovation. You got to give them a lot of slack. These board members just don't understand how things work. And now here we are four years later in this New Yorker article basically confirms everything those board members said. Now, to be clear, a few of those board members might have been a little crankish, but several of them were also very reputable people who ultimately were proven right across all of this. So I'm interested to see where did tech journalism fail us in that arc? Because this is a very unsettled, clear story, again, like Elon Musk's story.
Starting point is 00:27:22 they were not there telling us the truth. Why weren't, so my questions are, why weren't they telling the truth? What were the motivations for them to downplay those concerns about Altman, and why are they still downplaying those concerns? Well, and I remember at the time of Altman's ouster as well, they weren't initially
Starting point is 00:27:38 very forthcoming with like the reasons that Altman had been pushed out, and it seemed like they're, you know, for lack of a better word, their media strategy for why they had done this was not very strong, yet you have someone like Altman who can immediately call in a ton of resources, a ton of influential people
Starting point is 00:27:57 to start putting out the counter narrative to defend himself. And you immediately have people in the press who, you know, are open to taking his version of events, believing what he has to say, and then echoing that as, you know, the conclusive narrative for the story, right? Whereas the board is just being, you know, hit with questions. It doesn't seem to be properly prepared for, you know, explaining why they had really gotten rid of this guy or, or nervous to, you know, really come out and provide a real full justification. And, you know, there were certainly plenty of journalists presenting the Altman side of things, but it feels like nobody did that more vigorously than Caras Wisher, right? Yeah, yeah. I think, I think she called them cloddish.
Starting point is 00:28:46 I forget the exact quote. I saw the tweet the other day. She's cloddish. Cloddish cloddish clodd's Being Cloddish was her assessment of that on Twitter at the time. It was just completely like, there's no way this could be true. Alton is a boy genius. They were really invested in Altman being a boy genius, right? And I've sat and watched. There was one interview I saw him do with a bunch of Indian developers where, like Musk, he just kind of spewed out a lot of stale sci-fi tropes,
Starting point is 00:29:12 kind of jumbled them together and sounded smart. The audience was just so gobsmacked by him and so impressed. And it struck me as so phony. And it really stuck with me for a long time. And I don't think people are very discerning about truth. I don't think they're very discerning about what's actually innovative and interesting. I don't think they're very good judges of character or intelligence in the United States. I mean, look at the people we elect office and the CEOs that fail upward.
Starting point is 00:29:39 So I find the whole thing really fascinating and sad kind of simultaneously to watch this play out. And over and over again, too. You know, you see these same cycles play out. There's a lot of differences between Muskin-Alman, but there's also a lot of similar. in the way that they were able to exploit a lazy press, tell people what they wanted to hear, get millions of dollars. And I think an argument, somebody will make the argument, oh, but they're good businessmen. They got a bunch of money, so that makes them good businessmen, right? But we're halfway through the story. You know, Open AI is very likely going to be one of the
Starting point is 00:30:10 first big casualties of the AI bubble when it drops. They're likely to get gobbled up for a song by one of these bigger, larger companies. So I don't think you can say, oh, but he was a good businessman, man, you know, because I don't think we've really written the book yet. He was good at telling people what they wanted to hear, and he was good at accumulating a bunch of money. And those are skills, but they are not necessarily ethical or valuable skills. There's still several directions that this story can go in because we have not reached the end of it. You know, as Altman has even acknowledged, there is very likely an AI bubble, and the question is just when it pops, and we see what is going to happen from here.
Starting point is 00:30:50 And there was even a quote in that story from a senior Microsoft executive, you know, not named, saying, I think there's a small but real chance he's eventually remembered as a Bernie Madoff or Sam Bankman-Fried-level scammer, you know, in reference to Sam Altman, right? And so it's like, there are even people in the industry who, you know, recognize what this guy is. But as you say, the story really shows how effectively he has played people throughout his career. in order to advance himself, right? And in order to enrich himself and to empower himself, whether that is going back to his first company looped,
Starting point is 00:31:26 whether that is at Y Combinator and how he, you know, kind of used that position and eventually was, you know, forced out. Obviously, there are multiple versions of the story of what happened there. And then seeing, of course, how things have played out at OpenAI and how he has used that as a vehicle to expand his power and influence, not just in the tech industry, but throughout the world. right? He has been very successful at using, you know, everything available to him, his network, his relationships, to his advantage. But there's a real question as to this company that has
Starting point is 00:32:03 delivered him this degree of influence and power that he has now, the degree to which he has stretched the truth and lied in order to get it to this position. And what is going to happen And once, you know, kind of the air kind of goes out of the room and everyone sees that a lot of what he claimed was not accurate, not true. And I think a lot of people are already realizing that. But it feels like they're, you know, it feels like it takes a bit of time for the markets to catch up or further to be a certain event that causes there to be a shift. And all of a sudden, everyone, you know, accepts or realizes or admits that they knew something was wrong, but they didn't want to say it. Like, you know, with crypto, right? When it all imploded all of a sudden.
Starting point is 00:32:45 And, yeah, obviously there were critics throughout. And, you know, I think the critics had a bit more of a voice in the crypto time and were taken a bit more seriously, you know, depending on where you're looking and whatnot. But there were still a lot of people who, you know, bought into it, who echoed the stories, who didn't want to be too critical, too questioning until it became clear. It was, you know, all a load of bullshit that was imploding. And then all of a sudden, a bunch of people were like, oh, yeah, I always realized that there was something wrong here, you know. Yeah, it always tracks back to the money, right? People are still making money off this hype cycle. We're still at the front end of it.
Starting point is 00:33:16 I saw an announcement today that Allbirds was pivoting. Allbirds, the Shoemaker, was pivoting to AI, and their stock jumped 300%. So it's pretty clear. We're still on the front end of this. On the other end, when people start suffering, and all the people, all the tech guys that benefited off the front end of the hype cycle have exited their investments,
Starting point is 00:33:33 and all the public is eating the losses on the other side of it. And you'll see a renewed sense of skepticism, and you'll see a lot of those same access reporters, like Swisher saying, I saw this coming all along. I warned about Sam Altman from the beginning. I warned about, you know, she did that with Elon Musk, too. I warned about him from the stock.
Starting point is 00:33:50 But clearly they did not. Clearly, they were part of the mythology making that was required for them to make all this money on the front end of the hype cycle. So I think... Very much feels with Swisher that it was almost like she had this kind of breakdown with Elon Musk in the relationship.
Starting point is 00:34:04 It kind of totally fell apart. And she needed, like, somebody else, to fill that position or that void or whatever you want to call it. And Altman was like there and ready to become her new, like, kind of favorite pupil or her new favorite CEO. And, you know, she has like done events with him. She has, she has very clearly, you know, kind of boosted the company, talked about how
Starting point is 00:34:27 great it's doing, talked about how great he's doing. I remember when she was promoting her book, she did an event with Sam Altman, where she both tried to present herself as like, you know, a truth teller who asked the hard questions and holds the industry to account while here's a major tech CEO, like pumping up your book and telling you how great you are. Yeah. Gonzo outsider. Yeah.
Starting point is 00:34:50 And I don't think it's entirely malicious. I think a lot of these folks really do believe in the innovative impact of technology. They want to believe. I can't always fault them for wanting to believe. But it's still journalism. It still requires skepticism and doing the hard work and realizing and having a self-awareness that you're too close to your sources. holding these events with the top billionaires.
Starting point is 00:35:12 And what also frustrates me is there's so many people in desperate need of platforming and elevation. There's so many academics, people doing real scientific work, so many engineers making cool shit out there. But most people can't name a single Musk company engineer outside of Musk. Most people can't name a single Open AI engineer outside of Open AI. So if you're going to be a journalist, why not spend some time elevating those folks? If you really are that in love with technological innovation, why don't you spend, spend a little more time with the people actually doing the work and a little less time with the extraction class mass head weirdos that we've flunked at the top to accumulate money.
Starting point is 00:35:48 Because I think, again, what it shows is that the interest is in the money. It's not in the tech. And I see that time and time again. Definitely. You know, you were mentioning before that there are people still making money off of this, right? And that's part of the reason why it hasn't imploded. And there are certainly people waiting to make their money or to cash out or what have you. And that is part of the reason that, you know, this is trying to sustain.
Starting point is 00:36:13 And I feel like we've seen this in the past where, you know, you look at the attempted we work IPO. And there were people really trying to get that over to the finish line before it all fell apart so they could make their money. And it just didn't make it there, right? Whereas there are a lot of other ones that do get it there only to crash afterward because, you know, the people who, who want to make their money have finally made it. And it doesn't matter anymore.
Starting point is 00:36:35 And I feel like, you know, this discussion of an open AI. IPO is really important to these conversations, right? And to the question of where the AI bubble or the AI industry or, you know, all this money that is infused in generative AI is actually going. Because I guess part of the question is if they're able to cash out, if Open AI is able to get to the IPO, then is there less incentive to try to keep this thing going in the way that it did before if they can make their money and kind of get out at that moment? There's no logic to any of this, right? You know, as you see with the Tesla stock valuation, there's no real same logic driving any of this. So they're going to ride this thing into the earth. There's no, they're just making money and they're going to ride it into the earth until they can't,
Starting point is 00:37:19 and then they're going to move on to something else like quantum computing. They're going to be a new thing that comes out and they're going to ride that. These are not people that care about tech. They don't care about people, a lot of them. They care about money. That's what they, it's not subtle. As somebody who writes a lot about consumer protection and government regulators, I think there's a real risk here that we have an AI bubble hitting us at the same time that all the rampant trump deregulation and gutting of regulatory safety agencies and consumer protections.
Starting point is 00:37:46 I think we're in for a really rocky next five years. And I think the AI bubble is really going to be only one small part of that. But then again, for people who've waited for the Tesla stock to collapse, it's amazing how long this is taking to really flesh out and to see any sort of real world accountability for any of this. So I'm not going to get into the business. I don't think of ever predicting when it'll happen. I just know that this is not sustainable when Allbirds is pivoting to AI. And all the CEOs are weird white supremacists with heads full of cottage cheese. That's not a sustainable vision for anybody.
Starting point is 00:38:19 Yeah. When I saw the story about Allbirds pivoting to AI and how it caused the stock to jump, one of the first things that came to my mind was when all the companies were making like Metaverse announcements and saying that they were planning to do things in a metaverse a few years ago. That's another perfect example of what I've been talking about. The pivot, you know, the $83, $85 billion that Mark Zuckerberg spent on pretending to be interesting and just the money they burned.
Starting point is 00:38:46 They just burned through money to create the most derivative cack, just the most uninteresting VR. They really thought he was really confident that he could just basically buy a domination of the entire video game, AR, VR, VR industries, right? and the tech press was right there with him. They were like, yep, super exciting, boss. That's really innovative stuff. You know, there was, you know, you saw some skepticism of the metaverse stuff.
Starting point is 00:39:10 Like, this looks stupid. You know, the legs aren't there. But they were perfectly happy to accept the company's rebranding effort. You know, that whole meta rebranding effort came as they were dodging privacy scandals. And the press was really happy to like sell the idea that Mark Zuckerberg really was revolutionizing work and play, right? That this guy who hadn't innovated, I don't think he's innovated. in 20 years, right?
Starting point is 00:39:32 It takes some skill to create such a large ad monopoly. I'll grant that, but I don't think he's done anything interesting in literally 20 years, but you can pluck, again, any of a million stories from the newswires about meta or Facebook. You know, and you won't see any of them mention the fact that he fails all the time. The stuff he makes isn't very light. You know, they get very excited about the meta-reband glasses. Oh my God. Yeah.
Starting point is 00:39:57 Like you'll see them talk, well, at least this is selling well, but I never see anybody wear those. Now they've got the reputation as the purve glasses that people are using to like stock women. I just, it's stunning to me with all that's happened and how clear and obvious all the failures are that like basic journalism. And I'm not just talking about like there's the business insider, you know, Fortune Forbes. There's certain websites that are mostly purely business or like they're written for an MBA, you know, to feel good about themselves. But I also see this stuff across Reuters.
Starting point is 00:40:27 I see in the Associated Press. I see it in like Main Wire magazine. They do the same thing where they avoid contextual history. They can't be honest about the thing they're writing about. It's like they're afraid of offending anybody. They don't want to lose advertisers. They don't want upset sources. They don't want upset ownership.
Starting point is 00:40:45 So you get this weird simulacrum of journalism. You get this weird, you know, it's like a kendall of journalism where the genitals have been sanded off to create a smooth hump so that nobody gets offended. That's kind of a gross visualization that I come back to a lot. But that's what I think of, what I think of these outlets. It's like they're journalism, but they're just like dull, bland garbage designed not to offend anybody that's just not really useful. I think you'd be better off going and watching The Muppet Show for a half an hour, and you'd probably come away better informed than most of this stuff. It's just not good.
Starting point is 00:41:18 It's not interesting. They're not really interested in tech, and it's embarrassing. And I think we have to really, I'm hoping this era ends with some sort of renaissance, you know, in terms of like, all right, let's get back to focusing on people. and what matters and building interesting stuff and push some of these VC ghouls off into the periphery where they belong. I think they've really dominated the discourse. It's gross.
Starting point is 00:41:39 Yeah, I think so too. And I'm not gonna be able to get that image out of my head now for the next little while. Yeah, my apologies. That's what it is. It's like a bland, inoffensive fake journalism, because they're so afraid of offending people, or not offending people,
Starting point is 00:41:56 but losing ad revenue, losing clicks, defending sources, offending the ownership, and you see it all the time. No, definitely. And I just wanted to go back to what you were saying about the bubble and like the moment that we're in. And I agree with you, right? I don't think there's much value in trying to predict when it is going to implode because I think that there have been multiple moments, you know, so far where it's looked like, oh, it has to go right now. You know, it has to be on the cusp of finally imploding. But it keeps going, right? Because, you know, there are many reasons for it. There's a lot of government money going into AI.
Starting point is 00:42:32 There's still a lot of hype around that. There's still a lot of people hoping to make money off of it. There's still a lot of reason for the companies to keep sustaining this. And as you say, could this be like a Tesla-like example where despite all the realities that should cause the valuation of this company to be far less than it is, it just keeps being sustained because so many people are financially invested in keeping it. that way because they would lose a lot of money if something else happened, you know, if it went in the other direction, right?
Starting point is 00:43:06 So, yeah, I think we still need to be watching where this is all going. We still need to be understanding how the company is operating and what's happening there. But in terms of predicting when an implosion is going to happen, it's not going to be nearly as easy as say as with cryptocurrencies and things like that, you know. No. No. I think a lot of it will be tied to Trumpism's fate, you know, in the MAGA contingent's fate. As he loses power in Waynes, we're going to have, you know,
Starting point is 00:43:29 there is going to be a resurgence of an interest in rebuilding the public trust, I think. I don't believe we're stuck in a permanent cacostocracy of just idiots and dipshits running this country in the ground. I won't allow myself to believe that that's just going to stay a permanent fixture. Maybe I'm diluting myself. But I do think eventually on the other end of this, there's going to be a need to recognize that, you know, regulators are important. Real engineering is important. Real science, foundational science investment grants, you know, restoring public media, restoring trust in cornerstone institutions. I do think eventually we're going to hit that point. And I don't think these huge idiot bubbles
Starting point is 00:44:06 are going to be useful at that point. I think you might see a slight shift away from them. I mean, America is what it is, right? You're foundationally always going to have what you see around, which is just this mad obsession with wealth and artifice, right? I think we're a country that's really in love with artifice. The illusion of smarts, the illusion of class, the illusion of power. And I think Trump is perfectly representative of that. And I think as MAGA wanes, I think you'll start to see maybe some sea changes. But again, we've been waiting for him to to wane for a decade. So again, it's hard to get into the prediction business. But, you know, historically, based on historical evidence, I don't think we're in this permanently. And I'm hopeful,
Starting point is 00:44:49 again, that there will be some sort of renaissance that takes us in a better direction here, a little bit over the horizon. I think it's very important to hold on to that kind of hope, right, that things are going to get better instead of just believing that, you know, okay, everything is bad now and it's going to stay bad and nothing can ever get better. Like, you know, that is almost like a self-fulfilling prophecy then, right? You need people to have hope and, you know, hope that's grounded in something that things are going to get better and that that is possible and that there are things that can be done in order to realize that, right? Yeah, as a reporter, I wouldn't wake up in the morning if I just thought
Starting point is 00:45:25 there was no point to this. It was just going to be a slippery slope down into more, you know, just a perpetual horizon of Elon Musk selling me gibberish. I don't think I could make it through the day. So no, I refuse to believe that. And I do still hold up a hope that, you know, humanity has a better heart than what we've seen in the last decade or so. Absolutely. You know, and I feel the same about climate change too, right? It's like, things are looking bad, but we need to still have hope that we can actually do this and turn things around and address this problem, right? The young people I meet don't have a choice. They're like, we don't have a choice.
Starting point is 00:45:56 We have to live in this world. We're going to try to build something better. And they're just, you know, for those of us that have seen several incarnations of this level of stupidity over the last 20, 25 years, it's a much different viewpoint from a kid who's coming up into this and wants to build something, not just wants, but has to build something better for themselves. And I think we can do it. I just, you know, it would be nice that we had a journalism that was capable of telling people the truth as a backstop. And we had, you know, a resurgent interest in funding education properly that.
Starting point is 00:46:23 might be helpful. Yeah, instead of replacing teachers with chatbots or something. Right, exactly right. Yeah. I did want to ask you, you know, we've been talking a lot about Altman, we've been talking a lot about media coverage of the tech industry more broadly. As I was saying earlier, it does feel like there is kind of a public swing against the tech industry, against AI, against these major billionaires, especially as they have co-seed up to Trump and, you know, become kind of implicated in that kind of politics. And I feel like, you know, that is kind of represented in a way by these recent attacks on Sam Altman's house in San Francisco. There was one initially with a Molotov cocktail, and then there was another one a couple days later. It was surprising to me, I feel like it was less surprising
Starting point is 00:47:08 that one happened, you know, because I, there was a couple months ago, there was a story about a guy in, I think, Tennessee who was looking to bomb the XAI data center and he was caught before, you know, he could actually put the bomb together or try to do it. And so it was like, I was like, okay, this stuff is out there, right? There are people who are angry about this, who want to do something about it. And so when I heard that Altman's house had been attacked, I certainly figured that or felt that this was like an escalation from, you know, where we have been in the past, obviously. But then to see just a couple days later that it had been attacked again, this was like,
Starting point is 00:47:42 okay, this really feels like something is shifting now if this can happen like multiple times. if this is seeming to become more regular. But I wonder what you think about that and what we've been seeing there. The rage against AI is white hot right now. It's wild to go out there and look at just how pissed off people are. And I don't think they're just pissed off at AI,
Starting point is 00:48:02 although they have very good reasons to be pissed off at AI. You know, with the energy consumption in the climate era, the attack on labor, people that are dictating the coordination and trajectory of AI are generally kind of terrible people.
Starting point is 00:48:14 So I think that, but I think a lot of the rage is just at the extraction class, right? We've had a decade of this. We've culminating Trumpism, which is a grotesque, bulbous kind of, you know, extraction class icon. So, yeah, I'm not surprised. Sam Altman has made constant promises that, you know, you know, Gen Y should be really happy that they're entering the market right now. And Gen Y is entering the market, and there's no jobs, and there's very little hope. And I understand why there's anger. I understand it completely. They
Starting point is 00:48:43 should be angry. This country has been hollowed out by corruption completely. I think the young generations can see it much more clearly than older generations can, and they have every right to be pissed off, and I understand why they're pissed off. And it's not just, you know, the tech industry lied constantly for the decade preceding this. AI was just like, this is more recent. AI is more recent. These are companies that spied on everybody constantly, lied about everything constantly. Facebook was grotesquely just growing into foreign markets with no concerns about whether their information platform was causing genocides. You know, Facebook was also big in going into India and trying to dominate the entire internet. They would offer like a free version of
Starting point is 00:49:20 the internet that was, you know, if you remember the free basic stuff that they tried to offer, they would offer free internet that was, you know, curated by Facebook where you could only access certain websites. These companies just engaged in repeated terrible behaviors at impossible scale, right? And then when authoritarianism came to town, they immediately dropped everything and cozied up to them completely like, oh, we're not going to engage in content moderation of your racist, bigoted propaganda at the internet. Sure. You know, they immediately, immediately cozied up to authoritarianism. So I don't understand why anybody would be surprised
Starting point is 00:49:51 that the public is angry about this, right? These are vile people. Although the authoritarians in charge are vile people, and they've completely thrown all their ethics in the toilet to partner up. So yes, people are going to be mad about that, and I don't think this is going to be the end of it. I think it's going to escalate.
Starting point is 00:50:07 I think there's been a top-down class war going on from the extraction class downward, and I think the public is starting to wake up a little bit to that, and it's going to take all sorts of colors and shapes and forms. Some of them violent, some of them not, some of them smart, some of them dumb. And I don't think it should surprise anybody. If you've paid attention to the U.S. history in the last, you know, 10 years, I don't think that people are violently angry should be remotely surprising to the billionaires
Starting point is 00:50:36 who have a lot of responsibility for that anger. Part of me does wonder what it does to their paranoia. Like, you know, if you look at someone like Elon Musk, It's clear for some time that he has really been paranoid about his safety and the way that the public sees him. I remember a few years ago, there was a story that someone had been tailing a car that his son, X, was in. And he felt that it was like, you know, someone who was looking to get him because he thought Elon Musk was in the car and blah, blah, blah. And it came out later that it was like a grime stalker that, you know, was just trying to, you know, I guess, fine grimes or whatever.
Starting point is 00:51:17 But it's like he is someone who has surrounded himself with security. He has created a vehicle like the cyber truck that seems like, you know, designed to be impenetrable to protect him from the threats that he feels are outside. And it feels like this mindset has taken over a lot of these tech billionaires where they have very, very much insulated themselves from the world that exists around them, you know, in their kind of closed off communities with their securitized vehicles. you know, in their private jets and their exclusive terminals and, you know, their exclusive areas where they go so that they don't need to interact with the public. And part of me wonders
Starting point is 00:51:55 seeing like actual attacks against tech billionaires. And of course, Sam Altman isn't, isn't the first one, whether this, you know, further pushes them into the arms of the security state and, you know, supporting the repressive crackdowns on people's rights and things like that. Absolutely. Yeah. That's what's absolutely. They're going to use there's evidence of the fact that the AI detractors are hyperbolic, weirdos and zealots, and the more violence there is, it kind of is counterproductive to them, to any kind of progress, because they will just treat that, you see it already, they will just treat this as outliers. You know, these radical extremists don't understand what we're building here.
Starting point is 00:52:32 We're innovators. Why are they so unreasonable? You shouldn't listen to what they say. That's going to be the play. I mean, you already saw it in his responses to the New Yorker article, right? You'll notice with Sam Altman pretty consistently that he never really takes ownership of anything he's done. You'll see it in headlines, especially. He tries to do this thing where he's he's not personally responsible for anything that's happening. He pretends to be on your side a little bit,
Starting point is 00:52:55 right? His company failed to create functional suicidal ideation guardrails in his chat about it, but he won't honestly own anything. And legally, you know, for many reasons you can't. But yeah, I think they will absolutely take violence as an act to harden up their security. I was surprised his house was that findable to begin with, quite honestly. I would have thought. thought that a lot of these guys would have built underground compounds long ago. He does have one of those as well. Yeah, I know Musk does. I know Zuckerberg does. So, yeah, I think they're only going to harden.
Starting point is 00:53:25 These are not people that are empathic and open to the Pleb's concerns. You know what I mean? If they were, we wouldn't be in the situation. So I think they're just going to get harder. They're going to get tougher. They're going to get more militaristic. And they're going to increasingly frame critics of what they're building. as outliers and radicals, much like the authoritarian's due with any criticism of them.
Starting point is 00:53:51 They're ideologically going to align. But in a country that's just desperate for infusion of money into functional resources and infrastructure, it's just, I don't think it's going to be a sustainable project for them to continue just being purely extractive. But we'll see. It's going to be an interesting decade here. And I just not entirely sure what the other side of this looks like. Just as a curious person, it's fascinating.
Starting point is 00:54:14 It's a terrible time to be alive in many ways. It's a very stupid time to be alive. Every morning you wake up and it's just the untold horrors are very dizzying. But it's also, if anybody that's interested in history or human beings, I find it fascinating from stem to stern. Yeah, no, it absolutely is, right? And I feel like we already see that with, say, Mark Andresen's Technooptimus manifesto where he's very clearly kind of calling out the enemies, you know,
Starting point is 00:54:38 the Luddites and the communists and the safety people and all this kind of stuff. And then, you know, the kind of long-termist ideas that these people echo where it's very clear that they don't seem to see much value in, you know, the lives of regular people. No. And they hate things like, I always like that they hate the humanities so much, like Elon Musk and all of these guys hate the humanities. But if they'd bother to study the humanities, they wouldn't make such stupid decisions all the time. They're just so mad, you know, the humanities shouldn't exist. We should eliminate the humanities. But if they actually understood the novel that they just read, they wouldn't have screwed up so much.
Starting point is 00:55:11 So, yeah, they're a very weird sect, and it is kind of a religious sect of people obsessed with money. And tech, again, is incidental to their goals. You know, it could be any other industry. You know, it just happens to be tech right now because that's where all the investment of money is. But in another era, it could easily be a different industry that they use as their spearhead. Definitely. And I feel like we see the religious angle of it more and more with the AI kind of moment and stuff. Yeah.
Starting point is 00:55:37 Who is that CEO Karp? Yeah, Alex Karp. Guy, like his statements are just so bizarre and radical. And yet again, you'll read an article about him the next day from a reputable news organ. It's like he didn't say anything insane at all. It's like, oh, what wonderful little nuanced, exciting innovation is he building for us? And then there's no reference to the fact that he's documentably quite mad. Yeah, that wonderful innovation from Palantir that we love.
Starting point is 00:56:05 Carl, I have one final question for you before I let you go. And it's kind of like a two-part question, I guess. But, you know, we've been talking about the way that the media reports on the tech industry, and we've been talking about Sam Altman in particular. And so I wonder, you know, do you think that this renewed attention on, you know, Altman's predilection for lying, you know, along with, you know, the way that people are clearly turning against him? Like, do you think that this is going to have actual real consequences for, you know, this
Starting point is 00:56:33 man who does seem to have been so empowered these past number of years? And do you think that there's any prospect? or hope of the media really changing the way that it reports on this industry as public opinion keeps seemingly turning against it. I think there'll be accountability for him eventually. I think this stuff kind of adds up. Oh, I remember that article a few years ago that said that, oh, I remember the board said this, and then it happens again because he's not going to change his stripes.
Starting point is 00:56:58 These guys can't change their stripes. So they're going to keep making these mistakes. And each time, I think it adds up in the public consciousness that, oh, these are not reliable narrators. And I can't really trust them. So on the short term, I don't think anything changes for him. You know, you've got that IPO looming. I think there's going to be a lot of hype.
Starting point is 00:57:13 There's going to be a ton of CEO set of thing, journalism about both the SpaceX and the Open AI IPOs. It's going to be, it's going to get dumber, I think, for a while still here. But I do think ultimately there'll be accountability when the money crash happens. You know, and especially if the U.S. economy really tanks out and people really struggle with this expensive gas and expensive services, I think that could potentially get much worse with the destruction of our regular. like I said. So I think if they are confused by the anger headed their direction right now, I'm not sure they've seen anything yet. I think it's going to get worse if people's independent direct immediate realities are more painfully impacted, which I think they will be.
Starting point is 00:57:53 As for tech journalism, I think we have to untether it from corporate power and advertising. I think that's a priority. I think if we're building anything in the new age, we have to publicly fund and crowdsourced journalism. You have to stop giving money to shitty corporate outlets and start giving money to independent reporters, start giving money to worker-owned news outlets. I think we have some control and agency over what we're sharing, what we're consuming, what outrage bait we're clicking on and retweeting or re-skating. I think it's important that the public develops a sort of immune response to bad actors and trolls. So there's a lot we can do to kind of reshape things. And yeah, I think it's possible to rebuild media, but right now it's not
Starting point is 00:58:33 looking great. The layoffs have been immense. The consolidation is immense. You've got, you know, deals like the warder and paramount mergers. Post-Trumpism could be something else, but it's a functional media that serves the public interest is something we're very much going to have to fight tooth and nail for in this country because it's very clear that corporate power does not want an informed electorate. So that's where we stand. You know, it's an ugly landscape, but I like to believe that winning some of these fights as possible. Yeah. Like we said before, right?
Starting point is 00:59:06 I think we need to hold on to that hope because that is really important. And otherwise, I think there's no prospect of even trying to have those wins, right? We need to believe that it's possible for it to become possible. Carl, it's always great to talk to you and to get your insights on all this. Thanks so much for coming back on the show. Yeah, I always like to talk to you and I appreciate all the work you do. Thank you. Carl Bodie is a freelance reporter and writes the fine print newsletter.
Starting point is 00:59:29 Tech Won't Save Us is made in partnership with the Nation magazine and is hosted by me, Paris Marse. Production is by Kylie Houston. Tech Won't Save Us relies on the support of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters and help us meet our goal for the show's sixth birthday by going to patreon.com slash Tech Won't Save Us and making a pledge of your own. Thanks for listening and make sure to come back next week.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.