Tech Won't Save Us - The Fight Over the Future of OpenAI w/ Mike Isaac

Episode Date: November 23, 2023

Paris Marx is joined by Mike Isaac to discuss the drama around Sam Altman being temporarily removed from OpenAI, what it means for the future of the company, and how Microsoft benefits from its partne...rship with the company.Mike Isaac is a technology reporter at the New York Times. He’s also the author of Super Pumped: The Battle for Uber.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Support the show on Patreon.The podcast is produced by Eric Wickham. Transcripts are by Brigitte Pawliw-Fry.Also mentioned in this episode:Mike summarized the OpenAI-Sam Altman affair with his colleagues in the New York Times. He's been reporting on it since it began.Paris wrote about the Sam Altman-Microsoft relationship in Disconnect.Semafor reported that in 2018, Elon Musk tried to take over OpenAI but was pushed out instead.Forbes reporter Sarah Emerson went through Emmett Shear’s old tweets — and yikes.Support the show

Transcript
Discussion (0)
Starting point is 00:00:00 People are rooting through his garbage now in a way that they were not before. Like you've seen these stories sort of circulate all across on Twitter, on Reddit, on Hacker News or whatever. Like I think the like rumor mill of what did he do has got people digging around. And like I couldn't tell you either way whether he has to worry about that or not. All the attention's on him. And that is good and bad. And that is the case for a lot of people whenever this shit happens.
Starting point is 00:00:45 Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks. And this week my guest is Mike Isaac. Mike is a technology reporter at the New York Times and the author of Super Pumped, The Battle for Uber, which became a television show made by Showtime. Now, I'm sure you've seen the drama that kicked off on Friday when OpenAI's board kicked Sam Altman out of the company. There was ample questioning of why that had happened. Then the next day, there were reports that Sam might be coming back. Then on Sunday, the board hired a new CEO. And so it looked like Sam wouldn't be coming back. And then he was hired by Microsoft, or at least so they said. Then Monday morning, it looked like Microsoft wasn't actually hiring Sam and he might be going back to OpenAI at all. And now I will say this interview was recorded on Tuesday afternoon, knowing that the story would probably change again after that. But we did have to kind of, you know, record it at some point and we couldn't leave it until the
Starting point is 00:01:34 last minute. And then, of course, Tuesday night, which we don't discuss in this interview, but we say is very likely a possibility. OpenAI did put out a statement saying that Sam Altman would be returning and that the board had been basically refreshed to ensure that the people who opposed Sam Altman was taken off of it. And now it's basically stacked with supporters of Altman and his vision for the company. And that includes all white men, including Larry Summers, who you've probably heard of before. He's quite an influential figure, unfortunately. But I guess one of the relevant things for this conversation is that when he was head of Harvard
Starting point is 00:02:11 University, when he was president of that university, he actually argued that there are fewer girls or women doing STEM fields because they are genetically inferior when it comes to doing science and math. And so this is one of the people who is now on the board of OpenAI to allow Sam Altman to do whatever he wants. And so I think that this is a fantastic conversation, even though we didn't get to this kind of final point of things by the time that we were recording. But we have a lot of insight into OpenAI itself, into the divides in that company, into Sam Altman and kind of what we might be learning about him. And of course, to that relationship with Microsoft and what Microsoft gets out of having this partnership with OpenAI that will now surely be continuing as Sam Altman is restored to the top job in that company. And since we didn't get to Sam Altman's return, I do want to provide a bit of an opinion or a reflection on that. One of the things that we discussed in this interview and that you've probably heard is that OpenAI was founded as
Starting point is 00:03:14 a nonprofit, right? That it did have kind of particular intentions at its founding. We can debate as to whether they were good or not, but at least it was, you know, to try to move away from the profit motive or to not be like so deeply affected by it. And continually there's been kind of a chipping away at that. And it looks like with Sam's return to the company, it is going to ensure that that kind of nonprofit influence or restriction will be basically gone entirely. And the way will be cleared for Sam Altman to do whatever he wants. You know, I described it as anything holding Sam Altman back is gone now. OpenAI is his fiefdom, and he can use it to do whatever he wants. And, you know, the Monarch, the company that has more money
Starting point is 00:03:58 that's enabling all this Microsoft, is not going to step in and try to do anything about that, but is rather going to enable him as much as they can. And so while this was a shaky few days, Sam Altman has come out with more power than he had before, which was already quite significant. So with that said, I hope you enjoy this conversation with Mike Isaac. A couple of things to note before we get into it. We did a kind of live Q&A for Patreon supporters over the weekend that went really well. If you are a Patreon supporter and you were not able to join that for whatever reason, you can still go on Patreon and find the link where you can watch back the live stream and the fun that we had over the hour and a half that we were doing it. On top of that, we've also started putting premium bonus episodes on the Patreon for,
Starting point is 00:04:41 you know, people who support the show. These are kind of longer versions of the interviews that I did with experts for the Elon Musk Unmasked series. There was so much insight that they were able to provide into the tech industry, into Elon Musk himself, that I thought it made no sense to just, you know, not publish them anywhere. And obviously I couldn't just stick them all up on the main feed. So I figured, you know, a little bonus for people who support the show, who make it possible for me to keep doing this. That made sense. So if you do want to receive those bonus episodes, you can of course sign up to patreon.com slash tech won't save us as well. Now, final thing before we get into the interview, of course, if you do enjoy the show, if you do enjoy this
Starting point is 00:05:19 conversation, make sure to leave a five star view on Apple podcasts or Spotify. You can also share the show on social media or with any friends or colleagues who you think would learn from it. And if you want to support the work that goes into making the show every single week, you can join supporters like Tom in Oxfordshire, UK, another Tom in the heart of Silicon Valley, Travis Dickey from Calgary, Valerie from Minnesota, and Sahil in Emeryville near San Francisco. So thanks to all those supporters. And if you want to join them, you can go to patreon.com slash tech won't save us, make a pledge of your own and get access to those premium episodes, future live streams, and all the other things that are on there. And I would say stickers are going out in the mail pretty soon. So you'd be able to get some
Starting point is 00:05:58 of those as well. So thanks so much and enjoy this week's conversation. Mike, welcome to tech won't save us. Thank you. Thank you for having me. Really excited to chat. You know, obviously, I've been following your work for ages. Loved your book, Super Pumped. Always meant to have you on the show. And for some reason, it just hasn't happened until now. But with all this going on, you know, it was finally time for Mike Isaac to come on Tech
Starting point is 00:06:19 Won't Save Us. So I'm very excited. I know. We needed another, like, implosion to make it happen. I'm glad we did. Thank you so much. I always love a good tech implosion, right? So you're on the show because you've been reporting on this drama that we've all kind of seen playing out over the past few days, really started on Friday, you know, with the announcement that Sam Altman was getting the boot from OpenAI, which seemed like a real shocker. So what happened exactly on Friday? And what was
Starting point is 00:06:53 your kind of reaction when you heard this news? Yeah. So the past, we're recording this on Tuesday, the past few days have been a super blur. And like everyone, it was funny, like everyone was like, well, time to go home for Thanksgiving, Friday, nothing, nothing's gonna happen because no one wants to work on a Friday. I mean, there is the like news dump before holiday thing. But anyway, around I want to say 1220 or something. My colleague Cade Metz gets word that this is happening. And then like seconds later, a blog post hits from OpenAI, like from comms, essentially giving this pretty scathing letter ousting Sam Altman, the CEO of what is probably one of the most valuable private AI startups in the world from the company, which is crazy because he's not a household name. Not most people know who Sam
Starting point is 00:07:51 Altman is. I spent a lot of my week explaining who this guy is, but he's a well-known figure in Silicon Valley and has been around for a while. Ran Y Combinator, which is a startup incubator out here, which like all these companies have come out of. But his celebrity has really risen in the past 18 months, I want to say, as OpenAI, this generative AI company that produced ChatGPT, has just gained momentum. The valuation has shot up in the weird mechanics of funding rounds that happen. And everyone's trying to get in there and all the other companies are freaked out and want to compete with them. So it was just a big shock for them to say, not only are we firing our CEO, we're doing it for cause because he's been not consistently candid with us over time.
Starting point is 00:08:40 A euphemism for lying, basically. Yeah, yeah, totally. Usually, like even if some bad shit went down at a company, you don't usually like stab them on the way out the door. The decorum in the Valley is like, give them a polite goodbye. And everyone like is like, they did a great job. And, you know, they're going to be an advisor, which of course they're not. And like, they just walk out. But that is not what happened here. And everyone freaked out. It was crazy. Yeah. You know, it's well known that you kind of fail your way up in Silicon Valley. Right. But even you think back, you know, we can talk a bit more about open AI, but you think back to like 2018 when Elon Musk tried to take it over and
Starting point is 00:09:17 he certainly didn't need the nice kind of, you know, we let him get away nicely, but we find out later, or at least Sam Altman said later, that what actually happened, they said at the time that there was a conflict of interest because Elon Musk's Tesla was working on self-driving technology. And so he decided to leave because of this conflict between OpenAI and Tesla. And Sam Altman said more recently that, oh no, actually what happened is Elon thought we were behind, tried to take over the company. The other founders were like, no, we're not going to let you do that. And so then he took off and stopped giving them the money that he promised them,
Starting point is 00:09:56 the billion dollars of funding. Right. So this kind of stuff happens, but usually, you know, you don't get the, this guy's a liar. We're kicking this guy out, you know? Yeah. That's the like gentleman's agreement, I guess, amongst these people is like it can be very dramatic and often is, especially when like millions or billions are at stake. But they'll just keep it sort of quiet for a while and then maybe it slowly trickles out over time or something if people care about it. Because like, again, like startups are messy. A lot of this shit happens a lot of the time and doesn't go reported because most people don't care or it's not valuable enough or like it's a stupid little app that whatever like is not important enough. But this happened to be, you know, Elon Musk and like all the sort of big wigs around here. So contrasting that squabble to this very high profile squabble that was immediately apparent something was up but it also kicked off this sort
Starting point is 00:10:52 of spiral in the valley of like everyone trying to figure out what he did right like and like you read a blog post like that you're like okay there's a laundry list of things that ceos get removed for right it's like did he do fraud did he have some sort of sexual scandal did he do whatever like just like what are the like cardinal sins of getting booted from a company as a ceo and all of those rumors to be clear were floating out there right those and many other things you know whether there was more human labor involved there than they were saying. And, you know, all these other kinds of speculations, because there was this void into which people were placing all of their, everything they were thinking about Sam Altman and OpenAI and these technologies and everything. Do we know though, like, have we figured it out
Starting point is 00:11:39 yet why he was actually booted from the company or is that still a gray area? Still up in the air as far as i know i mean to your point the information void was what got the whole thing spinning and like it was amazing it was on twitter it was on hacker news the y combinator website which is like where every engineer goes to post in the valley and ruminate. And it was on the OpenAI Reddit forum. And then just people texting, calling, meeting, da-da-da-da-da. It was just like the gossip mill was in full effect. But I think that was a real board miscalculation because that fed the hype train around this, right? Then the story became, what did this guy do? And I think that it was very
Starting point is 00:12:25 clear that they didn't have communication strategy. This was just sort of like, okay, we got to push this guy out, send a message and then assume everything will, I don't know, everyone will be like, okay, let's go do it, you know? And that was just not what happened. It was wild. Yeah. And so I guess, you know, what has kind of been reported over the last few days is essentially there was like a schism or like a divide within the company over how to approach, I guess, its mandatever, I might be butchering that pronunciation, and was very much kind of the one who initiated this push by the board to get SAM out. Can you talk to us about what that divide is? And like, I guess the different camps that exist within the company and the thinking on these AI tools and how to approach it? We're still trying to figure out like, the other dynamic here is that there's certainly folks who are talking and
Starting point is 00:13:25 spreading a lot of stuff to journalists and if you're paying attention you can tell who it is and who's being told what and who's being sort of used as a mouthpiece for someone trying to angle their way in and my colleagues and i have been very careful throughout all this to like i mean if you go back like this is the thing if you go back and look at like the many tweets or stories or shit oh there's a lot of wrong stuff that you're gonna have to like wait and see what shakes out but like it's been fascinating because like they're using the media very effectively to try to push this campaign to reinstate sam right and like normal people don't know that this is going on. They're like, barely just sort of like, okay, there's a headline. I don't really know what that
Starting point is 00:14:09 means. But here's a new drip, drip, drip, or whatever. And I think that was actually working for a while. It's gone sort of back and forth. But anyway, just to your question, there are differing sort of camps in this company and like what they believe in. And like, it's a super convoluted and I don't even know all those details of these philosophies, but it essentially boils down to the effective altruists versus the accelerationists. Those are the like two camps and effective altruists. This is a butchering of their philosophy, but essentially are very concerned about the effects of AI on the world and are often labeled as like doomers because they have the like Skynet thing in their heads all the time. Terminator is going to become self-aware and nuke us all. probably is where Sam Altman falls into the camp, which is like, no, we need to speed up development of this tech because think of all the people who are not being served
Starting point is 00:15:11 by it being out there. If there's a continent full of kids in Africa who don't necessarily have direct access to medicine equally across different countries, shouldn't we be able to at least give them a chatbot doctor so it gives them a minimal viable standard of care, even if it's not as good as a human? Like, this is the, like, thinking, you know, and there are certainly logical holes in that argument, but that's how they think about it, I think. And so that has been like a point of contention inside of the company, but really across the Valley and across
Starting point is 00:15:52 these camps for years and years now. And like OpenAI is almost a perfect vehicle and its implosion, a perfect vehicle for how that divide has really split people at this point. I think it's fascinating to hear you describe it that way, because I feel like when you look at Altman as a character, as the head of OpenAI, if you look at his actions, it very much aligns with this accelerationist camp, right? But then if you look at his rhetoric, he's using the rhetoric of both camps, right? We need to get these tools out there. People need it. But then at the same time, AI might destroy the world and kill humanity and all this kind of stuff. What do you make of the way both of those arguments that don Altman is, I guess is what I would say. He has gone from cult valley figure to figure on the international stage in a very short amount of time.
Starting point is 00:16:55 And I think that also probably plays into why some folks on the board are wary of him. But he's very smart in using the language of EA folks. And look, I will say that I do think that everyone involved in these conversations sort of believes parts of what they're saying, if not all of what they're saying, right? I do believe that the Doomers, as they call them, the EA folks, are concerned legitimately. This is something that's sort of in their mind, just as I think that Sam probably also believes in the, we need to speed this up stuff. But I think if you boil it back down to the original sort of purpose of this is a nonprofit, we're doing this
Starting point is 00:17:37 for ideals and then watch it sort of spin out of control as money gets involved and as it becomes like a more lucrative enterprise, that to me is the more sort of salient divide throughout a lot of this. Like the way I keep talking about it internally is like idealism running into capitalism and like it sort of like fighting out. And it seems like one side is winning if Sam makes his way back in and if he gains certain concessions, if he rids himself of the board, if he is able to change the profits or the structure of the company. Like that definitely says something about who won this sort of fight, if that makes sense. Absolutely. And can we talk about like I want to go back to the events of the weekend and how it all played out in just a second. But, you know, I think that this is an important point of it that you bring up right there as well, right? Because
Starting point is 00:18:28 a lot of these companies that we're used to are private for-profit companies, and then eventually they, you know, kind of go public. And, you know, that is kind of the route in Silicon Valley, right? But OpenAI was set up in 2015 as a non-profit, explicitly to not have this kind of profit motivation as it was trying to like figure out how to create an AGI that wasn't going to kill us or whatever. This was at least, you know, part of the justification there. To what degree did that kind of structure of the company and the fact that it kind of built a for profit arm underneath it in 2019, to what degree does that kind of play into the broader dynamics that are happening here? I think it fucked everything up in the sense
Starting point is 00:19:10 of like what they wanted it to be, right? Like, I mean, it would probably be a very different company and these people probably wouldn't have been involved in it. Like that's the whole genesis of it is that they wanted it to be a nonprofit and they didn't want market forces dictating what this should be. But it would have been a lot cleaner and probably a lot less board struggles and power plays at the top if they had just made it into an LLC from the beginning. But that's not what happened. And they ran into... So originally, they created this to be a nonprofit structure. Any profits are capped and then funneled back into the nonprofit instead of like paid among shareholders, which is how normal companies work. And that was supposed to sort of tamp down any incentives for people to do different things other than like more towards the mission of, you know, creating AI that can help the world sort of thing. To condense some of it, I think just over time,
Starting point is 00:20:07 it started becoming more complicated based on different founders' side activities or different founders' opinions on the direction they should go or who should be leading or the amount of money that could be made out of this. And Sam basically made it into a for-profit company backwards, basically. And so there's like these diagrams of the governance structure floating out there in the ether. And they're like, I think the meme is something like, who could have thought this governance would be a crazy thing? And it's like a very convoluted, like,
Starting point is 00:20:49 I don't know what those are called, but like, if this then yes, or if this then no, and like one of those trees. Anyway, the governance was sort of broken to begin with, just because all of the machinations that Sam and others did were to make it easier to recruit talent by, you know, making upside a possibility because AI talent is very expensive. Or to have different share sales to outside investors because the legitimate problem they had was it's very hard to get people to give you millions of dollars if there's no upside for them. Their philanthropy efforts hit a wall. People are like, yeah, great, whatever, but I'll give you a million dollars, not a billion dollars if you're not going to give me some stake in this, right?
Starting point is 00:21:29 So I would say it was like part necessity and then part lucrative sort of what do we have here and how are we going to sort of capitalize on this, if that makes sense. I found the Elon Musk story about him trying to like take over the company and all that kind of stuff really interesting as well, because the way that Sam Altman tells it kind of in hindsight seems to be set up in such a way as to justify the move toward kind of creating a profit model. Because it was like Elon Musk promised us a billion dollars and then he tried to take over open AI and we didn't let him and he stopped giving us the money. And so we needed to find money somewhere so that we could pay for all this stuff. And then like the following year,
Starting point is 00:22:09 2019 is like when they set up the for-profit kind of subsidiary or part of the company or whatever. And so I don't know if that's like the accurate telling of the tale, but like that's the telling of the tale that Simon has to justify it. I don't know if you have thoughts on that. We can go by what's out there in public. I guess all I can do is like, they're saying this stuff on the record, so we can point to that. But again, like the truth on a lot of this stuff is a mixture of who's willing to talk and whatever.
Starting point is 00:22:38 And it's very, this company in particular with all these like intense personalities at the top is going to be really hard to get to like ground truth on what actually happened, I guess is what I would say. Yeah, no, it makes perfect sense. And so I want to go back to this weekend. You know, we talked about Friday, this news comes out, there's all this speculation that's happening. Why did it happen? You know, all this kind of stuff. Then we move into Saturday and all of a sudden the story is, oh, by the way, Sam Altman The Times is like there are many layers to get something out, including like you need to verify. You need to get like multiple people to talk to you about it. And then you need to get editors on board and have it signed off.
Starting point is 00:23:32 And like those are good, proper checks. But it also puts me at a disadvantage when someone is tweeting random shit that they hear and then can go back and say, well, I was right because I tweeted five things and one of those turned out to be right. But anyway, I heard about it i remember this and everyone in my all my colleagues who i'm working with on the story were like what the fuck first of all we didn't believe it because like it sounded too stupid to be true right but it slowly became apparent that sam like once he was fired he put out the statement saying i love my time there my time there, whatever. And then Greg Brockman quits, who is his co-founder and was the president there, and has a lot of technical knowledge and was actually a big loss. And slowly, the investors, employees, and even executives start to rally around Sam.
Starting point is 00:24:29 And I think at some point he realizes, I'm not done here. I actually can make an effort here to pressure the board. The board is for people. It's very tightly knit. They're very interestingly obstinate people who are willing to take a lot of pressure, but they can't be removed by anyone except themselves, basically. And so, or they need to take a board vote to sort of do that. And like, if they need to dissolve the board or agree to resign. And so Sam wages this pressure campaign from the outside, basically all across Twitter and using media outlets to get different messages across.
Starting point is 00:25:06 I've been in between work, trying to go back and review the stories and different driblets that are out there. And it seems to kind of work. There was a point where the board was like... I mean, the fact that the board has reengaged shows that it is working to some degree. He didn't take it lying down. The board also knows the other part that made it work was the fact that all the employees were like, well, guess what? We're going to quit, which is actually existential to the company because you can live without your CEO, but you can't live without 760 of the 770 employees that work there, right?
Starting point is 00:25:46 Which is interesting because the tech companies do like to make us feel it's the other way around all the time. That's totally right. That's totally right. I've been thinking about this for a long time. And like employees do have more power than they realize, but they are treated so well that they don't usually have any reason to leave
Starting point is 00:26:00 or at least push back. But when shit gets really bad is when they start doing stuff or sort of banding together, which is what happened here, including the interim CEO, this woman, Mira Marotti, who was named, basically was smart enough to read the tea leaves and see that everyone wanted Sam. I mean, Sam's superpower here was marshalling support for him from everyone at the company and using that effectively to push back on the board. And so just to keep the timeline going, we walked through that. The Sam's just sort of pushing,
Starting point is 00:26:32 pushing, pushing, pushing. It seems like talks break down at some point and the board issues another letter basically slamming the door shut on Sam. So the weekend, I can't even remember what day it is, but like over the weekend, like it's all but inevitable that he's coming back. And then, oh, he's not coming back. And like, holy shit, everything is collapsing. And there's a ticking clock, at least for Microsoft, which we haven't even talked about.
Starting point is 00:26:56 We'll get to that. Because they need to do it before the market opens on Sunday. So dramatically on Sunday night, Satya Nadella, Microsoft CEO, tweets, we support OpenAI and also we're hiring Sam Altman and Greg to come join our company and anyone else can come who wants to, basically doing like a talent raid on OpenAI,
Starting point is 00:27:18 which is everyone loses their shit over that. It was wild. Yeah. And especially to come like, it was like right around midnight Pacific time. So it's like, oh, okay, this is really happening. Like just kind of capping off a really wild weekend. And of course, the one thing I would just add to what you were saying is about an hour or two before that, we got the news from OpenAI that they had chosen a new CEO, which was very much saying like-
Starting point is 00:27:43 I forgot about that. Yeah. Sam is not coming back because we chosen this guy, Emmett Shear, who used to lead Twitch, I believe it was, to be the CEO. Anything we should know about that? Well, it's telling that I forgot about that because he's such a non-entity in all of this.
Starting point is 00:27:58 Yeah, and immediately people started digging up like his old tweets and saying like just the worst kind of stuff. Yeah. I don't even know if I want to get into it. I know it's totally gnarly. And I'm like, nope, not touching that. Yeah. I'll put a link in the show notes if people want to know more. Yeah. Go look at the Forbes story. It was very good and very gross. And like it tells you the level of sort of rushing that was done here to get this stuff out, basically, because the way you usually do this, you go through and then it's like literally there are agencies that you can hire to scrub your your Bravo tweets over the years after years of like having this stuff up.
Starting point is 00:28:39 So you're not like Travis Kelsey situation or whatever that I'm talking about. Although that guy's tweets rule. He's like squirrel feeding tweet was the probably my favorite thing I've ever read. I think the guy Emmett Shear is an interim CEO who they were like, well, we trust this guy to lead in the interim. I'm not sure how they decided that, but clearly they made a very large miscalculation. I think it was someone they trusted. They apparently also talked to other people who turned them down, probably smartly. And just immediately the employees are revolting. They're like, no, we do not accept this new guy. This is not going to happen. So to your point, that happens, all hell breaks loose. And then at the end of the night, Satya says, the guys are coming over here. Now that would be an end if it was an end,
Starting point is 00:29:26 but like, it's not the end. You would think that's the end. I do want to come back to Microsoft because I want to continue on our timeline and then we can return and more deeply discuss kind of what is going on there. But as you say, you know, you would imagine, okay, Microsoft hires Sam Altman and Greg Brockman. That's the end of the story. You know, there's probably going to be some final things that happen over at OpenAI. Maybe some more people are going to leave, whatever. But that's like, you would imagine that's the conclusion. But then we move into Monday and all of a sudden the story is again, oh, by the way, Sam Altman might still go back to OpenAI. So what is then happening to still have this kind of moving forward? So yes, like everyone, besides being
Starting point is 00:30:06 delirious on Sunday night at like one o'clock in the morning, everyone was like, oh damn, this is a crazy capstone to this whole thing. And like Microsoft is an important player here because they have this very expensive partnership with OpenAI. They're the single largest investor, upwards of 10 to 13 billion invested. Although the caveat there is that most of that is in computing power, not straight cash. And so everyone's like, damn, coup de grace. Satya just killed OpenAI and stole all their talent. And this is crazy. But for me, A, it didn't end there because it's still playing out. They're still talking to the board or whatever.
Starting point is 00:30:46 But B, it's slowly sort of after talking to people, I've started figuring out. I was like, Satya would be better off with a living open AI compared to a dead one where he has to suck up all the talent, start from scratch inside of Microsoft, pay all these salaries. There's probably regulatory or legal risk there that I haven't even thought of, even though it's not a full-on acquisition. Maybe there's a legal argument where it's like a merger or this much absorbing talent plus some version of the IP that comes with the employees opens them up to legal risk. I don't know. But it was clear to me that the situation they have right now is much more desirable
Starting point is 00:31:27 to them as a functioning OpenAI than restarting from scratch. So if you believe that, which I do, you start to see that announcement as a bargaining chip against OpenAI's board. Then Satya goes on TV, CNBC and Bloomberg, and does interviews, basically not saying a lot, but basically saying, I need to kill the board. Then Satya goes on TV, CNBC and Bloomberg and does interviews, basically not saying a lot, but basically saying, I need to kill the board. And he straddles the line. This is the first time he kind of says publicly like, oh, well, you know, like we welcome Sam and them, but it didn't, he made it sound like it wasn't a done deal with their like hiring. Right. And so I think people start catching on like,
Starting point is 00:32:06 okay, they're still trying and they're still trying to force the board to do something, which is crazy. There's a crazy amount of pressure on the board. There's a trillion dollar market cap company or whatever CEO bearing down on him. Satya Nadella is an incredibly powerful person who is basically just railing against these four people. And that was Monday? know, that was Monday.
Starting point is 00:32:26 Yeah, that was Monday. Yeah. So today's we're recording on Tuesday. And so like the sort of status quo over the past few days has basically been employees threatening to quit board, taking up negotiations again with Sam or on and off. But I think they're, as of right now, they're negotiating again. And in the story that we just put up on the board story, they basically want containment around Sam if he were to come back. But like, basically, like they are demanding some concessions and Sam crying to create insurance for himself, basically. So the reason it's been locked in a sort of stalemate with board members who are like, these people are like zealots, man.
Starting point is 00:33:11 True believers in what they're like, the safety of the company and of the mission that they're doing, which is why you haven't seen them cave because any other person who didn't really give a shit probably would have stopped a long time ago. Yeah, really though. I think that kind of gives us a good in to talk a bit more about Microsoft as well, right? Because Satya Nadella does seem really good at this kind of stuff and kind of managing this relationship, even though he was totally blindsided by the board's decision to get rid of Sam Altman. But you can see in those interviews he did with Bloomberg, and I believe CNBC, I only watched the Bloomberg one. He has a good kind of command of this. And so, you know, if you're a major tech company like Microsoft,
Starting point is 00:33:54 I guess there's two paths that you can really go, right? Where you're building this massive AI division in-house to become a real kind of competitor and player in this space like Google has done or like Meta has done. Or there's another path where you have this partnership with an open AI and that allows you access to these kind of tools without having it directly in-house. So why did Microsoft choose this direction to go and what kind of benefits does it get from this relationship with open AI? Yeah, we've been thinking about that for a while. I think my colleague, Karen Wise, who reports regularly on Microsoft, made some really good points, which is like the arm's length relationship actually does behoove them in... Well, first of all, it was never clear they
Starting point is 00:34:37 could ever even swallow OpenAI with a straight-on acquisition. If it was a different regulatory environment, I imagine Satya probably would have at least considered it if not done it. But there are benefits to keeping it spun out, right? They have a 49% stake. That's a giant stake. They get all the upside of this partnership by saying they're powering Bing,
Starting point is 00:34:58 which again, not many people use Bing, but they're still like, they believe that generative AI search is a way to take on Google's search dominance. They're going to power not just Bing, but like all of their products, Microsoft Office Suite. Like Microsoft is this fascinating company that's huge and super powerful and like makes me go to sleep. But I really like they should be more heavily covered, you know, in different ways, I guess, is what I would say.
Starting point is 00:35:26 Yeah, I think the fact that they generally make us go to sleep like works for Microsoft pretty well. Totally right. Yeah. No one gives a shit about them because they're all like, well, what is Facebook doing today or whatever, you know? Exactly. Yeah. What's Google up to with its search engine dominance? And how is Facebook like fucking up the world today? And it's like oh you know i'm just using microsoft word on my computer and whatever right clippy isn't gonna kill me yeah at least unless we're all gonna be turned into little clippies you know that's that that would not surprise me so i think just to get back to my point is like, I think Microsoft creeps many of the benefits and also doesn't like if
Starting point is 00:36:07 opening, I like sold someone else or somehow went public or whatever, like it would be like something of a payday, but it, Microsoft doesn't really need that money. It, they need the tech, right.
Starting point is 00:36:20 They need the like competitive advantage they have here. If you believe that this is going to be some version of the future, then they have the biggest competitive advantage compared to Meta, Google, Amazon. I have no idea what Apple's doing, but probably Apple. So I think they get to have their cake and eat it too in that regard. There's also a reason that it didn't get done inside of these big companies. I've been thinking about this for months now. AI development and AI research is not new. It is not a new thing. This has been going on forever. And Google and Facebook, now Meta and Microsoft really have
Starting point is 00:36:59 been hiring and spending tens of millions, if not billions of dollars, hiring people to do this very thing for more than a decade. But the fact that this startup was able to get away with putting out this half-tested product into the waters and then explode caught them all off guard. And I think Microsoft sort of pouncing on that was like a really smart move by Satya. And so now he is still, regardless of like the sort of like cheering he got on Sunday and regardless of the many articles that I see saying Microsoft is the winner here, he's still trying to salvage this deal and like is still in some version of trouble because he just wants to hit reset on it. He wants it to go back to
Starting point is 00:37:45 like Friday at 11 o'clock in the morning, basically. I wonder to pick up on a few points of what you're saying there. I guess some speculations I have around Microsoft and see what you think. But I feel like, you know, if we're thinking about the way that like a startup, so to speak, or like a smaller company was able to really push this kind of AI stuff forward, it seems in a sense that they certainly got the resources of having the partnership with Microsoft and they had kind of a bit more of the freedom by not having these kinds of big bureaucracies to work with. But I also wonder if they were able to unleash these products in like a move fast and break things way in a way that a Google or a meta or something wouldn't have been able to because they
Starting point is 00:38:25 were smaller and don't have the level of scrutiny and like regulatory pressure and stuff like that, or at least didn't at the moment that they launched these products as, for example, if OpenAI was in Microsoft and Microsoft tried to do it on its own. I pitched that exact story a few months ago. I think we wrote about it, but like, that's exactly right. Like, people don't give a shit about open AI because they don't know what it is, so they can do whatever they want, right? But like, Facebook has spent the past seven years getting its ass kicked in front of Congress. And of course, that's given them PTSD on doing anything that could be even remotely controversial. Google, I have feelings about because they're just like this like giant,
Starting point is 00:39:04 lumbering, slow moving company with so many divisions, so much politics and infighting. So many people who think they're smarter than other people like getting something out the door. And like, look, I'm not saying that they should be doing these things. I'm just saying like, this is the reason why they are now finding themselves behind, you know, and so that is exactly right. Like being the like little guy can really pay off. I mean, this is probably some version of the innovators dilemma, Clay Christensen philosophy, which is once you get big enough, like you're going to die or at least like not be on it. But I think that this is probably another reason why the Microsoft
Starting point is 00:39:41 partnership helps Microsoft in the sense that they don't have to worry about this getting killed or bogged down internally you know absolutely and i felt like for google too like you know open ai going forward and pushing out chat gpt and pushing out dolly then also gives it permission to be able to push out its similar kind of ai products you know we can debate whether they're at the same level or whatever. But if it had kind of put out Bard without the context of OpenAI having released ChatGPT, I think it would have been a very different kind of conversation that we would have been having about these tools than what has happened, right? I will say that Congress moved faster in the states than it did in, let's say, 2016. 2016 and there's probably hangover there from 2016 and that's not saying that they're gonna do anything but like biden issued like a whole ai sort of
Starting point is 00:40:32 it was like an executive order i think is what it was like here are the guidelines here and it didn't it wasn't like crazy number of like you know things they should or shouldn't do but like they weren't doing that six years ago, seven years ago, you know? And like Josh Hawley, who's a prominent Republican senator here, was like calling AI meetings on the Hill all the time, right? Like it's, they're moving faster, I guess,
Starting point is 00:40:57 is what I would say, which is probably indicative of some like embarrassment over six or seven years ago. And then probably like just how everyone in tech is sort of freaking out over this and saying how important it is and that folks need to like be on it way more compared to last time is what I would say. Absolutely. You can see how on the one hand, it is good that they are being a bit more proactive about these things, even if we can still criticize the approach that they're taking and still being kind of a bit too driven
Starting point is 00:41:29 by what the CEOs say, then, you know, some of the watchdogs and critics and things like that. But to give you another point on Microsoft, I guess the other thing that comes to mind when I think about what you're describing there is, I feel like they have kind of set up these partnerships or are able to kind of jump on these moments of hype to kind of integrate them into their products, regardless of whether they will be the next big thing, but kind of to keep the investors happy, right? Like Microsoft was still a big pusher of the metaverse stuff, you know, when meta was really pushing that and making us believe that was the future, you know, it was integrating it into his products and all that kind of stuff. And then that quietly kind of, you know, got shuffled off to the
Starting point is 00:42:07 corner earlier this year when AI became the next big thing. And then it already had this partnership with OpenAI that it had made back in 2019, that then it could say, okay, OpenAI, you're the big kind of leader here, you're getting all the attention, let's deepen that partnership now, because we already have it. And then, you know, the benefit comes to Microsoft is still having access to these tools without facing the regulatory pressure and scrutiny that would come with having that division in-house or trying to acquire open AI in that moment. And then the one other thing I would add to that is that for me, I wonder if the goal is really to benefit from rolling out the metaverse or to benefit from rolling out the AI tools, or if it's much deeper to say, we are trying to dominate in kind of the cloud computing data center space. And by getting everyone to adopt these AI tools, even if we don't control them or own them ourselves, that's going to require a lot more computing power. We're going to be building out all these more data centers. And that gives us a lot more kind of influence in
Starting point is 00:43:08 whatever this kind of infrastructural piece of the industry is. Totally agree. I totally, I think there's probably a little bit of all of it in there, but I think that, I mean, look again, Satya is very smart. Zuckerberg, I think. Everything these, many, most of these guys do tends to, if they believe that it may work, there's like no downside to sort of just being like, we're going to try this. It'll be a net positive for the entire network of businesses that we do, you know, and fuck it, just move forward towards it. But like, we can do this because and we can make these very big and expensive bets because we have an underlying
Starting point is 00:43:50 business that is throwing off a ton of cash already, which is absolutely true for Facebook, which is absolutely true for Microsoft, which is absolutely true for basically for Google and for Apple and for Amazon. Like all of these companies do have businesses that are funding these next things. I think Zuckerberg actually does believe in the metaverse. I think he knows that. I mean, like he has staked a lot on it, his reputation, the company, like a lot of it. But like, I think he actually does kind of believe that this either should or might be a thing. The other companies, like they buy into these things to varying degrees but also i think the like strategy is kind of like a vc strategy which is like just like risk tolerance and like why not
Starting point is 00:44:31 sort of plow at least some portion of it into this in case it becomes becomes a thing and then if it doesn't then whatever everyone will forget about it and we can just shutter our metaverse division into like a corner of redmond washington and everyone will forget about them or something, you know? And then like in a few years, we'll lay them off or we'll transfer them and kill them. It's fucked up. But like, I feel like that's how it works. Totally. And they have the cash to make the bet and the investors still expect them to have something to do with whatever is exciting or getting all the hype in this moment. So for Microsoft, it can say, look, we're doing the metaverse or look, we got the big AI partnership when really they're using that, I think, in service of their more foundational businesses to kind of further
Starting point is 00:45:14 cement those or shore them up or whatever we would want to say. So, you know, we're talking about Microsoft there, its role in this, how it benefits from it. But, you know, the story is not over yet. Where, you know, and this is not over yet. Where, you know, and this is obviously speculation, I'm sure we'll have actual answers by the time the show goes live on Thursday. Maybe. Yeah, who knows? Where do you see this kind of going, you know, in the next few days? Do you think this ends anytime soon? Or is it even possible to tell at this moment? It has changed dramatically from not even day to day, like hour to hour, like this morning to right now.
Starting point is 00:45:47 And so I keep thinking we're getting closer to something, but like, look, the outcomes are Sam comes back or he doesn't. But if he doesn't, he sure doesn't seem satisfied with not coming back. And he's sure pressing every lever possible to get back in. So I think the board is trying to like safeguard the company and also accept that Sam might have to come back in some capacity because they're going to implode if they don't have him back. I think everyone needs to like take a breath on the cheerleading and take a breath on the like imminent return because the momentum builds and Sam and his allies have worked this very well. But I think like, I don't know how long it's going to take. Some people are that we've talked to who said, Oh, we're in this for
Starting point is 00:46:37 long haul. And other people are like, it's coming in an hour. Like it's, it's crazy. I don't know. I would like to have it happen before Thanksgiving so I can cook a pumpkin pie or something. To close it off then, what do you think this means for Sam Altman? Obviously, this is a man whose star has really risen over the past year as this kind of AI type has taken off. He was certainly already in a powerful position in the Valley as heading up Y Combinator in the past before going over the open AI, but his profile and his power, I think it's fair to say have significantly increased over the past year since then. So being pushed out of this company and now potentially going back into it or going to Microsoft or whatever, what does that mean for him?
Starting point is 00:47:19 We also had these stories that, you know, he was out there looking for money to raise, to start some other companies, particularly a chip company or something. So what does this mean for him going forward and his future position in the Valley? I think it's mixed. I think there's a version of it where he comes out on top, like he's been spending in front of everyone for a while now and on web pages across the world and whatever on TV. And then like, if he goes back, you know, he wins. If he goes to Microsoft, it might be a harder road, but he still kind of wins. But I also think the flip side of that is like, people are rooting through his garbage now in a way that they were not before. Like you've seen these stories sort
Starting point is 00:48:03 of circulate all across on Twitter, on Reddit, on Hacker News or whatever. Like people are like, I think the like rumor mill of what did he do has got people digging around. And like, I couldn't tell you either way, whether he has to worry about that or not. All the attention's on him and that is good and bad. And that is the case for a lot of people whenever this shit happens. I love that. Stay tuned for the Sam Alton dirt that is going to come after this whole drama process maybe concludes.
Starting point is 00:48:31 We'll see. Anyway, Mike, fantastic to chat. Fantastic to dig into this with you and learn everything about everything that's happened over the past little while, but also kind of the bigger picture of what it all means. Really fantastic to chat.
Starting point is 00:48:43 I know that you'll be back at some point. Thanks so much. Anytime. Thank you for having me. I appreciate it. Mike Isaac is a technology reporter at the New York Times and the author of Super Pumped, The Battle for Uber. Tech Won't Save Us is hosted by me, Paris Marks. Production is by Eric Wickham and transcripts are by Bridget Palou-Fry. Tech Won't Save Us relies on the support of listeners like you to keep providing critical perspectives on the tech industry. You can join hundreds of other supporters by going to patreon.com slash tech won't save us and making a pledge of your own. Thanks for listening and make sure to come back next week. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.