All-In with Chamath, Jason, Sacks & Friedberg - E168: Can Google save itself? AI takes over Customer Support, Reddit IPO teardown

Episode Date: March 1, 2024

(0:00) Bestie intros! Jato Caelin (1:36) Groq update!: The key metric developer platforms should care about (5:55) Google's make or break moment: Should Sundar be fired, AI safety team's power, RIF po...tential, stock pressure, and more (29:17) How bloated HR departments slow down companies (40:15) Google's licensing deals: Is this TAC 2.0? How can Reddit capitalize on its data? (53:50) Klarna's huge AI innovation: AI disruption, open-source benefits, a new class of billion-dollar startups (1:13:06) Reddit S-1 breakdown (1:22:52) The Apple Car is dead: Apple cancels Project Titan Follow the besties: https://twitter.com/chamath https://twitter.com/Jason https://twitter.com/DavidSacks https://twitter.com/friedberg Follow the pod: https://twitter.com/theallinpod https://linktr.ee/allinpodcast Intro Music Credit: https://rb.gy/tppkzl https://twitter.com/yung_spielburg Intro Video Credit: https://twitter.com/TheZachEffect Referenced in the show: https://www.google.com/finance/quote/GOOG:NASDAQ https://twitter.com/PirateWires/status/1762842476744708171 https://twitter.com/lulumeservey/status/1762864255437516932 https://x.com/pmarca/status/1762532979317043515 https://twitter.com/shellenberger/status/1762900581134557636 https://twitter.com/shaunmmaguire/status/1760885099984261564 https://www.reuters.com/technology/reddit-ai-content-licensing-deal-with-google-sources-say-2024-02-22 https://techcrunch.com/2024/02/29/google-brings-stack-overflows-knowledge-base-to-gemini https://www.sec.gov/Archives/edgar/data/1713445/000162828024006294/reddits-1q423.htm https://static01.nyt.com/newsgraphics/documenttools/82a013b9ba852548/9d4b1790-full.pdf https://www.klarna.com/international/press/klarna-ai-assistant-handles-two-thirds-of-customer-service-chats-in-its-first-month https://www.google.com/finance/quote/TLPFY:OTCMKTS https://tech.facebook.com/artificial-intelligence/2022/10/ai-translation-unwritten-language https://www.engadget.com/metas-new-multimodal-translator-uses-a-single-model-to-speak-100-languages-133040214.html https://twitter.com/balajis/status/1763067849218850939 https://old.reddit.com/r/AskReddit/comments/3cs78i/whats_the_best_long_con_you_ever_pulled/cszjqg2 https://www.bloomberg.com/news/articles/2024-02-27/apple-cancels-work-on-electric-car-shifts-team-to-generative-ai

Transcript
Discussion (0)
Starting point is 00:00:00 Jason, where are you? Is that a virtual background or? Oh, right. That's a place. I did look architecturally familiar to me. It is architecturally significant. We'll bleep out the Beets house. But yes, I am. This is like top three or four of places I like to be a house guest. I'm in rotation right now. My house guest. You're Kato Kalining through our friend group. Basically, you know, I'm just a great house guest. People like to have breakfast with me. People like having me around. So I just find myself in mansions around the world. Hey, Nick, do you have an updated picture of Kato Kalining? Is he still alive and kicking? He's alive for sure. He's got to be 70 or something, right? He's got to be older than that. He was on one of those reality TV shows with Dr. Drew, I think,
Starting point is 00:00:42 a couple years ago. What does Kato Kelen do? He's a good hang. He's yet. There's a lot of these people on LA, you know, like just hang. The kind of hang. They set things up. Oh my God. Wow. Ooh.
Starting point is 00:00:55 Is that current Kato Kelen? Yeah. Oh man. He's 64 now. Wow. God almighty. That's incredible. How did he survive?
Starting point is 00:01:04 He's got an Instagram account. I'll tell you how he survived. See no evil, hear's incredible. How did he survive? He's got an Instagram account. I'll tell you how he survived. See no evil, hear no evil. Exactly. I didn't see nothing. All I know is, hey, listen, man, we gave you a pool house. Snitches get stitches.
Starting point is 00:01:16 I didn't see anything. What did you see? Nothing. Alright everybody, welcome to your favorite podcast, the All In Podcast, episode 168. David Sacks, can you believe it? We've made it to 168 episodes with me again. The Rain Man, David Sacks. How you doing, buddy? Good. Yeah. I heard you got a big talk coming up. Give it another speech. Yeah. Yeah. Yes, I'm giving a talk. Yes, I'm giving a talk. All right. Get ready for that. All of the GOP fans out there, all your sax fans. You get a big keynote coming from sax next week. Also with me, of course, the Sultan of science, formerly known as the Queen of Kinwa.
Starting point is 00:02:13 He's got another crop. He's growing. David Freeberg. How are you doing? How's the crops? How's the fields? How's life in the fields? Worth cold open. Not a cold open. These are intros. Oh, intro. Oh, God, you're not producing. There's a reason why I'm the executive producer.
Starting point is 00:02:30 Go ahead, FreeBloom. How's your crops? How's life in the fields? It's great. Got a great team making progress. Tech is awesome. Having a lot of fun. It's great.
Starting point is 00:02:41 Back in the CEO seat, huh? And of course, the chairman dictator who's becoming completely insufferable because he invested in a company seven years ago that is absolutely crushing her and now called Grok. We talked about it last week. You're back. You're back, Tramoth. Everybody is talking about Grok.
Starting point is 00:02:58 You're on a cloud nine, it seems. Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok, Grok. Grog Grog Grog Grog Grog Grog Grog Grog Bitcoin Bitcoin Bitcoin Grog Grog I mean it is interesting how your book just literally whatever Tramot's book is that's his mood it's just Bitcoin 60 Grog. Also the honesty level is going to be through the roof now. Oh right. It's going to be running for governor and buying the Hamptons again. The truth bombs that Tramot is going to be through the roof now. Oh, right. It's going to be running for governor and buying the Hamptons again. The truth bombs that Chamath is going to drop now.
Starting point is 00:03:29 There is nothing like peak Zerp Chamath. I mean, are we going back? Is are we back right now? On paper, what is your stake in Grocka ready worth? Let's look, we're not talking. We're not. We're not talking about that. So uncouth.
Starting point is 00:03:41 It's so uncouth, but he's buying the Hamptons and he's probably going to buy. When has it been uncouth on this podcast to talk about Our wins or assets These are not these are not it's not a win we have not count you never count your chickens before they hatch That's absolutely right. I am not counting any chickens. Don't book the win. Yes. There's a lot of hard work to do There's a lot of hard work to do. There's a lot more people to sell products to, to build products. By the way, developers, by the way, on Grok, just this past week, in the queue, the waitlist tripled. Now there's almost 10,000 developers. That's a big
Starting point is 00:04:20 deal. Oh, that's crazy. That's a great sign. I think the most important North Star metric for these developer platforms is basically that. As goes the developers, so goes the platform. You can kind of count on some amount of pull through because some of those developers just statistically will land really important products. They'll consume more APIs. They'll just consume more stuff. It's just all goodness because if you have 9000 or 10,000 people just waiting in line.
Starting point is 00:04:47 Once those guys get in, somebody's going to create something magical. That's the great part of having a platform. It's a good thing. It's a good thing. People just take it and they build on it while you're sleeping. Something amazing can just get built and you get some amount of the credit for that. The other thing that's really interesting is the number of developers is increasing. I've seen two or three founders recently, just the last couple of weeks, who were previously IDF founders who are now have taught themselves to code. So, you know, there's 1% of the, you know, whatever number of millions of dollars in the world, I think it's going to like double or triple.
Starting point is 00:05:21 So the pool of people who are writing code, I think it's about to grow very meaningfully. I mean, you guys, you guys saw the release from the White House where they were like, we don't want you to go to see it C++ anymore. Yeah, that was very interesting. I mean, they were talking about memory leaks and like this is, I guess, the source of most memory leaks. Therefore, the White House wants you to learn
Starting point is 00:05:42 to develop in memory safe languages. Oh, great. Awesome. Yeah. So why are they involving themselves? Like they got nothing better to do? Yeah. I mean, they have an opinion on, you know, preventing security leaks.
Starting point is 00:05:55 All right. Listen, the top issue this week is the same issue as last week. Google's Gemini DEI Black Eye continues. We covered this woke AI disaster last week. It was kind of funny. I was watching CNBC and they had a hard time describing the problem. The woman just, I guess, didn't want to call it what it is. It's a racist AI.
Starting point is 00:06:15 You type in text and it gives you the opposite or just culturally insane responses. So, if you put in, you want a picture of George Washington from Google's Gemini or Sergey Brann, you might get back like a Benetton style diversity added with like George Washington being black or Sergey Brann being Asian, etc. And so this has caused a bit of a gruff luffle here in the industry to say at least the stock is down 5% since we talked about it last week. And Sundar sent a memo to the Gemini team. Of course, when they write these memos to a team, it's written to the entire world because you know it's gonna get leaked.
Starting point is 00:06:52 And so you're writing as such, it might as well be a press release. I'll give two quick quotes here and then I'll throw it to the besties because there's so many questions that we have to address. Quote number one, I know that some of its responses referring to Gemini have offended our users and shown bias. To be clear, that's completely unacceptable and we got it wrong.
Starting point is 00:07:10 And to be clear, it wouldn't show white people, especially ones like George Washington or just somebody who is obviously a Caucasian. And so the next quote will be driving a clear set of actions including structural changes. To me, structural changes means we're going to lay off a bunch of people and we're going to get rid of the DEI group. That's a newfound motivational riff quote in my mind. Freberg, will Sundar survive? And is Google too broken to fix?
Starting point is 00:07:41 I'm just going to ask you, since you work there, I don't wanna mean to make it uncomfortable, but what are the chances Sundar survives this? And what are the chances that Google can be fixed and produce great products quickly at the light users? I don't know how to answer the Sundar will survive because it's kind of an idiosyncratic organization. There are a couple of founders who have super voting shares, and ultimately comes down to their decision and the direction they want to take the company.
Starting point is 00:08:15 And I have no insights into what they individually think. So, frankly, I've spoken to a lot of folks who are investors in Google over the last week, and a lot of folks are just deeply frustrated and angry. On a number of fronts, the business, really, there's three businesses inside of Google. There's Search and Ads, there's YouTube, and there's Cloud, and the rest of it is kind of noise. And to give you a sense of how big these businesses are, there's YouTube and there's cloud and the rest of it is kind of noise. And to give you a sense of how big these businesses are, right, YouTube did 10 billion a quarter, roughly cloud did 10 billion a quarter. They have this devices business and subscriptions
Starting point is 00:08:53 business does about 10 billion a quarter. And then search does about 50 billion a quarter. And the margin on search is much higher than any of those other businesses. And so the search margin, the ad revenue on search, is probably 100% of the true operating profit of the business. So the real threat to Google is more, are they in a position to maintain their search monopoly or maintain the chunk of profits that drive the business under the threat of AI. Are they adapting? And less so about the anger around Woke and DEI, because most of the investors I spoke with aren't angry about the Woke DEI search engine.
Starting point is 00:09:32 They're angry about the fact that such a blunder happened and that it indicates that Google may not be able to compete effectively and isn't organized to compete effectively in AI just from a consumer competitiveness perspective. So, you know, investors are banging the table and in the past, we saw this with Meta, I think it was towards the end of 22, if you guys remember.
Starting point is 00:09:53 And it was a similar situation. Investors were like, why are you investing in VR and AR? This is crazy. Why do you have all these people that are getting overpaid? And everyone started to write off the stock and the stock took a big nosedive for a period of time, just like Google's is right now. And then the changes came. And much like Google, there's an individual with super voting shares who basically said,
Starting point is 00:10:13 you know what, I am going to step in and we're going to make these changes and we're going to fix this organization and we're going to write size and we're going to focus on the product winning. And since then Meta stock is up a tremendous amount. 5x since then Google shares are down is up a tremendous amount. Five X since then. Google shares are down about 10% over the past two weeks. By the way, I was one of the stupid people to sell Meta around that time.
Starting point is 00:10:31 And your thinking was that they just can't get out of their own way and the God King is... Yeah, it's exactly. It's another one of these idiosyncratic problems. You don't know what this individual is thinking and what he is individually going to do. The point I'm making is that at Google now, something has to give because the noise is so loud.
Starting point is 00:10:50 The board is hearing this left and right. Investors are banging the table. Analysts are banging the table. And I'll tell you a couple of anecdotes internally. Employees are now banging the table. A story I heard this week was that someone stood up in a meeting and said a couple of weeks ago, if I had stood up in this meeting and said, we can't show black people in the image generation at the rate that we're showing, I would have been cast a racist.
Starting point is 00:11:18 And I didn't have permission to do that inside of the organization. But today, and everyone's like, you're right, you would not have been able to stand up in the organization and say that. But the tenor has changed inside of Google with a lot of the employees that I've spoken with who are now saying, I can stand up, and I can say that this group called Responsible AI has too much power, and it's a one-sided asynchronous problem where they get to come in and say, we need to change this. And if you step up and disagree with them, you are deemed a racist, you are deemed culturally inappropriate.
Starting point is 00:11:49 You're canceled. So it's easier to keep your head down. It's easier to keep your head down. In this kind of a circumstance and keep collecting your RSEs. And all of a sudden you wake up one day and you see the blunder that happened last week. Yeah.
Starting point is 00:11:58 And so now internally, people are waking up and saying we need to change this. And I heard that that's made its way up to the higher ranks at Google. And they're very actively, you know. So there may be a moment here where Google stock, which currently is trading at just 17 times 2025 consensus earnings, which is cheaper
Starting point is 00:12:15 than all the other big tech companies by far, and is still growing. Core business is growing. Cloud is growing 20%. Search is growing 15%. All these other businesses, YouTube's growing 20% a year. It's a growth business that's very profitable
Starting point is 00:12:29 and it's trading at a very cheap discount. So there's, you know, the bull case is, now's a great time to buy because it's so cheap. And there could be this moment where, you know, you see some of the changes that are needed internally to get the AI products to where they need to be to maintain the lead that is inherent because of SIR.
Starting point is 00:12:46 So that would mean SACs, it would require, because we're using Zuckerberg as an example, we could also bring Twitter and X into this, it would require founder authority to come in there and make these changes. It would require Larry and Sergey who have the super voting chairs to come in there and say, hey, this is all changing. Enough of this. Do you think there's any chance of that happening or is Google just too broken to fix and they're going to just lose this opportunity? And it's Microsoft under Steve bomber missing mobile and cloud. Well, to quote Jefferson, the tree of liberty must be refreshed from time to
Starting point is 00:13:21 time with the blood of patriots and tyrants. And I think there's something analogous here. I mean, you have to, if you're going to refresh this company, you have to go in and you got to go in and make major cuts, not just a rank and file, but to leadership who doesn't get it. And that's the only way it's going to get fixed. Do I think that Larry and Serger, you're going to come in and pull an Elon and go deep and figure out which 50%
Starting point is 00:13:46 or 20% of the company is actually good and doing their jobs, probably not. But is it possible that they could make a leadership change? Yeah, it's possible. Probable? I don't know. I mean, I've heard that the company is the way it is because they like the way it is. I mean, that they're part of the problem in effect. They don't see the problem, or at least they haven't been told now. Maybe they'll
Starting point is 00:14:08 get the wake-up call. What do you think, Shama? Watching while this happened. I think it's basically a small cadre of a thousand people that have built literally the most singular best business model and monopoly ever to be created on the internet. And a whole bunch of other people that have totally transformed this organization, as Zach said, into ability and a platform to reflect their views. And so I don't, not a shareholder of Google and outside of the tools I use, I don't think I really have much voting power.
Starting point is 00:14:47 So I don't, and I have so many alternatives now. So I actually think like the, I don't really care that much, I guess, is the point. I think that the employees should care and the shareholder should care, and they should come together and vote. And I think SACs is right. I think the company is the way it is because they've chosen to be that way.
Starting point is 00:15:08 And I think Freberg is right, which is that there's a small group of people who have been protecting and breathing life into the single greatest business ever built, ever in the history of business. But now we need to have a confrontation amongst all of these three different groups of people and they need to make some decisions. Let me put a few data points in play here, J.Kal. This all speaks to the problems being, let's say, deliberate as opposed to a glitch or an accident.
Starting point is 00:15:38 So first you have Sundar's letter to the company, which there was a very interesting tweet by Lulu Cheng, Missouri, who does comms at Activision, and she writes a blog called Flack. She said, quote, the office she's a comms expert just for the audience. She's a comms expert. Yeah. So she kind of graded Sundar's letter and it was scorching. She said the obfuscation lack of clarity and fundamental failure to grasp the problem are due to a failure of leadership. A poorly written email is just the means through which that failure is revealed. So that was one reaction. Mark Andreessen had a series of posts indicating that the AI was programmed to be this way. Again, it's not like a bug. It's more of a feature and that it's not an accident. This is, this is happening because, because companies want to be that way.
Starting point is 00:16:31 They chose to be this way. And in fact, he goes further and says that these companies are lobbying as a group with great intensity to establish a government protected cartel to lock in their shared agenda and corrupt products for decades to come. Wow. I mean, again, that's a really scorching critique here. And then, you know, the question I would ask is, who's been fired for this? I mean, imagine if, I don't know, one of Elon's products had a launch
Starting point is 00:16:58 that went this badly. Do you think no heads would roll? No heads have rolled. And so you have to kind of wonder, well... What if it's a structural issue, Sax? I mean, the point does resonate with me that there is a group internally, and I think it's called the Responsible AI team at Google. And this team's job is to enforce those principles.
Starting point is 00:17:19 Remember that we brought up last week that they've defined. And so they go in and they're like, well, you know, if you just render an image of a software engineer and all the images are just white guys in hoods, that's inappropriate because there are plenty of non-white people. You need to introduce diversity. So then the programmers say, okay,
Starting point is 00:17:34 we'll go ahead and overweight the model and make sure that there's diversity. And you can't say no, because otherwise you are deemed a racist. So who's the individual that's responsible given that structural circumstance that exists within the organization? It's more of a cultural and structural problem to me than, you know, one, I guess ultimately there's leadership that's lacking. But actually, I agree with that to some degree. Let me describe how I think it works.
Starting point is 00:17:58 So what I've heard about Google is that every meeting above a certain size has a DEI person in it. I mean, literally. So it's kind of like in the days of the Soviet Union, their military, the Red Army would have in every division or unit, there would be, you know, a commander or a lieutenant and there'd be a commissar. Okay. And the commander reported up the chain and the commissar reported to the party and the commissar would just quietly take notes in all of the meetings of the unit.
Starting point is 00:18:27 And if the commissar didn't like what the lieutenant was doing, the lieutenant would be taken out in shot. Okay. Now, that's kind of a dramatic example, but the point is that in every large meeting at Google, you've got this DEI commissar who's like quietly taking notes. I'll point out that at Google, the only person we can ever remember to get fired was James Deymore, who was an engineer who complained about the political bias at Google.
Starting point is 00:18:52 In other words, he was a whistleblower about the very problem that's now manifested. He's the only person you can think of to get fired at Google. Google was mocked on the show Silicon Valley. It was called Hooli, but remember this? Yeah, you would not have the roof beingarrived the guys sitting on the roof. And you'd rest and nobody ever got fired because... It's a company where it's impossible to get fired unless you blow the whistle on the political
Starting point is 00:19:11 problem. And I think that if you're sitting in those meetings with a DEI commissar present and you know or you have suspicions that the AI product is not working right, are you really going to speak up and risk the fate of a James Deymore? Of course you're not. Right. So you're right, Freeburg, that it's a structural problem, that there's probably a low grade fear that's pervasive through the organization. And no one's willing to say the emperor wears no clothes.
Starting point is 00:19:39 The woke emperor knows where it's not close because they don't want to, why stick your neck out? You may not know you're going to get fired, but you know you're taking a chance. Three years ago on Twitter, nobody would talk about certain topics related to DEI because it was in the aftermath of George Floyd and people just did not feel comfortable even calling out something minor that was unfair. No, I would say they didn't feel comfortable until Bill Ackman broke the seal on this just like a month ago, where he really went after DEI.
Starting point is 00:20:06 I mean, because remember, part of the reaction to Bill Ackman was like, wow, he's really going there. Even though everyone's saying that, like, basically agreed with him and knew that he was making a correct point, but people were still afraid to, like, again, call out this woke emperor. Yeah. And the way this works for people who are not super aware, you have a language model and you write a bunch of code, but then there are guardrails put in,
Starting point is 00:20:31 and then there are red teams and people who test it. And so, at the very least, even if you were trying to do something with grade and 10, as you pointed out, Friedberg, hey, if we pull up a doctor, it doesn't necessarily always be a white guy. There are other people who are doctors in the world. Somebody should have caught it in testing. They probably did, Jason, and they didn't report it. Who wants to report that problem? That's an interesting rub, yeah. How do you miss it was my... They didn't miss it. They didn't miss it. They didn't have the guts to report it. You didn't miss it. You have to build a... Somebody built a reward model or people...
Starting point is 00:21:03 A reward model was built for the reinforcement learning from all the humans and their feedback. So these decisions were explicit. The question that you guys are framing is, was this though a case where it was explicitly imposed on people and people felt a fear of pushing back or did they just agree and say, this is a great decision, and these rewards make a ton of sense. And the whole point is that I think what this has highlighted is that's the truth that Google needs to figure out,
Starting point is 00:21:36 and they need to figure it out quickly, because what is going to happen now is, I think we talked about this a year ago. All Google needs to see is 300, 500 basis points of change. And the market cap of this company is gonna get cut in half, okay? Because there is only one way to go when you have 92% share of a market, and that is down.
Starting point is 00:21:58 And so the setup for the stock is now that people are looking at this saying, okay, if I see 92% go to 91 or 90 or 89, that's all that has to happen. And people will say the trend is to 50% and you will price this company at a fraction of what it's worth today. So I think it's really critical.
Starting point is 00:22:18 This is the moment that the senior leadership that really understands business, can separate it from politics and decide, is this a thing of fear where there's one rogue group that's run amok? Or is this what we believe? Because if it's the latter, you just gotta go with it. There's nothing you can do
Starting point is 00:22:36 because you're not gonna replace 300,000 employees or however many Google has. But if it's a former, Sax is right, you're gonna have to, some heads need to roll, and they need to tell the marketplace that this was a mistake where a group of rogue employees got way too much power. We were asleep at the job, but now we're awake, they're fired, and this is, so they have two choices.
Starting point is 00:22:57 But in all of these choices, what I'm telling you on the dispassionate market side is if you see perplexity or anybody else clip off 50 basis points or 100 basis points of share and search, this thing is going straight down by 50%. By the way, unless cloud takes off, right? Because the other hedge that Google has is GCP and the tooling that they've built in GCP
Starting point is 00:23:22 can enable and support and be integral to a lot of alternatives and competitors to what ultimately might be searched. So I think there's also a play here in thinking about some of the hedges that Google has implicitly built into the business. I'll just say one more thing on this structural point. I kind of thought about this as when I started speaking to people internally about how this happened from a product perspective. It felt a lot like when you talk to a lawyer,
Starting point is 00:23:47 as you guys know, you're making a business decision and there's some risk. I mean, think about Travis building Uber. And the lawyer will say, it is illegal, you need a medallion license to do what you're doing in the city. And Travis is like, well, you know what? I'm gonna take that risk.
Starting point is 00:24:01 Brian Chesky and Airbnb, he said, you know what, I'm gonna take that risk. And the problem is that like a lawyer's job is to tell you what you can't do, to identify all the peril of your action. And then you as an executive or a business leader or a manager, your job is supposed to be to take that as one piece of input, one piece of data, that you then make the informed business decision about what's the right way to build this product? What's the right way to build this company? And I'll take on some risk. And I think one of the challenges structurally inside of Google is that product leaders and other folks were never enabled to make the decision that there were these, as Saks pointed out,
Starting point is 00:24:38 kind of like policing type organizations that were allowed to come in and veto things and the vetoing or make unilateral decisions and those vetoing on whether it's waiting or some training data set or output, that ends up killing the opportunity for the smart business leader to say that doesn't make sense. We can't do that. This is where we're having founders in the organization every day, changes everything because the founders would say, hey, we're here to index the world's information, we're here to present the world's information, we're not here to interpret it. We're not here to win hearts and minds. We're not here for a political agenda. But there's a group of people there who it's apparent thing that they're there to change the world, that Google is a vehicle for them to make social changes in the world. And that's art. And there's other ways to do that. And just paradoxically, like Hamilton
Starting point is 00:25:26 making the founding fathers diverse and doing art, that can win Tonys in that context. You could win tons of awards and acclaim and make incredible, beautiful art. But on the other side, people are not looking for Google to do that. They're looking for Google to give them the answer and the data. And then these people are thinking, it's our job to actually interpret the data for you as we kind of touched on last week. So this is a good- Look, I think you guys are really close to the bullseye here,
Starting point is 00:25:55 but I would just refine slightly what you're saying. So here's how I think it happens. I don't think this is a rogue group. I actually think this is a highly empowered group within Google. I do agree with you. Yeah. Yeah. I think it happens. I don't think this is a rogue group. I actually think this is a highly empowered group within Google. I do agree with you. Yeah. I think what happens is Sundar says from the top that we're going to be on the
Starting point is 00:26:12 forefront of diversity and inclusion because he personally believes that and that's the way the social winds are blowing and they think it's good for the company on some level. Okay. To implement that mantra or that platitude really, HR hires a bunch of DEI experts. Okay, lots of them. I think the company has like this huge HR department.
Starting point is 00:26:34 And a lot of those people are basically fanatics. I mean, they're- They're a chosen career. Yeah, they're trained Marxists basically. And so they're the commissars and they're sitting in all these meetings. And again, they're the ones taking notes and they're the ones who push the company in a certain direction. But you have to then go back to senior leadership and say it's their fault for letting this happen because they should have made a course correction.
Starting point is 00:26:57 They should have realized what was happening. They should have realized that through a combination of bias and through this sort of like overly empowered HR team who are pulling a legal card. I mean, Friedberg's right about that. They're saying that our point of view is the law, which isn't true, but they're basically pulling the legal card and pushing the whole organization in a certain direction. It was up to leadership to realize what was going on and make a correction. And the thing that you're seeing now is that in the face of what's happened, the statement that we got really doesn't cut it. I wonder if they used to wear problematic. Yeah, exactly.
Starting point is 00:27:37 They're describing it as a glitch or a bug. It's not. It's a much deeper problem. And so therefore, it gives no confidence that the solution is going to be pursued in as comprehensive manners as necessary. Yeah. I think it's just a, that memo is a tip off that there's a 20,000, 30,000 person riff and a reorganization and they're just going to keep cutting the
Starting point is 00:27:59 DEI group and they've already met it and Google have cut the DEI groups a bit. I don't know if it's about as much about cutting as much as it is about empowering. If you said to the product leaders, you can make the decision. DEI and responsible AI and whatever other groups, they're going to inform you on their point of view, but they're not going to tell you what your point of view needs to be. Yeah, but that's the DI group is pulling the legal card. They're saying this is the law. They're saying that if you... They're saying that that's what needs to change, right? If you don't do things the going to tell you what your point is. Yeah, but that's the idea I was pulling the legal card. They're saying this is the law.
Starting point is 00:28:25 They're saying that. Well, that's what needs to change, right? If you don't do things the way we tell you, Google is going to be hit with a civil rights lawsuit. That's what they're saying. Oh, well, I mean, let's see if leadership can overcome that threat. But I think that's exactly the threat that is keeping the organization from resolving this problem or could keep the organization.
Starting point is 00:28:43 I think it's that, but I also wouldn't underestimate the bias. So, you know, everybody there, not, I shouldn't say everybody, but it's a very liberal culture, right? It's a monoculture. So when you're swimming in that much bias, it's hard to see, right? Yeah. When everybody is left to center. I mean, just look at something like political contributions, right? It's 90 something percent are democratic. I mean, like high 90s. So, it's just a liberal bubble, basically. And so, when everybody's liberal, it's very hard to see when the results are way off center as well. Also, if you give people this job, like, how are they,
Starting point is 00:29:20 where are they going to do every day for your burger? If they've been given the job DEI, and they've been given the job to trust and safety, they're looking to fill their time and make an impact. You gotta, I think maybe have less of these people. I honestly think it's about cutting people. You know, Sean McGuire is a partner at Sequoia and he used to be a member of the team at Google Ventures when it was called Google Ventures.
Starting point is 00:29:40 And he did a Twitter post, did you guys see this? Where he said, when he was at Google, he was told by his manager, I'm not really supposed to tell you this. It could get me fired, but you're one of the highest performing people here, but I can't promote you right now because I have a quota. My hands are tied. You'll get the next slot.
Starting point is 00:29:55 Please be patient. I'm really sorry. It ultimately led Sean to leave Google and, you know, the rest is history. He's a partner at Sequoia now. You're saying because he's a white male? Because he's a white male. Yeah. And so the, you know, the R&D.
Starting point is 00:30:10 Yeah. No, I had the same thing happen to me at AOL. When I was at AOL and Shabbat was there at the same time, the whole organization was white men from Virginia, whatever. And they gave me an SVP title and I said, well, I want the EVP one. And they're like, you know what? We can't have any more white males in that position right now. We have to like get some more women and people have called in that position.
Starting point is 00:30:34 And they told me you'll have the same comp and bonus, but we just can't give you the title. like race and gender driven quotas make sense for HR departments to try and enforce upon managers to increase diversity. I mean, is that a good objective for an organization to have? This is like a topic a lot of people I've heard kind of flip back and forth on. Hi, there is no company where I have majority control
Starting point is 00:31:03 where I have an HR department. You don't have an HR department? No. Say more. Why? I think that it's very important. You should go to a very respected lawyer at a third-party firm, someone very visible, an Eric Holder type person, and you should work with that law firm and retain them so
Starting point is 00:31:20 that you have an escape valve if there are any kind of serious issues that need to be escalated so that you can get them into the hands of a dispassionate third party person who can then appropriately inform the board, the CEO and investigate. So that covers sort of all the bad things that can happen. Then there are buckets of, I think, good things. One important set of things is around benefits. My perspective is that the team should build their own benefits package that they want. They should understand the P&L of the company that
Starting point is 00:31:57 they work for. They should be given a budget. And in my companies, again, what I do is I allow committees to form and I ask those committees to be diverse. But what I mean by diverse is I want somebody who has a sick partner. I want somebody who has a family. I want somebody who's young and single so that the diversity of benefits reflects what all these people need. They go and talk to folks, they come back, and they choose on behalf of the whole company and there's a voting mechanism. Then when it comes to hiring, I think what has to happen is that the person
Starting point is 00:32:32 for whom that person will end up working for, it's the head of engineering, it's the head of sales. Those are the people that should be running the hiring processes. I don't like to outsource it to recruiters, I don't like to outsource it to recruiters. I don't like to outsource it to HR. So when you strip all of these jobs away, HR doesn't have a role.
Starting point is 00:32:52 And what is left over is the very dark part of HR in most organizations, which is the police person, the policeman, right? What is SAX called? The Commiser. That is why everybody hates HR. I've never met a company where that is a successful role over long periods of time. They are this conflict creating entity inside of an organization that slows organizations down.
Starting point is 00:33:14 So that allows me to empower individuals to actually design the benefits that they want, to hire the team that they want, and I let them understand the P&L in a very clear, transparent way, and the results are what the results are. You want more bonuses, hire better people. And then what I do is at the end of every year, I talk about this distribution of talent, and I make sure we are identifying the bottom 5 or 10 percent. They need to be managed up or they must be fired. Every year. And it does not matter how big the company is. You must manage up or out the bottom five to 10%. And in some cases I'm talking about one person
Starting point is 00:33:55 because it's a small company. And in other cases I'm talking about 30 or 40 people. So just hire the best person for the job. No, you eliminate HR. Yeah. You empower the team, make the make, allow them to make their own decisions. Measure it, hold them accountable. Right. So if you have a salesperson who just hires their, I don't know, 15 sorority sisters or paternity brothers, whatever it is,
Starting point is 00:34:18 and it just becomes not a diverse group. Well, hold on. It is what it is. That still may be diverse. This is my point. Like the thing is that there's different ways to sell. There's different sales motions. There's the sort of elephant hunting kind of sales model. There's the dialing for dollars thing. So even those 15 sorority scissors, the way you describe it, could actually be diverse. My point is that kind of superficial marking based on immutable traits will not yield a great organization. Instead, it's you're empowered to hire whomever you want. Just know that at the end of the year, we're going to measure them. Your bonus, their bonus, the company's performance. So it's just all performance. Yeah. Performance.
Starting point is 00:34:56 I mean, Frank Slutman said this and he got barbecued at some point. He said, I don't have time to do this diversity stuff. No, but like, by the way, in my companies, like for example, like when you do SOC2 compliance, you have to generate these reports, okay, especially for some of our customers, some of our companies that actually show the diversity of our team. And when we measure them on the immutable traits, whatever they represent as their gender, whatever they represent as other dimensions, whatever they represent as other dimensions, we are incredibly, incredibly diverse anyways. But the way that I...
Starting point is 00:35:29 I've never seen a startup that isn't diverse. Silicon Valley attracts such a diverse mix of people. What do we know about success factors for startups? Number one, the ones that are successful have a culture of meritocracy. Number two, they're non-bureaucratic and they don't have too much DNA, basically overhead in the company. Now, all those things are contradicted by having a large HR team or especially a DEI organization, right? They add bureaucracy, they add overhead. And what's that? Pips. They cut into...
Starting point is 00:36:10 Well, pips are great. I pip people all the time. Does it work? I mean, I know some people are just like, these pips don't work. If you're an unperforming member of the team and you've been identified in the bottom 5 or 10%, we have a responsibility as management to coach you up or to get you to an organization where you're not in the bottom 5 or 10%. That's the right thing to do for people. You do that by being very transparent and writing it down.
Starting point is 00:36:33 You are not good at these things. You're underperforming in these things. Fix them or you will not be here. That's a very fair thing to tell somebody. Yeah, look, how do you want to implement your own meritocracy? I think there's different ways for founders to do that. The point is you want these companies to be a meritocracy. You don't want advancement in the company based on factors other than skill, merit, hard work, things like that.
Starting point is 00:36:56 Performance. We know performance. We know that's a bad, bad path to go on. I think a lot of founders don't understand that DEI is not something they have to do. They don't have to have a DEI organization. This has somehow become a thing. It's not required. And I think people are realizing, like, why would you do that? Why would you create this large bureaucracy in the company that undercuts the meritocracy, that adds a lot of costs, and that slows you down. Now those things will help your company.
Starting point is 00:37:25 What you need to have, I think, Tramath laid out some really good best practices. You should have an outside law firm that I would say is, you could call it HR law, but I would just call it employment law. I would say like a non-ideological partner, an expert in employment law who sets up your company correctly and to whom you can take a complaint. If an HR complaint gets raised through the sets up your company correctly and to whom you can take a complaint. If an HR complaint gets raised through the chain of your company, you do
Starting point is 00:37:49 have to take it very seriously and that's be a proper investigation and that's probably best handled by an outside lawyer. So get that outside lawyer and then I think it's- Sorry, by the way, always. There's never a case where your coworker should know the intimate details of any of those. And that's what also creates this horribly rotten culture in HR where these people act like gatekeepers of secret information, salary information, bonus information. And then all of the other things, oh, you know, did you know this person did this with, it's terrible to have that inside of a company. It should go to a dispassionate third-party person whose job it is to maintain confidentiality
Starting point is 00:38:26 and discretion while investigating the truth. Yeah. When HR is too powerful in a company, that's a red flag. At the end of the day, HR should be an administrative function. Their job should be to get people on boarded, sign their offer letter and their confidentiality and invention assignment agreement and set them up in payroll and get them benefits and that kind of stuff. It should fundamentally be an administrative function. And if it starts getting more powerful than that, it means there's been a usurpation.
Starting point is 00:38:56 You don't even need people for that. Like you have software that does that now and it's all automated. You know, you send them a link, you go to your favorite HR site and boom, it's done. Yeah. I don't think you need a lot of people doing this. Anything you want to add to this discussion for your burger as we move on to the next topic? I think that unfortunately the term diversity, equity, inclusion has been captured along, as Chamath points out, a single vector, which is this immutable trait of your racial identity or gender.
Starting point is 00:39:25 And I think the more important aspect for the success of a team, for the success of an organization, is to find diversity in the people that comes from different backgrounds, different experiences, different ways of thinking. And so I'm not a huge fan of race-based metrics or gender-based metrics, driving on generally more oriented around, be blind to those variables and focus much more on the variables that
Starting point is 00:39:54 can actually influence the outcome of the organization. Yeah, one of the great paradoxes of this as well is we are moving to a much more multicultural mixed race society anyway. People filling out forms, a lot of our kids could pick two or three of the different boxes on a DEI form. It's not going to make much of a difference in the coming decades. All right, issue two, Google goes splashy cash, it's licensing deals for training data, and it's now becoming a bit of a pattern. Google we talked about, I think just last week, had done a deal with Reddit for $60 million,
Starting point is 00:40:29 that's reportedly per year. Today, Stack Overflow is now using its Overflow API to train Gemini. No word on the contract value. I did get some back channel that it's a multi-year, non-exclusive deal. According to Retz S1, they have already closed 200 million worth of AI licensing deals over the next two to three years. So, yeah, maybe it's going to be 75 million a year, 100 million a year. Who knows how big
Starting point is 00:40:53 that business can get. We're going to talk about the S1 from Reddit in just a moment. And this is on top of all the other licensing deals that have occurred after Spring or an OpenAI. You remember that one? And OpenAI is in talks, reportedly, would see it in Fox and Time, to license their content that comes on the heels of that blockbuster New York Times open AI lawsuit that we talked about, I don't know, 10 episodes ago. And open AI's lawyers are trying to get that one just to give you a little update on it. They're trying to get that case dismissed saying that the New York Times hacked the GPD to get certain results and that the New York Times took tens of thousands of tries to generate the results.
Starting point is 00:41:29 Yada yada. And they said, here's the quote from the filing from OpenAI. The allegations in the Times complaint do not meet its famously rigorous journalistic standards. Both OpenAI and Google and Gemini have been furiously guard railing their systems as we talked about as well to stop copyright infringement and like trying to make pictures of Darth Vader and that kind of stuff. Chumab, you've talked a little bit about your TAC 2.0 framework.
Starting point is 00:41:52 Maybe you could talk about what you see happening here with all these licensing deals and what it means for startups in the AI space. Well, just to maybe catch everybody up, TAC is this thing called traffic acquisition cost, and you can see it most importantly in Google's quarterly releases, which is that what they realized very early on at the beginning of the search wars in the early 2000s is that they could pay people to offer Google search, People would use it, and then it would generate so much money that they could give them a huge rev share, and it would still make money. So I remember, Jason, when you and I were at AOL, this is the first time I met Omid.
Starting point is 00:42:35 We were flying back to California. We were both in Dallas at the same time in 2003 or 2004. And that's when Omid did the first big search deal between AOL and Google. And it was, it was, I want to say hundreds of millions of dollars back then. Where Google pays you up front, you have to syndicate Google search, and then they clean it up on the back end with some kind of revsure. So what's incredible is that that process has escalated to a point now where, for example, on the iPhone, it's somewhere between $18 and $20 billion a year is what now Google pays Apple.
Starting point is 00:43:11 So, that's the traffic acquisition cost 1.0 rule, TAC 1.0. And I just said that we should call this TAC 2.0, except now what Google is doing is instead of paying for search, they're actually paying for your data and saying, give it to me so that I can train my models and make it better. And I think that that's an incredible thing. Both it's very smart for Google, but also it's great for these businesses because it's an extremely high margin thing to do when you have a really good corpus of data that is very unique.
Starting point is 00:43:44 So in the case of Reddit, that $60 million deal, I didn't, I looked through the S1 to try to figure out whether it was a multi or deal or not. It wasn't totally clear. But the point is that, you know, Google is paying Reddit $60 million bucks. And Jason, you just said that they're, they've done a couple more of these things. That's incredible. This TAC 2.0 thing is amazing. So if you're an entrepreneur
Starting point is 00:44:10 building a website or building an app that has really unique training data or really unique data, you'll be able to license and sell that and that'll be an incremental revenue stream to everything you do in the near future. That's amazing. That's what TAC 2.0 is. This is going to be incredible for the entire content and community-based industries. You think this could sustain content creation where advertising has become very difficult, Saks? I guess my question to Chamath would be, do you think this is going to be available to small websites?
Starting point is 00:44:34 There'll somehow be some sort of program? Because I mean, Redd is one of the biggest sources of content on the entire web, right? It's like a top five traffic site with tons and tons of user generated content. Yes. Do you think like a small publication would be able to make these types of deals? Yeah.
Starting point is 00:44:50 And in fact, I think like if you go back to search 1.0, that's exactly what these small companies were able to do, which was in a more automated way. They were able to basically partner. And in that example, what they would say is, here at Google, why don't you just run your ads on our page? Right? And that was sort of in that web 1.0 world. What they would say is, here, Google, why don't you just run your ads on our page? Right?
Starting point is 00:45:05 And that was sort of in that web 1.0 world. So Google had solutions for the largest companies on the internet all the way to the smallest. And in this TAC 2.0 world, I do think that it works in that way as well. The problem in that world, if it's a small website that says, here's my training data, the question is, how do you attribute how much incremental value a model derived from it versus something else? And so I think that that part has to get figured out.
Starting point is 00:45:35 And so what Google will be able to pay you will probably be pretty diminumously small if you're small right now. So to your point, if the upper bound is $60 million for Reddit, then the average website is going to get a few hundred bucks. But that still may be a good start. And when Google figures out how to monetize this stuff or somebody else, where then they can give you back some way to make money, I think that there's a real monetization here. I really do.
Starting point is 00:46:01 You took the other side of this. But now that you see the market-based solution starting to emerge, what do you think of this market-based solution? You think it's got legs? Well, I'm not convinced that this looks like tech in the traditional sense where you're basically buying a continuous stream of traffic and then you're helping to monetize that traffic. That's effectively what Google did in the ad syndication business and DOVS today. That business makes about 10 billion a quarter at Google. And they're paying, call it 70 to 80 cents on every dollar
Starting point is 00:46:37 back out to the owners of that traffic, the folks where that traffic is derived from. I would say that this looks a lot more like the content licensing deals to build a proprietary audience, which is effectively what Netflix did. They paid studios for content. Apple does this. They have proprietary content that they pay producers to make and they put on Apple TV, Amazon does this, and so on. This is a lot more like that,
Starting point is 00:47:07 where there are content creators out there, whether that content is proprietary like the New York Times or user-generated like Reddit, and what they're trying to do is acquire that content to build a better product on Google Search. And I'm not sure how you get paid a continuous licensing stream for that content. Once you've trained the model, the content gets old, it gets stale at some point in a lot of cases like news.
Starting point is 00:47:29 And then eventually, if you don't have a high quality, continuous stream of content, it's not worth as much anymore. To give you guys a sense, humans generate in total, well, let me just give you some stats. There's about a million petabytes of data on the internet today. Well, let me just give you some stats. There's about a million petabytes of data on the internet today. And humans are generating about 2,500 petabytes of data, new data per day right now. Remember I shared a couple of weeks ago,
Starting point is 00:47:53 YouTube's generating about two petabytes of data per day. Half of all data generated is never used. So this is like records and files and stuff that gets put on. Log files? Log files get stored somewhere never accessed, never used. So this is like records and files and stuff that gets put on. Long files. Long files get stored somewhere never accessed, never used. The majority of the rest of that data is not in the public domain, it's not on the internet. So there is a lot of data out there
Starting point is 00:48:15 what some people might call dark data to train on. And I think that as the identification of better sources of training data and better sources of training data and the value of training data, right now we're in this kind of shotgun approach. We're trying to blast out and source lots of content, lots of data. Just like over time,
Starting point is 00:48:34 Netflix got better at figuring out what content to buy and what to pay for it, and they're the best at it. I think so too, will Google and others figure out what data is actually particularly useful, what it's worth, and what to pay for it. And so there's a lot of data out there to go and identify, to mine, to pay license fees to get access to, whether those are continuous license fees or one time is still TBD. That's a key issue.
Starting point is 00:48:59 Yeah. So I think we're still a little bit early to know if this is like, you know, a continuous model like attack type business or if these are chunky type deals, and we don't really know what the real value is yet, and that all changes over time. And remember, the rate of data generation is increasing. So while we're generating 2,500 petabytes of data per day as a species on the internet, that number is going up every day. And so every year, all the old data becomes worth even less. So this is all changing fairly dynamically. And I think there's a lot still to be figured out on what the monetization
Starting point is 00:49:31 model will be for content creators and how that's going to change over time. Yeah. And this is a really good point. Some things will be like the Sopranos or Seinfeld or Simpsons where that library is worth a fortune and people will pay a billion dollars, a half billion dollars. No one's paying a lot for old NFL games. Nick found it, by the way. Reddit's licensing deal was $203 million over, it says two to three years, so let's assume it's, call it three, just to be safe. So it's about $60, $65 million a year. It doesn't say whether the deal with Google is exclusive.
Starting point is 00:50:01 Wow. So they could do that same licensing deal multiple times. That's interesting. None of these are exclusive, it. They could do that same listing deal multiple times. That's interesting. None of these are exclusive, it seems. This is what I don't understand. Why, why don't you do head deals like with folks like Reddit, where you actually do it exclusively? Like it seems like it's more valuable to spend a multiple of this number for,
Starting point is 00:50:18 for one of the big seven who have tens of billions of dollars of cash anyways. And block the other players. And block everybody else. That just seems so much smarter. Well, then Reddit's multiple gets capped. If I'm Reddit, I don't wanna do that deal. Yeah, Reddit may not put it on the table. Yeah. Because then my multiple is capped.
Starting point is 00:50:31 Like the way I can monetize my content is now set. Then I'm done. This is why Reddit's gonna get bought. Reddit. And investors are like, oh, you're worth five times EBITDA. You know, that's it. Reddit, Core, Stack, Overflow,
Starting point is 00:50:41 they're gonna just get taken out. I think this is gonna be the new model. Well, Core did around actually. Didn't Core raise $5,500 recently? I think they're going to get taken out. I think these businesses will become too valuable because they do have ongoing content that just keeps getting generated. Take out you used to run, you started web blogs, right?
Starting point is 00:50:58 Yeah, yeah, I'm gadget blogging. But think about the value of that content today. It's negligible. Like it was very valuable at the time. And as time went on, more content was being created 100 times, 1,000 times, 10,000 times more content that started to overshadow the value of that content. The time the acquisition was done, it made a ton of sense.
Starting point is 00:51:16 But all of a sudden, two years later, particularly with the rate at which data is growing on the internet, it's like, does it make sense to buy any content anymore? So you take, well, on the other side of that is historical content could be worth a lot of money, especially- Some could be, for sure. Yeah, so if you had the Charlie Rose archive as an example, what is, you know, he's probably interviewed Kissinger
Starting point is 00:51:34 10 times, and he's interviewed Kissinger for, you know, 10 hours. I've got two, almost 2,000 episodes of podcasts I've done with startups over 13 years. Some of those are good, right? Yeah, this week in startups archive is gonna be worth something at some point, right? I don't think it's gonna be worth 60 million. No years. Like some of those are good, right? Yeah, this week in startups archive is going to be worth something at some point, right? And I don't think it's going to be worth 16 million.
Starting point is 00:51:48 What are all baseball games worth, right? Like, I mean, who watches their baseball games? And I don't think the data from them is particularly important. So I agree on that one. But historical stuff. We don't know, right? Yeah, we don't know.
Starting point is 00:52:00 And I just question how much of Reddit's content is actually like long-term valuable versus like they're covering a topic and they're talking about nursing stuff. And then it's a really excellent point. It's a really, really excellent point. Yeah. Uh, and we'll figure that out. Okay.
Starting point is 00:52:14 And that's why I think it's like, it's the early days of knowing how to value all this content, particularly for LLMs. And so we don't really know yet. Over the next year, this will all start to become clearer. But it's, it's a good thing. It happened to music licensing. Yeah. Right. Over the next year, this will all start to become clearer. But it's again, it's here again. The same thing happened to music licensing. Yeah.
Starting point is 00:52:26 Right. But if all the content creators kind of unionize, then it might increase. Like the music label press. I don't know if that's a cycle. Kind of like the music. The music industry has ASCAP and other social services. I love it. Well, I'm not advocating for this, but the point I'm making is if they can't unionize,
Starting point is 00:52:42 then there's a lot. There's just a huge number of vendors of content. And so models will need to buy some, but as long as they can get some, they don't need to have all. And therefore, it's basically highly competitive among suppliers, and there's a very limited number of buyers. So that tends to be a buyer. This is why the news industry should have always had a federation, because they could
Starting point is 00:53:02 have just said to Google, hey, we're going to de-index ourselves from the Google search engine. And so you won't have the New York Times, Washington Post, LA Times. You're just not going to have any of us unless you give us X, Y, and Z. And they were just too stupid and not coordinated to do it. Music industry, the exact opposite. You try to do anything with the music industry. They're going to come down on you like a ton of bricks. To this day, I see what's going on.
Starting point is 00:53:25 That's actually really interesting. Yeah. If all the old school legacy newspapers and magazines, so on, had basically formed their own whatever, Cartel, Trace Association. Yeah. That would have been powerful. Yeah. I mean, that's what Murdoch, when. He was like, Google's the enemy here.
Starting point is 00:53:45 They're going to take all of our revenue and they're going to get all our customer names and we're not going to even know the names of our customers. All right, issue three, Klana crushes customer queries with AI. You may have seen this trending on X and Twitter and in the press. If you don't know Klana, they're a Swedish Vintech company. They do that buy now, pay later stuff. I think they were the originators of that online. And they put out a press release with some really eye-popping claims. AI assistants are now doing the work of 700 full-time agents at
Starting point is 00:54:11 Chlorida. They moved issue resolving times from 11 minutes with humans to two minutes with AI. And customer satisfaction is on par with human agents. And it said its resolutions are more accurate than humans, creating a 25% drop in repeat increase that tracks. And so far, their AI, which they built with open AI, has had 2.3 million conversations accounting for two thirds of Klarna's customer support service chats. Klarna estimates its AI agent will drive,
Starting point is 00:54:43 wait for it, a $40 million increase in profits this year. We could talk a little bit more about Klara and their valuations, but Freeberg, what do you think this means if we're in year, this is the start of year two of chatGPT as like a phenomenon, let's say. What do we think year three looks like? I think the techno pessimist point of view is, oh my god, look at all these jobs that are getting lost. I think the optimistic point of view is that that company has all of that excess capital now to reinvest in doing other things. That capital doesn't just become profits that flow out the door and everyone's done with that money and that money just gets put away in
Starting point is 00:55:21 a sock. That money gets reinvested. And that money gets reinvested in higher-order functioning work. And that's really where there's an opportunity to move the workforce overall forward, which is what I think is super exciting and I'm super positive about. We've seen this in every technological evolution that's happened in human history, from the plow in agriculture to automobiles to computing and to now AI, that humans moved from manual labor to knowledge work to now ideally and hopefully more creative work. And so I do think that it isn't just about eliminating jobs and making more money, but it's about enabling the creation of entirely new class of work, whether that's prompt engineering
Starting point is 00:56:05 or building entirely new businesses that simply can't exist today, or perhaps even downscaling businesses where you no longer need to have a 10,000 person organization, smaller organizations can be stood up as startups to start to replace large functioning organizations. So I don't know, I think it's a time of great opportunity. I know that some people would view it as being highly shocking. I think it's inevitable that human knowledge opportunity. I know that some people would view it as being highly shocking. I think it's inevitable that human knowledge labor, where the job of the human is simply the ingestion of data and then communicate an output of data,
Starting point is 00:56:31 seems like it will eventually be replaced by computing somehow. And this is happening now in an accelerated way with these LLMs. So I think that what we should focus on and think about is what are all the new businesses, all the new jobs, all the new opportunities that just couldn't have existed 10 years ago that are now emerging, that are very exciting as the workforce transitions. Sakshu, by this techno-utopian view of this, all these jobs that are going to obviously be retired are going to open up the opportunity for these humans to do even better work at Klanur, or do you think it's just going to go straight to the bottom line?
Starting point is 00:57:04 these humans to do even better work at Clona or do you think it's just going to go straight to the bottom line? Well, it sounds like they're able to eliminate a lot of frontline customer support roles by using AI, which is what I would expect. I think this is a very natural application for AI. It was already the case that you could pretty much find answers to questions by searching the FAQ, things like this. This is an even better way of doing that. So look, I believe that this will be a big area for AI is saving on, again, I use the word frontline customer support because the way that customer support is typically organized is there's level one, level two, level three, the more difficult queries or cases get escalated up the chain, depending on how hard they are. And I think the AI will do a really good job eliminating level one, it'll start to eating
Starting point is 00:57:51 to level two, but you're probably going to need humans to deal with the more complex cases. Now, the question is where do those displaced humans go? I think there's going to be new jobs, new work that's always been the history of technological progress. One of the things you're already seeing is there's a whole bunch of new AI companies that are exploiting this technology and they need to hire people. So I basically agree with FreeBerg that you will elevate people's work by automating away the less interesting parts of people's jobs and then creating more productivity and more
Starting point is 00:58:23 opportunity. By the way, just to use your example, Saks, so imagine if all the level one support people, some chunk of them can now do level two support, and so the customers are going to get greater hands on care, more customers will get access to a higher level of service. The organization can afford to do that. They'll be more competitive in the marketplace because customers feel better taken care of.
Starting point is 00:58:45 I just think that's how the organizations get leveled up as new technology shows up like this. That's a great point. And then those folks can have a much deeper level of interaction with their customers than they are today. But then the world gets more complex and then people might get better at the software and they might discover new features that you might be able to redeploy those people. If you look at coffee, like, I don't know, it was 40 years ago, you went
Starting point is 00:59:08 to order a cup of coffee, it was decaf or regular coffee, milk and sugar. Those are your four choices. And now you go order coffee. I don't know if you guys have used the Starbucks app or the, you know, the, I just had the sweet green CEO on the pod and man, you can, the fidelity and the nuance of what you want to order is absurd. Chamath, where do you think this is all heading? Because there is the issue of displacement and how quickly people can be redeployed. And if we're seeing in year two customer support and developers getting 10x, what other categories
Starting point is 00:59:41 do you think we're going to see fall next? I think the truth is that, as you said, the real world applicability to AI was not last year. So I think we're really in the first five or six weeks of the first year. So you consider that year zero? That's year zero. That was sort of like the, you know, where everybody was running around building toy apps. Group of concept. This is one of the first few times where you're seeing something in production where there's
Starting point is 01:00:06 measurable economic value. The important thing to note about that is that it's not just what it means for Clarna, but what it means for everybody else. If you look at everybody else, for example, here's Teleport Formance, which is a French company that runs call centers. They lost $1.7 billion of market cap when that tweak went out, about 20% of their market cap. So this is the real practical implication. Yes, Klarna replaced 700 people and they saved 40 million of OPEX, but teleport for months while they were just doing their everyday work, lost $1.8
Starting point is 01:00:41 billion of their market cap at the exact same moment. And so what does it mean? I think that what Klarna should do is open source what they've built. And the reason is that you want to give companies like Teleperformance a chance to retool themselves with the best possible technology so they can actually preserve as many of the jobs as possible. to retool themselves with the best possible technology so they can actually preserve as many of the jobs as possible. Because at the limit, if every single company is able to implement something that is
Starting point is 01:01:13 as economically efficient as what Klarna did, Teleperformance doesn't exist and there's $10 billion and 335,000 employees that will not have a job. And so for Klarna, the reason to open source it is twofold. One is they don't lose anything because you will still need to train it on your own data. And so there's no disadvantage that Clarna will have, right? They're just saying, look, I built this on top of GPT. Here's what it looks like.
Starting point is 01:01:40 And that production code can be used by anybody else. Go for it, but it has to run on your own data. That's a very reasonable thing. So I think it has the benefit of both A, setting a technical pace that can help them attract better employees and more highly qualified people who find the scope of work even more interesting. And B, I think it's on the right side of history with all this AI stuff where it's allowing everybody to sort of benefit in a way that is the least destructive. But I just wanted to show you that the destruction was quite quick and it was pretty severe.
Starting point is 01:02:11 And if two or three other big companies launch these kinds of tweets after real measurable results, teleperformance will be a $1 billion company in short order. There's a third reason. I think it's a brilliant idea for Claren to open source this tool because it's not their business, right? This is just something they did as a, as a productivity improvement. They get the benefit back to them of the community working to
Starting point is 01:02:33 advance that technology. So they don't have to put more engineers like advancing the ball on their customer support AI. Totally. They can just re, re-merge in the changes that the open source community comes up with. Yeah. And since they're not in the business of selling AI directly, there's no reason not to do it,
Starting point is 01:02:49 like Jamal said. So I think it's kind of brilliant. This is Meta's strategy, by the way. I mean, Zuck said the same thing. That's what I was going to say. They should be building this on Meta's open source products and Apple's open source products, right, Zuck? Yeah.
Starting point is 01:03:00 So what Meta said, what Zuck said on the last Meta call is the reason we open source everything is because we don't directly sell AI. We create products that AI makes better. So by open sourcing this, we allow the community to advance the ball and we get to reincorporate those changes. So it's a very smart strategy for companies that aren't directly selling the AI. Now if you're like Brett Taylor's new company, Sierra, obviously
Starting point is 01:03:25 you're not going to open source it because your whole business model is to create a proprietary solution. Yeah, but then to Chimaz, 8090, whatever he was talking about with his incubator concept and these things. It's a company. It's a company. Okay. Is it 8090 or 9080? I'm sorry. 8090, yeah. 8090. Is there a third word that comes off of it or just going to call it 8090? Just 8090. Got it.
Starting point is 01:03:49 Okay. 80% of the features at a 90% discount. So, back to that, what does this mean if these things are getting to, Freeberg, your point about the pace and the pace of these things, is it going to improve 10% a year or 10% a month? It was improving 10% a month. We're going to get to 98% of queries done this year. If it's doing 10% a year, okay, we're going to get to 99 or 98% of queries in four years. In other words, this is happening, folks, and it's happening at a blistering
Starting point is 01:04:15 pace. By the way, think of it's so not just Teleperformance, which was a $10 billion now, a $7.58 billion USD company. But think about Zendesk, right? Zendesk was, I think, a $8 to $10 billion now is $7.58 billion USD company. But think about Zendesk, right? Zendesk was, I think, an $8 to $10 billion take private by PD. 10 billion went private after it. Yeah, it was a $10 billion take private in the hands of PE. The entire Zendesk workflow could be replaced by a handful of these open source agents, where all of a sudden people can eliminate a lot of OpEx. I think the thing to keep in mind here is
Starting point is 01:04:45 where the world is going has always been to try to lower costs. And the original foundational principle of SaaS was that there's these line items in on-prem software that are just extremely expensive over time, very hard to justify. And so when people moved to SaaS from on-prem, they were looking for cost savings. That was the initial thing. Now, it's actually not cheaper anymore, but it's much more feature-rich, so you get a lot more value in SaaS, etc., etc. But the point of these AI agents and bots and workflows is that it'll reintroduce the concept of cost savings, of this idea that you can have cheaper, faster, and better.
Starting point is 01:05:26 And the more that that stuff is open source, my gosh, I think it just makes it very hard for companies that have point products to survive. I was talking to a friend of mine, Josh Mora, who was the city head of Uber in New York, and he launched his own note-taking app. He's writing it himself, and he's obsessed with this concept of building a billion-dollar, a unicorn company with one employee. And this is something that a lot of people have been talking about. And my friend, Phil Kaplan from DistroKid, built a very large business in DistroKid, a unicorn with a very, like, I think low single-digit number of people. This could be the future of efficiency. You could
Starting point is 01:06:02 build, if you catch fire with a really hot company that gets a million customers. It's absolutely the future. It's absolutely the future. And then if you think about our jobs in terms of capital allocation, well, how much capital does that founder need? Do they need to dilute 10%, 20%, 30%? They're not going to need to dilute 60%, 70%, 80%. A one-person company should be able to spend less than a few hundred grand to get to product market fit in the next few years. Yeah. I mean, that's kind of what we're seeing. Just to go back to your question, J.K. Alec, where does this go next? Yeah, at least.
Starting point is 01:06:30 I think it's really interesting to speculate about that. So what Clarno seems to be talking about are email-based customer support cases. I think where this is going to go next is to phone. 100%. And these call centers use what are called IVRs, these interactive voice response systems, but they're very rigid. It's a lot of pre-recorded messages. And it says, push one, if your problem is this, push
Starting point is 01:06:52 two, if your problem is this, everyone hates those things. Yeah. I think where it goes next is you'll call up the call center and you'll get a voice that sounds like a human. Just talk to you. Yeah.
Starting point is 01:07:03 And you won't even necessarily realize that you're talking to an AI because there are already these AI companies that can do generative voices, any language, any accent. Oh, and they're fast now. And they're fast. Yeah, that's really- And multi-language. Think about that, just localizing them across the globe. You want to launch your product in Japan. By the way, did you guys see there is a meta demo, which I thought was really cool, which was, it was run on Lama 70B, but it was a real-time translation tool where the person
Starting point is 01:07:30 was speaking in Chinese and the other person was speaking in English and they were able to understand each other. But Saks, to your point, when that person calls, for example, B of A, now Spanish is not a language. It's actually many dialects and many, many, many different accents, right? Depending on which country you're from, like the accent that you hear in Spain is totally different than the accent in El Salvador, the accent in Chile or Argentina. And so wouldn't it be amazing where you, you call your B of A app, it picks up your accent
Starting point is 01:08:01 and your tonality and it responds with the person of that exact same accent and tonality, that is incredible. Well, I'll take it to another level. You call, it recognizes your number, it knows what you're doing in the software, it knows the problem you've had, it knows the last three times you called and how long you've been using the software and it anticipates like, okay, I know this person has a Windows machine and it's still five years old and And it's like, are you still using that same five year old Windows machine? Yeah, we know it's a bug with Windows, whatever.
Starting point is 01:08:30 You should probably upgrade it. It's gonna know the entire context of this. And so it's gonna just get, it could be more efficient than a human could ever be. The best customer support interaction I had was, I called JP Morgan Chase, because I had an old credit card that I had had for 20 years. And I think that they had outsourced it
Starting point is 01:08:47 to somewhere in the Caribbean. And this woman picked up the phone, it was so cool. Chamaat, what is the problem, man? And I was like, oh, this is the best. And I had this whole conversation with her for 15, 20 minutes, nothing to do with the phone, nothing to do with my credit card, rather.
Starting point is 01:09:02 It was great. She's like, I'm here, man. You want to cancel your card? I was like, yes, please. No, but you could with my credit card rather. It was great. She's like, I mean, man, you want to cancel your card? I was like, yes, please. No, but you could do celebrities. The best. I, uh, celebrity voices. Yeah. J. Cal. Do the DJT. Okay. Let's, let's role play. Uh, you're Donald Trump. Okay. And I, and I am JP Morgan. Hi, Mr. President. Jamath, you're huge. You've got big spending. I, every time I go see Jamath, I go see Jamath. I go in. Amazing. Mr. President, I would see Chimaz, I go in.
Starting point is 01:09:25 Amazing. Mr. President, I would like to cancel my credit card, sir. Okay. You don't want to cancel it. We've got a great APR, 3.9%. But for Chimaz, you know what? He's my Sri Lankan friend. How much do we love Sri Lankans? Okay? Huge.
Starting point is 01:09:40 And not J. Cal. Nasty. Nasty man, J. Cal. Very nasty. Yeah, it's like a dog Okay TDS How much do we love our sacks sexy poo? Great Mar-a-Lago. I go to sax's house his house a huge huge house Almost as big as Mar-a-Lago
Starting point is 01:10:01 Huge. Huge ass. Almost as big as Marlago. Not quite. I got to work on it because I haven't done it for years. That's really good. It's really good. Have you seen that guy? There's a new impressions guy who's amazing.
Starting point is 01:10:15 He does Trump very well. I have seen him. He's on Howard Stern all the time, right? He does the Howard Stern. He does Howard Stern. He does Howard Stern. No, not Shane Gillis. This is another kid.
Starting point is 01:10:23 Matt Fried, friend. Yeah, yeah, yeah, yeah, yeah. He does Howard back to Howard. It's That video is good to hold on On other Trump impressions, so like this is what jack there are so many people that try to do but you see the failed Alec Baldwin Baldwin comes out on SNL, which used to be a lot fun. You're back with a little shimmy John He's like this. He goes, we've got a great show. Boopity, boopity, boopity, boo.
Starting point is 01:10:51 I don't do that. I never say boopity, boo. I never say something. Stephen Colbert, he comes out like, I know. Da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, da, say the da da da The last one is Jimmy failing fell and he touched my hair like a dog He goes, okay, I look like a dog. We're rolling. Okay. I don't do these moves. They're all wrong See him doing how we see him do hard turn it's so next level the other one he does that's incredible But subtle is Stanley tucci. How are you by the way, Robin? You look beautiful. Right? Is this still you are powered it's me up. Yeah, right because when I start talking to Robin after you leave I get crazy
Starting point is 01:11:39 Right, can we just talk about this? Right or wrong, right? Right. Right. Right. And how it left or right, right? Right. Right.
Starting point is 01:11:52 Right. Right. Right. Right. Right. It's so great. He is killing it. I think we have our halftime act for the All-In-Summit 24. Oh, for sure.
Starting point is 01:12:03 Oh, my God. Yeah, he'd be amazing. I love Stanley Tucci, by the way. He's great. Oh, you gotta see this kid, Stanley Tucci. I do, yeah, I'm gonna see that one. Do this, do this Stanley Tucci. It's so good. Have you guys watched the Stanley Tucci
Starting point is 01:12:13 on HBO, where he cooking? Yeah, so great. Where he goes to Italy. It's so great. It's so great. He does it on TikTok too. He just randomly cooks something and he's like, I'm gonna make a frittata.
Starting point is 01:12:23 I have some leftover gnocchi and I'm just gonna, I do watch him on TikTok. Thanksgiving is a special time. I'm in London celebrating. So today I'm making Stan's stuffing and also sugo di carne or as it's known, gravy. Now, the thing about stuffing is, you might use a traditional white bread, but in my household,
Starting point is 01:12:45 I use a homemade focaccia because I'm atowning on both sides and nothing tastes better than bread with a little money, not a sauce. It's a time for gratitude and giving thanks. Enjoy your holidays and thank you very much. Felicity, good. That is really good. He's so good. He's so good.
Starting point is 01:13:04 All right, listen, I don't know how I keep the show going here, but three, two, okay, issue four, Reddit's S1 is kind of fun. Let's break it down, everybody. 2023 revenue, 804 million up 21% year over year. They're still losing money, net loss, 91 million in 2023. They lost 159 million in 2022. So they're cutting the loss.
Starting point is 01:13:22 Free cash flow is negative. Here's a chart of their quarterly revenue and cash flow. So they're kind of bouncing along the break-even mark, as you can see there in the chart. They got a wonderful gross margin because they don't pay to produce the content, unlike the New York Times or Netflix. And so an 86% gross margin, up 2% year over year. They're a daily active unique, 76 million, up 2% year over year. They're daily active unique, 76 million, up 27% year over year. Average revenue per user is incredibly low,
Starting point is 01:13:49 three bucks and 42 cents. And they're daily active, unique users. Here you can see another chart, growing nicely quarter over quarter. They got a billion two in cash. Most interesting probably, and could be challenging to execute on is their direct share program. They're going to carve out a bunch of shares in the IPO to
Starting point is 01:14:12 sell to their most active mods. Those are the moderators, the people who run the different channels or subreddits as they're called. What could go wrong? What could go wrong? How long has this been moving before? but they're going to write uses to participate on it on a rolling basis. A more unqualified retail buyer pool does not exist in the Reddit mods, except maybe the Reddit participatory bad loads that they're doing. Yeah. I think they'll buy the, I think they'll buy after the lockup comes up. But let's just get started here. I think you looked at the S1 a little bit, Freeberg, and you had asked me to put it on the docket
Starting point is 01:14:51 because you were digging into it. Anything stick out to you or thoughts on the business overall? No, I mean, I wasn't pulling, I just asked if you guys had read it. Hey, I think I had made a funny joke. Read it. Accidentally. Unless you guys go on for a minute, go ahead. Tell me when you're ready.
Starting point is 01:15:17 It was a good pun. Too bad it was accidental. No, it was intentional. I'll give you credit. It was intentional. I'll give it to you. So I think the thing about Reddit, if you could pull up the chart with the quarterly average daily active user data, this was a business that the last couple of years, everyone was like, had flatlined because it was only growing 5% a year in terms of usage. And then all of a sudden in the last two quarters, so starting in late summer, early fall of 23, so just six months ago, the usage started to climb pretty significantly,
Starting point is 01:15:51 growing 15 and most in the most recent quarter, 27% year over year. Absent that growth story, it's a really challenged business because a business without much growth gets value typically on a multiple of the cash flow that they're generating, and there's less upside and all this kind of optionality goes away. That's kind of a key point. I don't know. I think for you to make an investment at a $5 billion
Starting point is 01:16:13 valuation here, you've really got to believe that the growth continues at this rate and it doesn't revert back to the mean growth rate of the last couple of years of basically 5%, which is roughly flatline. The other challenge they have is that their ARPUs only in that kind of $3 range, which is like less than 10% of where Facebook is at. And the data that Facebook collects on their users gives them the ability to do much better targeting on ads and therefore monetize their audience much better than Reddit has been able to do to the order of over 10X.
Starting point is 01:16:44 And then if you look at the ARPU number, how much they've been able to grow that metric, it's also been a little bit flat lined. So this business, I think is a real question mark. I mean, you could argue it's probably worth in the best case in the two to $3 billion kind of valuation range. And then you have to believe the bull case that the growth continues or accelerates from here, and they have a plan to believe the bull case that the growth continues or accelerates from here, and they have a plan to address the RPU problem.
Starting point is 01:17:08 They have other paths for monetizing their audience than what they're kind of doing today. What do you guys think this thing's worth? Do you buy it if it goes out of $3 billion or $5 billion? I think the first question which you nailed that a buy-side investor will ask is what happened in the last two quarters that was different than the last 15 quarters. That's going to be a very important question. I think they're going to have to have a very buttoned up answer for that. And if they can point to very specific, repeatable things, I think that'll be good. The thing that they, this IPO, if it goes off in the next four weeks, they won't have to wait. But if it doesn't get off in the next four weeks, they'll have to update the S1 probably with Q1. And so you'll see whether this thing is a
Starting point is 01:17:59 trend or whether it was a one time thing. Do you know what it is? Well, the growth in the logged out is probably largely because typically if you use it on a phone, it tries to force you to use the app, right, so that you can be in this logged in experience. And if you just turn that off, you can get a lot more logged out because Reddit gets tremendous rank authority from Google. off, you can get a lot more logged out because Reddit gets tremendous rank authority from Google. So if you just turn that off, I think that you'll have a lot of logged out customers, and that will grow very quickly. And so maybe it's a decision that they'd rather have the top line number grow than have logged in users grow. But the logged in user growth has still been pretty healthy. It's basically doubled in the last three years. But to your point, Freeberg, if they said, oh, our business is really only these 30 odd million
Starting point is 01:18:48 logged in users, it would be worth a lot less than saying 75 million. I think you're right, it's kind of like in the mid, you know, kind of two, three, four billion dollar range. The big problem is the ARPU, because these are not users that represent sort of Facebook's bread and butter kind of a $40 ARPU lives in a good suburb in the United States and is monetized like crazy. I just don't think that's what these users are. But you can see that as a challenge or an opportunity because if their ARPU is only 10% of Facebook's, there's a lot of headroom there to grow it.
Starting point is 01:19:28 If those users become economically more valuable. Arpu is actually down 2% year over year of sex. The issue with this user base is they're incredibly sophisticated internet users who don't click on ads and are kind of anti-ads as opposed to the general population on a Facebook or a generic service. And it's anonymous, so you don't know who the user is, which is how Facebook has such incredible demographic targeting capabilities. There's been a lot of conspiracy theories, long-con theories that came up, sacks that
Starting point is 01:19:59 we were talking about on Group Chat. Maybe you could summarize this long game that was played by Sam Altman and me, allegedly founders of Reddit to wrestle control of Reddit back from their freerious corporate owner, Cundinest. Well, this was a post by Yishan, who is a former CEO of Reddit that was published back in, I think, 2015. And he kind of lays out what I think happened. Or I mean, he says at the end, just kidding,
Starting point is 01:20:29 but if he's a former CEO describing these events, he must be describing something he knows about, I would just think. But in any event, what happened is that Reddit was sold for only about $10 million a year after it launched. So like really, really small. And I think that it kept growing and the founders realized maybe that made a mistake or that this was actually a bigger property.
Starting point is 01:20:52 And so they started scheming on how to get Connie Nass to spin it back out. And so Yixuan Li is out. The steps they went through, they recruited a CEO who they kind of pre-agreed on. Then they had that CEO demand options specifically in Reddit from CondiNAS, which meant that CondiNAS had to create a separate cap table for it. And then once they had a separate cap table as a subsidiary of CondiNAS, then they could sort of pressure to have an outside investor bought in for the expertise.
Starting point is 01:21:25 That just happened to be Sam Altman and his fund. And you know, eventually, like step by step, they worked it to the point where they got CondiNAS to spin off the company. And I guess this plan worked. Now, it should be said that the largest showholder in Reddit, according to the S1, is CondiNAS, or CondiNAS parent company. So no one's going to benefit more from this plan, if you want to, or scheme, if you want to call it that, than CondiNAS. It was a smart thing for them to do to spin out Reddit to allow the employees to have options. And then I would say to bring back the founder Steve Huffman as CEO several
Starting point is 01:22:04 years ago. So it worked out for everybody. You know, who knows if it was all premeditated. So they own 30% and it goes out for $5 billion. They got $1.5 billion. And they paid $10 million for it. So that's $150X. This is like a web. $2.0. $1.0. $2.0. I know it launched in 2005 and it sold in 2006. Yeah, it was at $2.0. It was the same time.0. No, it launched in 2005 and it sold in 2006.
Starting point is 01:22:25 Yeah, it was at 2.0. It was the same time as like Weblogs Inc. and Delicious and Flickr and all that, that whole cohort of little web apps that used Ajax and other things that, you know, the web was just getting faster. And there were a lot of users. Well, anyway, congratulations to the team. It was smart for Connie Nass to do the spin out and give up 70% in order to have 30% of what's going to be a multi-billion dollar IPO.
Starting point is 01:22:50 Great outcome for everybody. All right, issue five. Apple doesn't have the fast car project. Titan is DBA, dead before arrival. Apple, as you know, has been working on an electric vehicle for a decade, self-driving as well. They've invested billions in the project, according to Bloomberg.
Starting point is 01:23:07 And Apple was targeting 100K price point, basically going after the Model S Plaid with FSD. Company had 2,000 employees working on this project. It was called Titan. Then they had designers from Austin Martin, Lamborghini, Porsche. According to the report, most of the team will be transferred to Apple's
Starting point is 01:23:24 generative AI division. We talked a little bit about Maggie, their generative AI image language model. There's going to be some layoffs. So I'm clear how many. What does building a car have to do with building an LLM? Well, they weren't just building a car. They were all in on self-driving and not having a steering wheel. So I understand. I'm just saying it doesn't make a lot of sense that 2000 employees that were specialized in building a car, all of a sudden now become the AI team. It would be more like the full self-driving team is probably getting those AI jobs and the rest are probably going to lay it off with incredible packages. But what do you think, Sax?
Starting point is 01:24:02 I was always skeptical that Apple was even working on a car. It's just a very different kind of product than anything else they make. And so I never really treated it that seriously that they were going to make a car. So I'm not, I'm, the surprise to me is not that they canceled this, but that the, it was even true that they're working on it in the first place. Yeah. No, it was, there were reports of them having test tracks and everything. It was pretty well
Starting point is 01:24:25 established. And you could see that people from Tesla and other places had. Does it say why they actually killed it? No, this is just a report. And the speculation is that they're going all in on AI. They just see that as a much better future. Well, I think it's more core to what they do. I mean, the car never seemed that core. No. And there were reports and Elon's talked about it publicly, so we're not speaking out of school here or anything, but that Elon and Apple had and Tim Cook reportedly had talked or there had been overtures that maybe Tesla was going to get bought and Elon's been pretty clear that during the Model 3 rollout he was considering selling it to Apple.
Starting point is 01:25:07 That would have been a smart acquisition if they had done that. Can you imagine all the Apple showrooms having a Model S in it or something or a Model Y? I mean, boom, they would just sold so many of them and having the Apple operating system on that display. Oh. Well, wait, was it before the Model S came out or the Model 3? It was Model 3 was the time I think they were talking seriously. Yeah. But remember, no one really believed in the model 3 until it came out and started selling
Starting point is 01:25:30 hotcakes. Quite the opposite. They thought it was going to kill the company. Right. All right, everybody. Four, the Rain Man, David Sacks, Chairman, Dictator, Chamath, Paul and Habitya, and the Sultan of Science, David Freeberg. I am the world of Kinoa
Starting point is 01:26:06 I'm going all in What your winners why? What your winners why? Besties are gone Go 13 That is my dog, Piggie I know this is your driveway Oh man
Starting point is 01:26:22 My avid actor will meet me at Blitz He should all just get a room and just have one big huge orgy because they're all just useless. It's like this sexual tension that we just need to release somehow. What? You're a bee? What? You're a bee? Bee? What? You're a bee? We need to get merch.
Starting point is 01:26:38 I'm going all in. I'm doing all in

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.