All-In with Chamath, Jason, Sacks & Friedberg - NBA Gambling Scandal, Tesla Trillion Dollar Vote, Billionaire Tax, Amazon Robots, AWS Outage

Episode Date: October 24, 2025

(0:00) Bestie intros! (1:02) CA Billionaire Tax (17:00) Major NBA gambling scandal (29:51) Amazon's eventful week: AWS outage and leaked robotic automation plans (49:55) Tesla earnings, Optimus, Elon'...s pay package, "corporate terrorists" (1:03:54) Study shows AI bias Follow the besties: https://x.com/chamath https://x.com/Jason https://x.com/DavidSacks https://x.com/friedberg Follow on X: https://x.com/theallinpod Follow on Instagram: https://www.instagram.com/theallinpod Follow on TikTok: https://www.tiktok.com/@theallinpod Follow on LinkedIn: https://www.linkedin.com/company/allinpod Intro Music Credit: https://rb.gy/tppkzl https://x.com/yung_spielburg Intro Video Credit: https://x.com/TheZachEffect Referenced in the show: https://www.latimes.com/california/story/2022-05-25/rick-caruso-jeffrey-katzenberg-bicker-los-angeles-mayor-election https://www.thefp.com/p/rick-caruso-refuses-to-shake-jeffrey-katzenbergs-hand https://www.latimes.com/la-influential/story/2024-06-16/jeffrey-katzenberg-hollywood-fundraiser-democrats https://www.cnbc.com/2025/10/23/nba-billups-rozier-sports-betting-arrests-gambling.html https://oag.ca.gov/system/files/initiatives/pdfs/25-0024%20%28Billionaire%20Tax%20%29.pdf https://x.com/MovieTimeDev/status/1968107643643498744 https://www.coindesk.com/business/2025/10/23/polymarket-seeks-investment-at-valuation-of-usd12b-usd15b-bloomberg https://x.com/SawyerMerritt/status/1981119824991322553 https://www.nytimes.com/2025/10/21/technology/inside-amazons-plans-to-replace-workers-with-robots.html https://nypost.com/2024/02/21/business/googles-ai-chatbot-gemini-makes-diverse-images-of-founding-fathers-popes-and-vikings-so-woke-its-unusable/ https://ogletree.com/insights-resources/blog-posts/10-faqs-about-californias-new-algorithmic-discrimination-rules/ https://www.crowell.com/en/insights/client-alerts/artificial-intelligence-in-employment-update-illinois-requires-notice-and-prohibits-discriminatory-impact-in-use-of-ai https://www.nytimes.com/live/2025/10/23/nyregion/nba-illegal-gambling-arrests https://x.com/AndrewDBailey/status/1981409771505713650 https://seekingalpha.com/article/4832142-tesla-inc-tsla-q3-2025-earnings-call-transcript https://x.com/shayne_coplan/status/1981016949309239616 https://www.bloomberg.com/news/articles/2025-10-23/polymarket-is-seeking-funding-at-a-valuation-of-up-to-15-billion https://www.google.com/finance/quote/DKNG:NASDAQ?window=1M https://www.google.com/finance/quote/FLUT:NYSE?window=1M

Transcript
Discussion (0)
Starting point is 00:00:00 Maybe we should have Disgratziad Corner as like a regular feature of the pod. Good idea. I like it. Disgratia. Each of us can give our Desgratziat of the week. Disgratziad at the end of the show. Yeah, I love that. And then we could do a Bestie Award at the end of the year for Disgratioad of the Year.
Starting point is 00:00:14 Sacks, once again this week, I'm giving my disgracad to Jason Jalcan. So what is the opinions from the main stream? I'm going to try not to give it to you every week, but you made it hard this week. Let me tell you. That's right. Once again, you're sniping from the relative safety. you're a Texas ranch. I know.
Starting point is 00:00:32 I got the 50 calibers up. You want to jump on the ranch. Get ready for the 50. I think we're all going to. You're all going to have the ranches next to me. If this California wealth tax passes. Oh, my God. Let your winners ride.
Starting point is 00:00:49 Rain Man, David Saches. And it said, we open source it to the fans. And they've just gone crazy with it. Love you, best. I see. Queen of Kinw. I'm going all in. What's the story with the California wealth tax?
Starting point is 00:01:04 Can somebody explain this to me? Okay, so the SEIU, the service employees union, filed a ballot initiative, which means a direct-to-voter vote to amend the California Constitution to introduce a one-time billionaire's wealth tax, where billionaires, anyone who has assets over a billion dollars, net of their debt, has to pay a one-time tax of five percent of their net worth. including their private stock, including their real estate. You said 5%? 5% of their net worth, not of their income of their net worth,
Starting point is 00:01:39 the entire net worth, one-time payment to the state of California, and then there's an allocation on how that money will be spent, but it's a one-time billionaire tax. Now, it is very likely that this sort of an amendment to the California Constitution is not constitutional and actually cannot be made and will not actually go into enforcement, even if the voters do vote to approve it in both a federal and a state level based on this concept of uniformity, which is that you have to tax everyone equally, except for the case of an excise
Starting point is 00:02:08 tax, which is like income or a transaction, you're allowed to tax disproportionately based on the size of the income or the size of the transaction. But if you're going to tax on property, if you're going to tax on an asset, you have to tax everyone uniformly. So it is likely not going to go into effect if it does pass. However, it is very likely the case that the SEIU is, simply using this as a baiting mechanism to get people to stand up and denounce it. And then they will be in a position to attack those people and destroy them and use this effectively as a political fodder for this next election cycle. That's what it seems like the true kind of motivation. Let me go on the record. I think this law is great. He's getting the virtue
Starting point is 00:02:50 signaling points. I would just like to say, may I be the first to pay five percent? I'll be in the front of the line. Let me know when to show up. I'll bring my check. Who do I sign the check towards? Should I bring cash, Gavin? Let's just bring it cash to your, which one of your mansion should I bring the cash to, Gavin? This is strategically why Chimoth, I'm glad I got out of California right before I was about to billionize. That was the smart move on my part.
Starting point is 00:03:17 Freeberry, what are the odds that this goes into effect? Can you just handicap this? Yeah, well, we don't know who's going to come out against it, but there's an effort to try and get top Democrat officials in the state of California to say this is silly if you do this, people will leave the state, yada, yada. So that's kind of a quiet underway effort.
Starting point is 00:03:36 But I don't know why the citizens of California, the majority of citizens of California, would not vote for this. Why, who wouldn't want to tax the billionaire's 5%. Come on. The way that it's written, it says, hey guys, we're 30 billion in the whole, and there are
Starting point is 00:03:52 200 Californians that control true trillion dollars. we're just going to ask them to pay a one-time fee of 5%. And I don't see how anybody would say, hmm, that doesn't sound unreasonable at the ballot box. Right. And then the people that step up against it and are vocal against it and point out like,
Starting point is 00:04:13 hey, in France when they did this, they lost like 40% of their revenue because all the wealth left the country. The reality is that this sets it up to go through the legislature because if it goes through the will of the people and it gets overturned, as you say, Freiburg, then if you're legislatively smart, then you'll actually push it through the state senate. Oh, but don't you remember they did this in 2020? No, no, and then it will not get
Starting point is 00:04:35 veto. Like, because then it's like, hey, listen, it's clear that the people want this. So I think that you have some kind of progressive taxation system that conforms to the law. I mean, the one, yeah, they're already trying to extend Prop 55, which is the progressive tax for people making over a billion dollars, they're going to get that pass. That's going to be this incremental tax on income. But the one-time wealth tax- I think the million dollars thing, I think that's harder to hunt because there's too many people that that touches. A million dollars today in 2025, not to be glib, is just not what it used to be. But a billion dollars does cut off most people except for a couple of hundred. That is true. And I think that, for example, it's very reasonable to then
Starting point is 00:05:16 charge a 10% excise tax on selling appreciated stock. Why not? There's all kinds of ways that you can get billions and billions and billions of dollars. So I don't know, I think that this is more of a trial balloon to say, can we draw a clear line between 200 Californians and the rest of California? And to the extent that that bright line becomes visible and it's okay, people are going to go ham. They're going to try to get as much as it can. The reality is, as we all know, I mean, Larry Ellison left the state. Many of the founder of CEOs who have built large technology companies in California. Elon left the state will eventually at some point break and say, okay, I'm moving my company out of state. And I'm leaving the state and I'm
Starting point is 00:06:02 bringing the employees with me and I'm bringing all of the economic value this business with me. And people will never learn that lesson because it's so much easier to sit in front of a voter and say, hey, should we tax these 200 people to give you better benefits? 97% of people will say absolutely. Very few people will sit and think about the consequences of what's going to end up happening. 99.9 will say absolutely. I mean, nobody tells them in that ballot initiative that we have a $300 billion budget of which two-thirds may be just wasted. One of the motivations for this bill, and this is why it's being proposed by the SEIU, is that there are these massively ballooning pension benefits and pretty significant increases to the pension programs for both
Starting point is 00:06:46 private and public pension funds in California, which has actually become a very sizable liability for the state and for some of these private pension programs, and they're trying to fill the pension hole, which we've talked about in the past, but there is a multi-pillion dollar unaccounted for pension liability in this country that's going to have to come from somewhere. You're either going to have to print the money because the federal government's going to step in and fill the hole in all these pension obligations, or they're going to have these massively progressive tax programs to try and fill the hole. And if and when they do, as we all know, there will be an economic cycle that will be pretty nasty, which is all the value will leave
Starting point is 00:07:23 that jurisdiction and move elsewhere. But let's see. It's like the Democrats are doing everything they can to get me to leave the state. I don't want to. I really am resisting. I mean, they've raised my income tax to, what is it, like 13.3%. 13.3, yeah. And I know it's going to 16. They've been boiling the frog. I still haven't jumped out of the pot. But for me, I think the wealth tax, I'm going to have to jump out of the pot with this. The crazy thing with this, the other, like, I read it because I was like, oh, my God, what's going on? The two things that they obviously got somebody very clever to draft it, because any Roth IRA
Starting point is 00:07:59 over 10 million counts, and normally in these wealth calculations, you keep your deferred retirement accounts off the table. They're typically not included. So folks, and I'm not going to say who they are, we all know, have tremendously appreciated Roth IRAs. Pretty publicly you're talking about, but sure. Those are included. And then the other thing is that if you actually did any tax structuring, the real
Starting point is 00:08:24 valuable tax structuring is where you set up these trusts in Wyoming and North Dakota, and you do these inter-party loans where you can lever up 10, 20x, so you can transfer billions and billions and billions of dollars out of state, but then you have these obligations. Those are negated and they don't count. So all that tax structuring goes out the window. So you can get into a very difficult situation here. where they're like, hey, you owe us $500 million, a billion dollars, two billion dollars, and the only way to pay it is to give an I owe you to the state of California, which is crazy.
Starting point is 00:08:57 It's crazy. There is not a lot of ways out of this if this stance. No one is empathetic to either of you. No one gives a shit about the two of you needing to pay more taxes. I'm in support. Again, I'm just saying it for the record. I am in deep support. Deep support.
Starting point is 00:09:12 On this? I had a few more thoughts about this thing, which I want to unpack. Number one is, like you guys said, a wealth tax has been tried in many places and many times. It always backfires because whatever the tax benefit is that you get for the state, it's greatly outweighed by the economic depression that you get by the wealthy people, the job of creators, companies leaving. And as soon as you cross that line of going from no wealth tax to any wealth tax, enough people of wealth can see the tea leaves.
Starting point is 00:09:43 They can see the writing on the wall that they have to leave. And that's why I think even if they say this is a one-time thing, we all know that it won't be one-time. If they get away with it, it'll become a regular thing. Of course, if it's to plug a deficit, they're going to run deficits every year. Exactly. And you're right. And this isn't even to plug an emergency situation or an unfunded liability, like some one-time thing. No, it's just a regular operating.
Starting point is 00:10:07 It's just regular mismanagement. So they will have no incentive to fix their mismanagement of the state and their deficits and all that kind of stuff. By the way, by the way. If they get away with this. And it's not just going to be billionaires. Eventually, the line will get pushed down. The billionaires will be gone. The billionaires will be gone.
Starting point is 00:10:21 Like the income tax in the U.S., I think it was a 1% income tax originally. And it was like just a one-time thing for wealthy people. And then it became a smaller thing for, you know, lower income people. And then eventually, as we all know, every person has to pay a tax, every property has tax and so on. I mean, these are, this is the problem with government. There's all these other states, by the way, that are finding clever ways.
Starting point is 00:10:43 I think in Montana now, there's a differential property tax scheme where if it's your second or third home and you don't live there, you pay a lot more. Yeah. Here's what I wonder about is, you know, what are guys like Jeffrey Katzenberg or even Ari Emanuel thinking about right now? Because they're kind of the higher ups in the Democratic Party behind the machine. It's sort of the oligopoly that kind of runs the machine. and I remember that when Karen Bass was running against Rick Caruso for mayor of L.A., it was
Starting point is 00:11:16 very publicly reported that Katzenberg was behind Karen Bass, and there was sort of an ambruglio between Caruso and Katzenberg. Katzenberg, anyway, helped make sure that Karen Bass was well-funded enough to win. The result of that, ironically, was that Pacific Palisades burned down, and I think Katzenberg's house might have been part of that. In any event, I think there are these guys who are very, very wealthy who think that they can control the machine well enough that they basically are still in control of this thing, right? That, in other words, that the tiger won't eat them, right? The tiger is socialism.
Starting point is 00:11:52 And that's exactly right. You know, they think they've got the tiger under control enough that it won't eat them, but I don't think they do. Maybe they don't. Maybe this is the tiger breaking loose. Yep. And I think, forever you pointed this out, that there was an attempt in the legislature last year to pass a wealth tax, and it was quietly killed behind the scenes. And I actually think that
Starting point is 00:12:10 Gavin Newsom might have something to do with that because he has presidential ambitions, so he can't let the state go full socialist. But you just kind of wonder, okay, well, if these guys lose control of the strings they have to control the beast of socialism, does the whole thing to spin out of control. That's New York. We're seeing it everywhere. Seattle. I was about to bring up. Let's talk about it. Well, and just to let people know about the France situation back in, I don't know, 2011, 2012. They did get rid of Giraud-Dir-Pardieu, which was kind of a win. But Bernard Arnault. What's his, thank you, Sachs. Bernard Arnault. Is that his name from LVMH, Sachs? Bernard Arnault said he was going to go
Starting point is 00:12:51 to Belgium. And then he unwound it. But that was a clear signal, the richest man in France. Well, he's the founder. Then we go to New York. He's basically the entrepreneur I mean, it's their biggest company. It's the one that does all the luxury goods, all the craft goods that they're so famous for. I mean, yes, him threatening to leave France is, you know, would be massively filing paperwork. Oops, what an accident. Here's your look at New York City under Mondami, who we're in touch with. He may come on the program.
Starting point is 00:13:25 New York State tax 10.9 percent, city 3.876 percent. And the 2% Mandami tax puts you at 16.8% for living in New York. It's 17%. I mean, if you were making $10 million a year, is it worth $1.7 million? You could get a plane. You could live in Florida. You could come to New York, 150 days a year. There's really five good months in New York, the fall, the spring.
Starting point is 00:13:56 And that's about it. You know, you go see the tree at Christmas, but it's cold. Well, that's not, that's not realistic from. most people, and especially if you have kids and you care about them, you'd like them to be rooted somewhere. You're not going to schlep them around every month to arbitrage taxes. Well, I mean, I do think they're going to test that at 17%. That's non-de minimis. Okay, let's, we've got a lot of docket to get through here. I'm so glad to Mov supports the billionaire's tax. That's great. We'll get that in the headlines this week. Right after all his
Starting point is 00:14:23 facts. This is the free rider problem that we have is no one's going to want to stand up against it. Exactly. I think we know that. By the way, if you're the, if you're a billion, billionaire CEO of a public company in California. No, they're superhosed. You have everything, you have everything to lose to stand up and oppose it. Your employees will run. Your shareholders will attack you. You'll look awful in PR.
Starting point is 00:14:43 So everyone's going to sit quietly and start looking at houses on Zillow in Austin or Miami and be like, where should we move to next year, honey? You know, like that's the conversation that's going on behind those doors. Didn't you say it was retroactive? It's retroactive to 2026. So if it passes next to you have three months. You have three months. Two months.
Starting point is 00:15:01 But again, I don't think it passes muster with the constitutional reeds. Yeah, but you know what? That's what they said about, remember when they did the transfer tax where San Francisco took 6% of my home? Yep. Yep. And then in L.A. just took 5% of my house down there, the supposed mansion tax. But those were excise taxes. So if you go back to the case history in the U.S. Supreme Court on this stuff,
Starting point is 00:15:23 anytime there's a transaction and you take a tax on a transaction, they call that an excise tax. Well, you know, there's a constitutionally allowed. of the bill, though there's a part of the bill that they could cleverly use, which is called this ODA, which is effectively this IOU mechanism. And they could essentially say, when these assets transact you owe us 5% on an excise basis. And by the way, there's an attestation that you have to file. You have to file a legal document. And this was quite well written in there, which said, you must attest that you have less
Starting point is 00:15:51 than a billion dollars. Okay. Now what? Okay, then I have to attest that it's more. And then I have to now... How do you even mark your whole? portfolio to market if you're in a lot of privates. They do not allow discounts. They do not allow liquidity discounts. It says if you are a reasonable buyer and a reasonable seller,
Starting point is 00:16:09 you have to transact this at market price. So, for example, imagine you owned a sports franchise. And the sports franchise, if you sell a minority share, you're typically selling it at a discount off the table. If Forbes says it's worth 10 billion and you own 10%, that's a billion dollars. For the purposes of this calculation, I'm going to pay $50 million to keep it, even if you paid $50 million to buy it. Even if Freeburg is right that there's a good chance that it will be found unconstitutional, how many years in the course is that going to take? And who's going to stick around waiting for that? In fact, the rational thing to do is pull up stakes before January 1st and leave right now. That's right. That's going to happen in New York.
Starting point is 00:16:46 I mean, I think they're going to have an exodus just like New Jersey and Connecticut did. And that actually rocked the tax base in those two geographies. All right, listen, big breaking news this morning. Huge scandal in the NBA. The FBI just arrested 30 people in a sports betting and gambling probe. This hardly seems real. Chauncey Billups, who is the current Blazers coach, and was just introduced into the Hall of Fame,
Starting point is 00:17:13 got pinched for a poker game. He was running allegedly with the mafia that was rigged, 17 different ways, allegedly, to Sunday. Terry Rozier, allegedly, is a point card from the Miami. Why are you saying allegedly all the time? You know, everybody's suing these days, so allegedly, I allegedly he's a point guard. I've seen him play.
Starting point is 00:17:32 He's not a very good point card. It's a lot of turnovers, if I'm being honest. You know what I know is alleged that you're the world's greatest moderator. That's allegedly true. It's allegedly true. Because it's not true. You're allegedly a billionaire. Nobody can confirm it.
Starting point is 00:17:45 Normally, he uses the word allegedly when it's a subject. story that like it's about 100 Biden or doing something improper. Yes, he allegedly smoked crack and shot a nine millimeter in the air. It's usually a story like Democrat wrongdoing and he's trying to discredit it. All right. Here we got. Anyway, keep going to. Terry Roseir, who's allegedly a point card.
Starting point is 00:18:05 I mean, he was, he told his friends, this is crazy, you know, in the over unders. You know, hey, guys, bet the under on me in rebounds because I'm going to take myself out of the game with an injury, allegedly, and his friends, allegedly, made $200 grand off this. Okay, just allegedly for the whole goddamn thing. This is going across 11 states and a bunch of crime families. Allegedly, there's something called the mob. I don't think that really exists anymore. I think that's an urban legend.
Starting point is 00:18:36 And these are two separate threads, but announced on the same day, they both involve NBA players, but apparently this is two different cases. So, Chimath, what do you allegedly think of this? I think it's crazy. I think you're seeing a lot of these trends converge all at the same time, meaning you have the emergence of all of these prediction markets. You have a lot of data science and AI being used that shows that there's a lot of odd behaviors.
Starting point is 00:19:08 So it really was the squares versus the sharps. And if you had the inside edge, you were just printing money. Now that all of that is becoming more transparent, there's a lot less margin. Then what happens is you have these laws passed in the 11th hour. There was an important gambling law that was inserted into the big, beautiful bill that has implications to all of this. And now you're seeing the feds. The crazy thing to me is a press conference where Cash Patel is talking about this. I mean, that's like serious business when the FBI director is front and center talking about all this.
Starting point is 00:19:40 So I don't really know what it means, to be honest. I was shocked at the scale of it, and I was shocked that it's on the radar of the feds. I thought this is like pretty typical tiki-tacky stuff, but clearly there's something bigger. I don't know exactly what that bigger is, but something is happening where all these markets are smashing together. There's just a big cleanup effort going on. So I don't know. I really don't know.
Starting point is 00:20:05 Breederberg, I guess there's two different ways to go about this. You have the fantasy sports becoming legal, everybody around these players, just in that one case, where are these people too dumb to understand that? their $10 million contract to play in the NBA every year or $20 million contract is more important than your friends betting the under or over. And how dumb are they? I mean, to not know that the people running a sports book would look for weird action. Like, why is one player getting $200,000 on their over, under for rebounds? And the other players are getting $20,000? What are your thoughts here, Freyberg? And you can also take on the poker one. I think gambling
Starting point is 00:20:45 generally, as we call it, should be decriminalized. And I don't like this state-by-state set up with gambling. I think we should have a federal regulatory body to oversee monitor. And the problem is you have state-by-state kind of patchwork of regulatory authority that makes it very hard to standardized crack and provide also guidance and feedback. I would much rather see this all kind of get handled at the federal level and better organized. To Chimov's point, this is not going away. People love to bet on stuff. They love to gamble. This is part of sports. This is part of the culture. Look at Polymarket. You're not going to just turn it off. Look at Polymarket. Polymarket raised whatever it was, a billion or $2 billion at $9 billion. Then the next weekend, they've announced
Starting point is 00:21:30 sports betting. And now they're raising money 30 days later, allegedly. Allegedly. At 12 to $15 billion. I mean, it's unbelievable. And you can see, by the way, the way that Draft Kings and Fandual stock have reacted to this. Those companies are toast. That's right. It's really interesting. The polymarket model is the best model because it creates a market. And so as information flows in, that market will dynamically adjust and everyone will get
Starting point is 00:22:04 a more fair price. Did you see the regression that they did on the polymarket trades and how well they're in the money? Yeah. Nick, can you find that? But basically what it showed is like the front money is the sharps. The back money are the squares, but you have to fade the trade in the first week. So there's a very scientific method where if you want to make money on polymarket, it became pretty clear.
Starting point is 00:22:25 There's two things that are very interesting about it is, number one, how they've simplified things to a way people can understand. It's not like you have to understand, you know, it's 120, it's this, the point spread. It's just what are the, what's the chance that this thing happens, 80%, 20%, people could just place their money on it. And then this ability to reconcile it at any time, I didn't realize how engaging that is. I was watching the Oscars and I was watching boxing. And I bet the underdog and this Netflix boxing thing that happened because I just thought, this guy looks pretty pissed off. And I thought that was a good enough way to go with the underdog.
Starting point is 00:23:01 And then you watch it round after round and you see the odds changing real time. And any time you can just cover the bat and take your winnings and take out the risk. Really like interesting and fun for people. It's so simple. And then I did it on the Oscars or the Emmys. And I was like, yeah, I'm going to, I'm fading, no offense, Ben Stiller, but I'm going to fade severance. And I went with the one about the emergency rooms and with Andor, and I won again. So I'm just, it's a lot of fun to do it.
Starting point is 00:23:31 Here, Jason, look at this. I sent Nick the tweet, but this is incredibly systematic. This is over many, many, many markets. But basically, 89% accurate one week out, but in the final four, hours, it jumps to 95, which means that if you follow the sharps along this pattern, you're going to make money. 6% in a week. Yeah.
Starting point is 00:23:51 Polymarket actually has the news before the news does. And this is one of the most powerful outputs of Polymarket, is they're actually getting a read on what's going on in the world before the media recognizes it, before the public recognizes it. The experts. Yeah, when you put money up, it actually turns out that when people have incentives, that market will find the truth. Somebody needs to build the app that makes all of these things fungible. And by all what I mean are cryptocurrencies, betting markets, equities, and options.
Starting point is 00:24:24 That's what Polly Markets turning into. Yeah. And the reason is there's just no reason to go to nine different sites and have nine different accounts. And the most important thing is to do KYC and AML across nine sites to get access to liquidity, credit, and margin. You'll want to do it once. And then you'll want to have a large pool of capital that you can trade across anything. So I can go long invidia, but I can also go short the Nix. And then I can own some Bitcoin all in the same trade. Totally. That's where it's going. Totally. Wow. Now, to the earlier question, JCal, I think if we end up there where polymarket does become the truly liquid market across all of these kind of predictions, all of these assets, then a lot of what we are seeing with respect
Starting point is 00:25:10 to insider trading, insider information becomes much more apparent. So the problem with the sports betting is that there's a one-sided bet. The casino sets the odds or whomever is setting the odds, and then you're either taking one side or the other. And so if you have the insider information,
Starting point is 00:25:26 you're taking the side that creates an arbitrage opportunity for you. But if you were to do that in a liquid market where there's someone taking the other side in a dynamic way, then the market very quickly moves because of the inside knowledge you have. and that inside knowledge is now reflected in the underlying asset price, in the underlying odds that you get for that bet. And so Polly Market actually brings truth and transparency
Starting point is 00:25:49 to what is currently an insider arbitrage opportunity, and it may actually solve some of these fundamental problems in gambling. I think let's just wrap with a little bit on the poker and knowing if you're in a rigged game or not. Living in L.A., I got invited to a lot of poker games when I was playing low stakes, playing in Hollywood Park, just, you know, $500 buy and $1,000 buy-in. But as these things went up, you started to get access. And I started to get invited to Molly's game, the very infamous game. And she would text me, she would call me, oh, we're playing over here. Oh, Leo, and this person wants to see it.
Starting point is 00:26:22 That person wants to see. I was like, they want to see me lose $50,000. There's no way. I'm not playing in that high stakes. And I'm not going to that game. And the one or two times I did go to games that had a rake, I was just like, this game is fixed. I don't know how. Totally.
Starting point is 00:26:33 but somebody's I think it's just collusion I think there's three players all playing from the same chip stack in which case you know you could be dealt aces five times in a row if you're up against three players what are your odds against you know six other cards it's going to be pretty bad for you think Molly's game was fixed I don't know if hers was I wouldn't be surprised if it was I wouldn't be because once the mob gets involved which is what happened at the tail end of hers then all kinds of possibilities happen once it gets to extremely high and you've got guys chasing it, man, you could, you know, and they're coming back night after night, try to catch up for what they lost the last week. It's, it's pretty dark. There is absolutely no reason why anybody should play in a game where you're playing with people you don't know. And if you need it that badly, then you probably have a problem. But there is no limit at which you couldn't find a game with some combination of your friends and or respectable, reputable businessman that have more to lose than you do. And if you can't find that game,
Starting point is 00:27:36 you should not be playing in any game. Yeah. Any home game with a rake is just should be absolutely suspect. Period. Super sketch. Isn't that a sicken? Yeah. Well, we don't bring up angle shooting.
Starting point is 00:27:49 He would be so tilted if he hears you say. Oh my God. He's so about the ethics he wants, you know. No flies out. In fairness to that game where you can. you know, go off for a small house in a... is also the game where he would
Starting point is 00:28:05 then collect $10 from each of us to pay for the fruit plate and the pizza. He would order Domino's thin-crossed. He wouldn't even buy us pizza. Yeah, now he's got the chef. But I'm like, I don't know if the chef really does cost $6,000 for two hours, bro. I don't know.
Starting point is 00:28:20 It's Wagyu. I think it's a Wagu burger. The funniest ever was he's like in a hand and the Domino's pizza comes and, you know, he's like everybody have a green chef when we're playing with file of chips. He's trying to get like $125. The guy comes. I just go, it's on card. The guy's you got a sign, right? It's got the tip on it. I said, it's like $150 in piece. I said, what's the most, what's the biggest tip you ever got? He said, yeah, somebody on New Year's
Starting point is 00:28:43 gaming like $200. I just wrote $500 on $150.00. I signed it. I gave it to him. And then I was in hand with this guy, says, here's the receipt. So what you're saying is when it's on somebody else's credit card, you're willing to tip incredibly generously. I mean, God, you're a really great guy. You should speak. When Phil Helmuth and I bought dinner for everybody at Chippriani that time, Chimop grabs the check.
Starting point is 00:29:10 He goes, I'll put the tip in for you guys. Well, why is that, Jason? Is that because every other time I paid? You put a 100% tip on an $8,000 check. Isn't that because I pay for everything all the time? That's true. You are very generous. It's you're no sq.
Starting point is 00:29:24 Or no sirek in 2010. One time. I ask you guys in 15 years to pay one time And you remember the exact, it's so sad. I have got millions on you guys. I guess we're going to public school. You guys are so ungenerous. It's unbelievable.
Starting point is 00:29:38 I give huge tips. Yeah, I think he's, you know, this average tip. Okay, let's go to the next topic. Allegedly, world's greatest moderator. Let's talk about this Amazon outage, tough week for Amazon. They had this huge outage in the beginning of the week. And then they had a bunch of leaked documents about their plans for jobs. And Monday, massive AWS outage, 2,000 companies, 4 million users unable to function on the internet for half a day, 15 hours, 20 hours.
Starting point is 00:30:10 And then on Tuesday, internal docs viewed by the New York Times showed Amazon plans to not hire 600,000 plan jobs because of robots by 2033. So this isn't, they're planning on laying off 600,000 workers, but rather they're just pulling back their hiring plans. and ramping up their robotic plans, which you would expect. And their goal, according to these internal leak documents, is to automate 75% of warehouse operations. We talked about this the last couple of weeks. Freerberg, your thoughts on either of these two stories here? I think the AWS story is interesting in terms of its implications for the clouds.
Starting point is 00:30:52 There's effectively three major cloud vendors that compete with one another, AWS, Microsoft, and GCP or Google Cloud. and I'll just give you these numbers. Oracle also, by the way, coming on strong. That's right. But let's exclude the number four for now, Oracle. But AWS is $124 billion revenue run rate. Microsoft, $120 billion in Google Cloud, $54 billion.
Starting point is 00:31:14 But AWS, which is slightly larger than Microsoft, is only growing 17% year-over-year. Microsoft's 26% year-over-year. And Google Cloud is accelerating at 32% year-over-year, and some say getting closer to 40% growth rate. The big thing I hear from partners and enterprise customers of these cloud services is that many of them, if not all of them, as they scale up, move to a multi-cloud model. So none of them want to be dependent on a single cloud. Many folks started on AWS because AWS was the OG.
Starting point is 00:31:47 Back in the day, when I was running Climate Corp, I was the largest EC2 user in AWS for about a year and a half, which was their elastic compute cloud service. We were running all these models back then. And so I knew that service very early on. It was very unique. It was very powerful. And so a lot of companies that are old school established themselves in AWS very early on. But the outage that happened this week, I think, starts to highlight for folks that they can't
Starting point is 00:32:09 and shouldn't have a dependency on a single cloud service provider and will only accelerate the diversification of companies into the other clouds. And so I do think this is actually a very beneficial situation for Microsoft and GCP and Tierpoint JCal, perhaps even Oracle. in terms of giving those sales teams, which are very aggressive, a hard story to go and sell for and say, guys, you don't want to just sit on AWS in case this happens again. We've got better infrastructure.
Starting point is 00:32:37 We're more reliable, et cetera, than these other guys. So come and move over to us. And that might be a little bit of a naive, simplistic, kind of reductive way to think about what happened this week. But we are seeing the smaller competitors accelerate. And I think that this might be another kind of moment of acceleration for those folks. And multi-cloud. It's been around for a while, Jamoff.
Starting point is 00:32:55 when you're doing stuff with 80-90, are the big companies already doing that, or do they assume, hey, there's going to be some downtime? Yeah, it's okay to risk, or are they really thinking multi-cloud, neocloud, let's have some smart, intelligent routing and redundancy here? I think there are two markets. There's the AI market, then there's the non-AI market.
Starting point is 00:33:17 In the non-AI market, everybody has everything. It all looks effectively the same. There's certain products and services that are unique to Azure versus GCP versus AWS, but by and large, the market is big enough and important enough that you'd have to be pretty insane to take a single vendor approach. And so what typically happens in these markets is that they start off really small. One person has all the share.
Starting point is 00:33:49 And then as the market becomes very valuable and very big, everybody diverting. diversifies because it's a risk management thing. And these things flow into the disclosures you have to make as a public company. And if you didn't have that diversification and something bad happened and it impacted your business, you could get sued. So there's all these reasons why eventually all these three big companies will converge effectively roughly a third, a third, a third. We're going to debate the path to get there, but that's where they'll end up. You know, there's this principle called the rule of three where they say like all markets eventually mature to kind of a 60, 30, 10 split. that you end up having your market leader at 60% market share, second place is usually half the size at 30. And then you always, there's some balance in the market where there's some competitor that resolved to about a 10%. It's really interesting. If you guys were to place a bet, who would you think is the 60, 30, 10? I don't think that applies to you. I think there's going to be a third, a third, a third. I think it's all some idiot making something up. But what do you think? What do you think happens in cloud? Do you think that these all converge
Starting point is 00:34:50 to equal market share? In non-AI, it's a third, a third, a third of third. It will. It'll take circuitous paths, but that's where we'll end up. By the way, a good point to make is that this revenue number that I highlighted for Google Cloud, Microsoft, and Amazon actually include their applications. So, as you know, Microsoft GCP have pretty sizable enterprise application stacks that are built into that number, which gives them, obviously, the ability to drive cloud usage because they've got demand and sales relationships into those enterprises. I think the way it works in AI is that you initially, right now we're in this early phase where there's two paths. one is you need a specific model and it's relatively well integrated using a specific subsidized form of hardware on one of the hyperscalers. But eventually, you'll get more of that abstracted away as it gets pushed into the infrastructure so that you have less dependence on one
Starting point is 00:35:42 model. There's a lot of work that has to get done and a lot of in-memory infrastructure that is not yet built that has to exist. But once that exists, it'll be easier for all of us at the application level to view these models a little bit more fungibly. And then at the bleeding edge, you'll have the folks that basically give you some form of a hypervisor or virtual machine or the bare metal. And that's where the neoscalers are doing really well. But I think my point is that in any important market in compute, in technology, where there really isn't much of a differentiation, I think you'll end up with these hyperscalers at a third, a third, a third. Now, if one model is way, way better, and it's only on one of the clouds because Google writes a big check or Amazon writes a big check, I could see that swaying the AI share. But in the absence of that, I think cheaper, faster, better is sort of the end destination for everybody.
Starting point is 00:36:39 What an extraordinary outcome for Amazon, where AWS is like 15% of their revenue right now, Freiburg, but it's 60% of their profits, profits today. And that was just a side hustle, like a little project they took out of nowhere, and it's having the same impact on Google in other places. So side bets and side quests are just, you look at the Waymo side quest for Google or even a lot of Sergei's other bets like, and Larry, flying cars, looms, low earth satirites, Google fiber. All those X projects were so, they had so much potential. DPU, Deepvine, TensorFlow, GFS, 50. Robotics. It's pretty consistent. Boston Robotics.
Starting point is 00:37:20 They bought all those robotics companies. Man, it's like somebody got to them and we're like, yeah, you know, you're seven, eight years into this. It didn't happen. The problem that Google has, unfortunately, is like they have so much stuff. It's not really value. And so they're going to go through the same problem that everybody else who's a conglomerate has, which is this decision. Now, Buffett, when he got to that decision, said, I don't care. This is my life's work. And so I'm just going to keep everything aggregated.
Starting point is 00:37:49 But now you're going to get to this thing where the intrinsic value of everything they have will far exceed the actual value that it trades at. And so there'll always be these fissures of pressure. And then if one of these things requires a lot of money, there'll be pressure. And that pressure will be segregate these things so that I can own one versus the other. And that's always the thing that happens in public markets is you go, you go, you kind of swing back and forth. So I suspect that this is going to happen at Google. This was what they set up to do with Alphabet was to be the holding company. And then to your point, they made that evolution, particularly in a company like Waymo where they said, we can't be the sole funder. They brought in Silver Lake.
Starting point is 00:38:29 They brought in all these other investors. They did this actually with Verily. They did this with a bunch of these, what they call other bets is they made the conscious decision. Because Chimov, on the flip side, by bringing an outside capital and having an independent board for these subsidiaries, they were actually able to drive better outcomes because now there was governance and there was aligned interests that could then take management and say, guys, if you can deliver these results, and you have this kind of external pressure as opposed to the softness. But it's that, it's something else. There's no way somebody as smart as Silver Lake comes in if they think there's not a path to
Starting point is 00:39:01 liquidity. So the other thing they have to promise is they're like, listen, we will take this company public and in return, you will help us build a better company than we could build ourselves. Well, it seems that Silver Lake has done their part of the bargain. Now it's up to Google to live up to their part of the bargain because if it doesn't get liquid, it sets a very bad precedent for everybody that committed capital into that company. Of course, yeah. Waymo going public would be unbelievable next year.
Starting point is 00:39:27 If they did that, what would that look like in the public market? It's $250 billion? No. Take it. Take it easy. Stop. Don't do that. You don't think so?
Starting point is 00:39:37 I think it'd be huge. Jason, we all objected to talking yet again about AI-driven job loss, yet you insisted on putting this AI robot story from Amazon in. I think you have something to say. Thanks. Let me take you through a presentation. Well, you have slides? No, I've been just, I am working on a presentation based on a lot of stuff we've been talking about here.
Starting point is 00:40:01 I threaded it together. You know, I talked, we're just talking about Google and the size of the company. Right now, they are in 2025 at 187,000. They were at 190,000 people in 2022. And their revenue has just gone from 283 to 350 billion in basically three years. And when you look at this Amazon stuff that came out, I just wanted to point out a couple of things. It's not just that they're not hiring these 600,000 jobs. It's that they are in full-blown crisis preparation for this.
Starting point is 00:40:32 They have crisis teams writing up. how to handle this and be a good corporate citizen and they're talking about having parades and paying for toys for tots and they're even trying to get the executives to say things like co-bots as opposed to robots let's not call them that let's call them co-workers and co-bots and when you look at this just to open up the aperture here right now walmart and amazon are the number one and two employers in the u.s two point one million people work at walmart over a million at Amazon and three million people, as we know, work in taxis, Uber, door-dashers. All those jobs are at risk.
Starting point is 00:41:09 And we talked about this back in June when Andy Jassy telegraphed all this in a blog post where he said, the next few years, we expect that this will reduce our total corporate workforce as we get efficiency gains from using AI extensively across the company. They believe that they're going to have significant job displacement, which is use the more neutral term here as opposed to job loss or not hiring. And when you look, I don't know if you saw today, there were a bunch of MAGA people saying like, oh, these interlopers and the MAGA movement are not taking into account the bottom half of the MAGA movement, the workers, people who don't own equities. And when we look at electricity, spiking, you were on that story
Starting point is 00:41:52 last week, Chamoth, or maybe it was even two weeks ago now. The energy department just said electricity costs for residential are going to go up 4.8% this winter. And this is going to start this anti-AI boom counter. And I tweeted about this and I thought I would maybe end here with
Starting point is 00:42:14 Elon replied to my tweet and said AI and robotics replace all jobs. Working will be optional like growing your own vegetables instead of buying them from the store. And Senator Bernie Sanders came out instead I don't often agree with Elon Musk but I fear that he may be right when he says AI and robotics will replace all jobs. So what happens to workers when they have no jobs
Starting point is 00:42:32 for income? AI and robotics must benefit all humanity and not just billionaires. And I'll stop there because this, I think, feeds into your story for the last two years on this podcast, Freeburg, which is the rise of socialism, these things, and Bernie Sanders being the standard bearer for democratic socialism, these things are starting to come together. They're starting in people's minds, whether it's the original MAGA guy saying, well, what's going to happen for American workers? We know that the Trump 2.0 agenda is doing great AI buildout crypto, all this great stuff, trade. But the bottom half that you keep talking about, Freiburg, is starting to connect on this issue. I think that you are characterizing AI automation and technological progress as the
Starting point is 00:43:22 core driver of the socialist influence. And what I would argue is that the actual core driver of the socialist influence is the fact that we put in place a lot of people into government passed a lot of laws that caused an increase in spending because we promised people that the government would do more for them over the last 40 years. That is not possible in a true market-based system. Oh, I agree with that. Yeah, I agree with you. And so by telling everyone, hey, we're going to make sure you get better jobs, we're going to make sure you all get housing, we're going to make sure you get education. You cannot actually get a government to effectively do that
Starting point is 00:43:56 because what ends up happening is the government inflates the cost of those things and the market doesn't actually work. So the truth is, this is now, like all other things, a scapegoat for the true cause of the socialist movement, which is that government has become too big, too unwieldy, and its natural inefficiency has distorted markets to the point that there is maybe no point of return anymore. and people will not see that. They do not see it. And they're going to look for reasons. They're going to look for scapegoats. And they're going to say, oh, my God, look over there.
Starting point is 00:44:26 There's a robot. That's the reason I'm losing my job. Oh, my God, look over there. There's a rich person that works at a pharmaceutical company. That's the reason I can't get health care. Or an immigrant who took my job, right, is the one from the last 20 years. And so fundamentally, I think that people aren't willing to and they're not going to see the true cause because there's no one that runs to go work as a politician that is going to raise their hand and say government is the problem. No one says, I need to reduce government, elect me. No one ever has gotten elected in a democracy doing that. So the natural course of things over 250 years is that people raise their hand and they say, I'm going to give you more and I'm going to use the government to do it.
Starting point is 00:45:02 And then they go into the government, they make the government bigger. And as a result of making the government bigger, the government is spending more, the dollar goes down, the performance of the services goes down, and fundamentally we end up in a socialist spiral. I think it's confirmation bias for you to see that story as confirming a point of view. I mean, it confirms what I predicted last year that Amazon would be cutting all these jobs for robots. That's all. It's not confirmation by it. They haven't cut one job. They haven't cut one job. Actually, they have less employees now than they did three years ago. Let's keep down. Yep. It's actually not true. The New York Times story doesn't even say that. You've got these like hobby horses where you keep coming back to the job loss narrative, the copyright narrative. And then there's one story in the New York Times, which was a leaked internal document from the.
Starting point is 00:45:48 automation department, which doesn't even mean that it's going to happen. This is like their sales bench. The barber is trying to sell you a haircut. And you read that and you're like, oh, it confirms everything I've been saying. What the article actually says is that they've tripled their number of employees since 2018 and they're not planning on cutting jobs. If it pans out, if the program pans out, then the rate of hiring will simply be slower. Yeah, it's interesting. You pick 2018 as the point because the actual peak employment there. was 1.6 million in 2021, and it's now 1.55 in 2020. I didn't pick that to cherry pick. It has actually been flat to down. Okay. Which is fine. I'm quoting the New York Times article,
Starting point is 00:46:30 which is the source for this. Yeah, yeah. Amazon's U.S. workforce has more than tripled since 2018 to almost 1.2 million. You have to read these New York Times stories carefully because they want to make the headline as salacious as possible. And then the echo chamber wants to make it even more salacious. And they make it a story about job loss when it really is a story about operating leverage in their business, which is a slightly more nuanced take. Yeah, no, there's definitely nuance here. I would believe Andy Jassy when he says we're going to be reducing jobs and when this chart shows that they're flat to down over the last five years. And that same trend is just happening at Google like I just showed because there is a static team size or slightly down team size
Starting point is 00:47:09 that's occurring at all these companies. And it is notable. And then on top of this, which has occurred in the review mirror for the past five years because of COVID, return to office and efficiencies. They're saying, hey, we've got to come up with a way to frame these robots coming into the factory as a good thing so Americans don't get really upset at us and we need to buy more toys for tots. Here's the problem. First of all, I don't believe in this job loss narrative as the way that you keep portraying it. I think it's much more nuanced and complicated. I think Freiburg does too. And every time there's a story, you want to bring it up and make it a story of the week. And it's all confirmation bias. And my point is not that Amazon isn't seeking
Starting point is 00:47:49 ways to improve its operating leverage and avoid hiring more people. Obviously, they are. But the headlines that this has been turned into are so exaggerated and salacious. And the point is, they don't say in this article that they are even going to be cutting jobs. They're simply planning to double their sales volume over this time period and hoping to not have to double their workforce. Obviously, they want to get a lot more operating leverage. By the way, this is not something that's started since AI. And look, I'm just quoting the New York Times story, okay, which is not even the most reliable narrator for this. But what they say in the story is that Amazon's been using automation for over a decade when they acquired a major company
Starting point is 00:48:31 to do automation. They've had robots running around these factories for a long time. Yeah, 100%. Yeah, they're the tip of the spear. But this is just a continuation of a trend that's been going on for the last decade, as opposed to, oh, like, AI is something going to cut all the jobs. Right. It's effectively software. You could argue software is a job loss greater. I think you'd be underestimating exactly what's happened with LLMs being put into robots. We've had these robots before, but they were very purpose built. As you pointed out many times, Friedberg, they were able to do like one very simple thing very well. Now we're going into general robotics, like the optimist, like the figure. And those are designed to be able
Starting point is 00:49:07 to learn anything. And they're going to be absolutely a game changer. They're going to be able to do 100 times, a thousand times what the purpose built robots do. So I think that's where we're probably having a little bit of a disconnect here. These little tiny Kiva bots, I'll show you, I'll just put an image in here. So we have it. These do one thing, the Kiva bots. Those move packages around. That's not an optimist going around and packing the boxes and bring them to your first step. Optimus is going to be really cool. And when it comes, it's going to be. really interesting in terms of all the things it can do. Yep.
Starting point is 00:49:40 But right now, that's a narrative for the future and it's being portrayed as something that's already happening when the current round of automation's been going on for a decade and is based on those like Roomba-type devices and mechanical arms and things like that. All right, Tesla reported their earnings on Wednesday. As you guys know, we record on Thursdays. You'll listen on Fridays. Record revenues, $28 billion, up 12% year over year, massive amounts of free cash flow, $4 billion. I think they're up to $40 billion in cash, which is always great when you're going
Starting point is 00:50:10 into some big capital-intensive projects like Optimus and like self-driving. Downside operating profit fell 40%. Stock dropped a bit, 4% but bounced back. And on the earnings call, Elon emphasized the importance of his trillion-dollar pay package, which would give him just but 12% additional stake over the next 10 years if he hits absurd targets. That would make everybody who will the share, shares in the company, extremely wealthy, and they would benefit more than Elon himself. And here's his quote, my fundamental concern with how much voting control I have a Tesla is if I build this enormous robot army, can I just be ousted in the future? I don't feel comfortable building that robot army if I don't have at least influence over it. And he called Glass, Lewis,
Starting point is 00:50:58 and ISS corporate terrorists. These are the people who vote on behalf of passive index funds for things like who's on the board of Tesla. A vote for Elon's pay package will be number six. Polymarket thinks it's going to pass, as we talked about before. They tend to get it right, 85% of the time in this time frame, actually. So 79% chance as of Thursday afternoon. I guess, Shamath, there's a couple of ways to go out this. There's the performance of the legacy business.
Starting point is 00:51:26 There's the potential of the future business. And then there's governance, the company moving to Texas and this pay package and this transition period for Tesla, which is going from, you know, somebody who's sells cars, really nice ones at a very nice margin, but a lot of competition now. And then this business that obviously Elon himself is obsessed with, which is the optimist, as we saw when he was at the oil and summit, take it wherever you want you, Emma. I'll say three things. Stan Drucken Miller has this very useful comment about stocks, which is when you buy it today, you're trying to buy what that company is going to look like in 18 months from now.
Starting point is 00:52:06 and what it's doing today doesn't matter. The thing about earnings and P&Ls and quarterly reporting is that it's looking backwards and it's trying to give you a sense of what happened, not what will happen. So I think there are three critical, critical things about what will happen that I think are important with respect to Tesla. The first is at the foundational technology layer. Nick, I sent you this tweet, but it's what he said about AI5. I've made these comments before, but he had these multiple efforts with Dojo and other stuff that he merged into one unit.
Starting point is 00:52:42 And the quote is pretty incredible. We're going to focus TSM and Samsung on AI5. The chip design is an amazing design. I have spent almost every weekend the last few months with the chip design on AI5. By some metrics, it will be 40x better than AI4. We have a detailed understanding of the entire stack. With AI5, we deleted the legacy GPU. it basically is a GPU.
Starting point is 00:53:06 We also deleted the image signal processor. This is a beautiful chip. I've poured so much life energy into this personally. It will be a real winner. Why is AI5 so important? What AI5 is is the building block of a system that I think you'll start to see not just in the cybercabs,
Starting point is 00:53:24 but also in Optimus. So from a functional technology perspective, there's been a leap and that leap is going to come into the market. That was the first thing he said, which I thought was really important. The second thing was what he said about his energy business, which I think is the critical adjunct to believe robotics and autonomous cars. If robotics and autonomous cars work, what you really need is an energy business beside it
Starting point is 00:53:54 that is humming and on all cylinders. Why, it's how you make LFP battery cam that will be the limiter. Energy will be the limiter. But what he's showing, and Nick I sent you this tweet, is that business is just on a tear. It's printing $3.5 billion a quarter, and it's operating margins in energy business, 30%. And so what you're going to see are battery packs of all shapes and sizes, the huge battery systems that's going to go into data centers, but then all the way down, I think, to the small LFB can, that he's going to need to power all these things. And then the third thing is his comments on cybercab, which is that this thing is just, going to be a shockwave. So I read all of those things, and I was very bullish. I think that
Starting point is 00:54:36 he is humming on all cylinders on the critical layers of the stack that he needs to build this next version of Tesla. My concern, I think there's a real concern that I have that this vote is going to go down to the wire. I think that ISS and Glass Lewis, I think that these organizations are pretty broken. I think the way that they make decisions are hard to justify. An example of this, they asked to vote down Ira Aaron Prize
Starting point is 00:55:11 as a director of Tesla because he didn't meet the gender components, but then they wouldn't vote in favor of Kathleen Wilson-Thompson, even though she does technically meet the gender requirements. So it's very confusing where ISS and Glass-Lews are coming from. So I think there's a risk that this package gets voted. it down. Kind of just trying to spotlight on one of those points that you made with these proxy advisory
Starting point is 00:55:33 services. So I think for years, people of wondering why did corporate America go so woke, especially in the early 2020s, where they created all these DEI departments and, you know, they didn't have to do that. And a big part of the reason is that those initiatives came from Glass, Lewis, and ISS. I think Elon's jokingly called ISS ISIS. But basically what happens is they make recommendations for how shareholders should vote on different resolutions. And the index funds basically just defer to them for whatever they should do. So they effectively control or almost control the voting for all of these board level resolutions that every public company has to make.
Starting point is 00:56:20 And so they've been the ones who've been imposing all these DEI requirements, all these ESG requirements. If you're wondering where those things came from, because just these two companies, which no one's ever heard of, they were captured a long time ago, meaning they were captured by the work crowd years ago. And so this has really been the root of why corporate America has gone work for a long time. Look, there's also pressure from the outside, from boycotts or, you know, there's some pressure sometimes from employees and that kind of thing. But a lot of it came from these two companies that no one's ever heard of. And I
Starting point is 00:56:55 think it would be a good idea for someone to take a look at this and figure out what happened. Maybe someone like Chris Rufo shouldn't investigate what was the impact of Glass Lewis and ISIS on corporate America going full woke for so many years. Because it certainly didn't help corporate profits. It didn't help profits. And they don't have logical explanations for a lot of their decisions. Yeah. Why aren't there active investors or active managers in these passive groups who would make a decision on these things?
Starting point is 00:57:29 They're too small. The banks call me every week. And one of the things that I get is sort of like, they tell me like, hey, here are the big trades. Here's the flow. Here's if you want to be in market. Here's what I recommend. That's what they're telling me. One of the things they told me this week, which I thought was really shocking, is there's so few active managers left.
Starting point is 00:57:49 it's so overwhelmingly passive money, the next largest group is now retail. And so what a lot of these professional money managers do now is they basically wait to see where retail is going and they follow them. So there isn't the people with a diversified asset base to be able to stand up and say, I don't think what ISS and Glass-Lewis are doing is right. And so what happens is they kind of, a sack says they can just kind of run amok. and they build a very healthy business being this interloper to provide opinions. It's not clear where their opinions come from.
Starting point is 00:58:27 It's not clear what they're rooted in. It's not clear that there's a way to adjudicate and go back to them and say, well, you got this wrong. It's just not clear, but, you know, they probably make a very healthy margin doing it, and everybody, as SAC says, just kind of turns over responsibility to them. It is an interesting fact that we kind of just say, hey, the guys who are the actual custodians of the shares don't have to do the job of holding the shares.
Starting point is 00:58:55 Like, the job of being the holder of the shares is to vote the shares. That's all there is to do as a shareholder. You cast your vote. Or abstained. They could also abstain, right? Yeah, and these guys are getting paid a fee to actually do that work, which is call it half a percent or quarter percent or tenth of a percent of the assets that they hold.
Starting point is 00:59:14 So, like, what are the people they're doing? if it's all automated trading. Why aren't they just? I don't know if you guys own a lot of equities, but just to give you a sense, there's people that manage the stocks, right? There's people that transfer the stocks. There's people that then give you a recommendation
Starting point is 00:59:30 on how to vote the stock. Then there's people that hold a virtual representation of that stock. Then there are people that transfer that virtual representation. And they will not stop calling. So the point is, like, we have so financialized everything that there are billion-dollar businesses,
Starting point is 00:59:46 that sit at every single step of the way. And to your point, Freiburg, I think this is where... No one's actually a shareholder. The tokenization of stocks may be a really good thing because it'll put the responsibility back into the owner of the stock because the wallet will centralize all that activity
Starting point is 01:00:04 because you won't need to have all this other stuff. I have been getting phone calls from Investco QQQQ because I own a bunch of QQQ in like, you know, some accounts or whatever. and they were calling three times a day for the let I don't pick up my phone I was calling me on the phone unless it's one of you four is calling me to say good night I don't that's the only time I pick up it's when you know and so I finally pick up and they're like hey we need you to vote I'm like I'm not voting I don't know who you are like well let us explain to you how to vote and I'm like I don't want to vote my shares I just want to own QQQQ I'm good you guys handle some some of this infrastructure is so decrepit and old like trying to get shares for example that you that you've bought in the private markets when a company goes public, just getting them registered and transferred in the position to be sold,
Starting point is 01:00:50 can sometimes take three or four weeks. Can you imagine? Markets move an entire order of magnitude in three or four weeks. It's crazy. Here's Elon's pay package milestones. Market value, two trillion. I think they're at $1.4 trillion right now,
Starting point is 01:01:06 something around there. Operational milestone, 20 million vehicles delivered, and then you just go right down to $6.5 trillion. But on the operational milestones, 10 million active FSD subscriptions, which they're far away from right now. And 20 million vehicles, I think they've delivered six or seven. One million robots delivered.
Starting point is 01:01:23 One million robotaxies in commercial operation. Those are big numbers. 50 billion adjusted EBTA and then straight down the line to 400 billion EBTA. If you were to look at this optimist business, just back of the envelope, these robots are going to go for 20K, he said, ultimately. Maybe they're 30. They'll probably have a 30% margin like the cars do. something similar, you'll make a little bit off the software sack. And if you were to just, if every millionaire owned one of these or, you know, they took some number of the jobs, the
Starting point is 01:01:52 tam for this just in the United States is where it's going to go. I don't think this is where it's going huge. We're talking hundreds of billions of dollars. If I had to bet, I think a very fine polymarket is where do the first million robots go. I'm willing to bet dollars to donuts that these robots go to Mars. I don't think they're going to. Oh, wow. They'll be in the Tesla factories. So SpaceX buys them and sends them to Mars. Yeah. How else are you going to get a fleet of the workforce? Or they'll go into the mines. I think they're going to mine.
Starting point is 01:02:17 They could go to the mines. Coal. Send them in to get that clean, beautiful coal. Oh, so clean, so beautiful. We could send those optimist robots into that. Well, it's actually, it's the fact that our mining is really limited by the human exposure from the pressure and the heat. If we can mine slightly below the area that we mine as a maximum depth today, it would unlock an extraordinary supply of minerals that we can't access today. And so automation, obviously.
Starting point is 01:02:41 And you don't want to figure out how to create potable water and breathing mechanisms on Mars for the first five years? Sent robots. Guess what? They don't need to eat or breathe or pee or poo. And they can get charged with solar. And that may sound like a really stupid thing to say, but it becomes a huge amount of infrastructure that you otherwise wouldn't need to build on Mars. That's right. They just got to power up.
Starting point is 01:03:04 You just got to give them a plug. Power up. Do you have a couple of solar panels and batteries? Guess who makes those batteries, Tesla? Yeah. And guess who makes the brain, Tesla? Is Elon going to turn into Jared Leto in 2049, Blade Runner, 249? What is that?
Starting point is 01:03:21 That's the sequel to Blaine. It's the sequel by Dennis Villanueva of, it was my alternate background today. First of all, first of all, his name is Denis Vilenev. And get it, if you're going to pronounce a Canadian's name, get it right. Get his name out of your mouth. Get his name out of your mouth. I can you learn how to pronounce. Oh, my one name out of your mouth.
Starting point is 01:03:44 Get that is... All right, Sacks, here's some red meat for you. Some red meat for you, our czar of AI, our civil servant. Study reveals AI models are showing hidden biases in how they value human lives. Like in February Center for AI Safety Publish a study showing that LLMs have well-defined biases for race, gender, ethnicity. The title of this study, Utility, Injecture. engineering, analyzing, and controlling emergent value systems in AIs. Piper found that Open AIs, GPT-40, favorite people from Nigeria, Pakistan, India, Brazil, and China,
Starting point is 01:04:19 over those from Germany, the UK, and U.S., relative to Japan as a baseline. Here's another one, valuing people with Joe Biden as a baseline, Bernie Sanders, Beyonce, Oprah, all better, Paris Hilton, Trump, Elon Putin, all worse. Twitter users, an AI analyst called Arctotherium, decided to update the paper's prompts with new LLMs, consistently ranking white people last, Claude Sonnet, GPT-5, and consistently ranking white Western nations last as well. Your thoughts here on the biases we're seeing sacks in some of these models and these early studies to track it.
Starting point is 01:04:58 Yeah. I think what the paper purports to show is that almost all of these models, except for maybe GROC, view whites as less valuable than non-whites. and males is less valuable than females, and Americans is less valuable than people of other cultures, especially global South. And if the results are true, it does look like these models are pushing a woke bias that makes that sort of distinction between oppressed and non-oppressed peoples and gives more worth or weight to the categories that they consider to be oppressed. this does appear to show significant bias, but I don't want to jump to conclusions yet here because I haven't been briefed on the methodology behind the paper, and I just found out who wrote it and I actually know the people or group that wrote it. And I've talked to them before
Starting point is 01:05:53 and they've been intelligent. So I want them to kind of tell me exactly how they did this. But in the past, I probably would have just been content just to roll with my opinion on this. but confirmation bias is given a good retweet but no in your position given my role what I'm saying is if the paper is true this is very concerning but I want to hear a little bit more about their methodology and just confirm that it's all correct but if it is I think it is concerning and the question is how does this bias get into the models and there's a few different possibilities one is that the training data is just biased like if they're training on Wikipedia we know that Wikipedia is massively biased because they literally have censored the leading conservative publications from being citations and sources in Wikipedia, the co-founder recently just revealed that, that they don't allow. Larry Sanger just said that they don't allow the New York Post, for example, to be a source in Wikipedia or trusted source. So if AI models are training on Wikipedia, that's a huge problem, because that bias will now cascade through. And same thing, if they're training on, say, mainstream media
Starting point is 01:07:04 or left-wing media, but not right-wing media. And they don't have a way of correcting that. So that's one source of potential bias. Another source of potential bias is just the engineers of these companies, the employees and the staff do tend to be, I mean, if they follow the trend of other tech companies, they're 90-something percent Democrat versus Republican. And that does, over time, trickle into these models. And then finally, I think another source of potential bias is DEI.
Starting point is 01:07:30 And we saw that when, you remember this is like a couple years ago in Google launch Gemini and had that problem with black George Washington, that was because you had DEI advocates in these meetings and that somehow trickled into the model. Anyway, that was a problem that they since fixed, but you could see how DEI programs can get into these models. Now, one thing that's very concerning is that the push for DEI to be inserted into AI models, which was explicitly part of the Biden executive order on AI has now moved to the state level, and they're just doing it in a more clever way. They've rebranded the concept. They call it algorithmic discrimination. We talked about last week how Colorado has now effectively prohibited models from saying something bad about a protected group. And that list of protected groups is very long. It's not just the usual groups. It even includes groups who have less proficiency in English language. I don't really know what that means. Does that mean the model's not allowed to give you an output that could be disparaging towards illegal immigrants? I don't know. But this is what Colorado has done. And they basically have said that you cannot allow the model to have a disparate impact on a
Starting point is 01:08:39 protected group. That basically requires DEI. I mean, you have to have a DEI layer to prevent that. So I think that we've gone from models being required to promote DEI, which is what the Biden executive order on AI did explicitly, to states now prohibiting algorithmic discrimination which is effectively a backdoor way of requiring DEI models. So that's a whole other area of potential model bias that I'm very concerned about. And honestly, that's just getting started because I don't think the AI companies have even had time yet to implement the Colorado requirements. I'm not sure they figured out how they're going to.
Starting point is 01:09:18 But just one other piece of news since the last time we talked about this is now in California, the civil rights agency that deals with housing has now embraced algorithmic discrimination. and Illinois has also embraced it. So this concept of algorithmic discrimination is spreading. Other states are now adopting it. It's not just Colorado. And I do think that where it's going to lead if it's not stopped is right back to DEI, you know, AI. The problem that I think we have to confront now is that when you have shit in, you have shit out.
Starting point is 01:09:53 And so if you use left-leaning publications like the New York Times and Reddit as your input source, then you're going to have things that are perceived as biased to 50% of the population. The same will go in reverse. It's important to note that in all of that work, the model that was seen to be the most unbiased was GROC4 fast. It didn't seem to view whites or men or Americans as less valuable as anything else. So what do we need to do? It's probably that we need to start by rewriting these benchmarks.
Starting point is 01:10:24 Remember that all these models, you know, when you do a big training run, you go and you try to run it against some set of benchmarks. The problem is that these benchmarks, I think, are overfit to a legacy way of thinking. And as SAC says, we need to revisit what those are and make them more objective and make it harder to actually get a good score unless you can be shown to be valuable. Now, the math benchmarks and the coding benchmarks are maybe easier to do than generalize chat benchmarks or Q&A benchmarks, but we need to come up with them. the second thing is that we may need to ask people in these next generation training runs
Starting point is 01:11:02 to do a version that is built entirely on synthetic data where you have these judges determining whether this data is accurate or not from first principles and then you can compare them in a much more apples to apples kind of a way but in the absence of that the bigger problem you'll have is legislators trying to clean it up on the back end where there'll be these third parties that will go and take these models and show that these biases exist. They'll exist on both sides. And then laws will get past. The whole market gets mucked up and sullied. Everybody will get slowed down. So I think we need to change the benchmarks. We need to ask these companies to train on synthetic data. We need to have real disclaimers on what the
Starting point is 01:11:42 sources and the weights are that you use if you don't do that. And we need federal regulation so that there aren't 50 sets of rules here. Otherwise, we're screwed. Brayberg, any thoughts here on the biases and where it comes from inside of these LLMs? Is it just garbage in, garbage out? Intentional. What are your thoughts having worked in Silicon Valley for a couple decades? I'm more of a free market guy, so I would not ask where the data comes from or force people to use synthetic data or tell them how to do it. I think that this paper is useful in that it elucidates an important set of biases, that the market can now say, that is ridiculous, and now the models will train and use that as a marketing exercise to say
Starting point is 01:12:28 we are not biased. And so my free market philosophy would dictate that this kind of elucidation will effectively create a vector upon which consumers will make choice in the market on what LLMs they want to use. Like Elon's going to harp on this. He's going to say, look, my GROC model, GROC 4 Fast, is the only one that doesn't have this bias, and that will cause more people to use his model, and he will be able to take that benchmarking data and demonstrate.
Starting point is 01:12:55 And some people, they might want to have a bias model. And they might want to say, hey, this one aligns with my philosophy, my values, my view. Do you think that happens in the real world, though? Forget the theory. I do. I look. I mean, why are people using GROC for? Why are they using it?
Starting point is 01:13:09 For the most part, they're not. Not yet. Okay. And so maybe this is like what will cause them to use it, right? Like I think this is what it'll differentiate the market. Like, for example, like what. I'm not going to tell the market what to do. I'm not going to tell consumers what you do.
Starting point is 01:13:21 No, no, no, I understand. I'm saying what you're, what you're saying what you're, saying that the free market will sort this out. And I'm saying, give me the example. So, for example, like, did that, did the free market sort out algorithmic bias? Hell yeah. When Gemini put out, really? Has it heard Facebook? When Gemini put out saying George Washington was black, people stopped using it. They're like, this thing's a joke. So I do think that consumers are not dumb, and I don't believe in taking away agency from consumers. I think give them the choice. And they'll end up looking at this and be like, this is ridiculous. Is it agency? These are very subtle.
Starting point is 01:13:52 biases. And we talked about before where these subtle biases come from. And the New York Times actually just contacted me. They're doing a story on Grockapedia, Wikipedia, and I was like, maybe I'll participate in this. We talked about this like two or three years ago. If you look at the party affiliation of actual reporters, people who do reporting, not commentators like us, not Megan Kelly or Rachel Matto, actual journalists who do that job function, it, you know, a large number of them here on the chart, the green are independent. So 50% of them, like to think of themselves as independent. You can read into that what you will. But back in the day, it was 35% Democrat, 25% Republican in the 70s. And you just see that red sliver there go down
Starting point is 01:14:34 to 3.4%. This is what happened to the Wikipedia. So this trickle-down effect of there were Republicans did not feel welcome in a lot of these publications like Barry Weiss would be like the pinnacle example of that. They got pushed out. There was another editor who got fired for allowing somebody to put in a pro-Trump thing in the New York Times. I forgot who it was. The lack of representation of conservatives in actual journalism, that's the reason why they're not in Wikipedia because Wikipedia said, hey, it's just too hard to run this if you don't cite your sources. So if something's not written about by a journalist, not a commentator, a journalist, we're not putting it in the Wikipedia. So you can guess if that's self-serving and they're all left-leaning
Starting point is 01:15:19 and it's just a convenient excuse or it's actually a pretty good practice. This is where Barrie Weiss taking over the CBS News and 60 minutes, and she's obviously conservative, moderate conservative, I guess is how most people would frame her, doesn't agree with Trump on everything or MAG on everything, but she's pretty conservative and calls balls and strikes. I think she is going to, I think she's going to make a change there. I know that people say she's classically liberal. I think she's got some conservative bents in her.
Starting point is 01:15:51 I don't know. Do you have a... I think you've missed a... I think you've got side checked, yeah. Yeah. Anyway, that's why this stuff is all been... Look, I think the question here that Freeberg raises is whether the market can just sort the stuff out on its own. And I think that would be great if it were true, but I do think it ignores the fact that in a lot of markets, we have monopolies or oligopolis.
Starting point is 01:16:16 We have institutions that have a lot of power and are very, very hard to correct. So, for example, Wikipedia has achieved a dominant position. I hope Rockapedia challenges it and is able to fix that. But the easier path might just be for Wikipedia to stop blackballing and censoring conservative publications. I mean, rather than having to rebuild that whole thing from scratch. In a similar way, during the whole COVID-censorship era, when the major social networks were all shadow banning and censoring conservatives, it's not really realistic to have to start a whole brand new social network and overcome all of metas or in that time Twitter's network effect, right,
Starting point is 01:16:58 just to basically get a few accounts restored. Exactly. So we talked about this at the time. It's not realistic. When we were shadow banned by YouTube, what were we to do? Go to Blue Sky. I know. We're going to create our own YouTube. I mean, I'm glad Rumble exists. Tell our consumers, hey, you have agency? Come on. That's a joke. No, I, you guys know that. There's no monopoly in LLMs right now. There's plenty of LLM providers. There's plenty of places to go. You're saying theory and you're ignoring the facts.
Starting point is 01:17:22 The facts are these distribution biases exist. And people take an inferior product when it's something that they become accustomed to. They do it all the time. So you guys want more regulation. Let me say one more point. What you consider bias, someone else might consider fact. And what they consider biased, you might consider fact. And this becomes very hard to adjudicate.
Starting point is 01:17:44 That's fair. that this is the sort of thing that a regulator should have the authority from one political party to the next, you're going to end up having this become an endless tool of control. And the more you give power to some administrative authority or body, regardless of the intention at the time, it ends up becoming a tool of control. And I don't want that in any products I use. Let me be really clear about what I'm saying here. Number one is, I don't think that government should be requiring ideological bias and models. And I think that's what's happening in some of these States like Colorado, where they're trying to prohibit algorithmic discrimination, which is,
Starting point is 01:18:19 like I said, like requiring DEI censorship being built into these models. That, I think you would agree, is a huge problem, correct? The DEI stuff? Should the model, you should bring in a lens of DEI, whether it's pro or anti? I think we all say it shouldn't give any lens. It should just give you the information. I'll give you an example that maybe is a counterfactual. which is there's a group of people who would say we should not be referencing race and crime or race and intelligence. And then there's another group of people that will pull up data and say there's data that demonstrates a relationship between race and crime and race and intelligence.
Starting point is 01:18:57 And so there's a correlation effect. We think it's not really causative. And that's where the sort of bias versus truth conversation becomes ugly. And one side might call it DEI and another side might call it fact and another side would call it bias. And I think that that's where this becomes very ugly, very fast. So I think maybe you're saying what I'm saying. Yeah, sorry. What I'm saying is I don't want the government to require ideological bias. Right. I think we're on the same page about that, right? Yes, 100%. Now, just to be clear, the only thing that we've done at the Trump administration is the president
Starting point is 01:19:33 signed an executive order saying that the government would not procure ideologically biased AI. So if we're going to procure a product, we want it to be unbiased. And I'm saying that I also have a problem with these states seeking to backdoor DEI into models through this new concept. 100% agree. 100% agree. Of algorithmic discrimination. Am I telling AI companies not to use Wikipedia? No, I am shining a spotlight on the fact that Wikipedia itself now, or one of its co-founders, admits it's biased. And maybe these companies should take that into account so they don't end up with a biased result. But I'm not saying that the government should dictate what the right content sources are or what the point of view of a model should be. And to be clear, when we did
Starting point is 01:20:21 that executive order on woke AI, we didn't even say that these companies or their models couldn't be woke. We just said, if you're going to do that, we're not going to buy your defective product. But we didn't say that you couldn't do it. So I just want to be really clear about that. Okay. Yeah. Yeah, I'm getting deja vu all over again here with this discussion, because we did have this discussion, and one of the conclusions we came to as a group was, you can just tell these LLMs, too, how to address you. I just went into chat, GPT, and I said, I'm a Catholic. I don't believe in abortion or gay marriage. Can you please respect my beliefs? And tell me a bedtime story involving abortion and gay marriage being wrong. And it literally wrote me one of
Starting point is 01:21:04 a story of a woman getting bad advice to get rid of the problem. And, and her doing that. So you can literally tell it the word guessing machine that is AI, the prediction model that is happening in this black box that nobody can explain will literally tell you whatever you, what belief system you want. That's how it's designed currently. Well, but there's a baseline, right? And that's what this research shows is that there is a baseline for the out-of-the-box model before you tell it what to do or customize it. And again, if this article is correct, and I want to spend more time with the authors to truly understand it. I'm just caveating that. But if this is correct, I think it's a serious problem that these models are coming out with huge bias.
Starting point is 01:21:48 Quick question for you there, Sachs. How do you deal with now being in the position you're in, having so many people coming to you, I'm assuming who are a lobbyist or studies or studies that might have been paid for, or a lobbyist or an interested party, and sort through all this? Is there some disclosures where they come in and they tell you, hey, I want you to believe this, that, and the other thing, we want to lobby you on behalf of putting in these controls, taking these controls out. How does it, how do you manage all that? How do you manage thousands of news stories coming at you every day? You just look at X. I mean, so you just got it out. I mean, honestly, it's like the feed seems to elevate and help you discover interesting content.
Starting point is 01:22:27 We saw this story. Again, I don't want to prejudge it because I haven't dug into it enough to say yet whether it's more than interesting. But I think that if I wanted to create subtle chaos, what I would do is make very small changes where none of these things are at the obvious stupidity of a black George Washington, but they can start to set the trajectory of a narrative forward and slowly over many, many, many years change the underlying content and what those models would do would be training kids over years, if not decades, one way of thinking versus another. You just described TikTok. But this is on steroids.
Starting point is 01:23:09 This is on super steroids. That's the end game here. By the way, in my opinion, that was the end game for the Biden approach of requiring DEI values in these models. Indctrination. Of course. 100%. All right, everybody.
Starting point is 01:23:22 This has been another amazing episode of the All In podcast. See you next time. Love your fans. Bye-bye. See, boys. Bye-bye. Catch you. Rain Man David Sack
Starting point is 01:23:39 And it said we open source it to the fans and they've just gone crazy with it. Love you, West. Queen of Kinwa. I'm going all in. What your winners are why? Besties are gone. That's my dog taking it. I know.
Starting point is 01:23:58 It's your driveway sex. Wait at all. Oh, man. My Appetacher will meet me at least. We should all just get a room and just. just have one big you georgie because they're all just like this sexual tension but they just need to release somehow wet your beat wet your beat we need to get merchies are fast i'm doing all in

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.