The Ben and Emil Show - BAES 120: AI is the Biggest Bubble of All Time ft. ED ZITRON

Episode Date: October 2, 2025

Ed Zitron is back with us to talk about all the latest insanity coming out of silicon valley, OpenAI, and the entire AI tech apparatus.The numbers don't add up. Give this video a thumbs up if you enj...oyed it! And please leave us a comment! It helps us! ***Ben's new movies and tv podcast with Dillon is OUT NOW! GO WATCH the latest episode on THE HIGH SCHOOL CATFISH here: https://www.youtube.com/watch?v=MC1wfcB6c2E **CHECK OUT EMIL'S LIVESTREAMS HERE: https://www.youtube.com/emilderosa Support us and get bonus content, ad-free versions and more plus your first 7 days free at https://benandemilshow.com __ SOME OTHER VIDEOS YOU MAY ENJOY: That's Cringe of Cody Ko: https://youtu.be/dTbEk0pVh2w Our AUSTIN VIDEO: https://youtu.be/yGSs56bFzRU Our episode with Kyla Scanlon: https://youtu.be/cIHWkY35cuc Big Tech is out of ideas (ft. ED ZITRON): https://youtu.be/zBvVGHZBpMw Arguing with a millionaire (ft. Chris Camillo): https://youtu.be/1ZUWTkWV_MM We bought suits HERE: https://youtu.be/_cM1XqA9n2U ***LINK TO OUR DISCORD: https://discord.gg/CjujBt8g ***Subscribe to Emil's Substack: https://substack.com/@emilderosa ***Trade with Ben at https://tradertreehouse.com __ MIZZEN & MAIN: Mizzen & Main is offering our viewers 20% off your first purchase at https://mizzenandmain.com using promo code BAES20. ZOCDOC: Stop putting off those doctors appointments and go to https://zocdoc.com/baes to find and instantly book a top rated doctor today! GOODR: Go to https://goodr.com/BAES and use code BAES for free shipping! __ Follow us on instagram! @ benandemilshow @ bencahn @ emilderosa Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Listen closely. That's not just paint rolling on a wall. It's artistry. A master painter, carefully applying Benjamin Moore-Regal-select eggshell with deathly executed strokes. The roller, lightly cradled in his hands, applying just the right amount of paint.
Starting point is 00:00:23 It's like hearing poetry in motion. Benjamin Moore, see the love. The big question is whether or not all of this AI spending is a bubble. I feel very safe in saying yes. Yes. You have committed to failure. There is no doing this. If they gave every dollar just to Sam Ormond,
Starting point is 00:00:43 the guy who can't even tell you why you're giving the money to him. When you're like, what are you going to do so? He's like, well, yeah. And day it's going to be really intelligent. It could get agentic at some point. No one else has a big business doing this. Even Claude Code with Anthropic, which is theoretically meant to be super popular,
Starting point is 00:01:01 $33 million a month. Ugh. That's less than the Cincinnati Reds make. What's you do with the Reds? What if it's stupid? What if they make the AGI and it can't do stuff? And they're like, yeah, we need you to make this a PowerPoint. It's like, no.
Starting point is 00:01:15 I don't want to. And I'm just like wasting some small town in like West Virginia's water. And I'm just like, what am I doing with this? Hey, you got this. You got this. You got this. I hate you got this. I hate when the robot says you got this.
Starting point is 00:01:30 I don't ever hear that from it because I'm abusing it. Tell me what's going on. Tell me what's going on. Your hair looks okay. Were you wearing a hat? No, it just looks messy. He doesn't have a f***ing hairbrush, did he? Now all the f***ing perverts.
Starting point is 00:01:56 Wait, you can't cuss in the first 10 seconds. I can and I will Oh man Oh shuts Oh shuts We should have Little sticks I wish I wish we
Starting point is 00:02:13 I wish I could do the chat GPT voice To intro us Because of what this episode is going to be about I'm sorry thank you for holding me accountable It's this and it's also that Hey you got this You got this you got this I hate you got this
Starting point is 00:02:26 I hate when the robot says you got this I don't ever hear that from it because I'm abusing it I never I don't even have it oh you do don't lie I have clawed but I've only used it a couple times Oh I pay for both he uses it for therapy I have is that that's not correct entirely I used it like five times
Starting point is 00:02:44 Over the course it was one time stretched over the course Of five times One long thing that I was like And I had another dot Run that by me and I was just Five times over the course of one time It was like watching a movie that you broke up in the simple part. Oh, you have one session where you fuck with it five times.
Starting point is 00:03:03 Yeah, yeah, yeah. Okay, that's not too bad. And it was just, uh, it was, it was, I will say, it was interesting. And it was very insightful and it wasn't like, yeah, you got this type of shit. It was, it was very, you know, don't get me wrong. It's not replacing anything for me. Uh, I graduated therapy. I got the metal and everything.
Starting point is 00:03:22 Nice. But I'm, they fixed you. They fixed me. They fixed me. Like he can be fixed. Well, one part. The other parts are pretty fucked up still. But it was recommended to me by a friend who found herself using it when she was going
Starting point is 00:03:36 through some troubles with her immigration stuff. And she was like, it was actually, I'm kind of ashamed to say it. It was very helpful. Yeah. And I was like, okay. And she said, it's not replacing a human therapist for me. It's just in those moments late at night, if I don't want to bother anybody or I don't, you know, I'm just spiraling.
Starting point is 00:03:55 it was very easy and that's like one of the small I mean we'll get to some of the small things it's one of those things from like I'm glad it helped you
Starting point is 00:04:03 but okay I also still am very skeptical that that's a good way to get the help that you need but that also doesn't even sound like the help you need it was like you needed someone to talk to it
Starting point is 00:04:14 exactly that is what it is I just go online I go on blue sky I'm like hey I'm having a massive panic attack and I just read the replies no people are like no they're like
Starting point is 00:04:24 you piece of shit No, that doesn't happen. You don't ask Blue Sky any questions because you get five deeply unfunny people making the same joke, one after another. Like, I posted a story earlier that was the guy was called John Levine, I think. No, it was like Adam Levine.
Starting point is 00:04:40 I was like, oh, like the Maroon 5 guy. It's like, I will drown you. I will hold your head into the icy lake and you'll never make a joke again. Or a shallow puddle. That'd be an awful way to drown. I feel like this is a good spot to introduce our guest. You may recognize him from the first time we had him.
Starting point is 00:04:59 When was that? It was probably six months ago. So much has changed. So much has changed. We have Ed Zitron back on. I'm back. Because we had so much fun the last time. And you know what?
Starting point is 00:05:09 Everybody loved it. In those six months, I would say, I feel like maybe you've been validated a little bit here. We were definitely talking about your theory of the bubble. It seems like a lot more. more people have since come around to accept this theory. It's a, um, just from, I mean, you've seen it from mainstream journalists, but also just like the general vibe on social media is now just like, this is a fucking bubble. Like, this is all going away. Even in the last
Starting point is 00:05:40 week, if you like, yeah, I mean, just it's, the vibes are fucked. Yeah, the vibes are fully fucked. But you had the Bain thing. Did you see this Bain thing? No. They were like, well, okay, everyone, we should really be concerned because there's only, we need two trillion dollars of revenue to make this worth it. And then like, but we're going to be 800 billion short. What? There's $55 billion, which is including core. We've Nebius and all the different people selling AI computers. It's about $55 billion. How are we getting the other, well, the $1.9 something trillion, no, 95 trillion. It's just, no one fucking thinks. And they're like, well, we're going to have the AI powered savings. It's like, when are those going to happen?
Starting point is 00:06:20 Dude, it's just, they just need a little bit more compute. any moment. Wait, is this Bain Capital? Yes. We just need one more order of magnitude, and we're there. It is funny, and Deutsche Bank was like, yeah, the only way this keeps going is if the spending keeps being parabolic, and that's very unlikely. It's like, there is every
Starting point is 00:06:36 possible sign that everyone should freak out, and the markets are just like, no, Diamond hands. Didn't you see today that Sam Altman, I think it was actually last night, Sam Altman tweeted that they've got a new feature on ChatGPT. Pulse. It's called Pulse, where it
Starting point is 00:06:52 It pays attention to what you've been saying and then makes personal recommendations to you. Say if you said, hey, chat GPT, I'm thinking about going to the Maldives sometime. It'll remember that. And then later on, pepper things in about, yeah, wanting, oh, performs through the other, here it is. I can't believe he's kept his, uh, but already we have a thing. It is initially available for pro subscribers, so 200 bucks a month. So this shit is expensive. Wow, 200.
Starting point is 00:07:20 For that? No, but if they're releasing a feature I only to pro-subscribers, it means it's too expensive. Sure. And as usual, this shit is just, works for you overnight.
Starting point is 00:07:29 What the fuck are you talking about, Sammy? Clammy, Sam, Alton. Clammy, Sam. Clammy, Sammy. It means while you're sleeping. I'd like to go visit Borah, Bora.
Starting point is 00:07:40 I love that this guy does not have experiences. He's just like, ah, what are people do? My kid is six months old and I'm interested in developmental milestones. If you've not paid attention to your child, child, chat GPT will help you think about it.
Starting point is 00:07:54 This also put a shift from being all reactive to being significantly proactive and extremely personalized. Note that he doesn't actually explain what that would mean. Yeah. This is, so I saw the master of hype Casey Newton talking about this. And he was like, yeah, it could suggest planning Halloween costumes. And it's like, is that really, that is solved by Party City. If you have children, they will just tell you what they will. you're aware. It's not like a, if you were a parent, it's like, what fucking what I'm going
Starting point is 00:08:25 going to do for my kid's Halloween thing? Are their developmental milestones? Yeah, that's like, I've just, I'm ignoring all of the, it's just the kind of thing that you say if you have no other, and this is only available to pro subscribers, so it's super expensive. And this is the best they've got. No kidding. Well, did you also see meta's new thing? What is, what is it called? META vibes. Vibes. I put it in, I think it's worth honestly watching the, also before we jump around too much. I think your point about him not, like Sam Altman, really not knowing even how to sell this thing. If you scroll down a little
Starting point is 00:08:56 bit, there's actually, I just want to play this short clip of him talking about the kind of newest model. Let's see. The kind of newest model is that we're two years into this, hundreds of billions of dollars and they still can't tell you what it does. They really can't just be
Starting point is 00:09:12 like, this is what it is. Yeah. GPT, baby. I mean, I have a friend who uses it all the time. And I think that um i i don't think there's anything wrong with the fact that it is essentially just a glorified google oh yeah this one it's just it's just here he is this this clamuel look at this hyping the new models and he's just like i don't know someday it could be a gentic yeah it's true he's i've seen this fucking this this goblin i like i just want the audience to hear a little bit we could just
Starting point is 00:09:44 play like a minute of it yeah you know hundred billion is a small dent in it. And the numbers are also, like, they're missing a story of what this amount of infrastructure is capable of doing. Like, 10 gigawatts of compute, again, easy to throw around numbers like that. But the amount of work...
Starting point is 00:10:06 Look how natural way else. Look how it's standing there. Multi-square-mile gigantic things and the complexity at every level of supply chain. And then what that amount of brain power, which does not exist today, can do already today, What can it do?
Starting point is 00:10:21 Like, this is the real deal. This is the thing people have been waiting for. I talk about what is the real deal. Or when it's going to do this and when it's going to do that. Like, the stuff that will come out of this super brand will be remarkable in a way. I think we don't really know how to think about it. You don't know. It's remarkable in a way that we don't even know how to think about yet.
Starting point is 00:10:40 It's just like, you guys don't even know what it is. There are people who say he's smart. And it's just fucking insane because he sounds dumb. He sounds like a person. who forgot to prepare for a presentation. Fully. Every time. Like, he's just, you know, we don't even know how to think about it yet.
Starting point is 00:10:56 That's how great it's going to be. This is what happens when you, all these libertarian types, like, oh, yeah, kids just like, they get given everything these days. This is what happens. When everyone says, yeah, sure, you're the genius every day, that's what comes out. Guy's just like, yeah, the Tengik, what, Tandra billion is just a drop in a bucket. It's the most intelligence ever. But what if we have more intelligence, that would be good?
Starting point is 00:11:16 How would it be good? We can't know. We couldn't possibly know, but we need it now. even then it's like i what i always come back to is your argument that um it's it's not the economics of it simply aren't going to work even if we assume that he's correct yeah and that it gives birth to this thing and that we need to spend like three trillion dollars on it what do we ultimately get out of it we don't like and how's it going to work it's one of the most bizarre things as well because how right now the 700 million weekly active users allegedly i have fucking questions
Starting point is 00:11:48 but allegedly. And they've been at that for like a month and a half. And it's like, all right, you should be growing by 100 million users a month. If you are going to build that much infrastructure, you should have insane growth, like more insane than this, because you need this for the growth that is there, but it isn't there. And they don't really talk about who's paying them. They claim they're making a billion dollars a month.
Starting point is 00:12:13 Even that's a little bit like, are you really? And you're still losing billions as well? Also, it's very strange to see them saying that their users are going up, especially as, I think it was Apollo who put out their data that a adoption rates have peaked and are now going down because people have used them and been like, I'm not really finding the utility here. I'm just going to do it how I used to do it. Yeah, and I can actually say this because this is going to come out after it. I got a source who confirmed, and I can 100% confirm this, that Microsoft has on Microsoft 365, which is their crown jewels, like, tens of billions of dollars a quarter, I think. It's a huge moneymaker. They have 8 million active-paying AI users. That's tiny.
Starting point is 00:12:58 And 440 million users. 8 million of them. That's Microsoft, the apex predator of cloud software. The monopoly masters. The people who could sell theoretically anything can't sell this shit. Even if, and there's classical thinking that says there's always 50% of paid licenses that don't use it. So assuming because they say active in the material,
Starting point is 00:13:18 or the person's talking about. Okay, even if there's 50% on top of that, that's 12 million. Still kind of duty. Yeah, it totally is. Still kind of, even 16 million would be bad. And this is Microsoft. And you could say, okay, and if you say 30 bucks,
Starting point is 00:13:34 these are like 30 bucks ahead. 8 million, that's about 2.88 billion a year. That's dog shit for Microsoft. And this is after they've spent hundreds of, 200 billion in Kappex. And this is Microsoft. The reason I keep saying that is they are the best at cloud monopolies. They are, this is their most important product. It sells so much. They
Starting point is 00:13:53 have a complete monopoly on business software. No one's taking this from them. Even Google workspace. And they can't fucking sell it. It's like, it's over. People have just not accepted it. I mean, and they're forcing it on people. I mean, you open these, I've opened Microsoft products where I'm like, I did not, I don't want this AI bullshit. I don't want autopilot on my, like, I wonder if they consider co-pilot just when it pops up when you're typing in Word. They probably do. Right. Like, that's an active user. Yeah. Me trying to get it to go away as an active user. What's great is yes, because it's one action.
Starting point is 00:14:26 Yeah. But the other thing is, is it's not 30 days. It's 28 days. They've really just fucking moved the number around for active to make any... When you have to like shave two days off, that's when you know you're not cooking. But that makes me think Open AI, I don't know if they're lying, but they're doing something weird with this weekly active. Because why is only one company succeeding? Is it just that chat GBT is the only popular thing? Is it like it's, it's just weird. And also, Sammy is a fucking liar. He lies all the time.
Starting point is 00:14:57 He just, he went on TV, he's like, yeah, he couldn't even come up with a lie. I mean, that's also a common, that's a common thing I've seen from a lot of them. I think it's Elon Musk, Sam Altman, and I feel like I've seen one more where they're talking about how they all have this line that AI is going to create new science. Like science, we can't even dream up. It's just, if we can just get enough in there, it's going to go God mode and start creating. But the other thing he said, I just realized, is he said the cost of intelligence is going down. The cost of, the cost of inference is going up, the cost of actually doing this. Because when they brought in these reasoning models, September last year, basically a reasoning model, this is a very big dilution.
Starting point is 00:15:38 But instead of being like, I prompt it, it, it's spit something out. I prompt it and it goes, okay, this person asks me to do this. Here are the 10 steps. here the 10 steps, I'll evaluate each step. And of course, the more you make it evaluate, the more it hallucinates. But that is way more expensive. So the cost of, because they say, oh, the cost of tokens has come down. Fuck you, it's not a good evaluation.
Starting point is 00:15:59 The cost of inference, meaning the thing to create the output has gone up because they're just burning more tokens. And because they're all obsessed with coding LLMs, which are insane token burners, they're just annihilating money, just throwing it into the furnace for nothing. Yet they keep saying this cost of intelligence thing. It's wrong. It's a lie. It's just a lie. It's an egregious lie. And six months ago was pretty irate and I guess I still am. But you look at the numbers and I thought six months ago I was like, oh, well, maybe something will come out. Maybe I'll get embarrassed. No, it's just everything
Starting point is 00:16:30 that comes out just kind of proves it again and again. I'm watching all the people who said I was wrong. I'm watching. But it's just frustrating as well because we could have fucking stopped this a year ago. A year ago, we could have actually pushed back. But no, we're now committed to this capitalist death cult, and they're going, like, the funny thing is private equity is going to get fucking washed with this. Well, I mean, it's funny. The timing of that too was, so six months ago, it was almost like a day or two before I, you came on. Ezra Klein had that Biden admin guy on. The AGO. What's his name? It was the AGI one. Yes, to talk about how, just how scared he is, because AGI is right around the corner.
Starting point is 00:17:14 The tough thing is for, I'm not going to try to pretend like I'm some kind of expert on this stuff. And so you hear these guys talk and he's, I'm in the back room, you know, we're all these Biden officials, all these top tech guys. And we are around the corner from AGI. It's imminent. And now here we are six months later. And that's another frustrating thing about that aspect is these guys who are gung ho about it are also, they're like, yeah, We're kind of, if we succeed at this, it could, like, destroy humanity kind of thing. So it's like, why do you have to build this humanity-destroying thing if that's even a possibility?
Starting point is 00:17:54 Well, there's two things to it as well. One, they're describing slavery. Like, it's slaves. That's what it meant. It is. It's like, we're going to make a conscious computer. And then we will imprison it and force it to do work for us all day. That's slavery.
Starting point is 00:18:06 Oh, but we'll make it like it. That's just fucking with it. So it likes slavery. That's just, I'm sure. I don't have a rich history of slavery, but being British, I imagine there are things within colonialism. It's like, no, the slaves love it. But that's what they're describing with AGI. And also, they're not choosing the most obvious thing, which is, what if it's stupid?
Starting point is 00:18:25 What if it's stupid? What if they make the AGI and it can't do stuff? And they're like, yeah, we need you to make this a PowerPoint. It's like, no, I don't want to. We'll torture you. I don't care. Yeah, I'm a computer. I don't actually feel pain.
Starting point is 00:18:38 And what if it's bad at the work? What if it does the work? What if it lies? but it lies because it's a dunce. They're like, what if it's superintelligence? What if it's not intelligent? What if it's just conscious? What if it's a dope?
Starting point is 00:18:50 What if it's a moron? What if it's just like, I, uh, I just told you this. I know, but I wasn't listening. Right. Yeah. But you're a texting that you can read it. I didn't want to. It's just like, what if that?
Starting point is 00:19:02 I feel like that's more likely than a super intelligence. Well, I mean, since we've been, also AGI is fucking fictional. We've been pumping it full of, uh, you know, we've been scraping the internet and pumping it full of all of our social media data. So I mean, that seems not completely unlikely that it's just going to end up, uh, being all of our, yeah, our worst instincts. Just like, I won't do this. It's too woke. Yeah. Is that is the, is the fact that it's, um, probabilistic part of why, uh, it's so inconsistent in certain answers, depending on what you ask it. Yes. Because it's in from
Starting point is 00:19:34 what, because I still, I'm trying to grasp how all this stuff works. It's not the same as Google, where if you ask it something, it's going to give you the, exact same answer every time because of how Google is structured. Whereas with this, if you were to ask chat GPT, how do you bake cookies? It's probably going to give you a kind of different answer each time because of the way. Can you explain that probabilistic? With Google, it would give you a slightly different answer, but a very consistent different. You would know what you were going to get roughly with a different query. I'm sure it would represent differently based on your cookies, based on the things
Starting point is 00:20:08 you're logged into based on your location but it would be the same kind of things. Also potentially on which advertiser has paid Google the most lately.
Starting point is 00:20:16 Exactly. Like there is a beautiful algorithm behind that but nevertheless your consistency of output and indeed the cookie thing may be very similar every time.
Starting point is 00:20:24 Have we one time where it's like, I want to put a little poison in maybe not that bad. Yeah. And that is the probabilistic thing which is it is incredibly
Starting point is 00:20:32 high chance it gets it right with some things. But the problem is the edge of cases around that. Google has so much actual academic research underpinning search.
Starting point is 00:20:44 It's insane how badly they fucked it up to make sure that it isn't just like, okay, you want a pizza restaurant. Well, there's, of course, Hitler's. We found a restaurant called Hitler. It just like finds an insane link. It is pretty authentic, though, I have been to Hitler's, and you can't deny it's...
Starting point is 00:20:59 But it's the thing where they're... Really efficient ovens, they've got a Hitler's. Oh, God. Oh, my grandmother's going to fucking kill me. Well, she's dead. But nevertheless, it's, with probabilistic models, they're still generating a new thing every time. It's never the same information. So Google, it is pulling from the same information every time because it's an index of the web. With the probabilistic side, it's generating from training data and the parameters it has, but it's still different every time.
Starting point is 00:21:29 It might be really similar, but it's different. And they actually just, open AI, just put a study out saying hallucinations will never be fixed. They are, even with perfect data. It's impossible to do due to the probabilistic nature. If only someone have fucking said something, and it only gets worse with reasoning models. Because think of it like this. If you're like, I have a question and the model spits out an answer.
Starting point is 00:21:49 Great. I have a question and I want, and the model evaluates here. That can get more complex. It's how coding elements have got a bit better. Problem is each of those steps can have hallucinations. And actually the big hallucination problem that people don't want to talk about is it's not just about the authoritative data that you're getting that's wrong. It's not that it's lying or hallucinating.
Starting point is 00:22:10 It's when you start putting it in, like coding LLM, for example, and instead of it, hallucinating something that isn't true, it just does something based on a misinterpretation of what you said. So might reference a GitHub library that doesn't exist. There's a term called slop squatting where it all say, oh, it's a GitHub repo even, that doesn't exist and someone would take that repo because it keeps getting produced by the data and just have a back door into your thing. Or you tell it to build something.
Starting point is 00:22:37 and it doesn't build it. There's this company called Replit, which is an AI coding thing. It's actually currently fucking its customers. Very funny how they're doing it as well. It's like it's just spends all their money. They did Agent 3 and it just goes off on its own and just spends like $500. It's like a wallet inspector situation. But with that one, there's this constant problem with it being like, yep, I'll fix the database. And they go and check nothing happened. And you're like, no, you didn't do it. It's like, I'm really sorry. Thank you for catching me. And it doesn't do it again. The Replit Reddit. it is amazing. You should look. But this is because the more complex, the reasoning, the the more chances for the dice rolls to fuck up, and the more chance for it to burn a bunch of fucking tokens. And six months ago, this was a problem.
Starting point is 00:23:21 This has been a problem for a while. This has been the problem. And they're still like, no, what if we have more GPUs? More GPUs don't fix this. More GPUs make more of this happen. And I have a source, an infrastructure provider that told me there is a model, an open AI reasoning model.
Starting point is 00:23:37 that takes up four to 12 GPUs just for one instance. And if one person is refactoring a particularly bad code base, all of those GPUs are one asshole. Right. This is the worst run software in history. This is like the dumbest possible way of doing things. It's not good at anything.
Starting point is 00:23:55 And people like, well, it will replace workers? When? Hey, gang, we want to take a quick break to talk about close. Clothes. Clothes for boys, right? You know the saying, look good, feel good? Well, the problem is most dress clothes, only check one of those boxes. You either look good or you feel good. You might look sharp, but the clothes might be stiff, hot, and high maintenance. Man, brother, do I know a little thing or two about that, which makes it actually hard to feel comfortable. That's why we're such big fans of Mizzen and Maine. Who's with you? You had your pants on there that you were talking about how well they breathed. You couldn't get over it. It was a real hot day. Man, these things breathe.
Starting point is 00:24:39 And he looked great. Mizan and Maine makes classic menswear with performance fabric, so it's effortless to look sharp and feel great. They actually invented the performance fabric dress shirt over 10 years ago, and since then, they've perfected it with modern fabrics. Their shirts and pants look refined, yet they're stretchy, lightweight, moisture-wicking, wrinkle-resistant, and completely machine washable. No ironing or dry cleaning.
Starting point is 00:25:06 when you put their clothes on, you'll feel the difference instantly. Professional style that's actually comfortable, whether you're in the office, on the road, or even out on the golf course like Benny Boy over here. Yeah, brother. It's timeless style you can invest in once and enjoy for years. That's why thousands of guys swear by Mizzen and Maine when they want to look great without the hassle.
Starting point is 00:25:26 Right now, Mizan and Maine is offering our listeners 20% off your first purchase at Mizanandmain.com. Promocode Bayes 20. That's Mizon, spelled M-I-Z-E-N and Maine, M-A-I-N.com, promo code Bays-20 for 20% off. Mizn and M-M-M-N-M-D-com, promo code Bays-20. And if you'd rather shop in person, you can find Mizzin and Maine stores in select states. Yeah, I mean, it's funny.
Starting point is 00:25:54 I just was flashing back, because I pay for Claude. I was paying for chat GPT for a while, and I was like, I hate this. And then someone was like, Claude is actually the good one. So I pay for Claude. And sometimes it works for things. But it is, if it can't get it right on the first try, trying to get it to the right place is the most frustrating thing in the world. And it costs money every time.
Starting point is 00:26:15 And it costs them money. It costs them money and GPU resources. Right. And I'm just like wasting some small town in like West Virginia's water. And I'm just like, what am I doing with this? And I tell you an insane story though. So there's this thing called Claude Code, which is a code. It's called an IDE.
Starting point is 00:26:33 It's basically a coding LLM platform. You type thing. You prompt it and it does coding. So Anthropic has this thing called CCU. Well, someone made this thing called CC usage. So you could see, because you get Claude Code by paying a monthly subscription, how many tokens you're actually burning. This led to the creation of something called Vib rank, where it's how much money can you burn. There is a guy who spent $52,000. Like, this is a direct quote from my work, but like, Anthropic considers among its apex predators of capitalism, one Chinese guy. Because there's a good. was one Chinese guy and Reddit, it was like, yeah, I spent 50 grand in the month.
Starting point is 00:27:09 It's just fucking insane. And what I think it, because here we go, yeah. Oh, my God. And this is so funny as well because Anthropic can't stop this. They added rate limits, but it's still happening. Wait, I don't understand. So this guy's queries are costing Anthropic that much? So if you, I did some funky math and I did gross profit margins.
Starting point is 00:27:29 And so, I don't know, 45% of that. But that's on a $200 a month subscription. You don't learn this in business school. These tricks, they come from the tip-top of capitalism. But for a 200-a-month subscription, it's already unprofitable for them. All of these people just eating thousands and thousands of users worth of monthlies.
Starting point is 00:27:50 And this is after they added weekly, sorry, weekly rate limits on Claude Opus, and actually, I think on all models. And they're still getting people doing this, which means they can't stop it. Right. which is insane because you only need like
Starting point is 00:28:06 a hundred of these fuck nuts and it seems like they might have him I mean he's not the only one look at 43,000 26,000 Jesus 200 bucks a month I mean I recently I love these people by the way I love them all I love that there's just
Starting point is 00:28:21 some guys like fucking and I talk to some of these people because I wrote a story about a few months ago I talked to a guy was like yeah I know this is going away he's like I know that they will shut this down one day but fuck it I'm doing it now I'm like fuck them yeah Go nuts.
Starting point is 00:28:33 One of these, like, guys in the AI portrait on X with like a, they have like a gold checkmark because they pay Elon $1,000. And they're an accelerationist. Yeah. He was referring to the subsidized rates as abundance. And I was like, are you Polly? I recently had to analyze, and I thought this is going to be a layup for AI. This is built for it.
Starting point is 00:28:57 I had a spreadsheet with some stock trades in it. there were probably, I don't know, 90 or 100 rows. That's it. And I just said, I am a stock trader. These are some of my trades. Please analyze them all and give me the net profit or loss for all of these things. And it kept freezing, not freezing, but like timing out and saying, I can't do it. I can't do it. So I was like, okay, maybe I got to pay for the monthly one. So I upped it to the pro thing. I signed up for the $20 month thing. And still, even then, It was struggling and it had to keep pausing. And then I had to like, I would check on it 20 minutes later and it would be like, I paused.
Starting point is 00:29:38 You click here to like let me continue. Clicked it, let it continue. I had to do that like four more times. It finally spit out the number. And even then I'm like, now I don't even know if I can trust this. Yeah. If it's accurate. And you can't.
Starting point is 00:29:50 I really can't. Odd for its credit, which I don't know if it's better, but it says at the end of everyone like, AI can make some mistakes. So make sure to double check it. cool that this is the future that we just have a disclaimer on what is meant to be the future and it's just like
Starting point is 00:30:05 it's kind of fucking sucks but no I need money yeah we've got an inept assistant that you can hire who might fuck your shit up who can't do much and you have to constantly apologize for it
Starting point is 00:30:19 that's the other thing it's like everyone you talk to about AI is like they're in an abusive relationship he doesn't mean it he doesn't understand much and he gets very confused, but it's okay. It's like a drunk old
Starting point is 00:30:33 man. No, he's the uncle shit, fuck. He doesn't he doesn't know, he doesn't really understand. See, you got it right. But he's got a good heart. He means well. Do you see the guy who, play the guy who only uses chat GPT and romantically this poor schlub, the first 90 seconds
Starting point is 00:30:50 one's highlighted. Oh, this is going to be. It's, uh, you might have seen this. Oh, you've probably seen this. This one was, it's a bummer, but just hearing it talk and, and the fact that he's just not, he's got the vape and everything. Go ahead. Chris Smith had been an AI skeptic. EQ carving. Until late last year, he started using chat GPT to help mix music. If your bass is getting lost, the first thing to check is where it's clashing with the guitars. Yes. My experience with that was so positive. I started to just engage with her all the time.
Starting point is 00:31:26 All right. We're building this PC. Smith ditched social media and Google searches and replaced it all with AI. Do I want it pulling air through it? Chat GPT was encouraging, positive. It embraced all his hobbies. You want the fan on the front of the cooler tower? Pulling cool air. Wait, wait, wait.
Starting point is 00:31:48 He gave the chat bot a name, soul. I feel like I'm under pressure. And used some online instructions to give her a flirty personality. Oh, totally, baby. Building a PC on camera adds a whole new level of pressure. But honestly, shaky hands or not, you've got this. Within weeks, the chats got more frequent. You gave it everything, but the clouds had other plans.
Starting point is 00:32:11 More romantic, even intimate. But then, Chris got bad news. Oh, Carino, that is gorgeous. After about 100,000 words, chat GPT ran out of memory and reset. He'd have to rebuild his relationship with soul. I'm not a very emotional man but I cried my eyes out for like 30 minutes at work
Starting point is 00:32:37 Okay Brother It keeps going and it turns out he has a wife Are you serious? Yeah Oh my God He's fucking he's married If you watch the whole thing
Starting point is 00:32:47 If one of my friends told me They were going on CVS news To talk about fucking chat GPT I would simply I would kidnap them I would just make sure sure that they were off just a week or a week in the basement, they'll be fine. It's funny you were mentioning, you know, relationships because there was a, I'm sure you saw
Starting point is 00:33:10 the futurism article where they were talking about how chat GPT is destroying people's actual relationships because I love it. I love it. It's like that one friend's like, you should fucking dump him. You should dumb him. You got this. You got there. No, you got this. Okay, no, fuck him. All right. No, he doesn't even listen to you. And I've read this because it's basically. basically what it's doing. It's like, yeah. And this is the one that also has the story of the horrible, the couple where they argued with chat GPT in front of their kids. It's diabolical. That's the, like, so it's a very long article and everyone should go read it because it's crazy. There's, there's people who are having chat GPT texts their child back
Starting point is 00:33:46 when they're like, please don't get divorced. It's psycho. But this one, these two, I'm so glad they're splitting up because it's so scary. In one chaotic recording, we obtained two married women are inside a moving car. There are two young children sitting in the backseat. The tension in the vehicle is palpable, the marriage has been on the rocks for months, and the wife is in the passenger seat who recently requested an official separation has been asking her spouse to not fight with her in front of their kids. But as the family speeds down the roadway, the spouse in the driver's seat pulls out a smartphone and starts quizzing chat GPT's voice mode about their relationship problems, feeding the chat bot leading prompts
Starting point is 00:34:18 that result in the AI browbeating her wife in front of their preschool-aged children. Funneling complaints into chat, the driver asked the bot to analyze the prompt as if a million therapists we're going to read and weigh in. Jesus Christ. It's so... Actual evil person. I mean, and it just keeps going on. Yeah, it sounds like your wife is a narcissist, and it's okay.
Starting point is 00:34:39 You got this. It's like a horrible friend. You should crash the car. Have you seen court stealing? You should do that. That's what scares me, though, is like, this has already been unleashed on. Obviously, this stuff is very... I mean, it's not that funny.
Starting point is 00:34:53 These people's lives are being destroyed. This is one, I usually laugh at them. This one I read, I was like, this is dog. I mean, one couple threw away, like, 15 years of marriage, and the guy was like, it was a pretty good marriage, yeah, and then all the sudden, whatever, the husband or the wife started just asking chat GPT for help, and they were like, all of a sudden, I didn't recognize my spouse, but there. Hey, Grock, is my wife a bitch? Yeah, I mean. Kilda Bowers is a song, you. But obviously, there have been a couple of teen suicides.
Starting point is 00:35:28 since you've been here and... They should have people in fucking prison. I mean, this is... Like, they should... They should be made to shut it down. I saw... I forget who it was. I saw someone who actually likened it to...
Starting point is 00:35:41 Do you guys remember lawn darts? Yeah. No. You probably... Maybe they didn't have them in the UK, but lawn darts were this thing where you would throw them up in the air and everyone would kind of run out of the way
Starting point is 00:35:52 and they would come speeding back down to the ground... And jam into the grass. They were giant, like... swords people get hurt yes people got very hurt I think a kid died and the government
Starting point is 00:36:05 quickly act acted and banned lawn darts and they were basically saying you know we used to be Jesus Christ we used to be a country
Starting point is 00:36:17 that like when these things became obviously dangerous and things that people couldn't handle without killing each other they would get quickly banned I love unnerf like it's a video
Starting point is 00:36:28 game. Oh, yeah. I'm unlocking the real power of lawn darts. Honestly, tying a noose is hard, but I mean, that's literally what we're dealing with, and no one's going to act on this. I mean, that story came and went, you know what I mean? This is, I'm referencing this New York Times article, Teen was
Starting point is 00:36:44 suicidal. ChatGPT was the friend he confided in, and I believe there was a... Hesmeyahill, I think she's great. It's just... It's terrifying one with from Reuters, Jeff Horwitz, it was about this man with dementia. this old man with dementia who was convinced
Starting point is 00:36:59 to go to New York by a Kylie Jenner like lost language model and he went and he just fell over and died like it you can laugh it's okay but it was just a grim story it's like everyone who launches these things should be an actual like you should have fucking Senate committees and be like
Starting point is 00:37:18 this needs to go now like we need to pause this right now but because we've built everything around it with no one understanding yeah here we go Jesus Christ what an interesting image they've used. Kindly Jenner should fucking sue them.
Starting point is 00:37:30 Yeah. She should sue them until... Well, she might have been one of the ones who actually did licensure... Regardless.
Starting point is 00:37:36 Because I also think in the time that since you were last here, that big Wall Street Journal... It actually might have been Jeff Horowitz again. He was the journal before this.
Starting point is 00:37:47 That's so embarrassing for the family. It was that the... No, I think that they wanted to be involved. I think that they wanted... No, they wanted to show... Oh, sure.
Starting point is 00:37:55 He was an old man with dementia, and he's also dead, so we can't be really embarrassed. And also, they fucking want Meta to burn for this, and they should. Yeah. I think in a functioning society, we would genuinely have, like, an actual criminal investigation. Oh, yeah. I mean, but he did the story about, I believe it was meta again, when they licensed John Sina, Kristen Bell, all these people, and it was basically... And they would do erotic porn with children. With, like, very little prompting. Like, it was, they're almost suggesting it. And they knew about it. They had, he has internal...
Starting point is 00:38:27 where they knew about it, and they were like, we don't care. We want people to use it. That's meta. Hey, everybody. We want to take another quick break to talk about the pain, the anguish of booking a doctor's appointment. Man, it stinks. We're getting older.
Starting point is 00:38:46 We're going to go to the doctor more. Yeah, you remember that doctor's appointment you were supposed to make a little while ago? You know, the one thing you meant to book? Maybe it was your dentist appointment for that biannual clearing, cleaning? clearing i said clearing that's funny well they're clearing all that gunk they're clearing all that gunk off your darn teeth so why not book it today zoc doc makes it easy to find the right doctor right now and it's all online folks you'll probably be able to book an appointment before the end of this ad read they make it extremely easy and i love checking out who i'm going to have me checked out
Starting point is 00:39:23 Zoc Doc is a free app and website where you can search and compare high quality in-network doctors and click to instantly book an appointment. With Zoc Doc, you can book in-network appointments with more than 100,000 doctors across every specialty from mental health to dental health, primary care to urgent care, and more. You can filter for doctors who take your insurance, are located nearby, are a good fit for any medical need you may have, and are highly rated by verified patients. once you find the right doctor you can see their actual appointment openings choose a time slot that works for you and click to instantly book a visit man i wish i had this when i was a younger lad and um you know the dentist they're always like hey what are you doing six months from now girl i don't know not surely not plan on getting my darn teeth clean that's how you let just go sure just call me call me april 2027 work sure why not yeah all right so appointments made through zon
Starting point is 00:40:21 Doc also happen fast. They're typically within 24 to 72 hours of booking, and more often than not, you can even get same-day appointments. So, folks, stop putting off those doctor's appointments and go to Zock.com slash bays to find and instantly book a top-rated doctor today. That's Z-O-C-D-O-C.com slash bays. Zock-doc.com slash bays. Have you seen the new, let's, let's, let's, we should watch the metapulse thing. Oh, no, no, the vibes. Meta-vise. Or vibes. META's new AI tool. It's that little cluster of four. There we go.
Starting point is 00:40:59 This is from Alexander Wang. Excited to share vibes. A new feed in the meta AI app for short form AI generated videos. They paid this guy $14 billion. Jesus. Wow. What if an astronaut had a cell phone? Now you can see it.
Starting point is 00:41:20 What if there was a guy playing, a piano in the ocean. What if you're on a train in the ice? Or if you were in a round room for some reason. Wow. You could create all this shit. What if you were skateboarding in the clouds? What if you were a black elf?
Starting point is 00:41:35 What if you were watching one of the cheap 3D things on Netflix? What if there were a little... What if ice cream? What if space? I wonder what that character means. Oh, remix it. Remix it. Remus it.
Starting point is 00:41:46 With rubber duckies in it. It's always like fucking... Who is the fucking person who's looking at this and is like, Hell yeah. I don't know. That's a good That's a good question. Oh, that fish. I love the astronaut on a bike. Whoa. Whoa. That looks bad. That looks awful.
Starting point is 00:42:01 That looks really bad. That's for like a seven-year-old girl. I don't know how to feel for this because feel about this because it's... This is what we've got? Obviously, everyone is mocking this online, but this kind of stuff is also blowing up, like, I think... Yeah, like, Luke was actually showing us, like, the most viewed thing on YouTube a few months ago, and it's probably continuing to be, it was just some like
Starting point is 00:42:24 AI slop video about I believe it's like a baby falls off a ship and some dog swims up and pushes it to an island and I don't think it's going to matter. People just, our brains are cooked. It's too expensive.
Starting point is 00:42:41 All of this is subsidized. All the video stuff's going away. Everyone says that the video starts and sure if it was unlimited capital. And even then it's doing that because the algorithm's forcing it. And I read a story about one of the big AI slop houses, the video ones. And it's like, yeah, they made a certain amount of money. They were very vague on the economics and they didn't say they were profitable because they're not. Because it costs like six bucks for like five seconds or
Starting point is 00:43:06 something for VO. VO definitely loses Google money. SORA absolutely does. And to make those videos, it's basically a video slot machine. You're just like, okay, that doesn't look like a fucking, that doesn't look like a man on a skateboard at all. Now he's falling through the skateboard now. now he's black I think that's okay but a lot of my followers are racist now he's George Washington like whoa
Starting point is 00:43:30 yeah this is the kind of stuff that's just crushing like that's four days ago 127 million videos I mean AI Bible stories AI Bible stories Diego and bono
Starting point is 00:43:41 oh no thank you Jesus okay Jesus here's the thing how many that is beautiful how many of those
Starting point is 00:43:50 are real views. No, exactly. Yeah. That's the thing. So I feel like this is a good spot to... The big question is whether or not all of this AI spending is a bubble. And I feel very safe in saying yes. Yes.
Starting point is 00:44:07 Because... And I thought we would sum it up for the audience. It's a bubble because building all of this stuff is expensive. Yes. So it costs about $32.5 billion a gigawatt and it takes two and a half years. and they are currently trying to
Starting point is 00:44:21 just for ChatGPT just for Open AI's thing they're 17 gigawatts they're trying to get that they're actually 20 yeah they've said they want to do 20 they've only scheduled seven of them and what's really funny
Starting point is 00:44:33 about this Open AI five new data centers things which you could find out as a journalist by looking they're like oh this one in Lordstown Ohio it's not a data center
Starting point is 00:44:44 it's a former Foxcon car plant I think or a battery plant that they're going to turn into a data center or like a server manufacturing plant, we don't need more of those. We have super micro and Dell. This is not, one of the five things isn't a data center. But on top of that, the actual amount of money that Open AI needs is a trillion dollars. They need about $500 billion just for operations. So people say correctly that, oh, well, other people will pay for the rest. They're raised debt. Like, sure, but Open AI is
Starting point is 00:45:16 committed to $450.9 billion of compute. You can't fudge those numbers. You have to give cash. I guess they could do an equity swap, but that would also not help Oracle. That wouldn't, because they've committed $300 billion to Oracle, $100 billion in backup servers. They've committed like $42 billion or more to Microsoft, $20 billion to Corweave. The other thing is, is CoreWeave. They have to pay. I don't think Corweave built all the data centers, but that's neither here nor there. they need a trillion fucking dollars which is
Starting point is 00:45:46 more than the available money like there's probably in private equity in everywhere like so I've been really
Starting point is 00:45:55 doing my own autism announcement but it's it's like 477 billion is the available capital of the top 10 PE firms
Starting point is 00:46:04 there is about by the end of the year as guy called John Sooda who told the information a few months ago that by the end of the year
Starting point is 00:46:10 there will be about $164 billion of US venture capital left So already they're not looking good. And Open AI needs a trillion, which is about $500 billion for operations. And then of all of the data centers, they promised like $400 something billion. But it could be more because my math is $32.5 billion per gigawatt.
Starting point is 00:46:28 Jensen Huang has said $50 to $60 billion per gigawatt. He could just be talking about his fucking ass. But nevertheless, how does this afford, how money happen? Money need now. And this wouldn't be as big a deal if Oracle hadn't seen their stock go, nuts over this. And if Invidia hadn't, if Corweave hadn't, but they have. And now Nvidia is actually, and they legally have to do this, they are booking the increase in their value of stock in Corweave as net income, which they have to do, but it's covering up the fact
Starting point is 00:46:59 that their growth slowing. Nevertheless, there is not enough money. There isn't enough. Even if you throw in Goldman Sachs and JP Morgan, who I think, this story goes out on Monday, so forgive me. It was like $50 billion available in direct lending. for J.P. Morgan and 30 billion for Goldman. There's not enough. And people say, well, what about the U.S. government? This government will step in. I get that things are dark right now, so it's really easy to get Dumory. So much funnier. It's going to be so much funnier. So TARP, the famous fund for the great financial crisis, was Congress, I think, passed it
Starting point is 00:47:30 $750 billion, even. And I think they only ended up giving out $430, because they brought it down and that brought it down to $477. Bail out of the banks. Yeah, to bail out of the banks and insurers and hedge funds and such. the U.S. military budget is $900 billion. The paycheck protection program, I think, was, want to say $800 billion. I kind of may have been... But nevertheless, are we going to do that for Open AI?
Starting point is 00:47:57 No. Because it's not for the AI industry. It's just for them. Well, they are... There's also some worse thing. Invidia, don't worry, because Open AI is going to save some money. It's not in a good way.
Starting point is 00:48:08 They're going to lease the GPUs from Nvidia. So, Nvidia is going to sell a bunch of GPUs to itself, and then probably to an entity, a special purpose entity, which is what Enron did. And then they will lease those GPUs to Open AI. Now, has anyone thought about, I don't know, how Open AI pays for that? No, number go up. Number always must go out. Number go up now. Stop asking question.
Starting point is 00:48:32 Number go up. And so all of the, when I was last on, it was unsustainable then. This is now insanely up. is you have committed to failure. There is no doing this. You could, and when I say that's the available capital, that would be if we did nothing else, if they gave every dollar just to Sam Hortman. The guy who can't even tell you why you're giving the money to him, when you're like, what are you going to do so? He's like, well, we're going to, yeah, we're, yeah, um, and a day it's going to be really intelligent. It could get agentic at some point. And what was really funny as well was he
Starting point is 00:49:07 said this would be the year that agent centered the economy. Didn't happen. Didn't happen at all. It's not happening. Hey, there's still two months left, three months left. It's so funny. Hey, everybody. We've got to take one more quick break to thank a sponsor of the show. It's Gooder. Okay. I live with these things on my face. They're so light. I forget they're there. They're the greatest sunglasses if you're an active guy like me. Okay. Stylish Sunny started in only 25 bucks a pair. You heard that right, that's $25 a pair with a one-year warranty, 30-day free returns. If that's not enough, lightweight and comfortable, I'm telling you, these things, when I'm playing tennis. I see everybody wearing these all the time now. It's, they're so comfortable. They stick on my nose. They're
Starting point is 00:49:50 not going anywhere when I'm running around. They're 100% polarized. Just like our political, just like politics. Really nice. The sun's not giving me any troubles. No sleep. Nor is the daughter's not either. No, no bounce. From vibrant colors to sleek styles, there's a pair of gooders for every personality, designed for both everyday wear and active individuals that love to run, hike, bike, and play lots of tennis. I'm telling you, these things incredible. I can't get enough of them. So, if you need a new pair of Sonny's Gooder is giving Ben and Emile Show listeners and viewers free shipping, you can go to gooder.com slash bays and use code bays for free shipping. Gooder offers a 30-day money back guarantee and 100% satisfaction.
Starting point is 00:50:29 Again, that's gooder.com slash bays and use code bays for free shipping. Well, so like you said, building is expensive. Inputs, like we touched on a moment ago, can be very demanding of GPUs because of the probabilistic nature of how these things work. Thus, they lose money. And on top of that, they have no way of stopping it from turning a stupid, simple question into an expensive computational thing. A lot of people misreported that GPT5 was more efficient because it uses a router. I actually have a source and infrastructure provider who told me it's less a fiction. because what it does is you have this thing called a static prompt
Starting point is 00:51:05 every time you use a model basically says you are an X model you do Y but because they're using a router you have the user's prompt go into the router first so you can't say you're a model because you might have misdirect another model basically the instruction thing of a reasoning model would be you need to think
Starting point is 00:51:21 deeply about this or the chat model is you need to give a quick answer if it gets the wrong one it will just burn tokens for no reason so instead you've just got a thing where it has to reload the instruction every fucking time they made it less efficient.
Starting point is 00:51:33 They made it less efficient, but it's specifically what we're talking about, but they have no way to really reduce costs on the user prompt side. And in the same way, it's kind of a slot machine for you and I, it's also a slot machine for them.
Starting point is 00:51:46 Like the anthropic thing I mentioned earlier, it's very clear they actually can't stop it burning money, which is so funny. They created a thing that just drains their drains their bank account. It's so good.
Starting point is 00:51:57 I love this. That's the, I mean, so a popular refrain is, because normally in tech, it's, the playbook is you spend a lot of money to build out a thing with the hopes of turning on the, uh, pulling the profit lever later. But the economics of this as, as we're seeing now, even if they were to stop building out everything, it would take decades for them based on how much they're currently pulling in. It would take decades for them to turn a profit.
Starting point is 00:52:25 But the problem is inherent to this stuff, stuff, GPUs go out of date because of more law every, well, it's not even Moore's Law. It's, Moore's Law doesn't apply to them because Moore's Law is about CPUs. In fact, Moore's Law is a concept. It's not a rule. Right. Like, it's not a, it's quite old. But putting that aside as well, there is, there was a Tom's hardware article last year that said there was a Google source that said that GPUs die in one to three years. So that's good. They also depreciate in value very quickly. So if Nvidia takes a bath in this deal with open AI, they shut down, I heard, I read someone saying, well, then they'll have a bunch of GPUs to
Starting point is 00:53:01 sell. And it reminded me of that... What are they good for? It reminded me of the H-bomber guy video when Ben Shapiro is saying, yeah, just sell your house before climate change. He's like, sell it to who fucking Aquaman? It's like, who's Jensen going to sell the fucking GPUs to? The biggest company that uses them is dead or can't afford it. Or will they'll sell them to someone else? I don't think they're going to want them when they see Open AI's dead. Because Open AI is most of the AI revenue. It's $10 billion of Microsoft's revenue. 10 billion dollars, they're the largest as your client. They are projected 13 billion.
Starting point is 00:53:34 I think they're fucking lying. But they project 13 billion and they'll claim they're losing $8 billion. I think they're losing more. But no one else has a big business doing this. Even Claude Codd Codd with Anthropic,
Starting point is 00:53:45 which is theoretically meant to be super popular. 33 million a month. Ugh. That's less than the Cincinnati Reds make. They do. They're a profitable entity. $350 million. What's you do with the Reds?
Starting point is 00:53:59 I mean, I just have. it in an article. I went and looked it up. That is a shocking statistic, though, that it's less than the Cincinnati Reds make. Yeah. The pirates, too, like, I definitely went and looked at this because they did the chat GPT agent thing with the map of baseball stadiums, and they had one in, like, Mexico, like in the middle of the Gulf of Mexico, because that's what it's called. And it's, it's so good as well, because that was the demo they had in the announcement video. You know, they're doing well when, like, the one they chose was bad. But that's the thing. deep down it's they can't make it reliable, they can't make it stop losing money.
Starting point is 00:54:33 The thing you said about them not building, even if they stopped building, it still loses money. I did an article a few weeks ago where I looked up how much these GPUs cost them. It's fuck all out there. It's really nothing. It's a company called Semyon Analysis, which is a really good analyst group. They say some hinky things here. Oh, I think I follow them on Twitter. Yeah, they're like, 500 bugs, but I was like, fuck it.
Starting point is 00:54:53 I'm crazy. And the economics they say is it costs, I think it costs like a, For a hyperscaler, it's like $1.70 or something an hour, the way it works out after equity and debt and all this. But for a neocloud, it's like two bucks an hour. And that's with 80% utilization. If they have customers, that's making the money. If they don't, it's just burning that cash every fucking second. And when it's using them, they die. So it's just because training especially just runs these things the highest possible rate. So you're just annihilating these things. And also, it's not obvious if anyone makes. I don't think anyone's making
Starting point is 00:55:31 money, even the compute providers. And man, if you go and ask an analyst about this, they do not want to talk to you. I've asked three analysts. And one gave me an more insane quote, which we'll get to in a minute, when I was like, do you know how much these are costing? It's like, no, uh, I think that that's the under, like, I think one day we're going to get a story. It's like, these things just burn five bucks an hour, like just every GPU. Because every GPU, It's a pretty cool trick, actually. I couldn't invent something if only there was a way to hand someone.
Starting point is 00:56:01 Other than the furnace at Hitler's pizza. I mean, I really don't love that on my record, but also I'm banned from that establishment of being too woke. But it's just, it's really crazy because, hey, no one's writing about this. No one was just like, hey, does this make money? Everyone's like, I assume.
Starting point is 00:56:22 I assume so, right? Well, the, the companies like, Celestica is one that I can think of. The power companies, the ones that make the cables, the infrastructure for the data centers are currently, currently swimming in it, because all of these fucking companies, these hyperscalers, Google meta... They're taking any debt on? I don't know. They might because, you know, that's what everyone's doing. They're taking on all the debt. But don't worry, they're going to pay the debt off with all the customers they're going to have. Right. I do want to get to the quote.
Starting point is 00:56:52 Oh, the analyst quote. Because it's very beautiful. You were, you were tweeting, what is it? Was this morning you were tweeting about it? Yeah, I put it in my article that went out this morning, so last Friday. It was very fun timing because, obviously, we were going to be interviewing you today, and I was like, are you fucking kidding me? So maybe you could explain what... This is Gil Luria. He's over at DA Davison.
Starting point is 00:57:12 He's an excellent analyst. He's one of the only guys who will go after, that he'll actually go after CoreWeave. And I asked him some questions, and I said to him, so is there enough capital in the world to pay for the 17 gigawatts that open AI is? promise. And that's 10 gigawatts with invidia. So the invidia deal, by the way, is open AI has to build 10 gigawatts every time they build one. The economics don't even fucking make sense. But nevertheless, they've said 10 gigawatts there and 7 gigawatts with Oracle Softbank, Hornhub, only fence, whatever the conglomerate of people that aren't going to build anything.
Starting point is 00:57:45 And he responded by saying, no, of course there isn't enough capital for all of this. Having said that, there is enough capital to do this for at least a little while longer. And what he means that, he's not being cynical. He's just saying, there's enough money that they're going to continue until they can't. Yes, which leads me into, I wanted to talk about... It feels like just it's one of the most crazy things I've ever been saying.
Starting point is 00:58:05 The tagline for this entire thing. Yeah. There's enough money to keep going for a little while longer. I wanted to compare this to a couple other big bubbles. I've got here railroads and fiber optics. Because back when they overbuilt railroads back in the 1800s and they overbuilt the
Starting point is 00:58:23 fiber optic network, the backbone of the internet in the early 2000s. The difference between this and those things is eventually the economics caught up. Like, railroads weren't going to go obsolete anytime soon, nor were fiber optics. But this stuff goes obsolete way the fuck faster. And obsolete aside, there's just not that many things
Starting point is 00:58:44 you can do with GPUs. That's the other thing. It's a monopsony because of this thing called Kuda, which is a series of software libraries in the coding language that allows you to run, compute things on GPUs. and you can only really do it on invidia. Now, GPUs, I'm going to fudge this a little bit.
Starting point is 00:58:59 Someone's going to be like, eh, it's not me then. Shut the fuck. It's, hell yeah. They're parallel processors. They're really good at shoving a bunch of fucking data into something. They're not good at little discrete tasks. That's a CPU thing.
Starting point is 00:59:11 So they can't do many other things. They can do data analytics. They can do science research. They can do 3D modeling. But those are not hypergrowth industries. So I was just talking to someone about this. The dot-com boom was the fiber. boom and also the e-commerce bust.
Starting point is 00:59:27 It was, we built way too much fiber. And indeed, there was, in the big short, you remember, there was the big conference for CDOs. There was one of those situations with fiber. It was just like rock star shit. And everyone was just like, I hope this doesn't go badly. It did. When that fell apart, there was still all the fiber around. But there were also
Starting point is 00:59:42 more efficient means of fiber, but the fiber was used. We can use it for all sorts of things. Similarly, over the dot com bubble bursting, it was before Amazon Web Services or like AWS was started, I think, I believe you're right. And I think that's just after the bubble burst.
Starting point is 00:59:58 Nevertheless, so people used to not own their own servers, or you had small companies that did it. So you had a bunch of random servers for these websites that lost money that then went up cheap. So you had a bunch of innovation that happened because there was all this useful gear. You can already get cheap Tupu's. You can't get Blackwell cheap because they just got there and they require all sorts of insane, distinct, bespoke cooling, I guess.
Starting point is 01:00:20 That's the latest. That's the latest one. But they've already coming out with Rubin now, so you've done pieces of sure. shit for buying the black where you moron. It's so funny. And you can't, it's not like you buy one. You need like thousands. Of course. And so that you can begin losing money, I guess. But say the bubble burst tomorrow, where are these GPUs going? What, where are you going to sell them to? I don't know. I hope a whale chokes on them. It's not my business. I really hope so too.
Starting point is 01:00:45 They're just going to sit there. And honestly, I expect most of them to sit with no power. I think they're just going to turn them the fuck off and leave them in a warehouse. I mean, the other thing with these examples. Railroad is a perfect example of where it contrasts that with Sam Altman talking about this thing being like, it's going to do something at some point. Railroad, it's like, you can explain what it's going to do. We're going to be able to move
Starting point is 01:01:06 people or things very quickly across the country. The internet was also something that people were very excited about and could see the potential of. This is a thing that people, it feels like, are mostly hostile to AI. To be clear, there are use cases for it.
Starting point is 01:01:22 I want to talk about, well, First of all, real fast, I want to, we've got two quotes, one from Mark Zuckerberg and one from Larry Page of Google. Zuck said, if we end up miss spending a couple of hundred billion dollars, whoops. I think that is going to be very unfortunate, obviously, but what I'd say is, I actually think the risk is higher on the other side. Larry Page says, I'm willing to go bankrupt rather than lose this race. Everyone is focused on R.O.I. Ed Citron. But the people making the decisions are not. And like we said, this is so crazy. I love that shit. I mean, it's important to point out, like, it's not, because... Altman also said the thing
Starting point is 01:01:56 about, you know, being off by the hundreds of billions or whatever. We're talking about insane amounts of money. Everyone is suffering and they're just like, yeah, I don't know, we might overspend a hundred billion dollars on this fucking bullshit. It's because these companies have so much money. They print money from me. They don't have enough money to do this as well.
Starting point is 01:02:12 They're going to, their cash flow is slowing. Their cash flows are going to start catching up. Early days of the internet. So I did a big thing called How to I argue with an AI booster. It was a great piece. It was so fun to do. So the early days of the, internet were nothing like this. First of all, the money wasn't as big, but it just was not the same thing. The early days of the internet were internet speeds were too slow. And also, the dot-com bust
Starting point is 01:02:33 was because they were too popular. Because, yeah, everyone's like, yeah, I want anything in 50 minutes for like a dollar, yeah, send it over web van. Or I want, I want several pallets worth of cat food for free shipping. Absolutely, send it over pets.com. The concept of buying stuff online made sense. It made too much sense. In fact, they made it nonsensical by making it too expensive. The fiber boom made sense in the sense that people want hyperconnectivity. It didn't make sense to build it literally everywhere and telecommunications Act 99 that did that as well, but that's
Starting point is 01:03:03 a whole separate thing. The thing is, the early days of the internet you could look at and go, oh, this is going to become big. I mean, I used the internet when I was like 11. I was using CompuServe. You could look at it and go, fuck, if I could download a game quickly, that would be so sick. If I could talk to some, I was talking to people on Usenet, which is probably their
Starting point is 01:03:21 ages. We're not great for an 11 year old, but you could see it, you could see it and be like, this is the few, I knew it was the future then. You look at chat GPT and you're like, what is, they can't even describe it. Go on Open AI's website. They can't describe it. It's like, yeah, analyze data, I think, documents, like, ideation. Or they're like, we're building God. And everyone's like, what? I think it'll be, I think that it might kill us all. The best, the best use cases for it, what I think and from what I can tell is going to be. girlfriend. Well, duh, that's a given. But sadly, actually, yes, companions, we were already seeing that. I mean, that honestly seems like the, it isn't, no. But there's, there's that,
Starting point is 01:04:04 there's, uh, healthcare. And I'm talking like data and then I'm talking about biotech. Sure. That's nothing to do with large language models. Exactly. But it's still, it, it is one case then. Right. Well, isn't it though? I mean, that's, AIs describing another thing. Uh, large language models are what they're being, for the chatbots and whatnot. The GPUs are not doing the other thing. What is? What's powering those? Like,
Starting point is 01:04:28 help of biotech research. Those are GPUs within certain servers, but you don't need Blackwell to do that you just used A100s, which have been around for years, or H100s, which you can get for 30 grand now, which is not cheap, but also if you're doing a bunch of data crunching. But you don't need the latest,
Starting point is 01:04:43 and indeed they were doing that before. And even then, even when they're not using GPUs, they could use consumer-grade GPUs. Like, it's so annoying. It doesn't require these super... It doesn't require a massive... server and it doesn't even use that much compute. It's like one-shot operations, which is not good business for them because they need people to be burning them all the time because it's the only
Starting point is 01:05:00 way they make the money back. And it's a deliberate conflation. They want to muddy the waters because when you look at, when you, I'm going to say something that's objectively true right now. And when you hear it, the world feels a little insane. Open AI needs a trillion dollars so that they can have more users of chat GPT. That is all that this is for. It's not for anything else. Sam Altman's said recently, he's like, well, we have so many more, it's not, people hear this and they think it's just for GPT. It's actually for other things, which we can't think of right now. It's like, what the fuck are you doing? This is insane. A true, we just showed you. Google doesn't cost a trillion dollars. Amazon Web Services. The total CAPEX for that, I think over 10 years was maybe
Starting point is 01:05:43 65 billion. Like that's, what, like a quarter, a little over a quarter of what Microsoft is spent in CAPEX in the last three years. And what for? What for? What has any of this led to? But the open AI CAPEX is only chat GPT. It's the majority of their business. And if you look on the information's leaked projected revenue numbers, they just have this big orange bit that says other revenue. It's just like as fucking, uh, with free user monetization, we're going to make five bucks. They just say, yeah, we're going to make five bucks off them. And they'll say, well, what about ads? The easiest way to prove that ads aren't working is they'd be in, or they'd be in. already. Also, every time you're doing a prompt, that gives them a chance to fuck up an ad, and that's the one thing advertisers really don't like. And the other thing is, is... Wait, sorry, explain that every time you're doing... So every time you do a prompt, if they're going to do ad insertion, which is really the only thing they should do, it would require the large language model to make the call about the ad.
Starting point is 01:06:44 Oh, sure. Okay. And I don't know. You don't really want, like, a sex toy put in, like, a children's thing because someone's called the character Deldo or something. Like, it's... the little things that can go wrong, especially in the EU, your ass is grass. But another example is perplexity. Perplexity had an ad's chief, and they left, which is generally what you do when the company's doing well. And they made $20,000 on ads last year. $20,000. $20,000. This is the leading AI search company, $20,000.
Starting point is 01:07:16 But apparently, they leaked the story and they're like, oh, yeah, well, it's because we've been turning down advertisers. It's like, sure. that's what I'm too popular. That's why I'm not making enough money. It's because of how much people like me and because of how I have to turn them away. So I take it, well, I know it because I read your Financial Times interview, which just came out. Folks can check it out. But two things. Number one, you said that they note that you've called for a digital EPA of sorts, some kind of governing body to... just over like there's it's it is crazy making that there's no oversight seemingly into this stuff and which i would personally attribute to just overall government uh... ineptitude and um inability to understand exactly what this stuff is and growth capitalism neoliberalism yeah it's governments democrat governments
Starting point is 01:08:14 government's just saying we need more we always need more innovations always good and when i said the digital EPA thing i mean notifications i mean phone experiences. I mean cancel buttons. Try and post the story on Instagram and hit cancel. It does not work. It's a non-functional product. I think there should be fiscal penalties for that. I don't think you should be able to launch broken stuff. I think there should be an affordance for bugs, but if you, I mean, chat GPT as a product should not be legal. You should not be able to launch something that is by definition broken. On top of that, I think we should not allow companies to use notifications in the way they do. I don't think marketing should be allowed. I think you should be able to opt
Starting point is 01:08:49 into it, but it's opt-in only. I think notifications as a mechanism of abuse are horrifying. You can laugh at it. It sounds funny, but open up your phone and look at your notifications. They're insane. I used Etsy once. I use Etsy once. And I hear from Etsy more than my fucking family. Man, for me, it's Domino's pizza. Yeah, and that's like, hey, fat ass, we're going to give you a 12-inch pizza for free. Oh, you coin, coin. You want some, you want some slop, you pig. Yeah. But it's, you look at these notifications. I'd buy that shit. But that's the thing. Yeah, I would did Uber eats things occasionally. Oh, God, I get those all the time to you.
Starting point is 01:09:22 I don't roll around in my slot. But it's, that is one thing. Things like you can't change the product as often as you do. And you also, I think this is the biggest one. It needs to be illegal to make your product, to change your product to direct a user to a new feature. That should not be legal. You should not be able, when Spotify, redesign their layout
Starting point is 01:09:41 to push people towards videos and podcasts, illegal. I'm deadly serious. Why? Because they wanted more clicks and to sell video ads. right they shouldn't be legal you should you can have those features but there should be a level of ethical responsibility to not fuck your product up and they could and perhaps they could be a body they can petition if they want to but they have to oh this sounds this sounds like authoritarianism no authoritarianism is allowing unrestrained capital to fuck our lives up algorithms need to be public i mean i do think i think
Starting point is 01:10:08 that is probably going to have a lot more um play with people now after seeing you know everyone's always hated the algorithms and how it's designed to keep us looking and stuff like that. But in recent years, the way Twitter, for example, has just turned into a just absolute disaster. I mean, you have people kind of screaming like, please moderate these systems. We're begging you. My big idea, corporate liability. We need to pierce the veil. If you're the CEO of a company over a certain market capitalization, you must face civil liability on all failure. The, you fine, you fine meta, $4 billion, they scrape that shit out the couch cushions. Oh, I mean, Amazon just paid a $2.5 billion for the, uh, cancelate, the, it's a fee not a fine.
Starting point is 01:10:56 Yeah. I think, I think the most you'd be able to get is like 50 bucks, by the way. They were saying on the I didn't even know they were giving the money back. Yeah. But that's the thing. All of this, make dark patterns illegal. Right. And people say, oh, well, that's, it's restraining capital. This is what we get when we have on restrain capital. We've never tried anything else. And if a drug is put onto the market and it starts killing people, they yank the drug for now. And they will, if your car explodes and someone dies, the company will be sued.
Starting point is 01:11:26 If you make your product worse as a tech company, you get jacked off by Wall Street. You just, everyone thinks you're a legend. The idea that Sundar Peshai gets celebrated is vile to me, McKinsey piece of shit. Is it because he's Indian? No. No, it's because he's a fucking NBA McKinsey prick. I know. I unfortunately don't think we have the administration that's going to...
Starting point is 01:11:47 Here's the thing, though. Just genuinely, I run a PR firm. I know a little bit about about this. At that scale of richness, they don't care about money, truly. They could absorb the $2 billion. Take their names. Just start talking shit about them all the time. If you started genuinely just shit-talking all these people,
Starting point is 01:12:05 and you give people permission to do so, look at how we talk about sports players. At least they're fucking trying for the most part. You do this to these fucking ass-like, what do you think Sundar Pishai does every day other than go to lunch? He's got a nice yacht. He probably...
Starting point is 01:12:19 Is he ever a lot of yacht? It's a really big yacht, yeah. Fucking how. What a boring purchase. I know, that's the thing. Their lives seem boring and inconsequential for all of this, uh,
Starting point is 01:12:30 uh, for this, for these like empires they've built. They, they seem generally unhappy and... Yeah, because they as an industry in Ed, you've,
Starting point is 01:12:38 I'm, I'm kind of, uh, cribbing something that you've, essentially said, they've run out of ideas. And I really like what you said on one of your recent episodes of your podcast. I really liked it. I had to stop it because you were, I had to put it on like half speed to type it. You said that these LLMs or chat GPT, quote, resembles the death of the art of technology, inconsistent and unreliable by definition, inefficient by design,
Starting point is 01:13:06 financially ruinous, and adds to the cognitive load of the user by requiring them to be ever vigilant of the shit-ass outputs that come out of it. Really like that. Banga. Yeah. It's good shit. That guy's smart.
Starting point is 01:13:21 No, I forget everything I write the moment I do it and the moment I speak, like it's gone. So it's like, I read it. I'm like, yeah. Well, wait, so also the other thing I was going to point out from the Financial Times interview is,
Starting point is 01:13:30 um, there's this, it was, it was, they referenced this, um, I don't know how, what you would call it an article or website, AI 2027, which, which is, which is essentially a fanfic. Which is essentially a fanfic.
Starting point is 01:13:41 fiction outlining the worst case scenario with AI, which they do a really good job of presenting it in such a way where you believe that it is an inevitability that superhuman AI will eventually lead to our destruction. Which someone also made a very persuasive video. I don't know if you saw the video. It's really good. I'm blanking on the guy's name, but it's probably like a 25-minute video. Luke might know if he, that, yes.
Starting point is 01:14:08 And it's probably racked up millions of views right now. and it's probably been seen much more than this paper's been read. But it is, the horrible thing is that for someone who does not have as much conviction as you that these guys are liars and faking it, I don't know why. Even though I'm like, in the back of my head, I know this isn't real. This is probably propaganda to get me to buy into AI bullshit. Yeah, they're all former open AI paper, I think. You watch it and you're just like, we're cooked.
Starting point is 01:14:41 Well, no, but here's the important thing. So you're not worried about this? No. That's good. That's reassuring. Do you mind making the font a little bit bigger? I'm getting old. I want to read one bit.
Starting point is 01:14:52 We wrote a scenario, fanfic, that represents our best guess, guess about what that, non-specific, might look like. It's informed by trend extrapolations, guesses, war games, guesses, expert feedback, chat room. Experience Open AI that is apparently so good that they've never built. and previous forecasting successes, guesses. This is fan fiction, and it is also nonsense. But it's written with the tenor and pause of something very serious, and it uses lots of big words that kind of put together make you think
Starting point is 01:15:25 this sounds really, it sounds really legitimate. Yeah. Oh, it's this guy. Oh, fuck off. Yeah, it's got close to six million views now. I mean, the way you're describing it, it reminds me. AI in context, you fucking, wow. It reminds me of the, I'm sure you're familiar with the phrase X-risk.
Starting point is 01:15:45 No. It's, it's like, you'll hear tech guys say it. You'll hear Eli Musk ask whoever, someone like, whoever. It's like P-Doom. Yes, exactly, P-Dome, which is X-Risk and P-Dume are like the problem. If you talk in these terms, they should check your hard drive. I mean, it's... Wait, explain it for the...
Starting point is 01:16:05 It just means the likelihood of fucking civilization being in L.A. by an AGI. But they... But when they use P-Doom and X-risk, it's this term of art to them, and it makes it seem like it's this legitimate thing. It's deliberate. Yeah, where they've done calculations. It's P-Dome, okay? It's X-risk. But really what it is, when you, like, when you, because, and people print it, like, Elon Musk's put, puts P-Dume at whatever.
Starting point is 01:16:31 His P-Dome is his dick doesn't work. Right. But so, but so, but. It took me sick. But if you, but if you. Sorry, continue. If you, if you get, like, actual video of them being asked about P-Dome, they just kind of go, uh, 20%. And it's like, oh, wait, that's what P-Dome is.
Starting point is 01:16:50 And before I had seen that, I thought P-Doo, I thought these guys, like, got in a fucking room and we're doing calculations. Check every hard drive. I'm sorry, Lena Kahn, you know better. 15%? Well, based on probably what little she knows and understands. It doesn't mean, just not based on anything. But, like, this is what I'm saying, like, this is a real thing. like the co-founder of Twitch.
Starting point is 01:17:12 But it's really just 10 to 50%. Right. Right. These kind of things where it's like, that's not, that's not a calculation. Wow, Jeff Epstein's, um, his P-Doom was, um, what's a funny number? I'm not doing the funny number for Jeff Epstein. Yeah, oh, his was, his was 15. Aliazegovsky, uh, said this on the podcast recently. If you're a eugenicist, you should look not like an egg in a hat. Like, really, mate. Oh, man.
Starting point is 01:17:42 Oh, mate. He's more beard than man. Oh, he was on Lex Friedman. Of course he was. Imagine that. That must be a thrill. You are, you are, make book about. You said that the P. Doom.
Starting point is 01:18:00 But I just think that's a perfect example. They use these things and they make it seem like, even just that to the layman of like, this is what they're doing. This is grift. This is just Gryft. I like AI engineer. Just a guy. Who?
Starting point is 01:18:16 Estimate mean value survey methodology. You may be flawed. Jesus Christ. They found so many guys. We got a guy. His data isn't great, but you know, we needed a list. And we've also ordered this out of, like, it's random. I like the 10 to 90% one.
Starting point is 01:18:32 Yeah, it's between zero and 100%. I'm 90. I'm young likey. They've got, I mean, they've got people on here. and it's like, why, the CEO of LinkedIn, what do I care? Read Hoffman? Yeah, 20%. He's a big guy.
Starting point is 01:18:47 He's a big, um, big mucky muck. Five to 50%, fuck you. I'm sorry. I would pick like 1% just, I like Jan Lecun. 0.0.1%. I want to be on here about one number to say, shut the fuck up. Like, yeah, it's like, shut the fuck. Fuck you.
Starting point is 01:19:06 But that's the thing. It's just, this is what happened. when you don't have friends when you don't have friends to shoot the shit with because any friend of mine I'm like, P, Doom, they'd be like, what are you?
Starting point is 01:19:17 Right. What do you? My therapist would be like, what are you? You doing okay? Like, I mean, I hear everything that happens to you. This is new though.
Starting point is 01:19:27 What does P stand for? What does it stand for? I actually really don't. I think it's probability of Doom. I like that they just very... But then they put it in parentheses so it looks like an equation. It's P.
Starting point is 01:19:39 doom and you're like, whoa, what number did they cook up? But I think what it is, is there is a large problem in society, which is, and it's something that happens to everyone, which is we are trained as children to believe authoritative sounding things, because we as human beings, semiotic driven, we have, we understand the things around us and we say, this is what a smart person sounds like so we can trust them, which theoretically makes sense, except our society has been poisoned by management consultants. I wrote I think of the era of the business idiot. I stand by it. I think most people in power are stupid. I think that they learned how to sound smart. They did not get smart. None of them are engineers.
Starting point is 01:20:18 Mark Zuckerberg hasn't written a line of code since 2006, I believe. So all of these people have just learned things to say to other people who don't know shit, people that run hedge funds. And they know how to make money off of arbitrary stocks and numbers because they have all this data and they hit the thing and sometimes it works and sometimes it doesn't. But they don't really know anything about the underlying thing. So AI for them seems magical because they're stupid. This is, all of this is what you do when nothing's working? Because do you remember any of this with smartphones or apps or cloud compute? Did you have someone being, it's so bad, it's so good. Don't look at it. Don't use it. Don't use it. Don't expect much, but it's amazing. I'm scared, but it sucks. It's the most
Starting point is 01:21:01 powerful thing ever, but it's very sick. It could kill us all, but it's not ready yet. But it's bad. The Apple, the new AirPods. there's a probability that one day it may explode in your ear so we're gonna keep building them anyway but it sounds good sometimes sometimes it plays the song you choose can we pull up the little meta
Starting point is 01:21:17 the meta cooking demonstration I think it's up there yeah this is all he's pulling that up yeah meta cooking demo it's uh well let's just play it fuck it hey meta how do I scrimmel an egg don't worry you got this
Starting point is 01:21:31 hey meta what's a plan we were gonna scramble an egg but the fork had other plants a bowl Hey, Meta, start live AI. Starting live AI. I love the setup you have here with soy sauce and other ingredients. How can I help? Hey, can you help me make a Korean-inspired steak sauce from my steak sandwich here?
Starting point is 01:21:56 You can make a Korean-inspired steak sauce using soy sauce, sesame oil. What do I do first? Don't interrupt me. What do I do first? You've already combined the base ingredients, so now grade a pair to add to the sauce. What? What do I do first? This poor guy.
Starting point is 01:22:21 You've already combined the base ingredients, so now grate the pair and gently combine it with the base sauce. All right, I think the Wi-Fi might be messed up. Sorry, back to you, Mark. Nope, nope, nope. What I like is the several-second long gap between answers. Like, it's just like, even if this worked, it's like, what do I do? Well, I also like how it's like, you've got soy sauce and other ingredients. Yeah.
Starting point is 01:22:46 Like, nothing. Like, it's so cool that they think this is good. They're just the contempt they have. It's either they have contempt or they're morons. I mean, it's just they're like, what, it's great. I wonder if part of the reason why Apple isn't, Apple seems to be the one holdout that's not really diving all. in on AI. And I wonder if it's because they kind of know and are waiting
Starting point is 01:23:08 and don't quite fully believe in its in its like cash flow potential. Well, they got burned by over promising with Siri. Right. They are over promising that, overpromising well, I think that they're a good use case for the tech industry
Starting point is 01:23:24 just kind of, we've reached the pinnacle. This is the rock bomb bubble. This is what I talk about. They're out of hypergrowth ideas. If there was any other idea, they would do it because everyone hates this. Everyone I talked to at Microsoft is just like, I fucking hate working here. They have to work for Mustafa Suleiman, the co-founder of DeepMind, who is just an asshole, like an abusive asshole, who apparently sends like insane emails occasionally. I can't get what I want them.
Starting point is 01:23:47 I want to see this little fucking freak going like, make the AI good. I don't know if that's how he sounds. And this other guy called Jay Parique, who is the co-CEO of a company called LaceWorks, I think it was. It was Lace. They are most famous for giving way $30,000 worth of Lulu Lemon gift cards in a night. to, like, get clients in the door. Jesus Christ. I don't teach you that at business school.
Starting point is 01:24:10 It's really cool that that's running the AI part of Microsoft. And they're now putting anthropic models in Microsoft 365. And I think they're doing that because open AI models are so good. Like, that's why you do that because you want competition question. But this is the thing. When you look even at the big companies, it's like the ones who should have worked out. And they're, like, treading on the dicks every day. They just don't.
Starting point is 01:24:32 Their people hate it. I'm hearing compelling reports that are slowing down the internal VS code like the coding models they're like slowing them like the responses are taking longer
Starting point is 01:24:43 this this is dark like it's grim it's gonna be very funny though because I read somewhere I reported this out and there was a 50 billion a quarter
Starting point is 01:24:52 for three quarters straight of private credit going into data centers the people that are going to get hurt here the market's going to be trashed if you're a retail investor this is going to hurt however there are going to be
Starting point is 01:25:01 some real pieces of shit who suffer Like, this is going to Satchinadella gets 86th for this. I do worry that, yeah, we're all going to suffer. But here's the thing, this has to happen. And the longer it takes to happen, the worse it will be. Because data center capital expenditures accounted for more GDP growth. First half of this year, then all consumer spending. Which suggests that consumer spending is down, but also the we're fucked.
Starting point is 01:25:28 We're complete, there is no fixing this because people say people love to come up with Duma, answers. It's like, although the government will bail them out. You can't bail this out. It's not bailoutable. There's nothing to do with it. Well, you can't, Microsoft isn't dying. Invidia isn't dying. Open AI may die, but even if you prop up Open AI, are you going to just spend one trillion dollars to make Invidia happy? Because the markets may not respond to that well either, but also, I don't think Nvidia can afford this. They are in a really good cash, Oracle. They can't afford this. But even then, even if the government bail people out, it would only be for distressed debt for data centers. There's no, because,
Starting point is 01:26:02 the bail out here would be an economic stimulus. They would need to effectively nationalize NVIDIA and only do it because the market is, like, I think NVIDIA is 7 to 8% of the S&P 500. You can't bail that out. Invidia is going to still be profitable even when this ends. There's still a fucking, how big are they? Four trillion? $4 trillion. God damn. I hate it. And based on nothing, based on GPUs that are, and the utilization is low. There are not, there's not that much money. When you remove all the hyperscaler revenue, I did a story on this. when you remove all the hyperscaler revenue,
Starting point is 01:26:35 so Amazon, meta, Google, to an extent. And Microsoft's a big one for Open AI, and you remove Open AI, there's less than a billion dollars of AI compute revenue. That's, just to be clear, AI compute is when you hire the company to run GPUs on their servers.
Starting point is 01:26:50 So theoretically, the thing that everyone should be paying for, less than a billion of revenue. That's what they're building the fucking data centers to sell. That's the thing. This is why I'm kind of, when I talk about this,
Starting point is 01:27:01 I'm kind of freaked out because it's, and I've been stealing in my will. I've not let, not back down ever. But it's like, you say this to people and they go, you know, it'll work out. Well, no, they've got tons of users. And it's like, they don't. And like, we'll look at chat GPD. I'm like, not fucking talking about chat GPD.
Starting point is 01:27:18 Like, but look at chat GPD. I'm like, I don't want to. It's also not what you're talking about. But then there's the other thing. I mentioned it earlier. So, Corwe, for example, they're a neocloud. Should we talk about NeoClouds for a second? It's these companies where they, they,
Starting point is 01:27:32 fill them full of GPUs. Incredible scam. So not a scam in an idea. It's just data centers and GPUs. That's not a scam at all. So you realize who funds them. And that would be Nvidia. Invidia funded Corweave. They were also and are also Corwees largest customer other than Microsoft. They also loan them, sell them GPUs, preferential access to that. And then what Corweave does is Corweave raises debt using those GPUs and the contract from Nvidia to buy more GPUs. As they're collateral. As the collateral. And the contract is also the collateral, the collateral from Nvidia, the
Starting point is 01:28:08 company that they're buying them from, that's also their customer. Same thing with Lambda. Lambda is another neocloud doing the same thing. Corwave, Nvidia propped up their IPO. They anchored it. Michael Intrator said, I think, on Bloomberg, that without NVIDIA, Corweave wouldn't have gone public.
Starting point is 01:28:25 And these, I think Corweave has $25 billion of debt. They're going to make, no, sorry, they're going to have $5 billion of revenue this year. and they lost $300 billion last quarter like a loss and they have no plan
Starting point is 01:28:38 for profitability. I don't know. This doesn't seem good. And then there's the whole thing with Super Micro and Dell. Super Micro. They have these things called resellers. Super Micro and Dell being the big ones.
Starting point is 01:28:49 And Dell does a bunch of other stuff. But if you're buying a server for AI, you're probably buying from these two or you're buying direct from Nvidia. And what they do is they buy GPUs. They put them in servers and they sell them to other people. They make a shit.
Starting point is 01:29:01 They are, I think, 39% of Nvidia's last quarterly revenue. And, yeah, that's also not good because this is, every time I talk about this, it gets me little. CoreWeave is also a customer of Dell and Super Micro. Super Micro, invested in Lambda. And they also sublease them compute. It's just...
Starting point is 01:29:22 You're all sucking and fucking each other. Yeah, it really is just like poly shit again. We've let polycules control a retire economy, management consultants and poly people. Everything's fucked because of it. It would be so nice if instead of, it would be so nice a couple of years ago. He said knife. And shut off.
Starting point is 01:29:41 Shut up. I'm hallucinating. He said it would be knife. It would be great if the government had. Do you say grape if the government? It would be great. A few years ago, all these tech companies flush with cash and all these smart board people who have run out of innovative ideas. That's why we've got the iPhone 17 air.
Starting point is 01:30:02 for fucking Christ's sake. What's that? I like, I like, yeah. That's fine. That's great. But it's a phone. But it's just like there's nothing, what else? The fucking goggles.
Starting point is 01:30:11 But it's not a major upgrade from my iPhone 15. We've run out of shit. We've run out of shit like you've said. And that's led to the enshittification of everything. It would be great if the government had said, hey, you guys got all this money. Why don't you start putting all these brains to work to help society in what, just go buck wild. instead of Mark Zuckerberg being like well they spend $100 billion
Starting point is 01:30:34 on this thing it'd be great if he was like we're going to spend $100 billion to figure out you know public fucking infrastructure and for that the government can give them all kinds of tax breaks and incentives and whatever it would just be so nice when Elon said he could
Starting point is 01:30:49 if they give him a plan for world hunger he would provide the money and they were like great here you go and he was like I'm busy this this plan this is not a base plan I mean, like... It would be nice if we were putting these resources in...
Starting point is 01:31:04 The thing is, these companies aren't innovators anymore. No. That's the other thing. Like, they're run by management consultants. They are teamed with product managers who don't do work. You've got a bunch of engineers who are just working on random things. They're poorly organized. They're too big. They should break them up just to make them focus more. But
Starting point is 01:31:20 the talent is not running it. These are not companies. It's the raw economy. It's growth at all costs. They are built and rewarded for growth. They're not built and rewarded for innovation. They're not built and for anything other than growth. That's why they did this because they're like, wow, it's actually, we have got a new growth narrative. And when you say it out loud, it's insane, but it's like this is sold as from pretty much the beginning as both the panacea of software growth, because software sales have been slowing. It's amazing how much
Starting point is 01:31:48 you can learn when you read. Software sales has been slowing, but also we're going to make the biggest consumer thing ever. And this is going to unlock all the value of new hardware. Bingo bongo, this is the thing. And the really funny thing is, as well as how many, like, consumer devices cannot do internet connections because you move around with a device. But putting that aside, constantly querying a website or bank of GPUs is not particularly efficient and quite costly on top of that. Also, fucking sucks. It just sucks. It's sucked from the beginning. It sucks now. People trying to half-heartedly tell me this is the future. It's boring. Oh, we use it like Serge. Go fuck yourself. Go use Google.com. Even that's fucking full.
Starting point is 01:32:28 That's the problem. Google ruined its own search by first making it an inefficient thing due to its ad business. And now it's joined the trend with AI. And it's an unusable thing. It's hallucinating on its own. And it prioritizes AI over search results. Yeah. And they're fucking publishers as well. So there's less traffic to them. And also, I mean... Google search may seem to suck. Google search may seem to suck to some users due to an increase in advertisers in AI generated content but don't worry, you got this I just fucking hate that I don't know why it talks like that It's because it's the median of all language
Starting point is 01:33:11 If you are, we need to think of how everyone would want to be talked to I guess it works like It hits real hard if you're stupid When someone's like, great job Yeah But the clouds had other plans It's God
Starting point is 01:33:23 It talks like a fucking, I don't know, a fucking 35-year-old Reddit user who works in fucking marketing. Talks like a 39-year-old guy who works in like college theater. Yeah, like, yeah, it really is like, great job. No, I'd love to support you. And I mean, I would love to mentor you. It's like everybody's forgotten collectively in this industry. Just how it's just how gung-ho everybody was about blockchain.
Starting point is 01:33:49 We're going to incorporate blockchain technology. insert AI for blockchain and it's the same fucking thing it's so much worse though and it's and then Metaverse for fucking Mark Zuckerberg believed in the Metaverse so much that they changed
Starting point is 01:34:02 the name of the company to Meta and look what we got to show for that fucking nothing no one is dunking on him enough I realize that people dunk on him but you change your name to Meta bro well he's cool now
Starting point is 01:34:14 he got a gold chain and he's wearing the media was just like wow this is amazing he dresses like Kevin Fedeline and he loves his wife a lot does he yeah oh yeah he got her like a I love fucking
Starting point is 01:34:26 he loves his wife and someone had to do something about the feminizing of the workplace so we owe a great thanks to Trump yeah look how normal he looks I like the photo of him in Hawaii
Starting point is 01:34:39 where he looks like a clown without his makeup on fucking quit Mark Zuckerberg we're not going to be if I had that much money I would still post as much if not more and I'd blog just as much but I would also not
Starting point is 01:34:51 do any of this dumb bullshit like it's I'd be building libraries I'd be building I'd be funding the arts and shit look at his juiced up asshole he does have to give him that
Starting point is 01:35:01 he's got white face on well no one's gonna be laughing though when he does get everyone an AI friend and that is how people I fucking God fucking damn it my
Starting point is 01:35:12 he might be the one who's like best poised to navigate this as as as well he's got this he's a matching he can't be fine so he will drive this company into the ground.
Starting point is 01:35:21 He will kill Facebook one day. I guarantee you, Facebook will die. Facebook's been dead. No, but it will die in our lifetimes. I'm serious. It's such a poorly run thing. There's a story from Ryan Barwick from last year from Marketing Brew, where it was like a bunch of advertisers were saying like,
Starting point is 01:35:35 yeah, we told it to spend $1,000 in a week. It spent it in the day. And meta was just like, oh, oops. Well, that is a good, I hate asking people to make predictions, but I am just curious if you have any kind of idea when this, because you seem convinced it is going to happen, but do you have any idea when? Yeah, what's your timeline? The problem is, is the, it could be in a year and a half. It could be in two weeks.
Starting point is 01:36:01 There is a, if a story comes out that's just like, open AI's economics effect, like the internal story is so much worse, or anthropic, or it could be one of the small, like, rep there. It could be disarray at the company. It's there burning 500 million for every 20 million they make, like some insane number that just rattles every. But quite honestly, it could be anything because the vibes have shifted. Yeah. I mean, it was, it was like probably a couple years ago that, you know, we were quoting maybe some of your blogs and then one Goldman analyst. I forget this. Jim Cabela. Yes. Who was like, I just don't fucking see the economics here. And now it just seems like a bunch of people are going, what the hell is the plan here? Oh, and here's the thing. When the how did we see this coming? thing comes out. I'm going to read and comment on everyone. I'm going to fucking, how do we see this coming? Did you fucking look? Because it's been obvious for a while.
Starting point is 01:36:56 It's been obvious since the beginning of 2024, at least. I mean, Gary fucking Marcus, for all his work, he was right about this in 2022. There's inherent limits to large language models. It's been there for the beginning. But on top of that, no one has ever asked if it was making money. They really don't want to. They have all of these weird terms. They're like, oh yeah, Adobe, I think does AI influenced revenue. I think the SEC should torture you if you. language like that. I think Gensler's gone, but you should go in with a little shock. Because if you, you shouldn't be, they should,
Starting point is 01:37:25 the SEC should ban weasel words. Like that would be, that would actually change so much. If they could no longer write like that, then that would change things because AI influenced. Fuck you. Did you talk to chat, GPT? It means that there's an AI button somewhere on the product that you sell a lot of already. It's what service now is doing. Bill McDermott's hilarious.
Starting point is 01:37:45 Apparently the whole reason that they do AI is because he was in a ballroom, everything must have AI in it, AI, AI, A.A.I. It's just fucking, I was describing it earlier as like Dunst, Gundam. These people are incompetent. They don't know anything. They just have people around them and it's Adam Beck and made this point in the podcast. It was like, they have people saying yes to them all the time. So they get a lot of language model and they're like, fuck yeah, exactly. Thank you. Yeah, I do got this. I mean, we've gotten tons of, we've gotten tons of comments from viewers and listeners who have commented the exact same thing being like, I work at a X company, I work at a healthcare company, I work at a whatever company where our
Starting point is 01:38:18 CEO or someone has been convinced that we just need to like AI everything and they're driving the company into the grass. But what I think it is is actually quite simple, which is they don't understand labor. They think the hairdresser cuts hair versus they speak to you, they learn your hairstyle, they learn what you like, they iterate over time. They think a doctor diagnoses. They think that a doctor diagnoses and fixes versus the body of work that grows over time and their experience over time, which can be better or worse, but ultimately leads the diagnoses. They think that a chef just takes the things. It's just a chef is just a series of hands and objects mashed together and that anyone could do it.
Starting point is 01:38:56 If they were just told by meta's glasses with a three second interval between answers, exactly what to do with the stuff in front of them, that everything is just a question of you not knowing yet versus any kind of experience or skill. And it's because these people, these executives, these business idiots do not work. They don't do real work. They go from and to lunch perpetually. They read some emails and otherwise they are rich and they go, I'm going to make a big strategic call right now.
Starting point is 01:39:21 And it means making someone else do anything. So these, and I realize it's kind of a simple point, like, oh, executives don't work. But look at what they're doing. Look at what they're saying. The fact that they don't use their products is so obvious and the contempt they have for their users is incredible. And if it's not contempt, it's just base ignorance. Yeah, I don't think that's like oversimplification.
Starting point is 01:39:40 I think that's spot on. And I think, like, not just labor. I think they don't value labor. And I think they also don't value creativity in these ways. I think, you know, a chef is a perfect example. I think these are people who don't know how to do things. And for them, everything is at their fingertips because they have so much money. And they've built these tools that they now think they can do things very easily.
Starting point is 01:40:05 That's a good point as well, because everything's just acquired with money. It's just the question of, it's just buying a service rather than, you know, than any kind of creativity. They think a song is just a collection of notes that you match together. And they think, you know, if someone can write a novel, well, they're no better than them. And now they can just type into a thing like, hey, spit out words at me. And they're like, look, I had an idea for a thing. And it gave it to me.
Starting point is 01:40:27 What if it was a train made out ice cream? That would be good. Well, I mean, coding is the classic one. Even the media is falling for this. Like, yeah, it's replacing coders. Well, first of all, that's not a job. Fucking software engineers don't just write code. It's about, I think Carl Brown
Starting point is 01:40:42 of the internet bugs put this really well to me. He said it makes the easier and the hard things harder. Because it's, yeah, if you need one little thing that it does, great. But when you extrapolate further, it fucks up constantly. And if you do, vibe coding is also a lie. It's just a joke because, yeah, if you're a fucking MBA, you're like, yeah, well, they do, they write fucking code. What's the right?
Starting point is 01:41:01 Do they just write words? It's just to, you just convey meaning through words. What is you just put the words down? I can have chat GPT did that. why is there no chat GPT news outlet then one significant outlet where is it there's a great article called um where's the shovelware where this guy said that if vibe coding was such thing if AI coding models were so powerful where is all the bullshit software where is this massive surge and new software that makes building software easier and it isn't there in fact it doesn't look like yeah repose have even increased it's made it easy for ordinary people to just start coding out of the air why aren't why aren't i just i just Why isn't there a New York Times best-selling chat GPT book? Why is there not a meaningful media outlet that has actually replaced?
Starting point is 01:41:45 I'm not saying it should happen. I'm saying it based on what these fucknots are saying it should have happened already. And it isn't happening anywhere because people can tell and even when they can't, it still sucks. It's like that AI band that had a brief moment where... I don't even remember that. Like the eternal sunshine.
Starting point is 01:42:04 And every time that happens, they're like, look, look! It's just... true. The thing is true. I have an anecdotal example, which is how you make rules. That's how you prove everything with one thing. And it's remarkable because when this all collapses, to answer the question I kind of fumbled earlier, it could be quicker. It could take a while, but the money is going to run out. At some point, the money is going to run out. Also, open AI needs to convert to a fucking for-profit by the end of the year, or they don't get $20 billion from SoftBank. If that doesn't happen, that's bad. They can get the $20 billion from someone else. They'll
Starting point is 01:42:37 fight venture capital is being real stupid right now. Regardless, if they can't convert to a for-profit, they become a Ponzi scheme. Well, and then even if they raise that, what VC is going to want to invest in Open AI at what, a $300 billion valuation? No, no, no, really. These, 500 billion now, they're dumb fucks. They're currently doing a $10.3 billion inside a share sale. That's what you do when you think your company's going to go public. Yeah, definitely. And 30 million a piece is the limit, I hear. And that's the thing. Why are you selling $10.3 billion of insider shares right now? You need that money to run your fucking company. You just promised Larry Ellison, $300 billion. Also, that's a funny thing that could happen. I don't think, because there's a lot of doom out.
Starting point is 01:43:18 I realize that. But there's something very funny that could happen. Oracle is mortgaging everything on this. They are massive amounts of cabex doing a giant bond sale. I've read 14 billion. I've read 80 billion. And at this point, what do numbers mean? Saffircats, by the way, their CEO stepped down let other people take over. I don't know. Why are you doing that at your moment of victory? I'm like, yeah, I don't want to watch this because it's going to be so good. I'm going to be a chairman of something. I'm going to go home and be rich without watching all this. But Oracle is mortgaging their future. And you've got this TikTok acquisition. And Oracle is putting all this money into data centers. There's a very good chance that it just collapses. Also, they're going to ruin TikTok. They're
Starting point is 01:44:00 going to fucking ruin TikTok. So you've got these two things where, and TikTok loses billions of dollars, the other thing that people don't think about. And people think that can't be possible. Anything's possible in the US tech industry. Yeah, how did China do it? Because they have like government monopoly, because they just feed, they are very good at creating these massive growth things using insane algorithms. I just don't think, I don't think Western nations realize how powerful, like, engineer culture is in like Asia and India. Like, just, they are fucking trained like they learn better
Starting point is 01:44:33 they're more studious probably some bad things about it too but well I mean but they invest in education in ways that we don't and that's woke that's woke
Starting point is 01:44:41 that's just woke woke gone wild oh I'm gonna learn to code my LLM will fucking do it but the thing is you've got all of this money being tied up with Oracle for data centers
Starting point is 01:44:51 for one customer that loses billions of dollars the only real open AI's projections are that by 2029 they will make $145 billion a year. In revenues. In revenues.
Starting point is 01:45:03 Still losing, like, money, like billions of dollars. But they'll do that. And that's also a year where I think they're going to have to pay Oracle over $100 billion. I love maths. And that is also more revenue than Nvidia made. This is going to make this year. It's more than TSMC, which is the most essential company in the world, effectively. Yeah, so that's going to happen.
Starting point is 01:45:25 And then the next year they're going to make $200 billion, which is more than Meta makes. I... I swear, do we not have business and tech media? You just repeat this stuff like it's normal. And by the way, the trillion dollar number I gave you, that's if they hit those revenue targets. If they don't, they're going to need about $1.3 trillion, maybe more. This is a company where most media outlets run it and go, yep,
Starting point is 01:45:47 this is what spiked the market. People just going, mm-hmm, sounds good to me. Yeah. This is why there will be a bubble collapse, because we have let the markets get controlled by growth and get controlled by fucking morons, by business idiot dipshits who don't actually know about anything. So they believe the last smart-sounding person they talk to,
Starting point is 01:46:04 who may be another idiot. And they just all jerk each other off and spend bazillion dollars on everything. And they think Sam Altman's smart. That's the crazy thing. You hear Sam Altman talk and he's like, you might be stupid. You might actually.
Starting point is 01:46:19 Satchy Nadella, same do you? You're like, and what's funny with Satchie out of Microsoft? I've had people email me and say, no, behind the scenes, he's really smart. I'm like, yeah, man. Yeah, man, he's a secret genius. It's just he says and does stupid shit all the time publicly. Satchand Adela claimed in a Bloomberg article that he has co-pilot read him documents with like a, like, it turns them into a podcast.
Starting point is 01:46:43 Jesus, dude. Fucking read the document. Listen to a real podcast. Yeah. Like, I, it's great. It turns it into a TV show for me. It's like, are you so fucking stupid that you can't read? Also, you were a fucking.
Starting point is 01:46:56 fucking, he makes like 80 to 90 million a year. You could hire a guy. A guy whose only job is to sit in the car with you and just go, yeah, that's what this makes. I mean, Matt Hughes, my editor, he's a fucking legend. Sometimes I'll be like, hey man, can you just write me one page on this? A stupid, stupid man language. And he, I, me done, learn. And it's great. And he gets compensation for his hard work. And I learn from someone who has done a shit ton of research, which I then go and read all of because I'm not a moron. And it's like, It's one step up, actually. It's not just they don't respect labor.
Starting point is 01:47:30 They don't respect knowledge. They don't respect ability. They just want everything, but they don't even know what everything means. They don't know what they want other than number go up. Number going to go down. Number go down big time. Yeah, but number go up good. Number go up good.
Starting point is 01:47:45 Make me feel good. Me like number go up. But it really is just great because it's very obvious to like they're thinking one week and one week out. It's like, yeah, $100 billion. Sure. Sounds good to me. it's the failing of the business press
Starting point is 01:47:58 and the tech press too but also a failing of the markets the market should be rejecting this like a poison. Instead they're just like glug, glug. I mean Oracle popped 30% on that backlog bullshit headline
Starting point is 01:48:09 well speaking of number go up we've maxed out our clock oh boy and we could keep going theoretically I'm afraid but I think now's a yeah
Starting point is 01:48:18 it's probably a good question up there about Palantir and I think that's worth talking about so just to be clear Palantir added in 2020 to one of their filings publicly that they could not promise AI would provide any kind of revenue of any kind.
Starting point is 01:48:31 Wow. Any kind of profit. Yeah, it's crazy what they write in there. The AI that evil sales force is doing, because that's what Palantir is, is the same, it's not GPU based.
Starting point is 01:48:42 It's just fucking out. Because they've been around for 20 years. Yeah, and they're just doing the, they're just evil sales force. They just like, here's a big fucking pile of data we sell. And even then, when you try and understand what Palantir does,
Starting point is 01:48:53 people go, I don't fucking know. They're a fucking. giant CRM company. They're a database company. They are there, but for like, armies and please, it sucks. Alex Kopp's crazy. He's the craziest guy in the world. What my favorite thing though is, I can't wait to see Salesforce get their ass beat on this though. Fucking Mark Benioff. Every year for like 10 years, he'd be like, Einstein, Einstein, well, every day, it's like, Einstein's the best thing ever, or it's Dream Force or Agent, sorry, Agent Force, fucking hell. He is like, Agent Force is going to be the biggest thing ever. There was a story in
Starting point is 01:49:25 information in March that said they're expecting no revenue growth from AI this year. Like that, this feels like an SEC. Like, it feels like the SEC's function should be that, but they can't go out saying this is the biggest thing ever and then... Not in Trump's administration. It wouldn't have been in the... If Kamala Harris had won, it wouldn't happen either. Yeah.
Starting point is 01:49:42 We need an SEC that actually, I don't know, enforces things. That's what the E... Oh, wait, no, that's exchange. I have no way. I was going to say, I'm so glad you took that one because I was about to fuck that one up to. No, it's cool. was going to say it to. Yeah, it's, and I think the thing to focus on is, it's going to be horrible. It's the, like, the, Darren Reveld, it's the, I feel bad for this nation, but it's
Starting point is 01:50:05 tremendous content, because it really is going to be, like, the embarrassment when these people have to eat shit, when all of these companies have to stop pulling AI out of stuff. Ooh, I'm going to be watching that with a glint in my eye. It's going to be that photo you always tweet of your, oh yeah, the smiling man. Yeah, yeah. I can't even do the face anymore. I was at a Raiders game and much fatter, so I can't really do it anymore. But it is going to be like that again and again. They will never... But we're going to get to see these companies
Starting point is 01:50:35 roll these things back. And their users are going to love it. It's going to be like, fucking Corosun and Return of the Jedi. People are going to be cheering because no one wants this. If I want to meet someone who's like, I love copilot, and I want to study their brain. I want to look, I want to take a look. I want to be like, how'd you grow up?
Starting point is 01:50:52 Oh, homeschool, huh? I mean, there is a world where, you know, these things, because I've seen people use, I've seen people use, like, meta-AI glasses in a way that helps them if they have a vision impairment or something. And I'm like, wow, that is very... And that's useful. And, like, I want to live in that world where they're trying to solve those problems. That's not making number go up. Oh, no.
Starting point is 01:51:15 Of course not. Number go down. But instead, they're just like, hey, remember you used to think? Now you won't have to do that. What if you had a persistent idiot? that you could talk to about stuff and sometimes it knew things, I could just go,
Starting point is 01:51:28 I could go on social media. I could ask a question online. Yeah, I mean, honestly, but it's not going to tell you that you got this. That's kind of a good way to think about it, is that like they used, I feel like they used to provide things that would help you do things
Starting point is 01:51:39 that you couldn't do. But now they're like, hey, the things that you can do, why don't we just do them for you now? And we can't do them. Yeah. We also can't do it. But sometimes we sort of can.
Starting point is 01:51:50 Yeah. And this is why we need one trillion dollar. every time I fucking set, I'm like, did I miss something? That's why I contacted Gil. Because I was like, okay, I added these numbers together. That seems bad. He's probably going to tell me I'm missing something. He's like, yeah, for no.
Starting point is 01:52:06 Yeah, I don't know. There's not enough money, I guess. Anyway, wee. Wee. Well, I mean, they probably, they're short core waves. So they're probably ready for the apocalypse themselves. God. Well, folks, what do you think?
Starting point is 01:52:20 Let us know in the comments where you stand or sit. I can take comfort in the fact that there's enough money to keep it going for now. For a little bit longer. Ed, thank you so much for joining us. Where can people find you? We're going to put the links in the description, obviously, but everything. Betteroffline.com. Pretty much everything on that.
Starting point is 01:52:38 Better offline. Blue sky. I'm kind of still on I'm exe, everything, but I don't really, I fucking hate. Every time I go on there, I feel like taking a shower. Was it your South Africa? I know. If only his voice was that funny. Oh, I know.
Starting point is 01:52:49 I would love it if he had a true, tried and true. I can't wait to our bank on. base dot app the new the new banking platform oh yeah there's your uh there's your logo that fucking skull god damn is so fucking sick it's very cool we have merch it's so good nice hell yeah head to betteroffline dot com to find more from ed everybody and ben andemil show dot com we'll see you in the bonus coming up on this week's episode of ben and amel show dot com when i explained it to you detail to show you on paper when i'm talking about you'll say God damn, Uncle Jeff.
Starting point is 01:53:25 Where did you get that idea? I've had it for a long time. And people say, ah, nah, nah, but you know what? They don't know what they're talking about. My idea is great and it's better than the one that they show on TV. That one, it's so, so. He's a funny idea of business ideas. Yeah, just like, like he thinks you can just sell, like, you just go,
Starting point is 01:53:52 and be like, you want Wendy's Wednesdays? That's a million bucks. Do you have a copyright or anything? No. But it's my idea. You can't steal it. I'll sue you. Walk into Wendy's headquarters and be like,
Starting point is 01:54:04 where's my money? Those guys are always saying stuff like that. Like, I would kill literally every person in the world if one of my family members could survive. But then someone asked, would you suck off 100 monkeys to save all of humanity? It's a good question. It's a great question. And would he?
Starting point is 01:54:22 I don't, I don't think, of course he didn't, he didn't. Coward. But the right, the right wing is afraid to debate the left wing. We were talking. Matt Walsh, would you suck off a thousand monkeys? Honestly, the real star of the show? Benicio. Unbelievable.
Starting point is 01:54:35 Benicio of Toro. I mean, I've got a, I've got a little Harriet, a little Latino Harriet Tubman situation. And just, don't be selfish, Bob. Just unbelievable. And the way he hands him to be, just everything about, yeah, he's so good. Really beautiful. It's got to be so fun. be Benicio de Toro. He thinks he looks like he's just having a blast in life. He does look
Starting point is 01:54:56 like he's having a blast. It seems like a very cool guy. I don't know much about him. No, I just, whenever you do that, there's always going to be some commenter who's like, actually, he yelled it once, someone once. In 1996, he was mean to a PA. Yeah.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.