The Ben and Emil Show - BAES 147: Can Sam Altman be Trusted?

Episode Date: April 9, 2026

Ronan Farrow at NY Magazine just put out an incredibly long and thorough investigation into Sam Altman: his past companies, dealings, and lies. There's testimony from previous business partners (inclu...ding the founders of Anthropic). PLUS, we get into the economics of the AI buildout and whether everyone is collectively jumping the gun, and touch a bit on OpenAI's conveniently timed press release about safety. NEW MERCH OUT! Get 10% off when you sign up and also get bonus content, ad-free versions and more plus your first 7 days free at https://benandemilshow.com ***THE SOUTHWEST COMPANION PASS IS BACK GET IT HERE: https://www.cardratings.com/bestcards/featured-credit-cards?src=691608&shnq=520080,4028088,4048122,4028085,3006151,4048149,4028089,4048084&var2= The newest acid video is out now so check it out! https://youtu.be/7vkFY3f5kkw Give this video a thumbs up if you enjoyed it! And please leave us a comment! It helps us! ***Ben's new movies and tv podcast with Dillon is OUT NOW! GO WATCH the latest episode on our TOP MOVIES OF 2025: https://youtu.be/tbC-cMqcby8?si=tO0NK0PmpN2187ir **CHECK OUT EMIL'S LIVESTREAMS HERE: https://www.youtube.com/emilderosa __ SOME OTHER VIDEOS YOU MAY ENJOY: That's Cringe of Cody Ko: https://youtu.be/dTbEk0pVh2w Our AUSTIN VIDEO: https://youtu.be/yGSs56bFzRU Our episode with Kyla Scanlon: https://youtu.be/cIHWkY35cuc Big Tech is out of ideas (ft. ED ZITRON): https://youtu.be/zBvVGHZBpMw Arguing with a millionaire (ft. Chris Camillo): https://youtu.be/1ZUWTkWV_MM We bought suits HERE: https://youtu.be/_cM1XqA9n2U ***LINK TO OUR DISCORD: https://discord.gg/CjujBt8g ***Subscribe to Emil's Substack: https://substack.com/@emilderosa ***Trade with Ben at https://tradertreehouse.com __ QUO: Try QUO for free and get 20% off your first 6 months at https://www.quo.com/BAES RIDGE: Upgrade your wallet today! Get 10% Off @Ridge with code BAES at https://www.Ridge.com/BAES #Ridgepod __ Follow us on instagram! @ benandemilshow @ bencahn @ emilderosa Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 This episode is brought to you by Tellus Online Security. Oh, tax season is the worst. You mean hack season? Sorry, what? Yeah, cybercriminals love tax forms. But I've got Tellus Online Security. It helps protect against identity theft and financial fraud so I can stress less during tax season, or any season.
Starting point is 00:00:20 Plans start at just $12 a month. Learn more at tellus.com slash online security. No one can prevent all cybercrime or identity theft. Conditions apply. This week we're talking all about Sam Altman again. This massive 18-month investigation done by, what's his name? Ronan Farrow. Ronan Farrow.
Starting point is 00:00:39 And another guy who no one's going to remember. Yeah, nobody cares about that guy. I don't understand where this charisma is. Because I sure haven't seen him. But here's another tech executive. He's unbelievably persuasive. Like Jedi mind tricks. He's just next level.
Starting point is 00:00:54 Shut up. Not even a technical guy. I thought that he was a coder whiz. the inventor of large language models or something. He's the inventor of looped? There's been a lot of criticisms about this and some asshole. What about this article? You can't be gay and write an article now?
Starting point is 00:01:09 No, someone came to Sam Altman's defense and said, Ronan Farrow is just a jealous gay guy. That's how gay guys are. They're catty and they lie about each other if they're jealous of their success. Also, why would all these people lie? Because they're gay. I, that's what they do. And remember, kids, take your, what's it called?
Starting point is 00:01:59 Iodine. Take your iodine pills. No, don't joke. That's, don't. So, trust me, by the time this comes out, nothing will have happened. Do I look worried? You think nothing's going to happen? I don't think anything's going to happen. By the time this comes out, he will have done some posturing, bombing. I don't mean to make it sound.
Starting point is 00:02:21 Oh, sure. like nothing. I'm just saying, I think, I think that I predict that by the time this comes out, he'll have either done nothing or will have done a little bit so that he can say that he did. Bomb some more bridges and stuff. And then be like, they stopped me right before I was going to really fuck it up. Who stopped them? I don't know. The, the, the, Pakistan. Iran doesn't seem interesting a deal. I don't know. Obviously, I hope that's what happens. I hope he gets to keep tweeting from the White House. I did it. I won again. I won this war for the, ninth time and it's all over.
Starting point is 00:02:55 I just wish something would happen to him. We all do. Something, something permanent. If you're, if you're wondering what we're talking about, you're in the future. We are in the past.
Starting point is 00:03:06 Yeah. Which is, the past is current, is, hello up there. Is Tuesday at 2 p.m. Back when things were. Pacific time.
Starting point is 00:03:14 Yeah. Specific time. Waiting for Trump's deadline, which has gotten weirder and weirder with today
Starting point is 00:03:25 probably being the most insane threatening basically genocide. He said a whole civilization will die tonight, never to be brought back again.
Starting point is 00:03:34 I don't want that to happen, but it probably will. However, now that we have complete and total regime change, where different, smarter, and less radicalized minds
Starting point is 00:03:42 prevail, maybe something revolutionary wonderful can happen. Who knows? We will find out tonight one of the most important moments
Starting point is 00:03:49 in the long, complex history of the world. 47 years of extortion, corruption, and death will finally end. God bless the great people of Iran. And we're not here to talk about that today. No, we're not. But I do wish Melania would take a pillow and do something. Donald, come here. I have something to show you.
Starting point is 00:04:06 This pillow smells funny. And I want you to lay down. And I'll show you relax. I'm trying a new smell. And I put it on the pillow. And he goes, oh, wow. It smells real good. What do you think she would make it smell like McDonald's? I don't know.
Starting point is 00:04:22 Who knows? Her taint area? I'm sure he loves that smell. Wow, Melania, you really managed to get your taint smell all over this pillow. It smells just whop. And then she, Donald, I'm sorry, I have to do this. I have to be best. Anyway.
Starting point is 00:04:45 So, yeah, that's, if you're wondering, we don't know. I don't know what Thursday will look like. We don't know what tonight will look like. There's a chance we go live because something has happened. I hope we don't go live because that will have meant something happened that requires that. I really hope it doesn't. So we're going to talk about other stuff, but I do, you know, I hope people know that it feels like talking about other stuff feels like I feel like I'm, I'm a kid again. And I'm in my room trying to do my homework.
Starting point is 00:05:14 And I could just hear my parents fighting. And I'm like, I'll just try to do the homework. nothing I can do about that Yeah Oh man Or you're at your friend's house Your friend got invited over And their dad is like
Starting point is 00:05:31 Fucking screaming at him Or the mom in that case I remember going to my friend Jason's house And his mom just Fuck just Was screaming and hitting him And I remember sitting there in the living room Just like
Starting point is 00:05:43 And I remember distinctly looking up And seeing a cross over the thing with Jesus on it. And I just remember me like, damn, that's bleak. They really got him up there, huh? They really got his ass up there. Anyway. And then Jason comes back and turns on the PlayStation.
Starting point is 00:05:58 You're like, let's just play. We don't even have to talk about it. What's they're talking about? Let's just fire up GTA. We'll take turns on that. There's no reason to talk about it. I also remember it. They lived, their house was back against a storm drain,
Starting point is 00:06:12 the storm channel. And they had these two big German shepherds. And when he had to go clean up the dog poop. They would just shovel it and huck it over the fence. Anyway. No, but this week we're talking all about Sam Altman again because there's a big old, big old New York magazine hoity-toity liberal article. It's the New Yorker.
Starting point is 00:06:38 The New Yorker is what I said. What did I say? I think you said New York Magazine. New Yorker. The New Yorker magazine. The New Yorker. And we're going to be talking. talking all about that. And then we're going to cover a little bit of a, there was a Bloomberg article that was very surprising that came out talking about how AI's spending, not AI spending, the hyperscaler spending spree might be a little bit of a false start, as she calls it. And furthermore, we've got a bit of a look into OpenAI and Anthropics finances before the IPO. Because
Starting point is 00:07:13 man, oh man, these companies, they are just chomping at the bit to go public. And OpenAI, SpaceX. And Open AI, we're going to talk about the paper they released where they're like, hey, we actually think good stuff should happen to people. Yeah, we might want to, you know, change our tune here, yeah, a little bit.
Starting point is 00:07:32 So it's going to be a good episode. It's going to be really funny. You're not even going to remember your parents are divorcing. It's like you're going to forget all about that stuff. You're going to have two Christmases. Look at it this way. Two Christmases, two birthdays. I will say regardless of what happens,
Starting point is 00:07:46 You're living in a different world on Thursday. Even if it all chickens out, this, a whole civilization will die tonight. It's what, we live in a different world regardless. Let's just hope it's not the horrific version we're all afraid of.
Starting point is 00:08:05 But either way. Speaking of, you know what I was getting lost and thought about on the drive over here? You can't guess. You didn't even let me try. Yeah, I guess I'll give you a hint. No, why don't you let me try?
Starting point is 00:08:17 Yeah, but I'll give you a hint. Okay, okay. Go ahead. I bet you were thinking about planes, B-52s, and... Close. Just planes. Okay? Now your second hint. In-N-Out Burger.
Starting point is 00:08:32 Plains in-and-out burger? Come on, brother. How you ship them, I don't know. The In-N-Out near LAS is a classic plane spotting spot. And I was thinking, man, you know, I could throw a baseball at a landing airplane easily. How come more people aren't trying to take down full-on... passenger jets landing at LAX. Where you're like, why don't people
Starting point is 00:08:54 do this? Why don't they just give in to their basest instincts and do this all the time? Well, because they were talking about on the radio about the high speed rail here in California. And I just keep thinking that's such a bad idea. People are going to throw baseballs at that. I kept thinking... You could do this.
Starting point is 00:09:11 Why don't people go to overpasses all the time and just drop bricks off? That's true. You know why? Because people don't really have the urge to do these things. I know. Well, I worry about people. That's all. I mean, you know, all it takes is one methodic going, I bet I could throw a baseball at that thing. It'd also be really hard to hit a landing plane with the baseball.
Starting point is 00:09:28 I don't know, man. I got a good arm. I bet I could. I would say the population of people who could hit a landing plane with a baseball is small. I bet I could. That's all I'm saying. I got an arm on me, man. I don't think I could hit a landing plane with a baseball. It's 100 feet up there. I could throw a baseball. Actually, yeah, you're right.
Starting point is 00:09:44 I've been in that parking lot. They're not as close as you think. Yeah, that's true. Still, if you really gave it at your all? I guess if like But it's also just a baseball. Or what if you had a water balloon launcher and you launched? A baseball is also not going to take down a jetliner. Maybe if it was on fire. We should try it.
Starting point is 00:10:04 You would probably get a bunch of tries before anyone noticed or anything. I highly, they would be on it like white on rice. I don't think so. Or white on my skin. I was at, uh, don't worry. We're going to get to the episode. It's just that, you know, we got to. We're trying to drown out the noise of what parents.
Starting point is 00:10:20 Who cares, man? Truly, who cares? Whatever. Go on. I was at Storm King, and I had taken acid, and we were... That's an art place in New York. It's an outdoor giant art sculptures, really fun place to take. And you were on drugs? Boy, that's no surprise. And I had got... There was this, like, little thing hanging off a tree. And I was like, I bet I could throw them. There was, like... They remember, like, crab apples or something that were falling on the ground.
Starting point is 00:10:47 I was like, I bet I could throw one through when I was getting so close. close. And then, you know, like nothing, you don't really think about rules when you're in that space. And a woman was coming over being like, excuse me, what are you doing? And it didn't, I felt no worry at all. I was just like, well, I'm trying to get it through that hole. And I just keep, and she was like, well, you have to stop. And I was like, one second. And I just, and then I nailed it. I was like, okay, I'm good. Did she say, wow, good job? She was very pissed to me. Oh, that's very funny. I'm glad you shared that. Anyway, speaking of, of trying to throw something through a very small hole
Starting point is 00:11:24 and people getting pissed at you. Sam Altman. The small hole being... Are assholes? No, the small hole being that... Landless plane. Open AI has a very narrow path in front of it in terms of not only getting their free cash flows
Starting point is 00:11:44 in the positive, but also the woman who's coming to stop you. is the former colleagues and employees. I feel like I'm watching a airplane above in and out burger just getting absolutely pelted with baseballs
Starting point is 00:12:02 because I don't know if this plane's going to land. I think I did it. So it was this massive 18 month investigation done by what's his name? Ronan Farrow. Ronan Farrow. And another guy who no one's going to remember. Yeah, nobody cares about that guy.
Starting point is 00:12:18 So Ronan Farrow is gay, right? Right? Why does it matter? Because I saw someone's... Why does it matter? There's been a lot of criticisms about this and some asshole... About this article. You can't be gay and write an article now? No, someone came to Sam Altman's defense and said,
Starting point is 00:12:36 Ronan Farrow is just a jealous gay guy. That's how... That's how gay guys are. They're catty and they lie about each other if they're jealous of their success. Just like, all right, sure. And then someone... And then I think he also said, plus, gay guys lie. That's what they do in defense of Sam Altman's lying.
Starting point is 00:12:56 So what if he lied? That's what we do. I just thought it was interesting. So anyway, 18-month investigation. Can you, can someone in the comments confirm if that's what they do? Hey, guys, we got to take a quick little break here. You know, spring is a natural reset point. Spring cleaning. Spring has sprung, if you will. And, you know, if you've been putting off,
Starting point is 00:13:29 cleaning up the messier parts of your business, now is the time. Because streamlining communications is one of the quickest and easiest upgrades you can make, which is why today's episode is brought to you by Quo, spelled QUO, the smarter way to run your business communications. Quo is the number one rated business phone system on G2 with over 3,000 reviews built for how modern teams work. That's why more than 90,000 businesses from solo operators to growing teams rely on Quo to stay connected, professional and consistently reachable. Quote works wherever you are right from an app on your phone or computer and lets you keep your existing number. Add new numbers or teammates in minutes, sync your CRM, and rely on seamless routing and call flows as your business. scales. Man, I love seamless routing and call flows. I mean, it's great. Your entire team.
Starting point is 00:14:21 I say, coly moly. That's very good. Quoli moly. Your entire team can handle calls and text from one shared number. No more missed messages or disconnected conversations. Everyone sees the full thread making replies faster and customers feel genuinely cared for. So make this season where no opportunity and no customer slips away. Try Quo for free. Plus, get 20% off your first six months when you go to quo.com slash bays. That's QUO. dot com slash B-A-E-S. Quo, no missed calls.
Starting point is 00:14:53 No missed customers. So they collected internal memos, 200 pages of documents and private notes. They interviewed over 100 people. And basically they, it's a lot of things that people already kind of knew before. Yeah, they just went really deep on it.
Starting point is 00:15:12 Really deep. And I do find it, I find it compelling that they talk to so many people in his orbit. Former employees, just people who have worked with him at various startups. Also just interesting to see his like rise to Silicon Valley Sicko. He's not even a technical guy. I thought that he was a coder whiz. I thought that he was the inventor of large language models or something.
Starting point is 00:15:44 He really isn't. He's the inventor of Looped. Yeah, which was, well, it's an app that lets you see where your friends are and what they're doing. But that's a perfect example. It's of his, you know what? I also can help but notice. What? Remember when you go through the...
Starting point is 00:15:59 That he has insane vocal fry? No. God, the vocal fry in this man. It was kind of the like Epstein thing where they talk about like, like, what episode was it where we were going through all of those accounts of people talking about Epstein? and they're just like, he has this like magnetism, this charisma that he can get people to do things. Quite a few quotes where you're like,
Starting point is 00:16:21 why is every prominent person like this? And obviously everyone, I'm sure, knows those like weird studies where they're like, about a quarter of CEOs are sociopaths. We figure that out. And it helps, I guess. But it's just like,
Starting point is 00:16:35 this is from the article. Altman is often described either with reverence or with suspicion as the greatest pitchman of his generation, Steve Jobs, one of his idols was said to project a reality distortion field and unassailable confidence that the world would conform to his vision. But even Jobs never told his customers that if they didn't buy his brand of MP3 player, everyone they loved would die. When Altman was 23 in 2008, his mentor wrote, you could parachute him into an island full of cannibals and come back in five years and he'd be
Starting point is 00:17:04 the king. The judgment was based not on Altman's track record, which was modest, but on his will to prevail, which he considered almost an ungovernable. And it's like, this is always there's so many of these things. Like Sam Altman can't be stopped by such flimsy rules, another wrote. But when you listen to him speak, he's so
Starting point is 00:17:25 fucking boring. Well, I don't understand what the, where this charisma is. You don't see it. I sure haven't seen it. But here's another tech executive. He's unbelievably persuasive. Like Jedi mind tricks. He's just next level.
Starting point is 00:17:40 Shut up. He's just, come on, man. I mean, I mean, maybe, sure. To a bunch of autistic tech guys, maybe they're like, wow,
Starting point is 00:17:50 this guy's really good. I mean, I don't know. We've obviously, we've seen clips of him and stuff, but he's boring as all get out. This is what people's general read of the guy is. Who?
Starting point is 00:18:02 He's got a mouth. He does, he does the bottom teeth thing. He, and he, and he has vocal fry. And he's always looking, concerned?
Starting point is 00:18:12 I fucking can't stand this piece of shit. I mean, is it persuasive or is it, because there's other parts of this article where they describe, many people describe just how comfortable he is with lying. Yes. And as Ben has told me, that might just be a gay thing.
Starting point is 00:18:28 Apparently, that's what he thinks. That's not what I think. That's what the stranger on Twitter said. But it is and it's funny because I'm sure you've seen that guy online who he does all the things where he talks to the chat. He talks to chat.
Starting point is 00:18:45 Oh, yeah, it's great. And he asks us to do things and it obviously can't do them. Basic things. Yeah, and he's making fun of it. And he goes... Time me running a mile. Yes. And that's like one of his ones going viral right now. And he literally goes, okay, time me.
Starting point is 00:18:59 And the thing goes, yeah, I'm going to time you. And then he goes, okay, start. And the thing goes, okay, we got it. And then he immediately goes, okay, stop. And he's like, what was the time I did? And he goes, 10 minutes, 34 seconds, whatever. And the guy's like, what do you talk it? Like, are you sure it felt shorter than that?
Starting point is 00:19:13 And the guy's like, no, I timed it. I know what it was. Did you see Sam Altman get showed that clip? I did not. It's, there's nothing that remarkable about it. It's just how quickly. And goes, oh, sure. Yeah, that's, of course it's not able to do that basic thing.
Starting point is 00:19:32 No, he says, he says, oh yeah, we're aware of that. And that's the model just can't do that yet. And so in about a year, the product will be able to do that. We're actually currently working on those right now. And it's just like, wow, you're... Oh, yeah, here we go. Let's play this. Hopefully there's an ad first.
Starting point is 00:19:52 Can I show you a social video really quick? Have you seen this guy Husk online? I think you should bring him in as someone who breaks the models for Red Team. I'm going to run a mile and you just time me how long it takes, okay, when I say go. You got it. I'll be ready. Just say go when you're going. You start and I'll keep track.
Starting point is 00:20:11 Okay, ready, go. All right, I've started timing. Just focus on your run and I'll be here when you get back. Okay, I'm back. Welcome back. I've got your time right here. How do you feel about that? How long was it?
Starting point is 00:20:28 What? You clocked in at around 10 minutes and 12. I swear I was faster than that. Sometimes it feels faster in the moment. If it's going to save humanity, it's got to get it right. Sam. Maybe into the microphone.
Starting point is 00:20:48 Do you need to show that to your product guys? No, no, that's a known issue. Maybe another year. Another year? That works well. What's that known issue? The model doesn't have that model, that voice model doesn't have tools to like start a time or anything like that. Fair.
Starting point is 00:21:01 But we'll add the intelligence into the voice models. That's fucking. It's just so weird. We're this far along and the voice model doesn't have a basic, ultra basic function. It was just very funny as all this was going. on and people talking about his ability to lie. Like, there were so many instances. And then so, and that just popped into my feed today where I was like, wow, it's weird.
Starting point is 00:21:23 Like, you're telling me that you built this massive world changing tech and it can't do a fucking timer? And then you're just going to sit there and go, well, they're working on it. Well, that's a known issue. And that model, unfortunately, just doesn't have the tools and we're implementing those. But I mean, this is, this is one of them. So this is from his time at looped. Most of Altman's employees at Looped liked him, but some said they were struck by his tendency to exaggerate.
Starting point is 00:21:48 Even about trivial things, one recalled Altman bragging widely that he was a champion ping pong player, in quotes, like Missouri High School ping pong champ, and then proving to be one of the worst players in the office. Well, then Sam Altman said, yeah, I was probably kidding. Which I can see to defend that, not that I'm defending him, obviously you guys know I'm not a fan of him. I can see him playing poorly and then being like, oh yeah, I was a total ping pong champion in high school, assuming that people will get the joke. But still, yeah, it's the fact that he's... I think it's also coupled with all these things, coupled with all these things. He transitions from looped into his role at Y Combinator, where he's like doing all kinds
Starting point is 00:22:32 of shady deal making. Because with his role as head of Y Combinator, he gets all this access to new Silicon Valley startups. Startups. And he's one of the first ones to see it. They call him a king maker
Starting point is 00:22:45 in this article. And so he starts doing weird side deals, cutting people out, and all kinds of people are finding out, what the hell?
Starting point is 00:22:53 Like, why is Sam Altman all over this thing? Yeah. Just obviously an alarming pattern. Yeah, so it's all, and then I always forget that he started
Starting point is 00:23:05 Open AI with Elon Musk. Because he had seen Elon, I believe he had seen Elon Musk talking about AI and the perils of it and the coming seriousness and gravity of it. And so he just sent an email to Elon. Yeah, he sends an email to Elon.
Starting point is 00:23:21 And their entire thing was safety that we need to get out there and pursue super intelligence, artificial intelligence, artificial general intelligence, whatever you want to call it. Because if we don't, someone else will.
Starting point is 00:23:38 And we need to do it with safety as our prime focus. And one of the fundamental parts of this article is just underscoring how as they scaled up and they got more and more investment from outside sources, safety was completely abandoned and thrown by the wayside despite what Sam Altman was saying publicly. And I was also surprised, I mean, in all of this stuff with the AI drama and all these different competing companies. It is mind-blowing that Dario Amadai,
Starting point is 00:24:12 he started Anthropic in 2021, just five years ago. Yeah. And I mean, it's so funny. I didn't know this. He was the safety lead. I knew Sutskever was, you know, very behind all of this, the firing and everything.
Starting point is 00:24:29 It was very worried about him. And I guess they call him the Sutskever memos or whatever. I didn't know Dario. Amadeh was also keeping this detailed journal of his time there and his concern for him. I think there's a good, so they talked to, as we were saying, over a hundred people. And they said that some of them just had completely benign feelings about him. Some of them thought, you know, okay, I think he's maybe a little goofy or gullible, whatever. But they said, most of the people we spoke to shared the judgment of Sutskever and Amadeh.
Starting point is 00:25:04 Altman has a relentless will to power that even among industrialists who put their names on spaceships sets him apart. Wow. He's unconstrained by truth, a board member told us. He has two traits that are almost never seen
Starting point is 00:25:19 in the same person. The first is a strong desire to please people to be liked in any given interaction. The second is almost a sociopathic lack of concern for the consequences that may come from deceiving someone. Bab boom.
Starting point is 00:25:33 It's really, There's a lot of really wild Can you imagine getting that thrown in your face At a rap battle? Can you rap it to me? Yo, this motherfucker got two traits That almost never seen in a person First, he's got a strong desire to please people
Starting point is 00:25:48 To be liked and then give a rat Getting in giving any reaction Second is almost a sociopathic Like a concern for the consequences That may come from deceiving someone You are lying a bitch And then I'd be like Oh shit
Starting point is 00:26:04 Hey guys, we gotta take one more quick break to talk to you about your wallet. I'm gonna steal your wallet. No, I'm not. I'm just kidding about that. I do need to know. Do we have people watching who still, because I made the switch,
Starting point is 00:26:19 I used to have the leather bifold. The thing was sticking out of my dang pocket. Didn't matter where I put a front pocket, back pocket, either way, uncomfortable as I was on boomer mode. Just not good. Yeah.
Starting point is 00:26:32 And then the nice folks at Ridge sent us each a free little, sample of their products. And boy, I got to tell you folks, unique, slim, modern design that holds up to 12 cards plus cash. Sleak? Sleak? Yeah. Over 50 colors and styles to
Starting point is 00:26:50 choose from? They've all got a lifetime warranty. It's literally the last wallet you got to buy. They also have a cool little keychains and my favorite thing that they've got the power bank. I use this thing every single day. But also did you tell them about the RFID blocking technology? No, I didn't want to you. big thing, okay? Did your stupid
Starting point is 00:27:10 leather wallet keep all your information safe? Because this thing gives you peace of mind. You get the RFID blocking technology keeping you safe from digital pick pockets, okay? So not only are you carrying all your cards with ease
Starting point is 00:27:25 You're doing it safely. Slim, feeling comfy, but you're also protected out there. All right folks. I am a huge fan of the power bank too. The power bank is awesome. And that's the thing. Ridge isn't just about wallets. They create premium everyday carry essentials like power banks, keycases, suitcases, and rings, all built with the same
Starting point is 00:27:42 sleek, durable design. That power bank comes with me on every trip. And they got free shipping, a 99-day risk-free trial, and a lifetime warranty on all their products. So for a limited time, our listeners and viewers, get 10% off at Ridge by using code Bayes at checkout. Just head to ridge.com and use code B-A-E-S.
Starting point is 00:28:00 And you're all set. After you purchase, they'll ask where you heard about them. So please support our show and tell them that we sent you. And I just want to, I'm just going to Rattle these off real quick because it is an extremely long article and I don't I don't think anyone, like
Starting point is 00:28:16 I just like the fact that Ilya and Dario both agree. He's a liar and a major problem. That's basically there. But a terrifying liar. Yeah, a terrifying liar. I don't even care about the consequences. And before you, sorry, before you get to what
Starting point is 00:28:30 you were just about to say, I got to remind people that Ilya Sutskiber was seen and still is in many ways. Seen is like, like a nini who was just what a nini like that he was like oh my god you know they were like shut up ilio we're trying to fucking create aGI here oh you're getting in our way yeah no i was going to say that you're going to say the nini thing he was going to say something cooler yeah no i was going to say that he was seen as the as the the pinnacle of machine learning this artificial intelligence
Starting point is 00:29:04 nininess, the, like, the number one researcher for anyone to get. And when, when Open AI got him, it was like, okay, now we're cooking with gas. Oh, yeah. I think other places like Google were offering him like six million bucks a year. And he turned it down. Yeah, they were, I mean, everybody wanted a piece of Ilya. Also, everyone's talking about Ilya. That's honestly like, what a Sex and the City character. Like, that's what Samantha would say if, if, if, Darling. Carrie was like, I'm going out with Ilius Sutskever tonight. I'm kind of into the bald head and caterpillar eyebrows look.
Starting point is 00:29:40 I wish he would do some machine learning on my, you know what. Your pussy? Yeah. I'm talking about a vibe, you know, she's... Sure. I had him re-sodder my vibrator. It's divine. It might speak to Sam Altman's persuasiveness here that he's been able to
Starting point is 00:30:01 you know, remove him from the $6 million being waved in front of his face. Yeah. One of Altman's batch mates in the first Y Combinator cohort was Aaron Swartz. I don't know if you remember that guy who killed himself. Unfortunately, very sad story. A brilliant but troubled coder who died by suicide in 2013 and is now remembered in many tech circles as something of a sage. Not long before his death, Swartz expressed concerns about Altman to several friends.
Starting point is 00:30:25 You need to understand that Sam can never be trusted. He is a sociopath. he would do anything. And this is, they've obviously had this relationship with Microsoft for a long time, but I didn't know that had any trouble. Multiple senior executives at Microsoft said that despite Nadella's longstanding loyalty, the company's relationship with Altman has become fraught.
Starting point is 00:30:47 He has misinterpreted, distorted, renegotiated, reneged on agreements. And this is the last thing I'm just going to say about people talking about him. The senior executive at Microsoft said of Altman, I think there's a small but real chance he's eventually, remembered as Bernie Madoff or Sam, as a Bernie Madoff or Sam Bankman-Fried-level scammer. That's a massive claim. There you have it, folks.
Starting point is 00:31:13 I mean, I don't know what else to tell you. The guy stinks to high heaven. What I found most intriguing is when he and Elon eventually started budding heads and parted ways, there's quite a lot, well, not quite a lot, but there's excerpts in there about how they're all spying on each other. And I didn't even think of that. Because they're all trying to now... What do you call that?
Starting point is 00:31:40 When you're trying to futz around with other people shit? Futs it around with other people shit. Yeah, they're trying to futz around with each other shit. They're trying to sabotage. Sabotage. I knew what you meant, but I like... Corporate sabotage. Ooh, sexy.
Starting point is 00:31:52 And they're... They've got private investigators like monitoring his social interactions where he flies, how often. they've got detailed profiles on them and they're all doing it to each other and it's just it's it's gross but it's also like
Starting point is 00:32:12 all right yeah fuck you it did make me feel like we all move like we did a big episode about it when it happened the the 2023 firing and rehiring of same Altman but I'm like we all moved on way too fast after that and just as bizarre oh yeah and so that was a whole
Starting point is 00:32:28 that was a whole thing too in this article they're kind of rehashing that and sharing some anecdotes that might not have been public at first. But basically Ilya and Mira Marathi, who was one of the also chief scientists and I believe one of the on the board of directors. Yeah, she also took over a CEO briefly before he was. They were all, they were all so concerned over Sam's increasingly distrustful behavior and his, the fact that they were, I think they weren't for profit yet. Yeah, but I think what really changed was the introduction of chat GPT.
Starting point is 00:33:05 I think before chat GPT, they were all like, this is incredible. We're just a research lab. We're a nonprofit research lab and we're doing cutting-edge research on this breakthrough technology that may change the world.
Starting point is 00:33:18 And that does seem like what they were doing. But then as soon as chat GPT gets released, it's like classic cutthroat, Silicon Valley, destroy competitors, bind massive amounts of funding. That's when you see him. I mean, there are some insane quotes
Starting point is 00:33:35 about the proximity of the journalist Jamal Khashoggi being literally strangled and his body dismembered. And then like three days later, Sam Altman being like, yeah, sure, I'll take some Saudi money from MBS and people being like,
Starting point is 00:33:51 dude, I don't know if this is a great idea. He's like, why? What are he talking about? There's a lot of stuff in there about that and him receiving gifts from various foreign countries, foreign governments. And him just being like, I don't know, I get gifts all the time from people. What do you want?
Starting point is 00:34:08 And then shortly thereafter, he's photographed in that Koneg, Sig, whatever, $3 million car in the bucket. It's so, pull that up. Sam Altman in a supercar. Just Google it. He's just, he looks so small in it. And he just looks caught, you know.
Starting point is 00:34:25 But, yeah, when they, what went over Yeah, there it is. Look at him. Look at him. Look at him. Look at his little head in there. Aye, aye, aye. And it is amusing that you've got,
Starting point is 00:34:38 you've got Elon Musk and him going at each other. It's like that the Japanese guy in Godzilla when he says, no, let them fight. Yeah. Yeah, it's like that, you know. Let them fight. Yeah. Let them fight.
Starting point is 00:34:55 But yeah, when they, when, when, when, when, when, when, when, when, he was briefly ousted over that weekend and then there were a bunch of employees that signed on to back him. As it turns out, a big part of the reason that they were backing him is because they were just about to do another fundraising round wherein those employees were going to be able to cash out some of their options, their stock, to this, I believe it was called Thrive, was the investor that was about to do. So they were like, shit, if we don't do this.
Starting point is 00:35:26 Kushner's brothers VC firm. Josh Kushner. Oh my gosh. Did the other, all that stuff, Silicon Valley is so weird. They're like, Josh Kushner
Starting point is 00:35:38 had some thinly veiled threat towards, I think. Was it? Was it Mira? Yeah, they were basically like, well, so when he got re-ins,
Starting point is 00:35:49 when he said, okay, yeah, I'll come back as CEO. Under the, but you've, I have terms, you've got to get rid of the board of,
Starting point is 00:35:56 the people on the board who tried to oust me. And they basically threatened all the employees, including Miramirati. And he said, oh, and we're going to do an independent investigation into me, you know, because we're, he tried to, like, save face. And then that investigation just kind of fizzled out, as did a lot of their safety things. In mid-20203, they said that they were going to dedicate one-fifth of their computing power to a super alignment team to prevent you know,
Starting point is 00:36:31 rogue AI calamity kind of things. That team apparently only ever got one to two percent of computing power and on the oldest hardware that they had
Starting point is 00:36:39 and eventually was just fully dissolved. Speaking of how weird Silicon Valley is, maybe my favorite parenthetical I've ever read in a story.
Starting point is 00:36:50 So it says in 2023, Altman married Mulherin in a small ceremony at a home they own in Hawaii. That's his husband. In parentheses, they'd met nine years prior late at night in Peter Thiel's hot tub.
Starting point is 00:37:03 Hell yeah. Man. What I wouldn't give to be a bubble floating around in that hot tub. Just the hottest guys. Okay. The hottest guys, the most influential guys. You got Peter Thiel, a young Sam Altman. I don't know what this bit is, me being horny for these guys, but I'm leaning into it.
Starting point is 00:37:23 I think you're horny for them, and that's okay. Maybe it's not a bit. They're so powerful. and you go absolutely off. They are very powerful. And that's another thing. It is very disorienting, reading about the political shift because obviously the moment is insane and looking back at some of the things he's said.
Starting point is 00:37:42 And obviously, Peter Thiel and the rest of Silicon Valley's pension for bending towards Trump and being like, you know what? I'm rethinking it. It's that whole section just... made me sick. Oh, where Sam Altman says, you know, when I met him in person, it was a lot,
Starting point is 00:38:01 it was a lot different than maybe I should start thinking for myself. It's, so, dude, they say, Altman has long support of Democrats. This is his quote, I'm very suspicious of powerful autocrats telling a story of fear to gang up on the week,
Starting point is 00:38:14 he told us. No kidding. And he says, that's a Jewish thing, not a gay thing. And I'm like, and somehow you went, okay.
Starting point is 00:38:22 In 2016, he endorsed Hillary Clinton and called Trump an unprecedented threat to a, America. In 2020, he donated to the Democratic Party and to the Biden victory fund. During the Biden administration, Altman met with the White House at least half a dozen times. He helped develop a lengthy executive order laying out the first federal regime of safety tests and other guardrails for AI.
Starting point is 00:38:43 When Biden signed it, Altman called it a good start. In 2024, with Biden's poll number slipping, Altman's rhetoric began to shift. In quotes, I believe that America is going to be fine no matter what happens in this election. might be wrong about that one, but after Trump won, Altman donated a million dollars to his inaugural fund, then took selfies with the influencers Jake and Logan Paul at the inauguration, and on X in his standard lowercase style, Altman wrote, Watching Podus more carefully recently has really changed my perspective in parentheses. I wish I had done more of my own thinking.
Starting point is 00:39:20 You got to be fucking kidding me. I mean, that's no surprise. these guys, none of this will ever really affect them. They're so, so insulated. But it does affect them positively, right? Yeah, that's what I'm saying. He gets into Trump's good graces. Altman is now one of Trump's favorite tycoons,
Starting point is 00:39:37 even accompanying him on a trip to visit the British royal family at Windsor Castle. Altman and Trump speak a few times a year. You can just, like, call him, Altman said. This is not a buddy. But yeah, if I need to talk to him about something, I will. So if you guys are out there wondering why is this guy substantial? Why is he important? Why are we talking about him?
Starting point is 00:39:55 Why do people care so much? Well, it's because he is the face and the main cheerleader of all things AI. We would not be here in terms of AI being front and center for just politically, culturally, all this massive, massive investment that is like buoying the stock market and is responsible for everything going up. we wouldn't be in this position really without Sam Alman. I never sent that email to Elon Musk, probably not. I mean, I'm sure that Microsoft and Google were obviously working on it, but not to this extent. He really was, he really is responsible for it. I think you're right. He's probably the face of it. I don't think there's a more recognizable
Starting point is 00:40:41 AI creator. No. Hopefully, as time goes on, there will be room for others. I mean, as we're seeing with Anthropics emerging. Obviously, Dario Amadeh is really hot on the scene. Oh, yeah. It's such a sexy guy, too. These guys are all so hot. I know everyone, like, everyone loves Anthropic because of the whole Department of Defense
Starting point is 00:41:04 Showdown and whatever. I just, I don't know. I find, I'm skeptical of all of them. I obviously use AI from time to time. And, but it's just, literally just today, the New York Times did a thing about how anthropics new, Anthropics new model is so powerful.
Starting point is 00:41:23 They're not even releasing it to the public. They're giving it to other people to be like, hey, prepare for what's coming. It's just like, fuck. They say that every,
Starting point is 00:41:31 I've been, I swear they've been saying it since 20, 23. Oh, this model's crazy. This model just sucked me off. It's insane. You wouldn't believe the things is to do.
Starting point is 00:41:39 It uses its hand at the same time to suck you off. It's crazy. And they're calling it mythos. Maybe don't use Greek names, okay? The Greek people have been through enough.
Starting point is 00:41:48 We don't need to, You don't need to associate it with your self-sucking model. It's also important to talk about this guy because of the sheer size of Open AI. They're absolutely massive. They've got huge influence. They've got government contracts. They are kind of setting standards for how AI is used in war, in surveillance, in media, all sorts of things. And it's really scary that a guy like Sam Altman,
Starting point is 00:42:19 as is detailed in this article, as is detailed anecdotally, that we've heard from multiple people who've worked with him in the past, he is someone who is skeezy and, uh, and what, slimy and not to be trusted. And, and this is a guy who's helping to dictate policy. And it's really,
Starting point is 00:42:39 deeply unsettling. We see the very real aspect of that two-facedness that they describe. At the same time as, he's bizarre if you follow him. He'll literally, publicly go, yeah, I'm, like, I'm begging you guys to regulate me. Here I am, Congress. Like, please regulate me.
Starting point is 00:43:00 Let's do some regulation together. He knows that they're not going to do shit. But then he actively works against it behind the scenes and, like, funds all kinds of things to water it down or get them just completely sidetracked on it. Anyway, it's, this man has a lot of power. Yeah. Also, why would all these people lie?
Starting point is 00:43:22 Because they're gay. That's what they do. That's one thing. It's like, okay, if there's no way that they would be doing... Anthropic doesn't need to do character assassinations when they're already starting to really win. Oh, man. The popularity, the user base of the growth of users that... that Anthropica scene is actually staggering.
Starting point is 00:43:51 Well, it's Enterprise that they're really winning. OpenAI still has more individual retail users like us. Oh, I thought they beat them out heavily on the... Oh, maybe they did. Shoot, I don't know. The chart that I was looking at was, oh, maybe I was looking at projections. I don't know. By the way, CBPN, the show that I don't know if any of you guys even know what it is.
Starting point is 00:44:13 I bet 10... Tits, butts, but... Pussy nuts. Nuts. Tits, butts, butts. That's good, man. Pussies nuts. That's what I would say back at the rap battle.
Starting point is 00:44:23 And they'd be like, what? That has nothing to do with what I just said to you. Dude, you have to respond. I just said two devastating things about you. Yeah. Tits, butts, pussies, nuts. Oh, shit. Gonna get you cuts.
Starting point is 00:44:36 Hair cut. You're bald. He's like, I'm not bald. You're fat. You short. Neither of those things. I mean, that's a good way to win. Yeah, I guess.
Starting point is 00:44:47 Just go the Trump route and just lie, gaslighting. It works. It works. That's what feels so silly. You know, we're like sitting here going like, they've got him. Everyone in his life says he's a liar. He could be the next Bernie Madoff. And it's like, yeah, but until it all comes crumbling down,
Starting point is 00:45:04 we're just locked in here with them. We're locked in here. Well, so TVPN is a live streaming show on Twitter. And they just got bought out last week by, Open AI. For over $100 million. And I think that it was probably an all-cash deal because there is no way on God's Green Earth
Starting point is 00:45:30 that they're paying these motherfuckers cash. I bet it's all-stock that's going to take a few years to best. You just said I bet it's an all-cash deal. No, I bet it's not. I bet it's all shares. All-stock. All-stock. Whoops.
Starting point is 00:45:43 Man, guys, I don't know what's going on in my brain. Jesus. You bet it's all-stock. I bet it's all-stock, yeah. maybe a little bit of cash. Maybe like 5, 10% cash the rest stock. I'd take 10% of that cash, if you know what I mean? Brother, I would take, man, I would take a damn hand job.
Starting point is 00:45:59 From, for what? From anybody. In exchange for what? Nothing, man. Which I still don't understand that deal. In fact, I'll pay you. You don't understand? Yeah, it's a...
Starting point is 00:46:12 I don't understand that deal. Obviously, many people have gone. It's so ridiculous. For those of you who aren't familiar with TBPN, which you might because it's not that popular of a show. And that is not a dig at them. They've created a cool thing. They live stream three hours a day, I think, every weekday.
Starting point is 00:46:29 It's not exactly my cup of tea. I don't think they're doing any kind of, they're doing a lot of interviews, but they're very... It's pay to play. CEOs and stuff come on their show and they talk and that's it. And they know that they're... Yeah. They know that they're not going to get any hard questions. Yeah.
Starting point is 00:46:44 It's... And whatever. That's fine. They've carved out a... niche for themselves. A niche for themselves. And I think either one is fine. Oh, whoops. But if you're right, I'll publicly apologize in a very embarrassing way.
Starting point is 00:47:01 So they were already very deferential to the Sam Altman's, the tech leaders. Everybody. Elon Musk. If Mark Andreessen wants to drag his egg head on, and talk... I'm retard maxing lately. Yes. And they'd be going, wow, sir, wow. He literally said that shit.
Starting point is 00:47:25 Yeah. And get zero pushback on the maniac shit he's saying. Yeah. They had a perfect platform for that. I don't know what they get out of buying it out. I was going to say their staff, but at that point, just offer each individual staff member something other than, you know,
Starting point is 00:47:44 instead of doing the big-ass package. Maybe they get making. it more legit and putting resources behind it and getting their, turning it into an actual competitor to something like MSNBC or CMBC or, you know what I mean? I have no idea. It's crazy, man. I don't know. Hundreds. We don't know exactly how much it costs. The reporting is that it's in the low hundreds of millions of dollars. Which is absolutely. For, I'm not joking, devastating. I think they said they get, I mean, pull up. that, yeah, go back to that YouTube. What are they getting on there?
Starting point is 00:48:21 Well, that's just on their YouTube channel, because they are an X-first show. Okay, but for a show that just got bought out for $100 million? She was 2,000 views? Yeah, yeah. How much did Joe Rogan get paid? $100 million? You know, what a... Which sounded so expensive at first, but now I'm like, damn, they got that, they got a hell of a deal. There's another show, I'm not going to say...
Starting point is 00:48:43 Yeah, go to Joe Rogan. Like, Joe Rogan probably gets... Joe Rogan's gotten, you know, he gets a couple million per episode. Yeah, that's... Okay, so that's where I'm like, yeah, that's probably what a hundred millions had cost. Hundreds of thousands every time. Two million. Yeah. One million.
Starting point is 00:48:56 And that's not even including the Spotify numbers. So I'm just... Boy, they just don't put any effort into their thumbnails, do they? I mean, there's no need. Just don't care. There's this other show that's recently been getting a lot of... It's just so funny to see... I was like, who is that?
Starting point is 00:49:16 Priyanka, Chopra, Jonas. So much of what comes to me from this is just like, uh-oh, something controversial happened, but it is funny that he just also talks to Priyanka Chopra Jonas for two hours and 25 minutes. Why is that funny? Well, you don't care what she has to say? No, because everyone always talks about him as like, he platformed this dangerous phyrologist. And he's talking about whatever. And then he's like, um, and so do you like travel with Joe Jonas when they're on tour?
Starting point is 00:49:45 That's crazy. Wow, wow, wow. What do you think about woke? There's another show that has recently beginning a lot of stuff online. And I looked up, I'm not going to say the name. Who cares? She's not going to see it. It's called Sorcery.
Starting point is 00:50:06 I don't know what that is. And it's just, it's painfully obvious what's happening here, which is that some of these, I mean, that one excluded. She had the famous video with. Alex Karp doing the sword, yeah. But like some of these, okay, this one, Roblox, 67,000 views. All right, let's click that. Assuming, wow, assuming that these are all real views,
Starting point is 00:50:28 how many comments do you think this should have? I don't know, a couple hundred, three hundred, nine comments, 54 thumbs up. And it's like that on so many of these videos. And what I think is happening is every time they have a CEO or something, the CEO buys. Oh, wow. Someone actually commented right there. Sorcery is the most astro-turfed podcast in the world.
Starting point is 00:50:50 There are zero fans, and every big name she gets on is from her family connection. I think that the CEOs then go on and pay for views to make it look like they've got a lot of engagement. I mean, it definitely lends to... It's wild, man. Why Sam Altman would be interested in buying this, I think. Because... It's so easy to trick these. We should do that.
Starting point is 00:51:12 We should buy views and be like, look at us, man. We're getting hundreds of thousands of views per episode. Buy us. To TBPN's defense, they do weird, it does weirdly feel like they have some kind of cultural cachet. I don't know why exactly. Even before this, I remember seeing people talk about like, wow, it's so cool what TBPN has built. In a way that no one's talking about sorcery that way. Got a nice set and stuff and they were kind of the first to do the, uh,
Starting point is 00:51:44 I don't know. I don't know. Anyway. But I do think there's a real desire for evil tech billionaires to have a place where they can feel cool and get their little clips out and tell people that they've never experienced introspection in their lives. You know what the plan is, folks? I'll tell you what the plan is. We're going to get bought out by one of these. Mark Andreessen offers me $5 million, $10 million, whatever it is.
Starting point is 00:52:08 Dream hire. I'll take it. You know what I'm going to do? Can it get in his good graces? Not him specifically. anybody, anybody. Getting their good graces, really let their guard down, you know?
Starting point is 00:52:20 And then in time, suck them off. Suck them off, dude. No, no, a little bit of, uh, oizen. Oisen. What's oisen? Oisen.
Starting point is 00:52:32 Boys. Just a little bit inside their coffee or something. And then they go, my tummy hurts. I feel sick. Oh, gee. Oh, gee. Well, why don't you lay down, sir?
Starting point is 00:52:43 I'll heat you up some bone broth and it's more oizen. And then and then and then and then you know. You just kill their chances of getting $100 million. I hope you know. I'm just saying that that's... I hope the oizen was worth it. Playing the long game here.
Starting point is 00:53:03 I'm just kidding around. This is a comedy show. But no, I'm not going to do that. Wait, I'm just implicating myself cut two in five years. I'm like in an orange jumpsuit in court. like those guys when they have their Google search results read to them, like how to dispose of ex-wife body? Anyway, no, I'm just kidding around, folks. So let's look at OpenAI and Anthropics finances before the IPO.
Starting point is 00:53:30 This is a Wall Street Journal piece. It looks like Open AI isn't even going to break even until at least the 2030s. and Anthropic could break even a little bit sooner. And this is where I was sourcing from most of Anthropics revenues currently come from Enterprise. But obviously that's going to change. But yeah, a really frustrating thing that's going on with these... Dude, everything could change. Like, all this stuff is happening so quickly.
Starting point is 00:54:02 I don't think any of this is factored in yet. But like the Futurism.com had an article today with almost half of the... U.S. data centers that were supposed to open this year slated to be canceled or delayed, and they talk a lot about the situation in the Middle East with not getting not getting all kinds of
Starting point is 00:54:20 investments from them. Investments or components that they're going to need. If one piece of your supply chain is delayed, then your whole project can't deliver. It's a pretty wild puzzle at the moment. Yeah. So these guys have already been operating on a pretty strange.
Starting point is 00:54:38 Yeah. Some pretty strange accounting. I think it's going to get, I think the accounting is going to get stranger. Well, speaking of strange accounting, a lot of the bankers that are in charge of bringing these companies to the markets are swinging their dicks around, trying to make it easier for them to get cash by pressuring the indexes to loosen rules
Starting point is 00:55:00 for quicker entry to give them access to more capital, to bigger pools of capital. So like, for example, you know, to join the S&P 5. You can trade on, I know it's a little confusing. You can trade on the stock market, but then there's joining an index, like the S&P 500 or the NASDAQ 100, I think it is. And yeah, these bankers are trying to make it way easier for them to essentially cut the line and get added to these indexes way faster, which is so fucking frustrating. Oh, yeah, here's some of the, let's see, the yearly A. model training costs in billions of dollars.
Starting point is 00:55:37 They're projecting by, it looks like, 2029, Open AI is projecting that their costs. How much more they're spending on training and getting worse results. I think that's their biggest issue. Yeah. And Anthropic, by contrast, isn't spending as much. It's the deep seekification of it all, right? Well, that's part of what was in this other thing. Let's pull up that Bloomberg article.
Starting point is 00:56:00 This woman put out a great piece that's echoing some of the stuff that Ed Zittron said, but kind of took it a little further, right? Because she's talking about how the data that's used to train these models is pretty much maxed out at this point. Yeah. They've scraped the entire internet. The New York Times let them feed it. Disney was like, sure, just
Starting point is 00:56:18 run every movie through that fucking thing. They got all my tweets. Yeah. They got everything. They scraped Twitter, your Tumblr posts from 2008. They got it all. You too. They have everything. Yeah. In the last couple of years, they've been kind of training on synthetic data from other models. So it's a real major.
Starting point is 00:56:35 self-suck. I mean, if you can picture GPD just sucking itself off, at infinitum, that's what's going on here. And she points out that it guesses, it approximates, it does everything except for actually reason and think. Yeah, because it can't do that. Right. And in some cases, though, like recently, I was using it, it, Claude at least, gets around, it just Googles for you. It just It's just a smarter Google. That's been my overall thing. Which is fine. I used to love using Google.
Starting point is 00:57:11 Okay, you guys will make fun of me because whatever, I'm a boomer. This motherfucker goes to www.gook. No, no, no, no. No, no. We grew up. We witnessed it happen. We used to use this thing called Ask Jeeves and these other weird things. And then one day, the nerdiest kid you knew was like, dude, you got to use Google.
Starting point is 00:57:29 Yep. And then all of a sudden, you're like, it just opened the internet for you. I remember exactly who it was, too. I remember who mine was too. Yeah, what was his name? I'm not going to out him. And I think he had a weird time. But mine was a Russian guy, a twin.
Starting point is 00:57:44 They were twins and they were on the cross-country team. But it opened the internet and then everyone could feel it. I mean, obviously it coincided with them turning their search engine business into a massive ad revenue business. And search just gets degraded. and it's just become a much worse experience. And sometimes I'm just like, this would be easier to just put into Claude because using Google is a fucking nightmare sometimes.
Starting point is 00:58:13 And I'm like, so at best, they created like a better search engine. Well, and then she points out that these, she's basically talking about how hallucinations are an inherent part of these large language models, that they are systemic flaws that you can't get rid of, no matter how hard these companies try, it is just a fundamental part of their function that you can't get rid of. You can only adjust for. And she's saying for those in technical positions who rely on it, who rely on AI, a big part of it.
Starting point is 00:58:51 And she's got anecdotes from people that she interviews that it works for the most part, but it's only like 90% reliable. and that other 10% they are actively having to go in and mitigate and account for these hallucinations. And it's ultimately, she says that this, these current iterations
Starting point is 00:59:13 of large language models are for low stakes tasks and not mission critical work. And no matter how much compute you throw at it, it will not solve for hallucinations. So you're saying we shouldn't let it pick targets for us to just blow up all across the world? Yeah, there's that. There's that.
Starting point is 00:59:29 And, but she does say that, In order to solve for hallucinations, you would need a complete fresh start, start over with programming and training and all that shit. And there are people actively trying to do that. But then rightfully so, as you said with the deep seek, deep seekification, there's already free models that basically do just about everything the large language models can do, but for free. So the entire house of cards falls apart. But do they have that nice cream and like desert sand red colorway that Claude does that makes you feel like you're doing? Yeah, I can do that. Okay, whenever you're ready.
Starting point is 01:00:10 I think it's fucking hate that. I think it's cool you want to kill yourself. You should do it. Yeah. Everyone else doubts you. And that's epic. It's epic that you want to jump off a building. And then she, again, I'm surprised more people aren't talking about the fact that.
Starting point is 01:00:27 Apple essentially has just been kind of sitting it out. And could massively win this whole thing because they're just sitting on a pile of cash when everyone else has just been lighting it on fire to create the hallucination machine. And if someone does solve it, Apple has, you know, three billion devices more, that they'll just go, okay, well, you want to put it on our shit? It's still so frustrating when I ask Siri a basic question. Here's what I found on the internet for you. It's like, no, you stupid son of a bitch.
Starting point is 01:01:03 Do you think maybe it's the way you talk about it? Do you think maybe it's the way you talk to? No, I do it very gently. I say, hey, blank, can you da-da-da-da? Here's what I found on the internet. Like, motherfucker, I'm driving right now. That's why I asked you the question. It's a very basic, easy thing that you should be able to search the end.
Starting point is 01:01:20 It just pisses me off the way. our tech our tech mommies and daddy. Why just ask Claude at that point? That's a great idea. Just do a little shortcut on your phone and say, Yeah, I didn't even think about it. Hey, my bitch wife, Siri, won't answer me. My bitch wife won't answer it.
Starting point is 01:01:35 Can you tell me who won the March Madness game today? It can do that. It can do that kind of thing. What can't it do for you, huh? I don't know, man. I can't even remember. I can't even think right now about what it can't do. What can it do? Oh, the weather outside is frightful, but the fire is so delightful.
Starting point is 01:01:53 Oh, you're singing that song to me again? It's always singing me Christmas songs. Who? Siri. Fucking God. I hate it. I do. Oh, see, it just said, hmm?
Starting point is 01:02:03 I do, I do briefly want to go through some of the things that. So Open AI released this weird policy paper where they're touting all these, all these things that they... Great timing, by the way. They love timing these things coinciding with, like... They want to, like, incentivize the government to be doing. they're talking about how the government could incentivize companies benefiting from AI to institute a four-day work week and give employees expanded health care and child care coverage, as well as larger retirement benefits. They could also help workers displaced by AI switch to industries that rely on human connection like health care and education. Yeah.
Starting point is 01:02:42 Workers could be taxless and programs like... They could modernize taxes. Social Security and SNAP would be funded by increased taxes on companies. benefiting from the AI boom? They, in this, in this piece, it's this like 11-page thing that Open AI just put out, published yesterday? Yesterday, Monday.
Starting point is 01:03:03 And they, there's basically three main goals that they're sharing. Number one, share prosperity broadly, a higher quality of life for all. Doesn't that so nice? I mean, I'm all for that. Number two, mitigate risks. They're talking about how safety must scale
Starting point is 01:03:20 with the rise of AI. which is funny because Sam Altman's whole thing, as they switched from non-profit to for-profit, has just been at the expense of safety. And then third, they want to democratize access and agency. And yeah, they offer all these solutions. Oh, you could do a public wealth fund, providing everyone with a stake in the economic growth as it stands to benefit from. Yeah, give people like cash payments. I think they bring up the Alaska oil wealth fund. Altman bucks.
Starting point is 01:03:51 They said that the government could also facilitate the expansion of electrical infrastructure, powering AI data centers in order to lower household electrical bills. Listen, these are all great hippie-dipy things, but it assumes a world where people and politicians can agree and work together. Imagine putting this out while the guy you said, if you guys could all just think for yourself a little bit, you would realize that he's not so bad while he's going, I'm going to end a civilization tonight. he's really not so bad shut up and saying I actually think it doesn't matter who we elect
Starting point is 01:04:27 I think it will all stay the same and yeah well folks what do you think about Sam Altman let us know in the comments
Starting point is 01:04:39 should we play the clip of trunk saying the R word yeah why not why not all right so in case you missed it I doubt you did but he's, I'll just let the clip speak for itself. There you go.
Starting point is 01:04:56 To protect us from Kim Jong-un, who I get along with very well, as you know. Do you notice he said very nice things about me. He used to call Joe Biden a mentally retarded person, okay? So don't tell me about your stuff. Joe Biden, he said he's a mentally retarded person. He was so nasty to Joe Biden. It was terrible. But to me, he likes Trump.
Starting point is 01:05:16 And you notice how nice things are with North Korea? It's very nice. That's the president. Mentally retarded person. That's the president. That's my president. I do like the guys who, so there's obviously the people who are going, we didn't vote for this.
Starting point is 01:05:32 We said no more war. I do like the crazy guys going, this is exactly what I voted for. I like this. He said retarded on television. There you go. He's the pet. You know who his voters are?
Starting point is 01:05:45 The guy with the Pepsi can. The clavicular guy. I don't know if I don't know if I believe that. They're all just guys like that. But I don't know. That's what I'm saying. That's all I'm saying. That's what I'm saying.
Starting point is 01:05:59 All right, folks. Why don't you join us in the bonus episode where we're going to talk about all kinds of stuff. We're going to continue to pretend like we can't hear them fighting. Yeah. We're just going to do our homework and just hope that this all kind of goes away. We're staring down the barrel of, uh, I think it's six. p.m. 6 p.m.
Starting point is 01:06:22 Easter standard time right now. Maybe want milk? Two hours. Two hours from... Two hours from now. Two hours from the deadline. That is... Well, I hate to break it to you, man.
Starting point is 01:06:35 But I think something positive probably has happened since we've recorded because the S&P 500 is up quite a bit after hours. We are up 1% after hours. So, rumors of a ceasefire. gee who'da thought
Starting point is 01:06:53 yeah okay well wow but this has been well we'll see not gonna say I called it this I'm on
Starting point is 01:07:01 financial times right now Pakistan calls on Donald Trump to extend deadline for Iran talks uh I don't
Starting point is 01:07:12 let's see let's see real quick if there's a truth let's see if you put out a truth well folks well I think that we got to stop anyway. So we'll see in the bo-bo-bo-bo-b-b-b-b-b-b-b-b-b-b-b-b-b-b-b-b-b-b-b-bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb We hope you're all still kicking.
Starting point is 01:07:24 All right, bye. Bye. It's coming up on this week's episode of Ben and Emile Show.com. Look in my face. Look at that, what they're something else. Those are what they call sides in the industry. I was probably helping Erica with an audition. I know I said it's a nice restaurant, but it has mozzarella sticks. Nice restaurants can have those.
Starting point is 01:07:55 He doesn't even say that. He just starts doing it. Morris, blah, blah, blah. It's just like, oh. How old are we talking? The dog? No, the guy. I just told you, like in mid-40s, something like that. I was thinking about the war in Iran.
Starting point is 01:08:11 It's over. It's over. It's nothing's going to happen. How do you know? Because based on conversations with the prime. I mean Shabbat Sharia and Field Marshal Assim Munir of Pakistan and wherein they requested that I hold off the destructive force being sent tonight to Iran and subject to the Islamic complete immediate and safe opening of the straightway
Starting point is 01:08:29 I agree to suspend the bombing and attack of Iran for a period of two weeks. Wait, you're reading it so fast. I can't even understand what's happening. He's agreeing to suspend. For two weeks. What a dumb fuck. This will be a double-sided ceasefire. Nobody regular, it wasn't, size 16s weren't a regularly offered thing. so he exclusively wore new balances. Damn.
Starting point is 01:08:51 Yeah. What's your shoe size? 11 and a half. That's crazy. What? Why? Why is that crazy? That's kind of a small shoe for a guy with six...
Starting point is 01:09:01 Fuck you. You know what? Fuck you. For a guy who's 6'3, that's kind of a small shoe. No, it's not. My feet are fucking big. I got a big ass foot, dude. Put it up to mine.
Starting point is 01:09:14 Did you say naked? Yeah. Ooh, I'm footmogging him. Look at this, folks. Absolutely foot-mogged. Foot-foot-mogged, dude.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.