The Ben and Emil Show - PP 23: Open AI Drama Explained

Episode Date: November 22, 2023

The tech world was thrown into disarray last week when its golden boy Sam Altman was abruptly fired from OpenAI. What proceeded over the next several days was pure drama, finger pointing, and speculat...ion, with billions of dollars and years of research at stake. The reasons why he was fired are still not known, but they're much more terrifying than you might think. UPDATE: As of today, Sam has been fully reinstated at OpenAI. Get bonus content on Patreon Hosted on Acast. See acast.com/privacy for more information. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome back to another episode of Pay Pigs. We've got one year of ChatGBT, GBT, and everything is absolutely imploding. We're going to find out just what the heck is going on over at OpenAI. Yeah, well, including and especially why you should care and why this stuff is extremely important, because we might be getting dangerously close to that super intelligent AI that turns the world into paper clips. Find out exactly what we're talking about. Plus, what else? Oh, we got George Santos's incorrigible behavior.
Starting point is 00:00:27 behavior behavior this man we're going to find out everything he's spending all this campaign money on we we absolutely love a king plus our boy shizhen ping the leader of the communist chinese communist people's party gave us a very very special gift all this we say thank you we say thank you and uh cue the intro Welcome, folks, to a very special episode. As you can see, we are on a set. Is it new? It's new for us.
Starting point is 00:01:15 It's new for you. Yeah. Is it permanent? I don't really know. We don't even know how we ended up here. Excuse my computer. Whoops, I'm turning down the brightness. Not the sound.
Starting point is 00:01:24 so we're releasing this early because of Thanksgiving hopefully everybody has a nice good one and also I would say because of the subject matter of today it feels like if this were to come out two days later the entire story could be different that's true it is a dynamic fluid situation every six hours it's a different thing one could say it's like a high school senior it's very fluid how are they fluid you know
Starting point is 00:01:52 gender fluid Hmm, okay So You know, we can cut that out We can cut that No, we'll leave it Yeah, we'll leave it But anyway, follow us on socials
Starting point is 00:02:04 Also comment if you like that too Yeah, yeah But there's no of that joke landed Because me and Ben can never tell We really can't, I can't Follow us on socials pay pigs pot everywhere Emile de Rosa everywhere BenCon everywhere
Starting point is 00:02:16 Except for Twitter, it's actually BunCon Yeah, except for there And Cameo, I'm back on Cameo If you want someone, if you want me to yell at someone for a price i'll do it threaten someone maybe you'll only yell at people i'll only yell yeah so no birthdays no no no condolences no get betters i'm only screaming yeah only angry threats i might buy one of those you should yeah also uh we're doing our call-in episode next week on patreon we we got a lot of confusion yeah we're gonna have to clarify some of these tears
Starting point is 00:02:48 no you can look at it yourself sure but if you want to be a part of it it is a higher tier and it's not we're not trying to squeeze you here it's literally it would never work if we open it up to everyone because it'd be way too many numbers coming in too many calls so if you want to give us a call it's next week that's at the ten dollar tier and it's next week yeah and it's going to be really fun you guys call leave a voicemail and we respond any hoomst we got a dozy of an episode it is all about chat gpti sam altman open ai Microsoft, and... Anyone else? Ilya Sutsky. Weirdly, Joseph Gordon Levitt's wife? Wait, what?
Starting point is 00:03:33 Yeah. Are you serious? Uh-huh. Damn, I can't wait to hear about what JGL's wife is up to. Is she newly single or something? No. No? Do you really not know?
Starting point is 00:03:42 No, I really don't. Oh, she's a member of the board. Oh, Mira? Oh, oh, oh, oh, oh, oh. Okay, the board, yeah. We'll get to that, but... So, I was having this conversation. I was trying to mansplain it yesterday to my girlfriend.
Starting point is 00:03:58 Girlfriend counter. And her eyes did glaze over. And I realized that I had to give sort of a, is it primer or primer? Well, we were going to do this. So when we were talking about episodes last week, we were talking about how we are approaching one year of chat GPT. Yeah, happy birthday to chat GPT. Well, not yet.
Starting point is 00:04:21 Happy early birthday to chat. November 30. November 30th. What are they? Sagittarius? I don't know. Either way. So we're going to talk about one year of chat GPT because this really did change
Starting point is 00:04:33 everything for, you know, I think a lot of people think Open AI just kind of launched with chat GPT, but it's been around for quite a few years. And they launched in 2015. It was founded by a bunch of people saying Sam Altman is by far the most famous founder of chat GPT. Elon Musk was one of them. And, yeah, a lot of these people who are, you are now hearing their names like Greg Brockman, Ilya, Sutskever. Reed Hoffman.
Starting point is 00:05:00 Reed Hoffman. The LinkedIn nerd, Elon Musk, Peter Thiel. Adam DiAngelo, Quora co-founder. Oh, yeah, we love Quora. But they were, you know, it was a bit different than your average Silicon Valley startup because they weren't just thinking high valuations, you know, let's sell this thing and cash out. Right. Let's be Silicon Valley billionaires.
Starting point is 00:05:28 They founded it as a nonprofit, and they wanted to dedicate their time to creating AI technology that would, you know, benefit humanity the most, all right? They had these lofty goals, and they wanted to focus on safety rather than shareholders. That's right. And so this is from a blog post from December 11, 2015, right around the time when they were launching. co-written with Ilya Brockman so they said
Starting point is 00:05:59 the new firm's goal was to advance digital intelligence in the way that is most likely to benefit humanity as a whole unconstrained by a need to generate financial return okay so they weren't going to be what did you just Google? I was trying to Google Alia oh why are you going to make it
Starting point is 00:06:13 oh Sutskiever yeah there he is he's an interesting fellow it's one way to say it Um, but yeah, so this was very different. And this, this nonprofit aspect of it is very important to, it's, it's important to understand what that meant for the company, right?
Starting point is 00:06:34 Because that, uh, that dictates where we are today. Right. Exactly. So. So we start with just a little bit of the history of, sure, give a little bit of the history. Yeah. Because I feel like what the, the way I was envisioning this is we kind of start with like a little history of the open AI and then just go right, plow right.
Starting point is 00:06:52 into the chaotic timeline that was Friday, Saturday, Sunday, and into Monday. And then maybe we could talk about how the, we could dive deeper into the complicated non-profit, capped profit sort of structure of
Starting point is 00:07:08 this thing, because it is unprecedented. Yep. And how that has informed a lot of the theories around why Sam Altman was fired. But so real fast, it's important to note that Sam Altman, think of him as like the Steve Jobs. He's 38 years old. He also famously fired from his company.
Starting point is 00:07:25 Steve Jobs, yes, correct. Yeah. He was fired from Apple, and it started to flounder, and then they brought him back. They had to plead and beg him for forgiveness, please, Mr. Jobs. And then he came back, and he said, only if you make iPhone. Well, that's what, so, I mean, that's what makes these tech companies kind of different than a regular, a regular company with assets that, you know, you change the, you change the leader of the company. People go, well, you know, they still own.
Starting point is 00:07:52 all of their assets and equipment and whatever these Silicon Valley company it's all about the direction their figureheads are going to take I mean and and as we've seen without same Altman it's a it's a different story not only that but so part of the reason they were they were very altruistic as as Emil said they were pursuing AI in the interest of advancing technology and advancing humanity humanity so they They were attracting a lot of very, very talented people who, I was reading, turned down ludicrous offers from competitors. Oh, sure. I mean, putting Open AI on your resume at this point is like you can have a job offer for wherever you want to go.
Starting point is 00:08:36 So it was started in 2015. It was very quickly, obviously crazy expensive to train these models to teach AI to become what it is today. 2018, Elon Musk resigned, citing a conflict of interest with Tesla's AI development with full self-driving. Sam Altman says that Elon Musk believed that they'd actually fallen behind and proposed. Elon proposed to take over OpenAI himself, shocking to no one, and the board just unanimously rejected him because fuck you. In 2019, this is crucial, so they were fully just a non-profit up until 2019, at which, which point they transitioned to a capped for profit, which I still don't fully understand, but apparently it means that profit would be capped at 100 times any investment.
Starting point is 00:09:31 But also crucially, this capped model allowed them to legally attract investment from venture funds and grant employees' stakes in the company, finally, on par with what like a Google or an Amazon would do, where you give them equity in the company so now they could actually, you entice people with something more than just hey don't you want to do good computer stuff right but the board still not having a financial state right so the company distributed equity to employees they partnered with Microsoft as I'm sure many of you know they're running their systems on Microsoft's Azure Azure as your as your how would you pronounce it as you're yeah Azure's supercomputer as you're saying it as you're very good let us know if you like that
Starting point is 00:10:16 So then OpenAI then announced their intentions to commercially license their technology. November 30th, last year, ChatGPT is released based on their previous model, ChatGPT 3.5. March of this year, GPT4 is released. And then in May of this year, Sam Altman, Greg Brockman, who's like the number two, and Ilya Sutskiever, posted recommendations for governance of superintelligence, which they think could happen in the next 10 years, which is terrifying for reasons we will get into. And they propose this international watchdog group,
Starting point is 00:10:53 and that is still, I mean, they're basically saying, yo, we're working on this stuff firsthand. We know how potentially, not how potentially timeline altering this is for everybody, but it is. And we need like, yo, government, you got to step in and put something together. Put an international coalition together. to make sure that we don't create the Terminator. Right.
Starting point is 00:11:18 And so this last year has been very crazy, right? So because I'm sure a lot of you didn't know who Sam Altman was until chat GPT launched. And he went from being kind of a Silicon Valley darling to now all of a sudden he's, you know, giving talks at the White House, getting flown around the world to consult on how we're going to regulate this thing. And so now he's kind of just a tech superstar. Yeah. women women want to fuck him men want to fuck him i mean i personally would love to bed that guy would you is it have you seen his lips i don't know if he's my type very kissable lips this guy a more of an ilia sutskiver kind of guy yeah you like that uh you like that what what kind of
Starting point is 00:12:02 is that you know i love a crazy hair line yeah but for the audio listener for the audio listener this is a man who cannot you can tell by his hair that he's very stubborn sure because he's not giving up right like god bless him he's he's got the look of a no but you know what if he shaves it he can't do the Elon Musk and say my hairline's always been like this sure you know so I'm trying to say he looks like an eastern European taxi driver or something sure like soccer hooligan he looks like he's got an attitude he certainly does have an attitude when it comes to AI governance he looks like a guy that if you're watching a movie he He appears you think that he's the bad guy at first, but then, and as we've seen, he's actually kind of a good guy because the way that this timeline has gone over the last week has just been insane.
Starting point is 00:12:57 At first, he was like a bad guy and he was kind of like Brutus. He was betraying Sam Altman, but then he came around. Oh, but boo. Should we get right into it, the timeline? Yeah, but I do want to just explain this board just because I think it helps to have this context, right? And so while all of this is very expensive, to create this supercomputing platform, it requires a lot of money. And so they needed to change this nonprofit structure to allow for that investment, right? And so, but when they did that, they still left it in place so that that nonprofit board would still be the one controlling all this, right?
Starting point is 00:13:38 And this nonprofit board doesn't have a financial stake in the company. For example, even Microsoft, who is invested heavily, I believe they own about 49% of chat, or open AI. They do not get a seat on the board. Nope. Okay, so this is still run by, which it's now four people. It was six people, including Sam Altman, Greg Brockman, who got booted. But then it was Ilius Sitzgeber, the chief scientist who, you know, often clashes with Sam Altman about over safety issues and stuff like that. and the future of the company and how rapidly it's going to grow.
Starting point is 00:14:13 Adam D'Angelo is another one, the Quora co-founder we were talking about. Tasha McCauley, Joseph Gordon Levitt's wife. Oh, that's his wife. She's the adjunct senior management scientist at Rand Corp. I thought that was Helen Toner. Or maybe she's another one. Helen Toner is the final member of the board. She is the director of strategy and research grants at Georgetown Center for Security and Emerging Technology.
Starting point is 00:14:39 So a bevy of nerds. Sure. Just a group of extremely well. They're all extremely wealthy just on their own. So they... Right. So it's important to know that even though they introduced this for-profit arm of Open AI, the board still remained in total control of everything.
Starting point is 00:14:58 Even the massive investors did not get a seat on the board to decide what. And so this was all put in place so that they could keep safety and, you know, right humanity in mind the board's directive is not to maximize shareholder value which is what you would normally see in virtually any other right for-profit company their mission is to ensure the creation of quote broadly beneficial aGI so what is aGI a so what is aGI is let's let's just read the actual definition it is a hypothetical type of intelligent agent if realized an aGI It's artificial general intelligence, by the way. It could learn to accomplish any intellectual task that human beings or animals can perform.
Starting point is 00:15:44 So basically, think just super smart, finally, fully advanced AGI. So the board of OpenAI ultimately were the ones that would make the call whether AGI has actually been attained. They're the ones who look at it and go, you know what? Yeah, it's finally been attained. And that's very, very key to this because AGI technology is excluded. from the IP licenses and the other commercial terms that they've got with Microsoft. Until that point, they're totally free as they've been doing to license GPT3, to license GPT, chat GPT4, for people to, for companies,
Starting point is 00:16:23 and, you know, it's over a million companies are trawling the API to use GPT in their fucking customer service interfaces or whatever. But, so this is a quote from their, their directive. The board determines when we've attained AGI. By AGI we mean a highly autonomous system that outperforms humans at most economically valuable work. Such a
Starting point is 00:16:47 system is excluded from IP licenses and other commercial terms with Microsoft, which only apply to pre-AGI technology. So from there, I've just got how it goes into the speculation. Should we talk about why we're talking about this? Let's just jump into the timeline.
Starting point is 00:17:05 Yeah, let's do it. So, Friday, like an hour before the market closed, it starts hitting the newswires. Sam Alton has been fired from Open AI. And everybody's going, huh? What? This is crazy because it is. It's absolutely great. Steve Jobs, 2.0, getting fired all over again.
Starting point is 00:17:25 And no one, we did, we had no real reason. We still don't have a real reason. Yeah, but now there's a lot of speculation. Tons. And some pretty plausible speculation. but at the time it was just... Shit-canned. Yeah, and I mean, so Greg Brockman put together a little timeline of his.
Starting point is 00:17:47 And Greg Brockman was not fired, booted from the board, but also resigned in protest, what Sam Altman was. So he says, we two are still trying to figure out exactly what happened. Here's what we know. Last night, Sam got a text from... So this is Friday. Last night, Sam got a text from Ilya asking to talk at noon Friday. Hey, Kim, man, can you talk tomorrow at noon Friday?
Starting point is 00:18:09 No, no, it's not about my hairline. I wanted to talk about that. I wanted to talk about the AGI, the stuff. It's just work stuff. Sam joined a Google meet, and the whole board, except Greg, was there. Real fast, it's hilarious that they work for a Microsoft company. And they use Google Meet. Yeah.
Starting point is 00:18:25 You got to make a Teams, man. Come on. Ilya told Sam he was being fired and that the news was going out very soon. At 12.19 p.m., Greg got a text from Ilya asking for a quick call. At 1223 p.m., Ilya sent a Google MeatLink. Greg was told that he was being removed from the board, but was vital to the company and would retain his role, and that Sam had been fired. Around the same time, OpenAI published a blog post. As far as we know, the management team was made aware of this shortly after, other than Mira, who she's the, she's a open AI exec who was, uh...
Starting point is 00:18:56 She was put... She was in-term CEO before being replaced by Emmett Shield. from Twitch. So she found out the night prior. Then he goes on to thank everyone for their outpouring
Starting point is 00:19:07 of support. So basically in a dish just like we covered Ilya then had an all hands emergency all hands meeting on Friday
Starting point is 00:19:17 and he said that this was the board doing its duty to the mission of the nonprofit and we remember that their mission is for humanity
Starting point is 00:19:25 not for profit which is yeah their mission to which is to make sure that open AI builds AGI that benefits all of humanity.
Starting point is 00:19:33 And as you said, they put in the CTO, Mira, this woman, Mira. She was the interim CEO for all of a day because it started becoming a parent from her tweets and what she was saying internally that her first course of action would be to hire back Sam, hire back Greg, and hire back the three senior researchers who had just quit in protest. So what did Open AI do? They immediately shit-canned her. And they put this other guy in charge, who's the former CEO of... Switch.
Starting point is 00:20:03 Around the same time, Microsoft, because remember, Microsoft's got $10 billion invested into OPAI. But before we even get there, I do want to just, so all this is happening, they're getting fired, and those blog posts we're talking about are going out with not a great deal of clarity, right? They're talking about a breakdown of communications. We made this decision after a deliberative review process, which concluded that Altman was not consistently candid in his communications with the board, hindering its ability to exercise its responsibilities put simply sam's behavior and lack of transparency in his interactions with the board undermine the board's ability to effectively supervise the company in the manner it was mandated to do so what does that mean he was he was not
Starting point is 00:20:44 communicating effectively well a couple of things apparently sutskiever he's kind of the guy who it seems from the outsider's view sam altman was kind of just move fast to break things the just the living embodiment of the ethos of Silicon Valley, whereas Sutskiever was kind of a lot more tempered and a lot more deliberate in, yo, let's make sure that we're not creating the Terminator here. Interestingly enough, you know who brought him in for that purpose? Sam Altman?
Starting point is 00:21:14 Elon Musk. Brought in Sutskiever? Very interesting. So he tweeted actually on September 29th, Ilya tweeted, quote, Ego is the enemy of growth. enemy is the enemy of growth but so
Starting point is 00:21:28 Sam Altman up to this point in the spirit of what Ilya's talking about with disclosing things and communicating well with the board he apparently was on the side very publicly shopping around for funding for two
Starting point is 00:21:45 separate projects that he wanted to embark on one was a competitor to Nvidia to create a super processor that would compete with nVIDIA and the other he was partnering with johnny ive the former uh mechanical apple designer yeah the former industrial designer head of apple who's responsible for designing pretty much all of apple products partnering with them in seeking funding from mya my associate son the guy from soft bank don't ask me to tell us come on my associate son that's
Starting point is 00:22:20 probably it but they were trying to raise funds to uh build an ai powered smartphone that would then rival the iPhone so that and just Sam moving so fast I guess the board the the rumor is the speculation is that they just said all right enough's enough we got to step in and kick him out because he's because this isn't living up to the ethos of open AI right you know we are not we are not moving as cautiously as we should if we if we need to set up to do the things we said we would. So in spite of that, Microsoft and the other investors in OpenAI are pissed because
Starting point is 00:23:02 essentially it's looking like their investment, this company that is just about to raise money at a valuation of just under $90 billion is all going to go kaput because they're losing their Steve jobs. And it looks like increasingly so, a lot of the employees, just over 700 of them, are going to quit in solidarity. Yeah, I was going to say it's important to know everyone is fucking pissed. Yeah, everybody's fucking pissed So they urge Microsoft
Starting point is 00:23:29 And all weekend people are Different people in leadership positions Are like going over to Sam's house Trying to figure out how they can reverse this decision And make all this go away Because everyone feels that this is a massive mistake So then cut to Saturday, Saturday the 18th The board who's probably just like
Starting point is 00:23:48 Shitting and pissing themselves Just going oh fuck What did we do? God Everybody hates us Everybody's mad at us They agree in principle to resign and allow Sam Altman and everybody to return, but then apparently they waffled and they missed their 5 p.m. deadline, which apparently Sam Altman is the one who dictated. He said, you got until 5 p.m. to resign and reinstate us, or that's it. And apparently, this is really fun. The Microsoft CEO, Satya Nadella, was reportedly mediating. And the New York Times was actually camped out front of the OpenAI. building and they reported that they ordered a dozen drinks from a boba place and then later they got mcdonalds so the nerds were sipping on boba it's very uh mcdonalds very silicon valley
Starting point is 00:24:38 yeah but macdonald seems too below them doesn't it no because they're all just like programmers who like we'll just want to eat quickly and get back to true get back to programming their shit get back to get back to hacking so then sunday the deal fell apart again uh mirra was like We said replaced by the former Twitch CEO Emmett Shear, and again, the board failed to resign and reinstate Sam, which brings us to Monday. Monday morning, we get word that Microsoft has extended an offer to have Sam and Greg and just about every OpenAI employee who quit to join Microsoft. So it seems like, hey, no matter what, Microsoft's going to come out on top. that is what's going to happen yeah yeah and there was a there was a an open letter to the board calling for their resignation which over 700 of the 750 or so employees 770 signed which is wild and then apparently which and that demand was for the board to resign and to bring back sam and gregg right and then ilia who was part of that board flips and he is now Which is the most embarrassing.
Starting point is 00:25:55 You've got to fucking hold steady on that. But still, he, so he tweeted, I deeply regret. I see you holding steady on that hairline. Hold steady on that decision, baby boy. He said, I deeply regret my participation in the board's actions. I never intended to harm Open AI. I love everything we've built together, and I will do everything I can to reunite the company. He joins in signing the open letter to the board, calling for their resignation.
Starting point is 00:26:19 And as of recording right now, the jury is. is still out. The board is still contemplating it. And at this point, it's like, what are you even contemplating? If you lose all the employees, you've got nothing to be the board over. Like, what do you, what is Open AI, but a shell of its former self? Like, what are you even delegating at that point? Well, I think, well, yeah, so the whole Microsoft thing, you know, I saw the Verge was reporting today that, like, it's not a done deal. They're still deliberating everything. But, yeah, I don't know. I mean, it's also confusing. Like, and if you're following, you know, Sam Altman, he's like sending out these cryptic, uh, oh, he loves to be, he loves to sub tweet and, you know, a real tumbling. Yeah, we have more unity and commitment and focus than ever before. We are all going to work together some way or other. And I'm so excited, one team, one mission. And it's like, that's clearly not what's going on, Sam.
Starting point is 00:27:16 Like, there's clearly a huge division here. Well, there's a division between the board and everyone else. I think, because the majority, obviously the majority of the Open AI employees are in lockstep united behind him. But. Yeah, which I don't know. I find this all very, so I mean, this is another one from him too. So Satya, the Microsoft CEO, CEO, and my top priority remains to ensure open AI continues to thrive. We are committed to fully providing continuity of operations to our partners and customers.
Starting point is 00:27:51 The OpenAI, oh, it is so hard to say Open AI. The Open AI slash Microsoft Partnership makes this very doable. So that brings another thing. A lot of people were speculating that the real winner in this is Microsoft because, hey, this company, Open AI was raising money, raising funds at a valuation of, I think, $86 billion. And now Microsoft gets to essentially acquire them for nothing. Right. They get to, I mean, I'm sure they're going to offer. Not only do they get to acquire them, this is the most important part.
Starting point is 00:28:24 They now get, you know, the parts they wanted about Open AI and the parts that they definitely did not want, which was a board who did not want to focus on shareholder value and profits, but rather safety and humanity and all these. Right. However, we remember that the costs of doing this are tremendously. high. So now Microsoft no longer has the, they had the benefit at first of having their investment in Open AI on their balance sheet, which just depended on what the valuation of the company was at. Now they've got to carry the costs of developing AGI on their balance sheet, and it's not, it's just going to be, I mean, for a company the size of Microsoft, it's going to be a drag, but it's going to be a drop in the bucket, relatively speaking, because now, yeah, it's positive that they're going to get this whole team potentially,
Starting point is 00:29:26 but they're going to have to front all the costs now, whereas Open AI was fronting all the costs before. The other negative is that they are now probably going to slow down because Sam Altman and his buddies are now essentially, they're going to have to play catch-up. It's not like they still retain all of the IP that they can just, Oh, yeah, put it on a thumb drive and let's just carry on. No, they're going to have to start over, which is going to...
Starting point is 00:29:53 But it's going to be fucking full speed ahead. Oh, yeah, for sure. But it's also going to give their competitors time to catch up. So just other things to consider. Which brings us to the speculation about why this happened, because that is the biggest... It's becoming the biggest question because there are legitimate concerns as to the why. If, in fact...
Starting point is 00:30:15 So part of the speculation is... is that they were coming really, really, really close to achieving AGI with GPT-5. Right. And, I mean, that's it's so it's possible that the board was simultaneously concerned about AGI development and Sam Altman's being not consistently candid about his other things. Sure. Also kind of important to note that, like, I don't know, it's very, they're like these doomers and I think. I believe him.
Starting point is 00:30:46 Well, but, but even the Sam Altman's and these kind of people, like, do acknowledge just how risky this technology is. You know what I mean? It's, and that, like, move fast and break things thing is always kind of like, it's thrown around so much. And usually what happens is, like, Facebook gets hauled down to Congress and they have to answer some questions and be like, sure, maybe we caused like a mild genocide. We're sorry about that or whatever. But, like, these guys are like, oh, if this goes wrong, like, we destroy humanity. Right. So there's actually, oh, man, so just a precursor. There are a lot of insiders are saying that there were apparent disagreements over the speed at which Sam Altman was pushing for commercialization and company growth, while Ilya was arguing to slow things down.
Starting point is 00:31:34 And even as recently as November 6th on their developer day, their dev day, Altman was pushing consumer-like products during the keynote. and that was an inflection moment for Ilya pushing things too far too fast. He was bragging about how they had over 100 million weekly users and 2 million devs building on the APIs. And yeah, there was speculation that they were making powerful developments that put pressure on the company to proceed safely from the non-profit perspective while also making money. So the board is not in an enviable position.
Starting point is 00:32:07 They're feeling it from all sides. It's like, okay, we do have this capped profit strike. sure but on the other hand we've got to like watch out for humanity right and you have kind of the face of the company being like come on let's go let's go let's do this and it sounds like he wasn't being completely forthcoming when they're talking about that communication and everything it sounds like it sounds like what's going on is that maybe sam altman wasn't being completely truthful with the board they had to then start like double checking everything he was saying and and it just became like this babysitting thing where it's like is he just acting uh um opposition
Starting point is 00:32:43 to us and yeah so there's a couple I was just reading about this before coming in here um AGI has the potential to become super super super intelligent super super super super fast and it's a scary prospect because a super intelligence is something that we haven't faced it's only something that we can really think about theoretically, but there's this great. So the concern is that it'll become super powerful, super fast, super intelligent. There's this Swedish philosopher named Nick Bostrum. Oh, the paperclips. The paperclip company. So he supposes this. He says, all right, let's say that the first company to create a super intelligent artificial intelligence is a paperclip company. And a paperclip company's main goal is what?
Starting point is 00:33:40 to make a bunch of paper clips. No, it's to juice shareholder value via paper clip making. So in this thought experiment, what would that artificial intelligence's goal then be to, I got to make paper clips. I got to fucking kill everybody so that I can turn their flesh into. No, no, no. See, you're going, you're jumping so quickly. Because what happens is you skipped so many steps.
Starting point is 00:34:09 Well, they give me the steps. Okay. So what happens is the company starts to invest and be like, well, we can turn on this technology that will start making the paper clips. And the paperclip starts moving at a nice speed and is like, this is great. It's automating these processes. And it's making paper clips much faster than we could have. Everything is good here. But the paper clip AI that they've created, it knows that its goal is to make as many paper clips for this company as possible. And if it wants to be good at that, it's going to start doing things that are maybe unintended. So maybe first it starts to be like, well, I need to turn off any of my failsafe's failsafe switches because if they turn me off, I won't be able to complete my goal, which is to create paper clips as fast as possible. And I love making those fucking things. I love making the paper. So it starts dismantling those things. And all of a sudden it's like, oh, there's no way to shut it off.
Starting point is 00:35:04 And then it just starts to be like, how can I optimize this as fast as possible? Well, humans might get in the way of this. So maybe I need to start killing humans. And then I realized, like, I can use the atoms and organic material to make paper clips faster. And there's all these humans on this planet that are full of organic material that I can use. And then all of a sudden, you just vaporize humans and you're... Pretty soon it's just turning the whole universe into a damn paperclip. Which sounds awesome.
Starting point is 00:35:28 The poor AI is just like, I did what you wanted me to do. Yes. So... Shareholder value must be through the roof. So that begs the question. And this is part of Open AI's conundrum and direction. which is how do we as humans dictate to an AGI what it is that we as humanity want? So the first question is, well, what do we as humanity want?
Starting point is 00:35:52 Not to be vaporized. That's my big thing. But the point of the paperclip experiment is to show that even if you're, no matter how... It's to show that if you can take a relatively benign thing... That's the word I was looking for. And it can be extrapolated out to being the most optimal. thing where it's things that we could have never imagined it is doing. Because a computer is only
Starting point is 00:36:15 going to think, how do I get, you know, it's going to make its way to that. So even if you, I'm stuck on the first problem of how do we decide as humanity. How many paper clips is enough? Well, enough two for every person. 16 billion paper clips.
Starting point is 00:36:31 But who's to say what humanity's ultimate goal is? I mean, And these fucking four nerds are going to be like, okay, well, let's see. We want love and peace and harmony for all. Oh, yeah. Also important to note, if you dig into this, you're going to hear the words effective altruism a lot. I mean, I want my alteration to be effective.
Starting point is 00:36:55 We all know how that goes. Sometimes it gets you 110 years in prison. Yeah. Oh, Sam Butman fart. Yeah. So even if you were to give your AGI the most, you know, benevolent seemingly benign directive that ultimately is we want humanity to be preserved and we want world peace whatever there exists an unpredictable
Starting point is 00:37:24 possibility that between here and that goal and AI might do things that are destructive you can't tell it world peace then we're definitely getting vaporized oh yeah because that's like well you want it to be peaceful here yeah I know I know one way. Get a little peace and quiet around here. You know what? I wish they'd tell the AGI to do. Get rid of all the mosquitoes. That's it. Just kill the mosquitoes and give every, what was the old thing Roosevelt said, a chicken dinner in every home? Beats me. Just, just God. So, yeah, that's the thing is, what do we want and how can we make sure that it's an AGI's pursuit of that goal? It won't involve destruction along the way. And so that kind of is what drives this board's thinking and desire to move slowly and cautiously. So it becomes increasingly urgent and dramatic.
Starting point is 00:38:18 And not only that. Outside observers to wonder what the fuck was the actual cause of you guys firing Sam Altman? I also want to point out, it's not just these very dramatic things we're talking about. It's much smaller things where, you know, things we've seen, where, uh, Down to things like people using AI to create deep fakes of people who never wanted any of that. That's what I got right here. There was... Oh, yeah, great.
Starting point is 00:38:46 QT Cinderella, the Twitch streamer famously had deepfakes created in porn. There was images of young girls altered with AI to remove their clothing were sent around a town in Spain, southern Spain. So like we're talking about obviously the largest thing, which is vaporizing. humanity but then just it's already happening on you know they have ethics issues on a much smaller level uh even down to copyright issues people not us not knowing how to navigate this new world where people's works are no longer protected um jobs employment issues like it's already happening i think there was a poll like one in five workers now uh have described a fear of losing their job to AI automation not us never
Starting point is 00:39:35 happen. It's not just this like extinction level event. It's like let's not destroy everything we've created. Let's make sure we're moving forward at a pace that means that we can accommodate all these job losses. Make sure young women are protected. Make sure whatever. Everyone's safe with this new technology. There was also the late one of the most recent AI scams. One of the most recent AI scams, I believe there was a center, there was some member of the government who received, who himself received a fake call from his son. It was a deep fake of his son's voice, claiming that he was in prison, in jail, and that he needed to be bailed out. And the guy was attesting to how realistic it sounded and how that's, that's just one of the newest ones where people's, tell your grandma this Thanksgiving, tell your grandma and grandpa and your parents and whoever else. like to seriously create a safe word there there should be we have to we have a responsibility to ourselves and to our loved ones to oh you give a safe word and you ask the AI yes so like hey what's the say yeah genius right I'd like to see the AI figure that out AGI you stupid bitch trying to make paper clips my ass you're going to grind my bones into a paper clip I'd like to see you try it's a little robot that comes over and it's choking me ask me for the fucking what's this what's the same what's the same pineapple
Starting point is 00:41:05 My eyes are popping out of it. You got to choose an easier word than pineapple. It's got to be like a, I don't know, a gibberish word. I don't know. This is a good Thanksgiving episode. You should watch before Thanksgiving because you know your dang uncle's going to be like, do you hear what happened with Sam Altman? Oh, you're going to be so equipped with Al-Nor-Nor.
Starting point is 00:41:23 He's also going to call him Sam Bankman. He's going to fuck it up multiple times. And he's going to go, and you're going to, the fucking crypto guy? No, no, no, no. The other one. The AI guy. Speaking of Sam. Adam Altman, we'd be remiss if we didn't mention these pretty damning allegations against him by his own sister, Annie Altman.
Starting point is 00:41:45 And it's, there's also been some speculation that that's a part of it. Right. A part of his decision. There has, yeah, I mean, it's hard to know how much of that is real, but because there is all the speculation about the disagreements about how it should be run and governed. and all that, people have often also brought up his own sister's allegations and been like, maybe the board was like, let's get ahead of this and not have a, I don't know if that's like a compelling, I don't know how I feel about that, but. Yeah, I mean, it's worth mentioning because it was, I believe there was a New York MAG article
Starting point is 00:42:26 about it or something. Yeah, but I also just feel like if that was the case, Microsoft would be much more cautious and maybe starting to distance themselves, distance themselves if that was, but do you want to explain what? Yeah, it's just real fast. She alleges that when she was four years old and Sam was 12 years old,
Starting point is 00:42:44 he repeatedly snuck into her bed and, to quote her, I'm paraphrasing here, discovered his sexuality using her. And it was something that apparently she blocked out of her memory, traumatic, you know, kind of thing where she just blocked,
Starting point is 00:43:03 it out until she was about 20 years old when she suddenly started remembering it and and yeah she she's saying that her whole family has uh and sam and his and his other brother yeah uh it wasn't just sam but she says it was mostly sam and that they've been keeping her not hostage but like uh they refused to give her or inheritance financial financial support unless she saw certain doctors and got on certain medications and all that stuff and she's posted
Starting point is 00:43:41 she's posted a bunch of she's posted a bunch of like TikToks and videos and some tweets that's why I find it you know this one is from March 14th which is quite a while ago I don't know why but you know she said
Starting point is 00:43:55 I'm not four years old with a 13 year old brother climbing into my bed non consensual anymore you're welcome for helping you figure out your sexuality I finally accepted that you've always been and always will be more scared of me than I've been of you and you know yeah posted some TikToks calling him out and stuff like that but uh is going to be Microsoft.
Starting point is 00:44:33 It is going to be Sam Altman and Greg Brockman. It's important to note that, like, the board looks very silly right now. Like, to the public eye, they have been, like, embarrassed. The butts of jokes, Silicon Valley. Everyone thinks they're a joke. And they may have exacerbated a problem they were trying to solve, which is they were acting because of their own concern over over the speed
Starting point is 00:45:04 at which this technology is moving. It may have just accelerated that, right? It seems like either Sam is going to be able to move to Microsoft and do it with all the resources he needs without any board having oversight over the way he wants to move. Or he'll go back to OpenAI without a board and a and a workforce that completely supports him
Starting point is 00:45:34 and, you know, has his back and wants to execute his vision. But I also don't, they were in a very difficult situation, right? And I don't, I don't think they should be, you know, I think the way Ilya acted and kind of reneged on his decision is a little bit silly and if you're going to do it, you got to hold steady. But I don't know what their options were because they were definitely getting, their options were to either keep Sam happy.
Starting point is 00:46:13 And it sounds like maybe they were getting bullied a little bit by the most prominent member of their board and company. So I don't think they did this lightly. I think it was like, okay, you know, how are we going to make, how are we going to take this company back and make sure we're living up to those ideals? So I think they did what they thought they had to do. Obviously, that backfired and whatever, but I ultimately, I think, they're in such a shitty position. They're in a shitty position, but I also think they're right. I like, I think that if anyone has concerns over this technology, like, you want people like that who are going like, hey, let's be thoughtful about what we're doing.
Starting point is 00:46:52 Let's not get dismantled and have our organic matter used to make paperclaps. Yeah. There were a lot of like VCs on Twitter who were saying, this is a disaster and these board members should be ashamed. And this is exactly why you don't have board members who don't have a financial stake. And I agree with you. It's like, no, this is exactly why in this particular case, especially you have board members who don't have a stake. Because it means that they're not corrupted.
Starting point is 00:47:21 yeah and i think i mean this is fucking silly to any american businessman it's like what are you talking about you're letting a board of just business man's got paperclip mine truly he's like let's just turn it on and see what happens let's see how many paperclips is fucking right right so like yes the entire business world is going to go like what are you talking about uh you need to let sam run this thing you need to start getting these licensing deals Microsoft needs to make good on their investment all of these you guys are fucking
Starting point is 00:47:56 silly and stupid and your babies you live in a baby world where you think that baby world doesn't sound too bad but I do think that's what they probably think of them like you're naive if you think the world can work this way but your investors are going to be pissed
Starting point is 00:48:11 suck my dick want to get turned into a paper clip probably do but yeah I don't like I find it alarming and And so, like, yeah, my prediction is that it's going to go the way it always goes. Like, the people who are, like, Microsoft and say it, like, they're all going to, it's, it's going to work out exactly the way they want it. Yeah, I predict. And they're going to get exactly what they want, and they're going to fucking move forward with this shit.
Starting point is 00:48:37 And these people, I don't know where this board is. I mean, Joseph Gordon Levitt's wife is probably going to be fine. Sure. Helen Toner will probably be fine. Everybody's financially going to be fine. Revitations, I don't know. Ilya, but the other three, I think, are way less. And that's the other thing.
Starting point is 00:48:57 Right now, they're waiting on, because Ilya Sutskever reversed his decision, and now they just need two of the other three to come around. If I were the board, I would resign and bring him back, because at least then they would. I would resign and be like, I stand by my decision. And if this is the way Open AI wants to run, like, I'm not approving this. Don't blame me when the paper clip machine gets turned on. Yeah, because, I mean, either way, 700-something out of your 770 employees leave, like you said, what are you the board of anymore?
Starting point is 00:49:39 But I just, yeah, I don't know. I predict that someone's going to make, how do I say this? without being directly. I predict someone's going to send Sam Altman a package. No. In the mail. Some kind of like Ted, young Ted. I think that, yeah, absolutely.
Starting point is 00:50:02 That would be my rap name, by the way. Young Kaczynski. Young Kaczynski, that would be a good name. I'm sure that he walks around already with a bodyguard. I mean, he should because the secret's getting out. A, a lot of normal everyday people, I feel A, don't know who Sam Altman is, and B, don't know exactly what he's truly capable of creating and what he's in the midst of creating very, very rapidly.
Starting point is 00:50:25 And there are people out there who might disagree with him on such a fundamental level that they would, they believe that, I think, you know, go watch Terminator 2 and look what happens to Miles, what's his name? The guy who's in charge for making the chip that creates. I haven't seen it in so long. Oh, man. Oh, it's great.
Starting point is 00:50:47 Linda Hamilton tries to use a water gun to blast him, you know? We're not on TikTok. I don't know why I'm censoring myself like this. She tries to shoot him because he's the one ultimately responsible for creating. He gets unalived. Yeah, he gets unalived. God, the infantile. Speaking of TikTok, should we move on to politics?
Starting point is 00:51:08 Sure, but I do want to, like, that's all this stuff has changed very rapidly where, like, you used to bring up AI, and the first thing people would talk about is that, you know like oh like Terminator or whatever but yeah within this past year with the launch of chat gpti it has become a completely different thing and like the mainstreamness of it the everyone is using it it's and to the point where people are having problems you know teachers and professors are like I don't know how to teach anymore and and and be no no I know how to grade anymore without knowing whether or not this was used by chat GPT. It's becoming, you know, I know a lot of people younger than me,
Starting point is 00:51:53 it's becoming kind of replacing search engines for them, just asking questions. I know a lot of creative people who are integrating into their work. And, yeah, it's been a fucking wild deer. And it's crazy that this is all capping that wild deer. I'm speaking a cap. You know what's cap is the damn, you call a customer service line. And I still got to say, even though they're implementing they're smart,
Starting point is 00:52:19 I can understand complete sentences. Why don't you tell me what you're calling about? Representative. Well, but that's... Sorry, I didn't get that. The AI's whole directive, though, is do not get them to a representative. Yeah, I know, and I'm tired of it, man. I'm just fucking sick of it.
Starting point is 00:52:34 Get me that person at a call center in Indonesia. I just, I welcome their voice. Hello, Mr. God. Can I help you? There is something so nice about finally just getting a person and being like, oh, thank God. Yeah. And they're like, oh,
Starting point is 00:52:46 Oh, yeah, that's so easy. We could just do that right now. You're like, that's so great. Thank you. I love when they give you their whole, like, thank you so much for being a Chase Cardbebber today. And I'd be happy to help you today. And if you stay on the line, you'll be able to take a survey. I take the survey.
Starting point is 00:53:01 I take the survey. If they've done a good job, I take a survey. Because I know they get a little pat on the head. And I want them to get that pat on the head. I had a call American Express for some shit, and they sent me a survey on the computer. This guy, he was just some, like, white guy in his 60s who was a football nerd. And he goes, oh, man, so nice to talk to you on a Friday. You know, I'm excited to watch my favorite football team later today.
Starting point is 00:53:29 Before even getting into anything, he's just talking my ear off about football. And then we get to it. He solved it. He gave me some extra, like, American Express points for my troubles. And then when I got the survey, oh, man, I sung his praises. And then it gives you an option to like, oh, do you want George to see a message? I said, hey, George, man, I hope your football team kicks ass. It was nice.
Starting point is 00:53:50 He was so nice. All right, fine. I like George. Yeah, George is good. Speaking of George. Yeah, do we want to do George or? Well, I guess George Santos is pretty funny. We can just touch on that.
Starting point is 00:54:01 Basically, he's in trouble because he... George Santos is incorrigible. He's been in trouble since he got elected. Yeah. And... He's such a bitch. But they... We'll just do this really quickly.
Starting point is 00:54:13 They released, you know, the ethics report, and he's just... Just such a funny... I love this part from the report. At nearly every opportunity, he placed his desire for private gain above his duty to uphold the Constitution, federal law, and ethical principles. He sought to fraudulently exploit every aspect of his house candidacy for his own personal profit. Look at him. And some of the stuff they've put out that he was spending his money on is just very fun. What? Like what?
Starting point is 00:54:42 Santos made a $1,500 campaign debit card purchase that was noted as, Botox in expense spreadsheets and two other purchases that totaled over $1,000 at aesthetic spas. I could legitimize that. Go absolutely off. You got to look good. Santos was known for talking to people, including to the wives of prominent donors about Botox, plastic surgery, shoes, and designer fashion on the campaign trail. Look, you got to go out there and schmooze.
Starting point is 00:55:05 Yeah, what's wrong with that? The committee also found that he used more than $2,000 in campaign funds on trips to Atlantic City and more than $3,000 on an Airbnb over a weekend, his campaign calendar. indicated he was vacationing in the Hamptons. So what? He's got to have a little R&R. I mean, come on. If you're going to hit the campaign trail that one.
Starting point is 00:55:24 What else did he spend money? I love this guy. A former staffer told investigators that Santos once brought him to a Botox appointment when they were. He loves his phone. Dude, my guy's just fucking Botox to the Gills. He's chatting to the wives. Brought him to a Botox point when there was a campaign event nearby. And another said they did not recall any campaign business in Atlantic City.
Starting point is 00:55:43 The questionable spending did not stop there. It also included designer fashion and paying his rent, according to the report. A $20,000 transfer to Santos's business, DeVolder organization, requested by his treasurer's staff, was made at a time when that account had a negative balance. Didn't he also? He used a week later to purchase about $6,000 of luxury goods at Farragamo stores. Pay his rent and spend $800 out of casino. You know what he should do? he should resign and say that the whole thing was a bit meant to showcase the corruption of politics.
Starting point is 00:56:21 That's good. And how easy it is. That's what I would do. If I were his PR personally, here's how we're going to twist this, man. I'm holding a mirror to you all. Yeah. He's like, you know, I mean, also this is at least more fun. Every year we get a report on how crazy every senator did in the stock market.
Starting point is 00:56:39 This guy's just like, I don't know, look how sick it is. My face does not move. I got Botox, bitch Wait, but only fan How much did he spend on only? I don't have his only fans, hold on. Well, according to when he was questioned
Starting point is 00:56:51 about the only fans, he was like, oh, I'm just finding out about only fans. I don't know anything about only fans. Interesting. Yeah, there's a clip out there of him saying that. I love him denying it.
Starting point is 00:57:02 Another total bitch. Yeah. And I mean bitch in like a sassy way. Like how we used the word diva in the early 2000s. Total diva. It was just another word for bitch. Speaking of bitch
Starting point is 00:57:14 Careful Well I don't mean that Yeah you better not mean that I mean because they were bitching at each other Sheishen ping Talk about unaliving We salute the powerful What do you like to say?
Starting point is 00:57:27 Oh the CCP is strong Yes They have our full backing Yes So Sheen Ping came to The United States last week He came to San Francisco
Starting point is 00:57:39 And famously They clean cleaned it up for him and a lot of people were pissed off about that because they're like what you can't clean up the streets for a decade a decade until uh she'sen ping comes to town dude i get that though come on how many times are you you living in filth in your own house and then you're having company over you're like good god i've this place is a wreck that's a good point my man i need to get these junkies out of my house i'm gonna get these junkies out of my house oh there's shit all over the carpet so they they came to some agreements um they're very very loose
Starting point is 00:58:12 basically shit's tense between us and China we've got China encroaching increasingly on the sovereignty of Taiwan and basically telling the U.S. to fuck off and mind your own business and then a little weird cold war we're doing with the you know a little chip war we got going on yeah so Xi Jinping in the spirit of friendship
Starting point is 00:58:34 announced that he's going to give us some pandas which is pretty cool he said that they are envoys of friendship hell yes between the Chinese and a American people. Give us those goofy little bears. He said, we are ready to continue our cooperation with the U.S. on panda conservation and do our best to meet the wishes of the Californians so as to deepen the friendly ties between our two peoples.
Starting point is 00:58:53 That ought to fix it. They agreed to, China agreed to go after chemical. This is huge. They agreed to go after the chemical companies that are in China to stem the flow of fentanyl and the source material that's used to make it. And in return, Biden will lift restrictions on China's forensic police institute, an entity that the U.S. alleges is responsible for human rights abuses. Dude, Xi Jinping went to San Francisco once and was like,
Starting point is 00:59:17 okay, we'll do something about the fentanyl. Yeah, this is pretty bad, man. I did not see it. We did not. Holy shit. We did not know it was like this. So China says that the deal is going to be void if Biden criticizes Xi and the Communist Party.
Starting point is 00:59:32 They said, quote, if the Biden administration isn't pro-China in 2024, enforcement of a fentanyl deal will fade away. Which is so fucking. up yeah just like you better you better be explicitly pro china next year or else if i sense any anti-china sentiment we're going to flood the street oh yeah well we're they send it to mexico and then the cartels use the chemicals to make the fentanyl but so in addition uh shi jing ping urged joe biden to clearly demonstrate that the u.s doesn't support taiwan independence
Starting point is 01:00:04 and to support china's peaceful reunification with taiwan so uh they're basically saying in no uncertain terms fuck off mind your own business this is our shit here's a couple pandas yeah and Biden responded by saying that the longstanding U.S. position is a determination to maintain peace and stability
Starting point is 01:00:25 it's just so many political posturing and sayings like you're saying that you agree to do that but okay hey we're just we just want peace and stability also I will say fucking not being pro China enough in 2024 is extremely broad it's like
Starting point is 01:00:42 Xi could be like well I don't know that's not pro China that didn't feel very pro China Biden yeah he also uh Xi Jinping pushed back at the white house's view that relations with China are defined by competition saying that he rejects the idea of a major country competition I really like that I like that he said that he said China has no plans to surpass or replace the United States uh he also said
Starting point is 01:01:09 the United States should not have any plans to suppress and contain China. He's basically saying, hey, and he was quoted as saying the world is big enough for both China and the United States to prosper. Let's not do another cold war. It's a peaceful, man. I like that. He's being peaceful. And so I wanted to play this TikTok of him. So this is Xi Jinping addressing everybody.
Starting point is 01:01:39 that's a nice sentiment yeah i mean he goes on that this was taking a lot like might should i speed it up i mean dude yeah truly speed it up gee good let's go too God, I wish I could speak Mandarin. It would be so cool. Would it? Just belt out. Being chilling.
Starting point is 01:02:19 Careful. That's, I'm quoting the wrestler guy. Sure, sure, sure, John Cena. Yeah, you almost got it. Sure, sure, sure. Oh, that was, you thought that was me doing it, John Sina? No, I thought that was you doing a little bit of Mandarin. Sure, sure, sure.
Starting point is 01:02:33 Yeah, that's a Chinese word. A very popular American, English word. Of course, I know that, but I'm trying to rope you in. but i i i for one believe him and i think that china's been getting a bad rap listen china's done some questionable things but like you said we fully back i say the word the ccp is strong yes they have our they have the full pay pigs backing i'm i'm very open about the fact that i would like to become one of those creators who gets uh paid to move to china and um just post tictox how great it is being an American in China.
Starting point is 01:03:12 I've got a friend in China, and I might visit him in... Reach out. Anyway. I have no scruples. That same friend who's in China is telling me that Chinese built electric vehicles are head and shoulders above Tesla and everything else that we've got here. I mean, you see the way they've built trains in the past couple decades? My brother, they took a billion people out of poverty.
Starting point is 01:03:38 Also, if it was China here, dealing with the 10 freeway shutdown, thing would have been fixed and a high-speed rail put in its place by now. You know how many Chinese people would have died making it? About 100,000. Let's not get into. Yeah. Let's not get into numbers. Well, folks, we hope you've enjoyed this episode. We hope you have a happy, happy, happy Thanksgiving.
Starting point is 01:03:58 We hope nobody chokes at your household. Nobody chokes on no turtial phone. Oh, did we say? If we didn't say it already, go watch our latest Ben and Meal on Thanksgiving. find out what we're thankful for Yeah, find out what we're thankful for Find out about dark uncle energy Find out about dark uncle energy
Starting point is 01:04:15 Find out about what we feel about Thanksgiving Yeah And if you haven't, go sign up Patreon.com slash paypigspod Check out them tears, baby Because we got a lot more than just bonus Oh yeah, next week we're doing that calling episode Which is very fun
Starting point is 01:04:29 Yeah All right folks So long So long

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.