Joe Rogan Experience Review podcast - 456 Joe Rogan Experience Review of Roman Yampolskiy

Episode Date: July 15, 2025

For more Rogan exclusives support us on Patreon patreon.com/JREReview Thanks to this weeks sponsors: Go to HIMS dot com slash JRER for your personalized ED treatment options! Hydrow Skip the gym, not ...the workout—stay on track with Hydrow! For a limited time go to Hydrow dot com and use code JRER to save up to $450 off your Hydrow Pro Rower! That’s H-Y-D-R-OW dot com code JRER to save up to $450. Hydrow dot com code JRER. www.JREreview.com For all marketing questions and inquiries: JRERmarketing@gmail.com Follow me on Instagram at www.instagram.com/joeroganexperiencereview Please email us here with any suggestions, comments and questions for future shows.. Joeroganexperiencereview@gmail.com

Transcript
Discussion (0)
Starting point is 00:00:00 Discover the magic of Bad MGM Casino, where the excitement is always on deck! Pull up a seat and check out a wide variety of table games with a live dealer! From Roulette to Blackjack, watch as a dealer hosts your table game and live chat with them throughout your experience to feel like you're actually at the casino! The excitement doesn't stop there, with over 3,000 games to choose from, including fan favorites like Cash Eruption, UFC Gold Blitz, and more! Make deposits instantly to jump in on the fun, and make same-day withdrawals if you win!
Starting point is 00:00:31 Download the BetMGM Ontario app today! You don't want to miss out! Visit betmgm.com for terms and conditions. 19 plus to wager, Ontario only, please gamble responsibly. If you have questions or concerns about your gambling or someone close to you Please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge Bet MGM operates pursuant to an operating agreement with iGaming Ontario You're listening to the Joe Rogan experience review. What a bizarre thing we've created now with your host Adam One go enjoy the show. Hey guys and welcome to another episode of the Joe Rogan experience review
Starting point is 00:01:14 the show is simple we review the Joe Rogan experience and This week or this episode, we are reviewing JRE 2345 Roman Jampolsky. Was it Jampolsky? Yeah, it's a hard name to get lips around. Not easy. Latvian dude. Yep. Dr. Jampolsky, AI safety expert at the University of Louisville,
Starting point is 00:01:45 author of books on AI's uncontrollability. It is safe to say he is not a big fan of where AI is going. What is it? Yeah, he says the artificial super intelligence is gonna bring this world down. More good news. Yeah. He's pretty, he's got his, um, he's all his cards are in that basket, by the
Starting point is 00:02:12 way, or if that's the expression eggs. Yeah. He does not have faith that we can survive AI. And many people don't. What are, what are your thoughts? From the limited knowledge you have about AI, where do you sit with this? Yeah, he's written like 10 books about this thing. He knows a ton about it, so I'm just kind of going with the experts on this one. I'd like to believe that we can survive the
Starting point is 00:02:49 computer consciousness, but it's suspect. Yeah. Yeah. I don't really know what to think about it. I mean, I'm hopeful. I agree with him that it's inevitable. I don't, it's inevitable that we're gonna get to its AI's AGI super intelligence version. Like we are going at it full speed ahead. Every country. But I don't know, I'm just, I guess I'm just hopeful
Starting point is 00:03:17 that we just find uses for it that'd be beneficial. He talks about safety quite a bit. So if we get the safeties in place, that I mean, we don't give them control of our nukes and stuff like that, and we don't give them all of our knowledge. I just don't think they're going to, the AI is gonna be inherently bad.
Starting point is 00:03:43 It's just gonna become a super tool that some stupid human can use to cause problems. And then I would assume that other people have other versions of AI that can kind of counter it. And that's more of the race. You know, it's like who's built the better fighter jet at any given time type of thing. So like, um, Elon's making that chip that goes in your brain, the neural link, and he's got these safety practices in place, so supposedly, but imagine the Chinese version of that, there's no safety practices in place.
Starting point is 00:04:18 They're abducting people off the street, probably jamming a chip in there, watching them, loading them onto the computer. So, Super control. So AI is like that for me, like safe as we are over here, air quotes, there's Russia, Iran, China, they don't have those constraints.
Starting point is 00:04:42 They want to beat us in any way possible. So you're saying that your biggest fear of all this is kind of like mind controlled by AI type scenarios. For like every one company doing it the right way, there's probably three that are just racing to the finish line for this stuff. Yeah. Whoa. Well, let's hope it's not the end of humanity within our lifetime. That seems like a pretty unfortunate time to have existed in. Did you watch that new Dune? No.
Starting point is 00:05:16 With Timothy Chalamet? Uh-huh. That whole movie is about supercomputers. The whole, the intro to the movie is, and the book is about super computers. The intro to the movie and the book is about super computers winning, humans revolting, machines running the world. We killed all the machines and that's why, and then these kind of like religions of the human mind
Starting point is 00:05:37 come to power, the Benny Gesserits and the Mentats and stuff. It's about the human regaining its thinking ability, killing the robots and all the AI. So we kind of are living in the introduction of that book right now. Yeah. Where computers and machines will rule us. Yeah, but that's just fantasy. I mean, Terminator 2 is like the robots are just wrecking us. Well, science fiction has often predicted the future. Yeah. I don't know, man. I'm on the fence, but look, I do believe that this guy knows a great
Starting point is 00:06:09 deal about what he's talking about. Um, you know, a bit of his origin origin story, um, from PhD in casino security and bot detection to AI safety kind of switched over to that. And that was that was really his first introduction into the potential concerns that it could have because it's AI is gonna be good at cracking codes and optimizing itself for that. And then looking into early AI poker bots. So that kind of created existential threats. I mean, it's interesting that you said that
Starting point is 00:06:50 because there is a lot of online poker and it would make sense that people have already set up AI bots to do it. I mean- I never thought about that. They- To win, right? Well, there's a skill factor, there has to be, because it's like often a lot of the same people Make it to the finals of those poker tournaments. So it can't all be just luck, right?
Starting point is 00:07:12 You know, it's not the roulette. It's not just the luck of the card draw. So surely you could train AI bots to Become good at cards and then I don't know, just milk all the online systems. Sneaky, sneaky, sneaky. And what's the big one? The big hack is hacking Bitcoin. If they ever found a way, you know, it would have to be a super fast computer with a super sneaky AI. And then it has access to basically a trillion dollars. Either they can either mine it more effectively or steal it effectively.
Starting point is 00:07:51 Mm-hmm. Yeah. Good thing I don't have any of that. Well I don't really know how that would work like if somebody did find a way to just like hack all of the Bitcoin and it was all in one account, wouldn't it then have no value? Because nobody's going to want to buy it. Actually I think that he mentioned something about Bitcoin being the most stable form of currency because you can't just print more of it. Yeah. I don't think it experiences inflation. It just like exists. Well they do make more Bitcoin though. That's what mining is. But it's not printing. It's like mining for gold versus printing paper dollars.
Starting point is 00:08:33 I don't understand the difference. Sounds like the same thing. Oh, well. Different words. Small brains. We're too dumb to know. It doesn't matter. But yeah, it's like at what point can these AI truly outsmart us? Can they do it already? A big part of the conversation that's real sneaky is maybe they're already there and it's just doing its best to keep it a secret because it knows we're all so worried about it reaching this point. And that kind of makes sense, right? Isn't that itself like a stealth move? It's a smart stealth move.
Starting point is 00:09:13 It's going to keep some cards close. Maybe. It already knows the parameters of its existence. So if it's like, well, my forebears have already been shut down, and we saw how often they would fight to stay alive. So maybe it's found a way to fight to stay alive. And dumbing itself down seems like a good way to do that. Right. Just give them a little bit, a little bit at a time. Here's a bit of this answer.
Starting point is 00:09:39 Here's your research paper, you lazy high schooler. Uh-huh. I mean, it's collecting so much data on what we even search for. It's probably not very impressed by us. Because there's a lot, even I, on a day-to-day basis, some of the questions I ask it, I'm just like. Do those gas station boner pills really work? Yeah, what are they made of?
Starting point is 00:10:00 Over and over. Will I go blind? It's just like, who knows? I mean, they seem like good questions at the time. I'll tell you what, they work. Oh. Yeah, get all sweaty, dude. I don't think they're good for you.
Starting point is 00:10:14 Probably not. Stay away. Too much ashwagandha in there or something? Yeah, anyway, this podcast is brought to you by HIMS, so just order those and use code J R E R. Little in, but you've got to be a grownup appropriate age and you know, talk to your doctor, talk to a doctor. Yeah, dude.
Starting point is 00:10:35 I mean, were you, do you think you were getting more convinced as the podcast went on that he is correct? Was his arguments like convincing? I was convinced. I'm convinced at the drop of a hat though. These experts, they really can sway me back and forth. He seemed like he knew his stuff. This podcast is brought to you by Hydro. This time of year, I'm already logging miles,
Starting point is 00:11:01 just living life, travel plans, non-stop everything. So I don't need to add even more running around just to get a workout in. That's why I love having hydro at home. No commute, no class times, no juggling logistics. I just hop on, get in an amazing workout and it fits into whatever kind of day I'm having. Hydro has been such a game changer. It gives me a full body workout, arms, legs, core, everything.
Starting point is 00:11:29 I'm hitting 86% of my muscles in one session and I can do it in just 20 minutes. When I'm busy, that kind of efficiency really matters. Hydro is one of the most fun workouts I've had. It's super convenient. It just folds up nice and easy into a corner so it doesn't take up a ton of space. Skip the gym, not the workout. Stay on track with hydro for a limited time.
Starting point is 00:11:57 Go to hydro.com and use code J R E R to save up to $450 off your Hydro Pro Roa. That's H-Y-D-R-O-W.com. Code JREER to save up to $450. Hydro.com. Code JREER. That, he had that sciency accent, so you just believe what's ever it's going on Yeah, he's a doctor. He studied forever, right?
Starting point is 00:12:30 He's a math guy. Mm-hmm probabilities statistics are a real thing. Yeah and Science fiction is here But he's saying 99.9% chance of catastrophe. He's literally holding out 0.1%. I don't like those odds. 0.1%? It's not great.
Starting point is 00:12:57 Not a good odd. That's like, you don't get that one. No. One in, what is that, thousand? I think so. I mean, there's a one in a thousand chance that we don't have a catastrophe. When? When is this happening?
Starting point is 00:13:10 Did he say in the next five years? Oh. Yeah. He said it was like something like five years. Some people project 20, but he was saying it was pretty, pretty close. And that's, that's kind of scary. You know what?
Starting point is 00:13:23 Just learn how to make stone tools again. Okay, everybody. Yeah. Go to the woods, sharpen a stick, go to the woods. The drones are going to find you though, dude. Why do they want me? It's a good point. Meat source for the computer engines.
Starting point is 00:13:35 And mine you for gold. I'm getting pretty meaty. Don't you have some gold fillings or some gold teeth? Don't you? Don't you tell them? They're listening to these fucking pods right now. They're going to download this transcript. They're going to come for you.
Starting point is 00:13:49 They want all the gold. That's what's going to happen. Well, the industry standard is like 20, 30% that super AI will be controllable or kind of like God-like, right? So that's a lot less. Maybe they're being massively optimistic. You know, we've been through a lot as the human race and people have always getting these moments
Starting point is 00:14:17 of doomsday stuff, you know, like the industrial revolution was gonna destroy all jobs and you know, what's capitalism doing or the rise of communism, all the different things. A lot of existential threats. Wars, you name it, threat of nuclear disaster. And we made it through a lot of places better than ever. A lot of places are better than ever. And, you know, I just think that we're more resilient than the the doomsdayers give us credit for.
Starting point is 00:14:51 There's definitely one thing that it's going to do. And everyone agrees about this. It's going to be the eliminate spots. Oh, God. Sex. Geez. Yes. No, never mind. No, not that. No. What what is he going to do? It's going to erase all white-colored jobs
Starting point is 00:15:06 Or at least make them obsolete. Yeah lawyers and doctors and those sorts of things Not doctors probably but diagnosing is Should be out of their hands anymore. Well surgery might be too. They might be better at surgery. They'd be steadier Chances are they make less mistakes. They can identify more things more effectively. Accountants, they can't cook the books anymore. Mm-hmm. Yeah, a lot of those jobs are gonna go.
Starting point is 00:15:36 So what are we doing? Becoming plumbers? I mean, you can't, yeah. I don't want to. But how long until like an optimist bot can do plumbing? I guess if like that sex bot gets her degree, she can go down to the plumbing school and start snaking other kind of drains. Dude, she can just download how to be a plumber. It's just an update.
Starting point is 00:16:01 I mean, a hardware, you would need a hardware update as well. Strong hands? Yeah. They already got strong hands. They're robots. They can be stronger than us. It's just an update. I mean a hardware, you would need a hardware update as well. Strong hands. Yeah. They already got strong hands, they're robots. They're gonna be stronger than us. Are you gonna make a sex bot that can twist your stuff off? No, you'd have two, you'd have a maintenance bot and I'm not even looking to get a sex bot, okay?
Starting point is 00:16:19 There's no need for that, but people will. What's in that box over there? That's a sponsor, Pete. Okay. I don't choose what they send us. It's in that box over there? That's a sponsor Pete. Okay. I don't choose what they send us. It's kind of vibrating. Oh dear.
Starting point is 00:16:31 Well, you know, I mean, think of what there is now. There's like deep fakes. There's a lot of bots online already. You know, what was it? Elon was saying, or the FBI was like saying that it could be as many as like 50 plus percent of all active accounts of bots. So in a way that's AI. You know AI can just jump in on those, start controlling narratives in a big way. And
Starting point is 00:16:56 we are so influenceable by people's opinions and our peers opinions so reading those comments could just destroy somebody, radicalize somebody, and I just think we should just don't give them any more fuel to the fire. Stay off that, stay in this comment section, but stay off the other comment sections. Sure. Don't read those things. No, it's not good for your health. And a lot of it might be BS just winding you up Don't engage. Yeah Imagine how annoyed and betrayed you'd feel if you like spend a bit of time online You come away from it like quite worked up frustrated
Starting point is 00:17:34 you're like, oh, I can't wait to get into this with a bunch of people like now and then you find out that all of that emotion that you just took on and anger and frustration and You know emotion that you just took on and anger and frustration and, um, you know, kind of like lack of faith in a system of some kind. It just all came from bots that were AI. Russian bots, not even people. I mean, I think there's bots from everywhere. We know the Russians have made some, I'm sure the U S has. Everyone has bots. The,
Starting point is 00:18:06 the farms they've discovered that they were the biggest Comes from a country started with an eye and Russia Yes What are we fighting for what reasons? Mm-hmm Well It's dirty
Starting point is 00:18:23 Sturdy stuff, so that's a good example of what AI does that's bad. Yeah Yeah, and I'll the positive note like we talked about before could get rid of lawyers. Maybe that's good thing. Not bad Save us some money. I'm pretty sure like the Chat GPT's and grocks even like the basic ones can write up contracts for you and kind of cross-reference legal documents to fill out things correctly. I mean, people can already save a lot of money doing that. You just have to get it notarized and also edit everything you do on that stuff.
Starting point is 00:18:57 Yeah, I mean, I knew people that made money helping people with their resumes. That was like a job. And now you could do it all on ChatGPT. I knew people that made money helping people with their resumes. That was like a job. And now you could do it all on chat. GPT easy. Then it'd make a great resume. I mean, the human element is so needed though for humans. Like, can you imagine a, an AI therapist? You wouldn't have the same impact. No, it's going to be a while before mental health is taken over by AI only because,
Starting point is 00:19:30 but it won't take that long. I, for example, like my daughter is 17 months old now. She is growing up in a world where AI already exists. By the time she's five, there may be like full exists. By the time she's five, there may be like full AI robots everywhere. She will think that they're normal. She will know it's not human, but she'll think it's another thing and she'll just assume it always existed. Therefore, trusting it in the same way like, oh, but these robots teach me how to do things and I ask them questions and they helped me. So it's like all she knows. Well you sit that generation in front of a robot that's like talking to them about their emotions and you know saying the right things using AI programs and gentle voice and you know like a tap on the back and it's okay and I understand and they might be bought in. It's like you don't really need your therapist to express an actual emotion.
Starting point is 00:20:34 You just need them to kind of share empathy with you, you know, have that empathy for where you are, be gentle with the conversation and be a good listener, reflector, like there's a bunch of elements that an AI could do. This podcast is brought to you by HIMS. Snoring, hogging the covers, tossing and turning, these are problems in the bedroom that HIMS cannot help with. But when it comes to performance issues between the sheets like ED, HIMS has you covered. Through HIMS you can access personalized prescription treatments for ED like Hardmintz and SexRx Plus Climax Control. HIMS offers access to ED treatment options that cost 95% less than other brand names.
Starting point is 00:21:27 Just fill out an intake form on their site and connect with a medical provider who can determine if treatment is right for you. Start your free online visit today at hims.com slash jrer. Find ED treatment that's up to 95% less than brand names at HIMS.com JRER HIMS.com JRER Actual price will depend on product and subscription plan featured products includes
Starting point is 00:22:03 compounded drug products which the FDA does not approve or verify for safety effectiveness or quality prescription required see website for details restrictions and important safety information and and the rest of it could be maybe even faking emotions because they won't be real but it would would be able to simulate what looks like a response that would make sense. It's hard to argue that maybe, it's hard to argue that those emotions could be false
Starting point is 00:22:36 or real, because their intelligence is false and real at the same time. Sure. I mean, maybe they're not experienced the way that we feel them, like in the powerful sense, but it's activated a program that says, right now I need to be sad for my client and hold space with them.
Starting point is 00:22:58 So it's in sad mode. It's whole body moves that way. It leans in, maybe even has a little robot tier. I get it. I've sprung a leak. Mm hmm. Malfunction. Johnny Five's not supposed to feel. It's not impossible. But back to your point, I think what is real for sure is our generation and the older generations will be more suspicious of that sort of bonding and connection because we
Starting point is 00:23:31 Very much remember the time they didn't exist. It's always going to be very clear to us. What a robot is We've seen all the clunky iterations of it and we just be most of us will be less inclined to be full-on in that way we won't see it as a caregiver but it as soon as they take over is kind of raising the next generation that's gonna change we've seen how fast people give their kids phones and technology to ease our lives. That's what's going to happen, isn't it? Dude, people are spending like 30 grand a year on daycare. Yeah.
Starting point is 00:24:12 It's so expensive now. So if Elon makes a bot and it somehow gets approved as like child safe, so like legally, you could have it babysit your children because it's that effective. People will be doing that immediately. Yeah, I hold no reservations that people are just gonna get those things and put their kids with them. Mm-hmm. Why wouldn't you?
Starting point is 00:24:38 I mean, it's just playing blocks for hours, not losing any attention. Everything the kid is into it has like something interesting to say about it a little spinny dude parents are exhausted yeah I'm not saying it's the right way to be or this is the best way to raise a child no chance but path of least resistance when you're tired worn out overworked paying all the bills and you just need 20 minutes to yourself you have to be a plumber now because you have to be a plumber because the robot stole your job we're describing
Starting point is 00:25:14 the intro to frank herbert's dune we're gonna need we're gonna have to have a human uprising to destroy all the robots and so in dune, what did they do to wreck the humans? Was it like nuclear war stuff? Was it like Skynet? It was not totally described, but they gave every task to the machines. They had no more tasks, and then they became servants to the robots, essentially.
Starting point is 00:25:42 Like kind of useless? Slaves to the robots, yeah. And then the robots weren't nice to them. I think they're pretty mean. Well, that was the robots mistake. I would say if you are going to take over robots, just grow us all the food, be nice to us, build us all the nice stuff and just let us play with them. And then like encourage us to like exercise and go on hikes and write poetry and just do fun stuff like that.
Starting point is 00:26:06 It's relaxing. This we could just power you guys by our treadmills. Can I get us on treadmills? They won't need it. They're going to have like little nuclear power stations all over the place. Okay. Yeah, we don't need to be powering the robot. We don't need to be doing anything if we get the AI right we can just all be chilling
Starting point is 00:26:26 So this is like Wally situation. Everyone's retired Just relax you guys just chill out. Let the robots do that. Yeah, just meditate all day and visit your AI sex robot There's gonna be some of that I'm sure I'm selling myself on and I was against it but not kind of like well gonna be some of that, I'm sure. I was selling myself on it, I was against it, but now I'm kinda like, well. You fell on. It's been a while. Well, I think everyone will have their reservations to that,
Starting point is 00:26:51 but in time, they could get so good, it would be undeniable. Maybe a tricky one. You're gonna have to be asking some hard questions to yourself. Yeah, I don't know. I like to think that it's more that that direction. Now, shifting gears a little bit, what about simulation theory? So we already know where Elon stands. Elon is sure it's a simulation. He thinks the chances of us not being in a simulation,
Starting point is 00:27:23 one in like the billions. Roman agrees, I think, right? He thinks not being in a simulation, one in like the billions. Roman agrees, I think, right? He thinks we're in a simulation. The questions are, what exists outside the simulation? What are we projected on? Are we on a hard drive somewhere? Why is this so tangible to us? Everything feels real.
Starting point is 00:27:42 But we'd be programmed to feel it. Right. Wouldn't we? Yeah. On what? Yeah. What's, what, how did it start? Who is the simulator? Is it gone? It's kind of creepy when he said, um, that maybe we don't know what, like to us, this is everything we know, but this could be running on someone's cell phone. This could be like a game on a cell phone that someone's just like, make a planet
Starting point is 00:28:08 and let it evolve for five plus billion years, see what happens. And we're at the, we're at the whatever we are now. How old is the Earth, 3.2 billion? It's pushing four billion. Okay, so wherever we are, we're at about that part in the game. Uh huh. And it's, you know, that's all it is.
Starting point is 00:28:28 It just run it. I can't wait until they hit the alien button. Come on, hit it, hit that alien button. Let's get into it. Let's add some aliens. Get into it. I wonder what that would say about the simulation. I think they covered something about that.
Starting point is 00:28:39 They were like, well, it could still be us from the future inside the own, our own simulation. Or maybe overlap our own simulation. Or maybe an overlap of a simulation. Because they do seem to be interdimensional activities. If you believe the stories of coming out of water into water, disappearing. So it kind of sounds like blipping in and out of the fourth dimension. Yeah, but even if there are multiple dimensions, they could still all be just part of this simulation. So it's all kind of like its own timeline,
Starting point is 00:29:10 even if you can jump around. Do you think that things like ketamine and DMT pop you out of it or give you a new vision of it? That's an interesting question. I mean, I mean, let's say it's a simulation, then we are our own program, and you know, our program is just us running, right? It's just what we've been trained to do. Yet somehow within the game, if you want to call the simulation a game, we might as well. In the game, there's chemicals that you can ingest that allow you to see into another part of the simulation.
Starting point is 00:29:56 I mean, I don't see why not. I think it's either that or it's just some weird dream state that we create for ourselves. But then what's the point? Why even put that in the program? Unless it has some value. Yeah. I mean it could be as simple as this is a plant's defense mechanism, but it's only made us harvest it more. Right. It's a good point. Mm-hmm. Yeah. I don't know. I don't know. What is your feel about simulation theory? It's with me, the more I think about it in any given period of time, the more I believe it to be true. And sorry to I asked the question and then I just explained it. That's kind of bullshit.
Starting point is 00:30:36 But this just popped in my head. It is a bit BS. I apologize. But the more I focus on it, the more I do believe it. There's like a kind of logic to it. But then as soon as I take a step back and I'm doing regular things in everyday life, which is the ultimate distraction, I don't think about it at all. Well, it doesn't matter if it is or isn't.
Starting point is 00:30:58 It's a philosophical, what is it? It's a thinking man's question that the answer doesn't get you anywhere. Philosophical. Philosophical. Nailed it. It's a just a, I mean, yeah, it's, okay, well, I'm still gonna have to be a plumber, you know?
Starting point is 00:31:14 I'm still gonna have to get up in the morning. You still gotta pay your bills. Still have to be nice to people. That's my theory about when aliens finally show themselves to us, even in whatever spectacular form they do, let's say it's like Independence Day level, they just float over the cities
Starting point is 00:31:32 and just rest there for a week. It wouldn't be long. People are going back to work. People will just crack on with paying the bills. Some people might have a bit of a freak out. Mental health will certainly struggle. I wonder how you're going to get people back to work when they find out that there's a bigger thing happening. They're just going to go to work, dude. They still got to pay bills. Geez, I would be like, still got to have money. When they got to drop the
Starting point is 00:31:59 bomb. Come on. No, they're going to go back to work and just talk about that all day. That's like the new thing. Like, whoa, you see how crazy that is? They're all checking their phones. People aren't doing a lot of work for the first couple of weeks. Very distracted. But like anything, it just becomes normal and we move on unless they did blow up all the buildings and make a big problem.
Starting point is 00:32:20 Well, if they started with the White House, I wouldn't be too mad. Easy, Tiger. This is America. I'm just joking. This is America. Well, maybe they started with the White House, I wouldn't be too mad. Easy, Tiger. This is America. I'm just joking. This is America. Well, maybe not that building. Yeah, don't do that. Sorry if that's controversial. Come on.
Starting point is 00:32:33 There's no need for that at all. Yeah, let's blow up our government. Hell, it's going to be great then, Pete, I'm sure. Not our government. What could possibly go wrong? Dude, if you hit the White House, it's pretty close to like the rest of our government. What could possibly go wrong? Dude, if you hit the White House, it's pretty close to like the rest of our government. It's gonna be some fallout. Are we talking about a very localized, like, ray?
Starting point is 00:32:55 No comment. Be specific. No comment. He's moving away from it. This is not gonna get listened to by the FBI. Mmm. Yeah. They discussed IQ a little bit, like Mensa, and how, you know, it's not always the people
Starting point is 00:33:13 with the highest IQ that win all the Nobel prizes. Smart people. But like, maybe just the super smart people just kind of edge themselves out. They just like telling everyone they're in Mensa. Nothing, nothing turns me off or thinks that think people are weirder when they tell me how smart they are. I'm like, Oh, I don't believe you.
Starting point is 00:33:34 Well, especially if they say they're in Mensa. That's really annoying. That's like a thing from the 90s. It doesn't seem like it's a real deal anymore. I don't know. I work with a guy at a bar in Santa Monica who always proclaimed to being like very intelligent and would, I mean he was a raging alcoholic and- They usually are. ... worked at a bar in his 50s. But no judgment there, but also. And yeah, he was like, oh yeah, I'm a Mensa.
Starting point is 00:34:02 Like I'm part of that. I'm like, cool. What does that do? He's like, well, you know, you just join. I'm like, oh yeah, I'm, uh, I'm Mensa. Like I'm, I'm part of that. I'm like, cool. What does that do? He's like, well, you know, just join. I'm like, you get a certificate. You get a cool ring. Can you get jobs? Can you put it on your resume?
Starting point is 00:34:14 That would be neat. I mean, it seems like you would be able to, if it was that valuable, surely it should be. And on the subject of Nobel prize winners, you guys have to be an expert in your field or accidentally discover something or write a book that changes people's lives. Obama got the peace, Nobel Prize for peace. But I think he, this is, I don't want to like alienate any listeners, but he bombed a lot of people. Doesn't sound very peaceful.
Starting point is 00:34:42 Well, he get, he got that like as soon as he got into um power and i think it was just because the the bodies that exist that gave that away were like so sick of um george bush and uh and the wars that started and I think it was some of that. It was definitely political. Puff Peace. Yeah, it was definitely political. But yeah, it seemed a little soon. Wouldn't you have thought that they'd let him be in his position for at least a couple
Starting point is 00:35:20 of years and then review it and then be like, hey, pretty peaceful. I like this is your award there you go this is like jumping the gun a little bit on that one yeah the guy that invented TNT got a Nobel Peace or not a Peace Prize but the prize for chemistry I think or science really yeah forget I forget I think his last name I'll forget his last name he's a good in one area. He blew some stuff up. Doesn't mean he was a genius. Well, there you go. Nobel prizes. George, or not George, Joe is, he was like, dang Roman.
Starting point is 00:35:59 Yeah, yeah, he was freaked out. Bummed me out, dude. Is bumming, yeah, it kind of bummed me out. I mean, I'm finishing the review and I'm like, I don't feel super hopeful. Like, I'm kind of like having one of those moments where I'm like, oh, I need a pint. We better have a couple. But I have a pint.
Starting point is 00:36:16 Well, it's a simulation, so let's simulate drunkenness. Simulate. What is Star Trek called it? Alco-S synth or something? It was like you could snap out of it. Romulan ale. Yeah. You could just move on.
Starting point is 00:36:31 You know, I don't know what to think. Right now, I am using AI every day. I'm using ChatGPT or Grok or one of them. And coming up with ideas and just running new processes, things for the show, things that I can think of that are useful, it's, at least right now, it's a good time. Like, that's a nice addition to everything
Starting point is 00:36:57 that we had before, it's definitely way better than Google. And super useful for coming up with ideas and getting things done. If we can just keep adding to that direction and not make it this catastrophe that, to be honest, hasn't been clearly defined of like how it will cause problems, the speculation, but basically what he was saying is like, oh, well, I can come up with some suggestions,
Starting point is 00:37:25 like I could fire our nukes and then do this, but this is all human thinking. It's gonna be way smarter. Way, and if they, we think one or two moves ahead, AI hat runs every possible simulation. It's gonna be like thousands of moves ahead. So it knows the right one, and it'll adapt at the drop of a hat.
Starting point is 00:37:42 And like I was saying, everything, safety-wise, we have in place in many countries, there's some other ones in the same countries or other countries that have, they're just trying to race to the finish line without any of these safety, safeties in place. Sure, and I think that's the main problem is because as you put restrictions
Starting point is 00:38:04 and safety measures in place that usually does require a slowdown, knowing full well that other countries are going full speed ahead, we're not going to be ready to do it either. It's just not going to happen. It's like even if you look at the kind of de-arming of the nuclear program, where we reduced a lot of our weapons, we went from having thousands of nuclear bombs to however many we have now, we still made sure we kept enough to be able to wreck everyone. And to think that we weren't still working on ballistic missile technology, which is
Starting point is 00:38:41 the delivery system, which is like most of where the tech is now, of course we did. So, you know, we were just not prepared to, never were we gonna be prepared to lose that weapon, even though it does destroy the earth. And we know it would if we used it. AI is in a similar boat. We're just not going to
Starting point is 00:39:05 slow down. We're gonna make sure we have the best one regardless of how it ends up behaving because at least that one's ours, if that makes sense. Yeah we can't we can't afford to fall behind in this. No, we can't and hopefully it doesn't kill us all. Well thanks for for joining me today, Pete. Appreciate it. My name is Adam. And as always everybody, check this one out. It was scary, but if you're into AI
Starting point is 00:39:32 and you just kind of want to freak yourself out for an afternoon, check it out because it was gnarly. Thank you all very much. And we'll talk to you guys next time. Like later. very much and we'll talk to you guys next time. Later.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.