Macrodosing: Arian Foster and PFT Commenter - Artificial Intelligence

Episode Date: May 4, 2021

On today's episode of Macrodosing, our Barstool HQ genius Quigs joins the show and the crew talks the positives and negatives to artificial intelligence. Are our phones listening to everything we say?... Has neuralink already been used on Elon Musk? Find out all on the show. Enjoy! 2:00 Quigs joins the show 5:45 What is artificial intelligence 12:00 Big T TikTok time 14:00 To fear the humans or the robots 17:15 How will the algorithms become smarter than the people who created them 25:30 Deep fakes, All technology evolves through porn 33:30 Robot juries 37:00 Are we living in a simulation 1:00:00 Can AI create better art than we can 1:08:00 What if we just turned off the internet 1:10:00 Cannibalism 1:14:00 Would it be a positive or negative to have everyone be able to watch all your memories 1:18:00 Do not send Arian baby pictures or Christmas cards 1:20:00 Single Player Theory 1:28:00 Free will 1:33:00 Roko’s BasiliskYou can find every episode of this show on Apple Podcasts, Spotify or YouTube. Prime Members can listen ad-free on Amazon Music. For more, visit barstool.link/macrodosing

Transcript
Discussion (0)
Starting point is 00:00:00 Hey, macro dosing listeners, you can find us every Tuesday and Thursday on Apple Podcasts, Spotify, or YouTube. Prime members can listen ad-free on Amazon music. Welcome back to another episode of macro dosing. Big episode today. The computers have won. If you're hearing the sound of my voice, the computers have already won. And we're going to teach you how to live in a post-humanitarian lifestyle, post-humanity world, which is rapidly approaching. This is honestly a subject that I looked up.
Starting point is 00:00:30 Like, my brain is all fucked up from just reading about all this stuff and the different theories that are out there and the possibilities for how artificial intelligence is basically going to take over everything. Probably already has taken over everything. So by the end of this episode, my goal is to figure out whether or not we are real people or if we're currently living in a robot simulation. And then there are nine different ethical issues that the World Economic Forum has put out there about the dangers. of artificial intelligence as they relate to unemployment, inequality, humanity, artificial stupidity, racism, security, the unintended consequences, how we stay in control of complex intelligence systems, and then robot rights. That last one just sounds like a robot added that one onto the end.
Starting point is 00:01:18 That's a piece of evidence number one that we're currently already living a simulation where they tacked that one at the end and hope that we wouldn't notice about it. But we've got Aaron. Ariens joining us. It's good to see you again, Aaron. what's up pimping how's everybody doing doing great big tea good evening you're also here yeah uh we might have to discuss do some cleanup on on last week's episode and how big tea uh currently thinks that uh robots or not excuse me that uh dinosaurs existed six thousand years ago
Starting point is 00:01:49 well i mean we pretty well covered it arian's been debating against some big t's followers online about that and then we have quiggs joining us today so quiggs is employer at barstool um he's actually the most famous person this week at barcel because he's doing you're doing your second podcast ever and your first one just ended 30 minutes ago so never did a podcast before at two today so quiggs is probably the smartest person that works here besides billy uh and he's got some unique insight how would you describe what would you describe your background is when it comes to just computers and artificial intelligence in general so for this stuff i would say it's more interest than expertise but I do have like, so I was, I was an aerospace engineer in college. So I have like a scientific
Starting point is 00:02:33 background and kind of like an intrigue by these kind of things. But to say I'm like any sort of expert or no more than anybody else on this, I'd, I wouldn't. I can tell that you're, that you're new to doing podcast because you just have to declare yourself an authority on it right off the bat. Well, once I, once I unwind a little bit, I'm sure I'll be making some. He's an expert. Quiggs is an expert. Um, so before we get into the meet of the podcast, podcast. Aaron, you want to talk to us about our friends at BetterHelp. I do. All right, man.
Starting point is 00:03:08 We're going to get serious for a second. The last year's been hard on a lot of people, and that's why we're doing something new and partnering with our sponsor BetterHelp online therapy. At Barstool Sports, we truly love and appreciate our listeners. Without you, none of us will be able to do what we do every day except me. And so once in a while, we try to bring you something good. Maybe it's for you. Maybe it's not.
Starting point is 00:03:33 A lot of us take care of our bodies with as tough as a year as it's been on many of us. It's a misunderstanding of what therapy is. It can be whatever you want it to be. It doesn't have to be sitting around talking about your feelings. I have plenty of expertise in getting therapy, man. And this last year has been the worst year for me 100%. And I see a therapist all the time and better help is a good help for that. A lot of people battle with their temper and their stress is too much to manage.
Starting point is 00:04:06 Or they have depression, anxiety, PTSD. The list goes on if this is you. You can use therapy and get some tools that make life easier. In my experience, I think that's what therapy has done for me the most is, is just allow me to have tools to operate in a way that is conducive with everybody else because we're all just bouncing around in each other's feelings, man. So it's best to have tools to deal with everybody, man. When everybody's struggling, everybody's struggle with something,
Starting point is 00:04:33 there's no more shame. You don't have to tell everybody your personal business. But you can't talk to a therapist about it privately. Better Help is a customized online therapy that offers video, phone, and even live chat sessions with your therapist, so you don't have to see anyone on camera if you don't want to. It is much more affordable than in-person therapy, and you can start communicating with your therapist in under 48 hours.
Starting point is 00:04:52 join the millions of people who are seeing what therapy is really about. It's always a good time to invest in yourself because you are your greatest asset. This podcast is sponsored by BetterHelp and our listeners get 10% off their first month at BetterHelp.com. Betterhelp.com slash dose, D-O-S-E, that's B-T-T-E-R-H-E-L-P dot com slash dose. All right, sick. Let's get into it. So this, the concept of artificial intelligence and where it's taking humanity is one that like every single tech entrepreneur, all the leaders when it comes to developing software and like these massive companies like IBM and Microsoft. pretty much if you're a tech leader, your biggest concern in the world is that artificial intelligence one day will become so powerful that it's going to wipe out all of humanity.
Starting point is 00:06:02 And some people think that it already has. Some people think that we're currently living in a simulation. If you've seen The Matrix, you've probably heard about that. I was first introduced to the concept of artificial intelligence destroying the world through Skynet in the Terminator movies. I think that was the first one that really did it. But the debate goes back a long time. It goes back to, I think, the 40s. when Alan Turing came up with the turning test,
Starting point is 00:06:25 which was essentially the idea that you have to be able to give a computer system a test and figure out whether or not it's a computer or whether it's an actual person responding to it. And then there's a concept that computing, the power of computing doubles, I think it's every two years. I think it's called Moore's Law. So a lot of prominent futurists right now, are saying that if we're not already living in a simulation, we will be by 2040. By the year 2040, that's like the time when supercomputers will be so powerful that they'll be
Starting point is 00:07:04 way more intelligent than humans are. And then they'll just create the rest of the foreseeable future just using software programs, essentially. So I'm probably butchering that a little bit, but that's the general concept. So Quigg, so you have, and interesting is what first got you into artificial intelligence? I think for me it was probably like the introduction of Teslas and like self-driving cars. And sort of like the I was like super interested on like the process of how they were like developing the self-driving things. So they they've always had their self-driving like software in the car. even when it doesn't work and it like matches up what what the car thinks it should do with
Starting point is 00:08:00 what the driver does and then compares which one's safer and it's just been like collecting that for years and years to make like the it's going to be like the greatest self-driving thing of all time so like that would be my I think artificial intelligence like introduction gotcha what do you know about it a area I mean, I don't fancy myself an expert at this at all. I've just been vaguely interested, probably not to the extent of you, bro. But I've been fascinated with it for a while since The Matrix. The Matrix to me was like an eye.
Starting point is 00:08:41 Because I didn't actually even see it until 2015, right? So 2015, that was the first time I saw The Matrix. And I was actually high while I saw it. And it was just the most fascinating shit. I have ever seen, just the fact that we are living inside like a big computer system. And then fast forward, all of the literature on it, there's really people that think we actually live
Starting point is 00:09:07 in a simulation right now. There's like literally scientists that believe that shit. You know, they're on a fence about it, some do, some don't. But it's just a fascinating thought. And then, you know, my philosophical mind, I think that, If we're not there, we're going to be there. And if I'm one of those people that I don't think it's a bad thing.
Starting point is 00:09:28 I just think it's a natural. I think I've mentioned on a podcast before. So I just think it's a natural. The next progression in human evolution is that we, uh, uh, we are interwoven with technology. And I'm here for it. I can't wait until I get a computer ship in the brain so I could always have Wi-Fi. That would be fire. Fucking Pete would be out of a job at that point.
Starting point is 00:09:49 Pete, so there's some, there's some, arguments around the office about whether or not Pete is currently a robot. I think that he might be. But yeah, Pete, he would be the first person in line, I think, to get like Wi-Fi just straight up installed in his own brain. But there's a company out there called like Neurlink that's already doing that shit right now. You could probably get into a study if you wanted. Oh, I'm straight. I'm going to get like a third generation chip. I ain't even with the first a second. Wait until they work out all the kinks. Yeah. What about you, big team? I got to work out of kinks. What are your thoughts about artificial intelligence?
Starting point is 00:10:21 Um, so I think I forget what episode it was, but we touched on this for a second where we, um, I, I don't remember what the conversation was, but it got to the point of like when, when the robots become sentient and whatever, like take over and I was like, fuck the robots. I don't give a shit. I have very, uh, I guess it's conflicting feelings on it because I'm very wary of human applications of technology. Like I'm far more worried about what, what humans, what, what, what, humans, what, people in positions of power in the government and in corporations would do with my not information and things like that with technology than I am about like AI so so looking into this it it was it was interesting I don't um yeah I'm not particular I'm more concerned with humans still than I am about the robots but it's the humans are creating a flawed robotic system well right um I'm not I you're I still, you're going to be worried about the robots that eventually fire a nuclear missile up your ass. You're not going to be worried about the guy that wrote the code for it back in the 1970s. Sure. And I would be, you know, I guess if you're telling me that Bezos is creating a robot to act as he would, then I would fear that robot.
Starting point is 00:11:36 Right. But that's, that's the key to getting Big T to hate something, just tell him that Jeff Bezos was involved in it somehow. Yeah. And then boom, bad. So I have to imagine that a lot of people are already kind of, if you don't know it already, artificial intelligence. as a part of your everyday life. Big T's getting served ads for $70 pillows, cube pillows over here. Yeah. Because if you're listening to the sound of my voice,
Starting point is 00:11:59 you'll probably get some pillow ads right now because they're listening to your phone. You want to know. So here's my recent experience with AI and it happens often. So I'm big into TikTok. I'm on TikTok two hours every night. Last week. I like to imagine you have a daily planner.
Starting point is 00:12:16 And it's like, okay, 8 to 10 p.m. TikTok time. It's just when I go to bed, I watched TikTok before to fall asleep. So, big TikTok time. I mean, it kind of is, honestly. But so last week I had a very public dispute with a coworker. And so I said her name repeatedly throughout the course of, I guess it was Wednesday.
Starting point is 00:12:36 I get on TikTok that night, the first TikTok that comes up, I had never seen this person's TikTok account on my page before ever. First one was her. And this has happened repeatedly. And like, so. We didn't experiment last. year on part of my take where I just said the word funco pops repeatedly and then everybody starts sending me screenshots of their phone on like Instagram they were now getting served
Starting point is 00:13:01 ads to purchase funco pop dolls and people that never looked this up before so when I say that I'm I'm more fearful of humans like those are people who have developed algorithms to listen to what you're saying and then sell you things so that's something that I'm like okay with though well I mean I guess this is like I'm very pro but like if I'm getting ads like it might as well be something sure but I'm saying like you know the logical conclusion of that being whatever you think it is but but I said that to say that like somebody programmed that or well I guess there is an element of self learning to that but that was those are goals of corporations trying to make money that's not like like the computer didn't come up with that itself so until like and I guess what the point in
Starting point is 00:13:48 time we're discussing is theoretical so but big t let me jump in because they came up with those algorithms to make money you think they just put those out there and they just kind of let them roam free and never update them they update them constantly and they're always getting better and they keep improving and improving and improving until there will be a point theoretically where they will be super intelligent smarter than anyone that ever created them smarter than any human on earth and then eventually smarter than all the humans on Earth. And we can get to all this in a second. But let's go back in time real quick.
Starting point is 00:14:21 So the goal of these art. Real quick, real quick. Let me jump in. Because my man said that he's okay with that, right? Yeah. I am not okay with that. And I'll tell you why. Because it's not about them selling you fucking beanie babies or whatever, right?
Starting point is 00:14:37 That has, that is the least of the concern. The bigger part of the concern is the misinformation, right? So we currently live in an era where people are dying in their echo chambers. So there's no accountability from these companies to say, okay, we need to cross-reference some of these people's beliefs. And this is why flat earth shit is on the rise. This is why people think dinosaurs live 6,000 years ago is because there's nobody and there's nothing to regulate these companies and their echo chambers, things like YouTube and Google all these. things yeah that's that's where that to me misinformation is the bigger uh point of contention here it's not like i guess but like that's that's on the where i feel like it's on the person to like realize
Starting point is 00:15:25 you're getting no there's no there's algorithms but the algorithms are are set in place to say if i look up a whole bunch of ben shapiro videos and i'm going to keep getting ben shapero videos and other that kind of stuff maybe david's video occasionally yeah you said what dave was just on a show last week yeah yeah well i mean to me that's That's the, that's the, that's the, that's the, that's the bigger issue. It's not, it's not really on a person because like, all of a sudden, I, I look at a flatter video and all of a sudden, three months later and I start to believe this shit, right? Yeah. I mean, there's just no kind of recourse for that. Yeah. So I, I think the biggest red flag is even Elon Musk, who is like the biggest anti-regulation person. Um, uh, maybe that is prominent in the media right now. Like, he's notoriously like, let the private sector. He's fucking building his own NASA. That's how much he thinks that regulations get in the way of progress. He is saying that he is terrified of artificial intelligence and that it's the biggest threat facing humanity right now because he's at the cutting edge of seeing
Starting point is 00:16:30 what artificial intelligence is capable of. And also, I think that like, to be fair, a lot of times tech CEOs and people that work in that world, like that's the thing that they're focused on all the time. They spend their days thinking about artificial intelligence. So, of course, they're going to be like the old saying, like, when you're a hammer, the entire world looks like a nail, right? So they all think that this is the biggest problem facing humanity. But when it's Elon Musk saying it, who's like, who probably would rather there not be a government in general, when it's him saying it, it's kind of a red flag that, okay, this, yes, it could drastically hurt Tesla's bottom line, but he's still saying it's a huge issue.
Starting point is 00:17:11 you touched on my most basic question about this earlier and I don't know that anybody here can even answer it so I don't necessarily know who I'm asking but so you said you know the back to the algorithms on Instagram TikTok whatever that are designed by people to keep you on there spending time and money right at their most basic level and you say they'll they'll keep getting updated and at some point become smarter than the people who design them how does that happen if they are at and quiggs i guess might maybe know this i don't know like so they're designed by people yep the implemented and then they run do what they're supposed to do how does that how does it become greater than than what it was created i'm somebody
Starting point is 00:18:00 who believes that it like doesn't like i think there are like when and it's like artificial intelligence the word itself is kind of like what does that mean like because there's a lot of that it's just like coders like writing code to pick up on your like preferences. It's not necessarily like the robot itself. Like it there's a lot of human stuff that go in it. But like there's just parameters on that that it does like that the robot or whatever that's like giving you the TikTok videos that it thinks you like isn't going to go like anywhere. It's not going to be like sending launch codes anywhere. So there's a thought experiment called. called the Chinese room. Do you know the Chinese room? No, I don't think so. It's a really old one,
Starting point is 00:18:46 but the Chinese room is just, it gives you something to think about, like, are the software programs actually sending it or not? And basically they're saying if you write a program to translate Chinese or to like learn Chinese and then spit out more Chinese figures, it's essentially like a box and you put, you, the, you transmit a Chinese symbol to it and then it goes through the list of parameters that a human has written for it, and then it spits out the output on the other side, you could get the exact same result from just having a human inside of a box that has the list of things with like a bunch of pencils and papers and erasers and filing gamuts in front of it. It doesn't, and the human following all those
Starting point is 00:19:30 steps could then put out the same output that a computer could or a robot could, but it doesn't mean that the human knows Chinese or knows how to speak Chinese. It just means that they're able to follow instructions putting those out. So some people are like, yeah, it's that you'll never have a computer become sending it. But like a practical real world example of what you're saying, Big T, how you don't think that they would ever be able to become intelligent enough to surpass human intelligence. There's this game called Go. Do you know the game Go? So it's, it's essentially, no, no. Go is like, it's a board game that has black and white pieces. And it's been played, I think it's the longest running board game
Starting point is 00:20:09 or the oldest still played board game in the world. And there are billions and billions and billions. I think there are more possibilities for each move than there are stars in the galaxy. So it's really, really complex. And the goal is to just like surround your opponent's pieces with your pieces. They wrote an AI program for it
Starting point is 00:20:31 that beat the world's best. Well, first it beat the world's like 100, 50th best player, two out of three games. Then it got good enough where it could beat the best player in the world, four out of five games. And then there's a new version of that program that can beat the old computer a hundred times out of a hundred. And this has all happened over the course of like the last five years.
Starting point is 00:20:56 So already we have a program that's better at playing this crazy complicated game than any human on earth will ever be at it. And it's just the, like, leaps that's taken are just exponential. They go like, once, it goes back to like the Moore's law thing where technology gets better by a factor of two every two years. And so they, you can already write software systems that could beat humans, just like they have that, that motherfucker Deep Blue that was the world's, that was really when we achieved singularity, when Deep Blue beat Gary Kasparov. And then they've got robots that can beat, like, every, every, every, every, every. Jeopardy player. The robots are already number one at trivia in the entire in the galaxy. But that's all stuff where it's like I like the there's no intelligence in that thing
Starting point is 00:21:47 beating the guy in the game of go. It's it's they've got like code written that's parameters and they just, the way computers work, they can just rifle through like potential moves. And it's, I don't know. I just don't see that as like actual like intelligent. So what I wanted to say, like I was going to say that makes sense to me, still, that a computer can play chess, right? Yeah. And what I guess I struggle with is kind of the anthropomorphization. That's the word I want, right? Like, attributing human characteristics to the machines.
Starting point is 00:22:24 And like, yeah, like, that's my thing, is like, what do we think the worst thing that could happen is? Like, the end of the universe. Okay, but what, what is the computer's motivation to end the universe? That's my thing, right? Like, like all the things we're describing, it's doing what people told it to do. So if you're a hyper-intelligent computer and you understand that humans created you, you also understand that humans are the only ones with the power to destroy you. So if it's, if it's acting in a true mode of self-preservation, when it makes sense that it would eliminate its biggest threat.
Starting point is 00:23:10 But again, we're, we're attributing a, a human-like capacity to understand, like, being an existence that, like, doesn't really exist, right? So let me ask you this, Big T, what is, what is being alive? What's the difference between. Well, that's a great question. What's the difference between people's brains and a computer that's taught to, like, respond to all the inputs, run through the acceptable moves that it can make and then make a choice as an output? That's a great question. I'll have an answer for it. Yeah. I don't know either.
Starting point is 00:23:44 I mean, there are certain jobs that I think computers are probably, it's going to take a longer time for them to take than others. They are, we already know they can't do Monday net football. We saw Jason Whitten up there, big UT guy. but there's there's certain jobs like the arts composing painting although we can get into that a little bit because there have been computers that have been taught to do like rembrand paintings and shit but yeah I think like was it 30% of jobs in the next like 10 years are in danger and high danger of being automated at some point by a computer but like let's flash back real quick let's take this back in time because at the base level
Starting point is 00:24:26 artificial intelligence computer software systems we're designed to make life better right so whether it's i don't know optimizing a supply chain or uh you know helping people do taxes no matter what is like it the the root reason why we use computers is because it can be more efficient to use computers than having you know 40 guys in the back room with slide rules doing everything and uh as we get as we progress throughout time, yeah, they're going to optimize everything. There's going to be somebody that comes along, writes a better code. That computer program gets optimized. And then eventually at some point, it will become more intelligent than human beings are. But Quigs, I'm curious to know what you think. How are we defining intelligence? Because when we say, yeah, intelligence or it's
Starting point is 00:25:17 becoming more intelligent than human, how are we defining intelligence? I guess like just having a smooth, optimized brain, like not, not wasting energy on anything. Optimization. They're already more intelligent. That's the case. Yeah. What's funny is, if you look back over the last like 30, 40 years of technology, most of the huge leaps have come through porn.
Starting point is 00:25:43 Like, porn is always at the cutting edge of where people try new shit when it comes to technology. And right now, 96% of deep fakes. are just in porn, just putting different people's faces in porn. And so you can always tell which way the wind's blowing, like how, how, like what route we're going down? What's the next hot step going to be like is it five years ago you saw people doing like the Oculus shit and, uh, in virtual reality in porn. And then now it's becoming like mainstream, um, where it's got other applications. But that's really, if you want to know where
Starting point is 00:26:16 we're going with things, look at what porn's doing. And so porn has been at the bleeding edge of the deep fake market right now. deep fake shit that now that scares me and be real shame if somebody were to make a uh porn deep fake a big well that's a that would be horrible guys you guys should not do that definitely not that's illegal yeah yeah the deep fake stuff is crazy because like how far it is already so the logical conclusion of that is let's say in 20 years like there's just deep fakes of everything right and then you somebody goes somebody relatively your same height wait whatever goes and whatever goes and shoot somebody in broad daylight and there's a street camera and then that they deep fake it
Starting point is 00:26:58 to be you yeah why are you laughing you don't bro you you live in constant fear of getting arrested for murder fam and i do i do i live in constant fear of being arrested for anything this is why i don't do anything illegal it's scott peterson 2.0 all over again like this is this is your exact same fear yes i i yes i i i Alibre. Big T, you need you need to pay somebody to be your alibi a hundred percent. But I said that to say that like the deep fakes, like that could be a real problem with like the legal system. I mean, that itself would be a problem. But like I think they'll like they, the same way that they can make the deep fakes, I feel like you can use AI to like, like verify the authenticity of a video. But if you told, if you told a jury, if somebody, if somebody's on life is on the line and a video is the evidence.
Starting point is 00:27:51 I'm positive that they can have the technology to kind of like backwards engineer to see if it's been doctored or not. Like you can do it with pictures, right? Like in Photoshop, like you can see if it's been Photoshop or not. Like I'm positive that they can develop that technology. So I'm not worried about it from that point of view. The point of view I'm worried about it from is like the same shit,
Starting point is 00:28:13 the misinformation shit, because we are continuously getting smarter, continues to getting dumber. And it's because people don't check what the fuck they think. Right. And that's the big issue. Like you have, once rumors get started, once the cat's out of the bag, it's hard to reel that thing back in.
Starting point is 00:28:27 So that's, it's just more the social zeitguikes. It's more what I'm concerned about. I guess I meant, like, if I was on a jury, and this is in 30 years when, like, let's say a 15-year-old can, like, defake shit on his computer very easily and, like, professionally. And, like, the main critical evidence was, like, a video, and they brought in, and the defense brought in some expert that said, yeah, I think there's a possibility that video might be doctor.
Starting point is 00:28:51 And I was tasked with putting somebody in prison for murder. I would consider that to be reasonable doubt that, like, maybe that video is not them. And there's no other, you know, physical evidence or what have you. So you're, you're more worried about the fact that people will use deep fakes as an excuse to escape justice. As a fan of justice. I think there are all sorts of bad things that could happen, that being one of them. So it's so as doubt. Like everything is doubtable now.
Starting point is 00:29:14 Yes. Once deep fakes or get good enough. And I mean, deep fakes will, right now, I'm sure that we can go and look at a deep fakes. fake and tell you exactly what like shitty iPhone app somebody downloaded to make the deep fake app but you know five 10 years from now I don't know if that's going to be the case 20 years from now who knows if it's going to be the case it's it's going to fuck everything up I think there's really only one possible outcome that's just if everybody like if everybody just agrees to stop improving technology as a whole otherwise I totally believe that robots are going to take over and we're not going to stop technology we're not going to come to that agreement um or i don't know quiggs maybe like your your background like you you were hired you wanted to have a career in aerospace engineering at some point then we hired you yeah and now you work here it maybe that's a solution we just find everyone who's really smart working on giant technological problems and we give them jobs like
Starting point is 00:30:16 creating optimization for thumbnails for caller daddy on youtube yeah I feel like that could work. Just take them all out of engineering fields. Take them all out of engineering fields and have them come up with an algorithm to see which guess that ash should go up at what time per day. But there's a lot of like good that comes out of like artificial intelligence, I think. Like a ton of stuff for like medical research. Like I think like if it came down to it like we really got our backs up against the wall with global warming.
Starting point is 00:30:47 It's like we put enough computing power. I think that can figure it out. out like just with stuff like that like I feel like if like you can look at it that way it's like we have these things that can destroy the world like that probably also means that they're powerful enough to like solve some of the serious problems we have if we need them to yeah there was there's a Stanford study that they did where they took like um radiologists like Stanford radiologists and they took images of people's lungs and they said you know okay humans analyze the the x-rays and diagnose what's what's what's actually wrong with these people and then
Starting point is 00:31:25 they put algorithms in in place to see if they can diagnose it and at the beginning like computer was underperforming but towards the end when they started updating the algorithms the computers were actually outperforming the radiologists for diagnosing uh diagnosing these lung diseases and to me that's a plus we got it we just got to accept that shit and just roll with it because there's nothing better than getting scanned like a what's that a big hero six like having a big hero six at the crib and just like yo what's wrong with you we scan you and we can get you we can get it popping right there that's the type of shit i'm looking forward to i know all the rest of the shit you know big t's murder trial soon come you know that's the i'm not really worried i feel like we got a couple big t's got
Starting point is 00:32:06 to go to jail for me to get my big hero six and i'm okay with that's collateral damage no and a aaron that's that's a good point because like now you're thinking like a computer because like you're There are a couple of mistakes that you're going to make in the interest of advancing humankind. And if Big T has to go to jail for 50 years in order to save 200 million lives due to early diagnosis of cancer, I think that that's a win for everyone. No. We'll have Big T call into the show and record his parts. Big T, there's nothing more price like than sacrificing yourself for humanity.
Starting point is 00:32:42 Nah, stop that. That's disrespectful. No, it's true. It's just straight up with fact. Nah, that's sacrilegious. No, it's saying like live your life like Jesus. Use the teachings of Christ. Sure.
Starting point is 00:32:55 Be the change. I want to take a break real quick and talk to you guys about ExpressVPN. You know, it's not fair. The fact that Netflix hides thousands of shows and movies from you based on your location, then they have the nerve to increase their prices on you. That's right. They just raise their prices once again. You could just cancel your subscription and protest or you could be smart about it.
Starting point is 00:33:16 it. Make sure you're getting your full money's worth by using ExpressVPN like I do. So you might not know that what's on Netflix in your country is totally different from what someone in the UK or Japan has on theirs. With ExpressVPN, I can control which country I want Netflix to think that I'm in. ExpressVPN has over 90 countries to choose from. So every time I run out of stuff to watch, I just switch to another country to unlock new shows. And right now, there's so much new stuff that you can watch that's not on U.S. Netflix. And we're just one tap of a button. ExpressVPN can let you change your location to many different countries and optimize your viewing experience. Here's the best part. It's not just for Netflix. You can use
Starting point is 00:33:57 ExpressVPN to unlock shows on other streaming services too. I can watch BBC iPlayer. It's free. It's only available in the UK. ExpressVPN is also super fast and works on your phone, your laptop, even smart TVs. So you can watch your shows on the big screen with zero buffering. So be smart. Stop paying full price for streaming services. only getting access to a fraction of their content. Get your money's worth by going to expressvpn.com slash macro dosing. Don't forget to use this link so you can get three extra months for free. It's three months for free at expressvpn.com slash macro dosing.
Starting point is 00:34:33 ExpressVPN.com slash macro dosing. Gandhi would have stood up on this table and said, I killed Lacey Peterson, take me jail. If it meant that Scott would go free because he was innocent, you know that. should have gone three because you've then then if you truly believe that you should raise your hand and say i kill well that's not how that works times like you care more about yourself than that's just not how justice so all right here's a counter to your irrational fear big t if people if we reach the stage in deep fakes where um it'll so doubt amongst a jury we'll also
Starting point is 00:35:09 probably eventually have robot juries you could let me if you can teach robots yeah if you can teach if you can teach if you can teach an artificial intelligence system or a software system to like run through the uh the prosecution case that's been put in front of them like okay here's the DNA evidence that we have here's the video evidence that we have analyze it and tell us whether or not it's true i thought about it for a second and i don't like it because yeah i mean our entire justice system is predicated on a jury of one's own peers and those aren't those aren't those aren't those aren't own peers unless we're already robots what do you mean cap cap no it's not why
Starting point is 00:35:54 I don't want to get into politics yeah I mean this is a different thing but like I mean that's that's what a jury is it's just other people in your community allegedly and definitely not in your community so the thing is like there's human error that goes into the justice system sure there's a lot of human error that goes into it because it's all tasks performed by humans that all have their own backgrounds and biases. So wouldn't it make sense then to have a software system that might not have those biases? Or this is where you get into the real deep shit, then you have to analyze the people that develop the software systems and be like their biases are straight up in the software
Starting point is 00:36:33 that they're putting out. Like the funniest thing, I guess it's not funny, but it's really, it's strange how like they made those like automatic hand dryers that you see in like airport. bathrooms and stuff and there's statistically less likely to recognize a hand being underneath it if it's not a white person's hand because the people that wrote it did most of their testing on you know the the test subjects were white and so they have like inherent by that's why when you see people like writing articles like oh people are calling hand dryers racist now no they're not calling hand dryers racist they're just saying like empirically you can prove that the people that
Starting point is 00:37:11 wrote the programs, wrote their own biases subconsciously into it. When, right to that point, when face ID first came out, right, there was a lot of people that were Asian that were complaining about that, the racism and technology, because like the, when you would unlock their, when they'd unlock their phones with the face ID, like it would happen, like multiple people can unlock it. And that shit is a real thing that you should pay attention to it. Yeah. So I thought this was interesting.
Starting point is 00:37:39 a way to think about the computer simulation, whether or not we're living in one right now. There are three possible outcomes. One, humanity will destroy itself before we reach the robot singularity or the artificial intelligence singularity where they take everything over. So that would be like all out nuclear war,
Starting point is 00:38:02 global warming, asteroid hits the earth, whatever. Something catastrophic happens before we reach that point where we've created software program so powerful that it'll take over the world. That's one possible outcome. Possible outcome number two is that we are about to come up with software systems that will be so powerful that they take over the world in the next like 20 to 100 years, whenever you think that might be. Possible outcome number three is we've already come up with those software systems and we're currently living in the matrix right now
Starting point is 00:38:37 as a design of the software system that we invented. Let me ask you this. So either you're so egotistical that you believe that you will be around for the end of the entire universe or the end of the world, which like every generation, every society believes that wholeheartedly. But you are the special snowflake that will be around when the entire world has ended.
Starting point is 00:39:00 You either believe that or you believe that we're already in the simulation. right now. And I'm not saying I actually agree with this thought experiment, but it's one of those things that you think about and just totally fucks your brain up. I don't know. Let me ask you this. So if we're in a simulation, right, and we're developing all this artificial intelligence to potentially take over, what if we're in a simulation? And this is finally the one where we develop the technology to overtake the technology that created us. And then we beat the simulation with our own simulation
Starting point is 00:39:33 And then what happens I don't know I don't know I mean this is we're already worse you know I don't know but I'm just saying if we were created
Starting point is 00:39:41 in a simulation why couldn't we beat the simulation that's an episode of Black Mirror definitely so I like the intrigue of an alien race
Starting point is 00:39:54 creating us as a part of their simulation I'm a big like simulator I think we are part of it. I think it's dumb. Really? Well, no, I wouldn't say dumb.
Starting point is 00:40:05 Dumb's the wrong word. But there is the, as long as the, like a simulation can be built, like to this level, it's dumb to assume you're not in a simulation. So it's, it's tough for us to know if there was a something, a race, a human thing that has got to the level to build the simulation. But once the simulation, like floodgates open, you're far more likely to be a part of a simulation than to be just like a real life, whatever you would say life is on Earth. Well, now I'm very intrigued. So you think we do live in some sort of simulation? Statistically, yeah.
Starting point is 00:40:49 Like, I think based off that, I think it is a simulation. That's what I'm saying. So like in the one of the three outcomes where. or two of the three outcomes assume that we're existing at one point in time leading up to the destruction of the world one way or the other. And then the third, it's like we're one of a part of basically infinity simulations that have been created at some point in time. So yeah, statistically, like then you get into some cool shit where you're like, well, I'm not responsible for anything that I do. That was a simulation. That was a simulation talking right there. And that's where
Starting point is 00:41:23 Big T is like. I find that to be a reasonable argument if you believe that we're actually part of a simulation because then then it's also reasonable to assume that you don't really have free will okay you i don't think we have free will mind just blown right now uh Elon must just came up with all the simulation talk because one day he'll get arrested for tax evasion and then he can use the like i'm in a simulation nothing that i do is my responsibility and he'll have already told everybody else about the simulation so he's poisoned the jury pool not the worst that actually makes theory i've ever heard that so to i'm just like longing for an explanation that makes my dumb brain feel comfortable at the end of the day and to me like just pinning everything on elon musk actually seems a lot
Starting point is 00:42:09 more palatable than trying to figure out like whether or not i'm real or i'm a software system i'm okay with that so i that's actually that's actually intriguing though like if we're in a simulation so like like like die or like reincarnation would just be like you get another life you know what i'm saying like a one up you get a little mushroom mario mushroom that's kind of fire dog like i'm i'm i'm into this one i hope we i hope we live in the simulation that'd be so quiggs i'm very intrigued by this yeah so what what is the goal of the simulation what who and who or what like what is the i i i honestly don't even know what i'm trying to ask no i i kind of get it and i don't know if i have an answer like it's not a like it's not something i
Starting point is 00:42:52 walk around every day being like this is all fake it's real it's like when i kind of think it I but I don't know like I think of it as just like say say we're not in a simulation this is real life but we're kind of get it like the way computers are like evolving like we could have in a hundred years here like somebody where they could just run like almost like a GTA thing on their computer that is just like a simulation that's like real life for the people inside of it. Sort of thing. But I think, like, beyond that, that was a bad example.
Starting point is 00:43:31 But, like, yeah, I've never thought about who the simulation's, like, for or if it's for someone. I've got seen Starfleet, the episode of Black Mirror? Yes. Oh, yeah, yeah. Right. So, like, that brings up a good ethical question. Like, if you do create a simulation, right?
Starting point is 00:43:49 So this dude, like, took DNA from, like, his coworkers and people. He was like a weirdo in real life, but in a simulation. he was the captain and so he created he can recreate people and it made them they thought they were alive inside of the simulation but their real bodies were out in real life people that haven't seen it um but inside of the the simulation that he created he was like a dictator he treated him like he like beat him he like tortured him right like is that is that ethical like to to to be a dictator in your own simulation like i i don't know man it's like yeah i don't I don't know. I, like, and that's, I don't know either.
Starting point is 00:44:27 Real. I feel like there's just so much of, like, in the media and movies and stuff like that that, like, portrays AI poorly. Like, it, it kind of, like, the, the idea that I feel like people, like, think of actual robots and, like, the, like, movie, like, that's what I said earlier. And it's just, like, that's not even a part of it, really. Right. Like, anthropomorphizes them turns them into, like, human-like characters.
Starting point is 00:44:53 Yeah. When, like, that's not really. Yeah. Yeah, like, there's just not the, the, like, idea of them ending the world, like, taking over and, like, taking the humans hostage. And I just don't see that happening. So there is going to be an element of, well, wouldn't it be, like, if you're, I was going to say, if you're, if you're a computer, right, like, wouldn't your end game be, like, the most logical step, right? Because they're, they're, they're all about, like, logic and math and computation, right? So, like, logically, we're a cancer.
Starting point is 00:45:28 Like, we are not a beneficial thing to humanity, not humanity, the ecosystem of the Earth. Like, we actually harm it. And so, like, that's why I think people get the whole, they're trying to harm this thing from us because, like, we know we ain't shit. And so it's like a subconscious, like, if they really, like, if you really think about it, like, we are actually the problem. Even though we develop answers for the problems, we are actually the problem. And so I think if you have like an artificial intelligence that that ends up becoming sentient or something to the likes of sentience, they would make the logical decision to be like, y'all can't have power because you fuck shit up. Like that makes sense to me. I think I think that's what we think of when we think of sentient.
Starting point is 00:46:17 It's like sentient is when the robots become smart enough to realize that we're a bunch of assholes. and then at that point there's like there's no turning back from there so like in an analogy that big t would appreciate this would be like humanity developing these programs it's like if scott peterson went out there to and hired a private investigator to find the real killer right and then the private investigator at some point realized well it's just obviously you you order those two new porn channels well she was out missing somewhere like that's at that point the private investigation has become sentient. So it's like as long as if we can develop AI, because I agree with Quigs, like, AI has been a great thing for most of humanity. It's improved a lot of stuff. It can,
Starting point is 00:47:01 it can fight world hunger. It can give you early diagnosis when it comes to a lot of diseases. But if we can figure out a way to just make sure that the technology doesn't ever reach the point where it gets smart enough to realize that we're dickheads, that would be optimal. But is there, Is there a way to do that? Well, that's, I think, like, not that we have, but, like, I don't think anybody wants to, like, hear the answer that, like, yeah, we, like, it's under control. It has parameters. Like, I think people just hear the, it gets smart enough. It can just do whatever it wants to.
Starting point is 00:47:35 And I just don't believe. Like, there is, like, you're talking about this, like, the AI wants to destroy the human population. But it's like, there's not one single AI. It's literally just like very advanced coding that is heavily done by a human and it just uses like artificial intelligence to make that code more efficient and kind of fill in the blanks that that help it be better. So like all these worries, is that like a little bit of us just projecting our own insecurity onto things? I mean, I think so. Like I think I see a lot more good than bad out of AI. I do, I do too, but like, so like, there was this one, correct me if I'm wrong.
Starting point is 00:48:19 I could be totally off base here. But I remember reading the story about, like, there's these Facebook algorithms. Thank you, Pete. Thank you, Pete. No, whatever thanks, Pete. Thank you, Pete. We'll leave that in the podcast. We'll leave that in the podcast.
Starting point is 00:48:33 Thanks, Pete. What a, what a gym of a robot. Great guy. Eric, you were talking about Facebook. Unfortunately, I was talking about Facebook. Yeah, the Facebook bots that they created. Right. So I was reading an article, and I'm probably going to butcher the shit out of this, but it was a while back, and Facebook had like these two algorithms or something like that, and they ended up, like, kind of creating their own language and kind of communicating and talking to each other.
Starting point is 00:49:03 So it was like, it was English, but it was like there, there, over, over, whatever the case may be. And the creators of it kind of stopped it, and they were like, yo, this is kind of wild. And to me, that's, that's like the bigger implication is like when you look at like processing power now and processing power what it will be in 20, 30, 40, 50 years, it's going to be exponentially faster. And so what's to say that there won't be a algorithm that's written, right? And I don't know enough about the shit to say this is possible or not possible. I don't know. Maybe you can tell me more than I know. But what's to say that one of these times, like say we get into quantum computing and, and, and, and, and.
Starting point is 00:49:41 And these programs end up bypassing the fail safes that the people put in place to stop them from doing any kind of thing on their own. And I just don't know enough about it. So to me, that's the bigger part is like, I just don't know. We're in the baby stages of the processing power of these computers. And it's already crazy how it's changed our lives. So 100, 200 years from now, I don't know, dog. I feel like they could reach that level of, I do what's most logical or what I, you know, say, what I think or what even is a thought. But it's just, it's just crazy to think about.
Starting point is 00:50:20 I've, I've definitely read things about how there are safeguards in place. If you're developing systems like that, like one way to, you know, they had the ability to pull the plug on the experiment when they saw that they were, that they were creating their own language and talking to each other. Some people think that if you're going to develop systems like these, you just don't have internet that connects them. You keep them off Wi-Fi. You keep them in a closed system, essentially. My solution is much more elegant. Like, if I'm just trying to figure out how to prevent these computers from taking over the world, I just invest heavily in magnets.
Starting point is 00:50:52 I just start, I buy a shitload of magnets because that's their Achilles heel, right? Maybe, I don't know. Maybe I'm wrong about it. Are magnets just like a sweet way to kill any electronics again? Magnets will fuck up electronics. So that's what we need to do. We just need to, like, have like, a giant silo of mass. magnets ready to be deployed on cruise missiles at any given time and let the robots know
Starting point is 00:51:15 that shit and then they'll be like they have to have a healthy fear of us you know yeah i mean i think that's like what uh emps are yeah like isn't that basically electromagnetic like just like wiping out this stuff so we have that technology the sentinels and the matrix but it's also as easy as just like unplug it like that's easier than wiping it like these things can't work without like power stuff like that do you guys think that artificial intelligence will ever be able to create art that is as beautiful as what the human mind is capable of creating yeah yeah big tea what it depends what you mean by art like visual like a painting sure like are computers going to be able to write like songs I don't know yes I think
Starting point is 00:52:08 they're already like doing that they are but but like music is nothing but mathematics music is just mathematics how good are the songs though i would have to listen so there is slap correct there is a uh a piece that's unfinished i think is schubert's eighth symphony it's a famous famous symphony that was never finished and in 2019 huai way i believe that's the name of the chinese technology company um they announced that their artificial intelligence system finished the last two movements of the eighth symphony. And I listened to it.
Starting point is 00:52:44 It was pretty good. It was pretty good. It did sound a little robotic when it was like, okay, they just basically understood that the chord progression in Pocket Bell's canon is appealing to the musical ear. And so they just like,
Starting point is 00:52:58 it sounded a little bit basic, but it was pretty impressive. Do you want to hear it? I can pull it up. Well, that's what I'm saying. Like, I'm sure a computer can take pieces of information from a song and finish it.
Starting point is 00:53:08 but can it, like, write, I don't know, like a country song with, like, lyrics that are, like, good. Like, I don't know. Absolutely. Yeah. Yeah. Eventually, it'll be, like, I'm sure there's stuff now where they can almost do it, but it sucks. Like, eventually they'll be able to do it. Wasn't there that, um, uh, and this is to the point where it's like they, like, they just learn, right?
Starting point is 00:53:30 So they just learn that those algorithms just they learn. And so there was that, uh, there was that program that, uh, that listened to like all of Donald Trump's speeches and it like spit out its own speech that was like eerily similar. It was wild. And so like if you just input all kinds of songs, country, R&B, jazz, all kinds of songs, I'm positive it could come up with its own lyrics. Like and you know, if it's good or it's not, that's this subjectivity of art. But it's just, yeah, it's going to be able to create art. I also like a big T went straight to country music as an example of a complicated song that would be tough to finish those are the best lyrics you can get country music well yeah if you're
Starting point is 00:54:12 a fan of the best story it's the best storyteller 90% of country music and i consider myself a fan of country music but 90% of it is slogans that you see on a t-shirt that's that's contemporary that's like uh yeah i'm a bigger patty climb fan than you van right like old school good country music is yeah not that it's not the best lyrical like come on man i think it's the best telling of stories all right Full Miner's Daughter, great song, three chords, maybe four for the entire song. Sure. As most of the lyrics were- Pop music now. Right. I'm just saying, I think country music would be like real low-hanging fruit. So this is- I was just trying, I was picking a genre.
Starting point is 00:54:51 This is the- When they create, when they create, I'm going to let you get it off. But like when it, like, 100% pop music is super easy. It's like three or four chords usually. It's really, it's really simple to play. But like when you start like creating like jazz and those kind of improvisations, like that's when it gets like, I. man, this is getting wild. It's also cool hearing the human body be able to make those sounds, whether it be like singing them themselves or like the ability to play it on a guitar or on a piano.
Starting point is 00:55:19 Like part of the reason that I enjoy watching live music is because it's awesome to see people's hands do things that I thought were impossible to do. And like knowing that it's a human, like an imperfect thing that's playing these. That's part of the reason why I like it. But this is the unfinished symphony. A robot wrote this. I don't know. All right, guys, we want to talk to you about Mac Weldon.
Starting point is 00:56:22 This year's spring is going to hit a little bit different because we're finally starting to get back outside and see our friends again. No matter where your adventures take you, bring the comfort and style of Mac Weldon along for the ride. Trust me, your closet is going to thank you, whether it's their hoodies, polos, teas, or shorts. everything in the Mac Weldon collection, mixes and matches seamlessly to fit in with any other trendy products you have. You're going to want to get out there this spring, play some golf, go to the pool, do all that stuff. Mac Wilden's got all the stuff you need. Vesper polos, perfect for playing golf. Dry-knit t-shirts is what you're going to want this summer. The Ace sweatpants, my personal
Starting point is 00:56:58 favorite. They're the best sweatpants you can get out there. Everything you possibly want. Macweldon's got it, socks, shirts, hoodies, underwear, whatever you want. It's all versatile, all mixes and matches. It's the perfect combo for this spring and summer. They've also got Weldon Blue. They're totally free loyalty program. Level 1 is going to get you free shipping for life, whether that's in a simulation or whatever it is. And then once you reach level 2,
Starting point is 00:57:19 you're going to get 20% off every order for the next year. And Mac Weldon guarantees they want you to be comfortable. So if you don't like your first pair of underwear, you can send them back and they'll still refund you, no questions asked. That's a great deal. For 20% off your first order is at macwilden.com slash dose and enter promo code dose. That's macweldon.com slash DOSE,
Starting point is 00:57:40 promo code DOSE for 20% off. Mac Weldon, reinventing men's basics. Absolutely. That's fired, though. So, yeah, the computer did a pretty good job on that one. And then in 2016, there was a Rembrandt painting that was designed by a computer. And it was created by a 3D printer.
Starting point is 00:58:00 And this is like 400 years after Rembrandt died. And the result, could trick any expert. So, like, they had Rembrandt experts look and told them, like, hey, this was found. This is a painting that we didn't know existed and was found. And they're like, yeah, this is a Rembrandt. And that's the craziest part to me. It's like I always feel like art would be the last things to be taken over by any sort of
Starting point is 00:58:23 like software system. But when you break it down, a lot of it is like formulaic to a certain extent. Like everything builds off what we learn from the generation of artists before us. And then somebody after us will build up. off of that and so forth. So it does kind of make sense that they'd be able to sort of program a computer system to do it.
Starting point is 00:58:43 And also, to Quig's point, we might already be in a simulation. Yeah. Yep. So would you say that Rembrandt did that painting? Or would you say that the computer system did that painting? Because all the computer system did was study Rembrandt paintings
Starting point is 00:58:58 and then combine them to make a new one. So who gets the royalties off that? got to be Rembrandt well it's 50-50 whoever made the robot and Rembrandt's steak yeah yeah because I mean
Starting point is 00:59:12 that's like kind of where I keep going back to this it's just like all of this is like pretty human controlled everything there which I know that's like the fear is like once we break out of like that he's just copying
Starting point is 00:59:28 Rembrandt's paintings he just like the robot's just going off like this code like I don't know like it's it's all just like human knowledge going into it so I don't like I don't see how
Starting point is 00:59:43 I don't even know where I'm going with this but no I get what you're saying though because I feel the same way yeah I just think all the inputs are from humans so like there's only so far it's going to go like with it can only be as smart as what we taught it yeah to a certain extent yeah I just don't agree with it
Starting point is 01:00:02 I just think that we're in the, like, the very beginning stages. I mean, what, computers really started popping, what, in the 90s? Like, floppy disks and Oregon Trail? For, like, personally, yeah, in, like, the 90s. Yeah, yeah, yeah, P and PC. So it's like, and now, like, I'm sitting here talking to y'all across the island of New York. I'm in Houston. That's, that's fucking crazy, a 20-year jump.
Starting point is 01:00:26 20 years from now, who knows, 200 years from now, if we're still here, dog there's just no telling and I mean I just don't know enough about the shit but I feel like we're gonna find a way to fuck it up but I think I think we have to just embrace it though like all the like people are like scared of the shit
Starting point is 01:00:43 like we just have to embrace it because it's just part of our lives now like we just have to embrace this shit and try to use it for the good while it still likes us I got a dumb question quix yeah is there an off switch just like for the internet
Starting point is 01:00:57 for the internet yeah no I mean No, I feel like there's just so much It's so decentralized Where is the internet? Good question It's everywhere
Starting point is 01:01:11 It's just But what does that mean though? Like what? I'm sure it's just like a collection of server What the fuck is the internet? Servers It's high speed cables that run underneath the ocean Fiber optic lines
Starting point is 01:01:24 Just draped all across the globe And it's just everywhere Like Yeah but there's not one person that could be like there's not a room that Tom Cruise could break into
Starting point is 01:01:35 in a mission impossible no but save the world I guess there was like we were kind of talking about it earlier with like Amazon like if Amazon Web Services shut down their web services it would fuck a lot of stuff up
Starting point is 01:01:47 and then it's like there's probably a lot of like it's the internet things are connected there's probably like sites that aren't on Amazon web services but have like plugins that use Amazon Web service where they get taked down in large part and it would hurt a lot of things. But I don't know if there's something where you can click a button and nobody can do anything on the internet.
Starting point is 01:02:10 Do you think that for most people, let's say like a moderate income individual living in some sort of urban center, do you think that if the internet shut down tomorrow, if we said we're going to take 30 minutes or excuse me, 30 days away from the internet? Do you think that people's lives would get better to a certain extent? No. Do you think at the end of it would cripple everything? I mean, that's what that's what the whole fear of the MPs is, right? Like if you just shut down all the electronics in our country. Like we like shipping channels go down like everything.
Starting point is 01:02:43 We would start eating like distribution of food. 30 days. So many things. 30 days is pretty quick to jump to cannibalism. I mean, I guess I'm talking about more than just the internet. You might say like we'll start growing our own crops. If we had no access to electronics in this country. like think of how much the herb garden in my flower box uh i mean sure maybe i'll purchase some
Starting point is 01:03:05 grains at the farmer's market and have a bunch of rice ready to cook and boil over and open now he goes straight to like well that's why my neighbor's going into a crock pot first of all that was very obviously an exaggeration but secondly that's why those exist like that's why they're they're a weapon yeah i'm i guess i'm speaking more from a theoretical point of view of like it just in terms of human happiness, are we happier with the level that technology has got to right now than we were back in the 1950s? No way. How are we judging happiness? I guess it's like completely subjective. But if you take people in similar socioeconomic places and similar like geographical parts of the world, are they happier with access to all the information than they were before
Starting point is 01:03:55 they had the access to all that I would I would there's so many variables to consider but I would venture to say no think about how happy Lawrence Taylor was with no with no internet that's true yeah you always hear people like this is a big thing that like college football analysts say like thank god we didn't have cell phones when I played we love talking about how cool they were back in the day before they could be recorded doing uncool shit like the like the depression rates or the suicide rates like from the 90s and now like it's it's higher right and so i mean a lot of it is how we kind of curate our social media platforms and these greedy-ass companies but i think in general the more information that you have the more depressed you probably would be but erin you just said
Starting point is 01:04:44 a second ago that like you want to be one of the first people in line like load me up with the chips that make me feel good like the the computer starts third third generation sorry third generation after they worked out the kinks like you want to be one of those first people what would it take for them to convince you to be like okay now it's time like what would the features have to be where you you'd be like okay i will sign up i will get the chip in my body because it means like i don't know i can see through walls or whatever well no i think i think it's it's my my my vision of humanity is that we become an intergalactic species and we travel the stars and we go visit other planets it's like that's the goal in my opinion like right here we're just we're primitive right now
Starting point is 01:05:27 and so the goal should be to go really explore the universe because it's crazy out there 13.8 billion years of just who knows and we we sit here and argue about stupid shit all day but it wouldn't convince me I mean it wouldn't take much to convince me if it if it works then it doesn't cause a shortage I don't get a stroke from the shit I'm game give me Wi-Fi and let me think I think it's going to end up being something like really stupid and inconsequential that makes more people sign it like it'll be like uh get your fifth five dollar foot long for free at subway if you put this chip in your finger and you just scan it at the register like that would probably get more people signed up for it than some sort of like
Starting point is 01:06:08 a medical benefit at the end of the day like we want give me my fucking sandwiches give me lunch give me a discounted of lunch that that's that's a shit that makes a world go around yeah governor murphy just came out with uh it's a shot in a beer if you get a vaccination i saw shot in a beer yeah holy shit have you have you seen the um the the thought that elons already implanted himself with neuralink no because there's you can look and a lot of people say it's because he has a photographic memory but you can like watch him on rogan and stuff and he has these like short pauses and like rapid firing of his eyes before answering like tough questions that people think that it's like the neural link and he's like scanning through like the data and
Starting point is 01:06:52 all that stuff. So how does that, how does that manifest itself? Like if you have Neurlink, which is a company that makes kind of what we're talking about, like a third party interface between your brain and a software system, right? That actually physically goes inside your head. What does that look like? Are you seeing like a heads up display on your eyes? Is it like Google Glass, but it's just inside your retina? That's how I imagine it. I don't know at all, but like that's how I assume it would be. And it would make sense, like that's just kind of like how all like, evil super geniuses go they just keep pushing the limits like on themselves and fire and you don't have thoughts no more you just have like open tabs in your head I don't put to open tabs I do you want to
Starting point is 01:07:35 like go back and access your entire browsing history in your brain like everything that you've ever thought of memories compartmentalize your memories that'll be all this is all beneficial shit to me it would keep it would keep people more honest right because like now it's like I can't I can't gaslight people and say oh I thought you said like no bro I remember it like this is full proof memory like and we can recall that I could show you like that other black mirror episode where you can like record
Starting point is 01:08:01 what you're seeing then it'll make a more it'll make a more honest look at you look at your big teeth like no you see how that went for them you see how that went for them not well bro could somebody have your nerling though no but like but like what's the
Starting point is 01:08:16 the dude goes into like a job interview or something right at the beginning and they're like what did you i forget how the whole thing transpired but he's like they have the ability to just rewind and like see everything you've done that's a horrible idea no he it's it's you can i mean it's one way to look at it but the other way to look at is okay let me see where i fucked up in this job interview how can i better myself for the future and the only really thing that happened in that episode was short it was cheating that's all that was like right the whole episode was about short of cheating so you basically want to you want to be able to just like watch film of your own life like
Starting point is 01:08:49 break it down and stuff. I mean, that is the way you get better, fam. You get better by watching film. Do you want other people to be able to watch your life, though? Sure. If you want to watch me beating off of this important, hey, feel free. See, I guess. I was a matter of fact, I had a email from a scammer one time. And he was like, I guess like, you know, I had an old password. And he was like, let's let's, let's just be blunt. This is your password. And he put one of my passwords up there. And I was like, oh, so it kind of got me off. guard. I was like, what is this? And he's like, I have access to sources. He's like, deposit me $10,000 or I'm going to show them. He's like, he's like, I saw what porn
Starting point is 01:09:27 you watch. He said, nice taste by the way. And he's like, I'm going to give it to you. And I was like, nigga, that shit's not going to startle me at all if you send out videos of me being my meat. Like, it's probably just going to make, I don't know, they ain't going to bother me than no money. I said, go ahead. Go ahead and do it. So it's like, I don't know, man. That shit don't bother me, man. I just don't commit enough crimes to like, be worrisome about niggas like invaded my privacy like that you don't have to commit a crime though i mean like you you would be fine with people just being able to access everything you've ever done yeah i just have a problem with that is somebody could hack your shit somebody could
Starting point is 01:10:07 if you've reached the point where you have neuralink in your in your brain somebody could hack your neuralink sure implant a false memory and then they're like boom we found this in your in your your neural link and then you believe the ultimate form of gas lighting right like somebody actually does like brain wipe you put their own memories in that's yeah that's not even gas light it's electrical lighting also also like if this if this existed in the the black mirror like since the area and i are talking about like nobody could ever lie ever again like lying wouldn't be a thing if other people were able to access your your deal which like how is that a bad thing Well, I mean, it's, I'm not, I think it is a bad thing changing the entire fundamental way the world works.
Starting point is 01:10:53 But if like, let's just, you don't want to, your friends are like, hey, let's go out tonight. You're like, oh, I can't. Sorry, I've got to do this thing. And then they're like, oh, no, you're, it, that's a bad example. I don't know. No, it forces you to be honest. You have to be honest with your friends, but I don't feel like going out tonight. It opens the door for healthy communication.
Starting point is 01:11:12 Again, yes. Tell me, you're correct. That was a bad example. But. Give me an instance where lying is. okay. I'm not saying it's okay. I think lying's like just a part of life. Right. You are fundamentally altering the way the world works. I think lying's okay to a certain extent. Like Aaron, I'm listening. Okay, you let's say hypothetically you send the group text a picture of your child. So you have a new
Starting point is 01:11:36 baby, newborn baby. You're like, isn't my baby cute? And somebody does not think that your baby's cute in the chat. They reply, no, it's not. It's an ugly baby. I'm not saying you'd have an ugly baby, but I'm saying like there's certain circumstances where you have where it's polite to lie. I have put a preface out there for all my friends. Do not send me baby pictures. Also don't send me holiday cards because they're stupid.
Starting point is 01:12:02 And so I have yet to see. I saw one cute baby, one cute baby. All babies are ugly. They look like toes. They look like wrinkled. They just don't look like humans and they're not cute. I don't understand it. Infants are not cute.
Starting point is 01:12:17 And it just, it just makes life better if you're honest with people that you love, man. Just be honest. It's okay. I'm not saying no perfect. I definitely lie. But if I didn't have the capacity to lie, the communication would be better. Okay. I feel like it makes the communication work.
Starting point is 01:12:33 Like, if you're just kind of like. There are so many more problems. Yeah, just like a walking by your coworker in the hallway and being like, hey, how are you? And you just like good. Like that's not even like the true like. But no, it's like you. There's like. things where it's like they're not like realized but you just kind of like you say things to
Starting point is 01:12:51 people to kind of like get through certain yeah what's uh hey ft how's your weekend i diarrhea yeah like it's that's not the answer you want that's what happens you just want a good and keep it moving you know you can say bad and keep it moving that's true i don't know the whole the neuralink stuff kind of freaks me out although i'm sure that once they come up with something worse I'll be like, yeah, sure. Shoot your poison into my brain. I don't care. Load me up with a good stuff. I just want to be the board. Quigs, we were talking before we start recording about the theory of, is it called single player? I think so. Or at least that's kind of like what I think of it as. Can you explain that to me? Because I heard a little bit about it and
Starting point is 01:13:37 it sounds fascinating. So it's something that like I kind of like thought of. I didn't know there was an actual theory. So I don't know if it's really simulation. Like I grew up thinking like, like, that, like, I'm the only real person on Earth. Truman Show. Yeah. Like, it's a sort of thing where it's probably, like, very vain. But it's like, when we leave this room, like, you don't exist. You, like, are basically kind of just, like, put in my life for, like, my purpose of, like,
Starting point is 01:14:09 beginning to end. Uh-huh. And, yeah, I just, like, I think that the world basically, like, revolves around me sort of thing so but it has that work are we we're not real are we made of like electronics or we just no it's like you're so like like flesh human body but like if you're not if i'm not with you like you have no brain function like you're you're you don't exist well what what actually we all go into a room and then we just sit around we're like i wonder what quiggs is doing and then we just hang out for a while and then you come back in the office and everyone's he's
Starting point is 01:14:47 come and we flash the lights. So you don't have that like theory at all? Like there was, is there like anything? Because I've told people before and they're like, holy shit, I kind of think the same thing. So I, to a certain extent, like when I was younger, I kind of had like, what if what if this was a Truman show? Because when you're a kid, you do think that the world revolves around you because, you know,
Starting point is 01:15:07 you lack the context of the world to like have a perspective of other people's views. But yeah, there was definitely like a little bit. to that. I think it's very common. Yeah. But single player, that implies that like, are you in control of what you do? No. And it's like, it basically just like, my like theory on it just like leads to
Starting point is 01:15:30 normal life. It's like it's not like a you don't exist, but it's like or if like I guess if I assume that nobody else exists, I could do whatever the fuck I want or whatever. But it's like part of it. It's like being nice to people because like you're going to see them again. And like, developing relationship and stuff like that to where it's like that's i'm just describing life yeah because you'll be happier and it's like i just don't see you when you're when we're not together and i
Starting point is 01:15:56 kind of assume that like like i can't i can't imagine like you being at home tonight like having the thoughts that like i have in my head like i don't think that that happens i don't know it's it's a weird thing but are you saying are you saying you don't have empathy is that i probably i mean maybe like it's but it's i don't know it's like a weird thing where it's It's like, we've never met. Like, I, I, like, just can't really wrap my head around, like, you having, having, like, internal thoughts and, like, doing things, like, when... Because that's you. No, it's not, it's not me.
Starting point is 01:16:32 No, no, I'm saying, like, you're the one that has the internal thoughts. Yeah. And it's, like, I feel like I'm the only one that, like... Why do you feel like that? This is... I don't know. It's, like, just, I always just, like, assume that if I'm not there, like, it doesn't exist, sort of. type of thing so i i don't feel quite like all the things you said but i do find myself thinking sometimes
Starting point is 01:16:53 like i'll see on twitter like when something crazy happens i don't know pick an event yeah and i'm like wow like that happened 20 minutes ago and i was like sitting at my desk like i was like how it feels so foreign that something that momentous could happen while i was just like sitting at my does that make sense yeah yeah like that's i find myself thinking things like that sometimes I guess, I don't know. I guess it's intriguing to me that y'all think like that. Because I don't. Like all I, the majority of my thinking is how will this affect other people and their feeling?
Starting point is 01:17:32 Like, that's how kind of how I operate. Yeah. And I do that as well. Like, it's like, it's like the feelings and stuff are important. And it's like I'm not like a complete like psychopath where it's like that. But it's like, no, I know. the way I explained it but it really just comes down to like I sort of like just like there's a lot of people that I'm like that they like I don't understand how they exist like I don't know it's weird
Starting point is 01:18:02 it sounds it sounds extremely have you ever heard of a solipsism no I don't think so so it's so it's a philosophical thinking that and you like it's it's a philosophical thinking that says only I exist and I I'm the only one that exists because I'm the only one that can be proven to exist. So it's like that old Renee Descartes. Then I think that's exactly what it is. And so solipsism is like the extreme of saying, I, I only exist. Like this could be all a figment of my imagination. And some solipsis actually think that it is a figment of their imagination, but they know that they exist.
Starting point is 01:18:38 And one of the more beautiful or eloquent rebuts than I've heard who's got a better name of Matt Dillahunty. And he was like, because you can't really, it's called like a problem of hard solipsism. Like how do you, how do you prove that other people exist? Like you really can't do it. Like your thing isn't that far off because it's a very poignant point in philosophy. So like he said there's only way that I can like solve hard solipsism in my head is like, I know my capabilities. Right.
Starting point is 01:19:05 He's like, and Beethoven created like this beautiful piece. He said, I know I can't do that. So some agent outside of my body had to have created that because I can't do that. shit is fascinating shit though yeah well i think at the end of the day no matter what we determine if we're living in like a simulation if you're a single player guy whatever i actually don't think any of it matters i really don't i think because i'm going to act the exact same way either way because this is my reality fucking if i'm a computer and this is just all been a simulation i'm going to probably keep doing what it is i'm doing it is interesting though to think
Starting point is 01:19:39 like big t what you were saying how like it's it's tough to comprehend that stuff happens and then you find out about it and it happens somewhere else. Like, yeah, in like an object permanence way, it's like, yeah, dogs don't understand that the tennis ball is inside their creative toys unless you, like, take it out and show it to them. It's just like that for us,
Starting point is 01:19:57 but on like a much bigger, more complicated scale. But if you grow up famous, let's say you grow up in like the royal family, Prince Harry or whatever, it would actually make a lot of sense for that guy to believe that I'm the only person that exists in this world because literally everything that you do, everyone else talks about all the time.
Starting point is 01:20:16 He has far more of a right to think that than I do. No, but I think it's like inside all of us a little bit. But if you if you grow up like being super famous like that, it's probably like you'd be crazy not to think that. Because if you just boil it down, you're like, is this like I'm one out of six billion people on the planet. And I'm the guy that's the fucking prince that everybody talks about all the time. If you really sat down and broke it down that way to yourself, you'd be like, yeah, the chances of that being real probably pretty small. The world is just probably about me. Yeah.
Starting point is 01:20:52 Like either I hit the lottery of all lotteries or just like this is some like alien playing a video game doing like print simulator 3,000. That'd be so fun. But none of that shit matters. None of that shit matters. At the end of the day, I think even if we're working in a simulation, even if we're about to invent. a computer that's going to nuke the world, or if the world's already ended, nothing really matters. I'm not going to change the way I do things.
Starting point is 01:21:21 And you shouldn't? For in a simulation and you don't have free will, you might not be able to. That's true. The free will, we'll have to talk about that. What do you think about free will? Me? What do you do? Oh, yeah.
Starting point is 01:21:34 What do you think about free will? Free will, I don't know. That's one of those conversations that when somebody brings up free will, I'm like, that's above my pay grade. I don't know. Like my philosophical pay grid at the end of the day, I'd rather just say, fuck it. I don't know. But yeah, I think to a certain extent we probably have free will. There's a lot that goes into it. Like there's different things that, you know, bring everybody to each exact point in life that they're at any given time. And then they'll make a decision based on what they've kind of learned and how their past is gone. But you still have
Starting point is 01:22:06 the ability to make choices. What about you? I'm a retort. I'm just interested. Huh? What about you? I don't think we have free will. It's a long process, but basically the gist of it is.
Starting point is 01:22:25 I alluded to this when we're talking to Buddy about biohacking. I think I mentioned it briefly, but you are, like all we are, we're products of our genes and our environment. Like, there's nothing else that we're made up. up. And so that's literally how organisms evolve on this earth is their genetic makeup and the environmental factors. And so you literally aren't in control of either one of those things. If you're genetically predisposed to like certain things, feel certain things, there's nothing that you can do to change that. And if there is, it's an environmental factor. And there was, that was totally out of your control as well. So you're just basically a byproduct of your genes and
Starting point is 01:23:08 your environment. So your choices that you're making, they might seem like they're yours, but they're really not. You're a product of where you're at and you're a product of what's inside of you, like, not like soul or anything, but like genetics. You're a product of your genetic makeup. And so those two things combined with some other things like with physics, I don't get too deep into that, but basically the past is happening, now is happening in the future is also happening this is like known to be true like all of that those factors are by like no i don't think we have free will i think we're just kind of like existing and it's you're kind of like spectating and it's kind of so we're kind of this is kind of a tangent but i hadn't really
Starting point is 01:23:52 thought about this until now and i'm curious so if you don't think that we have free will then let's say like a mass murderer is it right bro no okay again i yeah i get i no but no but this is a totally different question. I do see that I've had murder on the brain recently. But if there's a mass murderer, right? And cannibalism. This is a serious question. If there's a mass
Starting point is 01:24:17 murderer, is it right for us to put him in prison? If you don't, if you don't think that he had control over those decisions. Yes. Why? Because we're operating under the guys that we do have choices, right? In my opinion, right? We're
Starting point is 01:24:35 operating another guy. Like we do, it's the umbrella like, yes, we have agency over our choices. And the only thing that we can do is build a system that is, is, is, is, is, is, um, uh, curated injustice, right? If, what I'm saying is the, the consequences that people have, right, uh, that people will have for their actions might not necessarily be their choice. Like, think about it like this. There was a do. There was a, uh, a mass shooter in Austin, Texas, right? And he shot a bunch of people. And before he shot himself, I think he had like a letter.
Starting point is 01:25:13 He was like, examine my brain. Like, examine my brain, something's not right. And they found out that there was a tumor pressing on his, I think it was a maligula, amygdimal, whatever that shit is called. There was a tumor pressing on a part of his brain that activated like his aggression. And so for months and stuff, he was, his family was like he was overaggressive. And he even knew something was wrong. So like, was he wrong?
Starting point is 01:25:33 Yeah. Was it necessarily his fault? No. He literally didn't have control over what he was doing. All he knew was like he had to get this aggression out. But you still have to have consequences in your society for your actions or else you'll have chaos. Now what I'm saying is I'm unsure. I'm unconvinced that things would play out any other way.
Starting point is 01:25:53 That's what I'm saying. I'm unsure if it would play it other way. But as long as we have the illusion of this free will, then we should operate in a way that is, contingent on justice. And it could be a deterrent, too, which could factor into like somebody else's decision that happens years from now. That's part of what you're saying is no longer their free will because they have that information knowing like, hey, if I do this, I will go to prison for a long time.
Starting point is 01:26:22 But yeah, man, this has been a mind fuck of an episode. I think, fuck it. Yeah, we're in a simulation, but it doesn't matter. That's what I've come down to. And I'm going to bring this up one last time. We talked about this, I think it was like a month ago, month and a half ago. It's a thought experiment called Rockos Basilisk. Do you know what this is, Quiggs?
Starting point is 01:26:46 No. All right. So it's a hilarious thought experiment. And it was initially posted on philosophy message board, which has got to be the coolest place on earth. And after it was posted, the leader of the message board replied, he shut down the threat. and banned all discussion about this thought experiment. He said, listen to me very closely, you idiot. You do not think in sufficient detail about superintelligence
Starting point is 01:27:10 considering whether or not to blackmail you. That is the only possible thing which gives them a motive to follow through on the blackmail. You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and keep their idiot mouth shut about it. So that's a warning.
Starting point is 01:27:30 I am not clever enough to keep my idiot. and mouth shut about it. But the thought experiment is that at some point in time, we will be taken over by some sort of software system or artificial intelligence that will then continue to improve itself, improve itself, improve itself under the guise of it's designed to make the world as optimized as possible. It's designed to optimize life. It's designed to just pretty much control the world as it becomes super, super intelligent.
Starting point is 01:28:00 If it gets intelligent enough, then at one point it will invent a way to be able to access the space, time continuum, go back in time, and therefore punish the people that tried to prevent the implementation of that software system. And so now by knowing that you will be punished, if the software system does eventually get implemented, you two are also complicit and will be punished by the artificial intelligence in the future. when it goes back in time and sees that you did not do everything possible to help design it. I like that theory. It's a crazy theory. I mean, it's just like propaganda. Yeah. Just trying to scare you into liking AI.
Starting point is 01:28:44 Yeah. Bro, that's, that's, that's, that's, that's, that's basically like, that's Jesus. Yeah. Yeah. Yeah, it's like saying you will reach heaven if you do, if you help this out. At some point it becomes God. he's not going to punish me he's going to punish me if i don't well i was going to say earlier uh you everybody always bitches on twitter that we if you if you believe that we're in some sort of
Starting point is 01:29:11 simulation and that there's someone controlling the simulation that's not too far off from believing what the bible says right i mean that was a thought that popped into my head i was just that's like the creator of the matrix what was his name the guy that was basically god together with the white hair in the Matrix I was high at the time I forget two and three were just so I don't get into two and three
Starting point is 01:29:34 I don't get into two and three the second one was like a goth like fest weird like I just wasn't into it like nothing against gothic people but I didn't understand I didn't get it I've never seen the Matrix
Starting point is 01:29:44 that's wild because you gotta see the Matrix I'm like I was talking about it earlier I'm like an insanely bad movie guy like I've seen so few movies in my life what do you do like what are you into i'm into like podcast like i i watch like a lot of like science like i like growing up all i watch was like myth busters like that was my entire life so like do you
Starting point is 01:30:09 do you find like sci-fi is corny is that like that what it is sort of but it's just like movies in general i'm just like i i have a very small like catalog of movies i've watched wow like i don't think that's like yo you fuck with music you're like no not really man no i don't i don't i don't I don't really fucking music either. You're fucking music. That's wow. Shout out to you, man, because you or you, maybe you do. Maybe you're the only one that does exist.
Starting point is 01:30:34 I don't know. I might, yeah. Yeah. I feel like the Matrix would be right up your alley, though. It was an amazing movie. So, yeah, just, essentially what I'm saying is give me the blue pill. Like, fuck it. It doesn't matter.
Starting point is 01:30:45 None of this matters if it's, if we're in the Matrix or not. Would you change? Would you act any differently if we were inside a simulation? You knew that? No. I mean, I kind of think we are. Yes. I would.
Starting point is 01:30:56 What would you do, Aaron? I try to figure out a way to, like, talk to him. Like, I'd just be trying to communicate with the people that made it. Like, I'd be like, yo, get me out of here, like, this rap. Because I'm like, this shit, y'all, this maize y'all put in. I'm not for this. This is wet. I like that.
Starting point is 01:31:13 Like, I've never heard anybody say that as their answer. Like, you tried to find the creator of it. And to get them out, just they'd think that you were cool. They'd be like, yo, this airing guys, I like, I like talking to them. I think I'm going to help. I mean, whichever do programming, probably knows I'm probably his cup of tea. So, but I just want to, I don't, like, this shit, I'm just so over this world. The shit, whacked.
Starting point is 01:31:36 They're like, you just wake up and go to work every day and maybe be mad at each other. And it's like, it's something better than this. That's why I love Avatar. We've already talked about this. But like, that's where it's that. Give me my tentacles. I want to plug it into the tree, man. Like, that's better.
Starting point is 01:31:49 What about you? You would change shit? Yeah. I mean, I mean, if nothing matters, like, I mean, I certainly wouldn't be like, kind of I'm into work every day. Like, I would go, just go do whatever I want it. Like, if, if there's literally no consequences for anything and were just some sort of computer program, then why would you, like, do any of the things we do now?
Starting point is 01:32:08 You would still get hungry when you didn't have enough human to eat. You would still feel, feel pain. Sure, I guess. You'd still get cold if you couldn't afford your heating bill. I guess, but like, I don't know. Like, I would, I would be far more risk prone, I guess. I don't know. Like I
Starting point is 01:32:25 You start slamming H You get like really take a walk I might try it if I thought That we were just a computer program Like yeah why not He wouldn't murder somebody No no doubt No doubt big people
Starting point is 01:32:35 No I wouldn't do that I don't know about that I wouldn't do that I we've established That's my biggest fear No your big fear is getting falsely accused You're caught Yeah that's correct
Starting point is 01:32:45 Yeah Either honestly That is your that's your By far your biggest fear You don't care about getting caught If you actually did that shit I would prefer not to, but it would be fair, as opposed to the alternative. All right.
Starting point is 01:33:02 We do have a word from Billy Football, who will be out next week, and then he'll be coming back after that whenever he decides that it's a good time for him. But he did want to pass something. He's got a riddle. And Billy was like, do you think that people would want? Here's what he text me. Can I send in a riddle for you guys to try and solve on macro dosing? I feel like it could be a good contribution. And I said, yes, you may.
Starting point is 01:33:27 I don't know if we should try to solve it right now or if we should have people that listen to the show. If you've made it this far, you try to solve it. And then the first person that gets it, what, like a selection of merch, we can send them like macrodosing shirt. Care package. A care package.
Starting point is 01:33:43 Yeah. So reply to the tweet that we put out in the morning that has the episode on it. If you can solve Billy's Riddle, then just reply to that the riddle is when does death come before life but after birth it's heavy shit and i love that billy thought of us too and he he wanted to submit a riddle uh so mull that over see if you can figure it out say it again when does death come before life but after birth it's heavy oh i've got it it's heavy big t's got it that was quick i got it too that was very
Starting point is 01:34:20 quick. And then, Aaron, you got to guess and see what color underwear Big T's wearing. We'll see just how far into the simulation you are. I wore black today. Close. Navy blue. Let's see. Prove it. See the undies. What's going to look black? Let's see. It's a black or navy blue? That that looks black to me. It's navy. It's navy, but it's a dark. It's a dark navy. It's a dark. I already. It's a It's gone. I was getting dark vibes. I was close.
Starting point is 01:34:54 Yeah, no, you were close. You're close. All right. That does it for us this week. Love you guys, as always. And we'll be back next week. Let us know what you want to hear us talk about. Let us know that we're handsome.
Starting point is 01:35:06 And give us suggestions for any sort of topics or things that you want us to cover. And we will see you next week. Love you guys.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.