SmartLess - "MIT Professor Max Tegmark: LIVE in Boston"

Episode Date: July 13, 2023

Find out more about mechanical cats doing Cats on Broadway… with our esteemed surprise guest: physicist, cosmologist, and machine-learning researcher Max Tegmark, LIVE in Boston.(Recorded o...n February 04, 2022)Listen to “SmartLess Live” episodes four weeks early and ad-free on Wondery+See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 Hey, listener, and welcome to Smart List. Before we get into this incredible episode, I want just a moment of your time to set the stage a little bit. Sean and Jason and I went on a smart list tour last year where we recorded 10 episodes live in front of thousands of our biggest fans from Brooklyn to Los Angeles. And guess what? Right now there are more live episodes from our tour on Wondry Plus that you can listen to. You can listen to these episodes four weeks early and add free on Wondry Plus that you can listen to. You can listen to these episodes four weeks early
Starting point is 00:00:26 and add free on Wondry Plus, after which you can hear them for free wherever you get your podcast. Find Wondry Plus in the Wondry app or on Apple Podcasts. All right. Welcome to SmartLess. This is officially a cold open, I guess. This is a cold open.
Starting point is 00:00:45 Right? This is like a... Like on the podcast, we just kind of talk. Before we say what the name of the day. Like an intro, and then we got to be like... Yeah. We got to be like... Professional.
Starting point is 00:00:56 And like, oh, we... We get to... We usually... You're just copying what I'm saying. You're just copying what I'm saying. You're just copying what I'm saying. I don't know how to tap into this. That's okay.
Starting point is 00:01:08 Just say the only thing that you know what to say. Welcome to Smartless! Smart. Smart. Smart. Smart. Smart. We're so happy to be in Boston. Nice to be here. Oh my God. Tonight. And you guys, you guys rolled out your nicest weather for us.
Starting point is 00:01:30 Yeah. Oh, felt like being home in Canada almost. But thank you, thank you, thank you. Not only for listening to our garbage, but coming out and looking out at us. Yeah. And you guys, you guys rolled out your nicest weather for us. Yeah. Oh, felt like being home in Canada almost. Uh, but thank you, thank you, thank you, not only for listening to our garbage,
Starting point is 00:01:48 but coming out and looking at our garbage. Yeah. It's a better way to say that. No, the way, but better way to say that. Um, so, all right, let's sit down. Yeah, we're just excited. Sit down. Um, so, uh, oh wait, somebody left their phone here.
Starting point is 00:02:02 That's me. I brought it into my pocket. I don't know why. If I do get a call though, we're just gonna take a quick break, okay? From the inspection. It is mine. Sorry. The camera was on too, Granddad. Yeah, my flash was on. Yeah, put it over here. And I'm showing gum.
Starting point is 00:02:15 So really professional operation here. All right, so let me ask you something. Yeah. Do you guys, how is traveling going for you with all of this? Great interview question. What about signs? I did a little preparation, is that? No, I wonder what I want to know that because,
Starting point is 00:02:32 as of this morning, Will started calling me Katy Perry because I bring so many outfits with me. I don't know. This is the best you could use. It was just, I know. Well, the true story is the sweet girl who picks out my clothes for me because I don't know how to do that, clearly. She sent me, she sent over, Godlover,
Starting point is 00:02:58 she's a really close family, but she sent me over two of the exact same outfits today. So I kind of put this together, Willie Nilly. It is a new thing for us because we, you know, we do this thing on our laptops. And we were. And we know we're together, we're together, we're together. We wear pajamas. Yeah.
Starting point is 00:03:15 So this whole notion of having to dress and actually have a specific time and all that stuff, it is odd. Yeah, I think. And that there's people here. Yeah, yeah. And usually one of us is a few minutes late. I am commonly late.
Starting point is 00:03:30 Oh, that's you. I am. OK. But here's it. I don't like to be, if you're early, you've wasted time, right? You have an issue. I know.
Starting point is 00:03:39 I've just got a lot of milling around. Go out there. Just wait. I'm interested. I go to a doctor's office now late, so I don't have to wait the 20 minutes they make me wait. Right. I just go.
Starting point is 00:03:51 By the way, you still have to wait. But do they like that? They love me for that. This sounds familiar. I feel like we talked about this, maybe. I don't think we have a time. I don't think we've ever. I don't think we've ever.
Starting point is 00:04:02 You're never early or late, are you? No, I'm right on time. You're always right on time. What is that? I'll never ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever ever I'm going, I don't know, ask Kenny Perry over here. Like that, so we'll just bully. And then we all have, you're gonna make me roar in our head. And Jason, you've been singing that all day. I can't, I'm so, I can't, is everybody, when you get a song you can't get it out, am I the only one like, but like, it lasts an abnormal amount of time, like,
Starting point is 00:04:41 a week, I'll get something stuck on my head. They say, like, to count down backwards from a certain amount of number and that will make you stop thinking of time, like, a week. I'll get something stuck on my head. They say, like, to count down backwards from a certain amount of number and that will make you stop thinking of the, are you doing it right now? It just worked for songs. Yeah. Because there's a lot of shit I'd like to forget.
Starting point is 00:04:54 Like that kind of thing. Yeah, so if that works, I want to do that. Well, the other thing that happens to me yesterday was I found, my back has been itching, and then we have, we have, are you gonna get it? Yeah, so we have this. He's backing into a plug for hypoconyl.
Starting point is 00:05:09 No, I know, I was other podcast. Then it is. And, and, and. So Sean, wearing conversations in the hotel room, and Sean does a lot of this, he's like, uh-huh, what's going on, and he gets up against the door and he's going, really?
Starting point is 00:05:18 Yeah. Like a dog, like a non-stop. Like a dog, and scratches his, yeah. So then we, our friend Eli's with us, and who we love, and it's a very good friend of ours. And I'm like, what's going on with my back? He goes, take a short off, let me see. So I took the thing and I'm back and he goes,
Starting point is 00:05:33 that's shingles. Truly. Yeah, he said I have shingles. Oh, you know there? No. No, he went. Oh my God, that shingles. And I'm like, what? I just got the vaccine for shingles.
Starting point is 00:05:44 How could I have shingles. And I'm like, what? I just got the vaccine for shingles. How could I have shingles? Right. Because if there's a vaccine for it, it's in my body. Yeah. Yeah. Oh. Yeah. Right.
Starting point is 00:05:56 Yeah. Good health isn't political. No. Say that again. Thank you. No, thank you. You can get a show. You just got a pen. No. Say that again. Thank you. No, thank you. You can get a show.
Starting point is 00:06:05 He's got a pen. Yeah. I used to pen earlier to write my intro. And since we're talking about science, it's a great segue. Fellas. I wanted to tap into the brain power of this city. OK.
Starting point is 00:06:20 We got a big brain coming out. This fella has a master's and a PhD from Berkeley. He's a fellow at Princeton. He has tenure at Penn. He arrived here at MIT in 2004 where he still works today. He does it all from physics to cosmology to quantum stuff and computers. He's going to explain what it is.
Starting point is 00:06:45 Stephen Hawking. Will Ferrell. Stephen Hawking. Please welcome a guy you can definitely make us all more smart, not less smart. Smart less, you get it? Max Tagmark. Oh!
Starting point is 00:06:58 What? Max Tagmark. Come on, Max. Oh my gosh. Get out of here. There he is. Max. This is Max, Mark. Hello, man.
Starting point is 00:07:09 It's so nice to meet you. Come on, please. This is so exciting. Well, see, we have the same stylist. Oh, wow. He wears a little better than you do. He certainly does. Now, can you do a better job than I just did explaining?
Starting point is 00:07:23 What it is that you do, what it, first of all, how do you introduce yourself, call yourself what you do a better job than I just did explaining what it is that you do? What it is? First of all, how do you introduce yourself? I call yourself what you do. By the way, you look like a rock star. Well, it depends on what I want. Like if I were on a long flight, and I just want to be left alone, and the person asking you what you do, your pedophile. Physics. That was my worst subject in my school. Five hours of silence. Right. Right, right. But you are a wanna talk, yeah.
Starting point is 00:07:48 I'm the old astronomy. I'm like, oh, I'm a Virgo. Oh, right. Well, maybe I could say cosmology, they'll be talking about eyeliner, makeup. Okay, so the class you teach is, oh, it's whenever they want me to torture the students with that year.
Starting point is 00:08:06 So it can be either torturing the freshman, it came out of high school with the basic physics of how stuff moves to doing the torturing, the grad students with some advanced topics about our universe. Okay. Or I taught, most of my time I spend actually torturing my grad students, doing research on artificial intelligence. Okay, okay I toured most of my time I spend actually torching, torching my grad students, doing research on artificial intelligence. Okay, okay good. By the way, this is everything I'm for. Yeah, well you marry me, no kidding.
Starting point is 00:08:33 I want to go, you probably have a question. That's what I want to go over there. Well, I saw this documentary on artificial intelligence and what I was surprised to learn is that it's not about robots, like the Steven Spielberg movie. It's more about the amount of computing speed that we now can do such that, like, I think they said in the documentary, you can put all the books that have ever been written
Starting point is 00:09:00 into a computer now, and you're gonna tell me whether I'm right or wrong, I bet I'm close to right, but probably wrong. You can put all the books into a computer now, and you're going to tell me whether I'm right or wrong. I bet I'm close to right, but probably wrong. You can put all the books into a computer, and the computer will ingest all that information, separate it out, and be able to give you an answer about anything that you can ask them if the information was in any of those books from languages to rocks to... Isn't that called Google, though? of those books from languages to rocks to the, I mean, isn't that called Google though? Well, I'm sure you could explain that,
Starting point is 00:09:29 but it's like that's artificial intelligence. That's, well, yes and no. So on one hand, yeah, you can take all the books that were printed and put them on a memory card. It's so small that you might have a hard time finding it in your pocket, but that doesn't mean that a computer necessarily understands what's there, just because it can store it and kind of regurgitate it back to you, right? And I think the truth is, despite a lot of hype, that artificial intelligence is still
Starting point is 00:09:56 mostly pretty dumb today compared to humans or even cats and dogs. But, you know, that doesn't mean it's gonna remain that way. I think a lot of people make the huge mistake of thinking just because AI is kind of dumb today, it's always gonna be that way. Right. Well, shouldn't we keep it dumb because if we let it get too smart, et cetera? What is that threshold of the point of no return? Yeah, because remember that thing about Facebook where they started, I don't know if this is true, they started doing like AI technology,
Starting point is 00:10:29 they just started talking to each other and they shut it down, is that true? They were gossiping. Yeah. It's true, but I think Hollywood and media often make us worry about the wrong things. They, they, they, the, what do you mean? Yeah. First of all, people often ask me, things. What do you mean?
Starting point is 00:10:46 First of all, people often ask me if I should fear AI or be excited about it. The answer is obviously both. AI is like any technology. It's not evil or good. If I ask you, what about fire? How do you feel? I'm forward or against it. What are you going to say?
Starting point is 00:11:02 Right. It can hurt if you use it incorrectly. Exactly. And the same thing with all of the technology, the only difference is that AI is going to be the most powerful technology ever. Because look, why is it that we humans here are the most powerful species on this planet?
Starting point is 00:11:18 Fuckin' A. Is it because we have bigger biceps, sharper teeth? Then the tigers know it's because we're smarter, right? So obviously, if we make machines that are way smarter than us, which is obviously possible and most researchers in the field think it's going to happen in our lifetime, then it's either going to clearly either going to be the best thing ever or the worst thing ever. Yeah, so my question is, when it's the worst thing ever, by the time it becomes the worst thing ever, then we're fucked.
Starting point is 00:11:48 Then it's too late, yeah, you want to kind of... So let it be the worst thing ever. So that's the catch, though. We humans have had to play this game over and over again with where technology got more powerful. I mean, we're trying to win this race, making sure the wisdom with which you manage the tech keeps pace with the power manage the tech keeps pace with
Starting point is 00:12:05 the power of the tech. And we always use the same strategy, learn from mistakes. But it seems like the big safeguard that we have as humans, that we don't yet have with machines, is that we have ethics, we have empathy, we have emotion. And what is the computer program that you would need to put together to inject that into this new machine with all of this information. We put some snuggles in it. Yeah.
Starting point is 00:12:30 That's a fantastic question. What's the snuggles recipe? So you're hitting exactly my wish list. If you want to have ever more powerful AI that's actually beneficial for humanity, right? So you can be excited, right? I'm horrified about the future. There are three things you're going to need. First, you're going to want to need the AI
Starting point is 00:12:46 to understand our human goals. And then get it to actually adopt the goals, and then to actually keep those goals as it gets smarter. And if you think about it for a little longer, each of these are really hard. So suppose you have a tell your future self-driving card and take you to Logan Airport as fast as possible. And then you get their covered in vomit
Starting point is 00:13:03 and chased by helicopters and you're like That's not what I meant. Yeah, and the car goes like that's exactly what you ask for you know, it's clearly like they do talk like that literally Literally are you he's the Terminator? He sounds like the Terminator and he's talking about Terminator stuff So we humans have so much more background knowledge that a machine doesn't take over, because it's like very alien species of a source.
Starting point is 00:13:30 So that's hard for starters. And then, suppose you can get that right. Now, let me stop you there. Is there any chance of getting that right? In other words, the formula, the equation that equals emotion, responsibility, ethics, can you even create a computer equation for that? I think right now we don't know how to do it.
Starting point is 00:13:50 It's probably possible. We were not working enough on it. Yeah, the catch is, for computers, or just like, if you think of a baby, six months old, you're not going to explain the fine details of ethics to them, because they can't quite get it yet. By the time they're a teenager, they're not going to sit to you anymore. Those of you who have kids out there, right? So, but you have a window with human children while when they're smart enough to get it and may still malleable enough
Starting point is 00:14:18 to hopefully pay attention, right? With computers though, they might blow through that so quickly that window. Did you see X Machina? Did anybody see X Machina? with computers though, they might blow through that so quickly. That window. Did you see X Machina? Did anybody see X Machina? Yeah, that's amazing. That's amazing. What did you see? Yeah, let's put that for X Machina.
Starting point is 00:14:32 No, what is the most accurate film to science? Like, is it how in 2001 or X Machina? Those are my top two actually, because how emphasizes this key thing that the thing you should really fear in an advanced AI is not that it's going to turn evil, but that it's going to just turn really competent and not have its goals aligned with yours. That's what happens in how, like, no spoilers. Right. And like that, taxi, I mean.
Starting point is 00:14:57 60 years in were good, but. But then the other thing you should also worry about is even if you can solve all these things And I think it might take 30 years to figure this out, which is where we should start now Not you know the night before some folks on too much red bull switching on right? I mean I got two cans in me right now Not in the can not in my can but in super tellers away from Okay, but the other thing is even if you manage to solve those technical problems, which we should have more research on, you also have to worry about human society.
Starting point is 00:15:31 Because just think for a moment about your least favorite leader on the planet. Don't tell me who it is. So we don't offend anyone in the audience. Thank you for thinking about it. You're just a leader. Imagine their face for a moment here. And they imagine they are the one who controls the first superintelligence and take control
Starting point is 00:15:46 over the whole planet and impose their will. How does that make you feel? Yeah, not good. None of it makes me feel great. Listen, after a lifetime of doing all this stuff with the right, how does it feel to talk to an actual robot? Like that must feel... Yeah.
Starting point is 00:15:59 And we will be right back. Thanks to True Classic for supporting the show. This is my favorite time of year, because you've got your beach parties and your barbecue is in hanging at the beach and you name it, getting on your bike and with the kids and outside and everything. And while I love the warmer weather, it can be hard to dress for the heat, especially for us guys who just want a comfortable t-shirt that fits great.
Starting point is 00:16:27 And that is why I love true classic, because they make it easy to look and feel great all summer long with their perfect fitting shirts, shorts, pants, all at prices that your wallet will love. I feel like I'm 53. I know, hold your amazement, but I feel like I've spent my I know it. Hold your amazement. But I feel like I've spent my entire life looking for a t-shirt that fits right. You just want all that, you want it to fit right through the shoulders, right through here, right through there, that it's not too snug, but it's not too, but all those things.
Starting point is 00:16:59 And it has taken me this long to find it, and it's thanks to True Classic. So let's talk about these True Classic teas for a second. And honestly, I'm not kidding when I say, I've been blown away by how perfectly this fits me. It's like my new secret weapon almost. And it just makes me look super fit with the way that it hugs my arms and my chest. If I do say so myself.
Starting point is 00:17:24 And just the right amount of room through the body. So, and even better, they've got this new summer ready colors and the coming packs so you can have a shirt for every day of the week. So, if you're ready to upgrade your look and your wardrobe, shop now at TrueClassic.com and save 25% off with the code SmartLess. Look your best, feel your best, and be your best with True Classic. SmartLess is supported by NetSuite by Oracle. If your business earns millions or tens of millions of revenues, stop what you're doing and take a listen because NetSuite by Oracle has
Starting point is 00:18:02 just rolled out the best offer we've ever seen. NetSuite gives you the visibility and control you need to make better decisions faster. And for the first time in NetSuite's 25 years as the number one cloud financial system, you can defer payments of a full NetSuite implementation for six months. That's no payment and no interest for six months
Starting point is 00:18:23 and you can take advantage of this special financing offer today. NetSuite is number one because they give your business everything you need in real time all in one place. To reduce manual process, boost efficiency, build forecasts and increase productivity across every department. Look, having all of your information in one place is kind of the name of the game, and that's what NetSuite has. More than 36,000 companies have already upgraded to NetSuite, gaining visibility and control over the financials, inventory, HR, e-commerce, and more. If you've been sizing NetSuite up to make the switch, then you know this deal is unprecedented.
Starting point is 00:19:02 No interest, no payments. Take advantage of the special financing offer at netsuite.com slash smartless. Netsuite.com slash smartless to get the visibility and control you need to weather any storm. Netsuite.com slash smartless. He's a Swav billionaire owner of the Delus Mavericks and the sharkiest shark on Shark Tank Yes, I'm talking about Mr. Business himself Mark Cuban and our smartless interview with him is available now four weeks early on Wondery Plus. Mark unravels the secrets behind the phenomenal success and recounts the magical moment when
Starting point is 00:19:37 he became a member of the billionaire club. Our interview with Mark Cuban was recorded live in Chicago in front of thousands of our biggest fans from our SmartList tour. This is the seventh of 10 interviews with new episodes releasing every Thursday. We're talking with celebrities and icons like the Great Will Ferrell, Conan O'Brien, Kevin Hart, Jimmy Kimmel, and so many more. And it wouldn't be SmartList if Scotty and I didn't take the opportunity to pitch Mark Cuban our new idea.
Starting point is 00:20:00 Will we land a deal of a lifetime? You'll have to listen to find out. We call that a tease in the biz. You can listen to these episodes four weeks early and add free with Wondery Plus, find Wondery Plus in the Wondery app or on Apple Podcasts. And now back to the show. How so is there is do do people have proprietary right over certain stuff or does one country control a lot like who's leading China's leading the AI is are they not us in China are both very strong I mean most research suggests that it's us is still kind of ahead but there there's a lot of hype around both countries of course try to try to research how in USA I'm gonna lose it and I'll bet one
Starting point is 00:20:42 day and I'll bet and I'll bet when they say the USA, they're talking about MIT and they're probably talking about him. Well, you know how it goes. Both countries are trying to proclaim that the both countries researchers are trying to claim that the other one is the heads of it, because you get more funding. That's how we researchers always do it. But seriously, the interesting key here,
Starting point is 00:21:03 I think, is ultimately, it's not really, you know, matter which country gets it for. It's going to matter most, is it going to be us who control the machines, or are they who control us? I mean, but it really is. It's all joking, so I'm obsessed with the Terminator movies and anything. So I find that. Sure, all joking aside. Yeah, joking aside.
Starting point is 00:21:20 Yeah. Yeah, let's put the jokes aside and in no talk about it. No, but that's kind of the idea behind a lot of Hollywood movies and stuff. Is that what if the A.I. has become more intelligent than the human? But here's the thing, I saw something on 60 minutes years ago, which fascinates me to this day, and I'm not going to get this right, but it's some guy. And he rune's eyebrows. That's it.
Starting point is 00:21:39 No, so... It's never wonder why. Never wonder why. I opened my old desk drawer and I got a tie from the 1968 Democratic Convention. It's got soup on it. I don't like soup. No one knows who Andy Rooney is.
Starting point is 00:21:54 Yeah, yeah. It's also his DAC-shepard impression. But anyway. It was true. But anyway, so there is this guy. The interviewer was interviewing the scientists who claimed to have come up with this idea. This thing was wrapped around his ear, and it was tied
Starting point is 00:22:12 to his side of his head. And the interviewer was asking him a question, like, what's the population of Utah? And all he had to do was think of the answer and it popped up on the screen. Do you know what I'm talking about? Yeah, well, this sort of stuff you have to do is think of the answer and it popped up on the screen. Do you know what I'm talking about? This kind of stuff you can already do with your grandma if you have a connection to Google.
Starting point is 00:22:32 Oh, yeah. Ergo, you saw it 15 fucking years ago. What are you talking about? No, no. No, no, no, but that you can do. Oh, can we meet Peter Jamm? No, that you can, if you think of a response to the computer. Oh, OK.
Starting point is 00:22:44 And he heard you. That's it. Let of a response that you got it. Okay. Okay. That's it. Let me move on to another question something current. So you know we were talking earlier about We're gonna kiss it out later. You mean when you read people's brainwaves? Well, I found it fascinating. I saw the segment where the guy was thinking and answer And it popped up on the screen again. Again, the third time is really clear. Hey, Sean, your best robot voice, go quick. He's gonna sing Katy Perry watch. Do you want to play a game of war?
Starting point is 00:23:19 No. Do you want to play J.S.T.A.K.D.O.? Another 15-year-old. Do you want to play J. Tektop? Another 15-year-old. Jason. Do you want to play a game? That's it. That's it. Just seeing himself up.
Starting point is 00:23:33 This is the voice-over artist here. Let's have it. Do you want to play a game? No matter how I say it, it's just a game. I'm sorry. I tried to do that. Do you want to play a game? Yeah. No matter how I say it, it's just the gayest computer ever. I'm sorry. I tried to do that. Do you want to play a gay?
Starting point is 00:23:48 Yeah. Okay, yeah, sorry. And what's your best computer voice? I'm sorry, Dave. I just love that. I'll sing again, because it's great. It is great. It's what you really should worry about. But coming back to, there are two things.
Starting point is 00:24:05 One again, summarize. You need to make sure the machines can actually align their goals with ours, because if you have something much smarter than us, other goals were screwed. It's like playing chess today against the computer that has the goal to win when you have, it's no fun. Well, what's the next, because when somebody says, hey, I, all I picture are those mechanical dogs that walk around.
Starting point is 00:24:29 And they don't really do anything. Just like, oh, look, we invented a robot dog, and it doesn't really do anything. So I want to know, like, what's the next thing that we can use it like that's like mechanical cats? Yeah. Mechanical cats? Yeah, no, you know what I mean? Like, what's the, oh, imagine mechanical cats. Yeah. Yeah, you know what I mean?
Starting point is 00:24:45 Imagine mechanical cats doing cats. Oh my god, we amazing. Sorry, we'll get back. You're saying what's the next thing we can look forward to enjoying out of science? Yeah, in the pop sense. Okay, in the pop sense. So first of all, so just to finish off, we talked about, you know, Hollywood makes us associate AI so much with robots and the Boston Dynamics dogs.
Starting point is 00:25:08 And you should check them dancing, by the way, if you haven't. Dancing dogs? The dancing Boston Dynamics robots. Super cool. But the biggest impact right now AI is having is actually not robots at all. It's just software. I mentioned this, improve the news, the work project we're doing, which is just a little academic thing, but if you think about social media, that's
Starting point is 00:25:29 all about AI. One of the reasons people hate each other so much more in America now is the effect of AI, not AI that had the goal of making people evil, but just had the goal of making people watch as many ads as possible. But the AI was so smart, figuring out how to manipulate people into watching ads that it realized that the best way to do it is to make them really angry and keep showing the more and more extreme stuff until they were so angry they wouldn't just fragment and feel they're really pissed off.
Starting point is 00:25:57 You then research that thing even more and then you get more ads and all that stuff. So that's one. All of social media, I mean, yeah. And the other one is, let's talk about positive things because AI intelligence, right? It's human intelligence that's given us everything we like about civilization. So clearly, if we can amplify it with artificial intelligence, we can use it to solve all sorts of problems. We're stumped on now, like cancer and lifting everybody out of poverty and so on.
Starting point is 00:26:24 Well, there ever be, so we go ahead, sorry. Yeah, no. So I was just going to say, like, another pure software thing that has nothing to do with bots is use AI for doing better science, better medical research. For example. Can I ask about that? So is there any, like I read a long time ago about like, you know, like you put a locator chip in your dog or you cat, whatever, you, I heard that they might be making a chip
Starting point is 00:26:46 that has all your medical files and you put it under your skin so you can just scan it because filling out all those fucking forms over and over. It's like, I just, I just filled out the form and now you're asking me the questions all over again. Read the thing, I just got 20 minutes off. I'm not, I'm dying. I'm dying.
Starting point is 00:27:01 Right, I'm like, what? And it's like, what's your name? I just filled three forms out that says what my name. I think I personally get a pass on that ship implant and just ask the hospital to have a less stone age computer system, but the series live course. Did you care about Sean showing up to doctor's appointments late so that he doesn't have to wait?
Starting point is 00:27:20 It's riveting. I don't know if that's AI or whatever. Something huge that happened this year, for example, is biologists have spent 50 years trying to figure out just from the DNA how the shape is going to turn out to be of different proteins that our bodies make. It's called a protein folding problem. And then Google DeepMind solved it. No way.
Starting point is 00:27:41 Yeah, with an AI. And now you can develop medicines faster. So this is a fantastic example, I think, of AI for good. Another one. But then the robots that are having probably that are going to have the biggest effect, I think, on us and the job market in the next five years, are probably cars, actually.
Starting point is 00:27:58 This is autonomous vehicles. That's pretty cool. I'm worried about everybody with their cars and the automatic driving or whatever, and then they show up, and then people just gonna show up at the valley, like dead in their car. It's gonna be really, you know what I mean? Like people are gonna get in their car,
Starting point is 00:28:14 and then they're gonna be like, the pizza guy shows up, and he's like slumped over. You're open the door. They're open the door. Yeah, just out. You know what I mean? Like that's what I'm worried about. Now, with the combination of what you know about computers,
Starting point is 00:28:34 what you know about space, what you know about intelligence, I know, I know what you want to know. You know what I'm asking for? Yes, John wants to know. He wants to know if there's aliens, but we're not going to ask him that. No, not that. What I think is something about you.
Starting point is 00:28:47 What I want to know from you is, is it based on your knowledge of all those areas, does it seem possible to you that there is the requisite amount of intelligence and technology at a place other than Earth? Ooh, see, he asks, is it a different way than I would? Well, of course, he's a different way than I would. Well, of course it's possible, although, you know, my guess, based on spending a lot of years dealing with telescopes and thinking about these things is that when we, for the first time, get way more advanced technological intelligence showing up on this planet, it's probably going to be in our lifetime, and
Starting point is 00:29:25 it's probably not going to come from outer space. It's going to be something we built. What do you mean? What do you mean? So you're saying maybe, when you're building something to bring them here, what do you say? No, I mean, we're basically, if the goal of artificial intelligence has always been to build stuff that's part of that.
Starting point is 00:29:40 Yeah, they just don't have cars. They need to get here. So you're basically, oh, they're basically going to be dead when they get here? No way, keep going. So we're really, if you basically build a new species, a new life form, that's way, way smarter than us. Yeah. Right?
Starting point is 00:29:53 Yeah. That's alien. That's alien. That's incredibly alien. It's much more different from a chipmunk or a tiger and that it really has nothing in common with our evolutionary history. Nothing doesn't necessarily care even about food
Starting point is 00:30:05 or reproducing itself. So if we do that, it's going to be just as big an event on Earth as if aliens show up. And that's why I'm kind of weirded out that people talk so little bit about this. That's what I'm saying. That's what I'm saying. As despicable as human beings were all, and gloomy, everybody,
Starting point is 00:30:24 we're just so despicable that if the announcement came that like, oh my God, there are aliens from there, they're visiting our planet, people be like, oh, okay, I gotta check my Instagram. Like, I don't think people would be like, give it to me. Based on that, it seems to me that what you would wanna do is make sure that somehow built in all this stuff is some kind of a kill switch.
Starting point is 00:30:43 And that the wise men who are, and women, you know what I mean, but like a group, right? Probably made up of you and your other colleagues, male or female from around the world that are the leading scientists in this area would get together on some encrypted platform and say, let's make sure we only us five know about this one thing
Starting point is 00:31:07 that we could press to shut all these AI robots down that we've created on our way. I'll be one, I won't tell anybody. I won't tell anybody. All four kill switches, all of this is what I do. I know, let's find bar room. This sounds a little too, let me just for me, because the idea that somehow,
Starting point is 00:31:25 you know, I wish I had a dude who were like, no, a lot about AI, he should decide, humanity's future. That's kind of how, I wanted in your hands. But that's how it is now, you know, if we don't, if the rest of us, everybody doesn't get engaged in these things, who's gonna make all these decisions?
Starting point is 00:31:40 It's gonna be probably a bunch of dudes in some back room or some who have not been elected. Are people who are super AI nerds like specialists in what human happiness is? Only they would know the ramifications of something getting in the wrong hands. But are they the ones who should be deciding what makes them not some elected weirdo? Yeah. So I don't trust particularly elected people, but I also don't trust tech nerds with being experts in psychology and what they're like.
Starting point is 00:32:07 Who can we trust? Then we got to show them the best. Everybody. I mean, they should trust it to the nerds. What are you talking about? Talk about. Talk about it. I'm going to ask you guys, suppose you have a magic wand that you can wave, okay, and
Starting point is 00:32:22 create this future 35 years from now, okay? When there is this very advanced AI, and you get to decide how the planet is organized, what it's used for, and what it's not used for. So it's not gonna be your standard dystopian Hollywood flick. What is this future like? What do you want it to be like?
Starting point is 00:32:40 Well, that makes me want to take a nap. Oh. Well, you want me want to take a nap. Well, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, you would, You point it all in that direction, and then good decisions come from that. I don't know. Yes. Brave New World. Yeah. Well, that sounds great. So it's a frighteningly expected response.
Starting point is 00:33:13 Let's compare that with what we're most extending AI money on now, right? So a massive amount of money is spent on advertising, you end up making teenage girls on a rexick, and then we have a massive amount of enormous amount of money now building robots to kill people. For the first time, they were used in Libya last year. They hunted down these fleeing people and just killed them because the robots decided that they were probably bad guys. And I think- I did not know about that.
Starting point is 00:33:41 It's- I don't know. Did you guys all know about that? About's a... I don't know. Did you guys all know about that? About the robots and hundreds of people down? Yeah, let's not gloss over that. What happened? This is the shit that I'm talking about, man.
Starting point is 00:33:54 Yeah, so it's... It actually has some dark comedic value, I think. Yeah, that's hilarious. The current policy, actually actually of the US government on killer robots and slaughterbots is three things. First of all, the US says, you know, these are murder hornets. First of all, we're saying this is nasty stuff that we don't ever want to delegate kill decisions to machines.
Starting point is 00:34:19 So we're not going to do it. Second, it's going to have a decisive impact on the battlefield. And third, we're going to reserve the right for all other countries to build them and export them to whoever they want. So like this was this Turkish company decided to sell them to Libya against the arms embargo and that's why they hunted these people down. We went like a really short span.
Starting point is 00:34:38 We went from replicating a sheep to slaughterbots. Like it seems like that happened really quickly. Yeah, man. And so is there, we had a, we were talking to a guest earlier today about time travel. And now, so it's on, it's on our minds. Now, and'm not... And we do have to ask you, like we did before,
Starting point is 00:35:07 if you're good to, did you time travel here? I do. If... If... Is there any chance... And I won't bore you with the same question that I asked that astronomer that we had about the mirrors... At the rest, I said.
Starting point is 00:35:23 It was a real highlight. Oh, no. But, yeah. But in the... No, because we'll try here. So, the... Will said, Neil the Grass Tyson was on. And Jason asked a really long question
Starting point is 00:35:36 and about time travel. And Will said, Hey, do you think we could put enough mirrors and go travel back in time to the beginning of Jason's question? But... And it's a valid... It's hard for me. It's question. And it's a valid. It's hard for me.
Starting point is 00:35:46 It's a valid. It's a valid. So the light we get from the sun has been traveling seven minutes. Eight minutes. Eight minutes. OK. So we're basically feeling something that's eight minutes old. We're back on this.
Starting point is 00:35:57 That's right. That's right. Right? OK. So isn't there a way to have a mirror that creates anyway? Yes, actually. There is. Well hang on, there is.
Starting point is 00:36:10 He says yes. He likes my thinking. I'm sure it will be built for us actually. In the middle of our galaxy, there's this monster black hole that weighs about four million times as much as a sun. And it's black. Oh, that's mine. Oh, no, it's a black hole.
Starting point is 00:36:24 Okay. And if you look at it purely carefully, light that went from you actually was bent by its gravity so much that it comes back on the other side of the black hole. Like, to mirror. So if you look at the really good telescope, in principle, you could see your own reflection, except no way, no way, no way, no way, no way, no way,
Starting point is 00:36:40 that you weren't born yet. Are you serious? Oh, wait, so Tom. And now, a word from her sponsor. Since his death in 2009, the world has struggled with how Michael Jackson should be remembered as the King of Pop or as a monster. I'm Leon Nefak, the host of Fiasco
Starting point is 00:36:58 and the co-creator of Slow Burn. And I'm Jay Smooth, a hip-hop journalist and cultural commentator. Michael Jackson was accused of child molestation for the first time in 1993. Our new podcast Think Twice, Michael Jackson, is the story of what came before and what came after. Throughout the podcast, we explore what makes Michael Jackson seemingly uncancelable. And we dig into the complicated feelings so many of us have when we hear Billie Jean at the grocery store. Through dozens of original interviews with people who watched the story unfold firsthand, think twice as an attempt to reconcile our conflicted
Starting point is 00:37:34 emotions about Michael Jackson, the man with our deep-seated love of his art. Listen to think twice, Michael Jackson, wherever you get your podcasts, or you can binge the entire series ad-free, unaudible, or the Amazon Music app. And now, back to the show. Wait, so talk, go. So that's basically time to, in other words, these telescopes, like the one we just launched up the web. They're looking so far back, they might actually see the big bang at some point, or something
Starting point is 00:38:02 we're seeing the galaxy at an earlier stage than us. So eventually, if you get a telescope strong enough, you could see the start of Earth potentially, so it came all came out a big thing. We should tell you, we all took mushrooms before. All right, it's just from the beginning. Yeah, and we were so interested. Somewhere in there is there an answer about
Starting point is 00:38:18 a lot of hot pot. Some of us took a double. A possibility of time travel. Yeah, so this is a kind of sea, but not touch time travel. OK. The sky is a kind of sea, but not touch time travel. Okay. The sky is a time machine, just like that. You see me, three minutes, not time travel.
Starting point is 00:38:31 Yeah, three nanoseconds ago, you see the sun, eight minutes ago, you see stars at night, if it's clear, so along go the people over there looking at us would see maybe the Boston Tea Party, and we can see things that happen over 13 billion years ago. You can also travel forward in time for real. I know bullshit, real time travel. You go to this blockhole here. That's my...
Starting point is 00:38:53 I was told, actually, when I moved to the US, that America possession is 95% of the law. I think. Is that true? You know what? Now it's yours. So anyway, if you just orbit around this black hole really close, which you can actually do, I give this as a homework problem to my MIT students, then your time will actually slow down so much that if you're on Skype with you, you'll be hearing him go like, hello, I'm
Starting point is 00:39:24 here. And then you're gonna hear him say, oh my God, that's a word of the word, what's going on there? Well, that's accurate. Because, we're times are actually running at different rates. And then when you come back, you look so good, so useful.
Starting point is 00:39:38 Because you're actually younger than, yes, yes, you would have been otherwise. So are we gonna be alive when we start to see any of this stuff that's gonna really blow our minds? Yeah, and Mars, are we living on Mars, go. So this is an upside of artificial intelligence. Roll that window, no.
Starting point is 00:39:55 That's not a follow-up, it's a different subject. By the way, you mentioned before, Boston's in party too soon, you know. This is the upside of artificial intelligence, no. It's just for us too, though, huh? This is the last. This is the upside of artificial intelligence, because either just for us, too, so what? This is the last. This is the upside of artificial intelligence. Either we can use it to go extinct in our lifetime
Starting point is 00:40:09 or we can use it to bring all these awesome things about in our lifetime. We used to think, oh, this is going to take 10,000 years to get like the sci-fi novels, because we humans have to figure out all the tech ourselves. No. If we can build this incredibly advanced AI that can then build more advanced AI, et cetera, et cetera,
Starting point is 00:40:26 we might be able to build this tech, you know, 30, 40 years from now. Right. And suddenly, we're not limited by our own pace of developing tech. We just go, boom, and we're limited by the laws of physics. Well, and by the laws of ethics, like if just because we can, should we?
Starting point is 00:40:42 Like, how do we know when we, as a society, are mature enough to handle some of the technology that we can, should we? Like how do we know when we as a society are mature enough to handle some of the technology that we can access? I think going and having some fun, joy riding around black holes is ethically okay. Yeah. It's my inner nerd speaking here. As long as you don't force other people to go with you.
Starting point is 00:40:58 But on the other hand, that's one of the- No, man, don't be nervous, just go into the black hole. So, like, we're not forcing, but it is peer pressure. The- The- The- That's one of the questions that they asked in Jurassic Park was just- Yeah, good Jurassic Park, good. Yes, I'm not-
Starting point is 00:41:17 I'm not- Yes, I'm not. Just because- Keep on the science. Just because you can create this island with these dinosaurs, Yes. Should you? Now, the- Because you can create this island with these dinosaurs. Yes. Should you?
Starting point is 00:41:27 Right. Based on the science of that, the amber that was frozen in there with the DNA of the dinosaurs, is that real? Yes. Is that real? Well, now, you know, we have my friend George Church, down in Harvard here, he's talking about already bringing back the mammoth by just taking the DNA together assembling it error correcting a scooter than
Starting point is 00:41:46 Basically DNA printing out the mammoth DNA and boom mammoth. We can do a lot of these things Leaving we can come back to the ethical question. They're just thing him exactly who's what's the what's the key? Which is name again George church? What's the one to show? What's the count? That's all that's gonna say yes or no? Let me just say a bigger thing first though, just to get the controversy out of the way, you know. I think we humans to really get the ethical decisions right and let people forfeit some dumb stuff. You have to remember how much upside there is also.
Starting point is 00:42:16 We're living on this little spinning ball in space where they almost ate billion people on it and been spending so many years killing each other over a little bit more sand here and a little more forest there. Where in this huge universe, right? We thought was off-limits? Well, with AI, it could be on-limits again. We could go to the Alpha Centauri system in a lifetime.
Starting point is 00:42:40 We could have a future where life is flourishing in our galaxy and in our other galaxies, where there's such an amazing abundance that people are going to be wondering, why did these guys fudge around for so many years in this little planet and fight squabble about breadcrumbs instead of going on? Most of this universe, despite all the Star Trek episodes out there No offense So far really don't seem to have woken up and come alive in any major way And I feel that we humans
Starting point is 00:43:12 Have a more or less responsibility to see if we can help life It more our universe wake up some more and help life spread Well, what if we run out of time with our use on this planet because of environment where we don't I'm reading your mind. Yeah, we're getting to Mars. So can we point the AI to our challenges here regarding the environment, fix that real quick, and then we can explore everywhere else? I think we need to fix things here in parallel. The reason that the main forest is partly gone, the reason we're messing up our climate and so many other things isn't because we didn't know 10 years ago what to do about it. It's because we kind of already built another kind of AI, these very powerful systems,
Starting point is 00:43:57 corporations, governments, et cetera, that have goals that aren't so aligned with the rainforest and maybe the overall goal is of. If we can use the AI to tell them how they can make more profit doing things that aren't so aligned with the rainforest, then maybe the overall goal is over. If we can use the AI to tell them how they can make more profit doing things that don't kill the earth, then they'll stop chopping down the forest. Well, maybe we should take the biggest step back, you know, the whole point of having, I mean, you've got stock and exon, don't you?
Starting point is 00:44:21 He's not comfortable answering this. My undergrad was an economics. So I am very sympathetic to the free market. You're doing things more efficient. But the whole point of the free market is that you should get done efficiently things that you want to get done. And then you should have, of course, some guidelines.
Starting point is 00:44:41 That's why we decided to ban Shia labor in the US. That's why we decided to invent the shy labor in the US. That's why we decided to invent the weekend. So, you know, you don't like the weekend? I think you said it's very stressful. But right now, we have, if you create something, you know, whether it be a super powerful dictatorship or it beats the, you know, the back-hole company that tries to convince you that cancer, that smoking isn't dangerous, or whatever. It has its own goals, and it's going to act.
Starting point is 00:45:11 It's good to think of these things a little bit like an AI, even though it's not made out of robots. It's made of people. Because there's no person in a back of a company that's consignal-handedly change its goals, right? If the CEO decides to stop selling cigarettes, he's just going to get fired, right? So we should start thinking about how do we just align the incentives of all the companies. I want to keep private companies incentives of people and incentives of companies and incentives of politicians with the incentives of humanity itself to get what you were asking for, you know, as a society in the future where we work.
Starting point is 00:45:41 The change is more cultural rather than scientific if you will. Yeah, although you do need to kick out a bit a lot about the whole business within incentives. Like, why did we invent the legal system in the US? Well, because we realize it's not so smart that people always kill each other every time they get into a squabble about a hot dog, right? Right, so a consequence. So you change consequences and not only think twice, and they'll just punch each other instead, or have said lippons or other way.
Starting point is 00:46:07 We, we, alignment is kind of the big slogan a lot of us nerds have for this. You want to align the incentives not just the machines but also the organizations with what's actually good for humanity. And we're in this unfortunate situation now where whenever an entity gets too powerful, it doesn't have to be a machine or a dictator, it could even be a company, that they start to now like take over whoever was supposed to regulate them, and turn them into like a rubber stop. Now there's suddenly not going to be so lined with what's good for America anymore, or good for humanity.
Starting point is 00:46:42 And this problem, we cannot wait for AI to solve it. We have to start solving that in the meantime. Amen. But you know, I mean, that's been the modus operandi up till now. And we are running out of time. And people are taking their profits because they figured they're going to be dead before the ramifications of it really.
Starting point is 00:47:02 Like, so I think the computers have to help us out. Yes, yes, absolutely. So this is why I'm so into this AI empowerment thing. I want to think about how can we use AI and put it into the hands of people so that they can very easily like catch other powerful entities that are trying to screw them over. And it's a way of using technology to strengthen democracy. Are we going to live on the moon at all? You want to? Yeah.
Starting point is 00:47:30 Is that possible? Are we planning on that? Would you want to? Should I believe that? Do you want to? Yeah, I would totally live there. All of these things are certainly possible. I was very much of the opinion that it's easier to make a really comfortable and pleasant life on this planet.
Starting point is 00:47:45 So I would also like to make sure I don't ruin it. Is there one project that you're working on right now, just one that you feel extremely passionate about right now that you could share with us? It's actually improvenews.org. This thing I mentioned earlier. It's called improvenews.org. Improve the news.org. It's just this free little news aggregator,
Starting point is 00:48:05 but it's all powered by machine learning. So that's why it can actually read 5,000 new article every day, which I can. I saw that in that out. And then what we're doing is, instead of just saying, OK, today we have a lot of news sites. You can go there and read about all the good things that Democrats have done and all the bad things
Starting point is 00:48:21 Republicans have done. And then there are other ones where you can read about all the great things that Republicans have done and how the bad things Republicans have done. And then there are other ones where you can read about all the great things the Republicans have done and how the bad things Democrats have done. Where this one, the AI figures out which articles are about the same story. Maybe it finds now 62 things about the new US national debt passing 30 trillion or whatever.
Starting point is 00:48:39 And then it's like, OK, then you can come in and say, OK, here are the facts that all the articles agree on. Boom, boom, boom. If you're a fact kind of guy, you can now click away say, okay, here are the facts that all the articles agree on. Boom, boom, boom. If you're a fact kind of guy, you can now click away and go to the next story. But if you want to know all the narratives, it separates out. Here is what's the best narrative, that narrative. So you have photos, would it have a photo of Will last January 6th? You mean in the capital?
Starting point is 00:49:01 I got a little one. You got a little one. No way, with the goggles, to go with the goggles. Yeah. And the thing, there's no way you're getting that photo. How much will you pay me again? Yeah. What's best wordle score? Go.
Starting point is 00:49:11 And be careful. He got an unbeatable score today. Got it in two. This guy. Not bad. Yeah. Not bad. That sounds amazing.
Starting point is 00:49:22 But it's so exciting. Also just all the emails you get from people, because I think, yeah, bits or free. You can give them away the world. And AI is sounds fancy, but it's just code. Yeah, yeah, yeah. Well, listen, I want you with a fresh mind tomorrow when you get back at it, so I don't want you to stay up later any more today. Thank you for joining us.
Starting point is 00:49:44 Do you guys feel a little smarter? Yes, I do. A little bit smarter. I feel smarter. I definitely do. Please say thank you to Max. Thank you. Thank you.
Starting point is 00:49:52 Thank you very, very much, buddy. Here, pal. Thank you very much. Thank you. Max. Thank you, Max. Thank you, Max. Woo. Woo.
Starting point is 00:50:04 Now here's the thing. Now, how much? I, go ahead. How much dumber do you think you are than him? Like, on an IQ score, what do you think his score is versus yours? He reminded me how much smarter I am than you, which was great. That's fair. Do you think it's actually, I feel, very void by that whole experience?
Starting point is 00:50:22 Do you think it's double, his until the top of the turn? Double your, his, over mine? Over mine. Actually, I feel very boyd by that whole experience. Do you think it's double? His and tell the truth? Double your mind? His overmind? Overmind. Oh, easily. No, no, no, no. I mean, he's just, he has a very big brain.
Starting point is 00:50:34 I could talk to him for hours. I don't know that he would listen to me for more than five minutes, but I could talk to him for hours. Yeah. I love all of the fantastic guests. Yeah. Right up my alley. Right up my alley. Very cool. And I think that if I think that if we all spent more time thinking about that kind of stuff just a little bit that
Starting point is 00:50:55 maybe we could get around to solving some big issues tonight. Let's solve it tonight guys. Everybody, let it up. Yeah, I'm so cool. Thank you. I know you. We all kind of share. We love all of our guests. Those are nice pops because there's stuff that we don't usually cover on the podcast. And I've just repeating myself, but I just love that stuff. I could ask him a million more questions.
Starting point is 00:51:20 Well, it's the original conceit of this thing, air go the title. We thought we'd bring people on that can educate us a little bit more on things that we don't know about. We happen to get lazy and ask some of our famous fancy friends to come on. This is a real treat to be able to access these big, big thinkers in this incredible town. So, thanks for having me.
Starting point is 00:51:42 And now, it's incumbent upon us to really kind of do something about it. We can't sit around all day in our pajamas, you know, and in our slippers, you know what I mean? Or then just go to the golf course and then get in our test list. We have to.
Starting point is 00:51:58 Do you think that's important for, and I don't want to single anybody. I will never be one of those people. The ruining things. But no, it is true that thank you for you've educated us a little bit more. It's pretty bad. And I think, you know, I could talk to them. I want to talk to them about the web telescope
Starting point is 00:52:16 because, you know, those kinds of things. Oh god, here comes a buy, everybody. Get you feel it when it starts to ramp up the engine. Sean, if you don't land it, you can't do it. No, you just gotta get into it more subtly. Well, I'm just saying you a mile ahead. I'm just saying like telescope, like that is much better than any thing like this.
Starting point is 00:52:34 What are these call, all those are, are you scared? Hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, hey, boss, and thank you so much. Smart. Smart. Smart. Smart. Smart. Smart.
Starting point is 00:52:53 Smart. Smart. Smart. Smart. SmartLess is 100% organic and artisanally handcrafted by Michael Grant Terry, Rob Armstrong, and Bennett Barberco.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.