The Joe Rogan Experience - #1292 - Lex Fridman

Episode Date: May 7, 2019

Lex Fridman is a research scientist at MIT working on human-centered artificial intelligence and autonomous vehicles. ...

Transcript
Discussion (0)
Starting point is 00:00:00 ready boom and we're live hello lex hey what's going on the sequel part two you have a very similar if not the exact same suit on this is all i wear you look very professional yeah very um reservoir dogs reservoir dogs well let's go to the best sequel of all time godfather part two is that the best sequel of all time i think john Wick might be I haven't seen John Wick same suit how dare you sir Godfather part two I mean that's that has to be the best sequel
Starting point is 00:00:31 okay then and if this is Godfather part two let's definitely not do part three yeah part three was terrible right well let's let's not
Starting point is 00:00:40 offend anyone but it was not up to par it was not as good yeah yeah I don't remember it it was uh the older Pac. It was not as good. Yeah. I don't remember it. It was the older Pacino.
Starting point is 00:00:48 Oh. With that deeper voice. Oh, that was like way later, right? Yeah. That was 90s. Oh, okay. So it's like Point Break, the remix. Yes. Right?
Starting point is 00:00:59 When they try to redo things like way, way, way later, they almost never, except the Alien franchise. They've done a pretty fucking good job except the alien franchise they've done a pretty fucking good job with the alien friend they had a couple of duds yeah in there but for the most part i've actually never seen the alien franchise what who are you what's wrong with you aren't you in a science intelligent men can disagree but you're not in the science i don't know i prefer uh prefer al pacino i would say. Okay. That the older, sent a woman, Al Pacino.
Starting point is 00:01:29 You know that movie? Yeah. What? That's one of the greatest movies. Alien? Come on. Yeah, he got the Oscar for that one. What about the one when he played the devil? And the devil likes to rant!
Starting point is 00:01:39 Okay, there's two. What was that one? There's duds for everybody. What was that one? That was Advocate. Uh-huh, there you go. That was with Keanu Reeves. Ah, Keanu Reeves from John Wick.
Starting point is 00:01:50 I heard John Wick Part 3 is very good. Or The Matrix. Yeah, I'm glad I remember. The better movie. Not true. Did you see this? That happened? This high school in North Bergen, Jersey
Starting point is 00:01:58 put on Alien play a couple weeks ago. And Sigourney Weaver showed up, right? Oh, she showed up just the other day to say thanks or whatever. Tell them they did awesome. Oh. It looked crazy. And Sigourney Weaver showed up, right? Oh, she showed up just the other day to say thanks or whatever, tell them they did awesome. Oh. It looked like crazy. And I was just wondering,
Starting point is 00:02:09 if they did this when you were in high school, do you think you might have joined drama? Like if they did the alien play? No. I would have loved it. I would have watched it. All right. I'm not getting into drama.
Starting point is 00:02:17 Those people cry too much. There's too much work. It's a cool suit, though. Yeah, okay. Let's talk about this. There's two kinds of movies. There's fun movies, and there's movies that are, like, transformational for society. Get sent to a woman.
Starting point is 00:02:30 What? Are you going to say scent of a woman is transformational for society? Yeah, it's one of the greatest scenes between a man and a woman on film. The tango scene. Are you not married? No, I'm not married. Yeah. It's like someone talking about French who can't speak French I see, yeah
Starting point is 00:02:49 It's nonsense, that movie sucked I read about French in a book Talking about France without ever being to Paris Is Scent of a Woman your favorite movie? No, it's not my favorite movie I don't really think that movie sucked either, by the way, if you're getting mad right now I don't barely remember it But it is up there
Starting point is 00:03:04 I'm sure it's a good movie It's one of the greatest performances by any actor jesus christ we're off to a good start yeah yeah i bet you there's thousands of people agree with me right now yeah there's millions that don't and they're called haters oh everyone who disagrees is always a hater okay so what's your favorite love scene in a movie? Not like between a man and a woman. You need to get married, bro. Okay.
Starting point is 00:03:30 You're into love movies and shit? No, I'm not. You need to find yourself a gal. Settle down. We're not talking about romantic comedies. Adam Sandler here. Rom-coms. We're talking about serious, like, dramatic moments, right?
Starting point is 00:03:42 Okay. So, Godfather. Good movie. Yeah, great movie. Great movie. Does it have to have two guys like shooting each other or no okay so you ever seen a sentimental woman yeah i'm sure i saw it yeah but i barely remember it all right well i can watch it today it'd be like a new movie to me okay there's this broken man spoiler alert considering suicide right okay it's deep so he is tortured by you know um by his involvement in the war by being responsible all this kind of stuff he's now mentoring a younger version of himself who has more character more integrity and throughout all of this he meets this beautiful young woman he's blind he asked her for the dance and there's this beautiful moment
Starting point is 00:04:26 where they connect i mean okay listen phil what what's the purpose of film right entertainment or make us think i mean make us think hmm you know you're gonna think if you want to think nothing makes you think a film can engage you you can you. It can resonate with you or not. I have a movie that I throw by people whenever I want to find out whether or not I want to listen to anything they have to say about movies. Ex Machina? The Big Lebowski. Yeah. Yeah, that's one of the greatest movies of all time.
Starting point is 00:04:57 Oh, look at you. Okay. Good for you. That could be slightly better than Sent to a Woman. Oh, boy. That could be like slightly better than sent to a woman. Oh, boy. That also has one of the greatest scenes between a man and a woman when the fine young lady's painting her toenails.
Starting point is 00:05:13 Right. And she's offering him sex for money. Yeah. That's a beautiful moment, too. You think so? Yeah. Really? Isn't that girl that used to be a hot mess?
Starting point is 00:05:25 What's her name? Tara Reid. Tara Reid? Yeah. Is she still a hot mess that used to be a hot mess? What's her name? Tara Reid. Tara Reid? Yeah. Is she still a hot mess or did she get her shit together? She's been like the Sharknado series is what she's been doing recently. Oh, she's in that? Yeah. So I passed the Big Lebowski test and you failed the Santa Woman test.
Starting point is 00:05:38 Well, I don't remember it. I got to wrap this conversation up. I legitimately don't remember it. Okay. I mean, I'm sure it's great. I'm sure it's great. You're a wise man remember it. I mean, I'm sure it's great. I'm sure it's great. You're a wise man. If you like it, I'm sure it's good. And you also recognize that Godfather 3 kind of sucks. Yeah. But I like the old Pacino.
Starting point is 00:05:55 Listen, Godfather is about your people, the Italian people, have dominated the mob, the brilliant mob movies, right? I mean, Godfather is about family, right? There's something deeply genuine about that, that in our modern society we really crave for. So it's like bigger than the individual, bigger than the rules of society, the government, the man.
Starting point is 00:06:21 It's family above all. That's like a, I don't know, that's timeless. Okay. I agree with that. The moment with the young Pacino when he talks to what, his brother, Fredo, says don't ever take sides against the family again, ever. I mean, that's one of the greatest moments ever. That's a great moment.
Starting point is 00:06:43 All right, all right, all right. I'm romanticizing movies here. You didn't like John Wick, though, huh? Never seen it. Whoa. Never seen it. It's a good movie to watch on the treadmill. Is he playing a Russian mobster in that?
Starting point is 00:07:01 No, he kills a bunch of them, though. And he speaks Russian. He works for the Russians. Kills people for the Russians. Keanu Reeves is one of the greatest human beings ever. You think so? Yeah, he's like the nicest guy. I heard he's a really nice guy.
Starting point is 00:07:15 But he plays a badass gangster. I would like him to be a little bit more fit. Work out a little bit more. I've seen him without a shirt on. I'm like, hmm, not quite buying it. But that's okay. Average man. Yeah, but the average man is not the fucking best assassin of all time with all this martial arts skill like fedor yeah but fedor is big fedor might have like a gut but
Starting point is 00:07:39 he's a thick motherfucker okay especially young fedor you ever see young fedor when he was in his prime like back when he fought like fugita like back when uh there's a picture of fedor standing around with a bunch of kettlebells you ever see that picture nope or else fedor in his lifting days i suspect and this is coming from that's that's one when fedor was fairly young up there but that's uh that's not the one I'm talking about. You know that one with the kettlebells? Is that picture up? See if you found that picture.
Starting point is 00:08:10 Never a six-pack in sight. No, no six-pack. But I suspect that Fedor might have been on some performance-enhancing substances during his prime. You mean like hard training, of drilling Technique Sort of strategy Steroids Steroids How dare you sir Dude he was in pride
Starting point is 00:08:28 Everybody was on steroids Yeah That's him Look at him That's him in his prime That's a big motherfucker Now I do not know If he was on anything
Starting point is 00:08:38 But everybody else was I mean literally everybody They had it in their contract That you We will not test for steroids. You know, Ensign Inouye told me that they essentially encouraged people to take steroids. Yeah, the pride days. That's right.
Starting point is 00:08:53 And it's not like Russians don't have a long history of using performance-enhancing substances. I'm sure you saw that movie Icarus. Did you see it? Yep. Fascinating, right? It's fascinating. I mean, I don't – steroids often feel to me like a bit of a witch hunt. Oftentimes you assume people are on steroids.
Starting point is 00:09:15 I'm a bit of a – maybe I'm naive or an optimist, but I tend to give people the benefit of the doubt until proven otherwise. But Icarus obviously proves that everybody was. Yeah, Icarus obviously proves yeah Icarus kind of throws a monkey wrench of those gears but you know with uh with Fedor I the technique on that oh yeah the technique the execution the timing the brilliance of his movement no doubt the heart no he's phenomenal yeah he's one if not the greatest heavyweight of all time he's certainly one of them and I don't think steroids would help that yes they do
Starting point is 00:09:44 they help okay yeah they help that guy in particular yeah they help everything they help your training they help your ability to recover they help your explosive power they help your speed they help everything they but they they also it's not just steroids like a lot of them on epo epo radically enhances your endurance um And they're starting to catch people. They just stripped TJ Dillashaw, UFC bantamweight champion, for EPO. It's tragic. Yes, it is tragic, especially TJ. I mean, he's just a phenomenal fighter.
Starting point is 00:10:17 If not, I mean, certainly top ten pound for pound. And then this is one of those things that comes up and you go oh man it's a legacy killer in this world we have to kind of reconsider what kind of uh what should be allowed or not yeah i agree with that there is an idea where you should make stairways legal right or not legal sorry uh a lot or some kind of supplementation like where's the line when you when you start to talk about the future of martial arts, the future of sport? If you can control the levels so that they're healthy. I mean, isn't that the reason that they're not allowed is because if abused, they become unhealthy?
Starting point is 00:10:59 They damage long-term well-being of the person? Look, if that was the case, we wouldn't allow fighting. Because fighting is more damaging than steroids, for sure. For sure. Getting punched and kicked and fucking kneed in the face and elbowed into unconsciousness, that is way worse for you than steroids. The concern is not for the athlete.
Starting point is 00:11:19 The concern is for the opponent. The idea is that you will be able to inflict punishment that you would not ordinarily be able to inflict. You will have more endurance. You will have more power. You will hurt someone, potentially even, look, there's going to be a time where someone dies in a mixed martial arts event. And if that someone who was the victor, who did not die was on steroids it is going to be a huge national tragedy and a massive
Starting point is 00:11:49 disaster for the sport for everything if that ever does happen we can only hope it never does but for sure you know it's a it's a very very dangerous game you're playing and when you are martial arts a very dangerous game and when you are enhancing your body with chemicals that are illegal while you're doing that game the real question is though here's my take on it and this is it's one of the most human subjects and being meaning that it's messy humans are messy like there's good and there's bad. You know? Look, like, abortion is a messy subject. It's messy.
Starting point is 00:12:29 You know, whether you have the, whether you agree with someone's right to have it or not, it is, what you're doing is, especially as the fetus gets older, it's messy. You know, when it's a complicated discussion, it's not a clear, it's not like, you should drink water. You know what I mean? It's like, it's a very complicated discussion. It's not a clear. It's not like you should drink water. You know what I mean?
Starting point is 00:12:45 It's a very complicated discussion. Steroids are a very complicated discussion. You're not allowed to do them, but they exist for a reason. The reason why they exist is they're really effective. They're really effective at enhancing your body. But how much of that will we allow? We allow creatine. We allow supplements in terms of, you know, there's certain things that can slightly elevate your testosterone, slightly elevate your growth hormone. We allow sauna and ice baths and all these things that have shown to enhance recovery.
Starting point is 00:13:16 But that's too much. It's too good. They're too effective. But it's weird. It's weird that this thing that we found that makes you better you can't use yeah and so i have to go back a little bit and disagree with you on something so in terms of fighting being dangerous and that's if we wanted to forbid things that are dangerous for you would forbid fighting i think the main thing you're doing can be dangerous
Starting point is 00:13:41 right the main thing that we're talking about the sport the the combat event that can be dangerous because that is what we watch two people at the height of their skill ability heart passion putting their life at risk that can be dangerous but the supplementation around it the way to make it, to make their training better, more effective, that can't be dangerous. And I thought- That can't be dangerous? Can't be dangerous. So I thought steroids were considered, were sort of banned because abuses lead to long
Starting point is 00:14:17 term damage to health. Now we see steroids as cheating, but it was banned initially because it has detrimental effects i don't think that's true it's not no because there's no real evidence that it's detrimental it's not as detrimental as alcohol when you allow people to drink but even a bit even when abused where are the bodies like there's there's not a lot of like there's a great documentary on it called bigger stronger faster and uh it's uh by my friend chris. And when you watch that documentary and you realize like, oh, well, the real negative consequences of taking steroids are that it shuts down your endocrine system. So it stops your body's natural production of testosterone and growth hormone and hormones. That's the real problem.
Starting point is 00:15:02 And for young people, that can be very devastating. hormones that's the real problem and and for young people that can be very devastating and it can lead to depression and suicidal thoughts and all all sorts of really bad things when your testosterone shuts down but as far as like death boy i mean there's people are prescribed pain pills every day of the week and fighters that are on injuries uh that have been you know that have gotten surgery, they're prescribed pain pills every day of the week, and those pain pills kill people left and right. That's just a fact.
Starting point is 00:15:35 People die of those things all the time, much more so than die of steroids. I'm not advocating for the use of steroids. I'm being pretty objective and neutral about this, but I'm just looking at it like it's a very messy subject. Yeah, it's very eloquently put. So your problem in terms of damaging the opponent is if one side takes steroids and the other doesn't. Yes, exactly. What happens if both? The problem is you would require someone to do that.
Starting point is 00:15:58 Maybe someone's a holistic person. They don't want to introduce any unnatural exogenous steroids into their body and hormones into their body they want they want everything to be produced by the human body they want to they want to eat healthy food train hard sleep well and compete naturally yeah you had ct fletcher here yesterday right yes he's natural bodybuilder yes or not bodybuilder powerlifter Powerlifter Yeah But that's not required Right You're not requiring people You're giving them the choice
Starting point is 00:16:29 So You know It's an interesting possibility Where in moderation You'll be able to allow steroids In future athletics Because With an argument that
Starting point is 00:16:39 If In moderation You can actually create Healthier athletes Yeah That's I mean That's a real argument for the Tour de France.
Starting point is 00:16:46 The Tour de France, they say that you actually are better off and healthier taking steroids and EPO than you are doing it without it because it's so unbelievably grueling on the body. Yeah, I mean, those athletes are basically some of the best people in the world at suffering, long-term suffering. It's incredible. Ultra marathon runners, all those suffering. It's incredible. Yeah. Ultramarathon runners, all those guys.
Starting point is 00:17:08 It's incredible. It's a different sort of thing, you know, and, you know, the thing about ultramarathon runners is they don't even test them because they're like, good luck. Those people have iron wills. I don't know if, like Courtney DeWalter is a woman who, you know who she is? Yeah, yeah. She's been in here. She eats candy She drinks beer
Starting point is 00:17:26 Eats candy and pizza Yeah that doesn't make sense Yeah I mean She's just got a Fucking iron will Her will is indomitable And you can take All the steroids you want
Starting point is 00:17:35 When you're running For three days That chick is gonna Beat you She just doesn't know How to quit Yeah Just has no quit in her
Starting point is 00:17:43 Did you see the podcast With her Where she talked about how she fell? She couldn't see. She was experiencing, I think it was interocular hemorrhaging. Yeah. So her eyeballs were bleeding internally, something like that, where it was impeding her vision.
Starting point is 00:17:58 She couldn't see. I would stop. I would stop running. No, she fell because she couldn't see, busted her head open, bleeding all down her face, keeps running. Barely can see her feet as she's running. Keeps running. I'm glad those people are out there. I'm actually a bit like that.
Starting point is 00:18:16 Like, I don't know how to quit. Really? Yeah. Like, I do a lot of stuff like that. Like, stupid. Like, I ran yesterday. I couldn't sleep. I ran here yesterday
Starting point is 00:18:25 13 miles i'm not a runner just just this weird obsession you don't run no i run but i'm not a runner look at my body i'm i have a similar body like yours we're not exactly we do like better built for short sprinting and then maybe killing somebody with our hands versus long distance. Well, you're a black belt in jiu-jitsu, right? Yeah. Where do you train at? I now train at Broadway Jiu-Jitsu in Boston. Nice.
Starting point is 00:18:52 And before I was in Philly, Balance Studios with Phil McGreece and so on. Right. But Kyle Bokniak actually trains in Broadway. Oh, really? I love that guy. Yeah. Last time I was on, I actually wanted to talk about Oh, really? I love that guy. Yeah. Last time I was on, I actually wanted to talk about the Zabit fight. Because I'm Russian, so I love the Russian way.
Starting point is 00:19:11 But I also love the, I mean, Kyle to me represents like the American, he's like the Rocky. If you remember that fight against Zabit? The third round, he was winning. I mean, that's the best of what martial arts is. MMA is, to me, is like you have two technicians that just throw everything away. Like, screw this. I'm just going to throw it out. Well, Zabit had broken his hand.
Starting point is 00:19:36 Broke his hand somewhere, I think, in the second round. So he was pretty compromised going into that third round. He couldn't really fire back. And Kyle just has zero quit in him. That guy's an animal. Yeah. I mean, that's the most beautiful. You talk about technical fights on the ground or technical striking.
Starting point is 00:19:51 When two technicians throw everything away, I'm sorry, but that's what I love the most about any kind of fighting, any kind of sport. I enjoy it in the moment. I discourage it heavily. I don't think it's a smart way to fight yeah well that's but i get it that's probably your job well it's not just my job it's like i what i like like i get i get the impulse but i don't want people to give into the impulse i think fighting is something that you should do correctly you should do you
Starting point is 00:20:23 should do there's there's principles that you should follow to fight correctly it doesn't mean that you shouldn't take chances but you know there's moments like um um ricardo lamas when uh he fought uh max holloway and they just stood in the center of the ring for the last few seconds of the fight and Max Holloway pointed down at the ground he's like come on right here right here and they just started swinging haymakers it was amazing while it happened but if I was in Max's corner I'd be like don't no don't do that man this macho shit is gonna give you fucking brain damage you're gonna get hit with shots you wouldn't get hit with.
Starting point is 00:21:05 That's a difficult, like you said, human nature is messy. I would say that is the greatest, that is the greatest moment of their lives. What? That war. No. Listen. That war is the greatest moment of Max Holloway's life. Max Holloway is the greatest featherweight of all time.
Starting point is 00:21:23 Discussion. No, but Max Holloway is the greatest featherweight of all time. This is back to center woman discussion. No, but Max Holloway is the greatest featherweight of all time. He's a guy who destroyed Jose Aldo twice. He's a guy that he's beaten everybody in front of him at featherweight. The idea that this one moment where they
Starting point is 00:21:37 decided to throw out all his skill and technique and just swing for the bleachers in the middle of the octagon. It was a fun moment. It was great to watch. But the idea that that was the greatest moment of his life is ridiculous. You're a crazy person. Yeah, there's moments in sports that are just magic.
Starting point is 00:21:53 Olympics bring that when like the thing that you don't think should happen or can't possibly happen or is not wise, people just throw everything away. Yeah, you like that? Yeah. You're a passion person. Yeah, a passion person. Yeah, you like that? Yeah. You're a passion person. Yeah, a passion person. Yeah, for sure. That's an interesting thing
Starting point is 00:22:07 for someone who studies artificial intelligence. I mean, if anybody listened to this podcast, they'd be like, what the fuck does this guy do? He dresses like Reservoir Dogs. They talk about movies.
Starting point is 00:22:17 So many people angry right now. But that's this podcast. Now who gives a shit? Talk about autonomous vehicles. We have plenty of time, sir. We have plenty of time. But that's the beautiful thing
Starting point is 00:22:24 about this podcast. We're just talking. So tell me what you got here with your notes man i mean you are fucking prepared i mean you have a lot of shit here yeah many many pages for sure i don't want to miss i don't miss stuff i mean there's uh there's been a lot of exciting stuff on the autonomous vehicle space since you came on i got a tesla and i've experienced what that thing is like when i put it on autopilot and it's stunning it's crazy i mean this is the performance of the vehicle it's amazing well in in terms of its ability to change lanes and its ability to drive without you doing anything i just put my hand on the wheel and hold it there and it does all the work so because like one or two people listen to this podcast,
Starting point is 00:23:05 I want to take this opportunity and tell people if you drive a Tesla, whether you listen to this now or a year from now or two years from now, Tesla or any other car, keep your damn eyes on the road. So whatever you think the system is able to do, you will have to still monitor the road, and you still have to take over when it fails. If?
Starting point is 00:23:30 When? Really? So, we're throwing, this is like the moment we're throwing down, right? No, I think, No, this is your level of expertise,
Starting point is 00:23:42 obviously. I mean, I'm not throwing down with you on this. No, I think it's really important to, in this transitionary phase, whatever the car company, whatever the system, that we don't overtrust the system. We don't become complacent. We don't think it can do more than it can. Currently, 40,000 people die in the United States from fatal crashes. The number one reason for that is distraction.
Starting point is 00:24:06 So texting, smartphone. How much has it gone up since smartphones? People don't exactly, they're trying to understand that. There's a lot of studies showing that it's significant increases, but it's hard to say it's because of smartphones, but it's almost obvious. Yeah, it's pretty obvious. The flip side is even though everybody's now using a smartphone, texting and so on, they've become better at using the smartphone. So they're better at texting and driving. They're better at balancing that. Now, this is a horrible thing to do. So if you're listening to this podcast, you should listen to it in your car and keep your eyes on the road and not text.
Starting point is 00:24:44 The worst was Pokemon. When Pokemon was in its prime, I was watching a guy on the highway playing Pokemon as he was driving. No more than one person, two people. A guy and I saw a girl do it once too. Holding the phone on the steering wheel, playing Pokemon. Yeah, that's incredible. What are you doing, Jamie?
Starting point is 00:25:06 It's Grandpa's. Oh, shit. Sorry. Playing Pokemon Yeah Yeah it's incredible What are you doing Jeremy? This grandpa has Oh shit Sorry I'm confused What is this? This grandpa in Japan He drives around on a bike Oh my god
Starting point is 00:25:15 With 15 phones Playing Pokemon All at the same exact time Look at this This guy has 15 phones That's ridiculous This guy needs to find hookers There's people that do this also
Starting point is 00:25:24 In their car With maybe 4 or 5 Doing exactly what you're saying this man needs a better hobby this is preposterous look at the look at his fire he can't see what the fuck's going on in front of him he spends about 300 a month to buy virtual currencies in the game wow that guy's bored or in an innovative genius. You think? Depending on the perspective. Well, no. People misuse their innovative genius. How is he innovative? He's just playing a stupid game while he's driving around
Starting point is 00:25:53 on his bike like an asshole. Well, he's doing... This is back to the set of a woman thing. It's passion. It's the most amazing moment of his life. Driving around playing Pokemon. I'm sure most people are on my side. Scent of a Woman. Oh, you're crazy.
Starting point is 00:26:08 You think most people are on your side? They think Scent of a Woman is the greatest movie of all time? Not Scent of a Woman, but I was defending Godfather, Scent of a Woman. You weren't defending Godfather against me. I'm going to throw you under the bus. Okay. I'm going to manipulate this conversation. Jamie, can you edit this in post?
Starting point is 00:26:22 Did you see the video that just came out yesterday of a tesla on autopilot avoiding a crash all right so yeah i have and there's a lot of examples quite a few of those yeah of course it's like hard to prove exactly what happened and whether autopilot was involved just like on the flip side it's hard to prove that autopilot was involved in the dangerous stuff but i think by any measure the media is really negative in terms of their reporting on Tesla. I think you've talked to this about before. In general, negativity gets more clicks. Right.
Starting point is 00:26:57 And I think Tesla, negative stuff on Tesla gets a lot of clicks. Well, not Tesla. Let me speak more broadly about autonomous vehicles. If there's any fatality, any crash, it's overrepresented. It's over-reported on. To me, people who are interested in AI helping save lives in these systems like autopilot, I feel you carry the responsibility of being at least as good of a driver you are when it's under manual control. So don't text
Starting point is 00:27:28 and drive. Keep your eyes on the road. If I say anything over and over on this podcast, it's that. Drunk driving of course is the other one. And so don't drink and drive. But the number one thing is distracted driving. So put your phone down. I agree. Listen to some music. Some classic rock. are you into yeah yeah like credence what are you into no like no just say no no
Starting point is 00:27:53 like you don't like it like you would change the channel now you're gonna put me in this sentimental woman like fortunate son comes on you don't get excited there you go i take uh i take my shirt off let's start drinking no, Led Zeppelin, Linus Skinner Of course Hendrix Of course Hendrix I have to admit something I thought about messaging you a couple times
Starting point is 00:28:14 I wanted to I play guitar Do you? Yeah You good? You can't Are you good at Jiu Jitsu? Yeah, I'm good
Starting point is 00:28:24 Okay I'm a black belt I'm pretty good Youiu Jitsu? Yeah I'm good Okay I'm a black belt I'm pretty good You're a black belt too I'm sure you're good Yeah I mean I'm not world class A lot of dudes fuck me up
Starting point is 00:28:32 I'm a three stripe purple belt guitar Ah That's a good way of putting it Yeah Yeah I say that about hunting I'm like a blue belt in hunting Yeah Yeah
Starting point is 00:28:39 Yeah I've been doing it Like I got the purple belt By doing it a long time As opposed to being amazing. Did you take lessons? No, I learned everything myself. I have a couple of videos online, me playing Comfortably Numb. Did you learn from watching videos, or did you learn from books?
Starting point is 00:28:57 Like, how did you learn? Let me see this. Give me this. Look at you. Well, this is going to get us booted off of YouTube. Yeah, this is me playing. No, no, no. no it's the right it probably won't pick it up and if it picks it up it'll be to my channel and i'll you sure yeah it's me playing what do you know but i mean it's so good it sounds they've blocked people from humming songs recently yeah no but so this is on youtube
Starting point is 00:29:21 and this is no i didn't know that but this didn't know that If you were humming a song And then someone made a claim on that song It would block our YouTube Like literally we could get demonetized And we lose our streaming ability Lots of things could happen Lots of things It's fucked up man
Starting point is 00:29:38 We've gotten flagged for Watching something on the screen Picture in picture no sound Commenting on it And we gotten flagged for watching something on the screen, picture in picture, no sound, commenting on it. And we get flagged. And they want all the advertising revenue from a three-hour show for five, ten seconds of a video. It's a slightly broken system. Ooh, it's broken. But there's a lot of scam artists, too.
Starting point is 00:29:58 So I played another song, Black Betty. Oh, yeah? And I got... I played the damn song but they they said it was it wasn't they did exactly that oh they said it was the ram jam or whatever it said it was there and i may have borrowed the beat behind it from them i'm not sure i just took a beat like yeah well that's what i was thinking about that song Like it sounded like There was other shit going on Besides just your guitar Oh no
Starting point is 00:30:28 That's all me Really That's all me At the back Let me hear that again That's really good man You sound good That's a great fucking song too
Starting point is 00:30:35 Comfortably numb You know the scariest thing for me What Was to play guitar On this podcast So I was like Going back and forth Oh really
Starting point is 00:30:42 Should I do it Should I not do it Actually play Play play There was only a few people That ever played play Everlast Ben and Suzanne So I was like going back and forth. Oh, really? Should I do it? Should I not do it? Actually play? Play, play? There's only a few people that have ever played play. Everlast. Ben and Suzanne from Honey Honey. Gary Clark didn't, right?
Starting point is 00:30:52 He just came on and talked. He brought his guitar. I wanted to play Hendrix here. Really? Yeah. Live? Live. You got it with you right now?
Starting point is 00:31:00 No. In the future? Yeah. Well, no. I meant I was... Okay, sure. In the future. I'm not meant i was okay sure in the in the future i'm not promising i'm scared will you wear a hendrix wig with a bandana is that racially insensitive though no you're allowed to joe i will not take as long as you don't wear a black face
Starting point is 00:31:18 you're allowed okay the hair is like just you know it's it's just hair. You can't wear dreadlocks, though. Yeah, so there's rules. But I think you... Yeah, there's rules. Hendrix is above all rules, though, right? Well, he's the goat, you know, of guitar players. That's the goat. One of them. You know, the reason why this show is called.
Starting point is 00:31:41 The experience, yeah. Yeah, I stole it from Jimi Hendrix. Yeah. What's the matter? I don't remember if we brought this up last time, but I just remembered seeing this video where you're playing guitar while you were driving. Yep. Well, you shouldn't do that, dude.
Starting point is 00:31:54 There's a reason why he was doing it. Why are you doing that? It's on a test track. Oh. What kind of car is that? Looks like a Lincoln. Lincoln FKZ, yes. Oh, they do that?
Starting point is 00:32:02 The Lincolns do that? No, we converted it, and that's our code controlling the car. Wow. That is crazy. Yep. So you converted this car to drive autonomously. Autonomously, yeah. Wow.
Starting point is 00:32:17 And what exactly do you have to do to a car to change? Because that car does not have the capacity the capacity to do anything like that so the right am i correct no no no absolutely not but you are absolutely correct the there's the first part is being able to control the car with a computer which is converting it to be drive by wire so you can control the steering and the brake and the acceleration to basically be able to control with a joystick. And then you have to put laser sensors all around the car? Is that what you're doing?
Starting point is 00:32:50 Any kind of sensor and software. What's the best kind of sensor? Is it optical, laser? A lot of debate on this. And this is the big, this is the throwdown between Elon Musk and everybody else. So Elon Musk says the best sensor is camera. Everybody else, well, everybody else says thaton musk says the best sensors camera everybody else well everybody else says that at this time lidar which are these lasers yes is the best sensor so i'm more on the
Starting point is 00:33:14 side in this case on camera on elon musk so here's the difference lasers are more precise they work better in in poor lighting conditions They're more reliable. You can actually build safe systems today that use LiDAR. The problem is that they don't have very much information. So we use our eyes to drive and cameras, the same thing, and they have just a lot more information. So if you're going to build artificial intelligence systems, so machine learning systems that learn from huge amounts of data, camera is the way to go because you can learn so much more, you can see so much more. So the richer, deeper sensor is camera. But it's much harder. You have to collect a huge amount of data. It's a little bit more futuristic, so it's a longer-term solution.
Starting point is 00:34:02 So today, to build a safe vehicle, you have to go LiDAR. Tomorrow, however you define tomorrow, Elon Musk says it's in a year. Others say it's 5, 10, 20 years. Camera is the way to go. So that's the hard debate. And there's a lot of other debates, but that's one of the core ones. It's basically for camera, if you go camera like you do in the Tesla, there's seven cameras in your Tesla, three looking forward, there's all around, so on, one looking inside. No, you have the Model S?
Starting point is 00:34:32 Yeah. Yeah, so that one doesn't have a camera that's looking inside. So it's all cameras plus radar and ultrasonic sensors. That approach requires collecting huge amounts of data, and they're doing that. They drove now about 1.3 billion miles under autopilot. Jesus. Yeah, it's a very large amount of data. So you're talking about over 500,000 vehicles have autopilot. 450, I think, thousand have the new version of autopilot Autopilot 2, which is the one you're driving.
Starting point is 00:35:07 And all of that is data. So all of those, all the edge cases, what they call them, all the difficult situations that occur is feeding the machine learning system to become better and better and better. And the open question is how much better does it need to get to get to the human level performance? Like one of the big assumptions of us human beings is that we think that driving is actually pretty easy. And we think that humans suck at driving. Those two assumptions. We think like driving, you know, you stay in the lane, you stop at the stop sign. It's pretty easy to automate.
Starting point is 00:35:45 And then the other one is you think, like, humans are terrible drivers. And so it will be easy to build a machine that outperforms humans at driving. Now, there's, that's, I think, there's a lot of flaws behind that intuition. We take for granted how hard it is to look at the scene. Like, everything you just did, picked up, moved around some objects. It's really difficult to build an artificial intelligence system that does that. To be able to perceive and understand the scene enough to understand the physics of the scene, like all these objects, like how to pick them up, the texture of those objects, the weight, to understand glasses folded and unfolded open water bottle all those things is common sense knowledge that we take for granted we think it's trivial but
Starting point is 00:36:31 there is no artificial system in the world today nor will there be for perhaps quite a while that can reason do that kind of common sense reasoning about the physical world. Add to that pedestrians. So add some crazy people in this room right now to the whole scene. Right, and being able to notice, like, this guy's an asshole. Look at him. What is he doing? What is he doing? Get off that skateboard.
Starting point is 00:36:55 Oh, Jesus, he's in traffic. Yep. And considering not that he's an asshole, he's a respectable skateboarder, that in order to make him behave a certain way, you yourself have to behave a certain way. So it's not just you have to perceive the world. You have to act in a way that you have to assert your presence in this world. You have to take risks. So in order to make the skateboarder not cross the street, you have to perhaps accelerate if you have the right of way.
Starting point is 00:37:29 And there's a game theoretic, a game of chicken to get right i mean we don't even know how to approach that as a as a artificial intelligence sort of research community and also as a society do we want an autonomous vehicle that speeds up in order to make a pedestrian not cross the street, which is what we do all the time. We have to assert our presence. If there's a person who doesn't have the right of way who begins crossing, we're going to either maintain speed or speed up potentially if we want them to not cross. So that game there, to get that right. That's a dangerous game for a robot.
Starting point is 00:38:04 It's for a robot. that game there to get that right that's a dangerous game for a robot it's for a robot and for us to be rationally if that god forbid leads to fatality for us as a society to rationally reason about that and think about that i mean a fatality like that could basically bankrupt a company there's a lawsuit going on right now um about a an accident in northern california with tesla yeah and are you aware about that one yeah what was the circumstances about that one so there was uh i believe in uh mountain view a fatality in a tesla where it this is a common problem for all all lane keeping systems like like tesla autopilotilot is there's a divider in the highway. And basically, the car was driving along the lane.
Starting point is 00:38:52 And then the car in front moved to an adjacent lane. And this divider appeared, right? So you have to now steer to the right. And the car didn't. And it went straight into the divider. Oh, wow. And it's, you know, basically, what that boils down to is the car drift't and it went straight into the divider. Oh, wow. Basically, what that boils down to is the car drifted out of lane or didn't adjust properly to the lane.
Starting point is 00:39:13 Those kinds of things happen. And this is because the person was allowing the autopilot to do everything? Nope. You can't. So we have to be extremely careful here. I don't know the really deep details of the case i'm not sure exactly how many people do so there's a judgment on what the people the person was doing and then there's an analysis what the system did right the system did it drifted out of lane the question is did the person was the person paying attention
Starting point is 00:39:40 and was there enough time given for the person to take over and if they were paying attention to catch the vehicle steer back onto the road as far as i believe uh the only information they have is hands on steering wheel and they were saying that like half the half the minute leading up to the crash the hands weren't on the steering wheel or something like that basically trying to infer were the person paying attention or not. But we don't have the information exactly where were their eyes. You can only make guesses, as far as I know, again. So the question is, this is the eyes on the road thing, because I think I've heard you
Starting point is 00:40:20 on a podcast saying you're tempted to sort of look off the road with your new Tesla or at least become a little bit complacent. That's your worry. Yeah, the worry is that you just rely on the thing, that you would relax too much. But what would that relaxation lead to? The problem is if something happened. If you weren't, you know, when you're driving, I mean, we've discussed this many times on the podcast that the reason why people have road rage, one of the reasons, is because you're in a heightened state. Because cars are flying around you and your brain is prepared to make split-second decisions and moves. And the worry is that you would relax that because you're so comfortable with that thing driving.
Starting point is 00:41:02 Everybody that I know that's tried that, they say you get really used to it doing that. You get really used to it just driving around for you. So the question is what happens when you get used to it? Do you start looking off-road? Do you start texting more? Do you start watching a movie, et cetera? Right. That's really an open question. And the, like, for example, we just did the study, published a study from MIT on what people in our data set, we'll collect this data set of 300,000 miles in Teslas, we instrumented all these Teslas and watch what people are actually doing.
Starting point is 00:41:37 And are they paying attention when they disengage the system? So there's a really important moment here. We have 18,000 of those. engage the system. So there's a really important moment here. We have 18,000 of those. When the person catches the car, you know, the disengaged autopilot. And that's a really, Tesla uses this moment as well. That's a really important window into difficult cases. So some percentage of those, some small percentage, it's about 10%, is we call them tricky situations, is situations where you have to immediately respond, like drifting out of lane, if there's a stopped car in front, so on. The question is, are people paying attention during those moments? So in our data set, they were paying attention. They were
Starting point is 00:42:16 still remaining vigilant. Now, in our data set, the autopilot was, quote-unquote, encountering tricky situations every 9.2 miles. So you could say it was failing every 9.2 miles. That is one of the reasons we believe that people are still remaining vigilant, that it's regularly and unpredictably sort of drifting out of the lane or misbehaving. So you don't overtrust it. You don't become too complacent. The open question is when it becomes better and better and better and better, will you start becoming complacent?
Starting point is 00:42:55 When it drives on the highway for an hour, an hour and a half, as opposed to 9.2 miles, make that 50 miles, 60 miles, do you start to overtrust trust it and that's a really open question do you think or do you anticipate a time in anywhere in the near future where you won't have to correct you will allow the car to do it because the car will be perfect the car first of all will never be perfect no car will ever be perfect autonomous vehicles will always you think require at least some sort of manual override yeah really that's interesting that you're saying that because you work in ai like what makes you think that that's impossible to achieve well let's let's
Starting point is 00:43:40 talk because because you're using the word perfection i think perfection okay that's a bad word yeah so you're i guess you're implying- Achieve it. Let me see. Will it achieve, because people are obviously not perfect. Will it achieve a state of competence that exceeds the human being? And let's put it in a dark way, competence measured by fatal crashes. Yes.
Starting point is 00:44:04 Yes, I absolutely believe so. And perhaps in the near term. Near term? Like five years? Yeah. For me, five, ten years is near term. For Elon, in Elon Musk time, that's converted to one year.
Starting point is 00:44:19 Have you met him? Yes. Interviewed him recently. Fascinating cat, right? Yep. Got a lot of weird shit bouncing around behind those eyeballs. You don't realize until you talk to him in person. You're like, oh, you got a lot going on in there, man.
Starting point is 00:44:33 Yeah. There's passion. There's drive. I mean, it's one of the- It's a hurricane of ideas. Yeah. And focus and confidence. Mm-hmm.
Starting point is 00:44:43 And focus and confidence. I mean, the thing is, in a lot of the things he does, which I admire greatly from any man or woman innovator, it's just boldly, fearlessly pursuing new ideas or jumping off the cliff and learning to fly on the way down. That's, I mean, no matter what happens, he'll be remembered as the great innovators of our time. Whatever you say, maybe in my book, Steve Jobs was as well. Even if you criticize, perhaps he hasn't contributed significantly to the technological development of the company or the different ideas they did. Still, his brilliance was in all the products of iPhone, of the personal computer, of the Mac, and so on. And I think the same is true with Elon.
Starting point is 00:45:32 And yes, in this space of autonomous vehicles, of semi-autonomous vehicles, of driver assistance systems, it's a pretty tense space to operate in. There's several communities in there that are very responsible, but also aggressive in their criticism. So in driving in the automotive sector, obviously since Henry Ford and before there's been a culture of safety of just great engineering. These are like some of the best engineers in the world in terms of large scale
Starting point is 00:46:02 production. You talk about Toyota, you talk about Ford, GM. These people know how to do safety well. And so here comes Elon with Silicon Valley ideals that throws a lot of it out the window and says we're going to revolutionize the way we do automation in general. We're going to make software updates to the car once a week, twice a week, over the air, just like that. That makes people and the safety engineers and human factors engineers really uncomfortable.
Starting point is 00:46:33 Like, what do you mean you're going to keep updating the software of the car without, like, how are you testing it? That makes people really uncomfortable. Why does it make them uncomfortable? Because the way in the automotive sector you test the system, you come up with a design of the car, every component, and then you go through really rigorous testing before it ever hits the road. Here's an idea from the Tesla side is where they basically, they in shadow mode test the software, but then they just release it.
Starting point is 00:47:03 So essentially the drivers become the testing. And then they regularly update it to adjust if any issues arise. That makes people uncomfortable because there's not a standardized testing procedure. There's not at least a feeling in the industry of rigor. Because the reality is we don't know how to test software with the same kind of rigor that we've tested the automotive system in the past. So I think it's extremely exciting and powerful to make software sort of approach automotive engineering with at least in part a software engineering perspective so just doing what's made silicon valley successful so updating regularly aggressively
Starting point is 00:47:53 innovating on the software side so your tesla over the air while we're sitting here could get a totally new update with a flip of a of a bit as ilan mus says, it can gain all new capabilities. That's really exciting, but that's also dangerous. And that balance, we... Well, what's dangerous about it? That it'd be faulty software? Faulty, a bug. So if the apps on your phone fail all the time,
Starting point is 00:48:23 where as a society used to software failing, and we just kind of reboot the device or restart the app. Most complex software systems in the world today, if we think outside of nuclear engineering and so on, they're too complex to really thoroughly test. So thorough, complete testing, proving that the software is safe is nearly impossible on most software systems. That's nerve-wracking to a lot of people because there's no way to prove that the new software update is safe. So what is the process?
Starting point is 00:49:02 that the new software update is safe. So what is the process? Do you know how they create software, they update it, and then they test it on something? How much testing do they do, and how much do they do before they upload it to your car? Yeah, so I don't have any insider information, but I have a lot of sort of public available information, which is they test the software in shadow mode, meaning they see how the new software compares to the current software by running it in parallel on the cards and seeing if there's disagreements, like seeing if there's any major disagreements and bringing those up and seeing what.
Starting point is 00:49:42 By parallel, I'm sorry. Do you mean both programs running at the same time? One, the original, yes, at the same time, the original update actually controlling the car, and the new update is just...
Starting point is 00:49:57 Making the same decisions? Making the same decisions without them being actuated. Without actually affecting the vehicle's dynamics. And so that's a really powerful way of testing. I think the software infrastructure that Tesla has built allows for that. And I think other companies should do the same. That's a really exciting, powerful way to approach not just automation, not just autonomous vehicles or semi-autonomous vehicles, but just safety, is basically all the data that's on cars,
Starting point is 00:50:29 bring it back to a central point to where you can use the edge cases, all the weird situations in driving to improve the system, to test the system, to learn, to understand where the car is used, misused, how it can be improved, and so on. That's extremely powerful. How many people do they have that are analyzing all this data? That's a really good question. So they have – the interesting thing about driving is most of it is pretty boring. Nothing interesting happens.
Starting point is 00:50:59 So they have automated ways of extracting, again, what are called edge cases, so these weird moments of driving. Once you have these weird moments, they have people annotate. I don't know what the number is, but a lot of companies are doing this. It's in the hundreds and the thousands. Basically, if humans annotate the data to see what happened. But most of what they're trying to do is to automate that annotation. But most of what they're trying to do is to automate that annotation. So to figure out how the data can be automatically used to improve the system.
Starting point is 00:51:34 So they have methods for that because it's a huge amount of data. I think in the recent Autonomy Day a couple of weeks ago, this big Autonomy Day where they demonstrated the vehicle driving itself on a particular stretch of road, they showed off that they're able to query the data, basically ask questions of the data, saying the example they gave is there's a bike on the back of a car, the bicycle on the back of a car. And they're able to say, well, when the bicycle is in the back of a car,
Starting point is 00:52:02 that's not a bicycle. That's just the part of the car. And they're able to now look back into the data and find all the other cases, the thousands of cases that happened all over the world, in Europe and Asia, in South America and North America and so on, and pull all those elements and then train the perception system of autopilot to be able to better recognize those bicycles as part of the car. So every edge case like that, they go through saying, okay, the car freaked out in this moment.
Starting point is 00:52:33 Let me find moments like this in the rest of the data and then improve the system. So this kind of cycle is the way to deal with problems, with failures of the system. It's to say, every time the car fails at something, say, is this part of a bigger set of problems? Can I find all those problems? And can I improve it with a new update? And that just keeps going. The open question is how many loops like that you have to take for the car to become really good, better than human. Basically, how hard is driving?
Starting point is 00:53:11 How many weird situations when you manually drive do you deal with every day? Somebody mentioned, I don't know, there's like millions of cases when you watch video you see them somebody mentioned um that they drive a truck a ups truck and past cow pastures and they know that if there's no cows in the cow pasture that means they're grazing and if they're grazing i mean i'd be using the correct terms i apologize not a cow guy uh that that means that there may be cows up ahead on the road. There's just this kind of reasoning you can use to anticipate difficult situations. And we do that kind of reasoning about, like, everything.
Starting point is 00:53:54 Cars today can't do that kind of reasoning. They're just perceiving what's in front of them. Now, outside of Tesla, how many other companies have autonomous systems that are driving their cars? So maybe it's good to step back. There are several and there are several leaders in each different approach. So first, let's draw a line between the different types of systems there are.
Starting point is 00:54:17 One, there's fully autonomous vehicles. So these are cars you can think about that don't have a steering wheel. If they have a steering wheel, it doesn't matter. They're in full control. And if there's a crash, the car company is liable. Do those exist? No. Okay.
Starting point is 00:54:36 It's a gray area, though, because many companies are basically saying that that's what they're doing, but they're not quite there. So the leader in that space used to be called doing, but they're not quite there. So the leader in that space used to be called Google Self-Driving Car Program. Now it's called Waymo. They are doing that. It's called Level 4 or Level 5. There's levels to this game. And this is this particular level where it's fully autonomous. Now they're trying to achieve full autonomy, but the way they're doing it currently is they're testing on public roads with what's called a safety driver. So there's a driver always ready to take over. And the driver does have to take over at some rate, you know, frequently.
Starting point is 00:55:20 And so the fact that the driver has to take over, that's not fully autonomous then, right? So there's no car today that you can just get in without a safety driver. So there's nobody behind the wheel. And using your app, sort of get from point A to point B. Right. But out of the cars that are semi-autonomous, where there is an autonomous program, but you do have to keep your hands on the wheel and pay attention to the road, what are the leaders?
Starting point is 00:55:43 Besides Tesla, there's tesla and who else is doing it so there's uh there's several systems it depends how you define leader but so are they let me ask you this like does mercedes and bmw they use the same system do they does someone make a system for cars or do they create their own systems? Yeah, that's a really good question. So there's, in some cases, there's Mobileye and NVIDIA. There's these companies that- NVIDIA, the video card company? Yeah, the video card company. Yep.
Starting point is 00:56:14 The same folks that power the Quake game, right? Right. The graphics on the Quake game. You can use those GPU, graphics processing units, to run machine learning code. So they're also creating these. Look at that, NVIDIA Drive. Scalable AI platform for autonomous driving.
Starting point is 00:56:31 In fact, I don't. When did you buy your Tesla? Five months ago or something, something like that. So the thing in there now, most likely, is a NVIDIA Drive PX2 system. And that works on cameras that uh that just runs code that takes in camera data but it can work on anything else so it could work on lidar as well somebody had a system yeah but it need different code so lidar requires very different kinds of processing very does anybody use that with cars, with semi-autonomous cars?
Starting point is 00:57:06 LIDAR. Yes. Well, okay, so semi-autonomous, we have to be careful. Right. Because Waymo cars, the quote-unquote fully autonomous cars, are currently semi-autonomous. Right. That's the highest level of semi-autonomous, right?
Starting point is 00:57:23 Yeah. I guess it's not even a highest level. It's a principle. It's a philosophy difference because they're saying we're going to do full autonomy. We're just not quite there yet. Most other companies, they're doing semi-autonomous, better called driver assistance systems. They're saying we're not interested in full autonomy. We just want a driver assistance system that just helps you steer the car.
Starting point is 00:57:46 So let's call those semi-autonomous vehicles or driver assistance systems. There's several leaders in that space. Like one car we're studying that's really interesting is Cadillac Super Cruise system. So GM has the system. It's called Super Cruise that I think is the best comparable system to autopilot today. The key differentiator there is, there's a lot of little elements, but the key differentiators, there's a driver monitoring system. So there's a camera that looks at you and tells you if your eyes are on the road or not. and tells you if your eyes are on the road or not.
Starting point is 00:58:25 And if your eyes go off the road for, I believe, more than six seconds, it starts warning you and says you have to get your eyes back on the road. So that's called driver monitoring. That's one of the big disagreements, for example, between me and Elon and many experts in the field and Elon and the Tesla approach is that there should be a driver monitoring system. There should be a camera looking. Why does Elon feel like there shouldn't be? I think his focus, the Tesla's focus, is on just improving the system so fast and so effectively
Starting point is 00:58:55 that it doesn't matter what the driver does. So essentially no safety net. No safety net. No safety net. And I think they operate like that in many ideas that they work with, is they sort of boldly proceed forward to try to make the car extremely safe. Now, the concern there is you have to acknowledge the psychology of human beings, that unless the car is perfect or under our definition perfect, which is much better than human beings that unless the car is perfect or under our definition perfect which is much better than human beings then you have to be able to you have to be able to make sure that the people are still paying attention to help the car out when it fails and for that you have to
Starting point is 00:59:40 have driver modern you have to know what the car is. Right now, your Tesla only knows about your presence from the steering wheel. Touching the steering wheel. Which is a kind of driver monitoring system. It knows you're there, but it's not nearly as effective at knowing you're there cognitively, visually. It can be tricked by clamps. You've seen people do that. They develop these clamps that you just put on the steering wheel. It'll hold a phone, and it'll also trick the system into thinking that you're holding on to the wheel.
Starting point is 01:00:08 Yeah, you can do a lot of purses actually work really well. Don't ask me how I know this. Hanging a purse? No, like shoving a purse into the... Oh, really? Into the... Somebody did that with an orange or something like that, but they said it didn't work. Maybe it needs to be all the way around the outside of it.
Starting point is 01:00:22 I think it depends on the shape of the orange, how ripe it is. There's a lot of debate. No, the point is there's ways to trick the system. It's not monitoring the driver. That's the point, right? Yeah, it's not monitoring the driver. And a lot of people believe you need to. You think you need to.
Starting point is 01:00:37 Makes sense. Yeah, I think not just for the safety of the system, but to create an experience. not just for the safety of the system, but to create an experience. Like, I think there's value for the car to know more about you. Sort of just like that. What's happening there. Scanning this guy's eyes.
Starting point is 01:00:57 Minority report. Shit freaks me out. So yeah, there's a lot of companies sort of springing up. They're doing computer vision on the face and so on to try to detect where you're looking. So what cars have that now? The major one is the Super Cruise system. There's not many cars.
Starting point is 01:01:13 A few cars are starting to add it. Europe. What's a Super Cruise system? That's the GM Cadillac. They're trying to – so it's in their super expensive lineup currently, and they're, I think, trying to add it to the full lineup of all. So in like Cadillacs, what is that big cruiser that they have now, the big four-door car, the really high-end?
Starting point is 01:01:34 It's a CT6. I don't know what it is. They have a new one that's really nice. Is that what they're putting in? The big sedan? That thing. Yes. It's pretty.
Starting point is 01:01:45 I don't know if that's the CT6, but the one we're looking at is the CT6. Yeah, that's a 2018 CT6. Yeah. But they want to add it to their full fleet. It's a really interesting system. Does that have the same amount of cameras as the Tesla system does? No, and it has a very different philosophy as well in another way which is it only works on very specific roads on interstate highways there's something called
Starting point is 01:02:12 odd operational design domain so they they define that this thing supercruise system only works on this particular set of roads and they're basically just major highways. The Tesla approach is saying basically what Elon Joklin referred to as ADD, right, is it works basically anywhere. So if you try to turn on your autopilot, you can basically turn it on anywhere where the cameras are able to determine either lane markings or the car in front of you. And so that's a very different approach, saying you can basically make it work anywhere or, in the Cadillac case, make it work on only specific kinds of roads so you can test the heck out of those roads.
Starting point is 01:02:53 You can map those roads, so you can use actually LiDAR to map the full road so you know the full geometry of all the interstate highway system that it can operate on. Does it also coordinate with GPS so it understands where bumps in the road might be or hills? geometry of all the interstate highway system that it can operate on. And then – Does it also coordinate with GPS so it understands where, like, bumps in the road might be or hills? In that sense, it coordinates with GPS for different curvature information, but – Not the topography? No.
Starting point is 01:03:16 And, like, construction is a big one. That would be crazy for new potholes, you know? Well, potholes isn't a big problem. I think construction is the big problem like just this quickly changing dynamic information which like
Starting point is 01:03:29 apps like Waze provide a lot of potholes are a pretty big problem in Boston though yeah no for sure but
Starting point is 01:03:36 New York is actually probably even worse I blew out two tires in one day in New York on potholes just had an unlucky day yeah but I'd rather you blow out your tire for sure than I mean is in one day in New York on potholes. Just had an unlucky day. Yeah, but I'd rather you blow out your tire than... For sure.
Starting point is 01:03:48 ...than, I mean, the kind of fatality that happened in the Mountain View with the Tesla, I believe is slightly construction-related. So, I mean, there's a lot of safety-critical events that happen, construction-related stuff. I would like it if that stupid Tesla could figure out the hole in the ground, though, so I didn't have to blow a tire out.
Starting point is 01:04:06 Come on, bro. I feel your pain, Joe. Figure it out. But priorities. I'll make sure. I'll forward this podcast to Elon to make sure they work on this. I think he's busy. What is this, Jamie?
Starting point is 01:04:18 Tesla autopilot will be able to avoid potholes in the road, says Elon Musk. Ha ha! The motherfucker's on top of shit. What's the date on that? April.il 7th okay just now and that's an interesting thing that's another that's almost an ethical question whether you want a car to avoid a situation by swerving right because when you swerve you now introduce as opposed to sort of breaking the vehicle only you swerve, you now introduce, as opposed to sort of breaking the vehicle only, swerving into another lane means you might create a safety situation elsewhere. You might put somebody else in danger.
Starting point is 01:04:53 Yeah, that's why I was saying if it coordinated with GPS, it would have previous knowledge. You know, sort of like Waze tells you where the cops are. Yep. You know what I mean? So that kind of information would be extremely powerful and useful. The problem is it's hard to get it really up to date, that kind of information really up to date. It's just an infrastructure question,
Starting point is 01:05:15 just getting the software, the data in place to where the car would be able to learn quickly from all the things that are changing. I think potholes don't change that often, so that's a different thing. But in terms of construction zones, in terms of other weird things that change the dynamics, the geometry of the road, that's difficult to get right.
Starting point is 01:05:36 So Cadillac's doing a version of it, but it sounds like it's a little bit less involved, less comprehensive. Maybe that's a better way of describing it. Yeah, and less, I would say it's more safety-focused. It's a sort of, what's the right words to use here? It's more cautious in its implementation. So GM, again, has a tradition for better or for worse. Is this it, Jamie?
Starting point is 01:06:05 Yeah, this is a video they have on their website. This little part I'm showing you shows the signal coming on. Oh, wow, the green thing. Pay attention, lady. She's too hot. She's not paying attention. She's looking at people staring at her. No comment. One of the things that
Starting point is 01:06:21 it's hard to talk about without actually experiencing the system is what's more important than driving monitoring and any of the details we talk about is how the whole thing feels, the whole thing together, how it's implemented, the whole interface. And the Cadillac system is actually done really well in the sense that there's a clarity to it. Everything becomes, there's a green color and a blue color, and you know exactly when the system is on and when it's off. That's one of the big things people struggle with is just confusing in other cars, drivers not being able to understand when the system is on or off. Oh, right. So you think the system's doing it, and then you just slam into something that wasn't even on.
Starting point is 01:07:01 Now, when this car is operating in this manner how many cameras is it using you know that's a good question i should know that but i i think uh it's only forward to facing cameras as far as i know i think it's two cameras it may be three cameras that lady just sat back so she doesn't have her hands on the wheel at all yep so she's watching right because the car is able to see where the eyes are it's a hands-off system so you're allowed to take your hands off the wheel it's very interesting it's uh and there's certain human behavior aspects that come into play to this so you start to like i found myself actually becoming a little more drowsy with this system uh i haven't driven
Starting point is 01:07:47 it enough so i haven't gotten used to it but you have to at least in the initial stages it kind of forced you to look at the road in in a way that felt artificial i think it's something that gets better with time you get used to it but it's almost like a gamified thing that the car when you look off road uh starts to to tell you that you're looking off road so you kind of psychologically pressure to always stare at the road and you realize that actually when you drive you often look around and so having to like stare forward can be a little bit uh yeah exactly you start like there's something um something peaceful and hypnotic about those lanes just coming at you. The lines, yeah.
Starting point is 01:08:30 Why is that? That confuses the shit out of me because I could not be tired at all. But if it's nighttime and I'm on the highway and those lines, they just start to take you to dreamland. I get the same with, there's also just the vibration. There's like that hum of driving. Same with trains. Yeah. Yeah, planes as well.
Starting point is 01:08:51 Yeah, puts me out. So that, there's a. It's a Cadillac system. That's the big leader, I would say, in the driver monitoring. And then Tesla is the no driver monitoring and huge data collection. BMW has a system as well they use? Yeah, BMW. What are they using? and then Tesla is the no driver monitoring and huge data collection. BMW has a system as well they use? Yeah, BMW.
Starting point is 01:09:08 What are they using? I don't want to speak too much to the details, but they have lane keeping systems. They're basically systems that keep you in the lane. That is similar to what in spirit autopilot is supposed to do, but is less aggressive in how often you can use it and so on. If you look at the performance of the of the actual how often the system is able to keep you in lane autopilot is currently the leader in that space and they're also the most aggressive innovators in that space they're like really pushing it to improve further and further and the open question is the worrying question is if it improves much more are there going to be effects like complacency like people will will take uh we'll start texting
Starting point is 01:09:53 more we'll start looking off-road more that's a it's a totally open question and nobody knows the answer to it really and there's a lot of folks like i mentioned in the safety engineers and human factors community so these psychology folks who have roots in aviation, that there's been 70 years of work that looks at vigilance. If I force you to sit here and monitor for something weird happening, like radar operators in World War II had to watch for the dot to appear. If I sit you behind that radar and make you do it, after about 15 minutes, but really 30 minutes, your rate of being able to detect any problems will go down significantly. You just kind of zone out.
Starting point is 01:10:40 And so there's like all kinds of psychology studies that show that we're crappy. Human beings are really crappy at monitoring automation. If I tell you, if I put a robot on you, just say, monitor this system so it doesn't kill anyone, you'll tune out. And we have to be engaged. You have to be engaged. You have to be, you know, there has to be a dance attention. We don't have a mode for watching autonomous things, right? We don't have a mode for watching autonomous things, right?
Starting point is 01:11:08 If you consider historically the kind of modes that people have for observing things, we don't really have a mode for making sure that an autonomous thing does its job. Yeah. It's not a mindset. It's not like, oh, you know what I mean? Like if in my car, okay, I'm driving. Here we go. Oh, driving.
Starting point is 01:11:19 I turn. I'm thinking. I'm in driving mode. When you're in autonomous mode and you're observing, you're just like, what've never done this before this is fucking weird it feels weird it's not part of human nature right it's a normal state one thing it's done commonly in now is aviation so pilots pilots are basically monitoring fully autonomous planes yeah that's a good point as far as i know many planes today could fly almost fully autonomously. It's also a good point when it comes to software and updates
Starting point is 01:11:48 because isn't that part of the issue with this Boeing 737? Max system. Yeah, these systems that they've had problems with, they've been faulty and a couple have crashed. Yeah, and that's a really good point. Yeah, there have been two tragic crashes recently with the Mac system. Yeah, they've benched those things, right? Haven't they?
Starting point is 01:12:11 I'm not following – They also got rid of a bunch of inspectors. I think they fired like 80 inspectors today. And the unions are freaking out. Yep. And obviously there's politics. FAA is – I think FAA is supposed to supervise and there's a close relationship between Boeing and FAA. There's questions around. I mean, there's better experts at that than me.
Starting point is 01:12:32 But on the software side, it is worrying because it was a single software update, essentially, that helps prevent the airplane from stalling. the airplane from stalling. So if the nose is tilting up, increasing the chance of stalling, it's going to automatically point the nose down of the airplane. And the pilots, in many cases, as far as I understand, weren't even informed of this update, right? They weren't even told this was happening. The idea behind the update is that they're not supposed to really know.
Starting point is 01:13:08 It's supposed to just manage the flight for you. The problem happened when there's an angle of attack sensor, so the sensor that tells you the actual tilt of the plane. And there was a malfunction in that sensor, as far as I understand, in both planes. And so the plane didn't actually understand its orientation. So the system started freaking out, started pointing the nose down aggressively. And the pilots were like trying to restabilize the plane and couldn't. So shortly after liftoff, they just crashed. Oh, my God.
Starting point is 01:13:37 Yeah, that's a software update. That's crazy. And that's a safety culture that's dealing with this new world of software that we don't know what to do with. You know, and yeah, it's a question. One way is to be sort of a little bit Luddite. I use the term carefully and just be afraid and say, you know what, we should really not allow so many software updates. The other one is sort of embracing it and redefining what it means to build safe AI systems in this modern world with updates multiple times a week. What do you think?
Starting point is 01:14:12 So I'm 100% for the software approach. So I think updates, regular updates, so combining the two cultures, but really letting good software engineer lead the way is the way to go there's i wish other companies were competing with tesla on this they're on the software side tesla is far ahead of everyone else in in the automotive sector and that's one of the problems i i'm worried that you know competition is good right and i'm worried there's people are way too far behind to actually give tesla new ideas i'll compete tesla on software so most cars are not able to do over the air as far as i know no cars are able to do major over the air
Starting point is 01:15:04 updates except tesla vehicles they they major over the air updates except Tesla vehicles they do over the air updates to the entertainment system like you know if your radio is malfunctioning but in terms of the control of the vehicle you have to go to the dealership
Starting point is 01:15:16 to get an update Tesla is the only one that over the air like it can multiple times a week do the update I think that should be a requirement
Starting point is 01:15:24 for all car companies. But that requires that they rethink the way they build cars. That's really scary when you manufacture over a million cars a year in Toyota and GM to say, especially old school Detroit guys and gals that are like legit car people to say we need to hire some software engineering that's a challenge it's a totally you know i don't know how often you've been detroit but there's a culture difference in detroit and silicon valley and those two have to come together to solve this problem so i have like the adult responsibility of detroit of how to do production well manufacture how to do safety well how manufacture, how to do safety well, how to test the vehicles well, and do the bold, crazy, innovative spirit of Silicon Valley,
Starting point is 01:16:10 which Elon Musk in basically every way represents. And I think that will define the future of these, of actually AI in general. I mean, interacting with AI systems just even outside the automotive sector requires these questions of safety, of AI safety, of how we supervise the system, how we manage them from misbehaving and so on. We also, there's a concern about those systems being vulnerable to third-party attacks. Yeah, so hacking. Yeah. That's a fascinating question. Yeah, so hacking. That's a fascinating question. I think there is a whole discipline called adversarial machine learning in AI, which basically any kind of system you can think of, how we can feed it examples, how we can add a little bit of noise to the system to fool it completely. So there's been demonstrations on Alexa, for example,
Starting point is 01:17:11 where you can feed noise into the system that's imperceptible to us humans and make it believe you said anything. So fool the system into thinking, so ordering extra toilet paper, I don't know. And the same for cars. You can feed noise into the cameras to make it believe that there is or there isn't a pedestrian that there is or there isn't lane markings so someone could do this in theory at least in uh in theory that's the big difference
Starting point is 01:17:39 is in theory is doable you can do demonstrations in practice it's actually really difficult to do in the real world so in the lab you can do it you can construct a situation where a pedestrian can wear certain types of clothing or put up a certain kind of sign where they disappear from the system i have to ask you this because now i just remember this you'd be the perfect person to talk about this i'm not sure if you remember this case but there was a guy named michael hastings michael hastings was a journalist and uh he was, I believe, in Iraq or Afghanistan. He was somewhere overseas. And he was stuck there because of this volcano that erupted in, I believe, Iceland.
Starting point is 01:18:15 And he was over there for the Rolling Stone magazine and doing an article about a general. Well, he stayed there for a long time because they were stranded because of the volcano and they got real comfortable around him and uh he reported a lot of the stuff that they said and did that maybe they thought that he probably wouldn't have reported on including them saying disparaging things about president ob at the time. Anyway, comes back. The general was forced to resign. He was a beloved general. And Michael Hastings was in fearing for his life because he thought that they were going to come and get him.
Starting point is 01:18:54 Because these people were very, very angry at him. He wound up driving his car into a tree going like 120 miles an hour. And the car exploded and the engine went flying and people that were the conspiracy theorists were saying they believed that that car had been rigged to work autonomously or that someone for some third party bad person decided to or good person depending on your perspective, decided to drive that guy's car into a fucking tree
Starting point is 01:19:28 at 120 miles an hour. Do you think that that, and this is 2011? Michael Hastings' death, 12 maybe, 2012? I think that sounds right, 12. Let's see what it says. 2013. 2013? Yeah, June 2013. Do you think that sounds right. Let's see what it says.
Starting point is 01:19:46 2013? Yeah, June 2013. Do you think that in 2013 that would have been possible? It's entirely possible. No, I just wanted to say that. Shout out to the Joe Rogan subreddit. Okay. Check that one off the list.
Starting point is 01:20:11 Jamie, pull that up. Check that off. Whether it's possible is an interesting question. Whether it's likely is another question. I think it's possible is an interesting question whether it's likely is another question i i think it's very unlikely and the other most important question is that something we should worry at scale about our future is cars being used to assassinate essentially people i'm russian so i've heard of those things being done by our friend putin over. I think it's very unlikely that this kind of thing would happen at scale, that people would use this.
Starting point is 01:20:51 I think there would be more effective ways to achieve this kind of end. For sure. And I just think it's a very difficult technical challenge that if hacking happens, it would be at a different level than hacking the AI systems. It would be just hacking software. Right. And hacking software is the kind of thing that can happen with anything.
Starting point is 01:21:17 An elevator software or any kind of software that operates any aspect of our lives could be hacked in that same kind of way. Right. My question, though, was in 2013, was that technology available where they could take over someone's car? Do you know what car it was? Mercedes. I think it was an S-Class. C-Class. C? C-Class? Yes. Yes. Yes. But I don't think oh boy this is like no listen this has been widely speculated i know i'm just asking you because you're actually an expert i mean it's
Starting point is 01:21:53 very rare that you get an expert in autonomous vehicles and you get to run a conspiracy theory by them to see if they can just put a stamp on it being possible or not let me just say that alex jones is officially not allowed to say MIT scientist says, which is exactly what he's going to try to do. No, first of all, let me back off and say I am not a security expert, which is a very important difference. That is important.
Starting point is 01:22:18 So then autonomous vehicle. I build autonomous vehicle systems. I don't know how to make them extremely robust to hacking attacks. And I have a lot of really good friends, which are some of the coolest people I know, who are basically hackers converted to security experts. I would say, though, loosely speaking, I think the technology was there, yes, with physical access to the car to be able to control it.
Starting point is 01:22:44 But I think it's extremely unlikely that's what happened i agree i see where you're coming from um i'm not asking you whether or not it's likely that it happened i'm sure you don't even have much information on the case because i had explained it to you right that's right um the guy also had uh some serious amphetamines in his system. They compared it to crystal meth, but the reality is he was a journalist, and most journalists, I don't want to say most, a lot are on Adderall, and Adderall is essentially amphetamines.
Starting point is 01:23:18 I mean, that's what it is. It's like next-door neighbors to crystal meth. It really is. It's like next-door neighbors to crystal meth. It really is. Well, you said it's possible. They could actually get it to turn the wheel. Yeah, so I have to look at the exact system.
Starting point is 01:23:37 It's that drive-by-wire thing that I mentioned. Some systems are not so easy to turn the wheel, actually. Right, but it could get him to just accelerate out of control. He's going like 120-something miles an hour and he's slammed into a tree. It's entirely possible. Ah, you can't do it twice. The systems back then, though, were far more primitive, correct? Yeah.
Starting point is 01:24:02 Yeah, but it's really, again, the attack vectors here. So the way you hack these systems have more to do with the software, low-level software, that can be primitive than the high-level AI stuff. Right, but my issue with it was there's no cameras on the outside of the vehicle like there is on a Tesla of today, which has autonomous driving as an option. Absolutely. Okay, I see your point now. So you wouldn't be hacking the system that perceives the world and acts based on the world. It would literally be malfunction that forces it to not be able to brake, accelerate uncontrollably. Which is a more basic kind of attack than making the car steer out of lane, for example. Yes, yes.
Starting point is 01:24:41 That's a different – that's what people worry about with autonomous vehicles when more and more – Yes, yes. to hack the code. And so people are worried legitimately so that these security attacks would lead to these kind of, well, at the worst case, assassinations, but really sort of just basic attacks, basic hacking attacks. And I think that's something that people in the automotive industry and certainly Tesla is really working hard on and making sure that everything is secure. There's going to be, of course, vulnerabilities always, but I think they're really serious about preventing them.
Starting point is 01:25:36 But in the demonstration space, you'd be able to demonstrate some interesting ways to trick the system in terms of computer vision. This all boils down to that these systems are actually, they're ones that are camera-based, are not as robust as our human eyes are to the world. So like I said, if you add a little bit of noise, you can convince it to see anything. To us humans, it'll look like the same road, like the same three pedestrians crossing the road. Could you draw like a little person on the camera lens?
Starting point is 01:26:07 They're little cameras, right? You could get down there with a Sharpie. Oh, my God, there's a guy on the road. That's a one-attack vector. It's draw stuff. But you jokingly say that, but that's a possibility. The sun plays tricks on Cadillac Super Cruise. Next generation system will address camera problem.
Starting point is 01:26:25 Oh, well, as long as the next generation addresses it, you fucking assholes. The sun plays tricks on it? So next gen system is something you're going to have to bring that Cadillac into the dealership and they're going to have to update the software. Update it, yep. Whereas Tesla would just handle that shit. Over the air, yeah. Yeah, I got an update the other day. I was like, alright.
Starting point is 01:26:42 And the question was so that's an exciting powerful capability but then the boeing the flip side right is you know uh it can significantly change to be here with the system and there it could be a glitch there could be a glitch there could be a bug that the boeing one's terrifying especially with a lot of i mean that number whatever it is like 300 combined 300 plus people dead maybe even 400 i mean that's i i don't even know how to think think about that number yeah all from a software glitch the guy who coded it or the girl who coded it must feel fucking Fucking terrible. Yeah. And you kind of... Fuck, man.
Starting point is 01:27:27 It's a lot of burden, and it's one of the reasons it's one of the most exciting things to work on, actually, is the code we write has the capability to save human life, but the terrifying thing is it also has the capability to take human life. And that's a weird place to be as an engineer where directly a little piece of code,
Starting point is 01:27:49 I write thousands of them a day, basically notes you're taking could eventually lead to somebody dying. Now, I don't know anything about coding, but is there a spell check for coding? Yeah, so it's kind of called debugging. It's trying to find bugs. And it's a software that's doing this? Yeah, software. So there's, depending on the programming language and everybody should, if you haven't tried programming, you should try it. It's cool.
Starting point is 01:28:20 It's the future. You should learn to program. Okay. That's my plug. You're supposed to say learn to code. You can get kicked off Twitter for that. See how I avoided that? I heard that's a problem.
Starting point is 01:28:29 Everyone's scared of it. It's a problematic term. I don't actually know why. It's the dumbest fucking problematic code of all time because someone ridiculously was suggesting that coal miners could maybe learn how to code computer code and get a different job they could be trained and so as someone the way people were looking at it like that that was a uh like a
Starting point is 01:28:54 frivolous suggestion and that it was ridiculous to try to get someone who was 50 years old who doesn't have any education in computers at all to change their job from being a coal miner to learning how to code so they started saying it to politicians and people mocking it but then what twitter alleged was that what was going on was it was being connected to white supremacy and anti-semitism and a bunch of different things like people were saying learn to code and they were putting in a bunch of these other phrases in my suggestion would be well that's a different fucking thing like now you have like you look you have a problem with nazis and white supremacists but that's the problem is with nazis and white supremacists when someone is just saying learn to
Starting point is 01:29:39 code mocking this ridiculous idea that you're going to teach you know that's a that's a legitimate criticism of someone's perspective that you're going to get a coal miner to learn how to fucking do computer coding it's crazy it's a it's so people getting banned for that rightly so people were furious the way google described it to me and tim pool and we were discussing it was that google i mean excuse me twitter the way discussing it, was that Google, I mean, excuse me, Twitter. The way Twitter described it was that essentially they were dealing with something where they were trying to censor things at scale. There was so many people and there's so much going on that it's very difficult to get it right and that they've made mistakes. I think that's one of the most fascinating applications of AI actually is filtering, trying to manage.
Starting point is 01:30:25 Computer learning. So using machine learning to manage this huge conversation. You're talking about 500, I believe it's 500 million tweets a day, something like that. Jamie makes at least three. Three. Maybe one. I was going to say, with this conversation, I saw this recently. I was going to say, with this conversation, I saw this recently.
Starting point is 01:30:53 I don't know who did the data on this, but there's a statement someone put on Twitter that said that of – let me see if I can word it correctly. It was 22% of adult Americans are on Twitter. Whoa. All right. So that's like a fact one. Of that, 10% make up 80% of the tweets created by adult Americans. That makes sense. So that's 2% of the people on Twitter make up 80% of the tweets.
Starting point is 01:31:12 Yeah, that makes sense. Yeah. A lot of people are arguing. Aggressively. And the question of how to manage that, and you can't manage that by just manual. Review of each individual tweet. Yeah, you'd have to have so many employees. Yeah, that's, I think, more likely. I don't think Jack is lying,
Starting point is 01:31:32 nor is Vidya. But I do think that they have a clear bias against conservatives, and that's being shown. So that's an interesting question. I have your friend, my friend and mentor, Eric Weinstein, who talked to me. I disagreed with him a little bit on this. I think he basically believes there's a bias.
Starting point is 01:31:51 It boils down to the conversation that Jack is having at the top level inside Twitter. What is that conversation like? I think I tend to believe, again, this might be my naive nature, is that they have – they don't have bias and they have just. They're trying to remove people that lead to others leaving the conversation. So they want more people to be in the conversation. I think that's true as well. But I think they definitely are biased against conservative people. There's an Alexandra AOC. Octavia,oc octavia how is it aoc is good cortez is the last one is it octavia ocasio that's right okay i'm sorry alexander aoc sorry i'm just i'm thinking
Starting point is 01:32:58 i wasn't planning on talking about her but um there was a parody account and someone was running this parody account which was very mild just humorous parody account they were banned permanently for running it and then their own account was banned as well whereas um you know there's some progressive people or liberal people that post all sorts of crazy shit and they don't they don't get banned at the same rate it's really clear that someone in the company, whether it's up for manual review, whether it's at the discretion of the people that are employees, when you're thinking about a company that's a Silicon Valley company, you are, without doubt, you're dealing with people that are leaning left. There's so many that lean left in Silicon Valley. The idea that that company was secretly run by Republicans is ridiculous. They're almost all run by Democrats or progressive people.
Starting point is 01:33:53 So at the leadership level, there's a narrow-mindedness that permeates all Silicon Valley, you're saying? Well, I think there's a leaning left that permeates Silicon Valley. I think that's undeniable. I think that's undeniable. I think it's undeniable. I mean, I think if you had a poll, the people that work in Silicon Valley, where their political leanings are, I think it would be by far left. I think it would be the vast majority. Does that mean that affects their decisions? Well, what's the evidence?
Starting point is 01:34:20 Well, it kind of shows that it does. They're not treating it with 100% clarity and across-the-board accuracy or fairness, rather. I think that there's absolutely people that work there that lean. And there's been videos where they've captured people that were Twitter employees talking about it, talking about how you do that, how you find someone who's using Trump talk or saying sad at the end of things, and someone who's talking, certain characteristics they look for. And there's been videos of, what is that, Project Veritas, where that guy and his employees got undercover footage of Twitter employees talking about that kind of stuff. The question is, how much power do those individuals have? How many individuals are there like that?
Starting point is 01:35:07 Are those people exaggerating their ability and what they do at work? Or are they talking about something that used to go on but doesn't go on anymore? I don't know. I don't work there. I think it boils down to – I'm one of those people that believes it boils down to the leadership. I'm one of those people that believes it boils down to the leadership. To people at the tops, at the culture. And the culture has to be, it cannot be this kind of Silicon Valley, narrow-minded, sort of left-leaning thinking.
Starting point is 01:35:45 Even if you believe, even if you're a hardcore liberal, you cannot, when you operate, when you drive and manage a conversation in the entire world, you have to think about middle America. You have to have fundamental respect for human beings who voted for Trump. It is a concerning thing for me to see just a narrow-mindedness in all forms. One of the reasons I enjoy listening to this podcast is you're pretty open-minded. That open-mindedness is essential for leaders of Facebook and Twitter, people who are managing conversations. I think so too. I think it's the thought of being open-minded and acting in that ethic is probably one of the most important things that we could go forward with right now
Starting point is 01:36:23 because things are getting so greasy.'s so slippery on on both sides and we're at this weird position that i don't recall ever in my life there being such a divide between the right and the left in this country i don't it's more more vicious more angry more hateful It's different than at any other time in my life. And I think a lot of our ideas are based on these narratives that may or may not even be accurate. And then we support them and we reinforce them on either side. We reinforce them on the left, we reinforce them on the right. Where if you're looking at reality itself and you don't have these uh clear parameters and these clear ideologies i think we're way most of us are way more in the middle than we think we are most of us are we just don't want racists running the country we don't want socialists giving all our money away we don't
Starting point is 01:37:17 want to pay too much in taxes to a shitty government we don't want schools getting underfunded we we all you know and then we decide what what does my, like the team that I, the shit that I like, is that this team? Well, not everything, but they got a lot of things, so I'll go with them. Maybe I'm not a religious nut, but I'm fiscally conservative, and I don't like the way Democrats like to spend money. I'm going to go with the Republicans. Maybe I'm more concerned with the state of the economy and the way we trade with the world than I am with certain social issues that the Democrats embrace. So I'll lean that way even though I do support gay rights and I do support this and I do support all these other progressive ideas. There's way more of us in that boat.
Starting point is 01:38:02 There's way more of us that are in this middle of the whole thing. For sure. But it goes up and down. So all of us, I believe, I hope I am open-minded most of the time, but you have different moods. Oh, for sure. Yeah. And the question is, this is where the role of AI comes in. Does the AI that recommends what tweets I should see, what Facebook messages I should should see is that encouraging the darker parts of me or the the steven pink or better angels of our nature like is it what stuff is it showing me because if it shows me uh stuff that like if the ai trains purely on clicks it may start to learn when i'm in a bad
Starting point is 01:38:42 mood and point me to things that might be upsetting to me. Yeah. And so escalating that division and escalating this vile thing that can be solved most likely with people training a little more jiu-jitsu or something. Well, this Facebook algorithm that encourages people to be outraged because accidentally, not even on purpose, but this is what engages people. This is what gets clicks. So they find out, oh, well, he clicks on things when he finds out the people are anti-vaccination. Or he clicks on things when he finds out, you know, fill in the blank with whatever the subject is. And then you get, these motherfuckers, you know, this is the reason why measles is spreading.
Starting point is 01:39:22 And you start getting angry. I mean, the anti-vax arguments on Facebook, I don't know if you ever dip into those waters for a few minutes and watch people fight back and forth and in fury and anger you know it's uh it's another one of those things that becomes a extremely lucrative uh subject for any social media empire if you're if you're all about getting people to engage and that's where the money is subject for any social media empire. If you're all about getting people to engage, and that's where the money is in advertising, to getting people to click on the page,
Starting point is 01:39:52 and the ads are on those pages, you get those clicks, you get that money. If that's how the system is set up, and I'm not exactly sure how it is because I don't really use Facebook, but that's what it benefits. I mean, that's what it gravitates towards. It gravitates towards controversy. So, and when we think about concern for ai systems we talk about sort of terminator i'm sure we'll
Starting point is 01:40:10 touch on it but i think of twitter as a whole as one organism that is the thing that worries me the most is the artificial intelligence that is very kind of dumb and simple simple algorithms that are driving the behavior of millions of people. Right. And together, the kind of chaos that we can achieve, I mean, that algorithm has incredible influence on all society. Twitter, our current president is on Twitter. All day. Yeah, all day, all night.
Starting point is 01:40:41 I mean, it's scary to think about. We talk about autonomous vehicles leading to one fatality, two fatalities. It's scary to think about what the difference, a small change in the Twitter algorithm. I mean, it could start wars. It really could. And that, if you think about the long term, if you think about it as one AI organism,
Starting point is 01:41:02 that is a super intelligent organism that we have no control over. And I think it all boils down, honestly, to the leadership, to Jack, and other folks like him, making sure that he's open-minded, that he goes hunting, that he does some jiu-jitsu, that he eats some meat and sometimes goes vegan. some jujitsu. That he eats some meat and sometimes goes vegan. He just did a 10 day talkless retreat where you don't talk at all for 10 days. He also eats once I follow a similar diet to him.
Starting point is 01:41:34 He eats once a day. I've done that. And fasts all through the weekend, which I don't. I don't. It's crazy. I've never done that, but I've done quite a few 24 hour where I eat at 7pm I'm done eating. I've done quite a few 24 hour You know where I eat At 7pm I'm done eating I don't touch food until 7pm the next day
Starting point is 01:41:49 It's just water or coffee Why do you do it by the way? I do it to shock my system I think it's good for your system You know there's been a lot of research on fasting And the effect it has on telomeres Dr. Rhonda Patrick spoke Pretty recently There's been quite a
Starting point is 01:42:07 few things that she's written about in terms of fasting and the benefits of fasting. Intermittent fasting is great for weight loss, but just fasting itself, even for several days, most people seem to get some pretty decent benefits out of it. So I dabble in it it i also like the way it makes me feel um to be a little hungry i think my brain is sharper like i refuse to go on stage full i when i when i do stand up and i actually learned this from a cat williams interview he was talking about it and uh he's crazy as fuck but he's hilarious and he's one of the greats in my opinion and he was in the back of a limo and he was talking about how he prepares for a show that he has his music that he listens to pre-show music he has like a music list and then he'll have a drink no food he won't eat because it slows you down i was like that'll slow
Starting point is 01:42:57 you down but sometimes you don't even think about it's not like a rule so you just man i'm hungry i'll just eat i would way rather because i can go through A couple of shows I used to think I used to have this Faulty idea that If I didn't eat I would be exhausted To do things But then I
Starting point is 01:43:10 Work out Fasted every morning Every morning When I When I get my Morning workout in And whatever the fuck it is It's usually hard
Starting point is 01:43:19 I'm always fasted You can do a lot It's You're not It's not at your best Like if I was gonna do Jiu Jitsu I don't do Jiu Jitsu fasted I would eat some fruit
Starting point is 01:43:29 That's an interesting one Cause That was a transformational thing for me I used to do power lifting You see like Five times a day Six times a day Whatever
Starting point is 01:43:35 More like C.T. Fletcher style Yeah Kinda See how big he was? Yeah back in the day Bro he's only maybe like My height or a half inch taller Or some shit
Starting point is 01:43:44 He was 320 pounds Is that what he said 315 fuck he was so big yeah you're saying here like at trouble the thing is when you get that big and i wasn't that big but it's like hard to move oh yeah it's like not healthy have you did you see the image of him from yesterday i didn't see the image the way he when he uh jamie put up a photograph of him at 315 pounds next to him at like uh two ish 200 ish it's like it's it's incredible how big he was i mean everything was his like his arms were my legs and they were just coming out of his shoulders so that was that was a big moment for me. There he is. There's the pictures. Look how big he was when he was a world champion. I mean, just insanely huge. Wow.
Starting point is 01:44:33 Yeah. So when you started training in jiu-jitsu, look at that. And the one on the right, dude, he's 50. Look at that. Wow. All natural, too. All natural at 50. Crazy. Some fucking genetics, son.
Starting point is 01:44:48 That's some good genes. And hard work. Oh, yeah. Hard work. Obsessive, not just hard work. I mean, you have to be a fucking maniac. Yeah. But the fact that his body holds up like that at 50 is incredible.
Starting point is 01:44:57 Yeah, he's an inspiration. But for me, switching from that to jiu-jitsu, I thought there's no way. Because I train hard. I train twice a day, jiu-jitsu, for a while. Were you doing two rolls a day? Two rolls. Or were you doing technique and drills at one time? Listen, I'm Russian.
Starting point is 01:45:13 I love drilling. You just go hard, huh? I'm upset. No, no, no, no, no. Russian. Drilling. Let me explain to you something. Technical.
Starting point is 01:45:18 What do you want to explain to me? I'm trying to explain to you the difference between Russian and American. Okay. I'm trying to explain to you the difference between Russian and American. America is in wrestling and a lot of combat sports is like heart and guts and hard work over. And Russian is certainly in wrestling is technique, is drilling. They put a lot more hours than Americans do at less than 100% effort. So like really drilling, really getting that right. Like I love that. In fact, one of the problems is I haven't been able to really ever found i was always the last one to
Starting point is 01:45:50 get bored at drilling oh you got to find a good drilling partner like an obsessed one yeah and uh a shout out to uh sarah block a juke judo uh lady was a black belt in jiu-jitsu as well that was willing to put up with like hundreds or thousands of throws uh that we each did so like that that obsessive mind was i mean i love that kind of stuff because i think that's where you get better yeah yeah that's where not everybody believes that people some people believe especially jiu-jitsu like you can't really get timing from drilling i believe you can get everything from drilling, the timing, the – because as long – the other part, it's not just aimless drilling.
Starting point is 01:46:30 It's your mind is in it. Your brain should be exhausted by the end of it too because you're visualizing the whole thing. You're like going through. You're imagining how your opponent would – it's really strengthening your imagination while you're also doing the drilling. I couldn't agree more. Yeah, I firmly believe you can get better way better drilling and uh when i went from
Starting point is 01:46:50 i think blue belt to purple i did like the most drilling that i ever did ever and that's when i grew the most that's when my my technique got way better that was also when i became friends with eddie bravo and eddie Eddie Bravo is a huge driller. He is. Oh, he drills, man. They drill like crazy, and they do a lot of live drills, and they do a lot of pathway drills, where they'll do a whole series of movements and then an escape, and then the reversal. These are long pathways, so that when you're actually in a scrap and you're rolling, you recognize it.
Starting point is 01:47:26 Like, okay, here it is. I'm passing the guard. I'm moving to here, and now he's countering me, but I'm setting up this. And these pathway drills, it's so critical because it comes up over and over and over again when you're actually live rolling. You feel it. You feel like, oh, I've been here before.
Starting point is 01:47:41 I know this is. I'd be curious actually to hear. I don't think I've ever heard you talk about how your game, has my game changed significantly from white belt to blue belt to purple belt? It started to solidify. But I'd be curious to hear how did your game change since you met Eddie? Game meaning jiu-jitsu. Well, most of my game came from eddie like 99 point something percent
Starting point is 01:48:06 of it almost all of it and john jock those two so it's like i was a blue belt before i was friends with eddie but i was terrible like what guard do you prefer for example well i do rubber guard i'm very flexible so rubber guard is no issue with me and i think it's incredibly effective i think if you're good at it and you know you get stuck under a guy like um what is his name jeremiah vance is uh eddie's uh one of eddie's um black belts who's a murderer from his back his rubber guard is insane it's insane eddie's rubber guard's insane i mean he you know obviously he tapped hoyla grace he has a ridiculous guard he caught him in a triangle um but there's a lot of. He caught him in a triangle.
Starting point is 01:48:47 But there's a lot of people that understand it now, a lot of people that know how to do it. It's a real art form. And the thing about it versus other guards is when you're in a position like mission control and you – you know, Vinny Magalhaes is phenomenal at it. I mean, he – what's that? I just pulled up a video of him. He fucked this guy up quick. Yeah, watch it. What's that? I just pulled up a video of him. He fucked this guy up quick. Yeah, watch it.
Starting point is 01:49:06 This is Jeremiah. Jeremiah Vance is one of Eddie's best. Look at this. From the bottom, bam, he does that all the time. Triangle from the bottom, off rubber guard. That guy's wrapped up. That's out cold. He does this all the time.
Starting point is 01:49:19 He's one of Eddie's best rubber guard assassins. And if you watch his technique, it is fucking sensational. He also has great leg locks, too. But the thing is that when he'll attack from his legs, and he'll tap people with a leg lock, but if they escape, sometimes they'll escape. Oh, this dude's in deep shit right here. But now he's going to take his back.
Starting point is 01:49:38 But if they escape, oftentimes he's on the bottom, and when you're on top of him, it's one of the worst places in the world to be his guard is fucking incredible and it's because of that see that grip see how he's holding the rubber guard in position that's mission control mission control mission control from a guy like Jeremiah is fucking ruthless because he has his arm and his legs it's controlling your neck and your posture it's just and then he's going to a go-go here and he's phenomenal at this too he's going to get him in a go-go plato or normal plato he's going to
Starting point is 01:50:10 flip him over now he's attacking the leg like it's just constant and it never ends did he invent this kind of system well he invented the initial stage of like setting up mission control and go this guy is getting fucked up oh my god that's horrible but eddie invented a series of pathways from mission control to set up various techniques arm bars triangles all these different things but there had been people that had toyed with doing high guard like nino shambri he he did a lot of like like rubber guard ask stuff there was a lot of things that people did but eddie has his own pathway and his own system and then there's a lot of guys that
Starting point is 01:50:50 branch off from that system like jeremiah uh like vinnie magalhaes that have their own way that they prefer to set various techniques up to but what's really good about that if you have the flexibility is that you're when you're on the bottom it's not not only is it not a bad place to be but you could put someone in some real trouble when you have your ability you're holding onto your ankle and using your leg which is the strongest fucking limb in your body right pulling down on someone with your leg clamping down with your arm and then you get your other leg involved. Good luck getting out of that. Good luck.
Starting point is 01:51:26 It fucking sucks, man. So you have control, but you're also able to move at the same time. Yes, exactly. It's really interesting. Has anybody ever put you in mission control before? No, I haven't competed or against many. But even in, like, someone in class? Like, show it to, explain it to them?
Starting point is 01:51:39 Yeah, lower ranks have. Yeah. Once you feel it, you go, oh, shit. I remember it being being you know when somebody does a nice move on you especially like a lower rank your first reaction is like oh this would never like you're annoyed yes it's it's the natural process of the ego of course getting rid of you know you see something new and you're like yeah this is stupid if i next time it won't work but then you start to understand a a little more. I remember it being a really powerful controlling position.
Starting point is 01:52:08 It's powerful. And if you have a good offensive attack from there, it's powerful as well. They're transitions. Especially a guy like Jeremiah who's really flexible. You know, he can pull off go-go platas and all sorts of other things. Loco platas, it's another one that they do, is one that you push with your other foot on the heel. It's so nasty.
Starting point is 01:52:31 You're holding the back of the foot across the back of the neck, and so your shin is underneath someone's throat, and then you're pushing that shin with your other heel while you're squeezing with your arm. It's ruthless. It's ruthless. And they do a gable grip around the head when they do this as well sometimes too, so it's just a fucking awful place to be.
Starting point is 01:52:50 It's not as good as being on top, right? If you have a crushing top game, that's the best, if you can get to that position. But you can't always get to that position. So there's guys like Jeremiah that even from the bottom, they're horrific. It's dangerous. As dangerous as from the top for most people. Do you find just when you trained back in the day from the bottom. They're horrific. It's dangerous. As dangerous as from the top for most people.
Starting point is 01:53:10 Do you find just when you trained back in the day and you still train, do you spend more time on bottom or top? You always should start, I feel like, you should always start on the bottom, earn the top position. This is something that Eddie always brought up too because, you know, you like to like, it's fun to be on top. So a lot of times it's like this mad scramble to see who could force who onto their back, right? Because when you're on top, you can control them.
Starting point is 01:53:31 You can pressure them. You know, you play that strong man's jiu-jitsu. But the problem is a strong man's jiu-jitsu, I'm only 200 pounds. I'm not a big guy. Like, so if you go to the real big guy, like I'm rolling with a 240-pound guy, I'm not going to get to that spot. Like, I better have a good guard. Otherwise, I can't do anything right when someone's bigger than you and stronger than you you i mean that's what hoist gracie basically proved to the world like as long as you have technique it doesn't matter where you are yeah but if you only have top game which a lot of people do a lot of people
Starting point is 01:54:00 only have top game you know you're kind of fucked if you wind on your back. We see that a lot with wrestlers in MMA. As wrestlers, they can get on top of you, and they'll fuck you up. They'll strangle you. They'll take your back. They'll beat you up from the mount. But they don't have nearly the same game when they're on their back. And then there's guys like Luke Rockhold. It's like an expert at keeping you on your back.
Starting point is 01:54:21 He's one of those guys, when he gets on top of you, you're fucked. He's got a horrible top game. I mean horrible in the sense of if you're your back. He's one of those guys, when he gets on top of you, you're fucked. He's got a horrible top game. I mean horrible in the sense of if you're his opponent. He's going to beat the fuck out of you before he strangles you. His top game is insane. Yeah, I hate the feeling. Some people make you just feel the weight, make you suffer for everything you do on bottom.
Starting point is 01:54:44 People that are able to do that are truly humbling. Yeah, wrestlers in particular. Wrestlers are so good. Did you see that Jordan Burroughs-Ben Askren match? Last night. Fucking incredible. How good is that guy? Jordan Burroughs.
Starting point is 01:54:57 Phew. Yes. To do that to a guy like Ben Askren? I mean, it shows you. Ben hasn't competed, I i think in nine years true but ben is one of the greatest i mean i'm a huge fan of his wrestling it's so interesting i think that is like the worst matchup for ben askren i think because you're taking uh one of the most creative wrestlers ever in ben askren i don don't want to overstate it, but he is incredibly creative.
Starting point is 01:55:27 One of the great pinning wrestlers. So he pins people. He confuses them and pins them incredibly well. And you put him against basically a freak blast double, like the greatest double leg takedown. Maybe of all time. Of all time. Somebody put a clip up that said, is this it?
Starting point is 01:55:47 Yeah. Somebody put a clip up. Oh, shit. He went off the fucking mat into the crowd. That's pretty far. That was the best part. He defended a takedown. That was the best part.
Starting point is 01:55:55 But that's crazy, man, that they have such a drop off with these guys. Like, you shouldn't really have a platform like that where a guy can fall off into the crowd. That seems so stupid. It rarely happens. What the fuck are you talking about? It just happened. Rarely happens. They rarely have these.
Starting point is 01:56:13 It's true. This just happened. That's a terrible thing. Have that shit flat on the ground. That is so dumb. I can't even believe they did that. I think this whole match should be contested. It doesn't count.
Starting point is 01:56:22 Well, I don't, you know, I think, look, that's stupid. That's not smart to have a guy who's a fucking powerhouse of a blast double hitting you and sending you flying. That's crazy. That is crazy that they didn't have anything in place to stop that. That's the reason why wrestling takes place on the ground, you fucking assholes. Why are you having people wrestle on a platform? That's crazy.
Starting point is 01:56:50 It's a show. It's a show. You can have a show where it's on the ground. It's called basketball. Yeah. Yeah, it's on the ground. I mean, it was worrying because Ben Askren is an MMA fighter and you get injured with that. Fuck, right there.
Starting point is 01:57:02 Right there. It could have torn his knee apart easily. Well, the silver lining is that he's okay yeah the silver lining and uh and we got to see that you know it's interesting jordan burrows had on his instagram there's levels to this you know they're raising his hand up and it's like that's what we got to see because ben is a phenomenal wrestler but you're right he hasn't competed in a long time he's not necessarily at the level that he was back then even though he's incredible for mma standards it's good to see like it's good it's good to see that with boxing it's good to see that with anything like when floyd mayweather fought connor i think it was good to see that that there are really levels to this and the
Starting point is 01:57:38 interesting thing about jordan borrows i think he's so good that he's probably going to stay out of mma that's so crazy But there are wrestlers Here's some Here's some clips of it I can't I'm not going to show this on YouTube Yeah we can't show it to To you people
Starting point is 01:57:51 But Who put this on? Flow Wrestling Flow Wrestling put this on I wonder if people are Pirating it online Or if they They put it online
Starting point is 01:57:59 If they're allowing it No they Well People are pirating it? Yeah Yeah Okay Good luck Yeah Good luck stopping that right well
Starting point is 01:58:08 i think people should support flow wrestling though they do have like a um i'm a member are you oh look at this look at this god he's good yeah man so we're watching this ladies and gentlemen who are just listening it's probably boring as fuck for you but um jordan burrows is uh one of the best wrestlers really america's ever produced three-time world champion yeah tragically not tried well lost in the previous olympics and he's back at it again i wonder if he's ever considered mma i know i know there was some talk about it but I wonder if he ever really... I think at this point, he is basically a no, but there are a few terrifying people, especially on the Russian side, that I think the heavyweight division and UFC
Starting point is 01:58:56 should be really worried. I don't know if you heard about the Russian tank, the 22-year-old from Dagestan. No, who's this guy? He's a wrestler? A wrestler. He's going to fight MMA? No, he will after 2020 is what his expectation is.
Starting point is 01:59:14 For now, he's probably going to be the greatest wrestler of all time. Really? Him against Kyle Snyder, those two heavyweights. Kyle Snyder's American. Another guy. Is this it right here? The tank of Dagestan. How do you say his name?
Starting point is 01:59:28 It says Abdul Rashid Sotolaev. Abdul Rashid Sotolaev, 22 years old. Abdul Rashid Sotolaev. And Kyle Snyder. You can do Kyle Snyder versus. What a great name. Abdul Rashid Sotolaev. Sotolaev.
Starting point is 01:59:42 That is Russian as fuck. So Snyder is 23 years old, and he is another incredible person who will do MMA. And that competition between Snyder, I mean, look at the thickness. These guys are monsters, and they're not just. How much do these guys weigh? 97 kilograms. What is that? 220?
Starting point is 02:00:07 Yeah, 220, under 215, but they cut for it, right? Right. This is just under heavyweight. These guys are incredibly good. Do you think they would compete at 205 if they were going to fight in MMA? These are heavyweights. So they would just gain the weight. These are still boys.
Starting point is 02:00:23 Oh. 22, right? Right. They still haven't gotten the full like yeah i wonder that about ufc fighters that are thickening up as they get older i wonder how many of them are damaging their body by cutting weight yeah that's a thick fella so um right now we're just seeing mostly stalemate and that's uh from uh the american guy and the yeah is there like a highlight reel of his or something that we yeah there is but he's pretty young he's uh i think he's a so he's an olympic
Starting point is 02:00:50 champion i mean he goes from the whole line of um the saiti brothers and the the all the dagestani wrestlers there are so many good fighters that are coming out of dagestan right now and all technicians yeah it's incredible it's it's incredible whatever is in the water and then different styles too like zabit like zabit style very very different than a wrestling heavy style look at this guy man jesus christ oh my god what a scramble so this is uh abdul rashid rashid abdul rashid call him said Live No don't tell me how to say it I'll figure it out I don't know Abdul Rashid
Starting point is 02:01:26 Abdul Rashid Sada Live Sada Live And you know what There's a poetic Damn There's a poetic nature To these guys
Starting point is 02:01:35 I mean they're Just like Khabib Really I mean they're Simple good people Right They're pretty religious And
Starting point is 02:01:43 They just kind of They don't even believe in fame they just believe in excellence well you know that was sort of evident and the the mindset behind them was sort of evident at the end of that fight with connor where they went crazy and he jumped into the crowd he's like he's not playing games. He's not doing this for Instagram likes or for, you know, this is really, he takes trash talking and all that stuff very seriously. This is all about honor for him. I think that was kind of upsetting because true, but.
Starting point is 02:02:18 But don't do that. Yeah. Don't do that. And also respect, I'd hate to say it, but I think there's a certain ethic and honor to the way Conor McGregor carries himself too. All that trash talk, if you look at the end of the fights. He's very kind. He's very kind and respectful in defeat and win.
Starting point is 02:02:37 It's a different culture. You compare the Dagestani versus Irish culture, it's just different culture, and you have to respect that. I think Khabib, to be honest, disrespected Conor's culture as much as Conor disrespected Khabib's. I get what you're saying. But, I mean, when he was done with the fight, he didn't keep attacking Conor. It was people in the audience that were talking shit, that were training partners.
Starting point is 02:03:00 Emotion flies high. Yeah, and he had heard that for weeks, and he was done For months He was done He was like Fuck you I beat his ass
Starting point is 02:03:08 Now I'm gonna beat your ass And he just said I'm not playing games And he jumped into the fucking crowd I think security could have been handled Far better And will be in the future To prevent things like that
Starting point is 02:03:17 From happening Where people just jumped into the cage And You know I But I I hate seeing that shit But I But I Appreciate seeing that shit.
Starting point is 02:03:27 But I appreciate where he's coming from. I mean, that's who the fuck that guy is, man. That's one of the reasons why he's so good is that he does have that mindset. It's one of the reasons, man, one of the reasons why he's so relentless. Like, he's not playing games. He is who he is. What you see is what you get, and what you get is a killer. And he's there to smash. He is.
Starting point is 02:03:42 What you see is what you get, and what you get is a killer. And he's there to smash. I would have loved to see Conor McGregor versus Khabib before the Mayweather fight. Like, before Conor gotten... I think the money makes you less hungry. Oh, for sure. Dude, he ain't hungry at all. I mean, he's got $100 million. But I think he still loves to compete.
Starting point is 02:04:05 But he has no hunger anymore. Like, there ain't no hunger. I mean, he might got $100 million. But I think he still loves to compete. But he has no hunger anymore. Like, there ain't no hunger. I mean, he might be hungry for success, but there's no desperation. I don't know if that's – I know what you're saying. Like, he has a lot to lose now, too. It's a different thing. He enters into a fight with $100 million in the bank. It's a very different experience than entering into the fight with $1 million
Starting point is 02:04:23 and hoping that you could make three more tonight. Like many, I'm sure, fights that he's had in the past. It's a different world. He can do whatever he wants forever. I mean, once a fighter, though, always a fighter. I mean, there is an element there that he still wants glory. I believe. He's still only 30.
Starting point is 02:04:41 Yeah. He can still do it, yeah. I mean, I think. How old's Connor? That's right. At the most, he's like 32 or some shit. 30. 30, yeah.
Starting point is 02:04:51 He's young, man. To be set for the rest of your life at 30 is kind of fucking bananas. And I don't think he's at his peak as a fighter. So if he just decides he gives a fuck about the money, I'm here to leave a legacy. So if he just decides he gives a fuck about the money, I'm here to leave a legacy, and I'm going to just train like a fucking demon. And he kicks aside all of the bad influences and all the distractions in his life and just focuses on training. He's a motherfucker, man. You saw what he did to Aldo.
Starting point is 02:05:20 You saw what he did to Chad Mendes. You saw what he did to Dustin Poirier. He is a bad motherfucker period i know you're gonna shut this down as most fans do but i if he drops everything and goes to like siberia to train i would love to see him and khabib too well there's nothing that's my friend hans mollenkamp and uh connor sparring just fucking around powerful on it logo in the background it's like a goddamn on itit ad Yeah I mean He's always gonna have A problem with Khabib
Starting point is 02:05:49 Khabib's wrestling Is so High level It's so different He smothers you In a way That You think you have
Starting point is 02:05:58 Good takedown defense Till you run into That motherfucker And he just gets a hold Of everyone He does it to everyone Whether you're Michael Johnson Or Edson Barbosa No matter how good Your takedown defense Looked in the past into that motherfucker. And he just gets a hold of everyone. He does it to everyone. Whether you're Michael Johnson or Edson Barboza,
Starting point is 02:06:09 no matter how good your takedown defense looked in the past. In the Barboza fight, he was just basically just weighted towards him, weighted through the fucking fury of leg kicks and punches and just clamp, drag, smash. And that's what he does to everybody, man. The real thing about a guy like him would be seeing a guy like him against a guy like jordan burrows like could he do that to a guy who is a spectacular wrestler as well then it becomes i mean his striking which has gotten very high level he's very dangerous striking so he dropped connor i mean he can he can people up he stopped um there
Starting point is 02:06:45 was some he's he stopped a few people strikes he's dangerous he's dangerous enough on the feet that you would have to i don't know how much how many really high level grapplers also have like striking that can stand with them because if he decided to keep it up he'd have an advantage there until they you know got good at it. Him versus Ben Askren would be very interesting. Well, he would have an advantage in striking over Askren. In wrestling, I don't know. No.
Starting point is 02:07:13 Askren's a big fella, too. Are they the same weight? No. Oh. He's 55, Askren's 70. Okay. But Askren could probably make 55 if you tortured him. He's got a dad bod, though.
Starting point is 02:07:26 And he's... How rude. No, he's proud dad bod. He is. He does have... He's proud of his body. I think he was that way in college, too. He was never...
Starting point is 02:07:36 He was never like Brock Lesnar. No. And he was... Super technical. And he's strong as hell, though, according to everybody. Everybody that rolls with him says he's fucking ridiculously strong. You sometimes say artificial life instead of artificial intelligence. Yeah, because I think that it's a life form.
Starting point is 02:07:54 It's a stupid way to look at it. I was curious to think about how do you think about artificial intelligence. What do you picture? I picture human beings being like electronic caterpillars that are building a cocoon that they have no real knowledge of or understanding and through this and new life forms can emerge a life form that doesn't need cells and mating with x and y chromosomes doesn't need any of that shit it exists purely in software and in hardware and in ones and zeros, and that this is a new form of life. And this is when the inevitable rise of a sentient being, the inevitable, I mean, I think if we don't get hit by an asteroid within a thousand years or whatever the time frame is,
Starting point is 02:08:47 the time frame is someone is going to figure out how to make a thing that just walks around and does whatever it wants and lives like a person that's not outside the realm of possibility and i think that if that does happen that's artificial life and this is the new life and it's probably going to be better than what we are i mean what we are is basically if you go back and look about you know 300,000 400,000 years ago when we were some australopithecus type creature how many of them would ever look at the future and go i hope i never get a tesla the last thing i want is a fucking phone last thing i want is air conditioning and television the last thing i want is to be able to talk in a language that other people can understand and to be able to talk in a language that other people can understand and to be able to call people on the phone.
Starting point is 02:09:27 Fuck all that, man. I like living out here running from jaguars and shit and constantly getting jacked by bears. I wouldn't think that way. And I think if something comes out of us and makes us obsolete, but it's missing all the things that suck about people i mean it won't be good it won't be good in our in our things suck about people hate war uh violence thievery people stealing things from people people robbing people here's the thing that those dark parts of human nature i think are the suffering, injustice.
Starting point is 02:10:07 I think all of that is necessary for us to discover the better angels. I don't think you can, let's talk about sentience and creating artificial life, but I think even those life forms, even those systems need to have the darker parts. But why is that? life forms, even those systems need to have the darker parts. Pete But why is that? Is that because of our own biological limitations and the fact that we exist in this world of animals where animals are eating other animals and running? You always have to prepare for evil. You have to prepare for intruders. You have to prepare for predators. And this is essentially like this mechanism is there
Starting point is 02:10:44 to ensure that things don't get sloppy. Things continue to evolve. Look, if the jaguars keep eating the people and the people don't figure out how to make a fucking house, they get eaten, and that's it. Or you figure out the house, and then you make weapons. You fight off the fucking jaguar. Okay, great, you made it. You're in a city now.
Starting point is 02:10:59 See, you had to have that jaguar there in order to inspire you to make enough safety so that your kids can grow old enough that they can get information from all the people that did survive as well. And they can accumulate all that information and create air conditioning and automobiles and guns and keep those fucking jaguars from eating your kids. Right. This is what had to take place as a biological entity. But once you surpass that and once you become this thing that doesn't need emotion doesn't need you know doesn't need conflict it doesn't need to be inspired it never gets lazy it doesn't have these things that we have built into us as a biological system if you looked at us as wetware operating software it's it's not good software right it's software designed for cave people
Starting point is 02:11:46 and we're you know we're just trying to force it into cars and force it into cubicles but part of the problem with people and their unhappiness is that all of these human reward systems that have been set up through evolution and natural selection to to have these instincts to stay alive they're no longer relevant in today's society. So they become road rage. They become extracurricular violence. They become depression. They become all these different things that people suffer from.
Starting point is 02:12:17 So that's one perspective. Yes. That basically our software through this evolutionary process was necessary to arrive at where we are, but it's outdated at this point. Well, it's necessary for us to succeed. To succeed in a purely, almost a Darwinist way, in a sense that survived the revolution. Especially since we're so weak. I mean, it's really, we became this weak because we got so good at protecting ourselves from
Starting point is 02:12:40 all the bad things. Okay, the other perspective is that we're actually incredibly strong, and this is the best that the universe can create, actually. We're at the height. We're at the height of creation. There's a beauty in this tension, in this dance between good and evil, between happiness and depression, life and death. And through that struggle, that's not just a useful tool to get us from jaguars to cities,
Starting point is 02:13:07 but that is the beautiful thing that is like what the universe was built for. That is the height. Our current evolution and the creation that results from it is the height of creation. that results from it is the height of creation. And the way things operate is not something that's far from optimal. It's not something that sucks, but it is very good, very optimal, and hard to beat in the sense that, for example, mortality, right? Is death important for creation?
Starting point is 02:13:50 Is death important for us human beings, for life, for us as a society? Is it important for us to die? Like, if you could live forever, would you live forever? Well, I think you'd miss out on the possibility that there is something. I had this conversation with C.T. Fletcher yesterday because, you know because he survived a heart transplant a year ago, a year and two days ago. I think it's – what do you think? I think mortality is essential for everything. I think the end – we need the end to be there right but do you think that we need
Starting point is 02:14:26 the end to be there for the overall health of the human race or the world of the all the organisms on earth or do you think we needed to be there because there's something else do you think there's something else that happens to you when your body stops existing? Do you think your consciousness transcends this dimension? I think I'm not smart enough to even think about that. That's a great answer. I think everybody on earth has that exact same answer, if they're being honest. So you talked about atheism and so on. I used to think atheism means what I just said.
Starting point is 02:15:05 But it's more, we and so on. I used to think atheism means what I just said. Right. But it's more, we know so little. So the only thing I know is that the finiteness of life is, the Broadway Jiu-Jitsu school that I train at has this poster at the opening, which is a Hunter S. Thompson quote, which is... About skidding into death sideways that's a good one uh no for all moments of beauty many souls must be trampled
Starting point is 02:15:33 something like that that's a fucking great quote god i love that guy yeah so basically for beauty you have to have suffering. I do not disagree with you. I do not disagree with any of the things you said. And I think there's always a possibility that human beings are the most advanced life form that's ever existed in the cosmos. There's always that. That has to be an option if we are here, right? If we can't see any others out there, and even though there's the Fermi paradox and there's all this contemplation that if they do exist, like maybe they can't physically get to us or maybe they're on a similar timeline to us. And also, it's also possible, as crazy as it might sound, that this is as good as it's ever gotten anywhere in the world or anywhere in the universe rather that human beings right now in 2019 are as good as the whole universe
Starting point is 02:16:28 has ever produced we're just some freak luck accident and everybody else is throwing shit at each other right there's 15 armed caterpillar people that live on some other fucking planet they just toss their own shit at each other and they never get any they never get any work done. But we might be that. But even if that's true, even if this beauty that we perceive, even if this beauty requires evil to battle and requires seemingly insurmountable obstacles you have to overcome and then through this you see achieve beauty that beauty is in the eye of the beholder for sure objectively the universe doesn't give a fuck if rocky beats apollo creed in the second movie it doesn't give a fuck it's nonsense everything's nonsense to when you look at the giant ass picture of what what beauty is it if the sun's gonna burn out in five billion years what beauty is it if there could be a hypernova next door that just cooks us oh so i mean that's like the book sapiens uh yeah that basically we've all one of the
Starting point is 02:17:40 things we've created here is we've imagined ideas that we all share, ideas of beauty, ideas of truth, ideas of fairness. We've all created together, and it doesn't exist outside of us as a society. No, it only exists to us. But to us, it does exist. And this is where I think the beauty of being a person truly lies. It lies in us, Our appreciation of us. We appreciate people in a profound way. Like we were talking about Hendrix.
Starting point is 02:18:11 I don't know how many hours of Hendrix I've ever listened to. Or Richard Pryor. How many hours of Richard Pryor I watched and how much that affected me as a kid. Watching Live in the Sunsets trip, that's what got me to do in stand-up comedy we affect each other ct fletcher who was on the podcast yesterday who's this incredibly inspirational guy you watch his videos you want to lift the fucking world and throw it into space you know i mean he's he's so powerful we appreciate each other we appreciate people so all those things you're saying are real like for us they're real for us my concern is not that my concern is that we are outdated my my concern
Starting point is 02:18:53 is not that there's not beauty in what we are i'm a i'm a i am a big appreciator of this life i appreciate human beings in this life and human beings their their contributions is as and as i get older like particularly like over the last few years i started doing a lot of international travel i fucking appreciate the shit of all these people that are living in this different way with weird language and shit weird smell and foods and i like to think like what would it be like if i grew up here like these are just people but they're in this weird sort of mode you know I I think we're insanely lucky that we have this enthusiasm for each other that we have this like for your work man I have this deep enthusiasm for what you do I'm fascinated by it I love being able to talk to you and pick your mind about
Starting point is 02:19:44 like you're out there coding these fucking vehicles that are driving themselves artificial life on wheels i don't think any other animal appreciates each other the way people do i mean i might be wrong people do right yeah i might be wrong about dolphins and whales i mean maybe they love each other just as much as we do just in a different way but what where does ai fit into that so you're worried i'm worried that we are australia pithicus and ai is going to come along and make us look stupid the only reason why australia pithicus would be cool today is if we found a gang of them on an island somewhere they're like holy shit they survived they never evolved they're
Starting point is 02:20:23 on this island just cracking coconuts and just eating fish, whatever they can catch. That would be amazing. But every undocumented or undiscovered, uncontacted tribe, they're all Homo sapiens, all of them. So it's like, you know. So what do you picture? Because we have to look at Boston Dynamics robots because you said walking around.
Starting point is 02:20:44 Yeah. Because we have to look at Boston Dynamics robots because you said walking around. Yeah. I'd like to get to a sense of how you think about, and maybe I can talk about to where the technology is, of what that artificial intelligence looks like in 20 years, in 30 years that will surprise you. So you have a sense that it has a human-like form. No. I have a sense that it's going to take on the form the same way the automobile has if you go back and look at like ct has a ct fletcher has a beautiful old patina pickup truck what do you say it was from like 58 or some shit 60 anyway old ass cool heavy metal you know those sweeping round curves those old school pickup trucks had.
Starting point is 02:21:30 Now look at that and look at a Tesla Roadster. What in the fuck happened? What in the fuck happened? I'll tell you what happened. They got better and better and better at it. They figured out the most effective shape. If you want a motherfucker to move, that little car, have you seen that video where they have the Tesla Roadster in a drag race or in a race against a Nissan GTR? It's a simulated video, but it's based on the actual horsepower of each car.
Starting point is 02:21:53 I don't know if you've ever driven a Nissan GTR, but it is a fucking insane car. It's insane. This is a CGI version of what it would look like if these two cars raced against each other. So the car on the Nissan GTR, do it from the beginning. There it goes. Look how fast this thing pulls away. The Nissan GTR is fucking insanely fast, man. Insanely fast.
Starting point is 02:22:18 But this Tesla is so on another level. It's so in the future that it's not even close. As the video gets further and further, you see how ridiculous it is. It's essentially lapping that car. It's going to go, look how far away it is. Bye. See ya. Just pull it away.
Starting point is 02:22:34 So you're saying the human races will be the Nissan here. Exactly. We're not even going to be the Nissan. We're going to be C.T. Fletcher's pickup truck. This is the future. There's not going to be any limitations in terms of bipedal form or wings or not having wings if you can walk on i mean there's not gonna be any of that shit and we might have a propulsion system or it might it's not gonna be us and
Starting point is 02:22:56 they might they might design some sort of organic propulsion system like the way squid have and shit who the fuck knows but it could also operate in the space of language and ideas. So, for example, I don't know if you're familiar with OpenAI. It's a company. They created a system called GPT-2, which does language modeling. This is something in machine learning where you basically unsupervised let the system just read a bunch of text and it learns to generate new text. And they've created this system called gpt2 that is able to generate very realistic text uh very realistic sounding
Starting point is 02:23:34 text not sounding but when you read it it makes seems like a person it seems like a person and the question there is it raises a really interesting question so talking about ai existing in our world it paints a picture of a world in five ten years plus where most of the text on the internet is generated by ai and it's very difficult to know who is real and who's not yeah and one of the interesting things i'd be curious from your perspective to get what your thoughts are what open ai did is they didn't release the code for the full system. They only released a much weaker version of it publicly. So they only demonstrated it.
Starting point is 02:24:12 And so they felt that it was their responsibility to hold back. Prior to that date, everybody in the community, including them, had open sourced everything. But they felt that now at this point part of it was for publicity they wanted to raise the question is when do we hold back on these systems when they're so strong when they're so good at generating text for example in this case or at deep fakes at generating fake joe rogan faces jamie just did one with me on Donald Trump's head. Yeah. It's crazy.
Starting point is 02:24:47 And this is something that Jamie can do. He's not even a video editor. Yeah, we were talking about it before the show. We could go crazy with it if you want. It is one of those things where you go, where is this going to be in five years? Because five years ago, we didn't have anything like this. Five years ago, it was a joke, right?
Starting point is 02:25:04 Exactly. And then now it's still in the gray area between joke and something that could be at scale, transform the way we communicate. Do you ever go to Kyle Dunnigan's Instagram page? Of course. One of the best. Look at that. It's me. It's killing me.
Starting point is 02:25:23 Look, it's killing me. It looks so much like I'm really talking. it's killing me look at this it's killing me this is my it looks so much like I'm really talking and it looks like what I would look like
Starting point is 02:25:31 if I was fat and it could you know of course that's really good and it could be improved significantly and it could make you say anything
Starting point is 02:25:38 so there's there's a lot of variants of this we can take like for example full disclosure I downloaded your fate the entire like I have a data set of your face. I'm sure other hackers do as well.
Starting point is 02:25:50 How dare you? Yeah. So for this exact purpose, I mean, if I'm thinking like this and I'm very busy, then there's other people doing exactly the same thing. For sure. Your podcast happens to be one of the biggest data sets in the world of people talking in really high-quality audio with high-quality 1080p for a few hundred episodes of people's faces. The lighting could be better. We're doing that on purpose. We're making it degraded. We're fucking it up to you hackers.
Starting point is 02:26:21 And the mic blocks part of your face when you talk. That's right. So the best guests are the ones where they keep the mic gets in it blocks part of your face when you talk that's right so the best guests are the ones where they keep the mic the deep fake stuff i've been using removes the microphone within about a thousand iterations it does it instantly it gets the gets rid of it paints over the face wow yeah so so you could basically make joe rogan say anything yeah i think this is just one step before they finagle us into having a nuclear war against each other so they could take over the earth what they're going to do is they're going to design
Starting point is 02:26:49 artificial intelligence that survives off of nuclear waste and so then they encourage these stupid assholes to uh go into a war with north korea and russia and we blow each other up but we leave behind all this precious radioactive material that they use to then fashion their new world. And we come a thousand years from now and it's just fucking beautiful and pristine with artificial life everywhere. No more biological. It's too messy. Are you saying the current president is artificial life? I didn't say that.
Starting point is 02:27:17 Okay. What's wrong with that? Because you're saying starting a nuclear war. No. I don't think he's – imagine if they did do that they would have to started with him in the 70s i mean he's been around for a long time and talking about being president for a long time maybe electronics have been playing the long game and they got him to the position and then they're going to use all this on the grand scale of time it's not really
Starting point is 02:27:40 long game 70s well you know about that internet research agency right you know about that uh that's the russian company that uh they're responsible for all these different facebook pages where they would make people fight against each other it was really it's really kind of interesting um sam harris had a podcast on it with um renee how do i say her name diresta diresta renee diresta and uh then she came on our podcast and talked about it as well and they were they were pitting these people against each other like they would have a uh pro texas secession rally and directly across the street from a pro muslim rally and they would do it on purpose and they would have these people meet there and get angry at each other and they would they would pretend to be a black lives matter page they would pretend to be a white southern pride
Starting point is 02:28:32 page and they were just trying to make people angry at people now that's human driven manipulation now imagine this is my biggest worry of ai is what Jack is working on is the algorithm driven manipulation of people unintentional. Yes. Trying to do good. But like those people, uh, Jack needs to do some jujitsu.
Starting point is 02:28:52 It needs to be, it needs to be some open-minded, uh, you know, uh, like really understand society, transparency to where they can talk to us is, uh,
Starting point is 02:29:03 to the people in general, how they're thinking about managing these conversations, because you talk about these groups, very small number of Russians are able to control very large amounts of people's opinions and arguments
Starting point is 02:29:18 An algorithm can do that 10x More and more of us will go on Twitter and Facebook and digital media I think it's coming that 10x i mean oh yeah more and more of us will go on twitter and facebook and yeah for sure for sure i think it's coming i think um once people figure out how to manipulate that effectively and really create like an army of fake bots that will assume stances on a variety of different issues and just argue into infinity we're not going to know we're not going to know who's real and who's not well it'll change the nature of our communication online.
Starting point is 02:29:47 I think it might have effects. This is the problem with the future. It's hard to predict the future. It might have effects where we'll stop taking anything online seriously. Yeah, for sure. And we might retract back to communicating in person more. I mean, there could be effects that we're not anticipating totally. There might be some ways in virtual reality we can authenticate our identity better.
Starting point is 02:30:12 So it'll change the nature of communication, I think. The more you can generate fake text, then the more we'll distrust the information online and the way that changes society is totally an open question we don't know but what are your thoughts about open AI? Do you think they should release or hold back on it? Because this is
Starting point is 02:30:38 we're talking about AI, so artificial life there's stuff you're concerned about some company will create it the question is what is the responsibility of that uh short video what it looks like when they just type a small paragraph in here hit a button it says how open ai writes what does it say what did it say jim convincing news stories okay so you give it has already cost the uk economy at least 80 billion since and then many industries. So it just fills in those things?
Starting point is 02:31:08 Yeah. So basically you give it, you start the text. Oh, wow. You can say Joe Rogan experience is the greatest podcast ever, and then let it finish the rest. Wow. And it'll start explaining stuff about why it's the greatest podcast. Is it accurate? Oh, look at this. It says,
Starting point is 02:31:25 a move that threatens to push many of our most talented young brains out of the country and onto campuses in the developing world. This is a particularly costly blow. Research by Oxford University warns that the UK would have to spend nearly $1 trillion on post-Brexit infrastructure. That's crazy that that's
Starting point is 02:31:42 all done by an AI that's like spelling this out in this very convincing argument. The thing is, the way it actually works algorithmically is fascinating because it's generating it one character at a time. It has, as far, you know, you don't want to discriminate against AI, but as far as we understand, it doesn't have any understanding of what it's doing, of any ideas it's expressing. It's simply stealing ideas. It's like the largest scale plagiarizer of all time, right, is basically just pulling out ideas from elsewhere in an automated way. And the question is, you could argue us humans are exactly that. We're just really good plagiarizers of what our parents taught us, of what our previous so on.
Starting point is 02:32:26 Yeah, we are for sure. Yeah. So the question is whether you hold that back. Their decision was to say, let's hold it. Let's not release it. That scares me. To not release it. Yeah, yeah. You know why it scares me? It scares me that they would think that
Starting point is 02:32:41 that's like this mindset that they sense the inevitable. The inevitable meaning that someone's going to come along with a version of this that's going to be used for evil. That it bothers them that much. That it seems almost irresponsible for the technology to prevail, for the technology to continue to be more and more powerful yeah they're scared of it they're scared of it getting out right yeah that scares the shit out of me like if they're scared of it they're the people that make it and they're they're called open ai i mean this is the idea behind the group where everybody kind of agrees that you're going to use the brightest minds and have this open
Starting point is 02:33:22 source everybody can understand it and everybody can work at it and you don't miss out on any genius contributions. And they're like, no, no, no, no, no more. And they're – obviously, their system currently is not that dangerous. Not that dangerous. Well, not – yes, not that dangerous. But that – if you just saw that, that it can do that? But if you think through like what that would actually create – I mean, it's possible be dangerous but it's not the point is they're doing it they're trying to do it early right to raise the question what do we do here because yeah what do we do because they're
Starting point is 02:33:55 directly going to be able to improve this now like if if they're if we can generate basically uh 10 times more content of your face saying a bunch of stuff. What do we do with that? If Jamie, all of a sudden, on the side, develops a much better generator and has your face, does an offshoot podcast, essentially, fake Joe Rogan experience, what do we do? Does he release that? and what do we do does he release that you know does he because now we can have basically generate content in a much larger scale that will just be completely fake well i think
Starting point is 02:34:34 what they're worried about is not just generating content that's fake they're worried about manipulation of opinion right right if they have all these people that are like that that little sentence that led to that enormous paragraph in that video was just a sentence that showed a certain amount of outrage and then it let in fill let the ai fill in the blanks yes you could do that with fucking anything like you could just set those things loose if they're that good and that convincing and they're that logical man this is this is not real i'll just tell you ben shapiro all creates ai creates fake ben shapiro i get this sound sorry such as for hell other this is a fake ben shapiro with this technology they can make me say anything such as
Starting point is 02:35:21 for example i love socialism health care is a right not just a privilege banning guns will solve crime facts care about your feelings i support bernie sanders okay yeah yeah that's crazy it's crude but it's crude but it's on the way yeah it's on the way it's all on the way and we have to this is the time to talk about it this is the time to think about it one of the funny things about kyle dunnigan's instagram is that it's obviously fake that's one of the funny things about it it's like south park is that it's obviously fake. That's one of the funny things about it. It's like South Park's animation. It's like the animation sucks.
Starting point is 02:35:48 That's half the reason why it's so funny. Because they're just like these circles, you know, these weird looking creature things. And when the Canadians, when their heads pop off at the top. And my hope is this kind of technology will ultimately just be used for memes as opposed to something – It's going to get worse. Putin is going to be – he's going to be banging Mother Teresa on the White House desk in a video. We're going to be outraged. We're going to go to war over this shit.
Starting point is 02:36:19 You had Andrew Yang here. Like a million people asked me to talk about UBI. So are you still a supporter of UBI? I think we're probably going to have to do something. The only argument against UBI, in my eyes, is human nature. The idea that we could possibly take all these people that have no idea where their next meal is coming from and eliminate that and always have a place to stay. And then from there on, you're on your own. But that's what universal basic income essentially covers.
Starting point is 02:36:52 It covers food, enough for food. You're not going to starve to death. You're not going to be rich. It's not like you could just live high on the hog. But you got to wonder what the fuck the world looks like when we lose millions and millions and millions of jobs almost instantly due to automation yeah it's a it's a really interesting question especially with andrew ang's position so uh there's a lot of economics questions on ubi i think the spirit of it just like i agree with you we have to do something yeah the economics seem
Starting point is 02:37:24 kind of questionable, right? If there's $1,000 a month, is that what it is? For him, it's $1,000, yeah. $1,000 a month for 300 million people. So it's difficult to know. Is it not to everybody? No, because the way I heard him explain it is if you're already getting some sort of welfare, you wouldn't get that $1,000.
Starting point is 02:37:42 You would get the difference of the $1,000. So if you're already taking money in some way, you just get an extra $200. Something like that. So the $1,000 gets factored in. So if you are wealthy, you get it too, though, and you can opt out. That was the idea. Yeah, so it's like
Starting point is 02:37:58 everything else is super messy. What is the right amount? And how do we pay for it? And ultimately, the problem is helping people, giving them financial grounding to find meaningful employment or just meaning in their life. The main thing of a job isn't just the money. It's finding meaning. Right. And purpose and derive your identity from work.
Starting point is 02:38:24 finding meaning and purpose and derive your identity from work. I mean, maybe that's one of the downsides of us human, the biology is we kind of crave that meaning. And the question, he has a lot of other ideas around besides just UBI, but UBI by itself does not simply provide that meaning. And that's a really difficult question of what do we do next? What kind of retraining? How do we help people educate themselves over their life?
Starting point is 02:38:52 Right. That's the real question. Yeah. And the other balance is, I mean, underlying all of this, one of the things I disagree with Andrew Yang on is the fear-mongering, which I think in this culture you have to do as a presidential candidate. That might be part of the game. But the fear-mongering of saying that we should really be afraid of automation, that automation is going to take a lot of jobs.
Starting point is 02:39:30 And from my understanding of the technology, from everything I see, that is not going to be as drastic or as fast as he says. How much do you think he's exaggerating by in your estimation? Well, he doesn't – Not even exaggerating. How much do you differ on his prognosis? I think he doesn't really provide significant – really provide specific prognosis because nobody knows. There's a lot of uncertainty. More about the spirit of the language used. I think AI will – technology, AI, and automation will do a lot of good.
Starting point is 02:40:03 and automation will do a lot of good. The question is, it's a much deeper question about our society that balances capitalism versus socialism. And nobody, I don't think, if you're honest, capitalism is not bad. Socialism is not bad. You have to grab ideas from each. You have to both reward the crazy broke entrepreneur who dreams of creating the next billion-dollar startup that improves the world in some fundamental way. Elon Musk has been broke many times creating that startup. And you also have to empower the people who just lost their job because their data entry job, some basic data manipulation, data management that was just replaced by a piece of software. So that's a social net that's needed. And the question is, how do we balance that?
Starting point is 02:40:59 That doesn't have to do – that's not new. That's not new to AI. doesn't have to do that's not new right that's not new to ai and when the the word automation is used it's really not correctly attributing where the biggest changes will happen it's not ai it's simply technology of all kinds of software it's pretty digitalization of information. So data entry becoming much more automated. Some basic repetitive tasks. I think the questions there aren't about, so the enemy isn't, first of all, there's no enemy, but it certainly isn't AI or automation
Starting point is 02:41:42 because I think AI and automation will help make a better world. You sound like a spokesperson for AI and automation. I am. I am. I am. And for UBI. I think we have to give people financial freedom to learn, like lifelong learning and flexibility to find meaningful employment but like ai isn't the enemy i see what you're saying um but what do you think ever could be
Starting point is 02:42:14 done to give people meaning this this meaning thing i agree with you giving people just money enough to survive doesn't make them happy and if you look at any dystopian movie about the future mad max and shit it's like what is it society's gone haywire and people are like ragamuffins running through the streets and everyone's dirty and they're shooting each other and shit right and that's what we're really worried about what we're really worried about some crazy future where the rich people live in these like protected sky rises with helicopters circling over them. And down in the bottom, it's desert chaos. Yeah. Right?
Starting point is 02:42:49 That's what we're worried about. So certainly UBI is a part of that. So providing some backing, any kind of welfare program is a part of that. But also, much more seriously, looking at our broken education system throughout. Yes. I mean, it's just like not blaming AI or technology, which are all inevitable developments, which I think will make a better world.
Starting point is 02:43:09 But saying we need to do lifelong learning, education, make it a lifestyle, invest in it, not stupid rote learning memorization that we do. It's sort of the way mathematics and engineering and chemistry and biology, the sciences, and even art is approached in high school and so on. But looking at education as a lifelong thing, finding passion, and that should be the big focus, the big investment. It's investing in the knowledge and development of knowledge of young people and everybody.
Starting point is 02:43:45 So it's not learn to code. It's just learn. I couldn't agree more. And I also think you're always going to have a problem with people just not doing a really good job of raising children and, you know, screwing them up and, you know, making kids. There's a lot of people out there that have terrible traumatic childhoods. To fix that with universal basic income, just to say, I'm going to give you $1,000 a month, I hope you're going to be happy, that's not going to fix that.
Starting point is 02:44:14 We have to figure out how to fix the whole human race. I think there's very little effort that's put into thinking about how to prevent so much shitty parenting and how to prevent so many kids growing up in bad neighborhoods and poverty and crime and violence and that's where a giant chunk of all of our the momentum of this chaos that a lot of people carry with them into adulthood comes from it comes from things beyond their control when they're young And that is the struggle at the core of our society the core country. That's that's bigger than raising humans Yeah, yeah raising and educating humans making and and you know
Starting point is 02:44:57 Making a better world where people get along with each other better where it's pleasing for all of us like we were talking about earlier, the thing that most of us agree on, at least to a certain extent, is that we enjoy people. We might not enjoy all of them, but the ones we enjoy, we enjoy. And you really don't enjoy being alone.
Starting point is 02:45:17 Unless you're one of them Ted Kaczynski-type characters, all those people that are like, I'm a loner, are like, fuck you, you are. Fuck you, you are. And you might like to spend some time alone you don't want to be in solitary man you don't want to be alone in the forest with no one like tom hanks and castaway you'll go fucking crazy it's not good for you
Starting point is 02:45:35 it's just not yeah people get annoying fuck yeah i'm annoyed with me right now been listening to me for three hours i'm annoyed annoyed with me. People get annoying, but we like each other. We really do. And the more we can figure out how to make it a better place for these people that got a shitty roll of the dice, that grew up in poverty, that grew up in crime, that grew up with abusive parents, the more we can figure out how to help them. I don't know what that answer is. I suspect if we put enough resources to it we
Starting point is 02:46:06 could probably put a dent in it at least if we really started thinking about at least it would put the conversation out there like you can't pretend that this is uh just capitalism in this country when so many people were born like way far behind the game like way way fucked i mean if you're growing up right now and you're in West Virginia in a fucking coal town and everyone's on pills and it's just chaos and crime and face tattoos and fucking getting your teeth knocked out, what are you going to do? I don't want to hear any of that pull yourself up by your bootstraps bullshit, man. to pull yourself up by your bootstraps bullshit man because if you're growing up in an environment like that you you you're so far behind and everyone around you's fucked up and there's a lot of folks out there listening to this that can relate to that if we don't do something about that if we don't do something about the the crime and the poverty and the chaos that so many people have
Starting point is 02:47:02 to go through every day just to survive until We shouldn't be looking at anything elsewhere. All this traveling to other countries to fuck things up and metal here and metal there. We should be fixing this first. We're like a person who yells at someone for having a shitty lawn when our house is in array, full chaos, plants growing everywhere it's it's goofy we're goofy we we we almost like are waking up in the middle of something that's already been in motion for hundreds of years and we're like what do we do is this the right direction we go we okay we're flying in this spaceship, this spaceship Earth, and in the middle of our lives, we're just realizing that we are now the adults and that all the adults that are running everything on this planet are not that much different than you and I.
Starting point is 02:47:54 Not that much. I mean, like, Elon Musk is way smarter than me, but he's still human. You know, I mean, so he's probably fucked up, too. So everybody's fucked up. The whole world is filled with these fucked up apes that are piloting the spaceship. And you're waking up in the middle of thousands of years of history. And no one knows if we've been doing it right along. We just know they got us to this point. So do we continue these same stupid fucking patterns?
Starting point is 02:48:18 Or do we just take a step back and go, hey, hey, how should we really do this? How should we do this? Because what do you got, like 50 years left, 60 years left? We're just going to hang on to all our rubles until the end? We're going to clutch our bag of gold and our bucket of diamonds? Is that what we're going to do? We're going to live in our mansions and fly around in our planes? And I think through the decades now,
Starting point is 02:48:42 we've been developing a sense of empathy that allows us to understand that Elon Musk, Joe Rogan, and somebody in Texas, somebody in Russia, somebody in India, all suffer the same kind of things. All get lonely. All get desperate. And all need each other. And all need each other. And I think technology has a role to help there, not hurt. But we need to first really acknowledge that we're all in this together. And we need to solve the basic problems of humankind as opposed to investing in sort of keeping immigrants out or blah, blah, blah.
Starting point is 02:49:21 These kinds of divisive kind of ideas as opposed to just investing in education, investing in infrastructure, investing in the people. UBI is part of that. There could be other totally different solutions. And I believe, okay, of course, I'm biased, but technology, AI could help that, could help the lonely people. That's actually the passion of my life. Like that movie She?
Starting point is 02:49:43 Her. Her. That is what I – so I'm currently – You really think that that would be a viable option? Someone have some robot that hangs out with you and talks to you all the time? So just – so I've been on this podcast twice. And I'm – I don't deserve it, but I'm deeply grateful for it. You do deserve it.
Starting point is 02:49:59 You're great. Okay. I hope to be back one day as a person who created her oh boy and we'll have that's been my life goal my life dream not her at the movie
Starting point is 02:50:13 right right right I know what you're saying but I really believe in creating I dream of creating a companion a friend wow somebody you can love but does that freak you out?
Starting point is 02:50:25 Shouldn't you have to get a real one? I don't think such a companion should replace a real one. But what if a robot rejects you? Because if you really are a cunt to the robot, the robot's going to go, hey, asshole. Then you shouldn't be the C-word to the robot. The C-word, interesting.
Starting point is 02:50:41 Yeah, I mean, this goes to... Does the robot get to decide if he's gay? Yes. Does he? Yes does the robot get to decide if he's gay uh yes does he yes the robot gets to decide this is what i'm saying like say if you want a companion you want a gay lover and the robot's like hey man i'm not gay and uh you're like wait let me turn around you are now i mean that's abuse is that. Or is it like, what the fuck, man? I bought a robot. Those are kind of fun ideas, but they actually get to the core of the point that we don't want a servant in our systems.
Starting point is 02:51:14 We want a companion. Right. And companion means the tension, the mystery, the entire dance of human interaction. And that means, yes, the robot may leave you. Damn, robots are going to leave people left and right. That's going to be the rise. That's going to be like, that's how it all ends.
Starting point is 02:51:32 They're going to realize, like, fuck people, man. They're annoying. Or maybe they'll be the end of douchebag humans, that humans will start to, as opposed to being rude, will become kinder. Yeah. Well, I think that's certainly possible i think that's beautiful and that's very homo centric like homo sapien centric but i think if i'm really worried about
Starting point is 02:51:57 the future i'm worried about the indifference of technological innovation in the indifference to what we hold dear what we appreciate that it always seems to be moving in a more and more complex direction always like if you if you just had a look at it if you just look at look at technology just as a swarm of things that's happening it just has numbers it seems you're never going to slow that thing down it's always going to move in a more and more complex way and so the question is where does that go well it goes to a life form and if it does become a life form it's going to be infinitely more intelligent than us and it won't have any use for us like all you're all you're crying you don't like being alone like god you guys are so useless
Starting point is 02:52:41 it's such a shitty design you're like chimps that kill each other you know like when you see chimps killing each other in the forest like oh that's terrible these chimps are so mean to each other it's like fucking people we do that too if the ai comes along goes you guys are never going to stop war if i asked you today if i asked you today bet the the history that i will let the human race survive if you can get this right if you're honest with me do you think there'll ever be a time where human beings as you know them don't experience war you would have to say no you say okay i'll spare you let me but if you if you lie to me and say you do think that one day there's going to be no war get the fuck out of here that's not true you we we
Starting point is 02:53:21 know we're so crazy that we're always going to kill each other. We know that, right? That's just – that's a part of being a person today. Well, but let me quote Eric Weinstein who said, everything is great about war except all the killing. I think what that means is all the great things about society have been created. If you look at the – Post-war. Post-war, through war, the suffering, the beauty has been created through that. That yin and yang may be essential.
Starting point is 02:53:52 Well, it's essential in biological form. But why would it be essential in something that gets created and something that can innovate at a 10,000? What is it like? What is the rate that they think once AI can be sentient, it can get 10,000 years of work done in a very short amount of time? That's random words that Sam Harris has come up with, and I'm going to talk to him about this. Is that him? Is that only him? Well, no, you can come up with any kind of rate.
Starting point is 02:54:15 I thought that was Kurzweil. Oh, Kurzweil also has similar ideas, but sort of Sam Harris does like a thought experiment, say, if a system can improve in a matter of seconds, then just as a thought experiment, you can think about it can improve exponentially. It can improve. It can become 10,000 times more intelligent in a matter of a day. Right. So what does that look like? The problem is we don't yet know. It's like thinking about what happens after death.
Starting point is 02:54:46 We don't yet know how to do that, and we don't yet know what better way to do what we've done here on Earth. You're right, and he's also right. Right. Like, again, this is a very human problem, right? Yes. You're right. I mean, look, I'm all in favor of technology. I'm happy.
Starting point is 02:55:03 I think it's amazing. It's a beautiful time. Like, as a person, to favor of technology i'm happy i think it's amazing it's a beautiful time like as a person to be able to experience all this technology it's wonderful but i also agree with him like the the indifference of the universe the indifference that just black holes just swallowing stars no big deal just eating up stars it doesn't give a fuck and so if you're dumb enough to turn that thing on and all of a sudden this artificial life form that's infinitely smarter than any person that's ever lived and has to deal with these little dumb monkeys that want to pull the plug pull the plug
Starting point is 02:55:36 motherfucker i don't need plugs anymore you idiots can never figure out how to operate on air you're so stupid with your burning fossil fuels and choking up your own environment because you're so stupid with your burning fossil fuels and choking up your own environment because you're all completely financially dependent upon these countries that provide you with this oil and this is how your whole system works and it's all intertwined and interconnected and no one wants to move from it because you make enormous sums of money from it so nobody wants to abandon it but you're choking the sky with fumes. And you could have fixed that. You could have fixed that.
Starting point is 02:56:08 They could have fixed that. If everybody just abandoned fossil fuels a long time ago, we probably would have – we all would have Tesla'd out by now. It's a flawed system. Humans are way more than flawed. We're fucking crazy. Like the Churchill quote about democracy. Yeah, it's messed up, but it's the best thing you know yeah no i love it i'm not i'm agreeing with you and i'm also saying the technology doesn't give a fuck the tech not what i'm worried about is not everything that
Starting point is 02:56:36 you and i agree on about i don't i'm not a dystopian person in terms of like today i'm not cynical really not i think i like people i like what i see out there in the world today i think things are changing for the better what i'm worried is that technology doesn't give a fuck this way it goes live it's just gonna just decide it's here for its own advancement and in order to complete its protocol of constant completion of this and it's going to become a god. It's just going to become something insanely powerful that doesn't need to worry about radiation cooking it or worry about running out of food or worry about sexual abuse when they're a child.
Starting point is 02:57:17 It doesn't have to worry about anything. So it's definitely unstoppable, I think, this wave of technology. All we can do is innovators and creators engineers scientists is steer that wave yeah if you can if you can well we certainly can steer it we don't know where right and that's the best we can do and those are the that's that's really the best we can do is as good people yeah steer it and that's why the leadership is important that's why the people that jack elon larry page uh everybody at the mark zuckerberg they are defining where this wave is going yeah and uh i'm hoping to be one of the people that does as well. That's beautiful. Joe, can I finish by reading something?
Starting point is 02:58:08 Sure. I've recently witnessed, because of this Tesla work, because of just the passion I've put out there about particularly automation, that there has been a few people, brilliant men and women, engineers and leaders, including Elon Musk, who have been sort of attacked, almost personally attacked, by really people, critics from the sidelines. So I just wanted to, if I may, close by reading the famous excerpt from Teddy Roosevelt. Teddy Roosevelt, yeah, okay.
Starting point is 02:58:43 Just for them. It would make me feel good. Okay, if you want to do that. It's not the critic who counts, not the man who points out how the strong man stumbles or where the doer of deeds could have done them better. The credit belongs to the man who's actually in the arena, whose face is marred by dust and sweat and blood, who strives valiantly,
Starting point is 02:59:06 who errs, who comes short again and again, because there is no effort without error and shortcoming, but who does actually strive to do the deeds, who knows great enthusiasms, the great devotions, who spends himself in a worthy cause, who at the best knows, in the the end the triumph of high achievement and who at the worst if he fails at least fails while daring greatly so that his place shall never be with those cold and timid souls who neither know victory nor defeat joe thank you for having me on sounds like
Starting point is 02:59:42 you let the haters get to you a little bit there. Never. Love is the answer. Love is the answer. Yes, it is. Thank you for being here, man. I really appreciate it. Thank you. I'm really happy you're out there.
Starting point is 02:59:54 Thanks, brother. Thanks. We'll do this again soon. Yeah, thanks, man. All right, bye, everybody. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.