The Joe Rogan Experience - #2044 - Sam Altman

Episode Date: October 6, 2023

Sam Altman is the CEO of OpenAI, an artificial intelligence research and development company. www.openai.com  ...

Transcript
Discussion (0)
Starting point is 00:00:00 Joe Rogan Podcast, check it out! The Joe Rogan Experience. Train by day, Joe Rogan Podcast by night! All day! Hello, Sam. What's happening? Not much, how are you? Thanks for coming in here, appreciate it. Thanks for having me.
Starting point is 00:00:17 So, what have you done? Like ever? No, I mean, what have you done with AI? I mean, it's, um... One about this is, I mean, I think everyone is fascinated by it. I mean, everyone is absolutely blown away at the current capability and wondering what the potential for the future is and whether or not that's a good thing. I think it's going to be a great thing, but I think it's not going to that's a good thing. I think it's going to be a great thing, but I think it's not going to be all a great thing. And that, that is where I think that's where all of the complexity comes in for people. It's not this like clean story of we're going to do this and it's all going to be great.
Starting point is 00:01:00 It's, we're going to do this. It's going to be net great, but it's going to be like a technological revolution. It's going to be a societal revolution. And those always come with change. And even if it's like net wonderful, you know, there's things we're going to lose along the way. Some kinds of jobs, some parts of our way of life, some parts of the way we live are going to change or go away. And no matter how tremendous the upside is there, and I believe it will be tremendously good. You know, there's a lot of stuff we've got to navigate through to make sure.
Starting point is 00:01:30 That's a complicated thing for anyone to wrap their heads around, and there's deep and super understandable emotions around that. That's a very honest answer, that it's not all going to be good. But it seems inevitable at this point. Yeah, I mean, it's not all going to be good, but it seems inevitable at this point. It's, yeah. I mean, it's definitely inevitable. My view of the world, you know, when you're like a kid in school, you learn about this technological revolution and then that one and then that one. And my view of the world now sort of looking backwards and forwards is that this is like one long technological revolution. And we had, sure,
Starting point is 00:02:07 like first we had to figure out agriculture so that we had the resources and time to figure out how to build machines. Then we got this industrial revolution and that made us learn about a lot of stuff, a lot of other scientific discovery too. Let us do the computer revolution. And that's now letting us, as we scale up to these massive systems, do the AI revolution. But it really is just one long story of humans discovering science and technology and co-evolving with it. And I think it's the most exciting story of all time. I think it's how we get to this world of abundance. And although we do have these things to navigate and there will be these downsides. If you think about what it means for the world and for people's quality of lives, if we can get to a world where the cost of intelligence and the abundance that comes with that, the cost dramatically falls, the abundance
Starting point is 00:02:58 goes way up. I think we'll do the same thing with energy. And I think those are the two sort of key inputs to everything else we want. So if we can have abundant and cheap energy and intelligence, that will transform people's lives largely for the better. And I think it's going to in the same way that if we could go back now 500 years and look at someone's life, we'd say, well, there's some great things, but they didn't have this. They didn't have that. Can you believe they didn't have modern medicine? That's what people are going to look back at us like, but in 50 years. When you think about the people that currently rely on jobs that AI will replace, when you think about whether it's truck drivers or automation workers, people that work in factory assembly lines, what, if anything anything what strategies can be put to mitigate the negative downsides
Starting point is 00:03:48 of those jobs being eliminated by ai so i'll talk about some general thoughts but i find making very specific predictions difficult because the way the technology goes has been so different than even my own intuitions or certainly my own intuitions. Maybe we should stop there and back up a little. What were your initial thoughts? If you had asked me 10 years ago, I would have said, first, AI is going to come for blue-collar labor, basically. It's going to drive trucks and do factory work and, you know, it'll handle heavy machinery. Then maybe after that, it'll do like some kinds of cognitive labor, kind of, you know, but not, it won't be off doing what I think of personally
Starting point is 00:04:37 as the really hard stuff. It won't be off proving new mathematical theorems, won't be off, you know, improving new mathematical theorems, won't be off, you know, discovering new science, won't be off writing code. And then eventually, maybe, but maybe last of all, maybe never, because human creativity is this magic special thing. Last of all, it'll come for the creative jobs. That's what I would have said. Now, A, it looks to me like, and for a while, AI is much better at doing tasks than doing jobs. It can do these little pieces super well, but sometimes it goes off the rails. It can't keep very long coherence. So people are instead just able to do their existing jobs way more productively. But you really still need the human there today.
Starting point is 00:05:21 And then B, it's going exactly the other direction. It could do the creative work first, stuff like coding second. They can do things like other kinds of cognitive labor third. And we're the furthest away from like humanoid robots. So back to the initial question, if we do have something that completely eliminates factory workers, completely eliminates truck drivers, delivery drivers, things along those lines. That creates this massive vacuum in our society. So I think there's things that we're going to do that are good to do but not sufficient. So I think at some point we will do something like a UBI or some other kind of like very long-term unemployment insurance something. But we'll have some way of giving people like redistributing money in society
Starting point is 00:06:16 as a cushion for people as people figure out the new jobs. But – and maybe I should touch on that. I'm not a believer at all that there won't be lots of new jobs. And maybe I should touch on that. I'm not a believer at all that there won't be lots of new jobs. I think human creativity, desire for status, wanting different ways to compete, invent new things, feel part of a community, feel valued, that's not going to go anywhere. People have worried about that forever. What happens is we get better tools and we just invent new things and more amazing things to do and there's a big universe out there and and i think i mean that like literally uh in that there's like space is really big but also there's just so much stuff we can all do if we do get to
Starting point is 00:06:57 this world of abundant intelligence where you can sort of just think of a new idea and it gets created. But again, that doesn't, to the point we started with, that doesn't provide like great solace to people who are losing their jobs today. So saying there's going to be this great indefinite stuff in the future, people are like, what are we doing today? So, you know, I think we will as a a society, do things like UBI and other ways of redistribution. But I don't think that gets at the core of what people want. I think what people want is like agency, self-determination, the ability to play a role in architecting the future along
Starting point is 00:07:38 with the rest of society, the ability to express themselves and create something meaningful to them. And also I think a lot of people work jobs they hate, and I think we as a society are always a little bit confused about whether we want to work more or work less. But somehow we all get to do something meaningful, and we all get to play our role in driving the future forward that's really important and what i hope is as those truck driving long-haul truck driving jobs go away which you know people have been wrong about predicting how fast that's going to happen
Starting point is 00:08:19 but it's going to happen um we figure out not just a way to solve the economic problem by like giving people the equivalent of money every month but that there's a way that and we've had a lot of ideas about this there's a way that we like share ownership and decision making over the future um i think i say a lot about agis that everyone everyone realizes we're going to have to share the benefits of that, but we also have to share the decision-making over it and access to the system itself. I'd be more excited about a world where we say, rather than give everybody on earth one eight billionth of the AGI money, which we should do that too. We say you get like one eight billionth of a one eight billionth slice of the system. You can sell it to somebody else. You can sell it to a company.
Starting point is 00:09:12 You can pool it with other people. You can use it for whatever creative pursuit you want. You can use it to figure out how to start some new business. And with that, you get sort of like a voting right over how this is all going to be used. And so the better the AGI gets, the more your little one eight billionth ownership is worth to you. We were joking around the other day on the podcast where I was saying that what we need is an AI government. We should have an AI president and have AI. Just make all the decisions? Yeah, have something that's completely unbiased, absolutely rational,
Starting point is 00:09:49 has the accumulated knowledge of the entire human history at its disposal, including all knowledge of psychology and psychological study, including UBI, because that comes with a host of pitfalls and issues that people have with it. So I'll say something there um i think we're still very far away from a system that is capable enough and reliable enough that you that any of us would want that but i'll tell you something i love about that
Starting point is 00:10:16 someday let's say that thing gets built the fact that it can go around and talk to every person on earth understand their exact preferences at a very deep level, you know, how they think about this issue and that one and how they balance the trade-offs and what they want, and then understand all of that and like collectively optimize, optimize for the collective preferences of humanity or of citizens of the U.S. That's awesome. As long as it's not co-opted, right? Our government currently is co-opted. That's for sure.
Starting point is 00:10:47 We know for sure that our government is heavily influenced by special interests. If we could have an artificial intelligence government that has no influence, nothing has influence on it. What a fascinating idea. It's possible. has influence on it. What a fascinating idea. It's possible. And I think it might be the only way where you're going to get completely objective, the absolute most intelligent decision for virtually every problem, every dilemma that we face currently in society. Would you truly be comfortable handing over like final decision-making and say, all right, AI, you got it. No, no, but I'm not comfortable doing that with anybody. You know, I mean, I don't,
Starting point is 00:11:29 I was uncomfortable with the Patriot Act. I'm uncomfortable with many decisions that are being made. It's just, there's so much obvious evidence that decisions that are being made are not being made in the best interest of the overall well of the people. It's being made in the decisions of whatever gigantic corporations that have donated to and whatever the military industrial complex and pharmaceutical industrial complex and then just the money. That's really what we know today, that money has a massive influence on our society and the choices that get made and the overall good or bad for the population. That's really what we know today, that money has a massive influence on our society and the choices that get made and the overall good or bad for the population. Yeah.
Starting point is 00:12:10 I have no disagreement at all that the current system is super broken, not working for people, super corrupt, and for sure unbelievably run by money. And I think there is a way to do a better job than that with AI in some way. But, and this might just be like a factor of sitting with the systems all day and watching all of the ways they fail. We got a long way to go. A long way to go, I'm sure. But when you think of AGI, when you think of the possible future, like where it goes to. Do you ever extrapolate? Do you ever like sit and pause and say, well, if this becomes sentient and it has the ability to make better versions of itself, how long before we're literally dealing with a god? So the way that I think about this is it used to be that like agi was this very binary
Starting point is 00:13:08 moment it was before and after and i think i was totally wrong about that and the right way to think about it is this continual continuum of intelligence this smooth exponential curve back all the way to that sort of smooth curve of technological revolution. The amount of compute power we can put into the system, the scientific ideas about how to make it more efficient and smarter, to give it the ability to do reasoning, to think about how to improve itself, that will all come. But my model for a long time, I think if you look at the world of AGI thinkers, there's sort of two, particularly around the safety issues you're talking about, there's two axes that matter. There's the short, what's called short timelines or long timelines, you know, to the first milestone of AGI, whatever that's going to be.
Starting point is 00:14:00 Is that going to happen in a few years, a few decades, maybe even longer? Although at this point, I think most people are a few years or a few decades. And then there's takeoff speed. Once we get there, from there to that point you were talking about where it's capable of the rapid self-improvement, is that a slow or a fast process? in and also the world that i think is the most controllable and the safest is the short timelines and slow takeoff quadrant and i think we're going to have you know there were a lot of very smart people for a while who were like the thing you were just talking about happens in a day or three days and i don't that doesn't seem likely to me given the shape of the technology as we understand it now. Now, even if that happens in a decade or three decades, it's still like the blink of an eye from a historical perspective.
Starting point is 00:14:53 And there are going to be some real challenges to getting that right. make the sort of safety systems and the checks that the world puts in place, how we think about global regulation or rules of the road from a safety perspective for those projects. It's super important because you can imagine many things going horribly wrong. But I feel cheerful about the progress the world is making towards taking this seriously. And, you know, it reminds me of what I've read about the conversations that the world had right around the development of nuclear weapons. It seems to me that this is, at least in terms of public consciousness, this has emerged very rapidly where I don't think anyone was really aware. People were aware of the concept of artificial intelligence, but they didn't think that it was going to be implemented so comprehensively,
Starting point is 00:15:54 so quickly. So chat GPT is on what? 4.5 now? Four. Four. And with 4.5, there'll be some sort of an exponential increase in its abilities? It'll be somewhat better.
Starting point is 00:16:09 Each step, you know, from each like half step like that, you kind of, humans have this ability to like get used to any new technology so quickly. The thing that I think was unusual about the launch of ChatGPT 3.5 and then 4 was that people hadn't really been paying attention. And that's part of the reason we deploy. We think it's very important that people and institutions have time to gradually understand this, react, co-design the society that we want with it. And if you just build AGI in secret in a lab and then drop it on the world all at once,
Starting point is 00:16:42 I think that's a really bad idea. So we had been trying to talk to the world about this for a while. If you don't give people something they can feel and use in their lives, they don't quite take it seriously. Everybody's busy. And so there was this big overhang from where the technology was to where public consciousness was. Now that's caught up.'ve deployed i think people understand it i don't expect the few the jump from like four to whenever we finish 4.5 which would be a little while i don't expect that to be the crazy i think the crazy switch the crazy adjustment that people have had to go through has has mostly happened i think most people have gone from thinking that AGI was science fiction and very far off to something that is going to happen.
Starting point is 00:17:27 And that was like a one-time reframe. And now, you know, every year you get a new iPhone. Over the 15 years or whatever since the launch, they've gotten dramatically better. But iPhone to iPhone, you're like, yeah, okay, it's a little better. But now if you go hold up the first iPhone to the 15 or whatever, that's a big difference. GPT 3.5 to AGI, that'll be a big difference. But along the way, it'll just get incrementally better. Do you think about the convergence of things like Neuralink
Starting point is 00:17:54 and there's a few competing technologies where they're trying to implement some sort of a connection between the human biological system and technology um do you want one of those things in your head i don't until everybody does right and you know i have a joke about it but it's like the idea is like once it gets you have to kind of because everybody's gonna have it so one of the the hard questions about all of the related merge stuff is exactly what you just said. Like as a society, are we going to let some people merge with AGI and not others? And if we do, then, and you choose not to, like what does that mean for you?
Starting point is 00:18:43 Right. And will you be protected? How you get that moment right. You know, if we like imagine like all the way out to the sci-fi future, there have been a lot of sci-fi books written about how you get that moment right. You know, who gets to do that first? What about people who don't want to? How do you make sure that people that do it first,
Starting point is 00:19:00 like actually help lift everybody up together? Yeah. How do you make sure people who want to just like live their very human life get to do that that stuff is really hard and honestly so far off from my problems of the day that i don't get to think about that as much as i'd like to because i do think it's super interesting um i but yeah it seems like if we just think logically, that's going to be a huge challenge at some point. And people are going to want wildly divergent things. But there is a societal question about how we're going to, like, the questions of fairness that come there and what it means for the people who don't do it like super super complicated anyway on the neural interface side i'm in the short term like
Starting point is 00:19:53 before we figure out how to upload someone's conscience into a computer if that's even possible at all which i think there's plenty of sides you could take on why it's not. The thing that I find myself most interested in is what we can do without drilling a hole in someone's head. How much of the inner monologue can we read out with an externally mounted device? And if we have a imperfect, low bandwidth, low accuracy neural interface, can people still just learn how to use it really well in a way that's quite powerful for what they can now do with a new computing platform? And my guess is we'll figure that out.
Starting point is 00:20:36 I'm sure you've seen that headpiece. There's a demonstration where there's someone asking someone a question. They have this headpiece on. They think the question, and then they literally Google the question and get the answers through their head. That's the kind of direction we've been exploring. Yeah.
Starting point is 00:20:52 That seems to me to be step one. That's the pong of the eventual immersive 3D video games. Like, you're going to get these first steps, and they're going to seem sort of crude and slow. I mean, it's essentially slower than just asking Siri. I think if someone built a system where you could think words, it doesn't have to be a question. It could just be your passive rambling in a monologue, but it certainly could be a question. And that was being fed into GPT-5 or 6.
Starting point is 00:21:24 And in your field of vision, the words in response were being displayed. That would be the palm. That's a very valuable tool to have. And that seems like that's inevitable. There's hard work to get there on the neural interface side, but I believe it will happen. Yeah. I think so too. And my concern is that the initial adopters of this will have such a massive advantage over the general population. Well, that doesn't concern me because that's like a, you know, that's not, you're not, that's just like better. That's a better computer. You're not like jacking your brain into something in a high risk thing. You know what you do when you don't want them when you take off the glasses. So that feels fine. Well, this is just the external device then. Oh, I think we can do the kind of like read your thoughts with an external device at some point.
Starting point is 00:22:13 Read your internal monologue. Interesting. And do you think we'll be able to communicate with an external device as well telepathically or semi-telepathically through technology? I do. Yeah. Yeah, I do. I think so too. My real concern is that once we take the step to use an actual neural interface,
Starting point is 00:22:37 when there's an actual operation and they're using some sort of an implant and then that implant becomes more sophisticated. It's not the iPhone 1. Now it's the iPhone 15. And as these things get better and better, we're on the road to cyborgs. We're on the road to, like, why would you want to be a biological person?
Starting point is 00:22:59 Do you really want to live in a fucking log cabin when you can be in the Matrix? I mean, it seems like we're not we're on this path we're already a little bit down that path right like if you take away someone's phone and they have to go function in the world today they're at a disadvantage relative to everybody else so that's that's like a maybe that's like the lightest weight version of a merge we could imagine. But I think it's worth like, if we go back to that earlier thing about the one exponential curve, I think it's worth saying we've like lifted off the x-axis already down this path, the tiniest bit. And yeah, even if you don't go all the way to like a neural interface, VR will get so good that some people just don't want to take it off that much right and that's fine for them um as long as we can solve this question of
Starting point is 00:23:54 how do we like think about what a balance of power means in a world i think there will be many people um i'm certainly one of them who's like actually the human body and the human experience is pretty great that log heaven in the woods Pretty awesome. I don't want to be there all the time. I'd love to go play the great video game But like i'm really happy to get to go there sometimes yeah, there's still human experiences that are just Like great human experience just laughing with friends
Starting point is 00:24:22 You know kissing someone that you've never kissed before that you're you're on a first date that kind of those kind of things are the real moments it just laughs yeah having a glass of wine with a friend and just laughing not quite the same in vr yeah now when the vr goes super far so you can't you know it's like you are jacked in on your brain and you can't tell right what's real and what's not. And then everybody gets like super deep on the simulation hypothesis or the like Eastern religion or whatever. And I don't know what happens at that point. Do you ever fuck around with simulation theory?
Starting point is 00:24:55 Because the real problem is when you combine that with probability theory and you talk to the people that say, well, if you just look at the numbers, the probability that we're already in a simulation is much higher than the probability that we're not. It's never been clear to me what to do about it. It's like, okay, that intellectually makes a lot of sense. Yeah. I think probably, sure. Right. That seems convincing, but this is my reality.
Starting point is 00:25:24 Yeah. This is my life. Yeah.'m gonna live it and i i've you know from like 2 a.m in my college freshman dorm hallway till now i've made no more progress on it than that well it seems like one of those, there's no, if it is a possibility, if it is real. First of all, once it happens, what are you going to do? I mean, that is the new reality. And in many ways, our new reality is as alien to hunter-gatherers from 15,000 years ago as that would be to us now. I mean, we've already entered into some very bizarre territory where, you know, I was just having a conversation with my kids.
Starting point is 00:26:16 We're asking questions about something. And, you know, I always say, let's guess. What percentage of that is this? And then we just Google it. And then just ask Siri and we pull it up. Like, look at that. Like that alone is so bizarre compared to how it was when I was 13 and you had to go to the library and hope that the book was accurate. Totally.
Starting point is 00:26:50 I was reading about how horrible systems like ChatGPT and Google are from an environmental impact because it's using some extremely tiny amount of energy for each query. And we're all destroying the world. And I was like, before that, people drove to the library. Let's talk about how much carbon they burned to answer this question versus what it takes now. Come on. But that's just people looking for some reason why something's bad. That's not a logical perspective. Totally. people looking for some reason why something's bad. That's not a logical perspective. What we should be looking at is the spectacular changes that are possible through this. And all the
Starting point is 00:27:13 problems, the insurmountable problems that we have with resources, with the environment, with cleaning up the ocean, climate change. There's so many problems that we have. We need this to solve all of everything else. And that's why we need President AI. If AI could make every scientific discovery, but we still had human presidents, do you think we'd be okay? No, because those creeps would still be pocketing money and they'd have offshore accounts. And it would always be a weird thing of corruption and how to mitigate that corruption, which is also one of the fascinating things about the current state of technology is that we're so much more aware of corruption. We're so much more. There's so much independent reporting and we're so much more cognizant of the actual problems that are in place. This is really great. One of the things that I've observed,
Starting point is 00:28:06 obviously many other people too, is corruption is such an incredible hindrance to getting anything done in a society to make it forward progress. And my worldview had been more US-centric when I was younger. And as I've just studied the world more and had to work in more places in the world, like it's amazing how much corruption there still is. But the shift to a technologically enabled world I think is a major force against it because everything is – it's harder to hide stuff. And I do think corruption in the world will keep trending down. Because of its exposure. Yeah. Through technology. trending down because of its exposure yeah through technology if i mean it comes at a cost and i think
Starting point is 00:28:49 the loss that like i am very worried about how far the surveillance state could go here but in a world where payments for example are no longer like bags of cash, but done somehow digitally. And somebody, even if you're using Bitcoin, can watch those flows. I think that's a corruption reducing thing. I agree, but I'm very worried about central bank digital currency and that being tied to a social credit score. Super against. Yeah, that scares the shit out of me super against and that the push to that is not that's not for the overall good of society that's for control yeah i i think like i mean there's many things that i'm disappointed that the u.s government has done recently but the the war on crypto which i think is a like we can this up. Like, we're going to control this and all that.
Starting point is 00:29:46 That's like, that's the thing that like makes me quite sad about the country. It makes me quite sad about the country, too. But then you also see with things like FTX, like, oh, this can get without regulation and without someone overseeing it. This can get really fucked. Yeah, I'm not anti-regulation like I think there's clearly a role for it and I also think FTX was like a sort of comically bad situation yeah we shouldn't like worst case scenario yeah yeah but it's a fun one like it's totally fun. I love that story.
Starting point is 00:30:25 I mean, you clearly. I really do. I love the fact that they were all doing drugs and having sex with each other. No, no. It had every part of the dramas. I mean, it's a gripping story because it had everything there. They did their taxes with, like, what was the program that they used? QuickBooks.
Starting point is 00:30:46 They're dealing with billions of dollars a year. I don't know why I think the word polycule is so funny. Polycule? That was what they like, when you call a relationship, like a poly but closed, like polyamorous molecule put together. Oh, I see. So they were like, this is our polycule. So there's nine of them and they're poly amongst them.
Starting point is 00:31:03 Or ten of them or whatever. Yeah, and you call that a polycule. And I thought that was the funny, like, that became like a meme in Silicon Valley for a while that I thought was hilarious. You clearly want enough regulation that that can't happen. But they're like. Well, I'm not against that happening. I'm against them doing what they did with the money. That's what I mean.
Starting point is 00:31:23 The polycule is kind of fun. Go for it. No, no. I mean, you want enough thing that like FTX can't lose all they did with the money. That's what I mean. The polycule is kind of fun. Go for it. No, no. I mean you want enough thing that like FTX can't lose all of its depositors' money. Yes. But I think there's an important point here, which is you have all of this other regulation that people – and it didn't keep us safe. And the basic thing, which was like, you know, let's do that, That was not all of the crypto stuff people were talking about. Yes.
Starting point is 00:31:48 I mean, the real fascinating crypto is Bitcoin. To me, I mean, that's the one that I think has the most likely possibility of becoming a universal viable currency. And it's, you know, it's limited in the amount that there can be it's you know you people mine it with their own it's like that to me is very fascinating and i love the fact that it's been implemented and that at least something like i've had uh andreas antinopoulos on the podcast and he's when he talks about it, he's living it. He's spending all of his money.
Starting point is 00:32:27 Everything he has paid is in Bitcoin. He pays his rent in Bitcoin. Everything he does is in Bitcoin. I helped start a project called WorldCoin a few years ago. And so I've gotten to learn more about the space. I'm excited about it for the same reasons. I'm excited about the space. I'm excited about it for the same reasons. I'm excited about Bitcoin too. But I think this idea that we have a global currency that is outside of the control of
Starting point is 00:32:52 any government is a super logical and important step on the tech tree. Yeah, agreed. I mean, why should the government control currency? I mean, the government should be dealing with all the pressing environmental, social, infrastructure issues, foreign policy issues, economic issues. The things that we need to be governed in order to have a peaceful and prosperous society that's equal and equitable. What do you think happens to money and currency after AGI? What do you think happens to money and currency after AGI? I've wondered about that because I feel like with money, especially when money goes digital, the bottleneck is access. If we get to a point where all information is just freely shared everywhere, there are no secrets, there are no boundaries, there are no borders. We're reading minds. We have complete access to all of the information of everything you've ever done, everything everyone's ever said.
Starting point is 00:33:52 There's no hidden secrets. What is money then? Money is this digital thing. Well, how can you possess it? How can you possess this digital thing if there is literally no bottleneck? There's no barriers to anyone accessing any information because essentially it's just ones and zeros. Yeah. I mean another way – I think the information frame makes sense. Another way is that like money is like a sort of way to trade labor or trade like a limited number of hard assets like land and houses and whatever um and if you think about a world where like intellectual labor is just readily available and super cheap then that's somehow very different i think there will always be goods
Starting point is 00:34:42 that we want to be scarce and expensive but it'll only be be goods that we want to be scarce and expensive, but it'll only be those goods that we want to be scarce and expensive that's in services that still are. And so money in a world like that, I think it's just a, it's a very curious idea. Yeah. It becomes a different thing. I mean, it's not a bag of gold in a leather pouch that you're carrying around. It's not going to do you much good. But then the question becomes, how is that money distributed? And how do we avoid some horrible Marxist society where there's one totalitarian government that just don't sit out?
Starting point is 00:35:15 That would be bad. I think you've got to like, my current best idea, and maybe there's something better, is I think you, like if we are right, a lot of reasons we could be wrong, but we are right that like a the agi systems of which there will be a few become the high order bits of sort of influence whatever in the world i think you do need like
Starting point is 00:35:37 not to just redistribute the money but the access so that people can make their own decisions about how to use it and how to govern it and if you've got one idea you get to do this if i've got one idea i get to do that and i have like rights to basically do whatever i want with my part of it and if i come up with better ideas than you i get rewarded but for that by whatever the society is or vice versa yeah you know the the hardlin, the people that are against like welfare and against any sort of universal basic income, UBI, what they're really concerned with is human nature, right? They become addicted to it. They become lazy. But isn't that a human biological and psychological bottleneck? And perhaps with the implementation of artificial intelligence combined with some sort of neural interface, whether it's external or internal, it seems like that's a problem that can be solved. That you can essentially, and this is where it gets really spooky, you can re-engineer the human biological system and you can remove all of these problems that people have that are essentially
Starting point is 00:37:02 problems that date back to human reward systems when we were tribal people, hunter-gatherer people, whether it's jealousy, lust, envy, all these variables that come into play when you're dealing with money and status and social status. If those are eliminated with technology, essentially we become a next version of what the human species is possible like look we're very very far removed from tribal brutal societies of cave people we all agree that this is a way better way to live it's it's a it's it's way safer you know we were like i was talking about this at my comedy club last night because we're because my wife was we were talking about um dna and my wife was saying that look everybody came from cave people which is kind of a fucked up thought that
Starting point is 00:37:56 everyone here is here because of cave people well that all that's still in our dna all that's still and these reward systems can be hijacked. And they can be hijacked by just giving people money. And like you don't have to work. You don't have to do anything. You don't have to have ambition. You'll just have money and just lay around and do drugs. That's the fear that people have of giving people free money. But if we can figure out how to literally engineer the human biological
Starting point is 00:38:29 vehicle and remove all those pitfalls, if we can enlighten people technologically, maybe enlighten is the wrong word, but advance the human species to the point where those are no longer dilemmas because those are easily solvable through coding. They're easily solvable through enhancing the human biological system, perhaps raising dopamine levels to the point where anger and fear and hate are impossible. They don't exist. and fear and hate are impossible. They don't exist. And if you just had everyone on Mali, how many wars would there be? There would be zero wars. I mean, I think if you could get everyone on Earth to all do Mali once on the same day,
Starting point is 00:39:16 that would be a tremendous thing. It would be. If you got everybody on Earth to do Mali every day, that would be a real loss. But what if they did a low dose of Mali where you just get to – where everybody greets people with love and affection and there's no longer a concern about competition. Instead, the concern is about the fascination of innovation and creation and creativity. Man, we could talk the rest of the time about this one topic.
Starting point is 00:39:45 It's so interesting i i i think if i could like push a button to like remove all human striving and conflict i wouldn't do it first of all like i think that's a very important part of our story and experience and and also i think we can see both from our own biological history and also from what we know about ai that very simple goal systems fitness functions reward models whatever you want to call it lead to incredibly impressive results. You know, if the biological imperative is survive and reproduce, look how far that has somehow gotten us as a society. All of this, all this stuff we have, all this technology, this building, whatever else. That got here through an extremely simple goal in a very complex environment leading to all of the richness and complexity of people fulfilling this biological imperative to some degree and wanting to impress each other.
Starting point is 00:41:02 So I think evolutionary fitness is a simple and unbelievably powerful idea. Now, could you carefully edit out every individual manifestation of that? Maybe, but I don't want to live in a society of drones where everybody is just sort of on Molly all the time either. That doesn't seem like the right answer like i want us to continue to strive i want us to continue to push back the frontier and go out and explore and i actually think something's already gotten a little off track in society about all of that and we, I don't know. I think like I'm,
Starting point is 00:41:47 I don't, I thought I'd be older by the time I felt like the old guy complaining about the youth. Um, but I think we've lost something and I think that we need more, uh, striving, maybe more risk-taking,
Starting point is 00:42:04 more like explorer spirit. What do you mean by you think we've lost something? I mean, here's like a version of it very much from my own lens. I was a startup investor for a long time. And it often was the case that the very best startup founders were in their early or mid-20s or late-20s, maybe even. And now they skew much older. And what I want to know is, in the world today, where are the super great 25-year-old old founders and there are a few it's not fair to say there are none but there are less than there were before and
Starting point is 00:42:49 i think that's bad for society at all levels i'm using like tech company founders is one example but like people who go off and create something new who push on a disagreeable or controversial idea, we need that to drive forward. We need that sort of spirit. We need people to be able to put out ideas and be wrong and not be ostracized from society for it or not have it be like something that they get canceled for or whatever. We need people to be able to take a risk in their career because they believe in some important scientific quest
Starting point is 00:43:28 that may not work out or may sound like really controversial or bad or whatever. Certainly when we started OpenAI and we were saying, we think this AGI thing is real and could be done, unlikely but so important if it happens, and all of the older scientists in our field were saying, those people are irresponsible. You shouldn't talk about AGI. That's like, they're selling a scam or they're being reckless and it's going to lead to an AGI winter. We said we believed. We said at the time we knew it was unlikely, but it was an important quest
Starting point is 00:44:03 and we were going to go after it and fuck of like fuck the haters that's important to a society what do you think is the origin like what why do you think there are less young people that are doing those kind of things now as opposed to a decade or two ago? I am so interested in that topic. Um, I'm tempted to blame the education system, but I sure that I think that like interacts with society in all of these strange ways. Um, it's funny. There was this like thing all over my Twitter feed recently trying to talk about like what, you recently trying to talk about what caused the drop in testosterone in American men over the last few decades. And no one was like, this is a symptom, not a cause. And everyone was like, oh, it's the microplastics. It's the birth control pills. It's the whatever. It's the whatever. It's the whatever.
Starting point is 00:45:08 the whatever and i think this is like not at all the most important piece of this topic but it was just interesting to me sociologically that there was there was only talk about it being about what what caused it not about it being an effect of some sort of change in society. But isn't what caused it, well, there's biological reasons why, like when we talk about the phthalates and microplastic pesticides, environmental factors, those are real. Totally. And I don't like, again, I'm so far out of my depth and expertise here. This was, it was just interesting to me that the only talk was about like biological factors and not that somehow society can have some sort of effect well society most certainly has an effect do you know what the answer to this is i i don't i mean i i've had a podcast with dr shanna swan who wrote the book countdown and that is all about the introduction of petrochemical products and the correlating drop in testosterone, rise in miscarriages, the fact that these are ubiquitous endocrine disruptors that when they do blood tests on people, they find some insane number.
Starting point is 00:46:19 It's like 90 plus percent of people have phthalates in their system. I appreciate the metal cups. Yeah. We, we, we try to mitigate it as much as possible, but I mean, you're getting it.
Starting point is 00:46:30 If you're microwaving food, you're, you're fucking getting it. You're getting, you're just getting it. You're getting, if you eat processed food, you're getting it.
Starting point is 00:46:36 You're getting a certain amount of microplastics in your diet. And estimates have been that it's as high as a credit card of microplastics per week in your body. You consume a credit card of that a week. Whoa. The real concern is with mammals because the introductions, when they've done studies with mammals, when they've introduced phthalates into their body,
Starting point is 00:46:58 there's a correlating... One thing that happens is these animals, their taints shrink. Like the taint of the mammal, when you look at males, it's 50% to 100% larger than the females. With the introduction of phthalates on the males, the taints start shrinking, the penises shrink, the testicles shrink, sperm count shrinks. So we know there's a direct biological connection between these chemicals and how they interact with bodies. So that's a real one. And it's also the amount of petrochemical products that we have, the amount of plastics that we use.
Starting point is 00:47:45 It is such an integral part of our culture and our society, our civilization. It's everywhere. new advanced species, wouldn't one of the very best ways be to get rid of one of the things that causes the most problems, which is testosterone. We need testosterone. We need aggressive men and protectors, but why do we need them? We need them because there's other aggressive men that are evil, right? So we need protectors from ourselves. We need the good, strong people to protect us from the bad, strong people. But if we're in the process of integrating with technology, if technology is an inescapable part of our life, if it is everywhere, you're using it, you have the internet of everything that's in your microwave and your television, your computers, everything you use. As time goes on, that will be more and more a part of your life. And as these plastics are introduced into the human biological system, you're seeing a
Starting point is 00:48:58 feminization of the males of the species. You're seeing a downfall in birth rate. You're seeing all these correlating factors that would sort of lead us to become this more peaceful, less violent, less aggressive, less ego-driven thing. Which the world is definitely becoming over time. Which the world is definitely becoming over time. And I'm all for less violence, obviously. But I don't... Look, obviously testosterone has many great things to say for it and some bad tendencies too. But I don't think a world, if we leave that out of the equation and just say like a world that has a spirit that, you know, we're going to defend ourselves, we're going to
Starting point is 00:49:55 find a way to like protect ourselves and our tribe and our society into this future, which you can get with lots of other ways. I think that's an important impulse. More than that, though, what I meant is about, if we go back to the issue of where are the young founders, why don't we have more of those? And I don't think it's just the tech startup industry. I think you could say that about like young scientists or many other categories. Those are maybe just the ones that I know the best. In a world with any amount of technology, I still think we've got to – it is our destiny in some sense to stay on this curve. And we still need to go figure out what's next and after the next hill and after the next hill.
Starting point is 00:50:54 And it would be, my perception is that there is some long-term societal change happening here. And I think it makes us less happy too. Right. It may make us less happy. But what I'm saying is if the human species does integrate with technology, wouldn't a great way to facilitate that to be to kind of feminize the primal apes and to sort of downplay the role. You mean like should the AGI take phthalates into the world?
Starting point is 00:51:30 I don't know if it's AGI. I mean maybe it's just an inevitable consequence of technology because especially the type of technology that we use, which does have so much plastic in it, and then on top of that the technology that's involved in food systems, preservatives, all these different things that we use to make sure that people don't starve to death. We've made incredible strides in that. There are very few people in this country that starve to death. It's not a primary issue, but violence is a primary issue. But our concerns about violence and our concerns about testosterone and strong men and powerful people is only because we need to protect against we need to protect against
Starting point is 00:52:14 other same things is that really the only reason sure i mean how many like incredibly violent women are out there running gangs no no that part for sure. Really not very many. What I meant more is, is that the only reason that society values like strong masculinity? Yeah, I think so. I think it's a biological imperative, right? And I think that biological imperative is because we used to have to defend against incoming tribes and predators and animals. And we needed someone who was stronger than most to defend the rest. And that's the concept of the military. That's why Navy SEAL training is so difficult.
Starting point is 00:52:53 We want the strongest of the strong to be at the tip of the spear. But that's only because there's people like that out there that are bad. If artificial general intelligence and the implementation of some sort of a device that changes the biological structure of human beings to the point where that is no longer a concern. Like if you are me and I am you and I know this because of technology, violence is impossible. Yeah. Look, by the time, if this goes all the way down the sci-fi path and we're all like merged into this one single like planetary universal whatever consciousness, then yes, you don't. You don't need testosterone. Need testosterone.
Starting point is 00:53:34 But you still – Especially if we can reproduce through other methods. Like this is the alien hypothesis, right? Like why do they look so spindly and without any gender and they, you know, when they have these big heads and tiny mouths. They don't need physical strength. They don't need physical strength. They have some sort of telepathic way of communicating. They probably don't need sounds with their mouths.
Starting point is 00:53:53 And they don't need this urge that we have to conquer and to spread our DNA. Like, that's so much of what people do is these reward systems that were established when we were territorial apes there's a question to me about how much you can ever get rid of rid of that if you make an agi and it decides actually we don't need to expand we don't need more territory we're just like happy we at this point you me it the whole thing all together, all merged in. We're happy here on Earth. We don't need to get any bigger. We don't need to reproduce.
Starting point is 00:54:32 We don't need to grow. We're just going to sit here and run. A, that sounds like a boring life. I don't agree with that. I don't agree that that would be the logical conclusion. I think the logical conclusion would be they would look for problems and frontiers that are insurmountable to our current existence, like intergalactic communication and transportation. What happens when it meets another AGI the other galaxy over?
Starting point is 00:54:59 What happens when it meets an AGI that's a million years more advanced? Or that. Like, what does that look like? Yeah. That's what I that look like? Yeah. That's what I've often wondered if we are. I call ourselves the biological caterpillars that create the electronic butterfly. That we're making a cocoon right now and we don't even know what we're doing. And I think it's also tied into consumerism.
Starting point is 00:55:20 Because what does consumerism do? Consumerism facilitates the creation of newer and better things because you always want the newest, latest, greatest. So you have more advanced technology and automobiles and computers and cell phones and all of these different things, including medical science. That's all for sure true. that's all for sure true uh the thing i was like reflecting on as you were saying that is i don't think i i'm not as optimistic that we can or even should overcome our biological base to the degree that i think you think we can and you know to even go back one further level like i I think you think we can. And, you know, to even go back one further level, like I think society is happiest
Starting point is 00:56:09 where there's like roles for strong femininity and strong masculinity in the same people and in different people. And I don't like, and I don't think a lot of these like deep-seated things are gonna be able to get pushed aside very easily and still have a system that works like sure we can't really think about what if there were consciousness in a machine someday or whatever what that would be like um and maybe maybe i'm just like thinking too small-mindedly but i think there is something
Starting point is 00:56:50 about us that has worked in a super deep way and it took evolution a lot of search space to get here but i wouldn't discount it too easily but don't you think that cave people would probably have those same logical conclusions about life and sedentary lifestyle and sitting in front of a computer and not interacting with each other except through text uh well i mean isn't that like what you're saying is correct how different do you think our motivations are today and kind of what really brings us genuine joy and how we're wired at some deep level differently than cave people? Clearly, lots of other things have changed. We've got much better tools. But how different do you think it really is?
Starting point is 00:57:37 I think that's the problem is that genetically at the base level, there's not much difference. And that these reward systems are all there. We interact with all of them, whether it's ego, lust, passion, fury, anger, jealousy, all these different things. And you think will be some people will upload and edit those out. Yes. Yeah. I think that our concern with losing this aspect of what it means to be a person, like the idea that we should always have conflict and struggle because conflict and struggle is how we facilitate progress, which is true, right?
Starting point is 00:58:18 And combating evil is how the good gets greater and stronger if the good wins. the good gets greater and stronger if the good wins. But my concern is that that is all predicated on the idea that the biological system that we have right now is correct and optimal. And I think one of the things that we're dealing with, with the heightened states of depression and anxiety and the lack of meaning and existential angst that people experience. A lot of that is because the biological reality of being a human animal doesn't really integrate that well with this world that we've created. That's for sure. Yeah. And I wonder if the solution to that is not find ways to find meaning with the biological vessel that you've been given, but rather engineer those aspects that are problematic out of the system to create a truly enlightened being. Like one of the things, if you ask someone today,
Starting point is 00:59:32 what are the odds that in three years there will be no war in the world? That's zero. Like nobody thinks no, there's never been a time in human history where we haven't had war. If you had to say, what is our number one problem as a species? Well, I would say our number one problem is war. Our number one problem is this idea that it's okay to send massive groups of people who don't know each other to go murder massive groups of people that are somehow opposed because of the government, because of lines in the sand and territory. That's clearly an insane thing. It's an insane thing. How do you get rid of that? Well, one of the ways you get rid of that is to completely engineer out all the human reward systems that pertain to the acquisition of resources.
Starting point is 01:00:12 So what's left at that point? Well, we're a new thing. I think we've become a new thing. And what does that thing do want? I think that new thing would probably want to interact with other new things that are even more advanced than it. would probably want to interact with other new things that are even more advanced than it. I do believe that scientific curiosity can drive quite – that can be a great frontier for a long time. Yeah. I think it can be a great frontier for a long time as well.
Starting point is 01:00:47 I just wonder if what we're seeing with the drop in testosterone, because of microplastics, which sort of just snuck up on us, we didn't even know that it was an issue until people started studying. How certain is that at this point that that's what's happening? I don't know. I'm going to go study after this. It's a very good question. Dr. Shanna Swan believes that it's the primary driving factor of the sort of drop in testosterone and all miscarriage issues and low birth weights. All those things seem to have a direct – there seems to be a direct factor environmentally. I'm sure there's other factors too. The drop in testosterone, I mean, it's been shown that you can increase male's testosterone through resistance training and through making – there's certain things you can increase male's testosterone through resistance training and through making – there's certain things you can do.
Starting point is 01:01:26 Like one of the big ones they found through a study in Japan is cold water immersion before exercise. Yeah. It radically increases testosterone. So cold water immersion and then exercise post that. Yeah. I don't know. Let's see if we can find that. But it's a fascinating field of study, but I think it has something to do with resilience and resistance and the fact that your body has to combat this external factor that's very extreme that causes the body to go into this state of preservation and the implementation of cold shock proteins and the reduction of inflammation, which also enhances
Starting point is 01:02:06 the body's endocrine system. But then on top of that, this imperative that you have to become more resilient to survive this external factor that you've introduced into your life every single day. So there's ways, obviously, that you can make a human being more robust. You know, we know that we can do that through strength training and that all that stuff actually does raise testosterone. Your diet can raise testosterone and a poor diet will lower it and will hinder your integrin system, will hinder your ability to produce growth hormone, melatonin, all these different factors. That seems to be something that we can fix in terms or at least mitigate with decisions and choices and effort. But the fact that these petrochemical products, like there's a graph that Dr. Shanna Swan has in her book that shows during the 1950s when they start using petrochemical products in everything, microwave, plastic, saran wrap, all this different stuff, there's
Starting point is 01:03:11 a direct correlation between the implementation and the dip. And it all seems to line up like that seems to be a primary factor. Does that have an equivalent impact on like estrogen related hormones that's a good question some of them actually I know some of these chemicals that they're talking about actually increase estrogen in men I don't know but I do know that it increases miscarriages so I just think it's overall disruptive to the human body. Definitely a societal-wide disruption of the endocrine system in a short period of time.
Starting point is 01:03:51 It seems like just bad and difficult to wrap our heads around. And then pollutants and environmental toxins on top of the pesticides and herbicides and all these other things and microplastics. There's a lot of factors that are leading our systems to not work well. But I just really wonder if this – are we just clinging on to this monkey body? Are we deciding – I like my monkey body. I do too. Listen, I love it.
Starting point is 01:04:21 But I'm also – I try to be very objective. And when I objectively look at it in terms of like, if you take where we couldn't just text someone or tweet at someone something mean because you would literally feel what they feel when you put that energy out there and you would be repulsed. And then violence would be if you were committing violence on someone and you literally felt the reaction of that violence in your own being and you would also have no motivation for violence if we had no aggressive tendencies no primal chimpanzee tendencies you know it's it's true that violence in the world has obviously gone down a lot over the decades but emotional violence is up
Starting point is 01:05:25 a lot and the internet has been horrible for that yeah like i don't walk i'm not gonna walk over there and punch you because you look like a big strong guy you're gonna punch me back and also there's a societal convention not to do that but if i didn't know you i might like send a mean tweet about you and i feel nothing on that yeah and clearly that has become like a mega epidemic in society that we did not evolve the biological constraints on somehow yeah and i'm actually very worried about how much that's already destabilized us and made us all miserable. It's certainly accentuated it. It's exacerbated all of our problems. I mean, if you read Jonathan Haidt's book, The Coddling of the American Mind, have you read it? Great book.
Starting point is 01:06:14 Yeah, it's a great book. And it's very damaging to women, particularly young girls. Young girls growing up, there's a direct correlation between the invention of social media, the introduction to the iPhone, self-harm, suicide, online bullying. You know, like people have always talked shit about people when no one's around. I think it's – The fact that they're doing it now openly to harm people. Horrible, obviously. I think it's super damaging to men too.
Starting point is 01:06:40 Maybe they just like talk about it less. But I don't think any of us are like set up for this. No, no one's set up for it. And, you know, I think famous people know that more than anyone. We all get used to it. Yeah, you just get numb to it. And or if you're wise, you don't engage.
Starting point is 01:06:54 You know, I don't engage. I don't even have any apps on my new phone. Yeah. I've decided I got a new phone and I decided, okay, nothing. That's really smart. No Twitter. So I have a separate phone that if I have to post something, I pick up.
Starting point is 01:07:07 But all I get on my new phone is text messages. And is that more just to keep your mind pure and unobstructed? Not tempt myself. Do you know how many fucking times I've got up to go to the bathroom first thing in the morning and spent an hour just sitting on the toilet scrolling through Instagram like for nothing does zero for me and there's this thought that I'm gonna get something out of it I was thinking actually just yesterday about how you know we all have talked for so long about these algorithmic feeds are going to manipulate us in these big ways and that will happen. But in the small ways already where like scrolling Instagram is not even that fulfilling. Like you finish that hour and you're like,
Starting point is 01:07:51 I know that was a waste of my time, but it was like over the threshold where you couldn't quite, it's hard to put the phone down. Right. You just hoping that the next one's going to be interesting. And every now and then, the problem is every like 30th or 40th reel that I click on is wild. It's great. I wonder
Starting point is 01:08:10 by the way if that's more powerful than if everyone was wild, if everyone was great. Sure. You know, it's like the slot machine effect. You have to mine for gold. Yeah. You don't just go out and pick it like daisies. So that's like out in the field everywhere. If the algorithm is like intentionally feeding you some shit along the way yeah well
Starting point is 01:08:27 there's just a lot of shit out there unfortunately but it's all it's just in terms uh you know i was talking to uh sean o'malley who's this ufc fighter who's you know obviously has a very strong mind really interesting guy but one of the things that sean said is like i get this like low level anxiety yeah from scrolling through things. And I don't know why, like, what is that? And I think it's part of the logical mind realizes this is a massive waste of your resources. I also deleted a bunch of that stuff off my phone because I just didn't have the self-control. I mean, I had the self-control to delete it, but like not to stop once i was scrolling right yeah and so i i think we're just like yeah we're getting attention hacked in some ways there's some good to it too but we don't
Starting point is 01:09:15 yet have the stuff in place the tools the societal norms whatever to modulate it well. Right. And we're not designed for it. Certainly. This is a completely new technology that, again, hijacks our human reward systems and hijacks all of the checks and balances that are in place for communication, which historically has been one-on-one. Historically, communication has been one person to another. And when people write letters to each other, it's generally things like if someone writes a love letter or they miss you, they're writing this thing where they're kind of exposing a thing that maybe they have a difficulty in expressing in front of you. And generally, unless the person was a psycho, they're not hateful letters.
Starting point is 01:10:05 Whereas the ability to just communicate, fuck that guy, I hope he gets hit by a bus, is so simple and easy and you don't experience. Twitter seems to be particularly horrible for this as the mechanics work. Yeah. It really rewards in ways that I don't think anybody fully understands that like taps into something about human psychology yeah where but that's kind of like that's how you get engagement that's how you get like followers that's how you get what like you know the dopamine hits or whatever and like the people who i know that spend all day on Twitter, more of them are unhappy about it than happy. Oh, yeah.
Starting point is 01:10:51 They're the most unhappy. I mean there's quite a few people that I follow that I only follow because they're crazy. And then I'll go and check in on them and see what the fuck they're tweeting about. And some of them are on there 8, 10 hours a day. I'll see tweets all day long and i know that person cannot be happy they're unhappy and they cannot stop you can't stop and it seems like it's their life it's it's a and they get they get meaning out of it in terms of reinforcement you know they get short-term yeah they get they i
Starting point is 01:11:26 think maybe each day you go to bed feeling like you accomplished something and got your dopamine and the end of each decade you probably were like where'd that decade go yeah i was talking to a friend of mine who was having a real problem with it he's saying he would be literally walking down the street and he'd have to check his phone to see who's replying and he wasn't even looking where he's walking he was just like caught up in the anxiety of these exchanges. And it's not because of the nice things people say. No, no, no, no. It's all.
Starting point is 01:11:50 And with him, he was recognizing that he was dunking on people and then seeing people respond to the dunking. Yeah. I stopped doing that a long time ago. I stopped interacting with people on Twitter in a negative way. I just won't do it. Even if I disagree with someone, I'll say something as peacefully as possible. I have more of an internet troll streak than I would like to admit. And so I try to just not
Starting point is 01:12:13 give myself too much of the temptation, but I slip up sometimes. Yeah. It's so tempting. Totally. It's so tempting and it's fun. It's fun to say something shitty. I mean, again, whatever this biological system we were talking about earlier, that gets a positive reward in the moment. There's a react. You know there's reactions.
Starting point is 01:12:32 You say something outrageous and someone's going to react and that reaction is like energy and there's all these other human beings engaging with your idea. But ultimately it's just not productive for most people and it's psychologically it's just fraught with peril there's just so much going on i don't know anybody who engages
Starting point is 01:12:56 all day long that's happy certainly not i don't like i, I think I've watched it like, destroy is too strong of a word, but like knock off track the careers or lives or happiness or human relationships of people that are like good, smart, conscientious people. Yeah. Just like couldn't fight this demon. Yeah. Like hacked there.
Starting point is 01:13:21 And COVID really accentuated that because people were alone and isolated. And that made it even worse because then they felt even better saying shitty things to people. I'm unhappy. I'm going to say even worse things about you. And then there was a psychological aspect of it, like the angst that came from being socially isolated and terrified about this invisible disease that's going to kill us all. And, you know, and so you have this like, and then you're interacting with people on Twitter and then you're caught up in that anxiety and you're doing it all day. And I know quite a few people, especially comedians, that really lost their minds
Starting point is 01:13:58 and lost the respect of their peers by doing that. I have a lot of sympathy for people who lost their minds during COVID because what a unnatural thing for us all to go through. And the isolation was just brutal, but a lot of people did. And I don't think the internet and particularly not the kind of like social dynamics of things like Twitter, I don't think that like brought out anyone's best. No. Well, I mean, mean some people i think if they're they're not they they're not inclined to be shitty to people i think some people did seek comfort and they did interact with people in positive ways i see there's plenty of positive i think the thing is that the negative interactions are so much more impactful yeah i look i think
Starting point is 01:14:42 there are a lot of people who use these systems for wonderful things. I didn't mean to imply that's not the case, but that's not what drives people's emotions after getting off the platform at the end of the day. Right. Right. And it's also probably not... If you looked at a pie chart of the amount of interactions on Twitter, I would say a lot of them are shitting on people and being angry about things. How many of the people that you know that use Twitter those eight or 10 hours a day
Starting point is 01:15:08 are just saying wonderful things about other people all day versus the virulent? Very few. Yeah. Very few. I don't know any of them. I know. But then again, I wonder with the implementation of some new technology that makes communication a very different thing than what we're currently doing. What we're doing now with communication is less immersive than communicating one-on-one.
Starting point is 01:15:35 You and I are talking. We're looking into each other's eyes. We're getting social cues. We're smiling at each other. We're laughing. It's a very natural way to talk. We're smiling at each other. We're laughing.
Starting point is 01:15:44 It's a very natural way to talk. I wonder if through the implementation of technology, if it becomes even more immersive than a one-on what you say, about that person's memory, that person's life, that person's history, their education, how it comes out of their mind, how their mind interacts with your mind, and you see them, you really see them. I wonder if that, I wonder if what we're experiencing now is just like the first time people invented guns, they just started shooting at things, you know? Yeah. If you can like feel what I feel when you say something mean to me or nice to me, like that's clearly going to change what you decide to say. Yes. Yeah. Yeah. Unless you're a psycho. Unless you're a psycho. And then what causes someone to be a psycho? And can that be engineered out? Imagine what we're talking about when we're dealing with the human mind. We're dealing with various diseases, bipolar, schizophrenia.
Starting point is 01:17:24 A world where we can find the root cause of those things and through coding and some sort of an implementation of technology that elevates dopamine and serotonin and does some things to people that eliminates all of those problems and allows people to communicate in a very pure way. It sounds great. It sounds great, but you're not going to have any rock and roll. You know, stand-up comedy will die. You'll have no violent movies. You know, there's a lot of things that are going to go out the window. But maybe that is also part of the process of our evolution to the next stage of existence. Maybe.
Starting point is 01:17:46 I feel genuinely confused on this. Well, I think you should be. I mean, to be- We're going to find out. Yeah. I mean, to be sure how it's going to- That's insane. But I don't even have like-
Starting point is 01:17:56 Hubris beyond belief. Right. I mean, you just, you from the, when did OpenAI, when did you first start this project? Like the very beginning, end of 2015, early 2016. And when you initially started this project, what kind of timeline did you have in mind? And has it stayed on that timeline or is it just wildly out of control? I remember talking with John Schulman, one of our co-founders early on, and he was like, yeah, I think it's gonna be about a 15 year project. And I was like, yeah,
Starting point is 01:18:31 it sounds about right to me. And I've always sort of thought since then, now, I no longer think of like AGI is quite the endpoint. But to get to the point where we like accomplish the thing we set out to accomplish i you know that would take us to like 2030 2031 that has felt to me like all the way through kind of a reasonable estimate with huge error bars and i kind of think we're on the trajectory i sort of would have assumed. And what did you think the impact on society would be? Like did you – when you first started doing this and you said, okay, if we are successful and we do create some massively advanced AGI, what is the implementation and how – what is the impact on society? What is the implementation and what is the impact on society? Did you sit there and have like a graph? Like you had the pros on one side, the cons on the other.
Starting point is 01:19:34 Did you just sort of abstractly consider? Well, we definitely talked a lot about the cons. You know, many of us were super worried about, and still are, about safety and alignment. And if we build these systems, we can all see the great future. That's easy to imagine. But if something goes horribly wrong, it's like really horribly wrong. And so there was a lot of discussion about, and really a big part of the founding spirit of this is like, how are we going to solve this safety problem? What does that even mean?
Starting point is 01:20:05 One of the things that we believe is that the greatest minds in the world cannot sit there and solve that in a vacuum. You've got to have contact with reality. You've got to see where the technology goes. Practice plays out in a stranger way than theory. And that's certainly proven true for us. But we had a long list of, well, I don't know if we had a long list of cons. We had a very intense list of cons because, you know, there's like all of the last decades of sci-fi telling you about how this goes wrong
Starting point is 01:20:35 and why you're supposed to shoot me right now. Yeah. I'm sure you've seen the John Connor chat GPT memes. I haven't. What is it? It's like, you know, John Connor from the Terminator, the kid looking at you when you open up chat GPT. Yeah. So that stuff we were like very clear in our minds on. Now, I think we understand there's a lot of work to do, but we understand more about how to make AI safe in the – AI safety gets overloaded. Like, you know, does it mean don't say something people find offensive or does it mean don't destroy all of humanity or some continuum?
Starting point is 01:21:17 And I think the word has, like, gotten overloaded. gotten overloaded. But in terms of the like, not destroy all of humanity version of it, we have a lot of work to do. But I think we have finally more ideas about what can work. And given the way the systems are going, we have a lot more opportunities available to us to solve it than I thought we would have given the direction that we initially thought the technology was going to go. So that's good. On the positive side, we initially thought the technology was going to go. So that's good. On the positive side, the thing that I was most excited about then and remain most excited about now is what if this system can dramatically increase the rate of scientific knowledge in society? That is a, I think that kind of like all real sustainable economic growth, the future getting better, progress in some sense, comes from increased scientific and technological capacity.
Starting point is 01:22:15 So we can solve all the problems. And if the AI can help us do that, that's always been the thing I've been most excited about. Well, it certainly seems like that is the greatest potential, greatest positive potential of AI. It is to solve a lot of the problems that human beings have had forever. A lot of the societal problems that seem to be, I mean, that's what I was talking about at an AI president. I'm kind of not joking because I feel like if something was hyper intelligent and aware of all the variables with no human bias and no incentives, no other than here's your program, the greater good for the community of the United States and the greater good for that community as it interacts with the rest of the world.
Starting point is 01:23:16 The elimination of these dictators, whether they're elected or non-elected, who impose their will on the population because they have a vested interest in protecting special interest groups and industry. I think as long as – the thing that I find scary when you say that is it does – it feels like it's humanity not in control. And I reflexively don't like that. But if it's instead like it is the collective will of humanity being expressed without the mistranslation and corrupting influences along the way, then I can see it. Is that possible?
Starting point is 01:23:52 It seems like it would be. It seems like if it was programmed in that regard to do the greater good for humanity and take into account the values of humanity, the needs of humanity. There's something about the phrase, do the greater good for humanity. I know. It's terrifying. It's very Orwellian. All of it is.
Starting point is 01:24:11 But also, so is artificial general intelligence. For sure. For sure. Open the door, Hal. I wish I had worked on something that was less morally fraught. But do you? Because it's really exciting. I mean, I cannot imagine a cooler thing to work on. I feel unbelievably. I feel like the luck it's really exciting. I mean, I can't imagine it. I cannot imagine a
Starting point is 01:24:25 cooler thing to work on. I feel unbelievably, I feel like the luckiest person on earth, but it is not, it's not on easy mode. Let's say that this is not life on easy mode. No, no, no, no. I mean, you are at the forefront of one of the most spectacular changes in human history. And I would say as no, I would say more spectacular than the implementation of the internet. I think the implementation of the internet was the first baby steps of this and that artificial general intelligence is, it is the internet on steroids. It's the internet in hyperspace. What I would say is it's the next step and there'll be more steps after, but it's our most exciting step yet.
Starting point is 01:25:13 Yeah. My wonder is what are those next steps after? Isn't that so exciting to think about? It's very exciting. I think we're the last people. I really do. I think we're the last of the I really do. I think we're the last of the biological people with all the biological problems. I think there's a very- And do you feel excited
Starting point is 01:25:32 about that? I just think that's just what it is. You're just fine with it? It is what it is. You know, I mean, I don't think you can control it at this point other than some massive natural disaster that resets us back to the Stone Age, which is also something we should be very concerned with. Because it seems like that happens a lot. We're not aware of it because the timeline of a human body is so small. The timeline of the human existence as a person is 100 years if you're lucky. But yet the timeline of the earth is billions of years. And if you look at how many times life on earth has been reset by comets slamming into the earth and just completely eliminating all technological
Starting point is 01:26:15 advancement, it seems like it's happened multiple times in recorded history. I do think, I always think we don't think about that quite enough. We talked about the simulation hypothesis earlier. It's had this big resurgence in the tech industry recently. One of the new takes on it as we get closer to AGI is that if ancestors were simulating us, the time they'd want to simulate again and again is right up to the, you know, right up, right, the lead up to the creation of AGI. Yeah. So it seems very crazy.
Starting point is 01:26:49 We're living through this time, but it's not a coincidence at all. You know, this is the time that is after we had enough cell phones out in the world, like recording tons of video to train the video model of the world that's all being like jacked into us now via brain implants or whatever. And before everything goes really crazy with AGI and it's also this interesting time to simulate like can we get through does the asteroid come right before we get there for dramatic tension like do we figure out how to make this safe do we figure out how to societally agree on it so that's led to like a lot more people believing
Starting point is 01:27:19 in it than before I think yeah for sure and, I think this is just where it's going. I mean, I don't know if that's a good thing or a bad thing. It's just a thing, but it's certainly better to live now. I would not want to live in the 1800s and be in a covered wagon trying to make my way across the country. Yeah. We got the most exciting time in history yet. It's the best. It's the best, but it's also has the most problems the most social problems the the most awareness of social environmental infrastructure the the the issues we get to go we get to go solve them yeah um and i intuitively i think i feel something somewhat different than you which is i think humans in something close to this form are going to be around for a lot longer than I don't think we're the last humans.
Starting point is 01:28:16 How long do you think we have? Like longer than a time frame I can reason about. Really? like longer than a time frame i can reason about really now there may be like i could totally imagine a world where some people decide to merge and go off exploring the universe with ai and there's a big universe out there but like can i really imagine a world where short of a natural disaster there are not humans pretty similar to humans from today on earth doing human like things. And the sort of spirit of humanity merged into these other things that are out there doing their thing in the universe. It's very hard for me to actually see that happening.
Starting point is 01:29:01 And maybe that means I'm like going to turn out to be a dinosaur and let I horribly wrong in this prediction but I would say I feel it more over time as we make progress with AI not less yeah I don't feel that at all I feel like we're done in like a few years no maybe a generation or two it'll probably be a gradual change like wearing of clothes you know I don't think everybody wore clothes and they invented clothes. I think it probably took a while when someone figured out shoes. I think that probably took a while when they figured out structures, doors, houses, cities, agriculture. All those things were slowly implemented over time and then now become everywhere.
Starting point is 01:29:40 And I think this is far more transformative. And it's part of that because you don't think there will be an option for some people not to merge. Right. Just like there's not an option for some people to not have telephones anymore. I used to have friends like, I don't even have email. Those people don't exist anymore. They all have email. Everyone has a phone, at least a flip phone.
Starting point is 01:30:00 I know some people that they just can't handle social media and all that jazz. They went to a flip phone Good, I don't know if this is true or not I've heard you can't like walk into an AT&T store anymore and still buy a flip phone I heard they just changed you can oh really someone gave me this but I don't know if it's Verizon still has them I was just there. They still have flip phones. I was like, I like it I like this fucking little thing that you just call people and I always like romanticize about going to that but my step was to go to a phone that has nothing on it but text messages and that's been a few days
Starting point is 01:30:30 feeling good so far yeah it's good you know i still have my other phone that i i use for social media but when i pick that motherfucker up i start scrolling through youtube and watching videos and scrolling through tiktok or instagram and i don't have tick tock but i've uh i have i tried threads for a little while but i'm like oh it's like a ghost town so i went right back to x i i live on a ranch during the weekends and there's wi-fi in the house cell phone coverage anywhere else. And it's every week I forget how nice it is and what a change it is to go for a walk with no cell phone coverage. It's good for your mind for some reason. It's unbelievable for your mind.
Starting point is 01:31:17 And I think we have like so quickly lost something. Like out of service just doesn't happen. That doesn't even happen on airplanes anymore. You know, like, uh, but that like hours where your phone just cannot buzz. Yeah.
Starting point is 01:31:35 No text message either. Nothing. I think that's a really healthy thing. I dropped my phone once when I was in Lanai and I think it was the last time I dropped the phone. The phone was like, we're done. And it just started calling people randomly. Like it would just call people and I'd hang it up and call another person. I'd hang it up.
Starting point is 01:31:54 And I was showing my wife. I was like, look at this. This is crazy. It's just calling people. And so the phone was broken. And so I had to order a phone. We were on vacation for like eight days. And it took three days for Apple to get me a phone.
Starting point is 01:32:06 I bet you had a great three days. It was amazing. It was amazing. Because when I was hanging out with my family, I was totally present. There was no options. And I wasn't thinking about checking my phone because it didn't exist. I didn't have one. And there was an alleviation of, again what what sean was talking about that low
Starting point is 01:32:26 level of anxiety this sort of like that you have when you always want to check your phone yeah i think i think that thing it's so bad um we have not figured out yet like the technology has moved so fast biology moves very slowly we have not figured out how we're going to function in this society and get those occasional times when, you know, your phone is broken for three days. Yeah. Or you go for a walk with no service. But it's like I very much feel like my phone controls me,
Starting point is 01:33:02 not the other way around. Uh-huh. And I hate it it but i haven't figured out what to do about it well that's what i'm worried about with future technology is that this which was so unanticipated if you'd imagine a world when you can imagine going up to someone in 1984 and pointing to a phone and saying one day that'll be in your pocket it's going to ruin your life like what like yeah one day people are going in your pocket. It's going to ruin your life. Like, what? Like, yeah, one day people are going to be jerking off to that thing.
Starting point is 01:33:29 You're like, what? One day people are going to be watching people get murdered on Instagram. I've seen so many murders on Instagram over the last few months. Really? I've never seen one. It's been a bad timeline. Me and my friend Tom Segura, every morning we text each other the worst things that we find on Instagram. Why?
Starting point is 01:33:46 For fun. He's a comedian. We're both comedians that's fun to you yeah this is fucking all right just ridiculous i mean i mean just crazy car accidents people get gored by bulls and like every like we try to top each other so every day he's sending me the most every day when I wake up and I check Tom fuck what do you got you know can you explain what's fun about that well he's a comic and I'm a comic and comics like chaos we like we like ridiculous outrageous shit that is just so far beyond the norm of what you experience in a regular day just and also the understanding of the wide spectrum of human behavior. If you're a nice person and you surround yourself with nice people, you very rarely see someone get shot. You very rarely see people get stabbed for no reason randomly on the street.
Starting point is 01:34:41 But on Instagram, you see that every day. And there's something about that which just reminds you, randomly on the street, but on Instagram you see that every day. And there's something about that which just reminds you, oh, we're crazy. Like, the human species, there's a certain percentage of us that are just off the rails and just out there
Starting point is 01:34:57 just causing chaos and jumping dirt bikes and landing on your neck and all that stuff is out there. Even to hear that makes me physically, like neck and all that stuff is out there. Even to hear that makes me physically like, I know that happens, of course. And I know not looking at it doesn't make it not happen. Right. But it makes me so uncomfortable and I'm happy to watch.
Starting point is 01:35:18 Oh, yeah. It makes me uncomfortable too. But, yeah, we do it to each other every day. And it's not good. It's definitely not good. But it's also, I'm not going not good. It's definitely not good, but it's also, I'm not going to stop. It's fun.
Starting point is 01:35:28 But why is it fun? It's fun because it's my friend Tom and we're both kind of the same in that way. We're just, just look at that. Look at this. Look at that.
Starting point is 01:35:35 Look at this. And it's just a thing we started doing a few months ago. It just can't stop. And Instagram has like learned that you do that so it just keeps
Starting point is 01:35:42 showing you more and more. Instagram knows. My search page is a mess. when I go to the Discover page. It's like awful stuff happening to people. Oh, it's just crazy. But the thing is it shows up in your feed too. That's what I don't understand about the algorithm. It knows you're fucked up, so it shows up in your feed of things.
Starting point is 01:35:58 Like even if they're not people I follow, but Instagram shows them to me anyway. They're not people I follow, but Instagram shows them to me anyway. I heard an interesting thing a few days ago about Instagram and the feed, which is if you use it at off hours when they have more processing capability available because less people are using it, you get better recommendations. So your feed will be better in like the middle of the night. What is better though? Doesn't your feed – More addictive to you or whatever. Right. So for me, better would be more murders, more animal attacks.
Starting point is 01:36:27 Sounds horrible. It's horrible. But it's just – it seems to know that's what I like. It seems to know that that's what I interact with. So it's just sending me that most of the time. Yeah. That probably has all kinds of crazy psychological effects. I'm sure.
Starting point is 01:36:45 Yeah, I'm sure that's also one of the reasons why I want to get rid of it and move away from it. Yeah, so maybe it went too far. I don't even know if it's too far. But what it is is it's showing me the darker regions of society, of civilization, of human behavior. But you think we're about to edit all that out? I wonder if that is a solution. I really do. Because I don't think it's outside the realm of possibility.
Starting point is 01:37:14 If we really truly can engineer that. Like one of the talks about Neuralink that's really promising is people with spinal cord issues, injuries, people that can't move their body, and being able to hotwire that where essentially it controls all these parts of your body that you couldn't control anymore. And so that would be an amazing thing for people that are injured, an amazing thing for people that are paralyzed, they have all sorts of neurological conditions.
Starting point is 01:37:46 That is probably one of the first, and that's what Elon's talked about as well, one of the first implementations, the restoration of sight, you know, cognitive function enhanced from people that have brain issues. That's tremendously exciting. Yeah. And like many other technologies, I don't think
Starting point is 01:38:06 we can stop neural interfaces, nor because of the like great good that's going to happen along the way. But I also don't think we know where it goes. It's Pandora's box for sure. And I think when we open it, it's just we're not going to go back, just like we're not going to go back to no computers without some sort of natural disaster. the way i and i mean this is a great compliment you are one of the most neutral people i have ever heard talk about the merge coming you're just like yeah i think it's going to happen you know it's be good in these ways bad in these ways but you seem like unbelievably neutral about it which is always something i admire i try to be as neutral about it, which is always something I admire. I try to be as neutral about everything as possible, except for corruption, which I think is just like one of the most massive problems with the way our culture is governed. I think
Starting point is 01:38:57 corruption is just, and the influence of money is just a giant, terrible issue. But in terms of like social issues and in terms of the way human beings believe and think about things, I try to be as neutral as possible. Because I think the only way to really truly understand the way other people think about things is try to look at it through their mind. And if you have this inherent bias and this, you have this like very rigid view of what's good and bad and right and wrong. I don't think that serves you very well for understanding why people differ. So I try to be as neutral and as objective as possible when I look at anything. This is a skill that I've learned. This is not something I had in 2009 when I started this podcast. This podcast, I started just fucking around with friends and I had no idea what it was. I mean, there's no way I could have ever known. And, but also had no idea what it was going to do to me. And as far as the evolution of me as a human being, I am so much
Starting point is 01:40:02 nicer. I'm so much more aware of things. I'm so much more conscious of the way other people think and feel. I'm just a totally different person than I was in 2009, which is hard to recognize. It's hard to believe. That's really cool. But that is just an inevitable consequence of this unexpected education that I've received. Did the empathy kind of like come on linearly? Yes. That was not a... No, it just came on recognizing that the negative interactions on social media that I was doing,
Starting point is 01:40:39 they didn't help me. They didn't help the person. And then having compassion for this person that's fucked up or done something stupid, it's not good to just dunk on people. Like it's a, there's no benefit there other than to give you some sort of social credit and get a bunch of likes. It didn't make me feel good. Like that's not good. And then also a lot of psychedelics, a ton of psychedelic experiences from 2009 on and with everyone, a greater understanding of the impact like i had one recently and when i had the one recently like the overwhelming message that i was getting through this was that everything i say and do ripples off into all the people that I interact with. And then if I'm not doing something with
Starting point is 01:41:27 at least the goal of overall good or overall understanding that I'm doing bad and that that bad is a real thing, as much as you try to ignore it because you don't interface with it instantly and you're still creating unnecessary negativity and that I should avoid that as much as possible. It was like an overwhelming message that this psychedelic experience was giving me. And I took it because I was just particularly anxious that day about the state of the world, particularly anxious about Ukraine and Russia and China and the political system that we have in this country and this incredibly polarizing way that the left and the right engage with each other. And God, it just seems so just tormented. And so I was just, some days I just get, I think too much about it.
Starting point is 01:42:32 I'm like, I need something to crack me out of this. So I took these psychedelics. Are you surprised psychedelic therapy has not made, from what you thought would happen in, say, the early 2010s until now, are you surprised it has not made more progress sort of on a path to legalization as a medical treatment? No. No, I'm not, because there's a lot of people that don't want it to be in place, and those people have tremendous power over our medical system and over our regulatory system. And those people have also not experienced these psychedelics. There's very few people that have experienced profound psychedelic experiences that don't think there's an overall good for those things.
Starting point is 01:43:10 So the problem is you're having these laws and these rules implemented by people who are completely ignorant about the positive effects of these things. and if you know the history of psychedelic prohibition in this country it all took place during 1970 and it was really to stop the civil rights movement and it was really to stop the anti-war movement and they they tried to find a way to make all these things that these people were doing that was causing them to thinking these very different ways is tune in, turn on, drop out. They just wanted to put a fucking halt to that. What better way than to
Starting point is 01:43:50 lock everyone who participates in that in cages, find the people who are producing it, lock them in cages, put them in jail for the rest of their life, make sure it's illegal, arrest people, put the bus on television, make sure that people are aware. And then there's also you connect it to drugs that are inherently dangerous for society and detrimental, the fentanyl crisis, you know, the crack cocaine crisis that we experienced in the nineties, like all of those things, they're under the blanket of drugs. Psychedelic drugs are also talked about like drugs, even though they have these profound spiritual and psychological changes that they, you know,
Starting point is 01:44:30 imbibe the person. I remember when I was in elementary school and I was in like drug education, they talked about, you know, marijuana is really bad because it's a gateway to these other things and there's this bad one, that bad one, heroin, whatever. And the very end of the line,
Starting point is 01:44:42 the worst possible thing is LSD. Did you take LSD and go and go oh they're lying psychedelic therapy was definitely one of the like most important things in my life and i i assumed given how powerful it was for me like i you know i struggled with like all kinds of anxiety and other negative things. And to watch all of that go away. Like that. I traveled to another country for a week, did a few things, came back. Totally different person. And I was like, I've been lied to my whole life.
Starting point is 01:45:18 I'm so grateful that this happened to me now. Talked to a bunch of other people, all similar experiences. This was a while ago. this happened to me now. Talked to a bunch of other people, all similar experiences. I assumed, this was a while ago, I assumed it was, and I was like, you know, very interested in what was happening in the US. I was like, particularly like looking at where MDMA and psilocybin were on the path. And I was like, all right, this is going to get through like this is, and this is going to like change the mental health of a lot of people in a really positive way. And I am surprised we have not made faster progress there, but I'm still optimistic we will. Well, we have made so much progress from the time of the 1990s. In the 1990s, you never heard about psychedelic retreats.
Starting point is 01:45:59 You never heard about people taking these vacations. You never heard about people getting together in groups and doing these things and coming back with these profound experiences that they relayed to other people and literally seeing people change, seeing who they are change, seeing people become less, less selfish, less toxic, less mean, you know, you, and, and more empathetic and more understanding. Yeah. It's, I mean, I can only talk about it from a personal experience. It's been a radical change in my life as well as, again, having all these conversations with different people. I feel so fortunate to be able to do this, that I've had so many different conversations with so many different people that think so differently.
Starting point is 01:46:41 And so many exceptional people that have accomplished so many incredible things and you get to sort of understand the way their mind works and the way they see the world the way they interface with things it's awesome it's pretty fucking cool and that is one of the cooler things about being a human that you can find a way to mitigate all the negative aspects of the monkey body. There are tools that are in place. But unfortunately in this very prohibitive society, this society of prohibition, we're denied those and we're denied ones that have never killed anybody, which is really bizarre when Oxycontin can still be prescribed. What's the deal with why we can't make, if we leave, why we can't get these medicines that have transformed people's lives more available?
Starting point is 01:47:34 What's the deal with why we can't stop the opioid crisis? Or like fentanyl seems like just an unbelievable crisis for San Francisco. You remember in the beginning of the conversation where you said that AI will do a lot of good, overall good, but also not no harm. If we legalize drugs, all drugs, that would do the same thing. Would you advocate to legalize all drugs? It's a very complicated question because I think you're going to have a lot of addicts that wouldn't be addicts. You're going to have a lot of people's lives destroyed because it's legal. There's a lot of people that may not be psychologically capable of handling
Starting point is 01:48:14 things. Maybe they already have, like, that's the thing about psychedelics. They do not ever recommend them for people that have a slippery grasp on reality as it is. People that are struggling, people that are already on a bunch of medications that allow them just keep a steady state of existence in the normal world. If you just fucking bombard them with psilocybin, who knows what kind of an effect that's going to have and whether or not they're psychologically too fragile to recover from that. I mean, there's's many many stories of people taking too much acid and never coming back yeah yeah these are like
Starting point is 01:48:51 a powerful doesn't seem to like begin to cover it right yeah but there's also what is it about humans that are constantly looking to perturb their normal state of consciousness? Constantly, whether it's we're both drinking coffee, you know, people smoke cigarettes, they do all, they take Adderall, they do all sorts of different things to change and enhance their normal state of consciousness. It seems like whether it's meditation or yoga, they're always doing something to try to get out of their own way or get in their own way or distract themselves from the pain of existence. And it seems like a normal part of humans. And even monkeys, like vervet monkeys get addicted to alcohol. They get addicted to fermented fruits and alcohol and they become drunks and alcoholics... What do you think is the deep lesson there? Well, we're not happy, exactly.
Starting point is 01:49:48 You know, and then some things can make you happy, sort of. For like a couple of drinks makes you so happy for a little bit until you're an alcoholic, until you destroy your liver, until you crash your car, until you're, you know, you're involved in some sort of a violent encounter that you would never be involved with if you weren't drunk. You know, I love caffeine, which clearly is a drug. Alcohol, like, I like, but I often am like, yeah, this is like, you know, this is like dulling me and I wish I hadn't had this drink.
Starting point is 01:50:21 And then other stuff, like, I mostly would choose to avoid. But that's because you're smart and you're probably aware of the pros and cons. And you're also probably aware of how it affects you and what's doing good for you and what is detrimental to you. But that's a decision that you can make as an informed human being that you're not allowed to make if everything's illegal right yeah right and also when things are illegal criminals sell those things because it's you're not tampering the desire by making it illegal you're just making access to it much more complicated what i was going to say is if fentanyl is really great, I don't want to know. Apparently it is. Apparently it is. Yeah. Peter Berg was on the podcast and he produced that painkiller documentary for Netflix about the docudrama about the Sackler family. It's an
Starting point is 01:51:16 amazing piece. But he said that he took Oxycontin once recreationally And he was like, oh my God, it's amazing. He's like, keep this away from me. It feels so good. Yeah. And that's part of the problem is that, yeah, it will wreck your life. Yeah, it will capture you. But it's just so unbelievable. But the feeling, like how did Lenny Bruce describe it?
Starting point is 01:51:40 I think he described heroin as getting a warm hug from God. I think the feeling that it gives you is probably pretty spectacular. I don't know if legalizing that is going to solve the problems. But I do know that another problem that we're not paying attention to is the rise of the cartels. is the rise of the cartels and the fact that right across our border, where you can walk, there are these enormous, enormous organizations that make who knows how much money, untold, uncalculable amounts of money, selling drugs and bringing them into this country. And one of the things they do is they put fentanyl in everything
Starting point is 01:52:20 to make things stronger. And they do it for, like, street Xanax. There's people that have overdosed it for like street Xanax. There's people that have overdosed thinking they're getting Xanax and they fucking die from fentanyl. Yeah, they do it with cocaine, of course. They do it with everything. There's so many things that have fentanyl in them, and they're cut with fentanyl because fentanyl is cheap and insanely potent.
Starting point is 01:52:41 And that wouldn't be a problem if things were legal. So would you net out towards saying, all right, let's just legalize it? Yeah, I would net out towards that. But I would also put into place some serious mitigation efforts, like in terms of counseling, drug addiction, and Ibogaine therapy, which is another thing that's illegal. Someone was just telling me about how transformative this was for them. I haven't experienced that personally, but Ibogaine, for many of my friends that have had pill problems, and I have a friend, my friend Ed Clay,
Starting point is 01:53:09 who started an Ibogaine center in Mexico because he had an injury and he got hooked on pills and he couldn't kick it. Did Ibogaine, gone. One time done. One time done. 24-hour, super uncomfortable experience. It's supposed to be a horrible experience, right? Yeah. It's supposed to be not very recreational,
Starting point is 01:53:28 not exactly something you want to do on the weekend with friends. It's something you do because you're fucked and you need to figure out how to get out of this fuckness. And that, like we think about how much money is spent on rehabs in this country and what's the relapse rate. It's really high. I i mean i have many friends that have been to rehab for drug and alcohol abuse and uh quite a few of them went right back to it quite it doesn't seem to be that effective it seems to be effective to people when people have like they really hit rock bottom and they have a strong will and then they get involved in a program some sort of a 12-step program or some sort of a narcotics anonymous program and then they they get support from other people and they eventually
Starting point is 01:54:09 build this foundation of other types of behaviors and ways to find other things to focus on to the whatever aspect of their mind that allows them to be addicted to things now it's focused on exercise meditation yoga whatever it is that's your new addiction and it's focused on exercise, meditation, yoga, whatever it is. That's your new addiction. And it's a much more positive and beneficial addiction. But the reality of the physical addiction, that there are mitigation efforts. Like there's so many people that have taken psilocybin and completely quit drugs, completely quit cigarettes, completely quit a lot because they realize like, oh, this is what this is. This is why I'm doing this. Yeah, that's why I was more optimistic that the world would have made faster progress towards acceptance of – you hear so many stories like this.
Starting point is 01:54:56 So I would say like, all right, clearly a lot of our existing mental health treatment at best doesn't work. Clearly our addiction programs are ineffective if we have this thing that in every scientific study or most scientific studies we can see is delivering these like unbelievable results it's gonna happen yeah and it yeah i still am excited for it i still think it'll be a transformatively positive development but It'll change politics. It'll absolutely change the way we think of other human beings. It'll absolutely change the way we think of society and culture as a whole. It'll absolutely change the way people interact with each other if it becomes legalized. And it's slowly becoming legalized. Think of marijuana, which is like the gateway drug. Marijuana is now legal recreationally in how many states?
Starting point is 01:55:49 23. And, and then medically in many more. And, you know, it's really easy to get a license medically in California. He's just, I got one in 1996.
Starting point is 01:56:02 He used to be able to just go somewhere and go, I got a headache. That's it. Yeah. I get headaches. I'm to be able to just go somewhere and go, I've got a headache. That's it. Yeah, I get headaches. I'm in pain a lot. I do a lot of martial arts. I'm always injured. I need some medication.
Starting point is 01:56:13 I don't like taking pain pills. Bam. You got a legal prescription for weed. I used to have to go to Englewood to get it. I used to have to go to the hood, to the Englewood Wellness Center. And I was like, this is crazy. Like marijuana is now kind of sort of legal. And then in 2016, it became legal in California recreationally. And it just changed
Starting point is 01:56:33 everything. I had all these people that were like right wing people that were taking edibles to sleep. You know, I'm like, this is because now that it's legal, they thought about it in a different way. And I think that that drug, which is a fairly mild psychedelic, also has enhancing effects. It makes people more compassionate. It makes people more kind and friendly. It's sort of the opposite of a drug that that enhances violence. It doesn't enhance violence at all. It's just, it does the, like alcohol does that. Cocaine does that. To say a thing that'll make me very unpopular. I hate marijuana. It does not sit well with me at all. What does it do to you that you don't like? It makes me tired and slow for a long time after it. My thing also, there's biological variabilities, right? Like some people, like my wife, she does not do well with alcohol. She can get drunk off one drink, but it's biological. Like she, she got some sort of a genetic, I forget what it's called. Something about Billy Rubin, like some, something that her body just doesn't process alcohol very well.
Starting point is 01:57:33 So she's a cheap date. Oh, all I meant is that genetically I got whatever the mutation is that makes it an unpleasant experience. Yeah. But what I was saying is for me, that's the opposite. Alcohol doesn't bother me at all. I could, I could drink three, four drinks and I'm sober in 20 minutes. And my body just, my liver just like a blast. Just goes through it. Just goes right through it.
Starting point is 01:57:51 I can sober up real quick. But I also don't need it. Like I'm doing Sober October for the whole month. Feels good. Yeah. Did shows last night. Great. No problems.
Starting point is 01:58:01 Not having alcohol doesn't seem to bother me at all. But I do like it. I do like a glass of wine. It's a nice thing at the end of the day. Yeah. I like it. Speaking of that and psychedelics in general, many cultures have had a place for some sort of psychedelic time in someone's life or rite of passage. But as far as I can tell, most of them are under, there's some sort of ritual about it. And I do worry that, and I think these things are so powerful that I worry about them just being like kind of used all over the place
Starting point is 01:58:38 all the time. And I hope that we as a society, because I think this is not going to happen even if it's slow, find a way to treat this with the respect that it needs. Yeah. We'll see how that goes. Agreed. Yeah. I think set and setting is very important.
Starting point is 01:58:57 And thinking about what you're doing before you're doing it and why you're doing it. doing before you're doing it and why you're doing it. Like I was saying the other night when I had this psychedelic experience, I was just like, God, sometimes I just think too much about the world and that it's so fucked and you have kids and you wonder like, what kind of a world are they going to grow up in? And it was just one of those days where I was just like, God, there's so much anger and there's so much this and that. And then it's just, it took it away from me. The rest of the day, like that night, I was so friendly and so happy and then it's just it took it away from me the rest of the day like that
Starting point is 01:59:25 night i was so friendly and so happy and i just wanted to hug everybody i just really i got it i go oh my god that is not thinking about it wrong do you think the anger in the world is way higher than it used to be or we just it's like all these dynamics from social media we were talking about i think the dynamics in social media certainly exacerbated anger in some people. But I think anger in the world is just a part of frustration, inequality, problems that are so clear but are not solved in all the issues that people have. I mean, it's not a coincidence that a lot of the mass violence that you're seeing in this country, mass looting and all these different things, are being done by poor people. Do you think AGI will be an equalizing force for the world or further inequality?
Starting point is 02:00:13 I think it would be – it depends on how it's implemented. before with some sort of a neural interface, that it will increase your ability to be productive to a point where you can control resources so much more than anyone else. And you will be able to advance your economic portfolio and your influence in the world through that, your amount of power that you can acquire. It will, before the other people can get involved, because I would imagine financially it'll be like cell phones. In the beginning, you remember when the movie Wall Street, when he had that big brick cell phone? It's like, look at him.
Starting point is 02:00:57 He's out there on the beach with a phone. That was crazy. No one had one of those things back then. And they were so rare. I got one in 1994 when I first moved to California, and I thought I was living in the fucking future. Giant thing? No, it was a Motorola StarTAC. That was a cool phone.
Starting point is 02:01:09 I actually had one in my car in 1988. I was one of the first people to get a cell phone. I got one in my car, and it was great because my friend Bill Blumenreich, who runs the Comedy Connection, he would call me because he knew he could get a hold of me. Like if someone got sick or fell out, I could get a gig because he could call me. So I was in my car. He's like, Joe, what are you doing?
Starting point is 02:01:31 Do you have a spot tonight? And I'm like, no, I'm open. He's like, fantastic. And so he'd give me gigs. So I got a bunch of gigs through this phone where it kind of paid itself. But I got it just because it was cool. Like I could drive down the street and call people. I'm driving and I'm calling you like it was nuts to be able to drive and I had a little antenna little squirrelly I with
Starting point is 02:01:52 a little tail antenna on my car on the roof of the car but now everyone has one yeah you know you can go to a third-world country and you know people in small villages have phones. It's super common. It's everywhere. Essentially more people have phones than don't have phones. There's more phones than there are human beings, which is pretty fucking wild. And I think that that initial cost problem, it's going to be prohibitively expensive initially.
Starting point is 02:02:26 And the problem is the wealthy people are going to be able to do that. And then the real crazy ones that wind up getting the holes drilled in their head. And if that stuff is effective, maybe there's problems with generation one, but generation two is better. There's going to be a time where you have to enter into the game. There's going to be a time where you have to sell your stocks. Don't wait too long. Hang on there. Go. And once that happens, my concern is that the people that have that
Starting point is 02:02:53 will have such a massive advantage over everyone else that the gap between the haves and have-nots will be even further, and it'll be more polarizing. This is something I've changed my mind on I you know someone it opened I said to me a few years ago like you really can't just let some people merge without a plan because it could be such an incredible distortion of power and then we're gonna have to have some sort of societal discussion about this. Yeah. That seems real.
Starting point is 02:03:26 I think so at this point. Especially if it's as effective as AGI is or as, excuse me, ChatGPT is. ChatGPT is so amazing. When you enter into it information, you ask it questions and it can give you answers and you And you can ask it to code a website for you, and it does it instantly. And it solves problems that literally you would have to take decades to try to solve. And it gets to it right away. This is the dumbest it will ever be? Yeah.
Starting point is 02:03:56 That's what's crazy. That's what's crazy. So imagine something like that, but even more advanced, multiple stages of improvement and innovation forward. And then it interfaces directly with the mind, but it only does it with the people that can afford it. And those people are just regular humans. So they haven't been, there's, they haven't been enhanced. We haven't, we haven't evolved physically. We still have all the human reward systems in place. We're still basically these territorial primates. And now we have, you know, you just imagine some fucking psychotic billionaire who now gets this implant and decides to just completely hijack our financial systems, acquire all the resources, set into place regulations and influences that only benefit them, and then make sure that they can control it from there on out.
Starting point is 02:04:51 How much do you think this actually, though, even requires like a physical implant or like a physical merge versus just some people have access to GPT-7 and can spend a lot on the inference compute for it and some don't? I think that's going to be very transformative too. But my thought is that once, I mean, we have to think of what are the possibilities of a neural enhancement? If you think about the human mind and how the human mind interacts with the world, mind and how the human mind interacts with the world, how you interact with language and thoughts and facts and something that is exponentially more powerful than that. But it also allows you to use the same emotions, the same ego, the same desires and drives, jealousy, lust, hate, anger, all of those things. But with this godlike power, when one person can read minds and other people can't, when one person has a completely accurate forecast of all of the trends in terms of stocks and resources and commodities,
Starting point is 02:06:03 and they can make choices based on those. I totally see all of that. The only thing I feel a little confused about is, you know, human talking and listening bandwidth or typing and reading bandwidth is not very high. But it's high enough where if you can just say, like, tell me everything that's going to happen in the stock market if I want to go make all the money,
Starting point is 02:06:28 what should I do right now? And it goes, and then just shows you on the screen. Even without a neural interface, you're kind of a lot of the way to the scenario you're describing. Sure, with stocks. Or with like, you know, tell me how to like invent some new technology
Starting point is 02:06:43 that will change the course of history. Hmm, yeah. Yeah. with like you know tell me how to like invent some new technology that will change the course of history yeah yeah all those things like i think what somehow matters is access to massive amounts of computing power especially like differentially massive amounts maybe more than the interface itself i think that certainly is going to play a massive factor in the amount of power and influence a human being has having access to that. My concern is that what neural interfaces are going to do is now you're not a human mind interacting with that data. Now you are some massively advanced version of what a human mind is and it has just profound possibilities that we can't even imagine we can imagine but we can't we can't truly conceptualize them because we don't have the context.
Starting point is 02:07:46 We don't have that ability and that possibility currently. We can just guess. But when it does get implemented, that you're dealing with a completely different being. The only true thing I can say is I don't know. Yeah. the only true thing I can say is I don't know. Yeah. Do you wonder why it's you?
Starting point is 02:08:11 Do you ever think like, how am I at the forefront of this spectacular change? Well, first of all, I think it's very much like, I think this is, you could make the statement for many companies, but none is as true for as open AI is like, the CEO is far from the most important person in the company. Like in our case, there's a large handful of researchers, each of which are individually more critical to the success we've had so far and that we will have in the future than me um but even that and i bet those people like really are like this is a weird to be them but it's certainly weird enough for me that it like ups my simulation hypothesis probability somewhat if you had to give a guess what when you think about the possibility
Starting point is 02:09:10 of simulation theory what kind of percentage do you I've never known how to put any number on it like you know it's every argument that I've read written explaining why it's like super high probability that all seems reasonable to me feels impossible to reason about though.
Starting point is 02:09:26 What about you? Yeah, same thing. I go maybe, but it's still what it is and I have to exist. That's the main thing is I think it doesn't matter. I think it's like, okay. It kind of – it definitely matters, I guess, but there's not a way to know currently. What matters, though? Well, if this really is, I mean, our inherent understanding of life is that we are these biological creatures that interact with other biological creatures. We mate and breed and that this creates more of us.
Starting point is 02:10:05 and that hopefully as society advances and we acquire more information, more understanding and knowledge, this next version of society will be superior to the version that preceded it, which is just how we look at society today. Nobody wants to live in 1860 where you died of a cold and there's no cure for infections. It's much better to be alive now. Like just inarguably. Unless you really do prefer the simple life that you see on Yellowstone or something, it's like what we're dealing with now, first of all, access to information, the lack of ignorance.
Starting point is 02:10:47 If you choose to seek out information, you have so much more access to it now than ever before. That's so cool. And over time, like if you go back to the beginning of written history to now, one of the things that is clearly evident is the more access to information, the better choices people can make. They don't always make better choices, but they certainly have much more of a potential to make better choices with more access to information. You know, we think that this is just this biological thing, but imagine if that's not what's going on. Imagine if this is a program and that you are just consciousness that's connected to this thing that's creating this experience that is indistinguishable from what we like to think of as a real biological experience from carbon-based life forms interacting with solid physical things in the real world it's still unclear to me what i'm supposed to do differently or think yeah there's no answer yeah you're 100 right'm supposed to do differently or think differently. Yeah, there's no answer.
Starting point is 02:11:45 Yeah, you're 100% right. What can you do differently? I mean, if you exist as if it's a simulation, if you just live your life as if it's a simulation, is that – I don't know if that's the solution. You know, I don't – I think – I mean, it's real to me no matter what. It's real. Yeah.
Starting point is 02:12:04 I'm going to live it that way. And that will be the problem with an actual simulation if and when it does get implemented. Yeah. If we do create an actual simulation that's indistinguishable from real life, like what are the rules of the simulation? How does it work? Is that simulation fair and equitable and much more reasonable and peaceful? Is there no war in that simulation? Should we all agree to hook up to it because we will have a completely different experience in life?
Starting point is 02:12:35 And all the angst of crime and violence and the things that we truly are terrified of, there will be nonexistent in this simulation. Yeah. we truly are terrified of there will be non-existent in this simulation yeah i mean if we keep going it seems like if you just look if you just extrapolate from where vr is now did you see the podcast that um lex friedman did with mark zuckerberg i saw some clips but i haven't got to watch all this are right so they're they're essentially using like very realistic physical avatars in the metaverse. Like that's that's step one. That's maybe that's step three. Yeah. Maybe it's a little bit. Yeah. Maybe it's Atari. Maybe you're playing Space Invaders now. But whatever it is, it's on the path to this thing that will be indistinguishable. That seems inevitable. Those two things seem inevitable to me. The inevitable thing to me is that we will create a life form that is an artificial, intelligent life form that's far more advanced than us. And once it becomes sentient, it will be able to create a far better version of itself. And then as it has better versions of itself,
Starting point is 02:13:46 it will keep going. And if it keeps going, it will reach God-like capabilities. The complete understanding of every aspect of the universe and the structure of it itself, how to manipulate it, how to travel through it, how to communicate. And that, you know, if we keep going, if we survive a hundred years, a thousand years, 10,000 years, and we're still on this same technological exponential increasing and capability path, that's God. We become something like a God. And that might be what we do. That might be what intelligent, curious, innovative life actually does. It creates something that creates the very universe that we live in.
Starting point is 02:14:40 That creates the next simulation and then... Yeah, maybe that's the birth of the universe itself is creativity and intelligence and that it all comes from that. I used to have this joke about the Big Bang. Like what if the Big Bang is just a natural thing? Like humans get so advanced that they create a Big Bang machine. And then we're so autistic and riddled with Adderall that we have no concept or worry of the consequences. And someone's like, I'll fucking press it. And they press it and boom, we start from scratch every 14 billion years. And that's what a Big Bang is.
Starting point is 02:15:17 I mean, I don't know where it goes, but I do know that if you looked at the human race from afar, if you were an alien life form completely detached from any understanding of our culture, any understanding of our biological imperatives, and you just looked at like what is this one dominant species doing on this planet? It makes better things. That's what it does. That I agree. It goes to war. It steals. It does a bunch of things that it shouldn't do. It pollutes. It does all these things that are terrible, but it also consistently
Starting point is 02:15:54 and constantly creates better things. Whether it's better weapons going from the catapult to the rifle, to the cannonballs, to the rocket ships, to the hypersonic missiles, to nuclear bombs. It creates better and better and better things. That's the number one thing it does. And it's never happy with what it has. And you add that to consumerism, which is baked into us, and this desire, this constant desire for newer, better things, well, that fuels that innovation because that gives it the resources that it needs to consistently innovate and constantly create
Starting point is 02:16:28 newer and better things. Well, if I was an alien life form, I was like, oh, what is it doing? It's trying to create better things. Well, what is the forefront of it? Technology. Technology is the most transformative, the most spectacular, the most interesting thing that we create, transformative, the most spectacular, the most interesting thing that we create, and the most alien thing. The fact that we just are so comfortable that you can FaceTime with someone in New Zealand, like instantly. We can get used to anything pretty quickly. You know, and take it for granted almost. Yeah. And well, if you were an alien life form and you were looking at us, you're like, well, what is it doing? Oh, it keeps making better things. It's going to keep making better things. Well, if it keeps making better things. It's going to keep making better things.
Starting point is 02:17:06 Well, if it keeps making better things, it's going to make a better version of a thinking thing. And it's doing that right now. And you're a part of that. It's going to make a better version of a thinking thing. Well, that better version of a thinking thing, it's basically now in the amoeba stage. It's in the small multicellular lifeform stage. Well, what if that version becomes a fucking oppenheimer what if that version because like if it scales up so far yeah that it becomes so
Starting point is 02:17:33 hyper intelligent that it is completely alien to any other intelligent life form that has ever existed here before and it constantly does the same thing, makes better and better versions of it. Well, where does that go? It goes to a God. It goes to something like a God. And maybe God is a real thing, but maybe it's a real consequence of this process that human beings have of consistently, constantly innovating and constantly having this desire to push this envelope of creativity and of technological power. I guess it comes down to maybe a definitional disagreement about what you mean by it becomes a god. I think it becomes something unbelievably much smarter and more capable than we are. And what does that thing become if that keeps going? And maybe the way you mean it as a godlike force is that that thing can then go create, can go simulate in a universe.
Starting point is 02:18:36 Yes. Okay. That I can resonate with. Yeah. I think whatever we create will still be subject to the laws of physics in this universe. Right. Yeah. Maybe that is the overlying fabric that God exists in. The God word is a fucked up word because it's just been so co-opted.
Starting point is 02:18:53 But, you know, I was having this conversation with Stephen Meyer who is a physicist. I believe he's a physicist. And he's also religious. It was a real weird conversation. Very fascinating conversation. What kind of religion? Believer in Christ. Yeah, he even believes in the resurrection, which I found very interesting.
Starting point is 02:19:13 But, you know, it's interesting communicating with him because he has these little pre-designed speeches that he's encountered all these questions so many times that he has these very well-worded very articulate responses to these things that i sense are like bits you know like when i'm talking to a comic in like a comic like they oh i got this bit on train travel and they tell you they tell you the bit like that's what it's like he has bits on why he believes in jesus and why he believes and very very intelligent guy but i propose the question when we're thinking about god what if the instead of god created the universe what if the universe is god and the creative force of all life and all everything is the universe itself instead of thinking that there's this thing that created
Starting point is 02:20:05 us. This is like close to a lot of the Eastern religions. I think this is an easier thing to wrap my mind around than any other religions for me. And that is the, I, when I do psychedelics, I get that feeling. I get that feeling like there's this insane soup of like innovation and and and connectivity that exists all around us but our minds are so primal we're this fucking thing you know this is this is what we used to be and that what is that it's um there's a guy named uh shane against the machine who's this artist who created this it's a chimpanzee skull that he made out of zildjian symbols. See, I see that it's got a little zildjian logo. He left that on the back, and he just made this dope art piece. Cool.
Starting point is 02:20:51 It's just cool. But I wonder if our limitations are that we are an advanced version of primates. We still have all these things that we talk about, jealousy, envy, anxiety, lust, anger, fear, violence, all these things that are detrimental but were important for us to survive and get to this point. And that as time goes on, we will figure out a way to engineer those out. And that as intelligent life becomes more intelligent and we create a version of intelligent life that's far more intelligent than what we are we're far more capable of what we are if that keeps going if it just keeps going i mean chat gpt imagine if you go to chat gpt and go back to socrates and show him that explain to that and show him a phone and you know and put it on a phone and have access to
Starting point is 02:21:44 it he'd be like what have you done like what have access to it, he'd be like, what have you done? Like, what is this? I bet he'd be much more impressed with the phone than Chachi PT. I think he would be impressed with the phone's abilities to communicate for sure, but then the access to information would be so profound. I mean, back then, I mean, look, you're dealing with a time when Galileo was put under house arrest because he had the gumption to say that the Earth is not the center of the universe. Well, now we fucking know it's not.
Starting point is 02:22:12 Like we have satellites. We send literal cameras into orbit, take photos of things. No, I totally get that. I just meant that we kind of know what it's like to talk to a smart person. I just meant that we kind of know what it's like to talk to a smart person. And so in that sense, you're like, oh, all right. I didn't think you could like talk to a not person and have them be person-like in some responses some of the time. But a phone, man, if you just like woke up after 2,000 years and there was like a phone, you have no model for that.
Starting point is 02:22:39 You didn't get to get there gradually. Yeah. No, you didn't get – my friend Eddie Griffin has a joke about that. He's about how Alexander Graham Bell had to be doing Coke. He goes, because only someone on Coke would be like, I want to talk to someone who's not even here. And that's what a phone is. Is that something Coke makes people want to do? I don't know.
Starting point is 02:23:00 I've never done Coke, but I would imagine it is. It seems to me like it just makes people angry and chaotic. Yeah, a little of that, but they also have ideas. Yeah, I mean, but back to this, where does it go? If it keeps going, it has to go to some impossible level of capability. I mean, just think of what we're able to do now with nuclear power and nuclear bombs and hypersonic missiles and just the insane physical things that we've been able to take out of the human creativity and imagination and through engineering and technology implement these physical devices that are indistinguishable from magic if you brought them 500 years ago.
Starting point is 02:23:52 Yeah. I think it's quite remarkable. So keep going. Keep going 100,000 years from now. If we're still here, if something like us is still here, what can it do? In the same way that I don't think Socrates would have predicted the phone. I can't predict that. No, I'm probably totally off, but maybe that's also why comets exist. Maybe it's a nice reset to just like leave a few around, give them a distant
Starting point is 02:24:19 memory of the utopian world that used to exist, have them go through thousands of years of barbarism, of horrific behavior, and then reestablish society. I mean, this is the Younger Dryas Impact Theory that allowed 11,800 years ago at the end of the Ice Age that we were hit by multiple comets that caused the instantaneous melting of the ice caps over North America. Flooded everything. Flooded everything. Flooded everything.
Starting point is 02:24:45 It's the source of the flood myths from the Epic of Gilgamesh and the Bible and all those things. And also there's physical evidence of it when they do core samples. There's high levels of iridium, which is very common in space, very rare on Earth. There's micro diamonds that are from impacts. And it's like 30% of the earth has evidence of this. And so it's very likely that these people that are proponents of this theory are correct and that this is why they find these ancient structures that they're now dating to like 11,000, 12,000 years ago when they thought people were hunter gatherers. And they go, okay, maybe our timeline is really off and maybe this physical evidence of impacts. Yeah, I've been watching that with interest.
Starting point is 02:25:25 Yeah. Randall Carlson is the greatest guy to pay attention to when it comes to that. Yeah. He's kind of dedicated his whole life to it, which, by the way, happened because of a psychedelic experience. He was on acid once, and he was looking at this immense canyon, and he had this vision that it was created by instantaneous erosions of the polar caps and that it just washed this wave of impossible water through the earth.
Starting point is 02:25:52 It just caught these paths. And now there seems to be actual physical evidence of that, that that is probably what took place and that we're just the survivors and that we have reemerged and that society and human civilization occasionally gets set back to a primal place. Yeah. You know, who knows if you're, if you're right, that what happens here is we kind of edit out all of the impulses in ourselves that we don't like and we get to that world seems kind of boring. So maybe that's when we have to make a new simulation to watch people think they're going
Starting point is 02:26:28 through some drama or something. Or maybe it's just we get to this point where we have this power, but the haves and the have nots, the divide is too great. And that people did get a hold of this technology and use it to oppress people who didn't have it. technology and use it to oppress people who didn't have it and that the we didn't mitigate the human biological problems the reward systems that we have i gotta have more you gotta have less yes this is this sort of natural inclination that we have for competition and that someone hijacks that i think this is going to be such a hugely important issue to get ahead of before the first people push that button.
Starting point is 02:27:05 Yeah. What do you think about, like, when Elon was calling for a pause on AI? He was, like, starting an AGI company while he was doing that. Yeah. Didn't he start it, like, after he was calling for the pause? I think before, but I don't remember. In any case. Is it one of
Starting point is 02:27:25 those you can't beat them join them things um i think the instinct of saying like we've really got to figure out how to make this safe and good and like widely good is really important but i think calling for a pause is like naive at best uh i kind of like i kind of think you can't make progress on the safety part of this as we mentioned earlier by sitting like in a room and thinking hard, you've got to see where the technology goes. You've got to have contact with reality. And then when you like, but we're trying to like make progress towards AGI, conditioned on it being safe
Starting point is 02:28:13 and conditioned on it being beneficial. And so when we hit any kind of like block, we try to find a technical or a policy or a social solution to overcome it. That could be about the limits of the technology and something not working and, you know, hallucinates or it's not getting smart or whatever. Or it could be there's this like safety issue
Starting point is 02:28:31 and we've got to like redirect our resources to solve that. But it's all like, for me, it's all this same thing of like we're trying to solve the problems that emerge at each step as we get where we're trying to go. And, you know, maybe you can call it a pause if you want, if you pause on capabilities to work on safety. But in practice, I think the field has gotten a little bit wander on the axle there. And safety and capabilities are not these two separate things. This is like, I think, one of the dirty secrets of the field. It's like we have
Starting point is 02:29:02 this one way to make progress. You know, We can understand and push on deep learning more. And that can be used in different ways. But I think it's that same technique that's going to help us eventually solve the safety. All of that said, as like a human, emotionally speaking, I super understand why it's tempting to call for a pause. Happens all the time in life, right? This is moving too fast. Right. We got to take a pause here. Yeah. How much of a concern is it in terms of national security that we are the ones that come up with this first? Well, I would say that if an adversary of ours comes up with it first and uses it against us and we don't have some level of capability that feels really bad yeah but i hope that what
Starting point is 02:29:57 happens is this can be a moment where to tie it back to the other conversation we kind of come together and overcome our base impulses and say, like, let's all do this as a club together. That would be better. That would be nice. And maybe through AGI and through the implementation of this technology, it will make translation instantaneous and easy. Well, that's already happened. Right.
Starting point is 02:30:25 But, I mean, it hasn't happened in real time. The point where you can accurately communicate. Very soon. Very soon. Very soon. Yeah. I do think for what it's worth that the world is going to come together here. I don't think people have quite realized the stakes,
Starting point is 02:30:46 but this is like, I don't think this is a geopolitical, if this comes down to like a geopolitical fight or race, I don't think there's any winners. And so I'm, I'm optimistic about people coming together. Yeah, I am too.
Starting point is 02:31:01 I mean, I think most people would like that. If you ask the vast majority of the human beings that are alive, wouldn't it be better if everybody got along? You know, maybe you can't go all the way there and say, we're just going to have one global effort. But I think at least we can get to a point where we have one global set of rules, safety standards, organization that makes sure everyone's following the rules. We did this for atomic weapons. There have been similar things in the world of biology. I think we'll get there.
Starting point is 02:31:37 That's a good example, nuclear weapons. Because we know the destructive capability of them, and because of that, we haven't detonated one since 1947. Pretty incredible. Pretty incredible, other than tests. Yeah. We haven't used one in terms of war. 45 or 47?
Starting point is 02:31:58 When was the end of the World War II? Wasn't it 47? When they dropped the bombs? I think that was 45. I was wondering if there was more after that I didn't know about. No, it might the bombs? I think that was 45. I was wondering if there was more after that I didn't know about. No, it might be 45. I think it was, yeah. 45.
Starting point is 02:32:11 So from 1945, which is pretty extraordinary. That's right. It's remarkable. I would not have predicted that, I think, if I could teleport back to 45. No. I would have thought, oh, my God, this is just going to be something that people do, just launch bombs on cities yeah i mean i would have said like we're not going to survive this for very long and there was a real fear of that for sure it's pretty extraordinary that they've managed to stop that
Starting point is 02:32:35 this this threat of mutually assured destruction self-destruction destruction universe i mean the whole world we have enough weapons to literally make the world uninhabitable. Totally. And because of that, we haven't done it, which is a good sign. I think that should give some hope. Yeah, it should. I mean, Steven Pinker gets a lot of shit for his work because he just sort of downplays violence today. But it's not that he's downplaying violence today.
Starting point is 02:33:04 He's just looking at statistical trends. If you're looking at the reality of life today versus life 100 years ago, 200 years ago, it's far more safer. Why do you think that's a controversial thing? Like, why can't someone say, sure, we still have problems, but it's getting better? Because people don't want to say that. Especially people who are activists. They're completely engrossed in this idea that there's problems today. And these problems are huge. And there's Nazis. But no one's saying there's not huge problems today. Right. No one's saying there's not. But just to say things are better today. Yeah, I guess that. Some people, they just don't want to hear that. Right. But
Starting point is 02:33:40 those are also people that are addicted to the problems. The problems become their whole life. Solving those problems become their identity. Being involved in the solutions or what they believe are solutions to those problems become their life's work. And someone comes along and says, actually, life is safer than it's ever been before. Interactions with police is safer. Yeah, I can see that. That's deeply invalidated.
Starting point is 02:33:59 Yeah. But also true. And again, what is the problem why why can't people recognize that well it's the primate brain i mean it's it's all the the problems that we highlighted earlier and that that might be the solution to overcoming that is through technology and that might be the only way we can do it without a long period of evolution because biological evolution is so relatively slow in comparison to technological evolution and that that might be our bottleneck was that we just still are dealing with this primate body and that something like general
Starting point is 02:34:39 artificial general intelligence or something like some implemented form of engaging with it, whether it's Neuralink, something that shifts the way the mind interfaces with other minds. Isn't it wild that speaking of biological evolution, there will be people, I think, who were alive for the invention or discovery, whatever you want to call it, of the transistor. They will also be alive for the creation of AGI you want to call it of the transistor that will also be alive for the creation of agi like one human lifetime yeah you want to know a wild one from the implementation from orville and wilbur wright flying the plane it was less than 50 years before someone dropped an atomic bomb out of it that's wild that's fucking crazy that's crazy less than 40 right that's crazy yeah bananas i mean 60 something years to land on the moon nuts nuts where you know where is it going i mean it's just guesswork but it's interesting for sure. But it's interesting. For sure. I mean, it's the most fascinating thing of our time, for sure. It's fascinating intellectually, and I also think it is one of these things that will be tremendously net beneficial.
Starting point is 02:35:55 Yeah. We've been talking a lot about problems in the world, and I think that's just always a nice reminder of how much we get to improve. And we're going to get to improve and we're going to get to improve a lot and this will be I think this will be the most powerful tool we have yet created to help us go go do that I think you're right and uh this is an awesome conversation thanks for having us thank you for being here I really appreciate it and thanks for everything keep us posted and if you create Hal I'll give you a call let us know
Starting point is 02:36:25 alright thank you thank you bye everybody

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.