a16z Podcast - The Techno-Optimist Manifesto with Marc Andreessen and Ben Horowitz

Episode Date: October 25, 2023

Subscribe to The Ben & Marc Show on Apple Podcasts: https://bit.ly/3SdsfNtSubscribe to The Ben & Marc Show on Spotify: https://spoti.fi/3SclPOrRead the full manifesto: https://a16z.com/the-techno-opti...mist-manifesto/ This past week, Marc released his new vision for the future – “The Techno-Optimist Manifesto”.In an article that has sparked widespread conversation across traditional and social media, Marc challenges the pessimistic narrative surrounding technology today, and instead celebrates it as a liberating force that can lead to growth, progress and abundance for all. In this one-on-one conversation based on YOUR questions from X (formerly Twitter), Ben and Marc discuss how technological advancements can improve the quality of human life, uplift marginalized communities, and even encourage us to answer the bigger questions of the universe.We hope you’ll be inspired to join us in this Techno-Optimist movement. Enjoy! Stay Updated: Find a16z on Twitter: https://twitter.com/a16zFind a16z on LinkedIn: https://www.linkedin.com/company/a16zSubscribe on your favorite podcast app: https://a16z.simplecast.com/Follow our host: https://twitter.com/stephsmithioPlease note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.

Transcript
Discussion (0)
Starting point is 00:00:00 If you don't believe that you can be successful in life or with a new technology or with kind of moving the world forward, then you can't. I think like a lot of artists, I aspire to have a better class of critic. What do you do about tech companies getting so big and becoming monopolies? What do you do about banks getting so big and becoming monopolies? Do we want more Manhattans and Apollos? On the one hand, it's hard to say no. On the other hand, do I want a society that is that much more militarized?
Starting point is 00:00:27 If that's what's required, I don't know. Be really careful when somebody goes, oh, this is a problem with a new technology. So therefore, we have to stop the new technology. If you listen to the A16Z podcast, I have the feeling you saw our co-founder, Mark Andresen's new essay, the techno-optimist manifesto. This manifesto was a beacon among the sea of noise for fellow optimists. I believe in the power of technology, markets, growth, excellence, and ultimately, abundance. As Mark raises the technology flag, he is joined in today's episode by A16Z co-founder Ben Horowitz.
Starting point is 00:01:05 Together, they discuss the piece and its impetus, but they also address audience questions, including whether there is such a thing as effective pessimism, why victimization predominates the conversation, whether humans can become overly dependent on technology, the difference between being pro-business and pro-market, the role of private and public capital in driving progress, and even what? Mark thought was the most controversial point in his entire essay. Now, if you enjoyed this episode, Ben and Mark actually just launched their very own podcast, The Ben and Mark Show, that you can find wherever you listen to podcasts by, of course, searching the Ben and Mark Show, or by grabbing the link in our show notes. They've actually already released three episodes that you can binge today and have many more to come. In the show notes, you'll also find a link to the manifesto in full, or you can find it on A16.com. As Mark says in his essay, technology is the ultimate open society. And on that note, we are so
Starting point is 00:02:05 happy to have you here taking part. As a reminder, the content here is for informational purposes only, should not be taken as legal, business, tax, or investment advice, or be used to evaluate any investment or security, and is not directed at any investors or potential investors in any A16Z fund. Please note that A16Z and its affiliates may also maintain investments in companies discussed in this podcast. For more details, including a link to our investments, please see A16C.com slash disclosures. Look, there's the other scenario, and I would just call that one the Cheetos and Meth scenario. And PlayStation. And PlayStation, right? And like, and I like Netflix. I'm a fan of Netflix,
Starting point is 00:02:49 but, like, maybe not 12 hours a day. That's the existence of a cow. Cows are great. But, like, I don't think we should be cows. Hello and welcome to the Mark and Ben podcast. Today we're going to talk about a post that Mark recently wrote called the Techno Optimist Manifesto. And like all good manifestos, many people loved it, many people hated it. And so that's given us a lot to talk about. I just want to actually point out my favorite of the people that hated it was an article that was published in TechCrunch called When's the last time Mark Andreessen has spoken to poor
Starting point is 00:03:32 people or a poor person or something like that? And the thing that's so funny about it is that Mark, of all the people I know, I probably don't know anybody who's more self-made than Mark because he grew up in a tiny town in Wisconsin. He went to public schools, like not good public schools, like probably some of the worst public schools in the country, and like never got any money, you know, from home, not because his parents didn't love them. They didn't have any money to give them. And then the people who wrote the article all went to like the fanciest schools I've ever heard of and, you know, Ivy Leagues and wonderful private high schools and these kinds of things. So now we have people who grew up rich telling somebody who grew up poor and massively
Starting point is 00:04:17 succeeded what's good for poor people who want to succeed. So I just thought that was so funny. Anyway, so this one's gotten a lot of kick to it, so we're going to get right into it. The first question... Hey, Ben, can I... Can I weigh in on that? Yes, sir. Can I weigh in on that? You can't start that way, and then let me talk.
Starting point is 00:04:37 You know, let me say something. Sorry. So it's too tempting. Look, at that response is it's a classic example of what the author Robert Henderson calls luxury beliefs, right? And so the definition of luxury belief is it's a belief that can be held by somebody who's in sort of an elite position, a position of power, a position of wealth and comfort. about how society should be ordered that is incorrect and the consequences of which would be disastrous, right, for the people that would be subjected to the consequences of that belief. But the people who hold the belief are insulated from the consequences, right? They live in, you know, kind of fancy places and have very good lifestyles and aren't going to suffer directly as a result. So anyway, it's a classic, it's a great example of luxury belief. The sort of facial response to it, of course, is that capitalism and free markets are the machine that has lifted people out of poverty for, you know, 500 years. We've run an experiment. The other systems don't seem to work as well. Many, many, many times, you know, and this is one of those things.
Starting point is 00:05:27 And, you know, some amazing things. Exactly to your point, we ran this experiment hundreds of times in the 20th century, and the results are very clear. And, you know, look, most recently in China, you know, there's a direct correlation between the degree to which the Chinese Communist Party kind of takes its boot off the throat of the people in terms of their ability to engage in markets and engage in trade and the extent to which their quality of life rises. And so you kind of still see that, you know, playing out today.
Starting point is 00:05:46 And how quickly it progresses when they move to central planning? 100%. And so, and you just see this over and over again. And then, you know, you see it happening other parts of the, world also. And so, you know, it's this thing where it's just like more times running the same experiment are not going to generate different results. And then the result of this, of course, is that this is kind of so obvious and well established at this point that you have to go to, you know, one of our finest elite, you know, private high schools and private, you know,
Starting point is 00:06:08 universities to really get inculcated into the luxury belief system that's going to be trying to help. Yeah. Exactly. So that's an excellent start. So this first question is from St. Louis of Techni. What does effective pessimism, what like. How can people who want to mitigate risks make sure not to waste their time on moral panic that stymies progress? You know, look, this is really tough, right? You know, let's start by conceding the kind of giving the devil is due, kind of conceding the strength of the other side's argument on this, which is like, look, as I say in the essay, like, you know, I'm not a utopian. You know, technology is not purely a force for good. Technology is not purely used for both
Starting point is 00:06:46 good and bad. And, you know, virtually every technology that man has ever invented has been used were both good and bad. And so it's not that there aren't downsides. And it's not that there aren't risk. What's that? Yeah, starting with fire. I mean, look, like, you know, this is a reference in the piece, you know, the myth of Prometheus, which is kind of the origin myth in Western society of sort of the implications of technology. And, you know, in the Prometheus myth, you know, Prometheus is the god that brought a fire to man. And, you know, for that, he was punished by Zeus by being chained to a rock and having his liver pecked out every night by a bird. And then it would regenerate in the morning and then it would happen again the next day. So, you know,
Starting point is 00:07:17 a very exquisite form of torture that Zeus came up with. And, you know, the reason that myth is so powerful is because, look, fire was the enabler of heat and light and cooking food, right, and defense and shelter for early man. But it was also a weapon, you know, from the very beginning, it was a weapon of war, right? And if, for example, you're engaged in siege combat and you're going up against a fortified, you know, castle or city, the way that you win is you burn them out, right? And by the way, the way they retaliate is they may heat up oil to boiling temperatures and they pour it in your head, right? So both sides of these are true, and this continues to be the case to this day, you know, actually the effective pessimism is the clever framing. You know, these are kind of the two kind of, I don't, valances that you can apply to this question, which is you can apply. One valence is basically fundamentally, over time, net, you know, everything.
Starting point is 00:08:01 Technology has been primarily a force for good, primarily a force for progress, and basically you embrace it and support it and accelerate it as much as you can. And then you deal with the issues as they arise, which is the story of the development of modern civilization. There is another valence, and some people incline more nationally to the pessimistic position, which is basically, and this is true, by the way, both technologies and markets, people also apply the same kind of negative valance to markets, which is, well, primarily, you know, technology or markets are a generator of bad things, right? Technology is a weapon of war, you know, technology is something that has unanticipated negative consequences.
Starting point is 00:08:31 Markets have winners and losers, right? And so, you know, you start out focusing more on the winners or on the losers. And so you just kind of have to decide where you go in. You know, the accusation, of course, from the pessimists is an optimist or two optimists. You know, the counter-accusation, of course, is if you start out with a pessimistic frame, it's very hard to hold that in a moderate position, is what I observe. The pessimists sort of slide into greater and greater levels of pessimism quite quickly, and, you know, they end up very angry and bitter and hostile, and they end up advocating
Starting point is 00:08:59 for extremely, you know, I would say, draconian and kind of senseless policies. And so I think it's hard to be an effective pessimist. Yeah. I think it seems to be much easier to become, I would say, you know, either an ineffective pessimist or just flat out dangerous pessimist. Yeah, you know, Andy Grove had a great line on this. Somebody asked him, was the microprocessor good or bad? And he said, well, that's a crazy question. It's like asking is steel good or bad. It is like you're not going to hold back progress. And so what you have to ask yourself is, you know, how do you make it good? But don't try and do that by banning it. Because you proceed at that. You're going to get frustrated. Yeah, I mean, the only thing I'd argue with Andy on that is, you know, they do get banned, right? Steel didn't get banned. But civilian nuclear. power did. And so, you know, the pessimists, I mean, look, like I mentioned in the piece,
Starting point is 00:09:45 you know, I think the single biggest policy mistake of my lifetime was the decision in the 70s, effectively in the 70s into the 80s to ban, essentially ban civilian nuclear power. And, you know, for sure throughout most of the U.S. and then, you know, throughout certainly most of Europe as well, with, you know, maybe France being the big exception. And then by extension throughout, you know, a lot of the rest of the world, because it would have been Western, you know, countries and companies that would have brought it to the rest of the world in that time period. And, you know, look, I think the consequences for that, like, I think of you're an environmentalist and you kind of are looking at things dispassionately as Stuart Brand and others have
Starting point is 00:10:13 been doing now for a while is you kind of say you know look we had the silver bullet for a sort of unlimited zero emission energy and we had it and we chose not to use it and you know with everything we know today it's overwhelmingly both the safe effective and kind of zero risk of mass death zero risk of contributing to carbon emissions and so forth but you know look we collectively made a political decision to ban it and we're you know paying the price for that today and quite frankly it's one of the reasons why rush is able to do what is doing in ukraine is it has this flow of money from oil you know which in a counterfactual universe where the world by now had cut over to civilian electric power, they wouldn't have that.
Starting point is 00:10:44 They wouldn't be able to do what they're doing. So the consequences of that decision play out decades later. And I think that's a great illustration of the risk of, let's say not the risk of effective pessimism, let's say the risk of dangerous pessimism. Yeah. And dramatic example of narrative defeating data because it's very obvious from the data that nuclear is far safer than, say, oil or coal. All right. Second question.
Starting point is 00:11:07 And this is from your friend in mind, Shaka Sengor. How can we distribute transformative philosophies like yours to marginalized communities where a culture of despair and victimization predominates? Yeah, so this is a really interesting question for me because I think that it does get to the heart of like one of the reasons why victimization predominates is because it's a techno-pessimistic world in those communities in the marginalized communities. And I would say I go back to there have been. some very kind of interesting and successful leaders, Marcus Garvey comes to mind. And I think it starts
Starting point is 00:11:47 with something that he really was heavily behind, which is this idea of self-determination, that an individual can change their own lives, change their own circumstances. And he had this idea at the turn of the 18th of 1900s, which was a kind of much more difficult time to do that, particularly for kind of black people in America. But, you know, he himself succeeded at it greatly. And I think it really starts with that mindset where if you don't believe that you can be successful in life or with the new technology or with kind of moving the world forward,
Starting point is 00:12:25 then you can't. And Henry Ford's famous line, there are two men. One believes he can do it, the other believes he can't do it, and they're both right. I think is the key to that whole thing is it just starts with the belief that you can succeed. And like, the odds are harder for some than others, but you always end in despair if you believe you can't do it. Yeah, and I just add building a little bit on the sort of,
Starting point is 00:12:48 look, on the consumption side, so this is sort of one of the amazing things about free markets. Free markets are the most beneficial to the lowest income and the most advantage. And that's a counterintuitive mindbender for a lot of people who have been trained at the latest. But the truth is it's the best for the poorest. It's the best for the people who have the least. And by the least, I mean, the least existing wealth, but also the least access, the least social status, right, who are kind of on the receiving end of not a lot of advantage in their lives. And you can kind of look at that on two sides of their lives. You can look at that for them as producers and as consumers. So on the production side,
Starting point is 00:13:19 markets open up opportunity, right, for people to be able to make their way in the world and for be able to have jobs, be able to make money and then ultimately be able to support a family. And so, again, the counterfactual is not a poor person, right, trying to navigate their way through the hell of a capitalist economy versus somehow in a socialist or communist system, they'd be kind of handed everything they need, right? The reality is it's the other way around. It's the more capitalism, the more opportunities are the more available jobs. The more lines of work, the more openness and freedom and choice to be able to figure out how to succeed and how to make money and what work to do. You end up on the wrong side of a
Starting point is 00:13:52 authoritarian communist regime or a socialist regime or an authoritarian regime, right? You are screwed. Like there are no jobs for you, right? Nobody's hiring. There are no private employers. You know, if the state, for whatever reason, doesn't want to hire you, like you're out of luck. if you're in a disadvantaged class category, if you're in a disadvantage, race, ethnicity, gender, sexual orientation, any of these things, or just political views or whatever, or just that you're poor, and people look down on you, like, that's it. Like, you're done. Like, at that point, you're a word of the state and whatever small amount of grain they want to feed you to keep you alive, fair enough, but like, you're not going anywhere, right? And that was the story of low-income
Starting point is 00:14:29 people in most societies over basically the entirety of human existence up to the point of the invention of markets. And so that's on the one side. And then on the consumption side, I talked about this also in the essay is one of the big things that technology does in the free market context is it drives prices down. And this is a big thing on inequality and especially income inequality that people I think miss, which is like one form of like determining the level of like income inequality or the value of one's income or whatever, right, is to look at it in terms of like, you know, what literally is demarcated measured by units of currency. Right. And so I'm either making more money or less money. Look, the other side of that is what is money used for. It's used to buy goods
Starting point is 00:15:03 and services. And so if the price of goods and services is falling, that's the same effective thing to you from a standard of living standpoint as if you're getting a national raise. Right. And so what markets do and technology does is they drive prices down. And the more they're allowed to operate, the more they drive prices down. The traditional way that economists talk about this is they make the observation, which is as follows, which is the Queen of England always wore silk stockings. Right. Like the super rich throughout all of society, throughout all of time, have always had access to the best of goods and services, the best of available food and the best of available health care, right,
Starting point is 00:15:36 and so forth and so on, of their time and place. It was traditionally, like, that stuff was all just silk stockings and everything else, right? We're just completely out of reach of most people. And it is precisely the engine of free markets and technology that bring down prices so that regular people can afford these things as well. Yeah, maybe the most profound example of that is the Internet plus the smartphone because when you and I grew up, information was kind of an elite thing.
Starting point is 00:16:01 to get to. And the big reason to go to university was the knowledge was all there. It was in books and the libraries and all these kinds of things. And you actually have access to that. Otherwise, and computers, by the way, also we didn't have computers for individuals. And now, at least the kind of poorest people in America, the homeless in San Francisco, have better access to information and knowledge than the President of the United States did in 1980, which is unbelievable. Let me give you one other one on that. And I find this totally mind-blowing. So, Ben, you remember when we first started working on the Internet in the 90s, there was just this sort of endless kind of hand-wringing at the time
Starting point is 00:16:37 about this concept of the digital divide. Yes. Right? This concept of basically digital technology, the Internet, computers, PCs, and the smartphones were going to basically widen inequality because it was basically well-off people that were going to have them and then poor people wouldn't. Right. And this is maybe the effective pessimism of its time,
Starting point is 00:16:51 as people were very worried about that. Look, sitting here today in 2023, the following is true. More people in the world have computers in the form of smartphones, specifically, and internet access, then have electricity or running water in their homes. Yeah. Amazing. Right. So the digital technologies people were worried about are actually the most egalitarian, right, of all technologies that have ever been produced,
Starting point is 00:17:10 even more than running water and electricity. So, I mean, we should still be worried about, like, literally the gap in access to fresh water, like more than the gap and access to the internet. And again, there's exactly one. The water divide, right? The electricity divide is, like, still a thing. But the digital technology divide actually turns out to not be a thing. And, again, the reason for that is very straightforward.
Starting point is 00:17:28 the reason for that is falling prices. The reason for that is as the global smartphone market went to 5 billion people, the price of a smartphone collapsed to, I don't even know today, in sort of the developing world, it's, I don't know, 10 bucks or something. And then the same thing, Internet access has plummeted in price over time because of Moore's Law and competition and innovation. And so the paradox, the flip side of this is if you wanted a plan to be able to drive something, any form of good or service that is important to lots of people
Starting point is 00:17:51 to have it be available to everybody, the thing to do is to lean harder into markets and into technology, right, not further away. Great point. Okay, next question from Max Churitich. Technology makes life easier and necessary for a better future. However, how would you address humans getting overly dependent on tech to a point where we can't function without it? Yeah, so I would say there's kind of the full dystopian version of this, which is sort of the Wally scenario. Yeah, right? In the movie Wally, for those who you haven't seen it, mankind in the future basically is all just like obese and literally sitting in these big like zero gravity suspension chairs and basically, binging Netflix and slurping. And some of that going now. Yeah, this might sound a little familiar in our times.
Starting point is 00:18:34 But yeah, look, there is this sense of like, okay, we kind of at some point, like we live in sort of this automated, I don't know, farm environment or something, and we're kind of farm animals and we're being kept fat and happy, but we've kind of lost agency, we've lost very well, we've lost choice, we've lost any sort of sense of self-reliance, self-sufficiency, any sense of adventure. Like, I think there's certainly some argument in that direction. You do see examples of that. What I try to do in the essay and what I believe on this is a little subtle, which is, look,
Starting point is 00:18:57 There are really big questions about the meaning of life that people have today and have had for a very long time, right? And a lot of the history of human civilization has been debates around religion and which gods to worship and moral principles and how to order a society and what the role of the collective versus the role of the individual is and all these policy questions that flow are from this. And like the story of human civilization is in some way the story of trying to figure out all those questions. And of course, these questions are still, at least for a lot of people, still on answer. or are still open questions or are being open to new all the time. I think it's putting, frankly, too much of a burden. I'm an enthusiastic, a proponent of technology and markets, but it's putting a little bit too much of a burden on technology and markets to expect technology and markets to answer those questions for all people. I think if you're looking
Starting point is 00:19:41 to technology and markets to answer those questions, I think you're probably looking in the wrong direction. I think you probably need to look inside, quite honestly, inside the human soul where all the hard questions lie. And then the observation I make is rising technological capabilities and rising standard of living, right, through markets, open up the room, right, individually and collectively in our society to be able to ask those questions, right? So, and the most obvious example of that is, like, when people are hungry, they don't ask any of these questions. The only question is, like, where is meal coming from? Right. And you can kind of elaborate that all the way up and you can kind of say, look, like, if the ultimate human problem is that we have full bellies, our children
Starting point is 00:20:15 are going to live great lives, we're able to support our family, we're able to do all the things that technology is able to give us. And we still have these big unanswered questions about the meaning of life, then basically what technology will have done is to open up the aperture to be able to actually spend more time trying to figure out those big questions. Yeah. Yeah, that seems like a very champagne problem for sure. It's funny. It reminds me of when I was in elementary school, my brother was in junior high school.
Starting point is 00:20:40 I went to the school play, and the play calculators had just come out. And the whole play was about this society where nobody knew how to do math, and then the calculators all broke. And so those kinds of fears, I think, go with technology, but they tend not to play out in very real ways. At least all the calculators haven't broken yet. Yeah. Well, look, technology gives you the way. I mean, here's the other kind of serious observation to make is technology increases resilience to natural disaster.
Starting point is 00:21:09 So, for example, this comes up actually a lot in the climate debate, which is basically, and I'm not, this is not question climate change, making a different point. But one of the questions over time is basically is nature getting more or less dangerous, right, over time. right, as the world changes, as humanity changes and so forth. And basically, that deaths from natural disasters have been a systemic decline for a century plus at this point. No. Right. It used to be that if you were confronted by, you know, basically any kind of... What's that?
Starting point is 00:21:34 Cold. Yeah. There's no heat. Yes. People, yeah, it was cold, you freeze to death. If it was hot with no air conditioning, you might die from heat stroke. Any kind of tornado flood, mudslide. I mean, look, it was in Boston, like 150 years ago or something.
Starting point is 00:21:48 There was like a molasses, basically a mass tragedy. where people like literally drown in a molasses flood. Like nature is vicious, right? Nature really has it out for you. And if you're unprotected in a state of nature, like it, the old thing is life in the state of nature is, was a nasty, brutish and short. Poor, solitary, nasty, brutish and short, Hobbs.
Starting point is 00:22:06 Yeah, exactly. And so, so look, the flip side of the question is technology is now buffering us against sources of mass death that used to be far more common. And so this is not to sort of try to get away from the kind of the doomsay question of like what happens if it literally all stops working. But like, how do we build defunders? against the really bad scenarios in which that would happen. It actually turns out technologies our friend on that.
Starting point is 00:22:25 Yeah, yeah, yeah. And our friend, Elon, is protecting us against everything by making us an interplanetary species, which also deals with the asteroid problem. Yeah, I mean, look, the dinosaurs had no plan B, right? Nope. Turns out. So John asks, how can economic systems evolve to prevent human corruption from infiltrating advanced technologies that surpass our capacity for understanding?
Starting point is 00:22:51 Yeah, Ben, what do you think? Or maybe you might narrow on the question a little bit. Yeah, you know, I'm not sure if I totally understand what he's getting at here, but I think human corruption, there is this agency problem, and I think that you kind of alluded to it on the kind of nuclear efficient issue, where as these systems evolve, how do you keep the human interest from fouling them? And this kind of gets to the heart of the lot of the things that you spoke about in the manifesto, which is, for example, example, we had this banking crisis and everybody's intention was to basically lessen our reliance on giant banks that became as powerful as many governments in the world. And of course, because of this Baptist and bootleger issue where the banks were the bootleggers and the government was the Baptist, we basically got the opposite. We got much bigger banks, much more powerful. And that trend is kind of continuing forward. So I think that when we,
Starting point is 00:23:51 look at how systems get corrupt and create real crisis, we really have to beware of the agency problem. And we're going through a few of those now, I think, both on AI and also on crypto, where actually a lot of the answer to some of these huge, powerful monopolies, like, what do you about tech companies getting so big and becoming monopolies? What do you do about banks getting so big and becoming monopolies, we actually have a magic technology that decentralizes power actually creates a real form of stakeholder capitalism where all of the participants in the economy get rewarded for building the economy. And the biggest adversaries of that whole movement end up being the kind of Baptist and the
Starting point is 00:24:43 government conned by the bootleggers and the big banks and so forth, kind of over-highlighting small, real, but small dangers of the technology and trying to stop the technology in its tracks and kind of lead to this horrible agency problem where you have these very corrupt systems. So I guess my big thing would be, like, be really careful when somebody goes, oh, this is a problem with a new technology,
Starting point is 00:25:11 so therefore we have to stop the new technology. Like that, I think, is the pattern that's repeated over and over again and hurt us so badly on energy and it threatens to hurt us on intelligence and threatens to hurt us on decentralization. Yeah, that's right. Yeah, one of the ways to think about this for people who haven't run into this in their lives is there's a fundamental difference between pro-business and pro-market. Yeah. And they sound like they're the same thing and they're not. Because pro-business kind of begs the question of which businesses and then sort of, okay, what's the structure of the market? Are we talking about like, yeah. Right, exactly. Are we talking about lots of
Starting point is 00:25:45 companies having to compete and earn their way in the world, or are we talking about ultimately crony capitalism? And basically this is the pattern of basically what happens. This is kind of the point of the Baptist and bootlickers idea. What happens, which is basically a new technology, something changes in the world. You have big incumbent companies that very much are opposed to that change or want to control it and want to control and want to lock it down.
Starting point is 00:26:03 And so what they do is they go to the government and they basically say, oh, you need to regulate this. And they don't go in and say you need to regulate this for our benefit because they would get laughed at. So what they say is you need to regulate this. That would be a dead giveaway. Yeah, that would be a dead giveaway. So instead they say, you need to regulate this to protect basically the little people.
Starting point is 00:26:19 But what they're shooting for, 100% of the time, what they're shooting for is basically government sanctioned barriers to competition for themselves. And I would even argue like I'll do a little effective pessimism. We don't in America today in most industries actually live in what you call a free market system. We live in more of a captured kind of big business cartel ecosystem. And you look just across sort of sector per sector. What you see are sort of two or three or four companies that have, you know, overwhelming market share in each sector and generate, you know, an overwhelming percentage of the profit. and have this extremely ancestral relationship with the government. By the way, often to the point where they're actually writing their own regulations, right?
Starting point is 00:26:52 Like, it's their lobbyists actually writing the regulations. It's the industry groups that they run. Well, the classic example is... I think that's the majority of regulations works exactly. Yeah, that's right. And, of course, and then there's this other form of corruption, right, which is the revolving door, which is like, okay, if you're a regulator, it is extremely tempting just out of pure self-interest to kind of do what these big companies want because they'll hire you, right?
Starting point is 00:27:10 After you're done doing that. And so this is sort of this corruption after the fact that happens with the revolving doors. So basically, you could be pro-business and be completely in favor of all of that, right? Because it is still businesses doing everything at the end of the day. Pro-market, right, says, no, none of that, what I just described is acceptable. That's not how things should work at all. The last thing any government should be doing is giving any particular company some special right or privilege, some ability to block out new competition.
Starting point is 00:27:33 And in fact, what you want is more competition. You want more competition, more markets, more capitalism, as actually the answer to that, precisely to keep the big companies from basically just taking over and not having to compete anymore. Yeah, 100%. And actually, that leads very well into the next question, which is how exactly will markets prevent monopolies? Please elaborate on your point. And I think this is so key because, look, the nature of companies, even monopolies, is that the older and larger they get, the less adaptive they become. And we look, we've seen this with Google who invented most of the new AI technology, and then we're
Starting point is 00:28:13 It was a little bit of sleep at the wheel. When OpenAI released GPT, they missed it just because, like, they're big, they're complicated, they're slow, they're not as good anymore. And so if the new AI companies are free to compete, if open source AI is free to compete, then all of the sun, that's the best kind of way to break that monopoly. Similarly, there's a lot of chatter on, like, social networking monopolies and banking monopolies and these kinds of things. And again, we already actually have a technology
Starting point is 00:28:46 that's a great insurgent technology to defeat those monopolies. And the thing that prevents that is, as you say, not a pro-market, but a pro-business, a pro-very-specific business, businesses that have enough money to bribe, corrupt, lobby the government into creating regulations
Starting point is 00:29:05 that prevent the new company from competing. Right, that's right. And we're saying a lot of that right now. I find the heart to be terribly corrupt. With that mind, can you elaborate on your statement, love doesn't scale. I don't know if you meant because the heart is corrupt, but perhaps you do.
Starting point is 00:29:29 Well, so I didn't say the heart is corrupt. I mean, look, I think what the question is alluding to there, I would assume is sort of this perennial debate about human nature, which is man primarily good or bad, which we could talk about. But I think on the thing about love not scaling, let me hit that one directly and then we can maybe go to the bigger topic. So the formulation here is from a guy named David Friedman, who's an economist and Milton Friedman's son. And the thing that he said that really stuck with me is, look, there's only three ways to get people to do things for other people, right? Fundamentally, one is love.
Starting point is 00:29:57 And you see that in people's families and their friend networks like, I'll do things for Ben without him having to pay me, right? And so that is an important force. Or without me threatening to kick your ass. Coming to the other one, yes, exactly. so without force. So there's love. Without love, there's basically two other choices, and they're basically money and force. Right. Money's the carrot, the force is the stick. Look, money is capitalism's answer. And again, going back to just beat up on the communists a little bit more, like this was the big realization of all the communist societies in the 20th century
Starting point is 00:30:24 and today, which is like basically what communism and its derivatives, socialism and so forth, expect, they expect love to scale. And so they expect that you should work in whatever the Siberian salt mines or whatever it is, the fields or whatever. And you should do that out of love for your fellow man, and you should do that for love of the society and so forth. And by the way, look, like the Nazis are the national socialist, like they have their own spin on this. You're supposed to do things on behalf of the German people is the same thing. Like, you're supposed to love this macro unit because the love doesn't scale.
Starting point is 00:30:49 The problem is people just don't naturally love people they don't know, right? And by the way, my view, that's not because there's something morally wrong with them. It's because they don't know the other people, right? And it's like, okay, are they being loved back? Right. And like, are the other people going to be pulling their weight, right? Or is there going to be a free rider problem? And, of course, the answer is at any level of scale.
Starting point is 00:31:07 Of course, they're a free rider problems if people aren't required to work. And so this is the irony of the heart of the whole thing. Society built in the idea that love scales becomes an incredibly dark, dystopian, hostile, and ultimately brutalist place because love doesn't scale. Forces one way around that, which is, okay, the way you get people to work, even though they don't want to because they don't love people in some remote area that they'd be working on behalf of, just put a gun to their head. And then the third option that falls straight out of that is, okay, how about money?
Starting point is 00:31:29 And then sort of money, of course, is a proxy. Money is a tool for sort of, you know, what they call sort of rational self-interest, writer in light and self-interest, which was like, okay, I'm going to get paid money to do this. It's going to benefit these other people. I'm not primarily doing it because it's going to benefit people I haven't met. Look, maybe I love everybody and maybe I would love to meet all my customers. And by the way, look, when you walk into a restaurant and you've never met the owner or the host before, like they're thrilled to see you, right?
Starting point is 00:31:50 And so do they literally love you? Are they their new best friend? No. Are they excited to see you? Yeah. Well, they have a very positive sentiment towards you because they know you're going to pay, right? Anyway, so that's the best solution that we've come up with. Yeah.
Starting point is 00:32:02 barring some profound change in human nature in which people all of a sudden become far more generous than they've been historically. That seems like it's likely to be a stable state. Yeah, yeah, it's funny. It reminds me of an interesting conversation I had years ago with a friend of mine who grew up in the Soviet Union. And I made some offhand comment that like whatever, Stalin was a crazy psycho or that kind of thing. And he goes, what are you talking about? Stalin was very rational, very smart, go back, read what he wrote, look at his speeches.
Starting point is 00:32:28 He was super systematic thinker, very intelligent. And I was like, well, like, what went wrong? Why did he kill 20 million of his own people? And he just said, when you take away the carrot, all you have is stick. And that is so true. And I think a lot, you know, that's a lot of the point. At scale. And your family, yes, you can run a communist family.
Starting point is 00:32:50 You can even run a communist cabots at that scale. It can work for sure. Yeah, that's right. Hight made this point. And Deirdre McClusi has made this point also, which is like, look, we do live in a superposition of the two systems, right? because, like, you don't want to be the asshole who, like, you know, runs your family. Like, it's a capitalist enterprise.
Starting point is 00:33:05 Like, you charge your kids for rent, like, for sleeping in their bunks when they're eight. We all grow up and in our family and friend environments, like, we're all communistic in that context, right? But then there's this kind of schizophrenia to it, which is like, okay, we got into the world, and the world doesn't act like that. And all of a sudden there's a different way to exchange and a different way to relate. And so I think ultimately, you know, the answer lies in the superposition of the small and the large. And let's just say, yeah, the people who are too abstract on this, I think, get derailed. Yeah, I think that's the exact point that everyone gets confused on
Starting point is 00:33:33 because who want the country to run like their family? That would be so much better. The problem is it's impossible. Next question. This is actually one that I think a lot of people have from Morton. How do you see private capital versus public research budgets when it comes to fundamental progress? Aside from AT&T's monopoly that drove Bell Labs,
Starting point is 00:33:55 have yet to see systemic technological progress from private investment. I think I would disagree with the last board, but I'll let you start. Yeah, so I'd actually say there's actually three models. I would say there's private, there's public, and then there's a third that I'll come to. So look, like a couple things I think are true. So one is, again, I'll give the effective pessimists they're due on this one also, which is, look, like, straight capital R research that is sort of the time-odder way of like discovering the principles of the universe and so forth. The best of that has no idea if or when it will ever commercialize, right? Because by definition, it has no idea whether or not the experiments will even pan out, right, or the theories are even correct.
Starting point is 00:34:29 And so there is this kind of research that historically has been publicly funded. And look, in the U.S., we have had these agencies like the National Science Foundation that have done a lot of that for a very long time. And whatever issues NSF has, if you look at the totality of spend in their existence and then the results, you would say, yeah, that was absolutely an outstanding investment from a societal standpoint. Right, right, time, hundreds in investments, yeah. Yeah, that's right. And again, to be an effective pessimist, a lot of that research may not have been research that
Starting point is 00:34:52 would get done by private companies. Look, on the other hand, I think private companies carry their weight more than people think. And look, part of it is the questioner alluded to Bell Labs, but there is this thing that happens when these companies, you know, when the best of the big and powerful, they do more of this. They open these labs, right? And so AT&T did it, IBM and Hewlett-Packard and Google and Microsoft
Starting point is 00:35:12 and many others across many industries have done this. And I think quite honestly, part of that is PR. For them, they like to show that off. I think part of that is, look, they start to think in terms of longer time horizons and they do want ultimately new products at some point. Part of that is it's a recruiting and retention exercise to get the best and brightest technical minds to stay there is to tell them that they can do research and publish their research. And so there is that. And look, Ben, you mentioned like, look, the breakthrough on AI
Starting point is 00:35:34 just came out of Google, right? It came out of a private company, which is a great example of that. So, yeah, look, there's some tension there. Quite a few researchers out of academia to pull that off. They did. And in fact, that's been a long-running source of tension, actually, between the tech companies and the universities for the last 15 years or so, which is as these new tech companies have built these big research labs, they have pulled a lot of the good faculty out of universities. And by the way, that process, I think, is accelerating an AI specifically because the capital needs to actually do modern AI are so big that actually universities can't afford to do it
Starting point is 00:36:01 right now. So you get this kind of issue of like, are you draining the universities that they're remaining competent professors and leaving only the other kind? And you could debate and we could have a long debate about the pros and cons of public versus private. There is a third model. And this is the one that people often talk about, you know, as sort of an aspiration, which is this idea of these very large societal projects. And in the U.S., you know, we talk in particular about the Apollo project and the Manhattan Project, right? And we say, this is kind of a frequent lament from kind of declinists, right,
Starting point is 00:36:26 which is, why can't we have more Apollo projects and Manhattan projects? But I think that's actually a third model, and the third model is military and militaristic, right? Manhattan was just a straight-up military project, right, to build a bomb run by the military. Apollo was sort of a civilian thing, NASA, but it was in the context of a military arms race
Starting point is 00:36:42 with the Soviet Union and a technological arms race with the Soviet Union in a much more militarized society than we have today. And, of course, when the military gets involved, two things happen. One is they can do things at speed, when they really want to, because they just tell people what to do, and then also they get access to a lot of money,
Starting point is 00:36:56 even things you're rough. And so personally, I go back and forth a little bit on this, is like, do we want more Manhattans and Apollos? On the one hand, it's hard to say no. On the other hand, do I want a society that is that much more militarized? If that's what's required, I don't know. Yeah, that could... What would that mean for what is happening in the world
Starting point is 00:37:11 if we get back to the point we're as militarized as we were in 1941, right? Or even 1960. Yeah, no, it's actually one of the great things about the modern world is we have so many fewer wars and that's a smaller scale. And every war is still horrible. Look, and hopefully that continues, right? Yes.
Starting point is 00:37:29 Okay, here's a good, simple question from Tux. How do you bridge the gap between being anti-statist and supporting America First policies or maybe American dynamism as we do? Yeah, so look, I think that all these things end up being a question of power and how it works. So the extreme form of statism is communism where 100% of the power
Starting point is 00:37:57 is in the public sector and no power is in the private sector. And that has the issues that we spoke of. I think that no public sector can quickly go to anarchy. And that's not, at least not something that I would advocate for. So I think, look, as a society, we kind of have this collective, there's a certain amount of collectivism in shared values, shared morals, what's okay, what's not okay in America, we have things like freedom of speech, freedom to be who you are kind of, and we're very anchored on these kind of free society ideals. And you need a state to do that, at least I believe I think there are probably some people that don't believe that, but you need a state for that. And preserving that, those values and that way of life is
Starting point is 00:38:48 extremely important. And that's primarily the role of the government with all of us citizens participating. And we try to participate in that through our American dynamism efforts, and that's important. So when I say I'm anti-status, I'm really saying I'm anti-communist, and anti-too-much weight on the public sector at the expense of free markets and basically freedom for the citizens. Yeah, Ben, I think you probably mean broadened out anti-communists also, or by extension, anti-authoritarian. Yeah, for sure, anti-I think those go together because it's hard to be an authoritarian unless you have an amazing amount of the power, right?
Starting point is 00:39:27 Like, even if you wanted to be authoritarian in the U.S., and we certainly like had authoritarian tendencies from time to time from various politicians and leaders, but it's really hard to do because you don't have enough power to pull it off. And that's the greatest thing about our system, I think, is you can't gather enough power to become completely authoritarian in the U.S. and that's maybe the most fundamental thing we want to preserve. Zach T. Can you further define
Starting point is 00:39:56 the law of accelerating returns and how it may play out in the future? So I think it's, I never know whether you use this as a, it's such a great metaphor. I'll go ahead and use it. So Paul Rumer uses this metaphor. He says, ideas have sex. Let's say ideas reproduce.
Starting point is 00:40:09 They go through the same reproduction and evolutionary process as living organisms. That's different than having sex, by the way. Yeah, you're right. Okay, that's bad. now that we have birth control on all these things. Longer, longer form is probably better. So the argument is basically ideas beget ideas in quite the same way that people
Starting point is 00:40:25 beget people. So basically it's like the more ideas that you have, the more combinations of ideas you can have. And those combinations of ideas are themselves ideas. And then based on that, then they can further kind of replicate, you know, they can kind of crossbreed and have offspring. And you see this anytime, of course, you're building any kind of technological product is you're pulling in ideas from like all over the place.
Starting point is 00:40:44 You're getting inspired by all the different technologies. that are below you in the stack, you're getting inspired by all the other applications anybody has ever tried to build. You're getting inspired by all kinds of things. You know, look, AI is inspired by the neural structure of the brain, quite literally, right, deriving from biology, crossbred over into computer science, mathematics. And so basically, if this process is working the way that it should, you should see sort of this accelerating explosion of variety and sort of speciation and reproduction and scaling of the number of ideas in the world that sort of catalytic feeds on itself like a chain reaction.
Starting point is 00:41:11 By the way, this was also the argument to connect this back to the idea of human population. So this was the argument for this guy, Julian Simon, who I admire a lot, who wrote this book called The Ultimate Resource. And a lot of his work was in the 60s and 70s when there were all these pitched battles, of course, around natural resources, environmentalism. And he was kind of the avowed enemy of the Scott Paul Erlich, who was the guy who predicted mass famine and death from kind of increases in technology. And so Julian Simon said, no, actually what you want is you actually want a lot more people in the world. Humans, people are the ultimate resource, right? Not any kind of raw material, but literally people. And he said, why do you want more people in the world?
Starting point is 00:41:41 Because if you have more people, you'll get more ideas. Yeah. Right. And so more people means you'll have more ideas, more ideas in combination with ideas leads to more ideas. Those ideas lead to ways to make things better in the world. Among the things that those ideas make possible are ways to support more people on the planet. Right. And so he said like that quite literally, the answer to national resource consumption, right, for example, or national resource whatever limitations or environmental considerations or whatever, the answer is not the Paul Ehrlich approach of depopulate, right? Reduce the human population, therefore reduce the number of ideas, both the number of people, the number of ideas. The answer is to actually put the pedal forward, more people, more ideas, more solutions. And yes, clearly, I agree with him in that argument. Amazing. But by the way, Julian Simon is probably one of the most underrated economists and philosophers of the last hundred years. So if you're not familiar with him, that's a great one to read. On that real quick, a great point that he made that really blows people's minds when they read it for the first time is he made the argument that you'd never run out of any natural resource.
Starting point is 00:42:38 Yeah. And he had a bad moment, right? Like, he did. He did. So he had a famous, he had a your bet with Paul Ehrlich, the population bomb guy, and it was a bet on the price of a basket of natural resource commodities 10 years in the future. And Ehrlich was, of course, 100% convinced. That's with the first peak oil. The whole peak oil thing, exactly, is that same thing. And he let Ehrlich actually define, I believe,
Starting point is 00:42:59 the basket of the commodity. So he kind of loaded it in his direction. And the bet was, will the price of this basket be greater or less than it is today in 10 years? And everybody who's kind of in the sort of conventional thinking on this was like, well, obviously the prices of all this stuff are going to go up because there's more people, there's more consumption.
Starting point is 00:43:12 And that's what everybody wrote in the press for many of years. Like nobody was saying what Julian Simon was saying. Right. And then the kicker is Julian Simon won the bat. The price of that basket was lower in 10 years. The point was we never run on natural resources. His point was, with markets, the way that we know that a natural resource is becoming scarce is this price starts to rise. As its price starts to rise, self-interest says we should figure out ways to not need as much of that natural resource.
Starting point is 00:43:35 Right. Right. And so as the price of oil rises, then all of a sudden, we have an economic incentive to develop alternative energy, ranging, by the way, from solar to wind through to things like nuclear and then maybe in the future, even nuclear fusion. And so it's actually, markets working at their best. As the price of something rises, your incentive, your self-interested incentive to come up with the alternative rises. And alternatives that were not previously price effective, actually become price effective, right? So this was the rise of fracking, right? Fracking worked
Starting point is 00:44:01 because oil and gas started to get expensive enough or all of a sudden the additional cost of fracking was actually worthwhile. And then fracking brought the price rat back down. And fracking was a classic example of an idea. It was a technological innovation born by human creativity. And so basically what Julian Simon says is that it's the homeostasis kind of in the system. And this is not a dystopian scenario in which we are doomed to run out of everything and everybody's going to freeze and die. In fact, it's not necessarily utopian. Like it's still like natural resources will cost money. But it's a fundamentally positive view, which is it is human ingenuity that is going to cause us to not have the problems that the do-say or say we're
Starting point is 00:44:32 going to have, which, and again, 300 years of this and so far so good. Yeah, it's an abundance view of the world, as opposed to a scarcity view of the world. not the abundance view is right, which is good news for all of us. Matthew has a question, which I think you have an unusual answer to, which is the dream of fusion stopping us from enjoying the insane gains we can get from fishing. Yes, so primarily what's preventing us from enjoying the insane gains from fishing. And by the way, they are insane.
Starting point is 00:44:57 Like the level of energy that we could be producing from modern nuclear fission reactors and the safety of it is like just absolutely... Really, France is doing it right now. Yeah, yeah. France is doing it for you. By the way, France just got, I mean, the European politics are so entertaining on this, because France is so pro-nuclear and the rest of Europe is so anti-nuclear. France just had to get a waiver from Germany to continue to run their nuclear reactors,
Starting point is 00:45:19 which they finally just got. Because more than going in France is their fission reactor. Yeah, and the German greens have been determined for 50 years to turn those reactors off. So, yeah, so look, the big reason why we don't have widespread nuclear efficient power today is because of the precautionary principle, because of basically the fear of disaster, which basically makes people emotional, and then it turns out here the emotional decision is a very damaging bad decision
Starting point is 00:45:41 because the alternatives turn out to be things like gas and coal. And so that's overwhelmingly it. Now, having said that, back to the question is, yeah, so the new thing that you can say if you're trying to fight nuclear fission is, oh, we don't need it because we have fusion right around the corner. And by the way, look, I start by saying,
Starting point is 00:45:55 I hope that's right. I hope fusion really is right around the corner. It's been right around the corner for a while. I hope it's right around the corner. That would be great. It turned out to be harder than fission by quite a bit. It's quite difficult. And then, look, I think what's going to happen is, look, I think we'll get fusion to work at some point.
Starting point is 00:46:09 There's very smart people working on it. I think they'll get it there. I think the same forces and ideas in people that have prevented the deployment of nuclear fission will immediately prevent the deployment of fusion. And so I think this idea that all of a sudden... The same kind of bad narrative, corrupt pessimism. It'll be the same arguments. It's this incredibly dangerous thing. And who knows if it goes wrong and like, what if this and that and the other?
Starting point is 00:46:31 And like, we can't take these risks and can they prove it's going to be safe forever, infinitely? and it's going to be the same arguments. And these nuclear fusion companies are going to start out being very optimistic and they're going to hit this wall of sort of regulatory and emotional and political and ideological resistance. And I hope they punched through it. But look, Richard Nixon did two things in the early 70s around this, one as he declared something called Project Independence,
Starting point is 00:46:51 where he said, we the U.S. would build a thousand new nuclear efficient power plants by 1980, become completely energy sufficient, be able to withdraw completely from the Middle East, be able to be zero emission. And then he created the nuclear regulatory commission, which then prevented that from happening. Nope. Yeah. How many new nuclear reactors have they approved?
Starting point is 00:47:10 Zero new plants in 40 years. Wow. So. Not good. Well, and again, here you go back to incentives. Okay, now imagine, okay, now, again, get the devil is due. Like, imagine that you're the newly appointed regulator. You're the newly appointed chairman.
Starting point is 00:47:25 Ben Horowitz, you're the newly appointed chairman of the Nuclear Regulatory Commission in 1973. What are your incentives? How much glory are you going to get, right, if we built some new reactors for, is how horrible is your life going to be if there's another nuclear accident? Yeah, right, right, right. Another kind of incentive agency problem, yeah. Right.
Starting point is 00:47:42 And then by the way, you've got the existing energy companies in there doing their thing, right? Saying, oh, no, don't worry about it, oil and gas. And by the way, we're going to do... We're going to clean oil. Clean oil, we're going to clean Volkswagen. We're going to have clean diesel, right? Like, just let us continue working on the clean coal and the clean diesel.
Starting point is 00:47:59 And then, again, by the way, fusion's right around the corner, and that'll be better. And then you're like, wow, these companies, like, they're so big and powerful and successful. And by the way, they seem like they're kind of hinting they're to give you a job when they're done here. Yep. And you're back to square one.
Starting point is 00:48:11 This is a fun one. Luke Croft, you say that humans want to be productive, yet so many hate their jobs and work only because they have to. When it comes to abundance due to tech innovation, should we allow people the option of not working? So there's these two v. And look, like, again, no utopianism here, right? Like, not all jobs are fun.
Starting point is 00:48:33 Like a lot of people work jobs that they really don't like and they're doing it because they have to and they're doing it because they're trying to support a family or support themselves. And so certainly nobody promised everybody a job everybody loves. My analysis is if you close your eyes and imagine somebody who doesn't have to work, right? And I'm not talking about like stay-at-home others. I'm talking about like somebody who like in our system today would be working to be able to make money support themselves. And then there's a future ordering a society in which they can elect not to work. And they will have the same or similar level of material comfort that they have today.
Starting point is 00:49:03 and they can just kick it. You kind of close your eyes in that and imagine that person and there's kind of two possible versions of that person you can imagine. One is the person who is now, by the way, as Mark said, liberated to what was Marx's whole thing.
Starting point is 00:49:14 We're all going to be fishermen in the morning, literary critics in the afternoon, poets over dinner and musicians at night, right? Like we're going to be able to like self-actualize into all of these things that don't involve money and we're going to be, we're going to all be creating this and that
Starting point is 00:49:26 and we're going to be doing all these amazing things because we have all this time free. And look, there are some people for whom that's the case, right? there's certain people for whom I'm sure that would be the case. There's the other scenario, and I would just call that one the Cheetos and meth scenario, right? Yeah. Which is like...
Starting point is 00:49:39 And PlayStation. And PlayStation, right? And I like Netflix. I'm a fan of Netflix, but like maybe not 12 hours a day. Yeah. Right? And like, sit on the couch and get baked and just like, there goes all your motivation, right? Straight out the window.
Starting point is 00:49:54 And this is where I use in the essay the fairly provocative term, like farm animal. Like, that's a farm animal existence. Yeah. Right? That's the existence of a cow. Yeah. right and like cows are great but like i don't think we should be cows like i don't think we should be farm animals i don't think that is an advance you're back to the wallie scenario we were
Starting point is 00:50:12 talking about at that and i think we just need to be realistic about that and again this is very much one of these luxury belief things where kind of the class of people who imagine themselves as being the poets in the morning fishermen in the afternoon musicians at night think that's a general and maybe that's true for them by the way maybe if you went to one of these Ivy league universities and you hate your job as a barista and you'd be much happier and you'd be writing poetry if you didn't have to do your day job. Maybe that's true for them, right? But like the consequences of that misjudgment, if that's wrong on the other eight billion people, I think are very profound and possibly extremely negative. Yeah. And actually, we've run this experiment in the U.S. to some extent.
Starting point is 00:50:48 And I've been studying this more lately, but, you know, perhaps there were a lot of kind of bad things that happen with the pilgrims and the Native Americans. But I think the worst thing may have been the reservation system, at least the longest lasting. And the reservation system is essentially UBI. It's $65,000 a year, I believe, per resident. And anybody who spent time on a reservation knows, like, that just hasn't worked that well. Like removing purpose from people's lives, generally, there are people who can deal with it, but that's, I think, certainly the minority. And it's, I mean, at least in my view, that's not been great for the Native Americans. And to kind of scale that to society as many are proposing these days. It's probably not the best thing.
Starting point is 00:51:32 The other thing I'll add is jobs have actually gotten much better over time. So not all jobs are great, but like for many hundreds and thousands of years, every job is farming. And look, not every human is suited to be a farmer. I know you weren't, Mark, because there's a lot of farming in your hometown. And then the thing that came after that first was these assembly line jobs, which it's always kind of amusing to see politicians say, oh, we need more good jobs like these manufacturing jobs. Like doing this all day like a robot is not like the greatest job in the world. It may pay well, but it's not the most fulfilling.
Starting point is 00:52:08 And so many of the people on those lines are on drugs. And in fact, Henry Ford famously doubled the minimum wage. But the reason he doubled the minimum wage was he had so much attrition because people hated those manufacturing jobs so much. And so I think that the jobs that we're producing now have been increasingly interesting. And they're not all great. They're not all great for everybody. But one thing is for sure is we have a much broader variety of things that people can do.
Starting point is 00:52:35 So you have many choices. And hopefully people can find the job that's right for them. Oh, this is interesting. Coin Fluence X. So that's a pretty neat name. How do you present the contrast between rigorous technological and mathematical education seen in countries like China, which emphasizes relentless advancement with evolving educational paradigms in the West, which prioritizes intersectional perspectives and social values.
Starting point is 00:53:04 Yeah, Ben, why don't you start? I agree. That's an interesting one. Yeah, I mean, I think that it's tricky, you know, like, and it's been a long time since I've been in school and my children are all adults, so I'm a little far away from the actual thing that's happening in schools now. Having said that, boy, I think that there's kind of education that you can use where you can go make something or build something or figure out how large systems work or perform like a in-demand function in society. And then there are a whole class of other things. Like you can teach people about anything. Now they even have lots of like rap education in college and this and that and the other. And, you know, it's interesting because the great
Starting point is 00:53:55 musicians I know, and I know many of them, never took any kind of class like that. Most of them didn't take any class in music, by the way. It's funny, like my friend Kanye taught himself music. He was an art student. He taught himself music. And I think that for, you know, the great creatives, I don't know, studying to become a virtuoso in those things is necessarily even the best path. And then you have these other things, which are these social theories, unproven social theories that a lot of people are teaching these days, which I think those are fine hobbies and you and I pursue them as hobbies, studying these social theories and new ideas on how to organize society or whatever or how that all works. I don't think it makes sense to charge people
Starting point is 00:54:44 $300,000 or whatever college costs these days to teach them a hobby. That, like, that seems really absurd to me. I know people have different views on this, but I think that the purpose of the education that we funnel young people into ought to be to enable them to make significant contributions back to society. And that probably, I don't know if it's narrow as the Chinese education, but it's probably not as broad as some of the things that we're doing. Yeah, and then I might just pop the question up at one level, which is, look, education itself is an industry. It's a field in an industry, a business. And look, both the nature of the American education system and, for that matter, a system like in China's, they're primarily centralized and government controlled. And so the American system is, you know, K through 12 is primarily a government monopoly with occasional offshoot. And the offshoots are very restricted. Yeah. And very small and number. And then look, the college system in the U.S. is a cartel. It's quite literally a self-regulated cartel by virtue of the fact that it requires it's on the sort of federal student loan drug to work financially. And then the accreditation agencies that determine which colleges get federal student lending are run by the colleges themselves. So it's a self-reinforcing cartel, which is why you don't
Starting point is 00:55:54 see a lot of universities ever. That's why the major universities in the U.S. are older than the country, right? And so it's the same thing. And obviously in China, the system is government controlled, inherently completely government controlled is commonly true around the world. And so we've sort of backed into this idea of education is like an abstract good. And then we have the specific implementation of these highly centralized authoritarian, non-market based approaches to it. And then you can argue the pros and cons and the results. And I thought the question set up this very interesting aspect of it, which is like, okay, when does it become kind of too much wrote memorization versus when does it become kind of too much kind of wild theorizing with maybe
Starting point is 00:56:26 a sweet spot somewhere in the middle. You know, look, education as a system can't really respond and adapt to changing needs of people because it's primarily not market based. And so we kind of just get the system we get. We send our kids to it. We don't think we really have a choice for the most part. And I would just say like the big thing is it's just in all these societies, it's just stuck. And then there are all these signs. that it's becoming dysfunctional across many societies. We can spend hours on that. And so I would just at least, when I kind of put a placeholder and say,
Starting point is 00:56:54 I think we should also squint and kind of wonder, like, is there a different kind of system that could get built that would be much more market driven, much more technologically sophisticated, that would be much more better suited to modern needs and would be much more subject to actual competitive pressure and the need to be good. There are some very good education startups working on this, and there are some very good charter schools, and there are some very good new startup universities. And so there are some people taking swings at this, but at a macro level, like we're
Starting point is 00:57:16 not there yet. And I think maybe we should think over time about trying to get there. Yeah, and that's interesting because we are in the middle of this very interesting crisis, the student loan crisis when President Biden has kind of proposed. And I don't know how much of it has actually gone through all the legal objections and so forth to kind of forgive some portion of the student loans. But the interesting thing about that is that's kind of the ultimate treating the symptom and not the cause, because why can't any of these students pay back their loans. And the obvious answer is college isn't worth the money you pay for it. And then it doesn't translate into a job that enables you to pay back, but you borrowed to do it. And that's a much
Starting point is 00:57:59 bigger problem for young people and an ongoing problem that like you either kind of subsidize college in its entirety or you have to fix that problem. And I think that given what colleges are willing to do with their tuition, like grow at double the rate of inflation, and hire more administrators than they have students and all these kinds of things. If they have free money, another incentive problem, that it's a pretty dangerous idea to make college free in that way. We do need a better mechanism for sure. Maybe now is the time.
Starting point is 00:58:30 Hopefully. Okay. Sam Arnold, to what extent do you believe we really control what technology does after its release? Can we exert any meaningful control after invention? yeah I mean a couple of things so one is obviously we do control I mean there are pretty strict controls in a lot of areas right as we just discussed in the nuclear case so one is it's not quite that there aren't any you know I think maybe the underlying question here would be can we predict what the consequences are going to be and by the way this is a very hot topic right now because you have this thing where there are a bunch of AI kind of practitioners who are making these very in my view very extreme statements about what AI is going to do that's going to be horrible and there's this very strong temptation the part of people to make what seems like a very logical kind of assumption which is that the people who invented the technology are in the best position to be able to predict what happens with it afterwards and then are presumably also going to be in the best position to propose whatever regulations or controls are required to prevent those things from happening.
Starting point is 00:59:24 Yeah. I have not in my life and I have not in my reading of history seen a lot of examples where the inventors of technology are very good at this. Yeah. The example I like to cite is Thomas Edison, I invented the phonograph, and he was completely convinced that the use case for the phonograph was going to be to listen to religious sermons. He was a very pious man and he took religion very seriously. And he just assumed that if you owned a phonograph, the point of it would be you'd get home at the end of a long day at the officer in the factory.
Starting point is 00:59:49 You'd kick off your shoes, putting your slippers, and you'd have your dog by your side and your kids gathered around, and you would put on religious sermons. You wouldn't have to go to church and bother with all those people, which for Thomas Edison was a big thing. Exactly. And by the way, and then you could do this every night, right? You didn't have to wait until Sunday. This is great. You can do this all week. And I don't know, look, a few people do that.
Starting point is 01:00:07 Most people don't. Most people listen to music. And Ben, you'll recall, what was the first form of music that went kind of was kind of the big hit. hit genre of music on phonographs was jazz. Oh, yeah, sure. And you'll recall what people said about jazz at the time. Well, they said many things, but it wasn't real music. It was the devil's music.
Starting point is 01:00:26 It was very bad. It was channeling of all kinds of unholy impulses. It was going to cause young people to fornicate. They just had basically all the same things that people say about rap music today. I was thinking of something that there was some racism involved, too, I'll just say. Oh, of course. It was absolutely. It was going to be, yeah, all of a sudden, it's a channel for these black musicians to show up in the homes of white people.
Starting point is 01:00:43 Like, no question. And so anyway, like, it turned out that people had like all kinds of issues with the technology. It just turned out they weren't remotely the issues that Thomas Edison predicted that they would have. And if you look at this, historically, you're kind of like, well, of course he didn't know. Because, yeah, he's a techie. He just invented the technology. He doesn't have like the crystal ball. He doesn't have some special foresight.
Starting point is 01:01:00 And in fact, he's a particular kind of person. He's the kind of person who spends all this time in a lab. Yeah, right. He's unusual. He's unusual, right? And he must maybe speculate, you know, he might be psychologically a little bit different than most people, right? And he draws satisfaction for different than most people, right? And he draws satisfaction for different things,
Starting point is 01:01:13 and he doesn't spend a lot of time, right, with, like, regular people. And he's certainly not an expert on, like, politics and society and psychology and sociology and all these other things. So anyway, I think where I would start back to the original question is just like, I actually don't think it's that easy to forecast these things. And then specifically, I don't think it's any easier for the people who invent the technology to forecast these things. And I think they carry a lot of unjustified credibility.
Starting point is 01:01:34 This also came up in the Oppenheimer movie. The same thing, it's like these physicists all of a sudden, are convinced that they're the ones who are going to be working on, like the game theory and the philosophy and the morality. of the deployment of nuclear box. Yeah, well, they were. Well, they tried, but, you know, that was the nature. It was a lot of that, yeah.
Starting point is 01:01:49 It was, but, I mean, this was the scene with Truman, which is something that actually happened, right? Which is, like, Oppenheimer shows up to kind of ring his hands about the morality of the atomic bomb, and Truman is just, like, get this guy out of here. I'm the elected president of the United States. Like, let's listen to me.
Starting point is 01:02:02 Yeah. Right, and by the way, Truman, and again, not that Truman made all the right decisions or whatever, but Truman was the duly elected president of the United States. Like, he is the guy who should have been making that decision, not the guy who invented the thing. Yeah. And so I just think, like, I guess my big appeal here is just, like, humility.
Starting point is 01:02:15 Like, I think we as technologists need to be very careful. We're not, but we should be very careful about kind of crossing the line and deciding that we're going to do societal engineering kind of in our spare time. And then people who hear technologists kind of doing these things should be very skeptical that the technologists deserve any kind of unwarranted credibility on this. Right. And then look, there are going to be these big questions. What history basically says is we figure them out as we go.
Starting point is 01:02:36 Yeah. And I think the alternative to that is to just not do new things. And so I know where I come out on that. Yeah, yeah. I'll give a plug for a great book that you recommended. When Reason goes on Holiday, which is sort of the story of how the great physicists and inventors did on politics and policy and game theory. You take super high IQ people and you put them in a research or university setting. They go crazy. Like, they very frequently go very nutty on anything involving politics and society. And this was the American physics community in the 1920s and 1830s.
Starting point is 01:03:10 that basically, like, most of them went, like, hardcore communist. Like, Einstein was a Stalinist. Yeah. Like, we don't talk about these things anymore, but he was. These people end up, like, very radicalized. And this happens with other groups of people, right? Maybe at certain points, this happened in other areas in our public life. But, you know, it definitely, intellectuals go straight for it.
Starting point is 01:03:28 Thomas Soles talked about this a lot. He's like, look, he's basically the problem with people who are hyperverbal and, like, working ideas is they can get kind of arbitrarily unhinged. Yeah. And they can kind of talk themselves into crazier and crazier things. And their level of disconnection from the real world means that they no longer have any governors and how crazy their beliefs can get. And so I think that's what's happening in AI right now, and I think we need to be very cautious about who we listen to. Yeah, trust in their own judgment
Starting point is 01:03:47 is profound. Okay, last question. This is actually one that I would have asked. It's from Bird, and it is before publishing, what did you consider the most controversial point? Now that it is live, what makes people shimp the most? So, what did I think was the most? I don't know, Ben, let me ask you that. You read the drafts. What were you predicting was going to be the most controversial thing? Well, I thought for sure when you wrote it, everything on the economic front about free markets and love is unscalable on those kinds of things,
Starting point is 01:04:23 I thought those would be the most controversial for sure. Those are kind of time-honored crowd pleasers when it comes to making people mad. Yeah, yeah. It's just like the nature of humans that they disagree on that. That's a good question. I don't know. Yeah, I would say, I don't know.
Starting point is 01:04:38 And Ben, you tell me if you think I'm missing this. think most of the attacks, I think like a lot of artists, I aspire to have a better class of critic. So this is one of these things in the modern climate. Like a lot of the attacks have been on me. Yeah. More than on the ideas. Right, right, right. And if I were, including the one you started out with earlier, like the smug interpretation of that on my part would be, wow, the ideas must all be great if they're reduced to just attacking me. Yeah. Yeah, like, if that's all they've got, like I must have otherwise made outstandingly excellent point. and they must not have counter arguments.
Starting point is 01:05:11 I would just tell you, I would love to see more, there have been a few, but I would love to see more associative responses that are legitimately, like, carefully thought through. I agree with that. I mean, I think you've gotten a lot of emotional reaction. I mean, I think some of the more thoughtful ones are just probably on a misinterpretation of what you said,
Starting point is 01:05:28 which is some people interpreted it as technology gone wild. Like the Girls Gone Wild video of the whatever era that was. It's like technology gone wild, just like let it roam free. And which wasn't what you're saying, it's just more like, are we going to keep doing new things or not? Or are we going to kind of nip them in the bud the way we did nuclear efficient? But like you build something, you've got bugs, you've got issues, you have. The Internet's had crime on it since like the very, very early days. And we keep working on that.
Starting point is 01:05:58 But the benefits of the Internet outweigh the negatives and these kinds of things. And I think people wanted you to be more on one hand and on the other hand, but it's a manifesto. So, you know, do that in a manifesto. Exactly. I thought there was one piece of feedback on Twitter. I thought somebody made a really smart point. One really smart point I saw was back to that love force money thing, somebody said, boy, like, isn't it the case that actually a lot of innovators,
Starting point is 01:06:22 like a lot of creators of new things actually are doing it out of a sense of love? Yeah. By the way, broadens this out beyond technology, but like a lot of artists, a lot of musicians, a lot of writers, this is the concept of the labor of love, right? Is like they really feel like they're contributing something. And, you know, like love of society, love of their fellow man, is, in fact, a motivation, even if they would also like to get paid. Yeah.
Starting point is 01:06:41 The accusation basically was I was presenting two kind of crimped a view of human nature when I narrowed down love that much. Yeah, I think that's right. Although, if you invent somebody and then you need to hire a bunch of people to work for you, that's where love runs out. Yes. They're not all going to join on that. And similarly with the musician, your publicist, your makeup artist,
Starting point is 01:06:59 you're this and that and the other. They seem to need money. As a reminder, if you got to the end of this episode, don't forget to subscribe to Ben and Mark's new show called The Ben and Mark Show. Be sure to subscribe in your podcast app of choice so you don't miss upcoming episodes.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.