a16z Podcast - Marc Andreessen & Jack Altman: Venture Capital, AI, & Media
Episode Date: June 11, 2025In this episode Jack Altman, CEO of Lattice and host of Uncapped, interviews Marc Andreessen on how venture capital is evolving — from small seed funds to billion-dollar barbell strategies — and w...hy today’s most important tech companies don’t just build tools, they replace entire industries. They cover:The end of “picks and shovels” investingWhy missing a great company matters more than backing a bad oneThe power law math behind fund size and asymmetric returnsAI as the next computing platform — and a test for Western civilizationPreference falsification, media power, and what founders can’t say out loudThis is a conversation about ambition at scale, the structure of modern venture, and the deep forces reshaping startups, innovation, and power.Resources: Listen to more from Uncapped: https://linktr.ee/uncappedpodFind Jack on Xhttps://x.com/jaltmaFind Marc on X: https://x.com/pmarcaFind Uncapped on X: https://x.com/uncapped_podTimecodes: 00:00 What You Can’t Say 01:20 Founders, Funders, and the Future 02:00 Fund Size and Power Law Math 06:45 From Tools to Full Stack Startups 10:00 Market Sizing and Asymmetric Bets 13:00 Public Markets Mirror Venture Dynamics 17:00 The Barbell Strategy in Venture 20:00 The Conflict Dilemma in Venture 25:00 Staying in Early-Stage Venture 29:30 The Death of the Middle 32:00 Why It’s So Rare to Build a New Top VC Firm 35:00 The Case for Power in Venture 37:45 Limiting Factors for Big Companies 41:00 AI as the Next Computing Platform 45:30 Betting on Startups, Not Incumbents 48:00 How a16z Thinks About Risk 51:00 Building a Top-Tier GP Team 55:00 Taste, Timing, and Getting Into the Scene 57:00 Raising Capital Is the Easy Part 1:00:30 AI’s Existential Stakes 1:05:00 Autonomous Weapons, Ethics, and War 1:11:00 Tech, Government, and Power 1:13:00 Media, Mistrust, and Narrative Collapse 1:24:00 Preference Falsification and Cultural Cascades 1:32:00 The Thought Experiment 1:33:00 Career Advice for Young Builders 1:35:00 Marc vs. the Huberman Protocol 1:39:30 What Would Prove You Right? Stay Updated: Let us know what you think: https://ratethispodcast.com/a16zFind a16z on Twitter: https://twitter.com/a16zFind a16z on LinkedIn: https://www.linkedin.com/company/a16zSubscribe on your favorite podcast app: https://a16z.simplecast.com/Follow our host: https://x.com/eriktorenbergPlease note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.
Transcript
Discussion (0)
Here's what I would encourage people to do.
Here's the thought experiment to do.
Write down a piece paper, two lists.
What are the things that I believe that I can't say?
And then what are the things that I don't believe that I must say?
And just write them down.
What happens when startups don't just sell the tools
but decide to take over the entire industry?
On today's episode, Mark Andreessen,
co-founder of A16Z,
six down with Jack Allman,
co-founder and CEO of Ladis,
to unpack how the venture industry is changing, from small seed funds to multi-billion-dollar
barbell strategies, and what that means for founders, funders, and the future of innovation.
Mark explains how the classic playbook of picks and shovels investing gave way to full-stack
startups like Uber and Airbnb, and why the biggest tech companies today are not just building
tools, but replacing entire sectors. He also talks about the realities of fund size,
venture returns, power laws, early-stage conflict dynamics, and why missing a great company
matters far more than backing a bad one.
And then it gets even bigger.
Mark dives into AI as the next computing paradigm,
U.S.-China geopolitical risk,
and why Mark thinks we're in a capital T test
for the future of civilization.
This episode is about asymmetric bets,
ambition at scale,
and the deep forces reshaping tech and power.
Let's get into it.
As a reminder, the content here is for informational purposes only.
Should not be taken as legal business, tax, or investment advice,
or be used to evaluate any investment or security and is not directed at any investors or potential
investors in any A16Z fund. Please note that A16Z and its affiliates may also maintain investments
in the companies discussed in this podcast. For more details, including a link to our investments,
please see A16Z.com forward slash disclosures.
I am so excited to be here with Mark andres. Mark, thank you so much for doing this with me today.
Jack, it's a pleasure.
So what I wanted to start with was the topic of small funds, big funds.
We had Josh Koppelman on the podcast, and he made a point that resonated around fund size, the outcomes in venture, and sort of just like looking at the math of all of it.
And I think as venture funds have grown, it sort of spoke to a lot of people about, like, kind of what the plan is and sort of how tech is going to go.
And so I guess to start, I'd be curious to hear your thoughts around that whole dynamic, obviously, you know, you've got a big venture firm.
And so I just want to hear kind of your perspective on this whole topic to start.
So let's start by saying, like, Josh is a longtime friend, and I think is a hero of the industry.
And I say that because, you know, he started first friend of ventures back in the very dark days.
I forget the exact year, but, you know, back during the dark days of after the 2000 crash.
And in fact, you know, there was a period of time back there when, you know, the total number of angel investors or seed investors operating in tech was, you know, maybe eight total.
And, you know, actually Ben and I were two of them.
But, you know, this was sort of the heyday of, you know, Ryan Conway and, you know, kind of a, you know, Reid Hoffman and a very small group of people who were kind of brave enough to invest in new companies at a point in time when, you know, basically everybody believed the internet was over, like the whole thing was done.
And so he, like, I just think that, like, that was an incredibly heroic brave act. It obviously worked really well. It, you know, turns out by low, so high actually is a good strategy.
It's very nerve-wracking when you're trying to do it, but it does work. And he, he had brilliant timing for when he started. And, you know, the companies that he supported have gone on to become.
incredibly successful and we've worked with him a lot. So, you know, we're a big fan of his. And then
second, as I would say, I didn't actually, I heard, I heard there was a discussion. I never,
as a rule, I never read or watch anything I'm involved in. That's good. Well, it wasn't about,
you know, and I totally missed it. And to summarize, basically, what he was saying is he coined this
like venture arrogance score idea. But basically the idea is, you know, if you're going to own 10%
of a company at exit and you want to have a 3x fund and you're probably going to have a power
lot of outcomes, you basically need your big outcome to be like really big. And so like,
how's the math shakeout? And basically, you know, the question he was sort of posing broadly is,
are the outcomes going to be much bigger? You're going to own a lot more. You can hit a lot more
winners. But it was sort of like that math question. So I'll say a couple things. So one is, look,
Venture is actually a customer service business in our view. So start with this. So it's actually
a customer service business. There are two customers. They're the LPs and there are the founders.
And we think of them both customers. And so, you know, at the end of the day, the market's going
to figure this out. And the LP money is going to flow to where obviously they think the opportunities are.
and the founders are certain, you know,
as you know, the best founders definitely pick
where their investors are.
It's actually very unusual, right, asset class.
It's the only asset class in which the recipient
of the capital picks the, you know,
picks the, you know, actually cares where the money comes from
and picks it.
So the market will figure this out.
I think the big thing, to respond to your general point,
I think the big thing is the world has really changed.
And so, you know, modern venture capital
in the form that we understand it is basically,
you know, there were examples of venture capital
going back to like the 15th century or something
with like, you know, Queen Isabella and Christopher Columbus
and Whalers off the coast of Maine on the 1600s and so forth.
But modern Metro Capital was basically a product of the 50s and 60s.
Originally this guy, Jack Whitney from the Whitney family,
sort of created the model.
George Dorio, who's a MIT professor, created a version of it.
And then, you know, the great, you know,
the great heyday of the 1960s, Arthur Rock and those guys
and everybody that followed down Valentine and Pierre Lamond and Tom Perkins
and so forth, Gene Kleiner, you know, all those guys.
Basically, from that period, it's called the 1960s through,
call it 2010. There was like, there was just, there was a venture playbook and it became a
very well-established playbook and it sort of consisted in two parts. One was a sense of what the
companies were going to be like, right? And then the other was what the venture firm should be
like. And so the playbook was the companies are basically tool companies, right, basically
all successful technology companies that were venture funded in that 50-year stretch were basically
tool companies, right, pixel shovel companies. So mainframe computers, desktop computers,
smartphones, laptops, internet access, software, SaaS, databases, routers, switches, you know, disk drives, all these things.
Word processors, tools, right?
And so, you know, you buy the tool, the customer buys the tool, they use the tool.
But it's a general purpose technology sold to lots of people.
Basically, around 2010, I think the industry permanently changed.
And the change was the big winners in tech more and more are companies that go directly into an incumbent industry, right?
like insert directly.
And I think the big turning point on this was like Uber and Airbnb, right,
where Uber could have been, like Uber in 2000 would have been special specialist software
for taxi dispatch that you sell to taxi cab operators.
Uber in 2010 was, screw it, we're doing the whole thing.
Airbnb in 2000 would have been booking software for bed and breakfasts, right, running on a Windows PC.
Right.
And then Airbnb is just like, screw it, we're doing the whole thing.
And so, and you know, Chris Dixon came up with this sort of term, the full stack startup,
which he kind of meant.
But the other way to think about that is just you're actually the company is delivering the entire, basically promise of the technology all the way through to the actual customer.
Which is basically quicker to get there.
Also, I suppose you get more margin capture when you do it that way and you just get the technology seeped in rather than having to sell it through.
Is that the idea?
Prior to 2010, there were two kinds of tool companies, consumer tool companies and business tool companies, right?
So, you know, B2C, B2B, right, as we called them in those days.
And, you know, the consumer side was great.
But, like, you know, it's just like selling, you know, video games and consumer software is great.
you know, flying toaster screensavers, it was great, but there was only so far, you know,
that was going to go. And then the B-to-B side for things like taxi dispatch or for, you know,
bed and breakfast bookings, the problem is like you're selling advanced technology into incumbents
that are not themselves technology companies, right? And so are they actually going to take those
tools and then actually build the thing that the technologists know should actually get built?
More modern version of that is what you see now happening with cars, right? So who's going to
build the self-driving electric car, right? Is it going to be a, you know,
incumbent who's able to adjust, who's buying, you know, good components to be able to do that?
Or, you know, is it going to be a Tesla or a Waymo?
Yeah.
Right?
It's going to do that.
Same with SpaceX and NASA, I suppose.
Exactly, yeah.
You could, there are many companies that sell technological components that go into rockets,
but was any of that going to lead to the existing rocket companies making the rocket
that's going to land on its butt and then, you know, be relaunched within 24 hours, right?
And so, and by the way, same thing, Airbnb, or so Uber, had you sold a, the Uber-ized version
of taxi dispatch software to the taxi.
Wouldn't have been very good, yeah.
Would it have resulted in the Uber customer experience?
And so I think basically what happened was,
and there was sort of, you know,
as Peter says, these things are overdetermined.
So it's a bunch of things that happened.
But it was sort of the smartphone completed
the diffusion kind of challenge
for getting computers in everybody's hands.
And then mobile broadband completed internet access
in everybody's hands.
And then the minute you had that,
there was just no longer,
you just had this ability to get directly to people
in a way that you just never had.
You didn't have to like have a giant marketing campaign.
You didn't have to, you know,
have a giant establishment.
you know, consumer brand. And so there was a way to kind of get to market that didn't previously
exist. And then, you know, and then look, also consumers just evolved. And, you know,
people especially, you know, kind of Gen X and then millennials were just much more comfortable
with technology than the boomers were. And they, you know, the sort of gen X was entering,
you know, and boomers and millennials were kind of entering their consumer prime at the time this
happened. And then you started having these big successes. And so you started lining up Uber,
Airbnb and Lyft and SpaceX and Desla. And, you know, you kind of, you start stacking these
up. And at some point you're like, all right, there's a pattern here, right? There's a thing that's
happening. And that's what's happening. We're 15 years into that. And what's happened now is basically
that idea now has blown out basically across every industry, right? And so, so, so the tech
industry used to be a relatively narrow tools, picks and shovels business. Today, it's a much
larger and broader and more complicated, uh, basically process of applying technology into basically
every area of business activity. The result of that is that the companies are much bigger. Like when you're
the whole company, when you're both the picks and the shovels to yourself of the whole company,
you're much bigger. And that changes venture math. Yeah, you eat the market, right? And
And so Tesla ends up being worth more, like there have been points in time in the last five years
when Tesla has alone been more valuable than the entirety of the entire auto industry put together, right?
And SpaceX is, you know, like, you go through this.
And Uber is worth far more than the totality of every black cab operator and taxi cab company that ever existed.
Airbnb is worth far more than the bed and breakfast industry ever was.
And by the way, it turns out some of these markets just turn out to be much larger than people think, right?
When we do a retrospective on our analysis over 15 years, like one of the things that's been hardest for us to do is to do market sizing.
and sometimes we overestimate market size
but it's more often the other way
more often well for the winners
more often it's the other way
I guess the net blend is that you underestimate it
yeah in this goes to venture economics
you'll talk about so the core thing on venture
the core thing on venture bets right is
because because venture doesn't run on leverage
right because nobody will bank
yeah right will bank a startup
or a venture firm for leverage
because there's no assets when these things start
yeah it's asymmetric you can only lose one X
yeah but you can potentially
make a thousand X. Yeah. And so that means, right, then there's two errors in venture. There's the
error of commission where you invest in the thing that fails. And then the area of omission
where you don't invest in the thing that succeeds. And of course, just in the math, overwhelmingly,
the error that matters is the error of omission. And so if you run an analysis that, and by the way,
lots of people did this, you run an analysis that says ride sharing is only ever going to be as
big as taxicabs. Yep. That leads you to the error of omission and not making the bet and therefore
the difficulty of market sizing. In your view, is this only, does that only, does that only
apply up to a certain size or you know and you look at some of the rounds that now happen at huge
valuations in companies that would otherwise you know be a large IPO like let's say somebody's
raising 10 billion at 100 billion or something like that does the power loss still apply up there like
how do you think about that type of round or do you see venture capital sort of turning into private
equity at some level at the higher end of things yeah so i think there's two questions kind of embedded
in there one is why aren't these companies public right that's one question yes and then
And then the second question is, like, even whether they're public or not, like, can they actually...
Is it still the lose-one, win-20 type of dynamic?
Yeah, so I think there's a bunch of ways to look at that.
So, like, the smartest public investors I've met with basically have the view that the public market actually works just like the private market with respect to this dispersion of returns.
The extreme case I'll make sometimes is it may be that there's no such thing as a stock.
It may be that there's only an option or a bond.
Right.
Like, so for...
And the reason is because there's fundamentally two ways to run a company.
One is to try to shoot the moon.
One is to try to build for the future.
and then the other way is to try to harvest the legacy, right?
And if you're shooting for the moon, the big risk,
the big, then the big risk of that is, you know,
and you might fail, right?
You might not work, but if it works,
you have this telescoping effect in the public market
just as much as you have in the private market.
And historically, the returns in the public market
have been driven by a very small number of the big winners
in exactly the same way they've been driven by that in the private market.
In fact, you see that playing out right now in the S&P 500.
So one of the things I've been saying for years now
is the S&P 500 is no longer the S&P 500.
It's like the S&P 492 and the S&P 8.
So there's like 492 companies in the S&P that have no desire at all, right,
just like watching their behavior to like really charge hard at the future.
Like they don't want to do it, they won't do it, they're not doing it.
And then eight are betting everything.
Eight are all in, right?
And then I always say, you know, who are the eight?
And everybody always knows who the eight are because it's completely obvious who the eight are
because they're the ones that are building all the new things.
And then again, if you disaggregate like public market returns over the last 10 years,
you see the, it's just, you see this just dramatic, you know, explosion of value.
among the eight, and you see a, you know, relatively modest, you know, growth of the 492.
So even the S&P 500 is like having a portfolio of like bonds and options.
Yeah.
And it's like, it's like incredibly bar-billed.
And so I just, I think, and then people, people get cynical on this.
And they say, well, you know, if not for the eight, you know, the stock market.
You're like, yeah, but that's the whole point.
That's the whole point.
Right.
If you have a healthy functioning capitalist economy, the whole point is some number of these
things are going to go nonlinear.
This is like when someone says, ah, they're not a very good investor, but they invested in,
name that $100 billion company.
So they got lucky.
Well, you're like, okay, yeah.
That's the point.
That's the job.
That's the desired outcome.
That's the thing.
You know, any of us who, you know,
this is like, you know, kind of the classic joke, like,
joke of venture is like,
isn't there just a way to invest in the good companies and not the bad companies?
It's like, yeah, like, okay, for 60 years,
we've been trying to figure that out.
Here's a fun fact in finding the analysis.
Over the last 60 years, every one of the really great venture firms through that
period missed most of the great companies while they were,
while they were investing.
Yeah.
The best firms in the world, whether it's, you know,
Kleiner Perkins in the 90s or,
benchmark in the 2000s or Sequoia in the 2010s or whatever,
like they just like flat out missed most of the winners in each cohort.
And on one hand, you're just kind of like, wow, I can't, can't you do better than that?
But you've had these super geniuses for a very long time trying to do better than that.
And, you know, we could have a whole separate conversation about why this is so difficult.
The thing you said about companies building, you know, the whole stack,
roll-ups are super popular.
Should I, is it fair to take from what you said that you're bullish on that strategy or not necessarily?
And basically just, you know, to walk out.
And I mean, you know, instead of, you know, building accounting software and selling it to the accounting firms, just buy an accounting firm, become an accounting firm, AI, for yourself, which I think is becoming like a more popular strategy. Do you like that or is there a nuance why it's different to buy something rather than build it yourself from the beginning? What do you think of this whole roll up thing?
Yeah. Let's come back to the venture question because I was still lining up into that. But however, this is actually also relevant to that. So yeah, so there are a bunch of really good firms that are trying to do this roll up thing. I, you know, that, I mean, the opportunity with it is kind of very obvious.
The challenge with it is just cultural change of an incumbent is just a legacy company is just really difficult.
Charlie Munger was once asked, you know, a few years ago, he said, you know, GE, I think was the company who's going through a big issue at the time.
And he was asked at a shareholder meeting, how would you fix the culture at GE?
And he's like, I have no idea.
I don't even know how you would change the culture at a restaurant.
Yeah, that's funny.
Right? Like, how do you do that?
It's really hard.
Right?
It's really hard.
Yeah.
And so, you know, you have to have a theory on that.
I mean, people, they do have people doing it do have theories.
Yeah.
I think we're much more oriented towards just trying to back.
Well, I think it gets a little into this, like, private equity mind.
It's a little bit of the venture private equity blend I see happening as related,
not even just in dollar size, but in the mindset here.
Well, this is where I go back to my bonds versus options thing.
Yeah.
Like, fundamentally, the way I'd always describe venture is, like, fundamentally,
we are buying long-dated out-of-the-money call options,
which, like, seems completely insane, except when they pay off, they pay off, like, spectacularly well.
But, like, a lot of them expire out of the money and, like, you know, you know,
statistically top-in venture capital has a 50-plus percent.
Yeah, yeah, yeah.
Okay, yeah, I just wanted to get your hot tape.
I really wanted to hear about this, but, yeah, we can go back to the venture
because I think there's a lot more in there.
Okay, good.
So, look, so, so, so, so anyway, so, so, so, so, so, so, so, so, so
so, so, so, the, the, the, the, the, the, the, the, the, the, the, the number of
companies that, that are, that are going to be important.
It keeps expanding.
The number of categories that those companies are in, uh, keeps expanding.
Those companies are more complicated now.
Yeah.
Because they're, they're, they're, they're, they're full stack.
They're, they're, they're, they're, um, and then the winners are getting
bigger. And you, and you, and again, you just look at that in the market.
Look, we have, you know, of the S&P 8, they're like, oh, they're all venture-backed, right?
Every single one of them is venture-backed.
They are, on any given day, any one of them is bigger than the entire national stock market
of countries like Germany and Japan and the UK, right, and so the telescoping effect,
I mean, the numbers are just absurd.
The telescoping effect of victory is just incredible, right?
And so what Ben and I did was we looked at it, and we started our firm kind of as this was
happening, and we looked at it and we said, all right, like, this is different.
This is, you could, you could sit here and do things the old-fashioned.
way, but the world is moving on. And then it goes back to the customer service aspect. The founders
who are starting these kinds of companies need something different. Yeah. It's not sufficient anymore
to just, you know, to have investors who were operating the way that they were investing,
you know, for the previous 50 years. That's not the value proposition that they need. That's not
the help that they need. And so there's a different way to do it. And so I think what's happened
is like the, the industry, the venture industry, it had to restructure in order to basically
accommodate the change in the market. Now, having said that, I don't think that's an argument
that it's just there for big, big firms win everything.
That's definitely not my, not my thesis.
And by the way, that's also not how I'm deploying my own money,
which I want to talk about,
because I'm living what I'm about to say.
Which is, I think what happens is what Nassim Taleb calls the barbell.
And the way to think about the barbell is basically,
you basically have a continuum,
and on the one side of the continuum, you have high scale,
and on the other side you have high specialization.
And what you see in industries that mature and develop in this way,
including many industries in the last 100 years,
basically what happens is as they mature,
and enter their kind of full state
as they kind of flower.
What happens is they often start with generalists
that are neither sub-scale nor particularly specialized.
And then over the fullness of time,
what happens is they get disintermediated
and then there are scale players on the one side
and their specialist players in the other side.
The most obvious example of this in everybody's lives
is retail.
When I was a kid, there were these things called department stores.
Pretty good selection at pretty good price.
But not a great selection and not a great price, right?
And then sitting here today, those are all out of business.
They're just gone.
I think it's crushed by Amazon on one end.
and then, like, amazing retail on the other end.
Exactly, exactly, right?
And so, and why do you go to Amazon or Walmart or, you know,
and by the way, there were even these big box guys,
you know, Toys R Us and so forth.
And then over time, like Amazon and Walmart, even ate that.
Because when you go to Amazon or Walmart,
what you get is just like an unbelievable selection
of basically anything that's a commodity, right?
You just buy at, like, super low prices.
And it's basically impossible to compete with that
if you're subscale on the one hand.
And then to your point,
and then the specialist retail experience
is like the Gucci store or the Apple store.
you know the $15 candles
They gave you some Perrier when you walk in
Oh they love you like they're so happy to see exactly
Right you know they'll do private showings for you
And you know they pour the champagne and it's like it's like an entire experience
And so what's happening is and again you see this and like the return
You just look on this a return standpoint like this is what's happened
This is where this is this is how the value is
And then what happens is that just like gaps way out
And it never comes back together again
And then what the consumer does is they build a portfolio of their experiences
And so they buy things at unbelievably cheap prices of Walmart and Amazon
and then that gives them more spending money
to be able to spend on the boutique.
So this middle, the bar that's in the middle that's kind of screwed,
what is the mechanic by which they're in trouble?
Is it because the customers go away?
The founder customers go away.
Yeah, yeah, yeah, yeah, the founder customers go away
and then the office.
Who are neither getting sort of like the size and scale value,
nor are they getting like a special focus.
Correct, exactly.
Can you do focus, can you be a specialist with a $2 billion fund, let's say?
So obviously we're at scale.
But we do have a specialist approach inside the scale.
Yeah. We have investment verticals. They're discrete teams. They have, in some cases, discrete funds. And by the way, they have like trigger pull, trigger pull authority. They can make investment decisions. Like, we don't run the firm where Ben and I sit and decide, is this a good investment or bad investment. Like, our specialists, you make those decisions.
And you basically determine that by this is the size we think you can function. This is the biggest you can function as a specialist in a highly successful way. And then we're just going to put a bunch of those together. Is that like what defines the size?
Yeah. So it's sort of, it's stupid. Yes, yes, but it's two parts.
one is what's the external view is what what's the size of the market opportunity just how much money does this does this strategy does this vertical need how many companies are going to be how many different you know kind of how complex is it and then the other is the internal dynamic which is like you know you want if you're not a team you need everybody around the table being able to have a single discussion and that puts natural limits on how big that can what's your limiting reagent to building an even bigger firm is it number of productive partners that can do this then like conflict policy conflict conflict policy conflict
conflict policy.
That's the single biggest issue by far.
Really?
So if you had 50, if you had all the great GPs all wanted to work here and you had
like that would still be the issue.
Yeah.
There would be issues.
There would be issues for sure to our point that would come with.
So what's the conflicts thing?
The conflicts thing.
So the conflict's thing is the main line venture firms forever, meaning the firms that
do series A, series B, especially series A's and B's.
The relationship with the founder is just so.
It's too deep.
So it is too deep.
And if you as a venture firm invest in a direct competitor,
It's just, it's a giant issue.
The founder you're already invested in will be extremely upset with you.
By the way, do you think that's practical?
Do you think it's all emotions?
Like, do you think it's correct that firms shouldn't do conflicts?
I would say when we were startup founders, we felt this very deeply.
It's just, it's, okay, so when you're a startup founder, I'll channel the other side.
When you're a startup founder, the whole thing is so tenuous, right?
It's just like, is this thing going to work?
There's like 18,000 things can go wrong.
People are telling you, no, every day, no, I'm not going to come work for you.
No, I'm not going to invest in you.
No, I'm not going to.
And then your board member invested in a competitor and you're like,
dagger to the heart
dagger to the heart
and then you literally
what happens
is the founder
is you have to go
to the all hands meeting
and explain why
your investor
has given up on you
right and you go in there
and you do some song
and dance about
for sure
and their employees
are just like
your employees are just like
your employees look at you
and they're just like
you the founder
are so weak and lame
yeah right
you can't even get your board
member to not invest
in a competitor
exactly what about the marginal stuff
though because like
you know all these companies
are near each other
they blend they evolve
they evolve over time
so like how does this
how does this play out
on a practical level
for firms
it almost never plays up
the way that the founders think it's going to play out.
And I say that in two dimensions.
Number one, the company, this historically what we've seen is that the founders who think
that they're directly competing with each other generally end up not doing so because
one or the other of them changes strategies and then they diverge.
Which, by the way, it's natural because it's like specialization.
The company specialized, they end up not competing.
But the other thing that happens is two companies that were not competing that you're
already invested in pivot into each other.
Yeah.
And then they're mad at you.
And then they're very upset.
And you have to remind them that like that, you know, you didn't know that that was going
to happen.
And it's not your fault.
And then they're still upset.
And so I would say the founders are not, the founders, and also we have very low predictability
of terms of where the conflicts are going to be.
But that doesn't ameliorate any of the emotion at the time.
And so it doesn't actually help.
It doesn't help for us to explain to the founder.
Oh, don't worry about this guy who you think is directly competitive because he won't be
any year.
Yeah.
Because you can't prove that.
And the issue is in the moment.
What does that leave your, how does that impact your strategy, meaning like,
If you know conflicts are this huge issue, and you've got, you know, a big aggregate fund.
And so it's very important to catch winners.
And then you invested in, you know, Blue Origin, which is really good, but SpaceX is, you know, bigger or whatever happens.
Yeah.
What does that imply for your strategy when it comes to, like, should we, you know, doing seeds and A's and things like that?
Correct.
Versus like, say, you know what, let's just wait to, like, the D.
Correct.
Let's have DBR early stage.
That's right.
So the most obvious thing you do is you're just like, oh, we just need to wait.
Because we need to wait for clarity.
Just don't deal with this whole.
issue right just wait just keep just keep delaying and keep delaying until it's obvious what the what
the what the answer is if it's big it's going to be really big so we can buy later but then the problem
with that is all right now you're out of the venture business right because now you're doing as you
said now you're basically doing serious deeds now you're a pure growth investor and by the way there
are very good pure growth investors um but like our determination is to stay a venture investor yeah
because we think that that's kind of the whole point why is it so important is that just because
it's what you like or is there a strategic reason that it's important to stay doing early so we've
always wanted i mean that's the way we've always thought about it is we've always wanted to kind of be
the founder's best partner. And like to be to be the one who's like the closest in, the one
that can really be relied upon, the one that's going to be around for the longest amount of time,
the one who they can really trust. And it only happens early. Yeah. Like it's, yeah, it's your early
guys. And so it's hard to insert after that. And then look, the other thing is like there are
great growth firms that do invest later and have done very well. But I, we just think there's
so much information at the early stage. So for example, when we make a growth investment,
because we have the active venture business that we have, by the time we make a growth
investment, you know, we have either invested in the company for several years or at the very
at least we've met with them repeatedly over time.
And so we just, we end up with just like enormous amounts of information.
And then the other thing, by the way, is, you know, there's kind of time arbitrage,
which is, you know, sometimes the right answer is just like, okay, just invest in SpaceX
or whatever later on.
But sometimes the answer is, no, there's actually a new thing, you know.
Totally.
Do you invest in the MySpace growth round at the, you know, the Facebook, the Facebook C round?
Like, and if you're not in the early stage, you won't know that because you won't see
the, you won't see the early things.
Yeah.
And then, by the other thing I just say is financially, one of the, one of the things people say
that is inaccurate, as they say, if you're running a big fund, you're not going to have the time
to spend on the early stage opportunities because you can't justify it before you're putting the
money. But that's actually not true in venture because the aggregate dollar return opportunity
on early stage is just as high as any growth investment, right? Because if you get the right venture
investment and you can make $10 billion on the upside case, it's definitely worth my time to spend
with the early stage founders, you know, for that reason. So the barbell, there's, you know,
there's big on one end. There's something sort of like me on the other end. Selfishly,
I'd love to know, like, you know, I would assume you think it's better to be the big version,
but, you know, if you were conditioned on needing to be me at the small end of the barbell,
like, how would you approach it?
Nope.
They're both good.
They're both good.
This is the thing is they're both good.
They're both good.
And if I were, for some reason, not doing this, I would immediately do what you're doing, right?
So that's good to hear.
Yes, 100%.
And then I would say, I actually invest this way.
So my liquid assets are basically tied up in either A16 Z funds on the one side or I run a very
aggressive personal investment program in early, basically angel and early stage seed.
funds. And it's because I believe in the barbell. I believe in the barbell so much. And so, but the conflict
thing I wanted, I wanted to explain because that's the issue. So the big for like, we do seed investing.
It's just we have this problem every single time. We're looking at a seed investment, which is like,
are we really fully convicted that this is going to be the winner? Even at seed, it creates a conflict
before a board seat. There's debates. There's always debates on this is like, you know, do the seed
ones care as much? Do the growth ones care as much? Do the growth ones care as much? What I tell you is,
it's not a logical question. It's an emotional question. And we're just very sympathetic to the founder that needs to
be able to justify their, you know, authority.
You also definitely can't ask while you're, like, if somebody asked me while they were making
an investment, hey, is it okay if we invest in a conflict in a couple years? I'd like, what are you
talking about? You know, we've done these things. We tried that. We used to have this thing.
We used to have this separate branded thing called A16C seed. And we were like, well,
we have a different conflict policy on this. And it's, it's like, no, it's a 16.
And so the way I think about it basically is, like, the more successful you are as a, as a venture
firm, the bigger the issue this is going to be because the more the people that you were investing
and are going to care.
Yeah.
And so it's just,
it's like the downside of success,
but like success,
you know,
right, right.
The only people who,
like the only investors
you don't care,
whether they invest in there is if they're,
literally if you don't care
what they think about anything,
right,
if they just don't matter at all.
And everybody knows that they don't matter at all.
So,
so, so, so, so therefore it can be simultaneously,
both of these things are true.
Number one is we still,
we definitely do lots of early stage investing.
And we will do, we will do,
we do, we do make seed bets.
But it's just also true that we can't structurally for this,
we cannot do all of the seed investments
that we would like to do.
In fact, we can't even do a tiny fraction of them.
It's just like strategically, we just, we just can't do it.
And so, and again, this goes back to the barbell.
So that means structurally, it was the same reason why Amazon can't give you the champagne,
you know, experience, right?
It's the same thing.
They're not set up for it.
They can't do it.
It's not a scale strategy.
And so what has to happen is there has to be the other side of the barbell.
There has to be the specialization and intense focus and deep relationship.
Yeah.
Right.
Thing.
And that's, and that's the role of the angel investor and the seed investor.
And that's, and of course, in startups, that's incredibly important because that's the most formative, right, fraught time in the life of these companies is when they're first getting started, right?
And as you know, half the time, these are people who haven't, you know, they haven't started a company before, they haven't run a company before.
Some of them haven't had a job before.
Yeah.
And so, like, they need to learn a lot and they need people to work with them on being able to do this and they need to figure out how to actually, you know, do these things.
And so there have to be, and there are, like, incredibly high quality seat investors, ancient investors on that side of the barbell.
The big firms, presumably, you know, if we succeed, we succeed by generating large numbers of aggregate dollars and a very good, you know, percentage return.
The seed investors have this perpetual opportunity to just absolutely shoot the lights out, right, on upside.
And you can, you know, you know, there are seed funds that generate like 200 X, 300 X returns, right?
And so these are both good strategies.
They're both adapted to the current reality market.
There's just two things that fall out of that.
One is the death of the middle, which is it just doesn't make sense to have the old-fashioned, you know, series A, series B, six GPS.
300 million dollar funds sitting on Santa Road waiting for people to walk in the door.
Yeah. Like those days are over and those funds are you know those funds are shutting down like that that model's going away. And then the other thing that happens that causes some of the tension is this what does a successful seed investor do? Right. He raises more money and wants to become a venture. Right. Right. But then he goes, but then you're you're going from one side of the barbell back to the middle and you're creating that same problem again. And I think that's where the tension is coming from. I also feel like the mechanic that happens a lot of times is when you grow the fund, the only you you know you you raise a huge fund.
and then you start deploying it into things
just because you've got to deploy at some pace
and so the threshold for we've got to deploy 400 million this year
and I only see $700 million worth of investable things
I'm going to do four sevenths of them
versus presumably if you only had to do one seventh of it
you would you'd pick better hopefully
which I think is a huge way can't do it.
So I think that's part of it
but I think the related thing is your competitive set has changed
and what we find was seen investors who migrate up
and then regret it later
what we find is that they didn't realize
was they're competitive.
Right because now they're going for bigger
more competitive rounds against you and Sequoia.
Yeah, all of a sudden, okay, now you're competing for series.
$15 million, be good luck.
Right, right, exactly.
And so it's just like, and look, like market fundamentalist,
if you have a better value proposition than Sequoia,
you should go off for that.
But I just, I would not accidentally end up competing with Sequoia for Series A's.
Like I would just say, that's a bad way to live.
Yeah.
And I think that's what happened, what, that is what has happened to a bunch of the seed
funds that have gotten larger.
Why is it so rare for somebody to break through and get,
I mean, you did it.
And that's one that happened in the last 15 years.
maybe there's a couple others maybe,
but why is it as rare as it?
It seems like almost more rare
than a new big company in a way.
Yeah, that's true.
In fact, our analysis, actually when we started
was there actually hadn't been,
I think there had been two firms and Iraqly affectionate.
I mean, Thrive also.
So Thrive was, yeah, they were after us.
Yeah, yeah, yeah, yeah.
I mean, they've done great,
but in the 30 years before us,
we think that there were only two new VCs
that actually punched through to become top tier.
In other words, VCs that were not either firms
that were built in the 60s and 70s
or firms that weren't derivations of those firms.
Founders Fund?
No, no, no, no.
The Founders Fund started actually around the same time we did.
Okay.
They were a little bit earlier, but they were around the same time.
But I mean over the proceeding like 50 years.
Okay.
Seven Rosen.
You won't even...
No.
This is sort of the thing.
You don't even recognize these names.
I need to read a book or something.
So Seven Rosen was the venture firm
the famously funded Compact Computer,
which was the big winner.
And then they went on to become a successful firm.
This guy Ben Rosen,
early leader in this base.
And then there was a firm called Homer Winblad,
which was a software specialist firm in the late 80s,
those are the only two that punched into the top end,
while they were operating.
Wow.
Neither one of them, you know, sustained it, but they got there.
They got there for a bit.
But that was like the success case, right?
So there's a little bit like Elon looking at the history of the car industry and
said, you know, Tucker Automotive in the 1950s.
So it's so rare.
So why is it that rare?
Two reasons I think it's rare.
So number one, there's the intimate reason for it and then a sort of macro reason for it.
Intimate reason for it is just, like, you're going to have this incredible, as the founder,
you're going to have this incredibly intimate experience, you know, very close-dressed relationship
with whoever you're working with.
And it's like, you know, can you reference them, you know,
do they have a history and track record of the kinds of behavior that you need
and the kinds of insight, you know, that you need?
And it's just like it's very hard to do that.
It's very easy for an existing firm that has a long track record of success to prove that.
It's very hard if you don't.
So that's like the close in reason.
But then the other reason goes back to the way the world is changing is we always believe
the thing that you want from your venture firm is power.
So the thing as a startup that you want is you want them to like fill in all the missing pieces
that you don't yet have when you're starting a company.
that you need to succeed.
And so you need power.
And so you need power.
It means like you need the ability
to be able to like actually go meet customers
and have them take you seriously.
You need the ability to go get publicity
and like, you know, major, you know,
channels, you know, used to be media,
now it's podcast and be able to like get taken seriously.
You need to be able to be taken seriously
by recruits, right?
Because there's thousands of startups
recruiting for engineers.
What makes your stand out?
I sometimes describe it as venture firm
is providing a bridge loan of a brand.
As you say, until you have your own brand
that's big or bigger, you know,
for your own space than the VC,
you're borrowing,
VC's brand. Exactly. And that has been very effective for a long time. And that was how we looked
at it when we were founders. That's why you did media from the beginning. Yeah. Oh, that's one of the
it's one of the reasons, yes, but a very, very, very powerful one. Yeah. Very, very major one.
And then by the way, you also need a downstream money, right? You're going to have to need
to raise money again. And so they either need a lot of money or they need to be connected to a lot of money.
Yeah, exactly. Right. Exactly. And so you just better if they just have it. Yeah.
It's like being full stack. Well, then by the way, now you're getting also like, again,
you think like tools companies just never got into like, for example, politics, right?
Or just, let's just say global affairs, global events.
Like, what's happening with, you know, like, what's happening with, how do you navigate the world, right?
How do you navigate Washington, you know, when the regulators show up and they want to kill you?
Like, how do you navigate that or you're like to get in some, you know, giant fight with the EU?
Like, so, so the, especially these full-stack companies, they're getting involved in like very complicated macro political, geopolitical, geopolitical situations, like much more early.
And they have to like, in some cases, they have to, like, you know, senior government officials, heads of state, you know, major head of state, you know, major head.
heads of sovereign wealth funds, they need to get to, you know,
CEOs of major companies, you know, how do you get to the CEOs?
You know, you're a new AI, you're a new AI company
and you're trying to redefine, you know, visual production for movies.
How do you get to the studio heads?
Yeah, right.
The studio heads just don't have time to meet with a thousand startups.
So where are they going to meet with you?
Right.
So basically, it's projection of power.
And this has been one of our theories how we built our firm is optimized for maximum
amount of power in order to be able to give the startups access to it, right?
Both the startups that are already in your portfolio,
but also the startups that don't even exist.
yet, right? And again, and this goes to why the scale thing matters so much. It's just like,
all right, there's just, there's a scale aspect of power. There's a big difference between being
able to get to everybody who matters and not. Why is it rare for people to be able to accumulate
power even if they were, like let's say everybody was trying to do it. It's not like everybody
could do it. What's the cause of the rarity to be able to build enough power in that sense?
And start with you have to want to. And so we met with the, we met with all the GPs of all the
top firms, basically when we were starting out because we wanted to, you know, see who we could
be friends with and it worked very well in some cases and not not well in other cases but uh one of them told
us this is a GP at a top firm in 2009 and he said yeah the venture business is like going to the
sushi boat restaurant all right and so the sushi boat restaurants the sushi restaurant where they've got
the boats right it's got like a like a conveyor belt right and the little sushi boat comes
like a lot of them and there's a two-in-a-roll and there's a you know shrimp roll and there's a this or that
and you said basically you just sit on sandhill road and you're like we're going to crush these guys
and the startups are going to come in and he said you know if you miss one it doesn't matter
because there's another sushi boat coming up right behind it.
And he's just like, you just sit and watch the sushi go by.
And every once in a while, you reach into the thing
and you pluck out a piece of sushi.
And we walked out and saying, like, what the hell?
That's funny.
Like, in what industry is?
2009 or something?
2009, yeah.
Like, that was a very common.
This, again, this was the mid-sized venture.
One of the reasons, when I came, like, look, in 1994,
I mean, it might have kind of been like that about that.
It was.
It was.
When I came to Silicon Valley in 1994, I had never heard the term venture capital.
I didn't even know the thing existed.
And then as my business partner, Jim Clark,
explained it to me.
And I was like, there are guys like, they're just sitting there waiting to give you money.
But you see this and you're like, this is going to get eaten alive.
Of course.
This is absurd.
Like the minute anybody takes this seriously, it's all going to change, right?
And so it was this very clubby cartel, you know, basically kind of thing.
And again, it was fine as long as the ambitions of the industry were constrained.
And then, but then again, look, the tools companies, they didn't need all the power.
They needed some of the power, right?
But they didn't need all the power.
You know, they weren't dealing with, like, governments, right?
Or, you know, these sort of big macro issues, you know, at least, you know, in the early years.
Well, okay, so here's another thing that's happened is just the world is globalized.
So startups 30 years ago, you would spend your first decade just in the U.S.
And then you would start to think about Europe and global expansion.
And now you have to think about being a global company up front because if you don't,
like other people are going to do it.
Yeah.
Right.
And so you just, you have to chin up as an entrepreneur.
Like the expectations are much higher than it used to be.
Maybe one final question on this topic of fund size.
And then I want to go to AI.
What do you think?
And I know you thought about this a lot.
What do you think is the limiting factor for the creation of a lot more really big companies?
Do you think it's founders?
Do you think it's capital?
Do you think it's market maturity?
Do you think it's underlying tech stuff?
Like, if you had to pinpoint the one or two things that you think would allow for there to be way more big companies, like what is it?
So there's sort of the holy trinity of venture startups, which is, you know, people, market, and technology.
And I think the answer is sort of all three.
And the way I would describe it is there's some limiting issue with just,
market saw just how many markets are there how big are they how ready is the market to take
something new then there's the technology question which is we know when is the technology actually
the like for the venture perspective technology moves in stair steps right and so things become
possible in the world of smartphones that just weren't possible you know you couldn't do uber
when everybody had a laptop you had to wait until they had phones yeah right um and so technology
moves in a stair step you get these paradigm shifts platform shifts and and those just they come when
they come yeah and until they come you can't do it and then and then the people side you know and this is
the one that, you know, I'd say, you know, vexes me the most, which is like, okay, like,
how do you just get more of great founders? Yeah. Right. And I think part of that is, you know,
you, I think there is definitely a training thing that is real and getting people into the right
scene in the right way and like the thing that The Combinator does or the thing that Teal Fellows do.
Like, those are real things. Um, and those help a lot. But also, you know, there is an inherent,
you know, there are just certain, there's, there are not infinite number of people running around
who have the, but you probably figure there's a lot of people who could have built big companies who
haven't though and hopefully as a lot or a few yeah yeah I don't know some some I don't know some I don't
some number but there must be people who are just like in academia or government or
who are just doing something completely different who if they were attracted to startups would
have built a big company so yes but then the other question is like well okay then why didn't
they why didn't they do the things required to get themselves in that position well it could
have been then like 2001 it was just like too many people were too scared to do it or didn't know
about or whatever what does that tell you about the people who didn't do it yeah is they
they were heard I can tell you who didn't listen to that right
is Mark Zuckerberg.
Are there more good...
But let's just press this point harder for a moment,
which is like...
I always described this as like...
I always call this the test
of the capital T, which is like, okay.
Like, if you're not in position to do the thing,
it's the fact that you're not positioned to do the thing
meant that you've already flunked the...
Well, I guess the question would be,
is there a subset of people who could build Facebook
who, other than being too scared to do it,
would have had all the other ingredients.
And so when everybody's not scared,
you get more Facebooks.
You know, there's a line in the movie.
I actually never saw the movie,
but there's a line in the movie.
If you could have built Facebook, you would have built Facebook.
Yeah, yeah, yeah, yeah.
That's right.
That was a good line.
Right.
And so this is the thing.
It's like, you know.
Are there more great founders today than when you were, let's say, in net, like, do you think there are more now than there were 20 years ago?
I believe there are.
But like, maybe there's how many more are there, right?
Is it five times more?
Is it like 50% more?
Or is it?
Well, so look, the number wins is increasing.
So we used to talk about the 15, 15 year that matter.
It's that up numbers probably if you do the analytics, probably up like 10x.
And there's like 150 companies a year
That like really matter
And the reason is because there's so many more sectors now
Right
So again the industry maturation
And so kind of by inference
There kind of have to be like
You're saying the markets are better
More than you're saying the founders are better
Well maybe a little bit of both
Also I think the founders are getting better
Part of the founders getting better
Is they have better training
They're all in the well start with
They're just all online
So when I showed up here in 1994
Like literally there's like three books
In the bookstore
I don't know which were that great
Yeah it's not that the DNA is better
It's that they're now the ecosystem
Is matured to teach people better
Yeah and like people come in
And then they've watched every video, you know,
they watched every episode, you know, your podcast.
Hopefully.
Right.
And they just walk in knowing all this stuff.
And then, yeah, look, and then look, the white commentator didn't exist.
And, you know, that definitely helps.
And, you know, you know, Thielfellas didn't exist.
And that definitely helps.
You know, Brian Eno has this great term, seenius.
You know, scene plus genius, right?
And so it's just like, you know, the individual genius on his own is always, it's always, you know,
it's hard to get things done.
Yeah.
Some people do, but it's difficult.
It's more often, more often in a profession where you're seeing creativity happen.
Yeah.
There's almost always a scene.
You know, Silicon Valley is definitely a scene in that way.
People come here and they just kind of get, I don't know, they just get better.
They just, you know, they meet more people who are like them.
They're able to aggregate together.
They learn from each other.
So, yeah, so look, the founders are getting better.
There's more of them.
But is there, does that mean there's now 10,000 as opposed to 1,000?
Yeah.
I don't know.
And there's 8 billion people on planet Earth.
Why are we debating whether it's 1,000 or 10,000?
Yeah.
Right.
And so, and I just, that I don't know.
Yeah.
I would hope over the next, you know, years and decades will all figure out a way.
to go make sure we get everybody who can do it
and get them to do it.
That's a good segue into AI.
Do you feel that we're now at the beginning
of what is like the new next important, you know, paradigm?
Like, is this cloud but on steroids?
Oh, yeah, much, I think much larger
and I'll explain why.
So, yeah, so I described, you know,
I described before, right,
you know, the triangle people technology market.
The technology is, ultimately the driver is the technological,
for venture, the technological step function changes drive the industry.
and they always have, right?
And so if you talk to the LPs, you can see this.
It's like when there is a giant new to algae platform,
it's an opportunity to reinvent a huge number of companies
and products that now have become obsolete
and create a whole new generation of companies.
Often, you know, generally end up being bigger
than the ones that they replaced.
And so, and the Venture returns map this.
And so they come in waves, and the LPs will tell you,
it was just like, yeah, there was the PC wave,
the internet wave, the mobile wave, the cloud wave.
Like, that was the thing.
And then, by the way, in venture,
when you get stuck between waves,
it's actually very hard, right?
Because you've seen this for the last like five years.
Like for the last five years, it's like, how many more SaaS companies are there to found?
Like, just, we're just out of ideas.
It's out of categories.
Yeah, yeah, yeah, yeah.
Right.
And so it's when you have a fundamental technology paradigm shift that gives you an opportunity to kind of rethink the entire industry.
It would have been very sad, by the way, if the AI breakthrough didn't happen.
Like, the state of venture would be sad, I think.
Three years ago, this was, I mean, so when we were talking to RLP's three years ago,
we're just like basically like, you know, we're in, you know, so Chris Dexson has this framing he uses, he calls it your adventure.
You're either in search mode or hill climbing mode.
And in search mode, you're looking for the hill.
And it was search mode.
Right.
And three years ago, we were all in search mode.
And that's how we described it to everybody,
which is like, we're in search mode,
and there's all these candidates for what the things could be.
And AI was one of the candidates, right?
It was like a known thing, but it hadn't broken out yet in the way that it has now.
And so we were in search mode.
Now we're in hill climbing mode.
Thank goodness, yeah.
Big time.
Yeah.
And then, you know, look, like, as I'd say,
on the technology breakthrough itself,
I think a year ago, you could have made the argument that, like,
I don't know if this is really going to work because LLM's, you know,
hallucinations, you know,
lyrics can they actually do math you know can they do something they write code and now obviously and now
they obviously can and this this i think for me the turning point moment the moment for certainty for me
was the release of o one uh so oh one from open ai the reasoner and then and then deep seek r1 the minute
i the when those happened kind of back to back and the minute those popped out you saw what's
happening with that um and the scaling law that was around that you're just like all right this is
going to work because reasoning is going to work and in fact that is what's happening like it's it's
you know and i would say just every day i'm seeing product capabilities you know i'm seeing new new technologies
I never thought I would live to see, like really profound.
I actually think the analogy isn't to the cloud or to the internet.
I think the analogy is to the emission of the microprocessor.
I think this is a new kind of computer.
Being a new kind of computer means that essentially everything that computers do
can get rebuilt, I think.
So we're investing against the thesis that basically all incumbents are going to get nuked.
And everything is going to get rebuilt.
Just across the board.
Just across the board.
Now, we'll be wrong in a bunch of those cases because some incumbents will adapt.
The power law, the things that are right will be super right.
We'll be super right, exactly.
And then, look, the AI makes things possible that we're not possible before.
And so there's going to be entirely new categories.
By the way, is your mindset there that you should just bet on, like, obviously incumbents are going to win some percentage and startups are going to win some,
but it's basically the dominant strategy as a venture capitalist to just plan to bet that startups are going to win it all and go for the power law?
Yeah, that's right.
That's right.
And again, the reason is you can remember two customer sets.
The way the LPs think of us, the way the LPs think of us is as complementary to all their other investments.
Yeah.
And so our LPs all have like major public market.
stock exposure like they don't need us to bet on yeah incumbent healthcare you know whatever company right
they they they need us to fit a role in their portfolio which is you know to try to maximize alpha
based on you know based on disruption yeah and and then and then again and then just again the basic
math adventure which is you can only lose one X you can make a thousand X and you just like slam
that forward as as as hard as you can so when you have a moment in time worldview like this do you
you know as a firm leader do you give a directive that's basically like hey everybody we
need to deploy in this kind of way right now or do you just build a system that's always picking
birds out of the flock from like the bottoms up and you're just like well they're smart they're
going to see that every opportunity is good like how much is it like a top down guidance versus
you know the market's just obviously good all around yeah so we don't do like I said we don't
do top down investment decision making so Ben and I aren't sitting saying you know we need to invest
in category X so we need to invest in this company versus that company and we don't run we run we
have a legal investment committee but we don't run a process where they come to us to get
approval. Because you're letting the leader of each group sort of make that. Yeah. And often in those
groups, it's actually delegated for the furthest delegated to the individual GP or checkwriter. And the
reason for that is we just think that the knowledge of knowing what's going on and which one's
likely to win is going to be focused in the mind of the person who's closest to the specific thing.
But do you have like a risk slider? Are you like, hey guys, let's get a nine right now?
So this is the funny thing. So Venture is the only asset class in which the leaders of the firm
are in the position of trying to get the firm to take more risk, not less risk, on a regular
basis. Exactly. Because, right, because the natural orientation towards any kind of, anybody who's
in an existing business, there's a natural organizational incentive to try to reduce risk because
you just want to, like, hold on to what you have and not upset the apple cart. And so Ben and I
are generally on the side of like, take more risk. One of the, one of the applications of this is
an old Sequoia adage, which is, they say, when in doubt, lean in. Like, so, so for example,
so you see this, I'm sure when you do it, is, it's just like, okay, there's this thing,
there's this company that is, like, potentially very interesting. But, like, there are these
issues, right? And it's just like, it's too early and this and that and this weird, guys got a weird
background and this, that, that, that's in a, you know, whatever, I don't know, the issues and,
you know, it's a hair. Yeah. You know, there's hair on the deal. There's no hair on the GP.
That's funny. That's good. But there's hair on the deal. The founders tend to have, have really
good hair. They're saying the deal. And it's just like, all right, like, what do you, how do you
calibrate that, right? And, and again, the history of venture is when you see something is very
promising and there's a lot of hair on it. Sometimes when you invest, it's going to go to zero.
Yeah. Because the hair is going to kill it. And then sometimes when you invest, it's going to be the
But it's like something where you're like, I love that.
I hate that.
It's much better than, yeah, everything's fine.
100%.
And this is the way we describe this is invest in strength, not in lack of weakness.
Or another way to think about it is it's not good versus great.
It's very good versus great.
Differentiating good from great is very straightforward.
Differencing very good from great is actually very hard.
And again, the risk reducing way to try to do that, as you kind of alluded to,
it would be kind of the checkbox thing, which was like,
very good team, very good market, very good this, very good that.
And then you have this other one where it's like they've got six great things and nine like horrible things, right?
Yeah.
Okay.
Which is the better bet?
Totally.
Usually.
Yeah.
Usually it's the thing with the greater strengths.
Statistically, by the way, this shows up in the return data from the LPs, which is the top desile firms have a higher loss rate than everybody else, which is, which is called in baseball, called the Babe Ruth effect, which is the whole run hitter strike out more often.
Yeah.
So the top performing venture firms statistically tend to have a higher loss rate than the mediocre firms.
Right.
And it's for this reason.
to invest in the thing that is just looks like completely nuts but has that magic something yeah
um and so so when when ben and i think about trying to get the the team to take more risk it's
almost always it's basically either that kind of thing which is like look and it's what and what you're
doing is you're telling the person closest to it go with your gut yeah if your gut tells you there's
something magical here like go ahead it's okay because we're going to have some losses so it's
okay to make the bad if it if it if it breaks because of the hair that's fine and but then then the
other form of risk we try to do and i i do this a lot is just you know i am trying to push the firm
constantly is like go earlier yeah right because again that for as we discussed earlier the
national inclination is to wait right um and it's like no no no go earlier like we do actually
want to make these these these you know we will make some seed bets but we definitely want to make
like a lot of a bets yeah and again we're going to lose in a bunch of those like we're going to
screw those up and miss the winter or whatever but like we have to do that because we have to
get into some of these things early we have to you know get the level of percentage you get
in the a yeah that kind of relationship yeah and i guess there's risk that's of the flavor of like
do things that are more asymmetric where there's hair but also brilliance correct there's
also the flavor that's just like, well, something I struggle with is the deals where I just
barely said yes and just barely passed, I'm like, I don't actually have that much confidence
that I can tell the difference between those. There's another flavor of sort of be more aggressive,
which would just say, like, just do a higher percentage of those ones where you're like right
on the line? Do you give that kind of guidance? Like, do you think like that too, where you're like,
it's not just do the more out there things and we're swinging for the fences, but it's also like,
let's just do a little bit more right now in general. Yeah. So we used to run this process we
called the anti-portfolio, the shadow portfolio.
And so the shadow portfolio was,
we used to track this statistically
for like the first five years exactly at this point,
which is every time we do an A,
every time we do it pull the trigger on an A round,
let's put in the shadow portfolio,
the other company we were looking at around the same time
that we didn't end up pulling the trigger on.
Yep.
And then let's build up representative,
like build up the, you know,
the Earth 2 portfolio.
I'm so curious.
Well, so, and the good news is it turns out
generally the main portfolio did better than the shadow portfolio.
But the shadow portfolio was close.
It was a good book.
Did really well.
Yeah.
right exactly the point and so and then you're okay so then you're just like okay you're not that smart but you're just like okay obviously what does that mean it means do them both right and again this goes to the thesis of like how big should these firms get it's just like well if you had the opportunity to do both the portfolio and the shadow portfolio you should do them both yeah um but generally speaking you should try to do both yeah and by the way this is the this is the i don't know if it was josh or the other the other podcasts that they were talking about this but you know at least i saw a reference to like a statistical analysis of like wind rate or whatever return you know percentage returns or whatever
or percentage of wins.
It's just like it doesn't, in venture math,
it doesn't matter.
The thing that matters is,
were you in the next big thing
as early as you could get in
and buy as much as you did?
Like, that's the only thing that matters
because if you don't do that,
you miss out on the 1,000x gain.
The 1x losses don't matter.
They wash right out.
Yeah.
And so this idea that somehow there's some,
like, virtue to being like a, you know,
small, you know, we only make a few bets
so we have a higher percentage.
It does, yeah.
How much is the,
I'm glad people think that that's a – I would like to encourage people, too, to think that that's a virtue that they should shoot for.
It seems like it's very hard to assemble lots of, you know, very good productive GPs into the same firm.
It's just objectively rare.
Yeah, that's right.
You've done it, but it's like, doesn't happen very often.
Yeah.
Do you – I guess my first question on this is do you think of just finding greatness and then you can't really teach it much?
You know, so you're basically just going to, like, hire people and see how it goes?
Or do you think that it's about creating the system and condition?
in which people do great work
and you can actually create good investors.
Yeah.
So I think it only works if there's a point,
like if there's a reason why you would have
the aggregation of GPs in the first place.
And our answer to that is power, right?
Our pitch to GPs as to why they should join us
as opposed to go to a smaller firm
or start their own thing is if you come here,
you just like plug into this engine.
This is just like massively powerful.
And so everything that you do,
the effects of it are going to just be like blown completely out.
It's a much more satisfying.
And you're going to be able to actually help the companies a lot more.
And you'll probably see more companies anyway.
Yeah.
so everything probably gets better yeah that's right that's right and by the way you know some
people want to have colleagues some people don't want to have colleagues but some people do want to
have colleagues and you'll be working with people you like and you know who care about the same things
you do so but there has to be there has to be a point to it and of course it's you know it's on us to
keep proving that right because you know the devil's in the details of whether they'll
actually you know buy that but so far so far a lot of really good great people have and then
yeah and then the second part of the question is like okay who do you who do you put in those
roles um historically we had a history our old model was basically we only hire gps we don't we were
not developing and we could go through why that was the case we changed that like eight years ago
we now develop our own GPs that we've evolved to where I think that's that's working quite
well um I think the answer to your question is it's a two-part question is there's some level of just
objective you know are they are they are they good are they good at doing the job yeah here's a big
thing we focus on and we evaluate them which is um you know it's fine to invest in a category like
five years early or like whatever something goes wrong like that's fine what's not fine is
you invest in the wrong company
and you could have invested
in the right company.
Like at the moment
you made the investment
you made the wrong decision
in that moment
of which one you should invest in
and you could have known
and so it's like
did you do the work
to fully address the market?
How do you handle the fact
that like you don't know that
until like six years later
and now you're going back
and you're like
hey you made this mistake
six years ago
this isn't going to work out now?
So it's generally
so that is a giant problem
and I would say that
when we started actually
when we talked to our friends
in the business
what they said basically was
they said number one
you don't know
if somebody's a good GP for 10 years
because you don't have their return data
and then they said number two
is nobody ever wants to admit
that they made a mistake
and so they never actually fire anybody
so what they do is they just keep them on the masthead
and they just kind of gently like you know
retire them out that they sit and pollute
one of the guys running one of the big firms
15 years ago told me he said they hired a partner
it's an older firm so they hired a partner in 1984
who was like a big deal at the time in the industry
and you know the LPs were very fired up about it
and he said he then proceeded to just like nearly ruin the firm
over the next 20 years that's crazy
because he said he said all of his investments were bad
but then it was even
worse that he talked them out of all the other good investments they come in and he said we couldn't get him
out you know the reputational damage was too great so so this is a long run and then by the way a lot of
these firms are partnerships yeah the problem of the partnership is partnership sounds good yeah the problem is
you end up with lots of internal dissension and then you can't make decisions yeah so this is a big issue
um i guess what i would say is like for example the thing i talked about it's just like it's it's not a
it's a it's a what i just described as a process issue not an outcome issue right which is like
are you doing the work yeah right like it's an actual job
job. Like, are you doing the work? If you're not doing the work, it's relatively clear
you're not doing the work. And you're probably not doing the work, not just on one thing.
You're probably not. So you do try to really look at the inputs. Oh, yeah, very much so.
Yeah. We evaluate the inputs just as much as the outputs. What do you do with an investor?
I'm sure you've had this at some point where the inputs are not particularly good. They hit
this one outlier thing. The outputs are objectively now good. And so you're looking at that
situation or the inverse. So this is the other part of it. The other part of it is I think
there's just a subjective criteria for venture, which is just, are you good at it? Yeah.
And like, do you have taste?
Yeah.
Which is unquantifiable.
This is one of the nice things about your model too,
where like you, somebody gets to make a call
versus in these partnerships, I think it would be very hard
when nobody gets to make calls like this.
Because at some point, someone has to just like
make a determination on this stuff.
Yeah, that's right.
And then even, you know, and even who even made the call,
you know, gets lost.
Yeah, so, so I think there's a taste thing.
And then look, I think there's also just like a network cohort
branding thing, which is the startups come in waves.
And it's not just new technology, it's also new people.
and these new scenes form
and like are you in the scene or not right
and if you're not in the scene like
I can't fix that for you
there's also a ton of path dependence it seems like
where like you make an investment
that gets you in the scene
now other founders want to work with you
because you invested in this really cool company
and then it just snowballs
and you're like well I can't go back
and you know change history
and get you into the snowball
yeah like and again this is what I'm going to call this
this is the test of the capital T
so it's just different versions of the complaint right
so you brought off the one of the founder
who's like well I could have done
but I was in a position to it, right?
That's your own fault.
Yeah.
There's another version of it,
which is this is sort of the anti-VC narrative.
These VCs are so arrogant.
They don't see my unique genius, right?
Right?
You know, the VCs are only as a good critique.
They apply against Paul Graham
is, you know, he wrote this post on pattern matching
and he always gets attacked.
It's like, you know, he pattern matches.
He's not looking for quality.
He's just looking for pattern matching.
And, like, you know, it's like,
founders don't match the pattern.
It's like, raising is very important
for founders to understand.
Raising money from venture capitalists
is the easiest thing you will ever do
as a startup founder.
we are sitting here with
checkbooks waiting to write checks
we are dying for the next
person to walk in the door and be so great
that they convince us to write the check
we don't care where they come from
we don't care what country they're from
we don't care what doesn't none of it matters
it's just like do they know what they're doing
are they going to be able to do it we're just dying for that
everybody else they're ever going to deal with
candidates and customers and downstream investors
and everybody else is going to be much harder
to deal with than we are and so if they can't pass
the test of raising money
yeah like they're not going to be able to do it and it's this and it's the same thing with the
gp like if you can't network your way in and make good investments that's the job
okay on that point right because there's going to be i completely agree with what you just said
about how it's you know the easiest part of building a company there's going to be a lot of
frustrated founders hearing that who are like why can't i raise you know what's going on here
one of the things that i'm really you know you've done this for enough time now when founders
pass, you know, get a pass note,
it's usually about something
that's related to the market
or the product or whatever.
And a lot of times,
it's what you just said,
which is that, like,
I just want the founder to be great.
Yeah, right.
But nobody says that to them.
And so they don't get the actual feedback.
And so I guess this whole dynamic
of like people aren't giving the,
because it's, you know,
what they're saying is not,
you're not great,
but it's,
I didn't perceive you as great
or something like that.
Is there some way for there
to be a more honest,
useful back and forth around this,
or is it just one of the impossible structural things
and founders just have to go around frustrated
that people are saying the market's too small
or it's too big or whatever, and really
what it is, is they're just not landing as great.
I mean, it's like, yeah. I mean, I think you think your baby's
beautiful, but I think he's really ugly, right?
Yeah, yeah.
You know, this kid's going to have a really hard time in life, man.
He's really an attractive.
It's really hard. It's really difficult.
And by the way, you embedded two things in there.
One is, like, you know, one is do they come across as good,
which in theory is fixable.
but the other is like, some people are better than other people at doing this.
Definitely.
And some people should not be started.
Some people should actually just like be on a team.
Yeah, sometimes it's a correct assessment.
Sometimes it's an incorrect.
Yeah, that's right.
Like there are some people who in the early days can't rate.
You know, there's a lot of great people who now we all know are really great, but they
couldn't raise a lot of money.
So they must have shown up in 60 VC meetings is not great or whatever.
And look, yeah, exactly.
It's like, we don't know.
Yeah.
And we make lots of mistakes of a mission.
You know, so we, like I said, most, even the great VCs most of the time are screwing up.
And so that's all true.
The thing I always tell founders is
Steve Martin was asked this question
about becoming a great stand-up comic
and he wrote this whole book
a great book called Standing Up
which he talks about this
and he says the secret of being a great
he said the secret is
be so great they can't ignore you
if your business gets good enough
and you prove that you're really good
you don't have to show up in the one hour
with the VC is very impressive
you just proved it on the field
we're dying for people to come in
and just be like wow
and just be like I cannot believe
how good this is I can't believe
how good this product is
I can't believe how much the customers love it
I can't believe how much this person has gotten done
on a very small amount of money.
See, it's just the exact same thing
if I'm a talent aid,
I'm just dying for the young community
get on stage and make me laugh.
I also think the founders
who like really struggled
to like raise around or two
and then the business got working,
I think there's like a,
there's a real strength that comes out of that.
So it's not the worst thing that ever happened.
Yeah, no, look,
having said that,
look, there's breakage along the way.
Like there are.
Yeah.
Also it sucks.
It's like really unpleasant.
It's right.
Yeah, I had to happen.
It sucks.
Yes.
Yeah.
So, but like, you know,
look, I just say like I, you know,
having been a founder.
Like, it's an incredible privilege to be in an industry and in a world and in a country at a time when you can actually do this.
Yeah.
Like, you know, most of history and most places, you just, this kind of thing can't happen.
And then, you know, we are genuinely trying to find the anomalies, right?
Like, our business is defined by anomalies.
It is true.
The thing you said about, it's like an audience that wants to laugh.
It's totally true.
So desperate.
Can't wait for somebody to finally tell a good joke.
So on AI, I want to talk about not just the startup side, but maybe like just some of your takes on like the broader.
lens of AI. I guess my first question is around AI going wrong. And I know this is like a very
hard thing, but I'm just sort of for fun, really curious what you think. You know, the downside
case that people are very afraid of would be something like AI embodies humanoid robots and
now we have a terminator situation on her hand. It gets agency. We have a big problem. You know,
that's one end of the spectrum. The happy path is that it's just like the sickest software that
anybody's ever seen. And like, it's a tool that humans use and everything's great. Do you think
about this. So do you have any opinion on it or are you just like it's going to be what it's
going to be? Start by saying it's an important new technology. Any important new technology is
what they call dual use. It can be used for good things. It can be used for bad things.
The shovel. It can dig a well and save your life. You can bash somebody over the head with it
and kill them. Fire, you know, the computer, the airplane. You know, the airplane can take you on
most marvelous vacation with your new spouse. It can also bomb, you know, Dresden. Right. And so
it's just, I mean, atomic power was the big one because atomic power could be unlimited
clean energy for the entire world or it could be nuclear bombs right um as it turns out there we just
got the bombs we didn't get the unlimited clean energy and so um like that that's just like generally
true these things these things are double-edged swords the question is like all right like what are you
going to do about that um and are you going to like somehow put it back in the box or you're going to
somehow like try to constrain it and control it um the the nuclear example is really interesting
um because the um you know there was a very big concern around obviously nuclear weapons and then
and then nuclear there's a kind of big moral panic that developed around nuclear power i mean we kind of messed up
meltdowns. We very badly messed up with it. And what happened was the green movement in the 60s and 70s created something called the precautionary principle, which is now the same kinds of people are not trying to apply to AI, which basically says, unless you can prove that technology is definitely going to be harmless, you should not deploy it. And of course, that literally rules out everything, right? That's just like no fire, no shovels, no cars, no planes, no nothing, no electricity. And so, and that is what happened to civilian nuclear power, which is they just, they killed it. The story I tell on that is President Nixon in 1971, the year I was.
born. He saw the oil crisis coming, the Middle East. He declared something called Project
Independence. He said the American needs to build a thousand nuclear power, civilian nuclear power
plants by the year 2000, go completely clean, carbon zero, completely electric, cut the entire,
you know, they had electric cars 100 years ago. So it's just obvious, you just cut over to
electric cars at some point. And basically we need to do that. And then we're not entangled
to the Middle East, and we don't need to go, you know, do all the stuff there. He then created
the EPA and the Nuclear Regulatory Commission, which then prevented that from happening.
absolutely killed the nuclear industry in the U.S., right?
And then the Germans are going through the new version of that with Ukraine,
which is they keep shutting, you know, Europe ex-France keeps shutting down their nuclear plants,
which just makes them more dependent on Russian oil.
And so they end up funding the Russian war machine, which invades Ukraine,
and then, you know, they're worried now it's going to invade Russia.
And so the social engineering, I would say the moral panic
and then the social engineering that comes out of this,
the history of it has been quite bad, like, in terms of its thinking,
and then in terms of its practical results.
Yeah.
I think it would be a very, very, very big mistake to do that,
you know, an AI.
To, like, regulate early.
Yeah, yeah, yeah, absolutely, 100%.
To try to offset the risks in order to, like, and then cut up the benefits.
So start with that as number one.
Number two, I just say, look, we're not alone in the world.
And we knew that before, but especially after Deep Seek, we really know that.
And so there is a two-horse race.
This is shaping up to be the equivalent of what the Cold War was in the, in the,
against the Soviet Union in the last century.
It is shaping up to be like that.
China does have ambitions to basically imprint the world on their ideas of how society should be
organized. Now the world should be run, and they obviously intend to fully proliferate their
technology, which they're doing in many areas. And the world, you know, 50 years from now is going to be
running on, 20 years from now is going to be running on Chinese AI or American AI. Like those are your
choices. You think that's how it'll basically play out. Yeah. Yeah. It's going to run on one or the
other. How will that play out? Like, let's say it's one or the other. So AI is going to be the
control layer for everything. So my view is AI is going to be how you interface with the education
system, with health care system, with transportation, with employment, with the government,
with law, right?
It's going to be AI lawyers, AI doctors, AI teachers, okay, do you want your AI teacher?
You want your kids to be taught by a Chinese AI.
Really?
Yeah.
Like, they're really good at teaching your Marxism and Xi Jinping thought.
Like, you know, like, the cult is another way to put it as the culture's in the weights.
Yeah.
Right.
And so, like, how these things are trained and, like, who they're trained by, like, really,
really deeply matters.
And so, and by the way, this is already an issue in lots of.
countries because they're like number one they may not want chinese ai but number two do they
want you know super woke northern california right is another open question right so there are big
questions on this and so i i just think like there's no question like if you had a choice between
ai with american values versus the chinese communist party values i mean for me it's just crystal
clear where you'd want to go yeah by the way there's also going to be direct military there's a
direct military version of national security version of this which is okay do you want to live in a
world of all CCP controlled robots and drones and airplanes yeah i mean is is is that really
what you want. Warfare and defense, I guess, just is going to fully go AI over the next
20 years or something. I think that's very much true. And I think this, you know, robots plus
AI, basically. There's the signal, well, this is the signal of me. You probably saw the Ukrainian
attack on the, on the Russian airplanes. You know, so those are, no, autonomous drones. And then they
were doing AI targeting of structural, the right structural points to be able to attack the planes
and destroy the planes. Yeah. Right. And so, yeah, 100% that's happening. You know,
this is a major issue with our defense doctrine with respect, for example, to, you know,
potential invasion of Taiwan, you know, can, if an aircraft, Ukraine is, Ukraine is, Ukraine is,
It's been fielding AI piloted jet skis.
So they take a jet ski, take a jet ski, put an autonomous pilot on it, and they strap
with explosives.
And, you know, you could send out 10,000 of those against an aircraft carrier, right?
And by the way, and you could just keep sending them, right?
Because there's no loss of what.
You just keep sending them until you get through.
And so, yeah, so the entire, I think the entire supply chain, the entire defense industrial
base, all the doctrine of warfare, all changes.
You know, the idea of human beings and planes are on submarines just doesn't make any sense.
It's all going to change.
And then the symmetry or asymmetry between defense and attack is going to change.
You use the word dual use.
And obviously with like previous technologies, you know, they got used.
At some point I'm wondering, does it blend from getting used to being the user?
Like if a business, a benign business example would be if you could tell an AI, hey, I want you to, you know, hey, prompt, I want you to build me a software company, you know, make it roughly do this, serve these users.
and run that for the next five years
and just wire me the money to this bank account, go.
And if, you know, if that worked at some point,
you know, in the middle of those five years,
like, you know, is it doing its own thing
or are you telling them what to do?
Does that also happen, you know,
in like a warfare scale?
And I guess that's maybe like the thrust of,
to me, where, you know,
where it turns into something scary
or particularly when you get into, you know,
the embodied version in warfare
where it's just like, you know,
the prompt is like, hey, just, you know,
fight this war for the next year or something.
That's right.
That's right.
So the good news, the domestic version of it is straightforward, I think, which is we have, you know, U.S. law, Western law has a concept of responsibility, accountability.
If you use a machine to do something, it legally is your fault.
It's your, that's your problem.
But by the way, if the machine goes wrong for reasons having to do with not with you, then it's a manufacturing.
It's a product liability issue.
The manufacturer is liable.
But if you use it, you know, if I buy a shovel and I bash you over the head with it, right?
It's my, you know, yeah, the shovel killed you, but like I'm to blame.
And so I think that your example of the autonomous corporation, I think the legal system is perfectly prepared to deal with that, which is, yeah, you, that was your bot.
You set the whole thing up.
It's your fault.
And so there's a natural constraint.
I think there's a natural constraint on that.
The most obvious version of the military version of the question is autonomous targeting and trigger pulling.
Right.
And so, and this has been an issue in drone warfare for the last like 15 years, which is, is there a human in the loop on pulling the trigger?
Right.
So Predators flying overhead, da-da-da-da-da-da,
sees the bad guy.
Okay, how is the decision made for the predator
to launch the missile on the bad guy?
And by the way, the way that worked for a very long time
was it actually had to be an Air Force combat pilot
who would actually pull the trigger on the drone
very specifically.
Even if he wasn't otherwise responsible for like operations of the drone,
you'd still get somebody whose job it was
to make those decisions in the loop.
There are a lot of people in the defense field
who are like it's absolutely mandatory
that in all cases it is required for the human being
to make the kill decision.
And maybe that is the correct answer.
There's a very powerful argument of story that should be the case
because it's the biggest decision that anybody can make.
And even if you don't believe in like the Skynet scenario,
it's just the idea of a human being not being responsible for that decision
sounds ethically morally very scary.
There is a counter argument, which is human beings are really, really bad
at making those decisions.
Yep.
Right.
And so any- It's self-driving cars thing.
If it's safer than a human driver, then like who's, you know,
yeah, there will be accidents, but there's steward.
Correct.
And so every post-analysis of any combat situation that you read or any war later on, you discover all these shocking things.
So one is friendly fire.
Like there's just huge amounts of gas causeway friendly fire, people shooting at their own troops.
They're confused.
Number two is, you know, fog of war is just like, it turns out the commanders have very little idea what's going on.
They had some battle plan and immediately go sideways.
They literally don't know what's going on.
They're not making, they don't have the information to be able to make decisions.
Everything's confusing.
Number three, the physiological impact of stress, adrenaline.
It's like one thing to be on a shooting range, making.
these decisions is another thing to be like you know have like a severe leg wound coupled with
you know adrenaline you know overload coupled with two hours of sleep the night before and like
is the human is even the highly trained person making the decision right yeah um and then there's just
like a more basic thing which i think this is like a world war two retrospective it's something like
in a lot of combat situations it was estimated only like 25 percent of the soldiers even fired
their rifles wow like just generally a lot of people just like don't act right and so anyway so
you you the more you look at this you're just like wow the human being is actually really bad at
And then all these other issues around collateral damage, you know, and they should, you know, accidentally shoot the civilian.
And so, yeah, you're back in the self-driving car situation, which is like, all right, if you had, if you could, if you could, if you knew you could get better outcomes by having the machine make the decision, better, safer, less loss of life, less collateral damage.
And so I, and I would say, I don't believe I have an answer to this, but I think that is a very fundamental question.
I guess this kind of actually feeds into the next topic, which to me is, I think, like, tech has now gotten to a place where with the government and politics, like, it's sort of now undeniable.
It used to kind of be an underdog, but now for reasons like this and a bunch of others, it's just, like, too important to, like, not be in the mix at, like, the national stage now, which I think has really, like, changed the dynamic even insularly for Silicon Valley.
because now people are looking at what people are doing
not just like in tech but pretty broadly now.
Yeah, that's right.
Yeah.
So I would say I deeply agree with that.
I believe it is mostly our fault.
Like the current situation is mostly our fault in tech,
which is there's an old Russian, a little Soviet joke,
which is you may not be interested in politics,
but politics is interested in you.
Yeah.
And so I think we, we and I would include myself in this,
I think we all got complacent,
or a lot of us got complacent between like 1960 and 2010
that basically just said we could just sit out here.
we can do our thing.
We can talk about how important it all is,
but like it's never going to, you know,
these are never going to be big social
or, you know,
cultural or political issues.
Yeah.
And we can just kind of get away
with not being engaged.
And then I,
for all the reasons we've discussed.
You're saying,
and then once it was undeniable,
we weren't prepared.
And then we weren't even,
I would say, remotely prepared.
And then they're using metaphor,
the dog that caught the bus.
And the dog is being dragged behind the bus.
Yeah.
Tail pipe in his mouth.
Yeah.
It doesn't know what to do with the bus.
Yeah.
And look, you know, geography, I think,
has a lot to do with this.
where 3,000 miles away, you know, it's just hard to get there.
They don't come here very often.
And, yeah, so I guess I would say, like, it worked.
Like, we actually, we always wanted to build important things.
We actually are building important things.
There are obvious political, cultural, social consequences to them.
If we don't engage, nobody's going to.
Yeah.
And then, by the way, the other thing I'll say is, you know,
it's not like there's unanimity even in the industry on a lot of these issues, right?
And so there's, you know, I would say two giant divisions right now,
big companies versus small companies.
Yeah.
You know, there's often do not have aligned incentives right now and aligned agendas.
And then the other is, you know, like just on AI, obviously there's a big dispersion of use even in the industry.
I guess this probably goes to why it's important for, to some extent, at least some VCs to have relationships with the government because big tech has the resources to do it themselves.
Small tech can't.
And so if this is the state of the world, we actually as an industry, need somebody to be doing it on behalf of little tech.
Yeah, that's exactly right.
That's why we're doing what we're doing.
on media in particular.
I thought it was really interesting.
I can't remember how many years ago,
but biology many years ago started talking about
some fracturing, about the sort of relationship
between tech and the media was going downhill.
I think this was mostly talking about media
and inside tech, but I think probably also
with the major publications and at sort of a larger scale.
From my read, as often, you know, I think this was right.
And from where I said,
it seems like it did kind of continue to degrade
the relationship. What's interesting to me recently is I've seen a little bit of life, you know,
in the sort of tech publication stuff, but it's actually been from the inside. And so like,
Eric, who you just brought on his GP is awesome and he's been really good at doing this.
TBPN's really cool. And I don't think I've seen something like that pop up maybe ever inside tech.
What's your read, I guess, within our bubble of like the sort of tech media relationship and
where it's been? So my background in this is I, you know, I have a weird kind of history.
because of what happened in the 90s.
But, you know, I started dealing with the national press
and the tech press, business press, in 1993, 1994.
And I did an annual press tour to the East Coast,
you know, probably a week out of each year, usually in the spring.
And, you know, what that means is you kind of go around
and you meet with all the publishers, editors, and reporters, you know,
go up everything.
And I would say the, basically, the stretch from 94 to 2016
was generally, like, I thought it was like a quite healthy,
normal productive relationship, you know,
like they would run, you know,
they would do investigative reporting and they would run stories I don't like,
but generally they, you know,
the major publications in each of those categories
were trying to understand what was going on
and we're trying to kind of be, you know,
honest brokers and trying to, you know,
kind of represent what was happening.
And so this sort of meetings were like super interesting.
They always wanted to learn.
They always had tons of questions.
They were super curious about everything that was happening.
That was great until 2016.
It was the spring in 2017 that I went on the press tour
and it was like somebody had flipped a light switch.
And they were like across the board like unbelievably hostile.
like unbelievably like completely and across the board like 100% sweep do you know why absolute hostility
I think the obvious answer is Trump Trump got nominated and they got elected and then they blamed tech for for both for both of those
and by the way there's there are a bunch of other factors including that that was when the that was when the it's actually the there's a business side to it which is there was the fear that the internet was going to eat the news business in the 90s actually didn't happen and actually 2015 I think was the best year in history for like revenues to like newspapers yeah and then it was
It was really after 2015, social networking went big,
and then their businesses started to collapse,
and, you know, they started having lots of layoffs,
and so that didn't help.
Yeah.
And then, you know, look, they would say, look, that was also,
you know, they would say, hey, smart guy,
that's also when you started doing all these things
that actually matter more, right?
And so, you know, everything we've been discussing,
like, the tech industry changed,
and so, you know, you're going to get a different level of scrutiny
because you deserve it because you're doing different things now.
The political thing was just a giant swamping factor,
and they, and, you know, this is a big, you know,
I don't want to get into the politics per se,
but if you just, you know,
This whole thing ran in parallel with everything
that's like in Jake Tapper's book about, you know.
So it's just like they just got locked in
on a mode of interaction.
They just became very polarized.
Yeah.
And very polarized and very lockstep.
And, you know, from the outside, you just,
you read it and you're just like, wow, these people,
they're all like really wrapping themselves around on Axel.
I think one of the other hard things is as the truth has become more accessible
by other people,
you more often see something in the news that you know about
and you're like, wait, that's super backwards.
And then somebody posts about how backwards it is
And now, you know, you see a clip of, you know, some major publication
And, you know, here's the truth and everybody can tell
And it's like, okay, so should we just believe the rest of it or not?
I think the truth fact-checking went way up too with social media.
That's right.
And I would say there are, you know, the cliché has been
And there's some truth to the cliche that social media is where lies spread.
And there are some truth to that.
Yeah, there's a lot of lies that on social media.
But the other side of it is what you're saying, which I think is right,
which is the truth spreads on social media.
And so the way I describe it is social media is an x-ray machine.
And exactly to your point, like anytime there's a, and you see this in any domain of activity right now is anytime there's a thing.
And there's just like evidence that it's just not the way it's being portrayed.
It is going to show up people are going to see it.
Yeah.
And that is, there's this guy, Martin Curry, who wrote this book called Revolta the Public in 2015.
And he was a CIA analyst who did what's called open source analysis for 30 years, which was studying basically what was in newspapers and magazines for the purpose of political forecasting.
And his prediction in 2015 in his book was that basically social media was going to completely destroy the authority of all incumbent institutions.
And the way that it was going to do that was it was going to reveal through this x-ray effect
that basically none of them deserve the credibility.
Do you think that's kind of happened?
And I think that's exactly what's happening.
Yeah.
And I think there's statistical evidence that's happening.
Gallup polls, they do an annual poll now for 50 years on trust in institutions of every different
kind of major institution, including the press, and all the numbers are collapsing.
In light of widespread social media, what would be the correct sort of function or role of, like,
journalism. I mean, look, I'm a believer in like the original, I like the original idea, right?
Like, I'm, I don't know. I'm a romantic. I like, I like what, I like what journalism says
that it is. I would like it to be like that. I like what the university say that they are.
I would like it to be like that. I like what the government says that it is. I would like it
to like name it. Yeah, well, for journalism, it's just like, all right, number one, like, tell us
correctly and accurately what's happening. Well, actually, there's a, there's a conflict at the heart of the
journalism question, which is the journalists say two different things. There's one is they say,
you know, basically be fair and objective, right? And then the other thing they say is,
they say like hold power to account
or they'll sometimes say,
they have this phrase,
they'll say,
comfort the afflicted
and afflict the comfortable.
And like there's an inherent,
like, are you, are you an objective truth telling?
Well, yeah, I was going to say,
that has nothing to do with the truth.
It's just unrelated to the truth.
Exactly.
And so there was already a conflict
at the heart of the industry.
And there's a selection process
or the people who go into journalism
tend to be critical by nature, right?
They tend to want to be on the outside
looking in to be critical
because if they didn't,
they wouldn't be journalists, they would, right?
And so there is an issue there.
But look, like, do we need people
to tell us the truth?
Yes, we do.
Do we need people to hold a powerful account?
Yes, we do.
Like, I would like them to do that.
Do you think they can be like for-profit corporations and it works?
Because, I mean, I think another problem is they're getting all their distribution on social media.
Eyeballs are what drives the revenue.
People want to, you know, stay and pull.
So that also is unrelated to the truth.
In fact, it's antithetical to the truth a lot of times.
Yeah.
So there's two, two mentalities come out of that.
One is, yeah, the profit incentive warps it and you want it to not have a profit incentive.
So it can be true to itself.
the other argument is if you don't like four profits you're really not going to like nonprofits
yeah because at least four profits have like at least four profits have like a market test
yeah at least there's like some discipline non-profit just becomes somebody sort of like this is
my agenda i'm going to do what i feel like now arbitrarily crazy yeah they can go arbitrarily nuts
it does sound worse yes and they're completely unaccountable they're completely unaccountable right
they're in fact in fact it's the opposite it's the opposite accountability because of the
tax because of the tax break you're actually paid yeah as a donor to invest in the things that are the
most unaccountable.
Interesting.
Right.
And then they can spin into like crazy land.
Yeah.
And they don't come back.
They don't come back.
There's a history here.
Yeah.
They don't come back.
And so it's weird because like the citizen journalism thing is like a helpful fact check.
It's like good to have.
And sometimes it does feel like it's not quite sufficient to tell the full story on everything
all the time.
So I do think that there's an important role.
I just feel like it's it still feels like it's very in limbo right now.
So here is a theory that would be a reason for optimism, which is the last.
The last eight years were basically, it was basically the human animal adapting to the existence
of social media.
It was basically the assembly of the brain and you slam eight billion people in a chat room
together and like, it's just like we're not used to it.
We weren't wired for it.
We're not involved for it.
And just like, oh, my God, everything goes bananas.
Yeah.
Marshall McLuhan, actually, the great media theorist, he talked about this.
He had this term called the Global Village is what happens and everybody gets networked
together.
And actually, what people miss about it is he didn't mean in a good way.
Because the nature of a village is basically gossip and innuendo and infighting and reputational
destruction right and civil war yeah like that's what happens in a village yeah right um and so which
actually functions at a certain size yeah like up to 150 people you can kind of deal with that yeah
you know at the size of like new york city it actually gets quite complicated yeah at the scale of the
world it's a disaster it's a disaster right yeah but you could say look like we went through this
eight-year period where like everybody went just say everybody went nuts everybody went nuts
in like a thousand different ways and then but maybe that was just we had to get used to it
right maybe we just had to adapt to it and like if you talk to i don't know if you talk to like
like young Zoomers now, you know, a lot of the time that I'll tell you is, yeah,
we don't take any of that stuff seriously.
Yeah.
Like, I just, of course you don't believe what you see on, you know, whatever TikTok.
Yeah, which is wild.
It's just all ops.
Like, of course it's all ops.
Like, whatever, right?
And they just have, like, they're adapting.
I'm glad people know.
It's just like that's a crazy state of the world.
Yeah, right.
Yeah, exactly.
So probably how people feel about, like, the news too.
Well, so this is the thing on the news.
So then this is the other thing on the news, which is, was the news ever as we were told
that it was.
And so my favorite example of this
is people always cite Walter Cronkite
as being the great truth teller
and the thing that they cite
for you young people
he used to be on TV.
I've heard of him.
I have not.
He was this guy where he would show up on TV.
Everybody would say,
oh my God, he's going to tell you the truth.
Like he was like the voice of the truth.
And the way that he built that reputation
is because he went negative
on the Vietnam War in 1968.
In 1968, he came out and he said
the Vietnam War is unwinnable
and we need to pull out of this.
And they aired all these reports
that showed that that was happening.
Everybody said he's the guy
who told the truth,
and hold power to account,
tell the truth.
Well, it's just like the problem with that is he went negative.
The fact that he went negative on the war in 1968, right?
He was positive on it before that.
Right.
Exactly.
What did he know the day before he said that that he wasn't sharing?
Yeah.
And then by the way, what else happened in 1968, which is the White House went from a Democrat
to a Republican?
So the Vietnam War was created by Kennedy and Johnson.
And then it was inherited by Nixon in 1968.
And isn't it convenient and interesting that he went negative on it when it began
Nixon's war as opposed to being Kennedy's and Johnson's war?
And so then it's like, all right, like, what was actually going on there?
what was happening in the preceding five years?
And was he actually on a side the whole time?
And then there's just the reality of it,
which is I grew up in rural Wisconsin,
we always thought the press was out to get us.
Like, we always thought the press was like the coasts
basically passing, sneering judgment on the center of the country.
Like, we never believed, like, the stuff to start with.
And we were always like, where I grew up,
people are like super resentful in the stuff in the media
and how it portrays them.
And so I think there's also like a more fundamental underlying issue here,
which is, you know, objective truth is a high bar.
Yes.
People have agendas.
Yes.
Like, maybe we just need to get.
all this out on the table.
Particularly in politics,
objective truth is not really how a lot of,
like, people are like, oh, that's a lie.
I'm like, well, it's not a lie.
It's just like an interpretation of a situation
that, like, I wouldn't characterize,
but like, sure, it's not like that.
And these are complicated topic.
You know, the ordering a society
is a complicated topic.
Right?
And the functioning of the economy is a complicated topic.
And it's just not so easy to understand.
And so I think part of it might,
the optimistic view would be humanity adapting
to being in the global village
is basically just taking on a little bit
of a more humble attitude,
basically saying,
all right, look, there's not going to be.
We're not going to have a lot of objective true toes running around.
We're not going to have, but also at the same time,
we don't want to be in a complete panic about everything all the time.
And we need to kind of be able to, you know, take a deep breath, touch grass,
be a little bit more skeptical, be a little bit more open, be a little bit more understanding.
Right.
And so maybe we're starting.
And by the way, I think that's happening.
I mentioned that Jake Tappell, without getting into partisan politics, but the Jake Tapper
book, I would happen to went to an event that he did this weekend out here.
And like, it's a, like, that book and the reaction of the book, and if you watch
the interviews on YouTube and the crowd response to that book, like, it feels like people are just
like, oh, like, if we just take a step back for a moment from like all the intense partisanship
of it all, like there's actually some, like maybe we can get back a little bit more. I thought
it was, that book is a very positive step forward to it's just a little bit of a calmer approach
on these things. And then by the way, the other book I'd promote on that is the Ezra Klein book
on abundance. Yeah. Which I think is, I think is a, you know, somebody who supported a lot of
Democrats for a long time. I think it's like the most positive, you know, kind of manifesto that's
come out basically saying, you don't know, like we need, you know, whether you're on the right
or the left, like we need to actually build things. And I think that's also a healthy moment.
So sort of related to this topic a little bit adjacent, but I saw you talking about preference
falsification recently. And I think this is like a super interesting topic in general, but particularly
in the last, I don't know, call it five-ish years. I think a lot of preference falsification
became made apparent. So I'd be curious first to hear a little bit about what you think
happened over the last some number of years where these changes happened.
Maybe we can start there, and then I've got to follow up on it.
Yeah, so preference falsification, just a sketch and outline,
it's when people, it's actually there's two different elements of it.
It's when people are required to say something in public that they don't actually believe,
or they are prohibited from saying something in public that they do believe.
Right.
So again, so commission omission issues.
And then the theory of it, there's this great book by Timur Karan.
The theory of it basically is it's easy to think about what this happens in the case of a single person,
which is, are you telling the truth?
or is there your public statements mirroring what you actually think or not?
The thing that gets complicated is when that happens across a group or across the society.
And the thing that happens is if there's widespread preference falsification of society,
you not only have people lying about what they actually think or hiding it,
but you also, everybody loses the ability to actually know what the distribution abuse are.
Yeah.
Right?
And he says basically, if you look at a history of political revolutions,
a political revolution happens when a majority of the country realizes that a majority of the country
actually agrees with them.
And they didn't realize it, right?
And so that whatever system they were in had convinced them
that they were in a very small minority.
And then you get a, at some point, there's, you know,
the boy who points out.
It's like a catalyst.
There's a catalyst, catalytic moment.
And then, and then basically there's a,
what he's called a preference cascade.
Right.
And then all of a sudden...
Yeah, it's like the correct prisoner's dilemma's box
to live in all of a sudden flips.
Everybody realizes it at once.
Yes, exactly.
And he said, you can see this in,
you can see this like in a crowd
with like a speaker, controversial speaker,
where basically like you'll have a controversial speaker
and then there'll be silence in the crowd.
And then one brave person will start clapping.
Uh-huh.
And that person is like a severe peril
because if they're the only asshole
standing up clapping, like that's it, they might get killed.
But then if it cascades,
then a second person starts clapping.
And then a third and a fourth and a fifth.
And then you get the snowballing effect.
And then the entire auditorium is clapping.
And then that's everybody realizing
that they actually are on the side of the majority,
which they didn't realize before.
By the way, this is what comedy,
this is actually why comedy is why comedy does well
because people can't control the involuntary responsible after.
Yeah, exactly.
And so when you get an entire group of people in a room
laughing out loud at something that individually
they will all swear.
They can't help it.
They can't help it.
That's a great point.
And then the stress relief from that,
because they all know that they're part of a,
they've rebonded the community, right?
You're actually back in being a part of a community.
And it's just such an incredibly powerful feeling.
Yeah.
Yeah.
Okay.
So, it's very easy to apply this theory to like the Soviet Union, right?
Or like the, you know, the Eastern Europe, you know,
in the Cold War or whatever.
You know, Mao's China.
It's a lot trickier to apply this theory to, you know,
your current society.
I believe that, you know, we've lived in an era of like intense preference falsification.
I think the last five years, probably the last 10 years were like way more intense
preference falsification than the preceding 40 at least.
You know, probably going back to, I don't even know.
I mean, you have to go for sure back to the 60s, if not like the 1820s or something,
to find an analogous period.
I think this period was characterized both by people who were saying things they didn't believe,
but critically not saying things they didn't believe.
I think there are many reasons this happened.
And look, this has happened many times in history.
And so a lot of people want to say this is caused by social media.
Right.
Well, when you phrase it the way that you said,
it actually makes a lot of sense when it's just
if people are going to be in a part of this prisoner's dilemma matrix,
it actually just gets caused by nothing other than itself.
Like it doesn't really need an outside catalyst for people to get into their own box.
That's true.
Although there needs, I know that's a good question.
Does there need to be some kind of oppression?
Does there need to be some kind of motivation for the cascade to have started
where people end up in that box?
It's social pressure.
So, yeah.
specifically I think the thing that happened in the last five years was
I guess it needs to be a high stakes enough issue for it to matter
otherwise it's just like who cares whether you think like the clouds are pretty or not
yeah that's right so at least has to be that
yeah and the way I think team McCrano described it is it needs to have like political
social cultural salience yeah like it needs to get to something fundamental about how
the community is organized you know we call we call that politics but you know this
predates even the concept of politics right and so um and by the way look like
you don't even necessarily want to say that all preference falsification is bad because
like, you know, I don't know that you want everybody out telling the truth about everything.
I don't think you do.
I think at least in like a social, like a lot of social graces come from people saying,
it's great to meet you, and I don't feel like saying it was great to you.
Your baby, I believe your baby is very effective.
Yeah, exactly.
So some of it's good.
Right. Yeah.
So, yeah, but, but, but yeah, as your point, you get wedged in this box.
And so I think the specific thing that happened.
So the good news is preference falsification in a lot of totalitarian societies was administered
at the point of a gun.
You say the wrong thing.
They shoot you.
Yes.
That, for the most part, is not what happens in our society.
What happens in our society is the sort of nonviolent version,
which is ostracized, canceled, ostracized, reputation is ruined,
fired, become unhirable, lose all your friends,
lose all your family, can't ever work again.
Still really bad.
Still really bad.
So you said it sounds pretty bad.
Very bad.
And so, and it just turned out, I think part of, you know,
the optimistic view would be part of adapting to the existence of social media,
was social media just turned up to be, among other things,
a very effective channel to destroy people reputationally, right?
And this is the social media mobbing effect, right?
that we're now all familiar with it.
And you think that helped create basically more false preferences.
Yeah, big time, big time.
Do you think it also unwound them?
Well, so this is the thing.
And this is maybe the thing that happened in the 2024 election, right?
Which is just like, oh, okay, like, we don't have to live this way anymore.
You know, certain, certain views become safe to say out loud.
Also the censorship regime.
Like, we lived under a very specific censorship regime.
Even in tech for 2024 election versus 2016, you know, 2016,
regardless of what you think, you know, who you wanted,
at least everybody can agree that it was taboo to support,
Trump in 16, and it was not taboo to support Trump in 2024 in tech, and so something changed there.
Something changed. Peter had this great line in 2016. He said, because he was one of the only people,
you know, maybe the only person in tech who was actually pro-Trump in 2016. And he said,
he said, this is so strange. He says, this is the least controversial contrarian thing I've ever done.
He's like, half the country agrees with me. Yeah. He's like, I've never had a point of view on
anything else in my entire life where half the country agrees with me. Yeah. And yet somehow,
this is such a heresy that I'm like the only one. Yeah. Right. And so yeah. So, so there was
that, that definitely changed. And then I just think in general, like I said, I think their
optimistically, you could just say there's a process of adaptation, right, where it's just like,
all right, we're just, like, if we all just decide that we're just not going to, like,
live life by mobbing and scapegoating and personal destruction, and just because somebody's offended
by something doesn't mean it's going to destroy it, you know, somebody says one thing, it's going
to destroy their lives. Like, we don't, you know, you don't have to do that.
Do you think it's basically been unwound now, or do you think there are still a lot of falsified
preferences? I would say it's radically different than it was two years ago. I would say there's
still a lot of falsified preferences.
I would, but again, I would say,
I think probably in any healthy society,
there's lots of falsified preferences.
Do you have any guesses for something
that is currently falsified
that will become unfalsified
or is too hard to call it?
Sure.
Yeah.
Sure.
Okay, great.
Well, but it's far too dangerous to say.
We'll move on.
Yeah.
Dang.
Gosh.
But again, when you ask that,
that is a very key question.
Here's what I encourage,
break the fourth wall.
Here's what I would encourage people to do.
Here's the thought experiment to do.
Just write down two at the middle of the night
with nobody around, doors locked.
Right.
it down a piece of paper and let's pull it out in 10 years?
Well, write down a piece of paper, two lists.
What are the things that I believe that I can't say?
And then what are the things that I don't believe that I must say?
And just write them down.
Yeah.
And I bet, you know, if you're a reasonably introspective person, you know, the quote-unquote
NPCs can do this.
Like if you're a reasonably introspective person, you know, most of us probably have
10, 20, 30 things on both sides of that ledger, right?
And again, most of those are things where you've got to, you know, I don't know,
like you don't want anybody ever see that piece of paper.
Maybe five or 10 years from now we'll be back.
and everybody can reopen their papers and we'll see,
and it'll be safe to say whatever people wrote down at that point.
Exactly.
Okay, a few final topics I wanted to ask you about.
One is you're probably in a spot to be giving just sort of life or career advice to young people a lot now,
both in general, but also maybe specifically with, like, AI and, like, the current set of tech, you know, changes right now.
What do you most often find yourself repeating to a really smart, you know, recent grad about, you know,
they're like, what should I be doing with my career if they get the chance to ask you that?
To start with, I never took any advice, so.
Advice, yeah, there's something there, but a lot of people do.
So maybe, fair enough.
That's like the, you know, if you could have built Facebook thing.
Maybe, yeah, maybe the best people probably shouldn't take any advice.
Okay, well, the rest of us.
But I would just say, especially for young people, you know, and again, I would say this,
like people are very different.
Like, I believe very deeply, yeah, some people, some people are very happy being in the
middle of chaos.
Some people are very unhappy being.
Oh, sorry.
Some people are very unhappy being in middle of chaos
and they will actually get themselves out of a chaotic situation
as fast as they can. Other people love chaos so much
that they don't have any, they will create it.
Right? And so like you have to, you know, there's a level of
understanding here. You know, like not everybody
should be in like a high growth, high risk tech company
because it might just be too nuts.
Yeah. So I don't think there's a one size fits all,
you know, kind of thing at all. Having said that,
let's narrow it. So the young person who wants to kind of be in tech,
I think a big part of it is, I think it's like run to the heat
like, or the scene thing we were talking about, like, where, where are the interesting things happening?
And that's a conceptual question, and it's also like a place question and the community question,
network question.
Yeah.
And so, you know, run to that as fast as you can.
And it doesn't mean, you know, running to the fads, but it means, like, trying to identify.
Trying to get into those hot network or ideas or projects, basically.
Yeah, yeah, exactly.
And look, there's a geographic component to that.
And I think we all kind of wish it wasn't the case, but there really is.
And AI, I think, has very successfully unwound the geographic dispersion of what was happening in tech.
In a huge way, yeah.
It's kind of slammed everything back in Northern California.
I don't think that's good, really, for a lot of reasons, but I think it just is the case.
And so I would say, like, if you're going to like do AI, get here.
Yeah, and then look, and then the other thing is it's the Steve Martin thing, be so good they can't ignore you.
Like, time spent on the margin, getting better at what you do is almost certainly better than most of the other uses of time.
the old adage of you are the average of the five people you spend the most time with is also true you want to do that so you want to you know pick pick that carefully and then I guess what I would say is when I talk to you know people about like what kind of company to go to there are certain people who should only be in a raw startup and there's some people who should only be a big company I think the general advice is the it's the high growth companies it's the companies that we would describe as between like being between like series C and series E probably or something yes where it's like they've hit product market fit they've hit the knee in the curve and they're on the way up on average that's going to be the best place to go and
because you're not going to have the downside of risk of a complete wipeout, usually.
Yeah.
And then people who get into that position, like at those high growth companies,
if you're talented, you can pick up new responsibility very quickly.
Yeah.
Okay.
Next is your Andrew Huberman thing that I see on Twitter.
Like, what's – I actually can't completely parse what it is.
What's going on with that?
So we have a completely fake beef.
We're good friends.
We're very good friends.
And I were actually neighbors in Malibu.
And I've been on his podcast, and, like, we're very good friends.
But you don't follow his protocol.
I don't do anything that he says.
I don't do a single thing that he says.
With one exception, we'll talk about.
But yeah, I don't do any of it.
You know, he says maintain a regular sleep schedule.
You're all over the place on sleep.
He says, always get up, you know, see, get up, you know, see sunlight as you can.
I'm like, no, I don't want, it's a seasontless thing I want to do when I wake up to see sunlight.
You don't drink caffeine for the first two hours the day.
It's like NFW.
It sounds like torch.
It sounds like being in a North Korea concentration cap.
Like I can't even imagine.
You drink a lot of coffee?
A lot of coffee.
Hot plunge, cold plunge thing.
I'm not just on.
The cold punch is miserable.
I'm not doing any of that shit.
Yeah.
You think it's good for you, though?
Oh, I'm sure it's good for you.
I'm just not going to do any of it.
It all sounds just completely miserable.
That's good.
The one thing that he says that I do is stop drinking alcohol.
And I would say I am physically much better off as a result, but I'm very bitter and resentful.
It is.
Towards him specifically.
Why did you do that one?
Because it's much better for you physically.
Yeah.
It really is.
It fixes sleep and energy problems.
So is the most tolerable of all of these.
And you're like, final, do one.
Well, no, no, it's completely intolerable.
It's horrible.
Okay.
I don't recommend it.
Like, I think it's a horrible way to live.
Yeah.
Like,
I'd much rather be drinking alcohol.
Does he think even like a glass of wine at night's bad?
He does, yeah.
Just all of it.
He did one of the great, he's actually had, I think, big influence on the culture.
And this is very, in seriousness, this is very positive, I think, at least for health.
So he did this big thing.
There's all these alcohol.
So what happened is, there's all these alcohol studies, basically, you know, this is like red wine.
And then it's like all, you know, heart protective and all this stuff.
And it basically turned out that really sick people either drink a lot or nothing.
And then healthy people tend to drink.
a little. Yeah. Right. So, so, so, so, so, so, so, so, so, so, so, so, so, so, so one is,
healthy people tend to be very well. Right. And then I guess is that correlation or causation is that
it's all in the sample set. So, so, so, so, so, so, so, that's no health benefits
alcohol. Uh-huh. In other words, just because, I see. Healthy people drink, I see.
Michael Crichton called this, uh, wet streets cause rain, okay? Wet streets, rain. Yes.
Right. So for some reason, unhealthy people stop drinking. Unhealthy people stop drinking because
they're, like, in the hospital. Because like, I can't handle this.
Yeah, their doctor says if you keep drinking, you're going to die.
Or, by the way, they drink a lot, right?
Because they're, right?
And then there's this, there's this fundamental thing, which is healthy people tend to be very disciplined.
But discipline is not, discipline is, there's like a big inherent component to it.
Yeah.
Right.
And so people who are disciplined who drink moderate amounts of alcohol also do moderate amounts of exercise, also experience modern amounts of stress.
Also, you know, you go to the doctor on a regular basis.
They take the medication they're prescribed.
They live all aspects of their health in it.
I guess it'll take a while to see, but it feels like it should be a good thing that,
Andrew and other people have gotten
so many more people interested in health.
It's good for, it's good physically.
Right, yeah. Might not be good mentally?
No, I'll try, I'll be funny again.
It's catastrophic emotionally.
Yeah. It's made me a much less happy person.
Do you think, are you actually, you think that?
Well, so I really, so it's the, it's the alcohol is a time,
thousands of years, people have been using it, number one, to fundamentally, like,
relax. Yeah. And then, and then there's a very important social lubricant component to
you know, it's like, um, you know, it's like, um,
And the de-stressing could be healthy, so it's...
Well, let's just say, maybe it's not accident
that the birth rate is crashing, right,
at the same time that we'll stop drinking.
I don't think Andrew would argue,
you should not live your life purely maximizing
for just physical health.
It'd be a miserable way to live.
I mean, it's like, what are you going to do?
Just like never leave the house,
never take the risk across the street.
And so, you know, he certainly doesn't judge people
for drinking modern alcohol.
He just says, look, scientifically,
you have to understand it is a poison.
Yeah.
Now, having said that, as you know,
speaking of scenes,
as you know, the displacement thing that's happening
as people in, like, our world,
they're not doing alcohol.
Instead, they're like doing hallucinogens.
Why is saying, yeah.
It's not necessarily an improvement.
As you, Jack, you know, very well.
Yes, yes, tell us about your latest ayahuasca.
Yeah, exactly.
You're so much different than you were last time I saw.
That's right.
Your personality has clearly completely changed.
Yeah, I do feel different.
So, so the other theory would be there's a law of, like,
conservation of drug use, which is every society is going to pick some drug.
Probably right.
And abuse it.
And apparently, in our case, it's going to be, like, LSD and mushrooms.
It seems like a good one.
No, no, okay.
Okay. Okay. My last question, when I tweeted out a request for questions, I got almost
ratioed by one question, so I'm going to ask this one like nearly verbatim. It was buying
a non named Signal. If you were frozen for 100 years and you woke back up and you looked
around, what would be the piece of data that you'd want to know that would tell you whether
or not your dominant worldview turned out to be correct in the fullness of time?
Yeah. So I will pick a very unfashionable answer to this. And I would
would say United States GDP, just like straight out U.S. GDP.
Because I would say embedded in that is the question of technological progress,
which is if you have rapid technological progress, you'll have rapid productivity growth,
which means you'll have very rapid GDP growth.
If you don't, you won't have rapid GDP growth.
So you'll see that in the GDP numbers immediately.
You know, number two is, you know, well, number two would be just like our market is a great way to organize.
And the U.S. is the best market?
And so, you know, is that going to keep working?
And then third is the U.S. are going to be a great country?
And you are along all of this?
I am very long all three of those.
I am very convicted on all three of those.
But, you know, if I'm wrong about something big,
it's going to be something in there
and it will show up in that number.
Mark, this is amazing.
Thank you so much again.
Good.
Awesome.
Thank you, Jack.
Thanks for listening to the A16Z podcast.
If you enjoy the episode,
let us know by leaving a review
at rate thispodcast.com slash A16Z.
We've got more great conversations coming your way.
See you next time.
You know,