a16z Podcast - a16z Podcast: Platforming the Future
Episode Date: October 13, 2017with Tim O'Reilly and Benedict Evans In this hallway-style podcast conversation, O'Reilly Media founder Tim O'Reilly and a16z partner Benedict Evans discuss how we make sense of the most recent wave o...f new technologies --- technologies that are perhaps more transformative than any we've seen before -- and how we think about the capabilities they might have that we haven't yet even considered. O'Reilly has seen more than one wave of new tech make an impact over the last three decades in Silicon Valley. But this time, O'Reilly argues in his new book, WTF? What's the Future and Why it's Up to Us, is different, partly because of the combinatorial inventions now possible. But we are also in the midst of so much foundational change happening so fast, that we as a society have some very large questions -- and answers -- to consider. What, for example, is the relationship between big tech platforms and the broader ecosystem they're in? What strategic choices (and responsibilities?) do they make on behalf of those ecosystems? Why and how do some platforms compete with their own ecosystems? And finally, how are algorithms optimizing for economic culture and markets, and how aware are we?
Transcript
Discussion (0)
Welcome to the A16Z podcast. I'm Belidic Evans and we have Tim I Riley on today here to talk about his new book, among other things.
So your book is called WTF and it kind of follows an arc talking about platforms in general, but then sort of flowing through to kind of the broader impact of how these companies interact with us all and with the economy.
WTF can be an expression of delighted amazement or an expression of horror.
And when we're faced with technology today, we really are hearing both.
The arc of the book is really a tour through 35, 40 years in the technology industry
looking at some of the great platforms and where they've gone wrong and where they've gone
right and what they do.
And in the course of that, I look at some of the latest platforms.
They spend a lot of time on Uber and Lyft and on Airbnb kind of talking about how they have
to make both parts of the market come together.
A lot of people don't understand, for example, of the need to make a thick market in a new city
is sort of this driving factor behind the high costs of these companies.
Once that happens, their economics is going to be very different because they've got to recruit
enough drivers, they've got to get passengers used to it.
They've got to get critical mass on both sides.
Critical mass on both sides.
And the thing that's interesting, though, is there's a lot, when you understand the implication,
something like that, you can really start to tease apart what people get wrong about company
business models. For example, taxi companies, you know, traditional taxi companies, they thought,
oh, it's this app. If we have the app, it will work. We will be able to compete. But the app is only
a tiny part of the business model because the fact that Lyft can deliver three-minute pickup times
is because all these part-time drivers showing up in their own car so that supply automatically
rises as there's more demand. And it just doesn't work with their business model, which has a
fixed number of cars, which is sort of optimized for sort of middle of the day or whatever,
or, you know, it's optimized for a certain level of service. And you can't get there unless
you take on all parts of the business model. You'll see in some news stories, fatuous statements
like, you know, if, you know, Uber or Lyft had autonomous vehicles, they would get rid of all those
pesky driver costs and the cars would be fully utilized. And I go, those two things don't go
together, you know, because if you own all the cars, they're going to be lightly utilized at
the times when there's not lots of passengers. I mean, the one that really annoys me, actually,
is when people say cars aren't used for 98% of the day, and therefore you could fill all
of that up. And you think, well, yeah, okay, so what use do you think there's going to be
between 1 and 5 o'clock in the morning then with this new app that we're going to deploy?
You actually have to think about, well, what does rush hour utilization look like? And
Russia utilization is, what, like 80%, 70%, 90%, whatever it is. That's right. And you actually
have to have enough cars for Peak.
Not for those idle times.
I try to walk through an understanding of algorithmic systems
because they're so central to platforms today.
And I try to tease apart this idea
that every algorithm has something that's trying to optimize.
Google optimizes for relevance.
Facebook optimizes for kind of engagement.
And through the fake news controversy,
you can see how these algorithms can go wrong.
I actually talk about trying to phrase as automation.
mechanization as a way of breaking the kind of the blinkets that people get when you say AI
and suddenly people's brain shut down and they don't really process.
That's right.
Analyze, well, what are we actually building here?
And I think there's a sort of a similar point here when we say algorithm, people's brain shut down.
Yeah.
And it's kind of one, maybe one should just say like automation.
And I think this is particularly the thing you see with like, you know,
the sort of, you know, the recent problems with Facebook advertising or Google advertising and so on.
Or with sort of extremist content on YouTube.
And people just say, well, just use the algorithm.
And it's as though you've kind of sever up the magic lamp three times and the genie appears.
More understanding will lead to better policy and better business decisions.
Why do these platforms go wrong and start competing with their ecosystem?
One of the things that I was seeing was a repetition of patterns that I first observed in the, you know, the antitrust case against IBM and then Microsoft, which is that companies,
that really are platform technologies that enable a whole ecosystem
decide at some point to start competing with their ecosystem
because it's just not enough to go around.
And at that point, they screw up.
The ecosystem starts to fall apart.
Government turns against them.
People turn against them.
And it's so amazing to see even idealistic companies
forget that an ecosystem has to work for all of its participants.
You can't just think about yourself.
You can't just think about users.
You actually have to think about your suppliers as well.
And if you start competing with them, eventually, basically the entrepreneurs who are making
the services that your ecosystem depends on, go look for one that's more friendly.
It's interesting, given a Google HGC deal, where they spend a bit over a billion dollars
to buy something.
It's not entirely clear what out of HGC, it looks like they've got some portion of HTC's
phone engineers and some non-exclusive rights on IP.
And, you know, you can see a parallel conversation with Microsoft and the surface.
That's right.
How far does Google go in competing with Android OEMs?
That you can see there's like a clear strategic problem for them and that Apple has the high end and Google doesn't.
And there's reasons why Google as a reach business wants to be on those devices directly rather than just going through the Apple to get to those people.
But then if they go and build a business that takes on the high end of Android, well, that's where all OEMs make their profits.
So what does that look like?
you kind of see these interesting kind of tensions and dynamics that like there's the high level
strategy that we need to do A, B, and C. But then you also have like all the mid-level people.
It's like the guy whose job it is to run that particular thing. Well, of course that person's,
you know, is going to try and do X and Y in order to grow their business and increase the margins
and increase the profit because that's kind of the logic of the organization. But sometimes
you get all of these things. These things don't end up getting to be where you would want for the health
of the border ecosystem. Right. And you see how.
often, you know, great company decisions that end up creating enormous value don't follow
that logic of profit maximization. You know, I still think back to the origins of Google where they
were, you know, we were living in the world of Alta Vista with literally blaring ads, you know,
blinking at you on the page. And Larry and Sergey were like, we hate that. We're not going to do
that. And it was a moral rejection of that business model that eventually led them down the path
of we want ads to be as effective and as useful as search results. And that was the beginning
of Google's dominance. So you can have those strategic goals, which are, you know, I think
Google's in a hard place with this, you know, high-end, low-end ecosystem play versus do we do it for
ourselves. But sometimes you actually have to make choices on behalf of the ecosystem because
it's a longer term smart choice. Exactly. And you have to work out the difference between
the tactics and the platoons who are advancing because there's territory in front of them
so they'll advance and where you actually want the divisions to go and where you actually
want the army to go. Yeah. It seems to me that there are also really interesting questions
about understanding what your real business model is.
And so, for example, this is shifting to a different competitive front that Google is facing,
the battle with Facebook.
But it seems to me to the extent that Google says,
well, we're going to deliver this service directly with Google native content,
with Google native content, versus bringing it from an ecosystem of outside providers,
they're losing touch with one of their key strategic advantages against Facebook.
where the logic of this is better for the user led them, for example, to much more to compete
directly with Yelp and TripAdvisor and people like that.
And you say, well, that actually, you know, it is often better for the user.
You know, I like having my weather results directly in Google.
I like having my stock quotes directly in Google.
You know, but that logic ends up where, again, just as with the Android case, you gave up,
where you've ended up taken more and more of the valuable properties.
and you end up becoming a, you know, a destination platform,
whereas in fact, Google's original strategic strength was as a switchboard.
It's interesting. I've had sort of variance of this conversation with Steven Sinovsky,
a partner here who used to work, ran office and then ran Windows.
And there's, of course, there's the inexorable logic of the organization.
But you need to sort of balance, well, are you pushing back against that?
Or is it just something that this is clearly what that product was always going to be?
like, you know, Excel was always going to have charts, for example.
Yeah.
You know, and that...
No, I think that's absolutely true.
And the historical analogy is that sort of empires always tended to grow.
So this mechanism that they would kind of subdue the rebellious tribe on the,
the troublesome tribe on the frontier that was raiding over the frontier and killing their farmers.
And so you go out and you kind of subdue this tribe, and then they get kind of civilized
and settled, and you build some fortifications.
And guess what?
There's another violent, violent, aggressive tribe.
over the next mountain range.
So, okay, so you man another expedition and you get them.
The Romans did a great job on that, though.
The Romans, basically, when they conquered a tribe, they made them citizens.
They gave them a path to citizenship.
But the problem is then there's the next tribe.
And then the next tribe.
No, I understand.
Yeah, there is always the next tribe.
But it is so true, though, that if you consider ways to bring people into the value that you're creating,
that actually is a really interesting way.
And obviously, that's what Silicon Valley giants do when they acquire.
companies as opposed to, you know, killing them.
Well, this is the two ways to make money in software is bundling and unbundling.
Yeah.
I mean, we're kind of at an interesting break point at the moment years ago.
Clearly, the thing to talk about was mobile.
And mobile is, talking about mobile now is a bit like talking about PCs in 2000,
that you sort of, you know the ecosystem always done, and we know what happened,
and we know, like, the primary question.
And the question is now, what do we build on top of that?
Yeah.
You know, there's one other thing I would say in this different than what do we build on
top of it, which is, what are the capabilities of this device that we have not realized yet?
You know, when you think about Uber and Lyft, they were a realization of capabilities that were
there in the smartphone.
But you find new white space.
Yeah, you know, and that, that, you know, just like, how could we not see, you know,
there were connected taxi cabs and they had a screen in the back to show ads.
They had a connected credit card reader, but nobody put it all together and said, oh, wait, we
could summon the car, we can pay for the car. I could show you like with search reports from
99 and 2000 when I was a telecoms analyst where people made all these lists of all these
things that were going to happen and everything of course had an M at the beginning. So it was
going to be M banking and M this and M that and M the other. And no one had like M taxis.
You know, that wasn't anybody's list. I mean, it's interesting. You said that though.
It's like, so there's one, the conversation I was sort of getting at is that you get the
platform and then there's white space on top of the platform and you built that out.
But to your point, actually, also sometimes you discover a whole new piece of white space.
and I think you could argue that's happening now with AR on smartphones.
That's like a thing that wasn't there before,
and now suddenly that sort of become possible.
That's right.
And there's this new combinatorial innovation, you know,
that comes from multiple inventions.
It's almost like somebody's just dumping out pieces of a puzzle onto the table.
And you couldn't solve the puzzle to somebody brought the rest of the pieces.
One of the ways I described what's sort of happening in hardware components
are so much smaller and cheaper, more power fission than the components you had in PCs.
It's like someone took a shipping container full of Lego and dumped it out onto the table or dumped it out onto Sand Hill Road.
And everyone's kind of picking up the bits of Lego and going, well, what would we do with these things?
And now I think if one was looking at new curves, one would be looking at cars, which are not something that upends the computing ecosystem, but they're of kind of enormous and radical change in the world as we get autonomous cars.
And then machine learning, which is this sort of new foundational technology.
But it's not like it's a new platform.
Machine learning is like this new foundational technology
that in a sense will enable a bunch of new stuff and new companies
but it makes Google better Google
and it makes Facebook stronger Facebook and Amazon
maybe stronger Amazon.
It doesn't unlock, it doesn't, it's not a shift that the changes
the balance of power.
That's an interesting perspective,
although it is certainly possible that we're just too early
to see the way that one of those players may well
turn in their particular machine learning platform
into the open platform
and the way that Amazon turned
effectively cloud computing
into a platform
when other people could have done it
but weren't really ready to.
One of the things that happened
that was sort of non-obvious
that happened when we went to smartphones
was that on the desktop
it was really a pain to have more than one social network.
You'd have to open another browser tab
go to another website.
You'd get email notifications or something
and so you wanted to have everything
on one site
and you had this nice big screen
so they could just keep adding features.
and they could kill any competitor by adding another tab.
Whereas you go to mobile and the screen is much smaller and you have a home button.
So it's actually easier to get to another feature by tapping home and going to another app
than it is to navigate inside that app.
And then every app has your address book.
And so you get all of those things kind of come together.
It means even now it's really easy to have like four five or six social apps on your phone.
It's like there's a phone app.
Nobody's using all of those, but everyone is using like two or three of them.
And that kind of rolled out in like 2008,
10, and that gave us
Instagram and WhatsApp and like a hundred other
smaller ones. And Facebook bought
the two that won, but they might, you know,
that wasn't like inherent in the model.
The model created it like an inherently
plural model and Facebook was able to adapt
to that. But they might not have done.
I think a lot about the no
screen model, which I think
is an important part of our future.
You know, Amazon showed
it with the echo. I think the reason
that they got it right where other people weren't
was they basically built a
device that literally assumed there was no screen. Whereas, you know, in the first generation of Google
apps, and even now, sometimes, you know, you ask for certain functions, and it just opens the
smartphone app. And that's backwards from just building it in. I was kind of calling this like
frictionless computing. You don't have to turn it on. It's not even the screen. I mean, I think there's
a lot of overlap for me between the Amazon Echo and the, and a smart watch. Because they don't do
anything that you couldn't do on your phone, but they do it with less friction. That's exactly right.
If you remember the thing that you can do.
That's right.
And the other thing that's going to be really interesting is who creates the horizontal platform across multiple devices.
And when you talk about what's that next platform, I think it may be in the level of what is the software that's running in your car and in your watch and in your home speaker and in your phone and in your computer.
And I can just talk to this agent, for example, or I can interact with it in a variety of ways wherever I am on whatever device I am.
and it's a continuous experience.
I feel like that might be the second step.
I think, like, the immediate step is you buy,
is you get vend diagrams.
Yeah.
So rather like electrifying your home.
So you get a washing machine and a dishwasher,
and the common standard is AC power.
That's it.
But you didn't, like, go to the store and say,
right, I'm not going to electrify my home.
You bought a washing machine.
And the same, like, in what one could call IOT,
that you'll buy a connected TV
and you'll buy a connected speaker.
And maybe you'll have a connected,
doorlock and a connected alarm and like the doorlock will talk to the alarm but it won't talk to the
TV and the TV will talk to the speaker but it won't, none of them will talk to the smart meter.
Yeah, I didn't know about that. I remember actually seeing Jeff Bezos in 2003 gave a great talks and it
was the history of electricity. Basically there were lights and people were plugging in all kinds
of new electric motorized devices in, you know, just massive nests of wires coming down from
the light socket and, you know, people would get caught in their, you know, automatic,
laundry-wringing machine and have their hair ripped out.
It was just sort of just all the things that had to come together for electricity to actually work.
And it's a lot like all these incompatible devices.
So the devices, sure, they did come in, but it wasn't until we got standardized power
and we got standardized interfaces for interacting with that power.
I suppose the question is, well, how far up the stack does the standardization need to go?
It may be the standardization is Wi-Fi and Wi-Fi and.
low power radio and
AC power DC power
and it's not necessarily that they all
talk to each other
I suspect that it's going to be the ability
to understand commands
in some common
way. Yeah, it could be
I don't know, I've had the
like I've had the demo of the full connected home
and you know
my devices and I like buying this stuff
and I like removing friction and I
you go there and you walk in and you say
hey Google, hey Amazon, hey
in a movie night or turn the lights on or lights on in the kitchen and it just happens.
And I kind of look at this and I think I can like shut my eyes and imagine I want this.
And I can sort of write the script that says yes.
There's like the easy answer is like come on it, dumb, dumb ass, you know, just press the
light switch, but there's actually it's not one light switch, it's 10 light switches because
there's side lights and there's a light.
So it's actually, it is not, it is easier than just pressing the switch.
But then I just went not got a new apartment and like, am I going to go and buy all
these plugs and connected light bulbs to...
Well, but that's not really...
That kind of connected home vision isn't the way I think about it.
I think it's more that...
Just the way it's building up gradually that, you know, my calendar shows up in my car.
And I can select something from my calendar to get directions in the map.
I can talk to my phone or I can talk to my car.
And they have a common data substrate that knows about me that's centered on
me. And that is starting to be built. One of the big reasons, you know, like I have an iPhone and
I have an Android phone and I tend to live in Android because I have a suite of apps with my data.
You know, there is this gravitational attraction of who knows the most about me and who has more
of my data, who has built services that I already rely on. There's an interesting kind of
conversation around kind of privacy and permission there. I mean, I feel like it's as much,
I mean, it's as much to do with how you think about that company and what that company's
role is as what that company wants to do. You accept that your bank knows how much money you've got
because that's kind of how it works. And, you know, you accept that your phone company knows
where your phone is because that's how it works. And I think there's like, and it's, I feel like
there's an interesting distinction here between Google and Facebook that people accept that Google
knows a certain amount because that's kind of how Google works, whereas people just presume
that anything they put on Facebook is public and don't trust any assurances about privacy
because they just assume that they can't believe it and it's not reliable. And so people
kind of get this sense of what is the place or the permission that that company has. And so
you can like, one could like imagine the same product from two different companies where for one
and that would be entirely normal and okay and no one would be raised an eyebrow and for another
it would be completely
unacceptable. I mean, I think Apple, the face
ID that Apple announced actually would be a great example
of that. You know, Apple has spent enough time
like talking consistently, incredibly about
we don't want your data, that they say,
well, okay, it takes a picture. It models your
face, and that's stored on the phone, and we can't see it,
and we can't give it to anyone, it's just unlocking your phone.
And if Google or Facebook had announced that,
we'd be having a completely different conversation.
Well, I had that conversation with Tony Fidel
about why Google didn't do something like the Echo,
and he said, could you imagine
if it were Google who had first come out
with the device that was always listening,
the blowback would have been enormous.
And in fact that it was Amazon
that didn't have that Big Brother story already.
They were a retailer, you know, an e-commerce site,
it, and cloud computing site,
but they weren't kind of this omnipresent,
big brother, you know, narrative about them.
It gave them more room to innovate in that area.
I use the analogy of, you know,
thinking of these systems as really thinking of the corporation itself as a kind of artificial
intelligence. And just as Google has an objective function, we've actually told our corporations
that they should optimize for share price. And, you know, for economic return.
And of course, the theory behind that was that it would be good for society. But just as we've sort of
seen with problems with, you know, spam or, you know, fake news, these algorithmic systems don't always
turn out the way we thought. And we then go back and we have to try to debug them. And I think,
you know, the point that I kind of come to in the book is we have to actually come back and
effectively debug are the rules by which we're running our society. Because just like these
platforms, it is not an unfettered, you know, natural free market. It's a constructed thing.
It's kind of interesting. I mean, you know, the old joke that, you know, communism has never failed because the communist will always say it's never been tried properly. I mean, I always feel like sort of free market system is kind of the reverse in that, you know, almost everybody says, okay, free market needs regulation and government intervention and the arguments are about how much. And so you need laws and you need anti-monopoly. You don't just need the kind of the night watchman state, but you also need, you know, health insurance and you need food health and safety regulations and so on.
and so there's sort of the arguments are sort of about where you put that. My point is you have
this algorithm, if you like, you have this mechanism of the market price system. And then you
put your fingers on the scale every now and then you adjust it. And you're conscious that when
you do that, you're going to have unintended consequences, but you still feel like,
okay, here are these cases where we're going to make a decision that we're going to adjust
the scales. And you could argue certainly that people look at these sort of fully automated
systems like Google search or Facebook and so on and say, well, it's just fully automatic
and it just has to be fully automatic, and so don't touch it.
It's like, well, yes, it is fully automatic, and it has to be fully automatic.
We don't have price controls.
You know, we don't have central planning.
But that doesn't mean you don't change stuff.
Yeah, exactly.
And I think that that's why I think we need to change the metaphor that shapes are thinking.
Economists are kind of moving away from this consensus.
But for many years, there was this physics adulation, you know, where economics was going to be, you know, a hard science.
And I really think it's much more like game design.
You know, I mean, yes, there are rules of physics in, say, a game like basketball.
You know, you can't get the ball through a hoop if it's too small.
You know, humans can only throw the ball so far.
We have this incredible opportunity to rethink the rules of our game
or to play it in different ways with these unexpected moves.
I kind of wonder, can, you know, AI, you know,
with all the dark prognostications from people like Elon Musk,
There's a wonderful prognostication that AI could help us see ourselves more clearly, that if you can, as a thought experiment, imagine, you know, a far, far more powerful AI looking at some of our economic problems and coming up with creative solutions that we haven't tried yet for adjusting the rules of our game.
In a sense, what Go did was do the massive permissionless experimentation of moves that the free market does.
That's right.
Free market, you know, you want to start a business selling flavored water, you want to start a business selling cookies, you want to start a business doing that, you can do it.
It might work, it might not.
It's one creative, set of creativity in one set of parameters.
That's right.
And Go brings kind of massive radical experimentation, which is sort of the point of a free market economy that you get that radical experimentation.
And of course, it's also the point of the kind of the startup ecosystem in Silicon Valley that you have experimentation by 100 experiments by startup instead of two experiments inside a big company.
All of these things are goal-seeking behavior.
And there was a time when the goal of a startup was to build a real business.
You know, we think about, you know, the founders of Silicon Valley.
They didn't have any idea that they were going to get acquired.
They had the idea that they were going to build a business doing this new thing.
And now you have an awful lot of companies where the goal from day one is the exit
and where the behavior is targeted towards the exit.
That makes them, to me, financial products rather than companies.
Yes, that happens.
And you get waves of tourists, and there have been waves of tourists in the past.
You know, the three MBAs who did a Harvey Balls matrix to work out what company was going,
what prospect was going to have the highest and quickest acquisition turn,
and come in and say, well, we're going to sell it in two years to Google.
And once you fund us, we'll hire an engineer to build it.
We had a wave of that around mobile a few years ago.
And we've had a wave of that now around machine learning,
as a sort of a war for talent
and to some extent around cars as well.
But I think that's kind of froth.
I mean, the entrepreneur is not going to come in and say,
and if this all goes well, I'll sell it to Google in two years.
Whether they say it or not,
I think we're just in a period where there's more froth than beer.
You could certainly argue, I think,
that Google, Facebook, other big platform companies now,
because they're 10 times or 20 times larger than such companies were in previous cycles,
employ vastly more people and can pay them much, much, much more, and give them much, much,
much more freedom.
So the drive for those people to leave and do their own thing is smaller.
Your incentive to quit and live on ramen noodles and go and swing for the lottery ticket
to mix my metaphors is different to that which it might have been if you're
at Microsoft or at IBM or at Cisco or so on in the past.
This is a question around how these companies evolve.
The rigor and aggression and self-awareness of how Google, Apple, Facebook, Amazon are run now.
It's kind of different from some of these leading companies in the past.
It's like statistics and calls them like hyper-evolved organisms.
It's like they've all read the books.
They've all read Kay Christensen.
They all saw what happened to IBM and Microsoft and Myspace and Twitter.
And they all, except for Apple, the founders control the company.
and so you could argue actually it's not that the problem is not so much people making stuff
because they want to get hired. It's the fact that Google will just go up and vacuum everybody in
and then no one will leave. That's a really good point. And the other thing that's really
different today, you know, if you think about the days when Microsoft was so dominant,
there was really one company. Yeah, now there's four or five or six. And now there's four or five
or six that are serious competitors. And well, okay, so you kind of go for this split scaling
where you're going to get to scale.
How many companies are going to win?
And so then, you know, if the obligations are, you know,
win, be acquired or die, you know,
a lot more people are going to die than are acquired
and a lot more people are going to acquire than win.
Isn't that just kind of saying, do great, do okay, fail?
And if you do okay, okay, maybe you drift off sideways,
maybe someone buys you?
But if the platforms take more and more of the capability,
the possibility, it seems to me that they have an obligation just for the sake of the economy
and for the sake of the ecosystem that they depend on to actually think about how do they
create opportunity for an ecosystem. I think of this is a profound metaphor for our overall
society because we have built an economy where a small number of people own and control
platforms and many of them are using those platforms extractively and competing with the ecosystem
them as a whole. That's why we see this vast income inequality, why we see, you know,
the hollowing out of the economy. I don't think it's technology. You told Steve Ballmer to be
less greedy and look at Microsoft now. You know, it's... They've rediscovered their soul.
Perhaps they are not, certainly don't dominant the tech industry. You could argue about
their relevance to the future of the tech industry. There's a machine learning story there that
will play out. But, you know, the wheel of fortune turned and the wheel of fortune turned for
The cycle turned for IBM, cycle turned for Microsoft.
It's not apparent how that cycle would change for Google or Apple or Facebook or Amazon, but it will turn.
Oh, absolutely.
In one sense, there's a good creative destruction aspect of that story.
But when I look around at – and I take particular target at financial markets as a platform for our society,
because they too are a platform technology.
This is the lifeblood of how businesses get funded, how they –
grow. And increasingly in financial markets, we see a distortion of that original function. You
know, Apple, for example, here's Apple saying, well, we can't afford to make these phones in the
United States. Why aren't they saying, well, we actually could in fact afford to make these products
in America where they cost more? And we'd take money out of that pot that the shareholders are
claiming. And we say, we're going to give it to society in the form of investment in our country.
There's a bunch of issues with that. One of them is, of course, that they,
cluster for making these devices is in Shenzhen.
It's not that you go there because you get cheap labor.
It's an awful lot more than that.
Yeah, that's fair.
That's where everything has to be, in a sense, because that's where it gets done.
There's a sort of second...
But that was a decision that was made on the basis of...
On the basis of the reason we have to do that, we were going to outsource this because
we're trying to optimize for shareholders.
And in fact, that shareholder optimization actually was bad for a lot of other
constituencies. Yeah, so there's a reason why it all went to China in the first place. Of course,
the fact that it went to a contract manufacturing model, on the other hand, enabled vastly
more innovation and mostly more people to make stuff. There's like a more general macroeconomic
pushback to this, which is that it's just comparative advantage that China gets rich and we get rich.
I think that's absolutely true, although I will say that the notion that the only constituency
that we have to take care of as shareholders.
not arguing for protectionism per se. I am arguing, though, for strategic thinking about
what are all the things we're trying to optimize for. This situation is way more complex
than this sort of free market narrative that we normally give. And it seems to me it smacks
of justification. Corey Doctoro had a wonderful tweet. He said, you know, economists use equations
to justify the divine right of capital
in the way that court astrologers
use the stars to justify the divine right of kings.
The key thing for me, I think,
is to think of the free market as a macro level
and the ecosystems as a micro level
as this is a system.
And the system as it functions generally
is a very efficient way of producing things that we want.
But you have to point it and steer it
and you make adjustments and you decide,
well, that as it operates,
is not actually generating what we want.
So the market does not provide a cost of pollution.
Okay, so we're going to have to add a cost to pollution, and we're going to decide what that cost is.
We're not going to set it a level that economic activity halts, but we're also going to stop people pumping sewage into the rivers.
And I think the same thing applies if you're running Google or Facebook or indeed Apple.
You set your balances and you set your trade-offs.
And to me, that's kind of the litmus test for rational discussion.
The one understands that all these policies decisions are trade-offs.
You can't have something for free.
And as long as you make a decision, okay, well, we're going to accept that cost in order to get this benefit.
then you're having a rational conversation.
There are lessons from technology that can apply to those policy discussions.
Because when you see that the way that, say, Google or Facebook or Amazon are making
those kinds of adjustments to their platforms, it's very hard to kind of go back to the idea
that somehow this is just the best system is one in which nobody is applying that kind
of intelligent test, learn, measure, respond.
loop that we take so for granted in Silicon Valley and that is so often absent from our policymaking.
The best quote ever about monopolies is from the man who invented the idea of the invisible hand.
You know, that no sooner do people of a trade get together than they start coming up with schemes
to defraud the public. This is a person who invented the idea of the invisible hand and as he
invents it, he's saying, yes, but you've got to pay attention. That's right. We must believe
that the world we live in is not the only possible world. We are facing a set of
great changes driven by technology.
And the first step towards making the world new
is to believe that the world we're living in
and the rules we're playing by are subject to change,
that they are up to us, that they are not inevitable.
And efficiency is not the only goal.
The secret of technology is that it allows us to do more,
to do things that were previously impossible.
And taking that belief to heart
is the biggest thing that we can all do.
Well, that sounds like a good way of describing the book.
How can we make the world a better place?
WTF is out now.
Thank you very much.