The Peter Zeihan Podcast Series - The Geopolitics of...Gaming || Peter Zeihan
Episode Date: November 20, 2024PC or console? Yes, I'm talking about gaming preferences...and if you answered PC, then we all owe you a big thank you.Join the Patreon here: https://www.patreon.com/PeterZeihanFull Newsletter: https:...//mailchi.mp/zeihan/the-geopolitics-of-gaming
Transcript
Discussion (0)
Hey, everybody, Peter Zion here coming to you from Snowy Colorado, where we just got our first
nine inches and there's another 13 inches on the way. Boy, howdy. Today, we are going to take an entry
from the Patreon page is Ask Peter Forum. And the question is, the geopolitics of video games,
which I know, I know, I know. Some of you are just like, now, this is actually quite unplanned
become one of the most important economic sectors in the world in the last five years.
I'm not sure whether or not it's going to continue. But let me.
kind of laid out for you.
For the period of roughly 2010 to 2021, roughly that window,
we had everything we needed for computing power.
I mean, yeah, yeah, yeah.
You'd upgrade your laptop every two or three years to get the newest chip.
But we had digitized most things that could be digitized.
We had moved into logistics and communication and information,
and all the low-hanging fruit had already been computerized.
And the question was, you know,
why do you need ever faster processors
there's an ever more memory if you really don't have a need for it.
And yeah, yeah, we got Starlink coming up and running,
so satellite communications can be an issue.
We wanted to build a smart grid.
You know, these are all reasonable things,
but you only need so good of a chip for that.
And so as chips got better and better and better and better and better,
the number of people who were willing to shell at cash form
got lower and lower and lower and lower and lower and loader.
And then the gamers came in
because they were solid demand.
They always wanted the,
fastest possible chips with the best graphics processing capacity so that they could join larger and
larger multiplayer forums and never have drag or lag. And it got to the point that they basically
kick off people, didn't have good enough hardware because they would slow down the process for
everybody. The chip that is at the heart of that, where you had the largest drag and so the highest
demand among the gamers for improvement is something called a GPU, a graphics processing unit.
And they are definitely the most advanced chips in the world today. But, uh,
a bunch of gamers sitting at home are not exactly what you would call the bellwether of global
economic patterns, even in technology. So there was only so much money that could go behind this
sort of effort. And then we developed this little thing called large language models and artificial
intelligence. And it turns out that the function of the GPU, which is designed to run multiple
processes at the same time so that graphics don't lag, is exactly what you need to run an efficient
large language model. And if you put 10, 20,000,
of these things running at the same time in the same place,
all of a sudden AI applications become a very real thing.
We would not have AI applications,
if not for those people who sit at home in the basement
and play role-playing games all day.
So, thanks to the geeks and the nerds and the dorks
because it wouldn't have happened without you.
The question is what happens now?
You see, GPUs because they were designed by dorks for dorks,
has some very dork restrictions.
Normally, you only have one GPU in a gamer con,
and you have several fans blowing on it,
because when it runs in parallel,
it's going to generate a lot more heat,
use a lot more energy than any other chip within your rig.
Well, you put 10,000 of those in the same room,
and everything will catch on fire.
So the primary source of electricity demand for data centers
isn't so much running the chips themselves.
It's running the coolant system
to keep these banks of GPUs from burning the whole place down.
Now, for artificial intelligence,
it's not that the GPUs are perfect.
they're just the best hardware we have.
And there are a number of companies, including Navity, of course,
who are now generating designs for an AI-specific sort of chip.
And instead of a GPU, which is like, you know, the size of a posted stamp,
you would instead have something where there are multiple nodes on the chip,
so basically the size of a dinner plate or even bigger,
so that you can run billions, trillions, lots of processes simultaneously.
And because the chip is going to be bigger
and because it's going to be designed for AI specifically
cooling technologies will be included.
It won't be the power sec per computation,
or at least that's the theory.
The problem is the timing.
Assuming for the moment that the first designs are perfect, never are.
We don't get our first prototype
until the end of calendar year 2025.
It will then be 18 to 24 months
before the first fab facility can be retrofitted
to run and build these new chips,
and we get our first batch.
So now we're talking the end of 2027.
And if all of that goes,
off without a hitch. It won't. We're not talking about having enough to outfit sufficient server
farms to feel the difference until probably 2029 or 2030. So the gamers have taken it this far.
And the question is whether the rest of us can take it the rest of the way in an industry with
supply chain that just to say has some complications. So gamers, salute to you. We wouldn't be in this
pickle without you. But we also wouldn't be able to imagine.
the future without you.
