Limitless Podcast - Why the Markets Ignore Amazon: AWS, Robotics, and AI Chips (Trainium 3)
Episode Date: December 23, 2025Believe it or not, Amazon has some hidden strengths in AI, especially with the groundbreaking Trainium 3 chip and significant data center expansions. We discuss advancements in robotics and c...hallenges with TSMC's manufacturing. With promising partnerships and interesting market outlooks, we'll definitely be keeping an eye on Amazon moving into 2026.------🗞️ LIMITLESS NEWSLETTER 🗞️https://limitlessft.substack.com/p/2026s-most-asymmetric-bet🌌 LIMITLESS HQ: LISTEN & FOLLOW HERE ⬇️https://limitless.bankless.com/https://x.com/LimitlessFT------TIMESTAMPS0:50 Amazon's AI Chips Revolution3:35 The Asymmetric Bet of 20264:08 Tranium 3: A Game Changer6:28 Comparing AI Chip Technologies11:02 AWS: The Profit Engine13:15 Amazon's Massive Data Center Plans13:50 The AI Factory Concept16:20 Amazon's Robot Revolution19:08 Automation and the Future Workforce23:22 Amazon's AI Strategy Unveiled25:43 The TSMC Challenge30:30 The Hidden Upside of Amazon------RESOURCESJosh: https://x.com/JoshKaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures
Transcript
Discussion (0)
The most contrarian AI bet of 2026 is the same company that you buy toilet paper from.
Many people will mistake Amazon as just an e-commerce company, but they're secretly a frontier AI
lab. For example, did you know that they manufacture their own AI chips that are as good as
Nvidia's GPUs, but 50% cheaper, which means that companies like Anthropic and Open AI,
which have signed deals with Amazon, save tens of billions of dollars training their frontier models.
But that's not all. Amazon's compute platform, which accounts for 50% of their operating
profits AWS. They're running the same playbook for AI now, serving AI cloud to any company that
wants to inference or train their own models. And finally, Amazon has a secret up their sleeve,
which no one's talking about. Robots. For over a decade, Amazon robots have helped them scale
manufacturing and factory automation to the tune of tens of billions of dollars, which make them
the perfect company to design and build the robots of tomorrow. Amazon is easily a $5 trillion
company hidden as a shopping platform. It's funny you mentioned.
from the shopping platform part because when when everyone myself included thinks of Amazon,
they very clearly think of shopping platform. And for the right reason, I have some pretty
unbelievable stats. So last year, Amazon shipped 6.3 billion packages, which is 17 and a quarter
million per day, which is 200 per second that gets shipped. So that equates to about 30% of all
the U.S. parcels were one company. And that means one out of every three and a half packages was shipped
by Amazon. So in terms of the amount of mass and atoms that they're moving,
throughout the universe, throughout the world.
Like, as it relates to the other mags having companies,
they're moving the most amount of stuff, just raw stuff.
And you have to imagine that once you start to apply AI to this,
in terms of efficiency gains and improvements like you're mentioning with robotics,
the amount of stuff can really be optimized quite a bit
and have a meaningful impact on the business.
But what we're seeing with the stock price that's on the screen here
tells a very different story,
which is basically flat in a year where every company in the world
that was building an AI at Amazon size went up at outrageous.
amount. I think the story behind this is Amazon's just very misunderstood. Like, to your point,
in the last year, it's gone up 1.4%, which is just insane. It doesn't even beat inflation.
Doesn't beat inflation, which is just insane. And listen, there are theories as to why this might be,
but we're here to tell you the story why Amazon is basically the biggest AI beach ball underwater
that is about to pop up in 2026. And some of my foundational thesis behind this, Josh, is that the stuff
that they focus on is really unsexy. It's operational. And they're about to do the same for AI.
Like, think about it. Like sorting, fulfilling packages, delivering it to people isn't really a sexy
thing. And then if you talk about compute and serving compute to different companies, again,
I don't really care about it. But what most people don't realize is that the top software
companies in the world run on AWS. That's why when AWS servers went out a few weeks ago,
the world couldn't function. I couldn't use X. I couldn't scroll my favorite social media platforms,
because it ran on AWS.
So most people don't realize this,
and they're about to do the same for AI.
Yeah, and one last thing I want to mention on this chart,
since we haven't on the screen,
is that little PE ratio number,
the price to earnings ratio,
it's down to 35 now.
And for reference, back in 2018,
Amazon was trading at over a 200 times price to earnings ratio.
So the multiples have really compressed a lot,
but this is a much more mature company,
which actually has a vector of growth,
which is adding AI to the mix to increase the sufficiency.
So,
EJS, you wrote all about this.
It's in the newsletter that we published,
but you actually created a proper article
going through the bullcase for Amazon,
and I think we're going to spend a good amount of time
kind of going through the outline that you framed here
for why you believe it to be,
I mean, as the title says,
the most asymmetric bet of 2026.
And for those of you are wondering,
hey, what is this article?
What is this newsletter?
It's the limitless newsletter.
And if you were subscribed to it,
you would have seen this article
about a week and a half ago.
So again, if you won't,
if you want the best alpha in,
AI, you have to subscribe to our news center. But yes, Josh, one of the first things, or one of my
first arguments as to why Amazon is an asymmetric bet here is this thing called AI chips. You heard
of them, Josh? You know, these little GPUs thing. I might have heard of GPU, a little GPU, a whole
bunch of U. All right, all right. Okay. So you've heard of Nvidia. It sounds like, it sounds like you've
heard of Google as well, as you know, we're very bullish on Google here. But what most people don't
realize is Amazon created their own chips. And it wasn't like they did this yesterday. They did this
10 years ago, Josh, when they acquired a company called Anna Purner Labs in 2015s, which marks the
start of their training and build of AI chips to help them with machine learning inference,
stuff like they was figuring out way back when when it came to like recommend their
shopping platform. Now, if you fast forward today, in the last few weeks, Josh, they released
this chip called Traneum 3. It's part of their Trinium series of chips, which are used to build
and scale large LLMs or rather large AI models. And Trinium 3, obviously, you know, the first
question that pops into your head is like, well, how does this compare to Invidia? Well, let me give you
a few stats to kind of wet your appetite, Josh, which is it is four and a half times more powerful
than Trinium 2, but it's also four times cheaper than Trinium 2. So if you net both of those
together, you get like a 10x kind of chip here, right?
But then it can also store five times more data than the previous chip that they had.
So all of these combined together gives you about 80 to 90% of the same performance as
Nvidia's latest GPU, Blackwell.
So I'm not just talking about Nvidia's second, third or fourth generation.
I'm talking about the latest generation that they have right now.
So automatically, Josh, if I was a frontier AI lab that is spending hundreds of billions of dollars each year on
Nvidia's GPUs and you suddenly give me an option to spend 50% of that bill. So I save like
$50 billion. Why wouldn't I use this chip? Yeah, it sounds like a pretty good deal. And I think
we should probably start by outlining what exactly this chip is because there's a difference between
NVIDIA's GPU, Google's TPU, and then training, which is something totally different. And they fit,
if I'm not mistaken, it's two kind of basic categories. So GPUs, which is NVIDIA's Blackwell system,
that is a graphical processing unit.
It's kind of a general purpose supercomputer.
You can use it for a lot of different things,
graphic being one of them,
but also the matrix math required to do AI training as another.
With TPUs like Google's making and these Amazon chips through Traneum,
they're more focused on specific types of math.
They are not general purpose.
They're narrowly focused on AI training and AI inference,
and that's where you get a lot of the efficiency improvements.
So like we're seeing on the screen,
4.4 times higher improvement, 4 times more memory bandwidth,
The specs are insane, but they cannot be used for everything.
So it's a very specific type of customer that wants these types of chips.
Exactly.
So if you were to kind of like compare the two Nvidia GPUs and Amazon's training chips,
Nvidia's GPUs can be used for a lot of broad use cases when it comes to training models.
Like there are several different types of ways to train an AI model.
If you weren't sure which one to use, you would probably use an Nvidia GPU because it would just be consistent across all of those things.
But for Amazon's training chips, you need to specifically know exactly.
exactly how you're going to train a specific model, and then it ends up being cheaper.
So it's for like a highly specialized type of AI lab that wants to train their own model.
And what's interesting about this, Josh, is it's not just kind of the specific architecture
of how this chip is designed. It's also very much the cost. I have a table pulled up here,
and if you can take a direct look at it, which compares Traneum 3 to Google's TPUs and
invidia's latest chip, the Blackwell, and it is half the cost of Nvidia's blackwell. That is
excluding Nvidia's margin that they add on top of this, Josh. So if you look at this, the average
cost of an Nvidia chip to manufacture is around $6.5 to $7,000, but they end up selling it for $40,000.
That's the price that Amazon has to pay to buy these chips. That's the cost that Open AI has to pay
to buy these chips. So I can think of two things here. If Amazon has an almost
is good chip, then it's going to largely kind of be a more attractive chip to buy for Frontier
AI Labs, but it's also going to cut into Nvidia's margins drastically. So you start seeing Amazon and
Google TPUs being able to eat into the market monopoly that Nvidia has. Okay, so interesting. I want you to
kind of help me understand this because a lot of this has been new information to me. I got intro to this
through the article. And what I understand is that Amazon's chips or Amazon's chips are not TPUs are
better on a per watt basis. They're more efficient. They're more cost effective. But I guess my question
is, if Amazon were to start making these available to the public tomorrow, would they outsell Blackwell?
Would people be more interested in these? Or would it just be a very specific audience?
No. Okay. So there's two things I want to bring up with you that kind of like sway people's
decisions when they're choosing between the chips. Number one, an Nvidia Blackwell chip on its own and also
an Amazon train chip on its own isn't useful, Josh. You need to start. You need to start.
stack them into this thing called clusters.
Now, for an Nvidia chip, you only have to put 72 of these chips together to get the same
performance per watt.
But with an Amazon chip, you need to stack 144 of them together.
So it's a higher volume thing.
Okay, so it's about double the amount of chips per little cluster that we have in a rack.
And do those chips, are they more expensive too?
Or are they like Costco?
Because it seems like that's a lot more complexity for like 80% of the efficiency.
See? Nope. They're cheaper. So on this table right now, they're half the cost. That's where the $3,000 to $3,000 and a half thousand dollars come from. So you may have more chips, which may take up more volume in your data center, but they are cheaper to run at a cost kind of inference, right? The other thing, Josh, is you might have noticed I said they're not as good as Mvideo's GPUs. They're around 80 to 90% as good. And the reason why there's that difference is because of NVIDIA's software moat. So this is something,
called Kuda or compute unified device architecture.
So basically, if you have the chips, that doesn't like solve your entire problem.
You need software to be able to make these chips run really coherently together in their clusters.
Invidia has the stronghold of this, Josh.
Any AI lab that is using Nvidia GPUs, which is the majority of AI labs, run the Kuda software system.
And typically, this has locked customers into using Nvidia.
So let's say they're interested in using Google's TPUs.
they may not necessarily still want to jump to use Google's TPUs
because the software isn't the same.
They would have to rewrite their entire code base.
Amazon saw that and thought,
hmm, I bet I could make this easier.
And so they released this thing called a neuron SDK,
which now allows you to copy and paste your code base
from your Nvidia instance into your Amazon GPU
or your Amazon Traneum chip in a few clicks.
Maybe this is a good time to point out the fact
that this is a really big deal for Amazon, even if they don't sell these chips anywhere.
Because AWS is such a huge story.
We were looking up before this episode was recorded, how much of the internet is run by
AWS, and it's about a third.
And another fascinating thing that you kind of point out in the intro, but as of last
quarter, I believe, AWS revenue was something like less than 20% of the total company,
but it accounts for close to 70% of the actual.
profit share. I think it's like 66% as the third quarter. So this small part of Amazon's business
accounts for over half of the total profit every single quarter that comes in. And with these
plans to scale by using these new ships, by using this gigantic data center that we're going
on seeing on the screen that we're going to get into, there is a very clear trajectory for it
amplifying the one part of the business that actually matters the most for their top line.
And we were talking earlier, it's kind of similar to what Costco does with their membership,
where Costco's business operates on very thin profit margins.
They really don't make a lot, if any, money on the actual goods sold.
A lot of that revenue comes from that membership, from the lock-in.
And Amazon, what they have with AWS, is this unbelievably profitable engine
that is becoming much more efficient using these new Traneum ships.
But here on screen, we have this really crazy-looking data centers.
So maybe you can tell us more about what's going on here.
Yeah, I mean, listen, we're not strangers to crazy data center setups.
We've spoken about Elon Musk's Colossus 2.
We've spoken about Meta's Super Data Center that they're building.
Basically, all the top companies are spending tens of billions of dollars.
And Amazon is no stranger to this either.
They've invested in $11 billion, and they're going to invest another $20 billion next year
to build out their Indiana Data Center campus to create around 2.2 gigawatts of compute.
That's equitable to about 1 million homes worth of energy by the end of next year,
which is just an insane goal to kind of figure out.
And to your point, Josh, like, if they're able to pull off what they did with AWS for AI compute specifically,
there is kind of like no reason for me to think at least why they wouldn't eat into all the other neocloud's valuations.
We've spoken about what's his name, Leopold, investing in this company called Corweb, which was one of the top neocloud providers.
And he invested to the tune of like, I think it was like $350 billion, almost $400 billion into this company.
if Amazon just eats, like switches this on at the end of next year, they, like, companies who are
already running on AWS will just switch to the AI version of AWS. It kind of doesn't make
sense for them to kind of flip here. And it's why I think they have such a kind of sticky mode.
They help with all the unsexy stuff, Josh. I don't know if you saw this rollout of something called
AI factory. Did you catch this by any chance? No. AI factory sounds interesting. We like factories.
Okay. Okay. So let me lay this out.
for you. A lot of enterprises and even governments want to create their own versions of AI models,
but they run into two main issues. Number one, they don't know who to buy the chips from.
Should they go to Nvidia? Should they go to Google? Should they go to Amazon? I have no idea.
And then two, they don't want to set it up themselves, right? But they also don't want to rent
compute directly through AWS. Why? Because, you know, they have some private information.
They don't want to leak information. They want Amazon to own the information. They want to hold it on their
own private service. So Amazon looked at this and said, okay, we're rolling out a service called
AWS, and here's what we offer. If you want Nvidia GPUs, we got you. If you want Amazon Traneum
GPUs or chips, which are, by the way, 50% cheaper, we also got you. If you want a hybrid or mix
of both of these things, we've got you. And what they do is they build the server racks, Josh,
for them. They basically build out a data center for them. And this benefits them in so many ways,
because they have the software and they have the hardware,
and they have 50% less the cost if they run AWS or Amazon Trinium chips.
So it's like an all-in-one package where they have lower latency,
they save on costs, and they have premium frontier intelligence.
Pretty cool.
But there's an important distinction there where they're not building the factories for the company.
They're building their own infrastructure that they're lending to the company.
Right?
So if we're considering a company like Oracle,
whose job is to build data centers for companies,
You can think about the Project Stargate project with OpenAI.
They are building actual infrastructure that OpenAI will own.
What Amazon is doing is they are building the infrastructure for you, but they're just leasing
it to you.
They still own the underlying core infra.
So these other companies are essentially bankrolling the buildout of these factories,
but doing so in a way that doesn't incur a lot of debt.
Like they already have customers here.
And the switching costs, I was looking because I was curious as it relates to these
big cloud providers, like AWS is.
is 30%. Azure and Google Cloud are another 33%.
And then the rest is just kind of this mix of things.
So a third of the companies are already,
they trust the security of Amazon.
They already use Amazon.
They have their whole custom software stack built on Amazon.
And in the world that they can just extend this
to integrate AI into their offering,
well, now 33% of the internet,
they just gets a direct-to-A offering built in.
And that's like a pretty powerful thing.
Yeah, Josh, you know which other little small company
this reminds me of.
Google.
Who's that?
Google already had their roots sewn
into so many different products.
Basically, anyone who's ever graced the internet
has come across a Google product,
whether it's Gmail, Android, Play Store,
whether it's Google Search itself.
Amazon is the same bedrock for this,
except it's just,
hey, we've got the compute
that you're going to run your number one website
or internet product on.
And they're converting the same users,
as you said, into AI users.
I just think it's a complete no brain.
And yeah, to kind of build on your analogy, they're not building the entire data center for you.
You're going to still have to buy the warehouse and supply it with energy and electrical grid.
But they've got everything else for you.
You don't have to have the upfront AI CAPEX costs that, for example, open air has when they're committing $1.4 trillion over the next five years to buy these GPUs.
Okay, so we've covered the chips.
We've covered the cloud.
Now we have the physical infrastructure.
The world of atoms.
This gets to our 6.3 billion package statistic.
where Amazon of all the Mag 7 companies, they just move the most amount of stuff through the universe.
And through that, they benefit, they say to benefit a lot from automation, particularly as it relates to robots.
So here we have news that they have three quarters of a million robots already deployed.
EJAS, what are their plans going forward that are part of your bullcase as it relates to kind of robotics and warehouses?
Okay, what I'm about to say is going to sound controversial because I have said that Tesla is the number one robot company so many times.
but I was wrong.
Oh, wow, that is a hot take.
It's going to be Amazon.
They already have almost a million of these things out there.
And listen, listen, they might not be the humanoid robots that grace your home, that help you with the laundry.
I have laundry in my thing that needs to get sorted right now.
Wish I had one of those, right?
It's not going to be a robot car that takes me wherever I need to from A to B with minimal accidental type stuff.
But they're going to be the robots that scale very important.
things such as manual labor, Josh, which is like, you know, a multi-trillion dollar industry or global
sector as itself. If you are able to cut down costs by 50 to maybe even additional percent,
why wouldn't you do that? Amazon is the perfectly positioned company for that. They already have
a million of these robots. They are making very, very aggressive cuts on their labor force. I don't know
if you saw, but like they cut 30,000 jobs about two months ago and they're aiming to scale that up to
600,000 warehouse workers by 2023, which is honestly quite scary to hear for a lot of people,
I'm sure, who are in these jobs. But I think those job roles will essentially evolve.
But it basically makes Amazon the perfect company to design and build these automation robots
of the future. Again, they're doing the unsexy stuff, the behind the scenes work.
They're not consumer-facing robots, but it's still a multi-billion dollar industry.
We're going to have to agree to disagree on that Tesla robotic statement, I think.
But I think there is an important difference to outline in the two different approaches the
companies are taking. So Tesla very much treats their, the factory as the robot. And the robots
that they're going for are more humanoid general purpose or as a relation to transportation.
What Amazon is doing and what we're seeing here is sure there's some robots that look like
humanoids, but a lot of the robots that are going to be in these factories, they're narrow
purpose robots. They're good for one task. It's a single arm that moves a specific way very quickly.
And I think that's, when we talk about robots, it's important to understand that there are narrow-band robots and general-purpose robots like humanoid.
And what Amazon most likely stands to benefit from the most, at least in the shorter term, is these narrow-band robots like these arms that you're seeing on the screen that are really, really efficient at doing one specific task.
Or whether it be those things that are rolling on the floor that can move all these boxes around.
So as it relates to that type of robotics, Amazon is, they probably have the strongest case to me on how much they can profit from putting those into the,
their product because the factory very much is their product. How many packages they can ship
per minute? If they can get that up from over, what was the number, some crazy number,
but whatever 200 per second packages, if they can get that up to 250, I mean, that's a huge
increase in improvement. So I think as it relates to factories, this automation is going to be
huge for them. I have a question for you, Josh. I noticed you said that it'll help them profit for
their own products. Do you ever see Amazon selling this type of robot to other factory manufacturers
in completely different sectors.
I would hope not,
but what I imagine Amazon does
is uses this to create more of a platform
to entice sellers.
So a similar business,
which I'm kind of thinking,
like people use Shopify a lot
to create web stores and to sell things.
Shopify is good to creating the tooling.
It's good at creating the actual website.
It's good at pairing customers
to consumers taking care of all the infrastructure.
But Amazon is the layer that sits beneath that
and can actually handle the logistics for you.
So if you're shipping,
physical goods. I suspect that Amazon won't sell these robots to other companies. They will just,
again, like they're doing with the data centers, they will build the infra and then offer the services
to anybody who wants because that's where they get the most amount of profit. So if you're a seller
on Amazon who wants to know, well, what are the sales going to be like in November and December during
the holiday season? Can you project that for me? How much do I need to make? Okay, how much should I
send to your warehouse so that you're able to get out all the orders on time? And as they get more of this
intelligence as they start to build more of an understanding. And this gets to the consumer side,
too, where they understand their customer. They know what the customer is shopping for. They know
how to sell them ads. You get this really fully vertically integrated experience as a seller on
Amazon that you just can't get anywhere else. So if they're building these robots for their own
warehouses, they should keep them and make everyone else use them because that total vertical integration
where they understand the customer, they have all the automation, it creates the best experience
for everybody. I see it. I just don't know if I can
fully believe it just yet because like in the same way that they rent compute or like kind of
CAPEX, you know, data hardware to different people, I feel like they would probably do the same
for their robots as well. But I saw a NewsLeak a few months or like last month, Josh, I don't
if you saw this, that they're planning to take on the US Postal Service. Hell yeah. For those of you
who don't know, Amazon pays the US Postal Service or rather facilitate $6.6 billion of revenue by using
the US Postal Service. Amazon just didn't want to scale the delivery kind of service to the extent
of the US Postal Service. Now they're in the market of actually doing that. So we might end up
with a company that is just all consuming and using all the goods for themselves. I don't think
they'll do it for chips, but maybe they'll end up doing it for robots. And Josh, I guess the
final point that I want to make and that we had in this essay is Amazon is just really good at
understanding what they're good at and not treading outside of that line. So what do I mean by that?
Well, many people think that, okay, if you're a frontier AI company, you should have a bleeding
edge model. The funny part about this is Amazon has arguably the weakest AI model that's out there,
but it's good for their in-house use. So they have a series of models, Josh, and it's called Nova.
But have you ever used Nova, Josh? Have you ever kind of heard of it?
not used. I actually haven't engaged with any Amazon AI yet. This is a new frontier. Okay. Well,
the reason for that might be because it is a speech to speech model, meaning that you speak to it and it
speaks back to you. So you might be like, well, when the hell would I ever do that? It's primarily in
customer support. That's where they've used their AI model. So they trained a very small but
hyper-efficient AI model to kind of facilitate and backlog all of that kind of stuff. So they've been
able to reduce their kind of reliance on human customer support for that and save costs in that
end. The other side of things is Amazon is really good at focusing on enterprise customers,
and so they produced an AI model software service, which basically allows any enterprise to train
their own AI model using Amazon's model architecture so you can train it on its own proprietary data.
Why would you want to do that as an enterprise? While you have private data, you don't want to
give it to open AI or you don't want to give it to Google, but now you have a private instance where
you can just run it on Amazon's model product. And that's really cool. And that brings me to the final
point, Josh, which is like kind of Amazon's secret ability here. They've invested in some really
big companies. In fact, you might have heard of Anthropic. Actually, I think you told me this on a previous
episode. Yeah, a little known fact that most people don't know is that Amazon owns about 20%,
give or take a few percentage points of Anthropic. And it's funny because when I was considering the
bear case, the reason to be skeptical of Amazon, one of the big things was,
was comparing the other major cloud providers.
We have Google Cloud Services, which is, I mean, a huge entity that runs on Gemini.
It benefits from Gemini.
Then we have Microsoft Azure, which is Microsoft's cloud service provider that partners with OpenAI,
and they receive 100% of the IP from OpenAI.
But then I was like, well, you know, Amazon actually does have their own big dog in their corner,
which is Anthropic.
They have this huge partnership with Anthropic where I'm sure a lot of resources are being shared,
but Anthropic is also training using these new chips, which is fascinating because right now,
Anthropic has the most unbelievably great coding model in the world. So something is working behind
the scenes. And I guess time will tell us to see how it bleeds out into the rest of the industry,
but they do have some big guns in their corner. Kind of like going on the theme of the bad case,
Josh, I have to like put the realistic framing on this. Amazon's chips are awesome. They're almost as
good as Nvidia. They're 50% cheaper. But there's one major constraint, which actually Google faces as
well. This is a little company in Taiwan called TSMC. Some of you might have heard about it. But they require
Taiwan Semiconductor manufacturing company to be able to build the chips for them. Google relies on
them to build their TPUs as well. Can you take a guess at which company owns around 90% of the
capacity for TSMC next year? Oh, I'm going to guess there's only one correct answer to this question,
and that is NVIDIA, the owner of all chips.
and GPUs.
Yeah.
So even if Amazon wanted to,
let's say they created a hit product
with these chips and everyone wanted to use it,
they couldn't even fulfill demand because
Nvidia holds the stronghold.
And so they have to wait until the end of
27 where TSM would have scaled
enough capacity at that point
to be able to service that.
So they're going to start to look for alternative
providers. Yeah. Is that
a real constraint? Because, okay, so
I'm of two minds here where Amazon
seems to be undervalue just on a
relative basis. Their price to earnings ratio is very low. Wall Street has basically priced them
as break-even throughout the course of the year, while a lot of the comparable companies have gone up,
like 80, 90, 100%. But then I hear things like this, and I'm like, well, is this billcase that
we're laying out actually even possible? Because we do have the TSM kind of monopoly situation
with Nvidia. There are alternatives. I know Samsung is working on building the chip architecture.
Is that really the nail in the coffin? Like, is it possible for them to see this upside growth?
without TSM?
Or is it really just relying on TSMC?
So a consistent trend for NVIDIA
that owns the monopoly of capacity on TSM
is they want to be able to train frontier AI intelligence.
Amazon's approach has been very clear.
They want to be the cheaper alternative.
Once you've scaled and built your frontier intelligence,
where the cheaper chip for you to use
to scale your product in general.
Influence is where they plan to make most of their money
is my suspicion.
The other way I would address this is,
I do think it's not going to be immediate,
but eventually over the next couple of years,
companies like Samsung and other competitors
are eventually going to provide an alternative to TSM.
Now, critics will be jumping down my throat on that one
because they're going to say,
well, for the last decade, nothing has changed.
And I will just respond to you that AI wasn't really a big thing
over the last decade.
It's only over the last couple of years.
And if big companies like Google, like Amazon,
are completely constrained because of this one company
and Nvidia maintains the market monopoly there,
it's going to force companies like Samsung to create an alternative.
We're already seeing this.
We had an episode this week, or rather last week,
on China copying the key components of a company called ASML
to try and fill this gap.
I just, if I was a betting man, which I am,
and I own Amazon stock, you know, just putting it out there,
I think a competitor is going to come into the ring.
Yeah, it seems like, well, the interesting thing is Samsung actually beat TSMC to the two nanometer chip architecture.
Yes, exactly.
Which was a really big deal.
And now they're going to partner with Apple and the largest companies are going to get their chips from somewhere else.
Is it going to be too long?
EGS, 2027, we're going to have like AGI on Mars by then.
So can they survive?
Like, is that not too long?
I don't know.
So I guess maybe we could wrap this up by kind of where you currently stand.
How long do you think these things will take to play?
out, how much upside do you think is really possible in the near term? How poorly has the market
misprice? Is this just like a bubble underwater that is waiting to pop out? How does that a look
kind of going forward into 2026 based on your understanding of it all? Okay, so in short, I think it's
bullish because even though they are reliant on TSM's capacity, they are still going to produce
in the order of five to six million Amazon chips next year. In fact, Google's going to do the same as well.
And one simple problem remains, Josh, there's not enough GPUs, TPUs or tranium chips to satisfy demand right now.
So even with invidious capacity, they're sold out.
So companies are looking for other types of chips to fulfill demand.
That's why Open AI last week signed a $10 billion deal with Amazon.
That's why Anthropics signed a one million chip deal with Amazon.
It's the same thing.
Like, no matter if Amazon creates all the chips, they're going to end up selling it.
Now, over the long term, are they going to win?
Yes or no, it remains to be seen.
You know, everyone thinks TSM has a stronghold.
I think there's eventually going to be a competitor,
which makes Amazon and Google way more competitive to NVIDIA's mode.
If you build it, they will come.
And Amazon is building it.
And they're building a lot of it.
And they have won the least out of everybody.
So just based on that meth alone, there's got to be something hidden upside there.
Like they're trading, there's such a cash-generating business.
Everybody who's watching this video has interacted with Amazon once their life.
It's just, it's a great company that will probably stand to benefit from AI.
When and at what scale, we don't know, but that is part of the fun.
We will follow it along as we go.
Issa, are there any party thoughts before we wrap up here?
Actually, yes, I have one final bit of law.
We started off the episode critiquing Amazon stock pricing, it's flat.
One of the main reasons that critics give is because they don't really believe in the CEO, Andy Jassy.
I don't think I agree with that.
He's led the company to kind of like create quarterly earnings upon upon, upon a time.
They're doing really well.
They've grown a lot.
But there's a small percentage chance, Josh, that Jeff Bezos reclaims the throne.
Similar to how Sergey Brin did so with Google.
Why not?
He's outspokenly said that AI is the most important technological shift over the next decade.
He started a company that's focused on creating AI lab stuff for robotics and stuff.
It feels kind of weird for him to just leave his baby behind.
He knows he already has the vessel there.
Why wouldn't he come back?
Okay. Well, it's certainly the same thing as Google. Google had their founders disappear.
Sergey Brin came back and the stock is up 80% in the 12 months to follow that.
So there is something, there is a precedent for this happening.
I mean, we have Bezos is off. He's building rockets with blue origin. He's doing robotics.
He seems very busy. If he comes back to Amazon, I'm sure that would be bullish to some extent.
What the extent is, we don't know. But we will be here to cover it. As always, as we go along this journey, that will wrap up our episode today on the Amazon Bull case.
if you enjoy this episode.
Please do not forget to share it with a friend.
Or if you haven't, go rated five stars on your favorite podcast player.
There is a special ask, which is, actually, I'm not even sure this isn't ask.
This is a value add to your life.
This is the alpha drop that comes two times a week in our newsletter.
Every Wednesday is a thought piece like the one that we covered today.
And every Friday is a short roundup.
It's five bullets on the most interesting things that we found really cool this week in the world of AI and Frontier Technology.
So if that seems interesting, we have the link in our description to go sign up.
It's a substack. It's really easy. It integrates with all the places you watch you get your emails from. And I don't know. I think it's pretty interesting. And you do hear the podcast episodes before they come out because that's normally where we curate our ideas before presenting them in long form. So would highly recommend that concludes our first episode this week. We still have two more to go. So stay tuned for that. There's a lot of interesting stuff happening. And thank you so much for watching as always. And we will see you guys on the next one. See ya.
