Limitless Podcast - Google's New AI Chip Could Actually Dethrone Nvidia (Ironwood v7 TPU)

Episode Date: December 2, 2025

NVIDIA faces mounting competition from Google’s cost-effective Tensor Processing Units (TPUs), as their stock $NVDA's 13% decline, while exploring Google’s strategic advancements and part...nerships, particularly with Meta. The episode highlights how TPUs could disrupt NVIDIA’s dominance and the implications for both companies' futures in AI. We also consider key investment opportunities and the potential for both firms to thrive in an evolving market.------🌌 LIMITLESS HQ: LISTEN & FOLLOW HERE ⬇️https://limitless.bankless.com/https://x.com/LimitlessFT------TIMESTAMPS0:00 NVIDIA's Trillion-Dollar Monopoly0:50 The Rise of Google's TPUs6:08 Google's AI Evolution6:45 Competitive Landscape: Google vs NVIDIA7:47 Meta's Investment Choices11:26 Future of AI Compute14:11 Google's Strategic Partnerships14:35 NVIDIA's Response to Competition17:33 The Future of AI Technology19:26 Insights from Elon Musk24:19 Conclusion and Future Outlook------RESOURCESJosh: https://x.com/JoshKaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures⁠

Transcript
Discussion (0)
Starting point is 00:00:00 One of the most famous investors in the world, Peter Thiel, he wrote a book called 0 to 1. Some of you might have heard it. Some of you may have not. But it's all about monopolies and how much an advantage having a monopoly has in the world. Now, what we've seen recently is a company named Nvidia reaching a $5 trillion monopoly. It's the biggest monopoly to ever exist in business. And it's huge. It's made tons of people, infinite amounts of money.
Starting point is 00:00:25 But something is happening. Something is wrong. It appears as if that monopoly is starting to slow. flip out of their hands. And in this episode, we're going to talk about why and what that means for the entire market. I mean, when you're a $5 trillion asset, the size of your company makes a meaningful impact on the market. So when a company like Nvidia loses hundreds of billions of dollars over a short period of time like the last month, we got to start paying attention to this because it's dragging everything else down with it. So up here on the screen, we have some charts that I want
Starting point is 00:00:53 us to walk through EJS. If you could just kind of show us the difference between Nvidia and who we believe to be the competition that is responsible for knocking these hundreds of billions of dollars off their market cap. Yeah. So this really scary looking red chart that you have in front of you is the last month's performance for Nvidia's stock. And it is down a staggering 13, almost 14%, which equates to over 500 billion dollars of market cap loss. That's a billion with a B, which is just an insane amount of money for the largest company or stock company in the world to lose. So the obvious question that comes in is why and where's that market, where's that money kind of flowing towards? Well, I want to show you another chart, Josh, which is Google's chart. And do you notice a similarity? Over the last
Starting point is 00:01:40 month, it is up almost the same amount in percentage market cap for a very peculiar reason. Or maybe it's not so peculiar. Josh, have you heard of these, you've heard of these things for TPUs, right? Oh, there's little things called TPUs. Yeah. There's nothing so TPUs. You know, Josh and I like to kind of go back and forth and discuss this a lot. In fact, we actually put out a bull episode on Google, which a bunch of you watched a few weeks
Starting point is 00:02:03 back. And I, you know, I don't want to be running victory laps here, but it turned out that Josh and I might have been onto something. But I want to, I want to dumb down what's going on here showed by this really hilarious graphic or comic from the semi-analysis team. And it basically goes, Google came out with this new rock. new shiny rock called TPU version 7. It's basically their version of Nvidia's GPUs, but it's built by themselves in-house. And it's actually really, really good. It gives you an average of 30 to 50%
Starting point is 00:02:34 cost savings for the exact same performance or equivalent of an Nvidia GPU. And it in some cases performs even better, up to one and a half to two times better, right? And so you've got on Nvidia in the leather jacket here on the right, which says, actually, my rock, my GPU is faster, right? And Google's like, is that true? And then everyone ends up using Google's TPUs. And the point being made here is, for the longest time, Josh, Nvidia held the monopoly on the AI training and inference market via their GPUs. It's all anyone and everyone could use to train their models. It was the only option that they had. And now Google presents a real threat to Nvidia's market dominance by presenting these TPUs.
Starting point is 00:03:16 Now, initially, they used these TPUs to train their own model in-house. In fact, Google's never purchased Nvidia GPUs to train their own models, and yet they have the best models, which tells us that the TPUs is something to really contend with NVIDIA's GPUs, but most recently, Josh, they've started selling these to other companies, supposedly, to train their own models. And so we're reaching a point now where Google and Nvidia is a direct comparison, and we're seeing that in the market share dynamics that are happening now, you know, you've got NVIDIA losing up to $500 billion and Google gaining the same amount over the same month. It's just pretty
Starting point is 00:03:50 insane to see. Yeah, I can't stress this enough how insane that delta between the two stock charges. That's 26% combined in one month. So the market is really pricing in the fact that this monopoly is starting to crumble. Now, I think we have reasoning why that's not necessarily the case that will come later in the episode. But for now, there is some real forces at play. I mean, EJ, as you were talking about them selling TPUs. This morning I saw Morgan Stanley make this announcement. They said about every half a million TPUs Google sells can add about 13 billion in revenue. And Google is planning to sell 12 million of those over the next two years. So it's a significant amount of revenue that Google can expect to come down the pipe. And it's the first time that we're really starting to see a legitimate
Starting point is 00:04:30 competitor to the Nvidia GPU cluster. Now, that's not to say the GPU is done for. There's a lot of competitive advantages to do a GPU. I suspect they're not going anywhere, but there is now another market force at play. And when we see a market force kind of cutting in, it starts to price cascade and the monopoly slowly starts to fade. I do want to give a brief history lesson on the history of Google and their AI program, because what a lot of people don't understand is Google really is the godfather of AI from the beginning of time to now. And they've just had this problem where they haven't been able to actually build products that scale or sell products to users. But they've been doing this since all the way back in 2011. And what we're seeing here is
Starting point is 00:05:14 the original paper that a lot of people would conceive to be the first time that a neural network proved that it was work. And they trained a massive unsupervised model on 16,000 CPU cores. This was before GPUs existed on random YouTube frames. They didn't use any labels. They used no supervision. and then one neuron spontaneously learned the concept of a cat. So this seems so stupid. It's like, oh my God, it can recognize a cat. But this was the first time in history, and a machine was able to identify something
Starting point is 00:05:44 without explicitly written instructions. And that moment inside Google stood off a light bulb that eventually led to them creating Google Brain, which was enough of a breakthrough for them to start creating AI inside of their in-house system. So, EGES, if you remember Google Translate, which has been around seemingly forever, Google Translate is a result of AI.
Starting point is 00:06:00 That was an early test implementation of Google Translate. And what that actually enabled is they could suggest to sponsors or advertisers on the platform which companies are more likely to click, which users are more likely to click. And that created the whole AdSense model. It created oftentimes when you type into Google search, it'll auto-complete for you. These were all very early versions of AI before we even realized what AI was, which led to the invention of the TPU almost nine years ago. And the TPU is this vertically integrated chip that we see today, taking over basically the entire
Starting point is 00:06:29 world, one company at a time now. So they've been doing this. I mean, a lot of people don't realize the TPU has been around for nine years now that they've been iterating. We're currently up to version seven, which is the Ironwood TPU. And it's just this incredible testament to the fact that Google actually has been doing this for over a decade now, almost 15 years. And we're finally starting to see the fruit of their labor grow and be exposed in public markets. And my God, it's explosive. It is insane that, you know, they've been working on this for over a decade, right? And like that compounded value is really starting to show now. Because like, I'm guessing, like, everyone back in the day was just kind of like,
Starting point is 00:07:02 what is this machine learning thing? Like, I can't imagine any kind of like a chatbot being beneficial to us. And then fast forward to 2022, chat YouTube goes viral. And suddenly everyone's kind of raving about GPUs and Google's kind of like quietly smirking and smiling, not buying any of VINDIAS GPUs being like, hey, we invested in this a decade ago and it's finally paying off, which is just kind of insane to think about. And like Josh, like they came up with the transformer as well, right? which is like 2017-earing architecture.
Starting point is 00:07:28 Exactly for this alarm. So super, super cool to see. Now, I want to kind of like step aside and kind of frame the narrative for what we're about to discuss right now. So we're talking about Google versus Nvidia, and there's many different ways that we can kind of compare the two, right? The most obvious one is through TPUs versus GPUs, with that you mentioned. And one of the biggest questions that I think listeners have on their mind, Josh, is like,
Starting point is 00:07:52 okay, well, if Google's going to compete with Nvidia, where's the proof? like, who are they selling this stuff to, like, surely they can't be selling to any major competitors, right? Surely they can't be selling to major companies. So can they actually really compete? And I just have one article I want to share with you, one little tweet. You might have heard of Mehta or Zuckerberg, who is rumored to be spending tens of billions of dollars in 26 on Google's TPUs to train their Lama or big Lama models coming up in the future. Now, of course, you're not too unfamiliar with Mehta's kind of progress recently, or rather their spending budget.
Starting point is 00:08:28 They've spent, I think, to the effect of $25 billion this year just to hire talent to train the model. We haven't even seen the model to begin with. So the fact that they're planning on using Google's infrastructure, supposedly, to train their models is no easy feat. And so you might be wondering, well, like, why would they choose GPUs over Nvidia's GPUs? That's kind of like the standard kind of framework to go down.
Starting point is 00:08:51 Well, I just want to show you the scaling chart. which basically shows that the TPU, which is Google's infrastructure, their GPU, versus the GB300, which is Nvidia's latest GPU, there's a significant cost difference, right? If you were to use Google's TPUs, the Ironwood TPUV-7, you would save 30 to 50% depending on the amount of TPUs
Starting point is 00:09:15 that you would use in training your model. Now, when you consider that META's Cappex spend for the next year, I believe is going to be something along the lines of $67 to $85 billion, that is a lot of cost saving if you are able to use Google's TPUs. So from an economical sense, it makes a hell of a load of sense, right?
Starting point is 00:09:38 And then the other thing I was thinking about, Josh, is why would META kind of use Google's TPUs for their own systems, right? Aren't they directly competing? Well, there's a secret kind of detail that I learned about this. The way that Google's TPUs are designed makes it really performant for something called recommender systems.
Starting point is 00:09:57 Now, a recommender system is the system that is used behind ads, behind social media algorithms. Meta is arguably one of the biggest companies which uses these things. So if they can have a hyper-performance TPU to train the AI model to use on their own social media platform that will make it inevitably better at 30 to 50% less cost, it seems like a complete no-brainer. And if you add this deal to the anthropic deal that are purchasing 1 million TPUs from Google, as well as another deal that we're about to talk about, that's just insane.
Starting point is 00:10:29 Do you think that's pretty insane, right? Yeah. And going back to that chart that you just brought up earlier, which shows the cost difference. Like, if you're a big AI company spending billions on training models, Google is now offering a system that can cost only 25 to 50 cents for every dollar you'd spend on Nvidia's best hardware. And that is a huge deal. Because, I mean, when it comes down to it, cost per compute, cost per unit,
Starting point is 00:10:51 of compute and training is so large when you're at the scale of these companies spending tens of hundreds of billions of dollars, that 25, 50% of the cost is massive. And granted, like you said, they're not good for everything. But if you're a company like Meta, who's building suggestion algorithms that's particularly good for, this is a no-brainer. And it seems to me like now the only threshold will be, how quick can Google actually create these, manufacture them, spin them up, put them in server racks, and get them online so people can start using them. Because this unlocks a whole new use case for AI that we haven't seen in the past that we'll see now because of the lower cost and also the increased efficiency of these ironwood TPUs. And Google's innovating
Starting point is 00:11:27 quick man. I mean, each one of these TPUs is coming out every single year and each one is significantly better than the last. Every 500K TPUs that Google sell adds 10% to their 2027 Google cloud rev and 3% to their 2027 earnings per share. That is insane. So they don't need to sell. Yeah, $13 billion. They don't need to sell. They don't need to sell. sell near as much GPUs as Nvidia cells, they just need to sell a couple hundred K. And if this is just one deal that they're cementing with meta, can you imagine how much revenue they're just going to churn from this?
Starting point is 00:12:01 It was rumored that the Anthropic deal, where they're selling around, I think it's a couple hundred TPUs to them, is going to earn them $50 billion next year just to train Anthropics kind of like next forward model. So just kind of insane to see. There is a counter thesis to this deal, which is, you know, I'm going to put my tin hole tin foil hat on here, Josh, which is metas kind of doing this so that they can negotiate better terms within Vida or AMD to kind of get like better chip deal saying, hey, look,
Starting point is 00:12:33 we'll go with Google unless you guys give us a cheaper kind of route. I think this is kind of like conspiracy theory. I mean, the metrics around Google's GPUs kind of prove themselves, but it's just something to keep in mind. I don't want to get too much into my bold thesis here. I was going through a lot of the deals that we're surfacing today. One of them being with Foxcon and Google, where now Foxcon is responsible for building a thousand server X a week.
Starting point is 00:12:54 Next year they're doing 2,000 server X a week. There was an announcement earlier today where Google is now partnering with AWS, the cloud server, to provide more infrastructure. So we're starting to see again more of these deals that are happening around the Google TPU worlds, which is super fascinating. And then this leads to the final point, which is the head of AI infrastructure in a meeting from a few months ago saying that Google must double AI compute
Starting point is 00:13:17 every six months to meet its demand. So there is no shortage of demand, there is no shortage of infrastructure, there is no shortage of support to get these TPUs out to the world. And what we're going to start to see is how big of an impact this really does have on a company like Nvidia
Starting point is 00:13:30 now that there is someone else in the market. There is a second seller for a company like Meta who wants to build massive AI systems. You could argue one of the most obvious bull signals to purchasing or investing in Google stock was the fact that Berkshire Hathaway bought a $3.5 billion stake in Google literally a couple weeks ago. And then, funnily enough, the leaked information around them selling TPUs to META
Starting point is 00:13:56 and them striking this deal with NATO kind of like surfaced, right? So there's a lot of momentum behind Google right now. There's a lot of big, valuable investors and kind of infrastructure providers getting behind the Google train right now. The momentum is palpable to say the least, right? And this NATO deal is another example of it, right? Like we're going like from like the hyperscaler kind of consumer level to the government level as well. So all types of organizations are treating this with a very high importance that Google was going to play an inevitably big role here. So then that begs the question, well, what's Nvidia going to do about this?
Starting point is 00:14:30 Are they just going to continue losing hundreds of billions of dollars in their market cap or are they going to strike back? And there's two frames of thought about this, Josh. Number one is, so Nvidia's next generation. of GPUs is going to be around the Rubin architecture. It's called Rubin, right? They introduced a new spec after Google's
Starting point is 00:14:52 TPUs, their latest TPUs got released, which upped a lot of the watts or compute performance for the Rubin architecture. Now, some might say this is just coincidental, but some might say this is a general reaction to the fact that Google just has a higher performance
Starting point is 00:15:07 TPU versus theirs, and so they needed to kind of like up the metrics of their next generation if they wanted to compete and appear attractive to their competitors or to their customers themselves, right? But then there's also the argument where it's just kind of like Nvidia and Google are kind of playing
Starting point is 00:15:23 in kind of like different ballpark and they already know this. They're playing different games. The argument here in this tweet being that Google's TPUs are great, but they're only for very specific niche use cases. If you have, you mentioned earlier, you know, the recommendation or search algorithm,
Starting point is 00:15:40 then, you know, these A6s are going to be really good. the benefit of Nvidia's GPUs is that they're highly generalizable. So if you wanted to train a model in a different way or test out a new method to kind of like inference or train your model, GPUs are by far the best architecture or the best infrastructure to use. So you could argue that Google, that, sorry, Nvidia is sitting pretty comfy. And Jensen went on a show or an interview this week, basically saying that he's not worried about Google. Obviously, you expect him to say that. But mainly for the fact that this is a positive,
Starting point is 00:16:11 some game. You know, Jevin's paradox, if you create more GPUs, it's not going to be a fact that you have oversupply. There's just going to be increased demand for compute. I think Jensen knows this, and that's why he's just kind of running fault. The fact of the matter is, there isn't enough Nvidia GPUs to supply the customers, even if you want it to, right? He needs to ramp up infrastructure production. That's why he's been visiting TSM for the last couple of weeks. So I think he knows this, and I don't think he's too worried, but he's definitely sweating a little. I don't know if you think the same. No, I mean, I'm sure it sucks. It's like you are just running the show and now suddenly there's
Starting point is 00:16:45 someone else who has a good product. It's not to say that it's going to harm the company too much. And I think for anyone who's listening, if you take away one thing from this episode, it's that both of these companies are going to succeed wildly. And there is going to be a shortage of supply for compute for a very long time. If you believe that AI is as impressive and as important as the technology as it really is, then you also have to believe that all of the compute around us must be replaced by it and must have it embedded inside of it. In order to do so, you need to shift the entire technological infrastructure of all the hardware that exists over to infrastructure that supports AI in everything. And we are just a fraction of a percentage through that transition.
Starting point is 00:17:24 So as a result, there could be many more Googles, many more invidias, and there would still be a shortage. Now, the question becomes, is there a short-term bubble? Are we overspending? That is to be determined, but this is a good type of bubble. This is one that even if it does explode, we are left with unbelievable technology across the board and a scaling infrastructure that will continue to be able to support this new type of technology that's permeating throughout society. So is this a good thing? Yes.
Starting point is 00:17:49 Is Nvidia going to suffer? Maybe, sure, the headlines suck. It's like, okay, we're not the coolest person in the world now. There's someone else who's also a cool kid. But they're still going to continue to produce the best products in the world. And EJS, to the point that you made earlier, they're just different types of chips. Like, a GPU is very different than a TPU. And a lot of people also need to understand that the whole world isn't actually training AI.
Starting point is 00:18:12 There's still a lot of other things that are happening, like graphics or simulation or financial technology, scientific research. TPUs just don't do that and GPUs do. So there's a lot more going on to the story. There's a lot more GPUs being sold. There's a lot of TPUs being. There's enough for everybody. And then there's still not enough for everybody.
Starting point is 00:18:29 So I think in the long run, like this is just great for both companies. This is positive sum. There's a lot of excitement around this, rightfully so. because I think it's great. There's another person stepping in, but it's not the end of anyone. It's just the beginning for so many of these companies still. I mean, like, don't take your and my opinions either, right? Why don't you just listen to one of the smartest men or smartest businessman?
Starting point is 00:18:49 Oh, we got Elon. Yeah, we got Elon. And he was asked this question in this interview clip that I'm about to show. This was released yesterday where he was asked, Elon, if you had to invest in any AI companies today and hold it for a decade, what would you buy? And you think, you know, Elon would shell his own companies? He didn't.
Starting point is 00:19:08 He shilled two companies, Josh, Google and Embedia. Let me show you a clip. I think, you know, Google is going to be pretty valuable in the future. They've laid the groundwork for an immense amount of value creation from an AI standpoint. Invidia is obvious at this point. I mean, there's an argument that companies that do, AI and robotics and maybe spaceflight are going to be overwhelmingly the older value, almost all the value.
Starting point is 00:19:46 So the output of goods and services from AI and robotics are so high that it will dwarf everything else. And so, you know, you hear that from South where he's basically describing Nvidia as a sort of toll collector because you kind of like need to basically pay the tall man for his GPS. used to get access to the intelligence that you're trying to build. And then Google's mode is kind of similar but quite different in the sense that they create the GPUs, their own TPUs, but they also like kind of own dominance across the entire AI stack, right, Josh.
Starting point is 00:20:21 And just to kind of maybe like round things up, I was looking at these crazy charts from the Financial Times this week, which basically showed that Google's Gemini model has now almost caught up in the number of users or monthly downloads that chat GPT has, which is just insane. That's the chart that we see here on the left. And then on the right, which I found the most interesting, is the amount of time that each Gemini user is spending on the app using the Gemini model has now beaten ChatGPT. This kind of blew my mind because I was like, surely everyone's still using ChatGPT because people tend to use ChatGPT for their own personalized things. They kind of like confer it therapy, ask about personal stuff. That probably spends a lot more time.
Starting point is 00:21:03 seems like the productivity aspect that people are getting from task orientation-based AI stuff using Gemini seems to be extending. And that just kind of like shows I've heard anecdotes from friends you and I were talking to a team member just before recording this. And he was like, yeah, I was talking to a bunch of my friends and they've fully switched to the Gemini app. So I think we're going to continue seeing this trend of Google gaining the advantage, not because of their infrastructure mode, but because they like own all the popular apps that anyone and everyone wants to use. And so all they have to do is plug in the AI model with whatever app, Gmail, maps, whatever you might kind of think of. And suddenly you have a really productive, useful app that
Starting point is 00:21:41 you and I want to use every day. Like Josh, you mentioned that you want to use Nando Banana. Or you're using Nano Banana, right? Yeah, there's a, there's an important shift that I found that has happened recently that hasn't happened before, which is I have Gemini on my home screen on my phone. And that's, that's a very high signal because I've been resistant of it because it just hasn't been good. And while it's still not great, I think the chat chapti app is engineered far better than Gemini, it is good enough to make me want to use it. So I went from not being a user. Like I would really, I'd use Gemini 3 Pro on desktop whenever I had a hard question, but I wasn't reaching for it. And Nanobanana Pro really, it really was the killer use case that had me like, oh my God, I need this
Starting point is 00:22:21 quickly accessible and in my pocket at all times because it is so far superior to any other product in the space like that. And I think as Google starts to roll out these products, as this is something that we talk about a lot with Open AI, where they're really good at creating an innovation than wrapping it in a product and selling it, as Google gets better at doing that, I really, I strongly suspect Gemini will continue the trend of taking over broader and broader people. Because when you think about how many monthly active users Google has, it's like gigantic, they're one of the few in the world that actually has more than chat GPT and Open AI. And if they can convert all of these services to pack AI into it into one coherent service and package, that's incredible.
Starting point is 00:23:01 We talk about a lot of times, like with meta, for example, they had their awesome hardware product, but no one really wanted to use it because the ecosystem sucked. Well, Google has like the best ecosystem ever. It's funny, even on my iPhone, I have an iPhone hardware with software from Google because their software is so superior. So as they're able to integrate these top tier models into all the products, this is serious shift. And I'm very bullish on Google. Yeah, I don't think we're going to see this trend reverse. We already know that Apple is going to be used using a Google Gemini-based model in their phone for Siri.
Starting point is 00:23:34 That's right. Yeah. We're going to start seeing a lot of Gemini-based apps just kind of like impair in our regular day-to-day. One other final point is like when you compare like Google versus Open AI, remember Open AI still isn't profitable. Google is massively massively profitable. So they don't need to turn on ads. They don't need to kind of like, like demean the user experience in any way. They could just keep giving you this stuff for free and gaining millions and millions more monthly active users, whereas OpenA at some point is going to turn on ads. And when they turn on ads, it's going to be an inferior performance.
Starting point is 00:24:07 It's going to be an inferior product to some extent. And that might shift to more people using Google, Gemini's products. And Google knows this. So they're willing to just kind of sit back. They own kind of every infrastructure layer. And they're just going to see how things play out. But I think that is it for today's episode. super exciting to kind of like see where Google and
Starting point is 00:24:27 MVIDIA ultimately end up. In my opinion, I think that both companies, to your point earlier, Josh, are going to do extremely well. It's a positive sum game. And the fact of the matter is there is not enough compute. There's not enough energy to feed the compute that both of these companies are pushing out. So I think we're just going to see both these companies
Starting point is 00:24:44 grow into two of the largest and most valuable companies in the world. Now, I need to take a quick victory lap for all listeners here who aren't subscribed. and who haven't rated our show just yet. We released an episode about the bull case for Google. What was it? Two months ago now. And a lot of it has now played out right now.
Starting point is 00:25:03 If you'd invested in Google back then, you would have participated in the hundreds of billions of dollars that their market cap is up relative to Nvidia right now. Now, I'm not going to say that we triggered it. Maybe we did. Maybe we didn't. But if you want to hear more bulk cases like this or more alpha in advance, subscribe to us, rate us.
Starting point is 00:25:19 Give us a thumbs up. Give us feedback. We love it. And we will hear more from you. or you will hear more from us rather on the next episode. See you down.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.