Motley Fool Money - Nvidia’s New Chips, with a Side of Valuation

Episode Date: March 19, 2025

Jensen Huang sees a path to $1 trillion in AI infrastructure. Is Wall Street buying it? (00:14) Asit Sharma and Mary Long discuss Nvidia’s “Super Bowl of AI,” plus: - The coming generation of c...hips. - Increased competition from hyperscalers. - Partnerships in fast food, autonomous driving, and robotics. Then, (19:44), a number of Fool analysts answer questions from the listener mailbag about early stock analysis, how healthcare companies are using AI, and how to factor in customer experience to investment decisions. Got a question for the show? Email podcasts@fool.com Companies/tickers discussed: NVDA, PYPL Host: Mary Long Guests: Asit Sharma, Jason Moser, Karl Thiel, Dylan Lewis Producer: Ricky Mulvey Engineer: Dan Boyd, Rick Engdahl Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 This episode is brought to you by Indeed. Stop waiting around for the perfect candidate. Instead, use Indeed sponsored jobs to find the right people with the right skills fast. It's a simple way to make sure your listing is the first candidate C. According to Indeed data, sponsor jobs have four times more applicants than non-sponsored jobs. So go build your dream team today with Indeed. Get a $75 sponsor job credit at Indeed.com slash podcast. Terms and conditions apply.
Starting point is 00:00:32 There's a new chip on the block. You're listening to Motley Full Money. I'm Mary Long, joined on this fine Wednesday morning with a Mr. Asit Sharma, Asset. Thanks for being here. Mary, thank you for having me. Of course, of course. We've got some big news happening after we record, but before the show publishes this
Starting point is 00:00:55 morning, that is, of course, that a Mr. Jerome Powell will be making some announcements later today. We're not going to hit that on today's show. Ricky and Nick Seiple will cover that tomorrow. Instead, Asset, you and I have another big name dropping another announcement, or multiple announcements, that big name is Jensen Huang. InVIDIA had its GTC, GPU technology conference. It's running throughout this week.
Starting point is 00:01:18 And yesterday, Mr. Huang delivered the keynote speech. Asa, this is an event that used to be kind of predominantly catered towards academics. And now it's turned into what the New York Times dubs the Super Bowl of AI. We got a number of things coming out of that keynote that you and I will hit on on today's show. But we'll start with this. In late 2026, InVideo will release its next generation of GPUs. They're calling this generation Vera Rubin. So what do investors need to know about this upcoming generation of chips
Starting point is 00:01:50 and how it's different from perhaps, should we call it its predecessor, the Grace? Mary, I'm going to call it Vera Rubin, just for some fun shift in pronunciation. It's probably Vera, but anyway. So what is Vera Rubin? Vera Rubin is an improvement on the Blackwell architecture. So that's Nvidia's current biggest and baddest GPU complex accelerator technology. And Vera Rubin does improve on grace. You're referring to the CPU, the chip that goes into Blackwell.
Starting point is 00:02:27 Blackwell has a GPU unit and a CPU unit. So Vera Rubin replaces that chip unit grace with something called New Grace, which is supposed to have two times the performance. Overall, this GPU system has a lot of compute power that improves on Blackwell. For example, it's got what is called NVLink scaling. This is when GPUs in a system communicate with each other, send information back and forth, that doubles the power over Blackwell, the ability to scale between GPUs and VLink scaling. It has something called HBM4 memory.
Starting point is 00:03:06 You probably heard us, if you've been listed, to Mary and I or myself and Ricky over the last year or two talk about something called HBM3, which is a type of memory within GPU structures. So this is the latest version of that. And it just has much more compute power, almost 10 times the aggregate compute power of the Blackwell platform. So in many ways, Vera Rubin just represents a really big leap. And then it's going to be followed up married by something called the Vera Rubin Ultra. which is a little bit funny. Okay, I'm digging here a little bit at something that shouldn't be criticized too much.
Starting point is 00:03:49 I mean, it's going to be an advancement over an advancement the next generation, the company's platform. But come on, this reminds me of new and improved from the grocery store when we were growing up, Mary. Like, after a while, how do you communicate that something's even better? not Vera Rubin Plus, but Virerubin Ultra sounds about right. And I want to lay a criticism here on Nvidia, which is a company I admire. The CPU of this Verirubin system, which we called New Grace, they called New Grace, is complemented by the new GPU graphics processing unit in the system, and that's called the CX-9. Again, Nvidia may be running out of inspiration here when you start naming your GPUs after
Starting point is 00:04:33 Mazda SUVs. Awesome. You and I have a tendency to really harp on what's in a name and why a company decides to name a certain project, X, Y, or Z. And so we were talking a bit before the show about it's interesting to see, Nvidia in particular, we'll pick on them right now, to see Nvidia in particular give these names to all these different products, but they can be very hard. I appreciate why they're taking the time to name something, name a GPU, or Rubin,
Starting point is 00:05:02 rather than HBM or a series of numbers and letters, I do appreciate that. But it still can be very hard for the layman to differentiate between all these different names, between Kuda, which is more of a software platform, and the Grace, the Rubin and Blackwell and how these pieces fit together. So thank you for kind of giving us the lay of the land a little bit and giving us an insight into what all these new announcements actually mean. when it comes to what all these new announcements actually mean, some news that came out last week was that meta is planning to train its own AI chips. Amazon and Google are already in the process of
Starting point is 00:05:40 doing this. And this is all these companies, these hyperscalers are training their own AI chips in the hopes that they can ultimately reduce their reliance on Nvidia. So help us understand what Nvidia's head start looks like because they are kind of the go, not kind of, they are the go-to player in this game right now. realistically, what do meta, Amazon, Google, anyone else that's interested in training their own AI chips, what do they actually have to do to build in-house chips that genuinely rival what Nvidia is putting out there? Mary, one thing they have to do is to make simpler chips that have custom purposes. So when you train a model or when you get inference out of that model,
Starting point is 00:06:25 not all of it has to be done on the latest and greatest GPUs. And so to the extent that a company like Amazon with its Traneum chip, it's now in the three series, approaching the three series, a company like meta, which is getting into the game or even Microsoft, which is later to the game with a chip called Maya, what they're trying to do is to cut costs dramatically. So when you or I use a large language model on their cloud platforms, it costs less for us, the end user. We're paying some way. And enterprise businesses are paying direct to these companies. So they have a lot to gain because they really don't need the overkill of awesome Infida GPUs for many of these use cases. And that's why they're investing in this.
Starting point is 00:07:11 They're working with great design companies. They are taking models, sending them over to TSMC, a great foundry, getting back prototypes, and then gradually, I think Amazon is furthest along this race, just putting them into the data centers and showing a cost savings on Amazon's last conference call, CEO Andrew Jassy said, look, we're reducing compute costs by 30% in some cases. Now, Nvidia is looking at this and saying, ha ha, ha, why? Why is Nvidia smugly smirking and laughing? Because we want ever more powerful compute. And the world is moving in the direction of reasoning models where we're asking the models to think in steps, to go out and research on the web, come back.
Starting point is 00:07:55 back to us with answers. And really, the chips that are best designed for that are the GPU slash CPU complexes that Nvidia builds. So it believes it still has a really, really major role to play in this world going forward. I think, you know, maybe that the true story is somewhere in the middle and time is going to tell how it really shakes out. Patrick Moorhead, who's the founder of tech research for more insights and strategy, told the New York Times for an article kind of about the, again, this conference, the Super Bowl of AI, that, quote, the gravy train comes to a screeching halt if cloud companies stop spending, end quote. So that's the gravy train that is currently reaping the benefits of. What can Nvidia do today to freeze some of that gravy
Starting point is 00:08:39 for the future if there is a potential hibernation winter period ahead? Well, one thing they can do is to focus on the replenishment cycles. So even though data centers, may get crimped at the margins, the data center buildouts, you have all these existing data centers that need to be upgraded in terms of their capacity, server capacity, memory capacity, etc., to meet the demands of our compute. Just because AI might become less expensive doesn't mean we're going to stop using it. So, Nvidia can really think forward in terms of where they can start going to these customers and working on upgrades to existing data centers.
Starting point is 00:09:21 if they pull that new build out. And I don't think that'll necessarily happen as sort of this cliff moment, which some project. If you look back before what happened with generative AI, we had a really slow, steady, but almost inexorable march to build data centers just to handle the migration of enterprises taking their stuff off of premise into cloud.
Starting point is 00:09:45 And those percentages, if you look at them, they still have a long way to go. Even that, people have sort of forgotten. This movement is still, in motion. So over time, I think what we're going to see is, yeah, there could be some valleys, some unexpected slowdowns in data center builds, but the direction of the future, until we can make these things much more efficient, both from a consumption standpoint and an energy standpoint, is to keep building. And Nvidia stands to gain from that.
Starting point is 00:10:12 Oh, and the direction of the future that Jensen Huang sees is a trillion dollars in AI infrastructure buildout. That was a number that really stuck out to me. He mentioned. that the company has already built out about $150 billion of AI infrastructure, but again, that he sees a roadmap to a trillion dollars sooner rather than later. What did you make of that number, that statement, that vague timeline? Not much. It's not too different from a number that he dropped like three years ago. It's just that no one could see that future. He was talking about hundreds of billions within a few year periods three years ago.
Starting point is 00:10:50 And it just seemed like, well, you know, I guess if you're talking about extremes and everything falls into place and everyone would have to be using this stuff, right? I mean, everyone would have to be using Chad GPT for that to happen. I guess so, Jensen. So if we go back and extrapolate his original projections, guess what? Right on the mark a trillion bucks. So that's interesting context too, because, you know, we've seen Nvidia stock go down about 15%. That's what. far this year. It was down over 3% yesterday alone in spite of all these big advancements that Jensen Huang is touting. Invidia's peg ratio today is just under 0.3. So despite all of the advancements that they're rolling out, that they're talking about and hyping up, the market seems to think that this company's growth is going to stall out. Did you hear or see anything during
Starting point is 00:11:44 yesterday's keynote kind of throughout the span of this conference that made you think, oh, maybe the markets got it right? Or do you think this is another situation where we're kind of where we were three years ago where Jensen Huang is saying, this is the future? And folks just aren't buying it because they can't see it. I think we should bring a lot of skepticism to bear to Nvidia's growth ethos, just because now everyone wants to disrupt them on every level. And companies that didn't have to compete with Nvidia yesterday, let's look at ERISA networks in the networking space, suddenly find it as a competitor and they have to shift their models. Arista Network is an amazing company, by the way.
Starting point is 00:12:21 So it's not like they're going to completely reinvent their business, but now they're paying attention to what could be some competition for them. So I think when you think about this, you think about companies that are trying to figure out ways to make compute more efficient. You have small competitors that are questioning, even the way we build GPUs. I think we really have to be skeptical on the growth thesis. But I mean, you bring up something interesting, Mary.
Starting point is 00:12:46 What you're basically saying is, hey, you know, that PEG ratio, what if that means that it really is undervalued in relation to its potential to earn out in the future? And yeah, I saw lots of stuff in that presentation that told me this could happen. Just take one thing. So you mentioned Kuda, which is a collection of software acceleration libraries. We've talked about this before, Marion, and I've pointed out that, look, Kuda is lots of different libraries.
Starting point is 00:13:15 There's one for high precision math. There's one for aeronautics. There's almost one for every really cool use case you can think of across so many industries. And yet we see another one that was introduced yesterday, which is a version of something, those who know the language, Python programming language Python, are very familiar with. There's a library called Pandas and it has to do with something called data frames. It's essentially how we manipulate data. What Nvidia is doing now is taking the constant.
Starting point is 00:13:47 behind data frames and extrapoling that into massively parallel computing use cases, which just points to a future in which the assumptions we had about how we use big data and how fast we can wreak stuff out of it, maybe those assumptions are off because if we take the same concepts from Python, but throw those into these great acceleration libraries which are running on GPU sets that are much more powerful than what is on the ground to day, then it means that lots of fun things that we do with computation today we can do in a much shorter time and there will be potentially some very interesting advancements in science, in big data, et cetera.
Starting point is 00:14:31 That's just one little thing I saw that indicated to me that they keep innovating and creating a future use for their technology. So I wouldn't bet against the company. I think right now it's sort of like be skeptical, be very skeptical, but don't discount the possibility that three to five years from today, you and I could be talking about some ho-hum, amazing double-digit growth rate that Nvidia holds. One of the central themes of yesterday's keynote was that AI has use cases in every industry and that it's Nvidia that's really laying that foundation across all those industries.
Starting point is 00:15:08 So as a part of that thesis, we have this many announcements about partnerships between Nvidia and different companies. So I'm going to call out three different partnerships that stuck out to me, and then you let me know which one's most interesting to you, and we can kind of go from there and give a few more details on it. So we've got GM and Nvidia teaming up to build out GM's fleet of fully self-driving vehicles. We've got Walt Disney, Google's DeepMind and Nvidia, building a platform that will, quote, supercharge humanoid robot development. We've also got Nvidia partnering with Yum Brands. That's the parent company of Taco Bell, KFC, and Pizza Hut, and they're going to roll out AI order-taking. Which one of these do you want to take, Osset? Let's do them lightning
Starting point is 00:15:52 round, all three, because they're all fun. Let's start with the drive-through. Okay, so we'll start with the drive-through. First of all, my question here is, do you want AI ordering? I've seen a ton of fast food companies roll this type of stuff out. McDonald's, and then also roll it back. McDonald's this past summer ended their partnership with IBM when it came to AI ordering, I think that this idea of efficiency at all costs at the drive-thru, it does, I understand why companies are pursuing that. But it also runs counter to a story that we've seen play out at Starbucks where, hey, you lean too hard into efficiency, and that actually pushes some consumers away. So what do you think the future of this,
Starting point is 00:16:30 this Yum brand's Nvidia partnership could look like? The AI is getting better and better at the end of the day. What I want is really the human interaction. But you know what I'll take, Nick Mary is an AI that's smart enough to get something wrong in the order. So I have to correct them and just fool me. And you wanted large fries with that? No, ma'am. I said medium fries. Right? That's the AI interjecting something to make me think it's human.
Starting point is 00:16:58 So I'll take it in that instance. So then we've also got this self-driving feature. What's interesting to me is that GM has struggled in its attempt to build out fully autonomous vehicles. Last year, they pulled funding for its cruise, Robotaxi company. Were you at all surprised to see GM seemingly giving full self-driving another go? I guess I was, but obviously I wasn't paying enough attention because GM sort of signaled to this. We talked about it, Mary, like we're not going to focus on trying to have a full fleet of autonomous vehicles.
Starting point is 00:17:26 We're going to focus on the software side, getting that into our vehicles, working with simulations so that our software is second to none, the software that's in the vehicle for your driving experience. So it's assisted driver technology. That's interesting. Okay. Last but not least, we've got this robot situation. I mentioned a partnership of three between Walt Disney, Google's Deep Mind, and Invidia. I get Google. I get Invidia.
Starting point is 00:17:54 Walk me through how Walt Disney fits into that trio. So Walt Disney actually has a long history with robotics. Think the animatronics at their theme parks, they're very good at robotic motion and making it sort of human-like, they also have research labs out near Burbank, near their studios, which have been working for a long time on robotics and the science, the physics of how these robots work and the expression of that. And then you think of their animation and motion expertise that divisions like Pixar bring. And you actually have a company that makes sense when you put that puzzle together with those
Starting point is 00:18:34 gurus from deep mind and invidia. just very strong research into the robotics field and its forte in simulation and virtualization, I actually was surprised but thought to myself that that makes sense. We got Quantum Day coming up tomorrow. I know you said that you're really excited for that. Anything in particular that you'll be keeping an eye or ear out for? And Nvidia already said that it's working on some Kuda libraries for Quantum. And there are some really interesting problems that quantum computing needs to solve before it becomes big time.
Starting point is 00:19:11 One of those is correcting for errors in the computation. So without getting to any weeds because we haven't got time, I'd be very interested to hear about how their software might help on the quantum level, cut down the error rates when we ask these particles to do their thing and measure them and try to get a mathematical result out of that. So that's what I'll be looking for tomorrow. Always a pleasure. Thanks so much for talking in video with me today and for breaking down these often complex topics. Hot so fun, Mary. Thank you so much.
Starting point is 00:19:48 You got questions. We got answers. Up next, we continue with Mailbag Week and turn to a number of Motley Fool analysts to answer your questions about fundamental analysis, AI and healthcare portfolio management, and how customer service experiences ought to affect your investment decisions. What does leadership really look like? On the power of advice, a new podcast series from Capital, group. You'll hear from athletes, entrepreneurs, and executives who've led on the field, in the boardroom, and in their communities. It's not about titles. It's about impact. Discover what drives them and the advice they carry forward. Subscribe and start listening today. Published by Capital Client Group, Inc. Every now and then, we'd like to turn to our listener mailbag to see what kind of questions are on your mind. We noticed that there were a lot of questions that were roughly about how to get started investing. So, for
Starting point is 00:20:46 today's show, we rounded up a few analysts to answer your questions that are geared towards the beginner or intermediate investor. To kick us off, we got listener Cody King, who wrote in, fools, for new investors, what are some fundamentals to look at when considering investing in a stock? Is it PE ratio? Most recent quarter's earnings? News headlines? Are any metrics overreated? What is the key info to dive into to determine if a stock is a winning investment? For the answer, we turned to none other than fool analyst, Asset Sharma. Cody, I love your question. For me, there are two things that go on when I'm thinking about stocks or selecting stocks. One is to sort of screen for new ideas or turnover stones,
Starting point is 00:21:28 if you will. And in that sense, different metrics like the PE ratio can be very useful in just comparing companies and seeing what might be a really high PE. Sometimes that's a signal for a good stock because you're looking at a company that must be growing its revenues or has the potential to do so. But let's take a step back. Because your question really asked, what is the key info to dive into to determine if a stock is a winning investment? And that's quite a different thing. For me, the most important thing is the story, the narrative behind that company. If you can grasp that, then you can put an overlay of everything else. You can crunch the financials. You can look at ratios. You can compare it to other companies. One of the places
Starting point is 00:22:11 that I like to start is the MD&A section of a quarterly or annual report. You can go to Edgar. This is SEC.gov and search these reports. They come labeled in varieties of 10K for annual and 10Q for quarter. When you go to this section, what you're looking at is a required explanation of what a company does and how its recent past has fallen out. It's a required disclosure from the SEC. see it's part of regular reporting. And that's a great place to understand exactly what management thinks about its company, how it presents its products or services, where it sees the business going, and it discusses, that section discusses recent results. I like to start from there. And if I can grasp that and really feel that I'm starting to understand what a company does
Starting point is 00:22:59 and what its potential might be, then the other things that you mentioned, like listening to earnings transcripts, looking at the most recent quarter's earnings, think. Thinking about the news headlines, those are really key at that point to getting a beat on whether this is a company you want to put into your portfolio or take a pass on. We talk about a lot of stocks on the show. Some of them are genuine investment ideas. Others are just public companies that we find interesting or newsworthy, but don't necessarily think of as strong stock picks.
Starting point is 00:23:30 One listener, Sumit Maru, asked about how listeners ought to think about what we call radar stocks. They write, I listen to the show religiously every day. day. One of the questions I have is about the radar stocks. Every Friday, the team picks two radar stocks. I love what JMO, Maddie A, and Emily bring to the section, but I'm not sure what everyday investors like me should do with those picks. For the answer, we turn to the host of our Friday show, Dylan Lewis. I'm glad someone asked, because I love the radar stock segment, and I think sometimes it does go on without explanation, Mary. So for me, I think the radar stock segment puts us in a position to talk a little bit about stuff that is going on in the market that our
Starting point is 00:24:07 analysts really want to discuss or feel like we'll be popping up in the news at some point in the next week or two. Very often, you'll have our analysts previewing a company that's reporting earnings. Sometimes they'll be talking about a company that they're not particularly excited about, but they feel like it's something that they should bring up for listeners of the show if it's a company that's in their portfolio. And so I joke sometimes that people bring radar stocks on for good reasons because they're excited and they're following this business. Maybe it's a future watch list stock for them. Sometimes it's a radar stock for a bad reason, and it's more of moment for us to give people a check and make sure they're paying attention to some of the things
Starting point is 00:24:41 that could affect companies in their portfolio. Speaking of investment ideas you get from the news, it's no secret that excitement about artificial intelligence is everywhere. Dana in Cincinnati wonders how healthcare companies in particular are using AI and whether she might find any intriguing stock picks herself from looking at the intersection of those interests. In a multitude of ways, and that's probably why it's something we can talk about at greater length. But just to pick an interesting area has to do with how you design drugs. And some people may know that the Nobel Prize in chemistry in 2024 went to the developers of AlphaFold, which is a program that can predict protein structure just from the amino acid sequence. That's actually not a brand new thing, but using some AI tools,
Starting point is 00:25:28 they were able to make it better than it's been in the past. And the upshot of that is that you can actually visualize in three dimensions what target you're designing a drug against with a lot more ease than you could do in the past. Once you get to that point, you can start to use digital tools to help make molecules that sort of fit like a hand in glove into those targets that you're looking for. So these are all approaches that have been used for years. It's often called rational drug design. It's something that most companies do to some extent, but you're seeing it step up in its complexity. And so it remains to be seen how much this bends the curve when it comes to speeding anything up or making anything better than we have in the past. But you're seeing some interesting
Starting point is 00:26:17 programs sort of advance into clinical trials. And particularly interesting are companies that are going after targets that were once considered, quote unquote, undruggable. So a target where the pocket you're trying to bind into is very hard to fit without making basically a key that opens everything. If you make a key that opens everything, that's called having a ton of toxicity, right? You need it to bind to just that target. And so if you can do that and show that some of these work, I think you're really going to see even more emphasis on this area. You can learn a lot about companies not just through the news, but also through your own experience with them as a consumer. Yuan Lu wrote in with a question for full analyst Jason Moser.
Starting point is 00:27:00 which will paraphrase here. I know you like PayPal and want to ask, why is its customer service aggressively mediocre? It used to be that you could just pick up the phone and talk to someone or leave a private message on a message board and someone would get back to you. In the past few years, however, all I can do is initiate a chat with absolutely no ticket ID, no cue name, nothing to allow the next agent who happens to be on the thread to understand what my question was or what information has already been exchanged.
Starting point is 00:27:27 How much weight do I put on this experience when it comes to making a decision about buying or selling any kind of stock? So I think that's a good question. And one of the reasons is because there's not one definitive answer. This is something that each investor kind of has to answer for themselves. And so, for example, like I'll say one of the reasons why I like PayPal is just because it always works. I honestly don't think I've ever had to actually contact PayPal customer service. So I think based on the question I'm considering myself lucky at this point, because I'm not. because that stinks when you get bad customer service. And so I suspect if I had to interact with, you know, a company and continually got bad customer service,
Starting point is 00:28:07 that would absolutely make me question whether or not this was a business that I really ultimately wanted to own. Now, you see these stories play out all over the market. I think Comcast stands out in an example as a company that really owns their space in a lot of ways. I mean, certainly is one of just a handful of very big players in what it does. and I think perpetually, they've had a reputation for just awful customer service. Now, again, not a Comcast subscriber. We don't have it here. So I don't use it.
Starting point is 00:28:34 I don't have that experience. But based on everything I've heard, that starts to make you wonder, well, is this a company I feel really, you know, do I feel good about owning this? And so I think for every investor, they have to be able to weigh that a little bit in saying, well, is this a place where a company could improve versus are there a lot of positive qualities that this company already has, right? None of these investments is perfect. And I think that's something always to remember is that none of these investments is perfect.
Starting point is 00:28:59 We try to identify the areas of weakness where they can improve, try to identify the areas of strength. And then you kind of have to weigh those against each other. And I guess for me, with PayPal, I mean, I still own shares in the company and I've owned them for many, many years. It just strikes me as one of the companies that's really leading the way in this digital movement of money. Are there things that they could be doing better aside from customer service? Absolutely. And I'm hopeful that new leadership here in Alex Chris is really. spearheading that. He seems like he has some really neat strategies. And I think the market is starting
Starting point is 00:29:29 to take note of that. But ultimately, yes, this is a question that each investor has to answer for themselves. I think looking at that customer experience, that's something that you definitely need. Wait, that's a valid concern that every investor should take into consideration. Once you've found the stocks you want to buy, you'll have to figure out how big a role you want each of those positions to play in your portfolio. Over on X, Jorso asked us, for a beginner, Middle-experience investor, are you better off buying small shares of multiple companies or saving up and buying a larger share amount in fewer companies? For the answer, we look to Osset again. Jor, I think for beginner, middle, and even advanced or very experienced investors, we're
Starting point is 00:30:10 all better off buying small shares of multiple companies at the start of the game. Now, yes, I'm being partly facetious here because if you are an extremely experienced investor, you may have this actually already in practice, but in a different way because you're building up a lot of conviction and then putting your money in and putting serious capital to work. But the principle is that we are taking small stakes at the beginning to learn about businesses as they grow, and we're going to keep adding money to the businesses we come to understand better and that keep performing as time goes on. That's a really great way to make money. It's actually probably a better probabilistic strategy versus identifying at the outset a few companies where we're going to allocate a lot of capital to.
Starting point is 00:30:54 That implies that you've got a really good edge on the house in this game. And Ricky Mulvey and I had a great conversation, likening, investing a bit to probability and playing cards. For most investors, the opposite is the strategy to use, which is you're going to find out over time where you're going to concentrate your capital. Don't do it at the beginning. And I will say over time, even though Warren Buffett is a certain. associated with taking big and concentrated bets. We've seen him in some businesses scale up over time. Of course, his billions have a lot more impact than maybe our hundreds or thousands.
Starting point is 00:31:31 Nonetheless, this is a really fun strategy to employ because it allows you also to learn about a lot of companies and to extend your investing chop. So I'm all for going with more companies, smaller positions, taking your time, and then concentrating further investments on those winners and also adding new ideas to your portfolio as the world changes. We love getting listener questions. So if you've got a question that you'd like to hear an answer to, write to us on X or shoot us an email at Podcast at Fool.com. That's Podcasts with an S at Fool.com.
Starting point is 00:32:05 We'll be sharing a few more of these with you on tomorrow's show. See you then, fools. As always, people on the program may have interest in the stocks they talk about and the Motley Fool may have formal recommendations for or against. so don't buy ourselves stocks based solely on what you hear. For the Motley Fool team, I'm Mary Long. Thanks for listening, Fools. We'll see you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.