Closing Bell - Closing Bell: The Next Wave of AI 6/10/25
Episode Date: June 10, 2025We take you inside the fast-moving world of AI. We discuss who is winning the race and – perhaps most importantly – where the next big investments are being made for tomorrow. You’ll meet the pe...ople, the companies and the technologies on the cutting edge – as well as innovations that will transform the way we will live, work and interact with each other. We have exclusive interviews with AWS CEO Matt Garman, Anthropic angel investor Anjney Midha, Notable Capital’s Jeff Richards and tech investor Ankur Crawford from Alger.
Transcript
Discussion (0)
All right, guys, thanks so much. Welcome to a special closing bell. I'm Scott
Wapner live from San Francisco. Over the next hour, we're going to take you inside
the fast moving world of AI who's winning the race today and perhaps most
importantly, where the next big investments are being made for tomorrow.
You'll meet the people, the companies, the technologies on the cutting edge
innovations that will transform the way we'll live, work, and interact with each other in the years ahead.
To help us tell that story today, we have exclusive interviews with AWS CEO Matt Garman,
Anthropic Angel investor Anjane Mida, VC Jeff Richards, and tech investor Ankur Crawford
on where she's placing her biggest bets today.
It's great to have you with us. The next wave begins now with our dear Drabosa here
at One Market.
Thanks for being with us today.
Well, I love that you're doing this show, Scott.
So I had to dig deep and figure out
some of the buzziest stuff that maybe people on Wall Street
hadn't necessarily heard of, but is big here.
So one of the biggest shifts in AI that I'm seeing this year
and constantly talking about with founders and CEO
and the CEOs in the Bay Area
is the shift from the foundation layer to the app layer.
It is no longer about who is building the smartest model,
it's about who's turning it into a business.
So take AI first coding, it's also known as vibe coding,
the idea that anyone can build software
just by describing or prompting what they want,
including my nine-year-old who's also taking a course here
and creating stuff.
Now you have startups like Cursor that are building
for professional developers who want to move faster.
Then there's Replit, another one that's aimed at users
with no coding experience.
And Windsurf was just acquired by OpenAI.
Now even the Megacaps, Google and Microsoft,
they both say that AI now writes about 30%
of their total code.
So it's happening at every layer here at every level.
For investors, it could signal a few things,
smaller engineering teams, faster product cycles,
better margins.
Now, another trend that's sweeping Silicon Valley
is next level AI video generation.
Specifically, Google Gemini put out a new version
of its product called VO3,
and users are creating incredible synthetic video with it
from maybe some of you have seen this,
Moses live streaming the Exodus to Stormtrooper vlogs
that are going absolutely viral on social media.
So Scott, first it was texts, then it was images,
and now video is the new battleground
as well as AI first coding.
I love the way you laid this out.
The idea of anyone anywhere can build.
The idea that AI has democratized innovation itself,
that you don't have to be a hyperscaler
or one of the largest tech companies in the world
to scale up your business.
You can be small and scrappy and a startup
and scale quickly if your technology is good.
Right, and that's true on the coding front
and that's true on the video front.
And I think you're gonna keep seeing this
in different industries, right?
The idea that AI makes things accessible
to so many different people and groups of people.
From, I mentioned it, from my nine-year-old,
he can now do an app, but also the mega caps,
Google and Microsoft, they're supercharging
their engineers, right? With this AI first Microsoft, they're supercharging their engineers, right,
with this AI first coding, they're becoming more efficient,
they're not hiring as much,
engineers can focus on different things.
So across the stack, I think that's happening.
And what's been surprising too,
is that a lot of the startups are able to do this
because they're proving nimble and they can move quickly.
They don't have these trillion dollar businesses that they have to worry about disrupting and
that's why I mentioned some of those companies. A lot of them are in the
private markets right now like a cursor that just raised a huge amount of money
like a glean who I just spoke to who also raised a lot of money. These aren't
companies do you think have the same kind of distribution network advantages
that the incumbents have and yet they're finding, you know,
really significant temps. That was a great look for us. Deirdre Bosa, thank you.
Deirdre says disrupting, it's a perfect segue to just that because just today
CNBC released its newest Disruptor 50 list. The company's leading that next wave.
Everyone from Anduril to Aptronic to Anthropic. For the full list head over to cnbc.com
slash disruptors. Our next guest was an angel investor in Anthropic. Today he's on the hunt for
the next big thing in AI. Anjane Mida is general partner at Andreessen Horowitz. He's with us here
at One Market for a CNBC exclusive. Anj, it's great to have you. Thanks for being here. Thanks for having
me. So we'll get to the next wave and all of that,
but I wanna start where you really started,
and that is as an angel investor in Anthropic.
Can you take me back to that period in which you call it
one of the hardest fundraisers you ever had to do?
Yes, I can look back on it with some wistfulness now
in a good way because I think things have gone well.
I'd say so.
For them, for the industry in general,
but this was around the end of 2020
when Dario and Tom gave me a call,
they'd been longtime friends,
and said, look, we've trained this little model
called GPT-3, we think it's pretty cool,
and we think there's an opportunity to transform humanity
if we can continue scaling up the training of these models,
if we can combine compute data and algorithms
in the right combination, we think we can sort of
predictably keep improving the performance of these models.
And so I said, okay, that sounds ambitious.
What do you need to get started?
And Dario said, look, I think we can get by with five.
And I said, okay, I'll wire over the five million next week.
And he said, no, I don't think you understand,
Anj, I'm talking about 500 million.
And I said, okay, that's gonna take a minute.
Just walk me through why.
And so we spent these sort of weeks
in the early months of 2021 on working sessions,
really trying to understand what was the capital requirement
that made the seed round requirement so large.
And that was my crash course to the idea
of scaling laws in AI.
And I made 22 introductions for them up and down Sandhill
Road.
I thought the job of an investor, especially
an early stage investor, is to take a bet on a big, bold
vision before the rest of the world realizes it.
And I thought it would be a slam dunk.
And they came back with 21 nos.
And I think that was a wake-up call for me.
All the VCs that you went to on Sand Hill Road,
which is the incubation center of startups out in Silicon Valley
in many respects, and they all passed.
Basically, that's right. Yeah.
It used to be the incubation center.
I think there was a time in the early days of the venture capital industry,
if you look at the greatest names like Arthur Rock or Tom Perkins, you
know, these guys were often sort of shoulder to shoulder with scientists, you know, like
Herb Boyer, who was working on, you know, gene editing at the University of the UCSF.
And they would work with them on shaping the right business model to sort of take the research
out of the lab and impact millions of people.
And I'm not sure most venture capitalists know even how to do that anymore.
And so that was a wake-up call for me where I realized we probably need to reset the venture
industry because most investors look at a business plan like the Anthropic Plan, which
said we're going to need hundreds of millions of dollars to train our model, and if we succeed,
we'll transform humanity.
And instead of going, wow, that could change the world,
I'm in, they went, ugh,
that's not quite software as a service.
Interesting.
I'm more of a SaaS investor, so not for me.
What do you make of what's become of Anthropic?
Obviously, the billions of dollars in the investment
they got from Amazon and this hyper-compet competitive landscape that they've now found themselves in?
Well, I think there's two frontiers in AI, right?
There's the capabilities frontier and then there's the efficiency frontier. Usually the the capabilities frontier comes first
It's often dominated by teams that have a pretty
sort of clear point of view on how these products will be used or new technology will be used by the most sophisticated users.
These are often consumers and developers.
And then there's the enterprise. These are the governments of the world, the mission critical industries of the world, defense, healthcare, financial services,
banks, the you know top fortune 500 and they often need a little bit more hand-holding to figure out how to consume the technology.
They often turn to cheaper, faster, more control.
They often turn to open source.
And so I think what's happened with Anthropic and OpenAI is they've done an extraordinary
job pioneering the capabilities frontier.
And you're seeing that in their revenues.
You're seeing that in their adoption, whether it's an application like ChatGPT or a coding
model like Cloud4.
And then you have the world's enterprises going, this is really great from a proof of
concept sort of prototyping perspective.
And when it starts coming time to push that to production, they often need to consume
that technology in a different way.
They often turn to open source for that.
They often turn to partners who can sort of abstract away the complexity of deployment.
And that is, I think, being pioneered by companies like Mistral, right, which is the team that created Llama,
which is literally the first open source language model
that was comparable to the closed source alternatives.
Which you're an investor in as well.
That's right, I'm on the board of Mistral.
We led the series at Andreessen Horowitz a couple years ago.
They just launched their new reasoning model today
called Magistral.
That's just today that this happened.
Like six hours ago.
And that's a frontier open source reasoning model, right?
Which is the idea that these models can now think
for longer and longer before coming to an answer for you
that is three times faster and smaller
than the previous comparable model in that family.
And so you're seeing sort of the efficiency frontier
being pioneered by one kind of company like Mistral
and the capabilities frontier being pioneered by companies like Anthropocene.
When you talk about Mistral and that technology, you're also, aside from being a very astute
investor in this field, you're trying to be a big thinker about how all of this is going
to change the way we live our lives.
If machines are going to think for us and learn for us and drive for us
and interact with us and and guide us, how does that impact what our purpose is
going to end up being in life if we're reliant on machines to do all of what we
have always prided ourselves on doing ourselves? It's a tough one and I
think it's going to be a choppy.
I'm an optimist, so I believe that,
like with every previous fundamental technical shift,
humanity will be moved forward by this new AI technology.
I think the long term is going to be abundance.
It's going to be new use cases, new productivity growth,
new ways for us to connect with ourselves, our families.
That's the arc of technological progress.
But the transitions can be painful.
And I think it's going to be hard for a lot of people
who don't embrace the technology,
who don't learn how to test the technology for themselves,
to understand how to build a relationship
with the technology and instead view it with fear.
It's going to be tough.
You're thinking a lot about usable AI.
As the theme of our program today is the next wave, you think that's going to be tough. You're thinking a lot about usable AI. As the theme of our program today is the next wave,
you think that's going to be at the forefront,
taking AI and making it usable.
It's like a company you're invested in called Sesame.
Right.
Are we literally going to be walking around
sooner than we think, wearing glasses
that are going to sort of dictate how we observe the world
and how we interact with it?
That's generally the history of the computing industry, right?
Is usually when you have a fundamental breakthrough
in the software that allows, or the software platform,
that allows the hardware interface to change
and become more natural to use.
And so when we had the idea of mainframes and PCs
and then smartphones and now ultimately
something wearable like glasses,
that went in lockstep with the development of
the operating system and then to a multi-touch interface on a smartphone. And then now with AI,
the idea is that you will just be able to talk to a computer instead of having to do many of the
things we're forced to do today by having to pull out a screen, tap it, and be much more
declarative about what you want to do with the computer. Instead, you can just lean it and be much more declarative about what you wanna do with the computer.
Instead, you can just lean back and be imperative.
Just talk to it about your goals.
And if it's smart enough and it has the context
about your life, because it's a pair of glasses
that can see what you're seeing, hear what you're hearing,
know who you're with, and is fully privacy preserving,
you will turn to it for many of the tasks
that you today have to lean on
on a traditional computing interface.
Instead, I think you'll just talk to a companion.
How do we also think about competition?
And I don't mean competition among the hyperscalers
or the biggest innovators here.
I mean between nation states,
the United States versus China,
state funded versus private enterprise
sort of leading the charges on either front.
Who's gonna win? Who's ahead today?
And how do you see that developing? Yeah, so I think the if you look at the core ingredients
of competing at the frontier AI, there's basically three or four, right? The recipe is quite known
at this point. It's compute, it's data, it's talent, algorithmic talent, and it's good
forward thinking regulation. And when it comes to compute, there really have only been two regions so far
that I would call hyper-centers.
These are regions that have the compute infrastructure
to train, host, run the inference
on their own frontier models,
and that's been the United States and China, right?
Now, the question for a number of other countries,
especially in the geopolitical environment
we're in right now is,
do I build, buy, or partner with a hyper-center?
Do I build out my own partner with a hypercenter?
Do I build out my own hypercenter infrastructure locally?
Do I buy it from somebody who I'm allied with
because we share the same democratic values
or some other value system?
Or do I just sit out the race?
And I think there's a few regions that have decided
they do wanna compete at the frontiers hypercenters, right?
You have the Middle East, the Kingdom of Saudi Arabia,
you have the UAE, you certainly have Europe,
you have Japan, India, Singapore,
these are the hyper centers
now building out their local infrastructure,
what they're calling sovereign AI capacity
or the goal is to have infrastructure independence
over their own destiny.
And I think this is probably the single biggest
sort of geopolitical story of our times, right,
is governments now see AI as a core national
piece of infrastructure, core national capability, and some of them are realizing they want full
control over that stack. And so they're turning to open source partners like Mistral, for example,
in Europe, to build out that stack entirely locally, whereas other countries are partnering
with China. And I think our job here in America is to make sure,
kind of like we did in post-World War II
with the Marshall Plan, that our allies come along with us
and we build an ecosystem, a global ecosystem
that's built on a stack that reflects our values.
You've made the point as well
that this isn't an optional game.
Right. We have to win it.
That's right.
I mean, what we saw with the release of DeepSeek
earlier this year, right,
was even though there were a ton of experts going up in front of Congress a year ago saying
very confidently, oh, you know, the U.S. is six years ahead of China. They're so far behind
us in AI. And we should talk about slowing down the progress here until we can figure
out exactly how to manage all the risks. And what DeepSeq came out and said was, with or
without you, we're rolling forward. And we're gonna open source the best frontier models
and we're gonna ship them to the rest of the world.
And so I think that changed the calculus completely
where this race is happening with or without us.
Look, it was an extraordinary piece of engineering
that DeepSeek put out from China.
What it also showed was that our adversaries
aren't waiting around for us to get our act together.
And so one way or the other,
we don't have a choice but to win.
It's been great catching up with you
and hearing your insights.
Thanks so much for spending time with us.
Thanks for having me.
It's Anshmita joining us right here at One Market.
We are getting some breaking news on the trade front.
For that, we go to Megan Casella,
and she has that for us.
Speaking of the United States and China, Megan.
Hey, Scott, that's right.
So Talks Now in London are restarting
for the third time today.
Chinese officials arriving back to the Lancaster House in London to meet with those U.S. officials
after they took a dinner break earlier today.
And when they arrived, Commerce Secretary Howard Lutnick spoke on the ground to reporters
there giving just a little bit of a readout and a sense of how things are going.
He said, talk so far are going, are going quote really really well that they are trying
to finish things today. He says he thinks that talks will end tonight. He hopes they end tonight
but that if need be they could continue into tomorrow. He also said the teams have their heads
down right now. So if talks do continue into tomorrow Scott one big question is whether Scott
Besson would continue to be on the ground in London. He's scheduled to be on Capitol Hill tomorrow
for testimony. It was one of the reasons we thought there was a hard stop
tonight and that he would be back in Washington tomorrow for that. There is
some question though that Lutnick and the trade representative Jamison Greer
could continue on even if Besson does have to return home. So we have queries
out to the White House and Treasury Department about that, about what
tomorrow would look like if they get to that point.
And just to clarify one headline from earlier, Scott, earlier in the day there had been a
headline, a report that talks had wrapped for the day much earlier, several hours ago
now.
I was told at the time that that was false.
A little bit after that, these negotiators did take their dinner break.
But now that it's just about 8 p.m. in London, they are returning now for their third round
of talks even today after that full day of talks yesterday.
Scott.
I appreciate your reporting and the update.
We'll follow the markets over this final stretch here.
Megan, thank you.
Megan Kassell in North Lawn of the White House.
We're just getting started on this special edition of Closing Bell today.
Up next, AWS CEO Matt Garman joins me exclusively.
How Amazon is looking to own the next wave of AI.
We're live today from One Market in San Francisco and you're watching Closing Bell on CNBC.
We're back on this special edition of Closing Bell live from One Market today in San Francisco.
Amazon among the hyperscalers thinking about the next wave of innovation in AI.
The company has pumped billions of dollars into Anthropic, making Amazon Web Services
its primary cloud and training partner.
Joining me now in another Closing Bell exclusive today, the man leading that effort.
Matt Garman is AWS CEO.
Welcome to our special program.
It's great to see you.
Awesome, thanks for having me.
I'd like to begin with news that's recent for AWS
and that's your data center investments.
$20 billion this week in Pennsylvania,
10 billion last week in North Carolina,
billions more globally.
If we're at peak data center, Matt, this would in North Carolina, billions more globally.
If we're at peak data center, Matt,
this would suggest that you don't think so.
Look, we think that there's a ton of business ahead.
As we look at cloud adoption,
we're still in the very earliest stages.
And with all of the excitement and potential
of generative AI and agentic AI,
we think that there's lots of opportunity for us
to expand our infrastructure globally.
And so, as you mentioned,
we've announced several large investments
here in the United States,
both in Pennsylvania and North Carolina.
Very excited about those, as well as internationally.
We announced just last month,
a large investment with Humane in Saudi Arabia
to launch an AI zone there in the kingdom.
All around the world, we think that AI has the chance to really transform companies and businesses and really industries.
And we're incredibly excited to be powering that really revolution.
How do we know we're not overspending and overbuilding?
Well, our view is that the potential out there is massive.
We think about this investment very carefully.
We look and forecast out many years of what our customers are going to need and what that
growth is going to look like.
We do a lot of risk analysis and careful analysis of what we think we're going to need.
We feel very good about the investments that we're making.
We feel that those are going to be important, but that's part of what we do for customers.
As we think about that, we plan many years in advance
so that we ensure that when customers
do need that capability, it's ready for them.
And that's part of what we offer them in the cloud.
So that customers don't have to worry about that.
They can trust AWS to scale for that need on their behalf.
At the same time, there are innovations and advancements
that happen that are unforeseen.
I was just speaking with my prior guest about what DeepSeq did and how it maybe re-informed
or reset the conversation around computing power and the cost thereof.
Are we still thinking about it relative to what DeepSeq did?
What was your biggest takeaway from that as you obviously continue to spend where you
see fit?
Yeah.
Well, we support DeepSeek actually.
We have a service in AWS called Bedrock where we make a wide variety of AI models available
for our customers to use.
And we're actually the first to enable DeepSeek on Bedrock together with Anthropic and a number
of other models that we support, including our own models called Amazon Nova.
And so when we look at those costs,
and one of the things that DeepSeek, I think,
helped people realize is that the cost of inference
is really coming down a lot.
That's really where most of the usage
is gonna come in the future.
Not from training these models,
where there'll be several providers
that do do those large training clusters,
but it really is gonna be in this usage
where every single enterprise out there
is gonna be using inference,
and they're gonna be using these AI models
to get more efficiency in their workplace,
they're gonna deliver new customer experiences,
they're gonna completely transform how they do work
and uncover new innovations.
And as you think about today, that cost is still high.
Even with DeepSeq showing that they could lower the cost, that cost is still too high even with DeepSeq, showing that they could lower that cost,
that cost is still too high for many customers
and many use cases.
But as we see the cost of inference coming down over time,
we actually think that the usage is gonna go up
multiple times more than that.
So we expect over the next year or two,
we'll actually see another 10X decrease
in the cost of inference,
but we expect multiple orders of magnitude growth
in the amount of inference this has done,
which is why we're building all of this,
investing in this infrastructure,
so that we can help customers take advantage
of those capabilities as they roll out.
You mentioned your Bedrock platform.
The headlines would suggest that Amazon,
and this is the way they're writing about it,
that Amazon wants to become a global marketplace for AI.
You have your global marketplace business, obviously, but that you want to repeat that,
in a sense, as it relates to AI. Your model, agnostic, more of an aggregator, they would
describe rather than being an innovator, how you just described as well, with Bedrock letting
customers choose between literally a hundred different large language models. Yeah I would say that we're both an
innovator and an aggregator. I think customers want that choice and so they
don't only want to use first-party products they want to use the whole
suite of offerings that are available out there in the world. Today we're here
working with a number of startups that are building some fantastic new
capabilities. Many of them are building a number of startups that are building some fantastic new capabilities.
Many of them are building new models,
many of them are building capabilities
on top of those models.
And we want our customers to be able to use
all of those capabilities.
And so in AWS, our goal is to give customers choice.
Sometimes they're gonna wanna use
our first-party offerings,
like Amazon Nova from the model providers,
but also they're gonna want a whole host of other things.
Sometimes they're gonna wanna use open source models.
We provide access to models like DeepSeq or like Llama.
Sometimes they're gonna wanna use proprietary models and Claude 4, which was Anthropix model
that just launched a couple of weeks ago, is now the best performing model out there
in the world for many use cases, including coding.
All of that.
Customers are gonna wanna access to all of these things.
And actually, as it turns out, many customers wanna use multiple models at the same time.
What we're finding is that as they build applications,
a lot of times they'll take many of these models together,
stitch them together in unique and different ways,
and then use the rest of the rich AWS infrastructure
to build fantastic applications.
And so they'll use AWS databases, they'll use AWS storage,
they'll use AWS compute,
they use a bunch of the capabilities that we have,
and that's how the actual end applications get created,
and it's what customers are really excited about.
When we think about the startups that are here today,
I'm saying that that's what they're doing,
and that's how they're using AWS.
That's what actually brings you to town,
is dealing with startups and trying to get them to build
on the AWS cloud.
In fact, you just opened the 2025 Generative AI Accelerator
to support startups building foundational AI.
That just happened today.
What do you really hope to get out of that?
That's right, it's actually our third version of this.
We do this annually.
And really startups are, you know,
I've been at AWS now for 18 years.
And from the very beginning, startups were a core piece
of how we built the business.
And they're fantastic.
And we continue to invest in startups today.
The startups today, they're the enterprises of tomorrow.
And so what we do as part of this investment
in these startups is we really invest in them early
and help them get going.
And as part of this generative AI innovation center,
we invest in brand new startups
who we think are doing really interesting things.
I'll take one example from a couple of years ago,
there was a company called Audio Shake
and they're building a cool technology
where they can actually listen to a live audio stream
and actually pull apart the different tracks in there.
And when people are making movies and other things like that, it actually makes editing
easier.
There's another company called Versel where actually many of these customers are building
on top of that actually allow enterprises and startups to build much more quickly.
And they all build that on top of AWS.
And our goal is to work with the entire ecosystem, like folks like Vercell,
together with brand new startups to help them get a leg up and move more quickly.
You just said you've been there for 18 years.
Of course, you've been CEO of AWS for basically a year.
Do you think Wall Street has a better understanding now versus then over what Amazon's AI strategy
really is?
There was concern at some point
that you were behind Microsoft and Alphabet.
Have you closed the gap, do you think?
Well, I think that we've hopefully done a better job
of helping people understand what that strategy is.
And I think for us, it was super important for us
to think about operational excellence
and security as a baseline.
Because when you go out to the customers out there, they really see that is critical,
and they realize that their enterprise data, their unique IP, is the thing that differentiates
their company from everyone else's. And so we started from the baseline to say,
how do we ensure that we secure that data for every customer who's building on top of AWS?
And then we wanted choice, and we knew that customers were gonna want choice.
If you remember, three years ago,
everybody was saying there was just gonna be one model
to rule them all, and that would be the end of everything.
And we really believed that there was gonna be
this wide selection of choice,
some of them first party, some of them third party.
And so combining that operational excellence,
security, and choice is really how we built the platform
that we have today.
And I think that story is now resonating, and I think you see others kind of scrambling
to catch up to that story.
But that's how we built the business.
It's where enterprises are leaning in.
It's why startups choose us.
We're here with 100 of the top infrastructure startups in the world.
88 of those started their journey and built on top of AWS.
And so as we think about the vast majority of startups building on AWS, those
are the reasons we think they choose us.
I'd like to end where I began, and that's back on the idea of spending. And the question
of it can't really go on forever, can it? At some point, there has to be a return on
that investment. You demand that as CEO. Investors obviously would demand that too. There was at the end of April
a Wells Fargo report that said AWS had paused quote some lease commitments. Now you guys
responded to that and called it quote routine capacity management. But what would cause
you to change your spending plans?
I'm not really sure where they got that story from.
I will tell you that we're continuing to invest broadly across the globe, frankly.
And so from our perspective, less than, you know, by my most estimates, less than 10,
20% of workloads have moved to the cloud today, and the number of workloads are growing every
single year.
So there is a massive demand for compute infrastructure.
And we're regularly think about different leases that we have or
different investments that we make.
But on the aggregate, we are still significantly growing our capex
investment as well as our investment in data centers, as well as our
investment in servers.
Frankly, as well as our investments in custom silicon so that we can
deliver the absolute best performance and the absolute greatest set of capabilities for all customers out there.
And we want our builders and our customers to be completely unconstrained as they're
able to go and invent the next great companies out there.
The trade war having any impact whatsoever?
You know, look, we keep a close eye on it and we obviously have customers all around
the world.
So we work with people to figure out how that impacts us and how it impacts them.
And we actually think that the cloud is an enabler.
When a lot of our customers are worried about how tariffs may impact them or trades may
impact them, the cloud is actually one of those things that gives them agility to be
able to move their workloads around, to be able to flex up and down. And so many customers view the cloud as one of those tools
that can really help them a better deal
with uncertainty out there in the world.
Want to ask you one more question.
And it's about chips,
because I ended up talking about Nvidia almost every day
with the investing community that comes on the programs
and certainly comes on our network every day.
You buy chips from Nvidia.
You're also designing your own chips.
Can you talk to me about the balance between the two,
the demand you see for your own and whether you can meet it?
Again, I think it comes down to choice.
You know, it's not about A versus B. It's A and B.
I think our customers out there want choice.
And we've been investing in our chip business
for over a decade now.
And we're on generation four of our Graviton chips
and they're incredibly popular with customers.
And we still buy a huge number of chips from Intel and AMD.
On the accelerator side, we're great partners with Nvidia.
We work very closely with that team.
We buy lots of processors from the Nvidia team
and we build our own chips,
which we think give customers a great alternative,
particularly on a cost performance basis
and particularly for certain workloads.
And so we think that choice is incredibly important
and we'll continue to offer that choice
for a really long time.
Nice to see you, Matt.
Thanks for spending time with us today.
Yeah, thank you for having me.
That's Matt Garman.
He's the AWS CEO.
Coming up next, how you can invest in the next wave of AI.
Notable capital is Jeff Richards.
He maps out his forecast for the IPO market.
He has high hopes for it.
The special edition of Closing Bell back after this.
If our next guest is correct, you'll soon be able to invest directly in the next wave
as the IPO market finally picks up.
Jeff Richards is managing partner at Notable Capital with me here at One Market.
Had to talk to you when we're out here.
It's good to see you.
Good to see you.
Welcome.
Do you have high hopes for what our viewers are hopefully going to be able to invest in
in the months ahead?
We certainly hope so. I mean, if you just look at the last 20 IPOs,
some data I shared with your team,
average IPO is up over 50%
and you've got some stellar outperformers
that are up over 100%.
You've got companies like Service Titan
that went public in December, traded up over 40, 50%.
It's held up well.
Today's the lockup.
We'll see how it does after that.
It's always a big test for new IPOs.
And then recently you had two or three
that were below five billion in market cap,
which has kind of been a barrier
that the bankers didn't really want to cross.
And so, a lot of appetite, they've done well,
and hopefully a good sign of things to come.
Which are the ones that we need to keep our eye on
in the next handful or two of months?
Well, we don't have access to the confidential filings
of folks that are on file, but rumored Figma is a big one.
Everybody talks about the same name.
Yeah, Adobe tried to buy Figma years ago.
I think the thing that a lot of folks are really waiting for is this next generation
of AI companies.
Could we see Perplexity or OpenAI or a name like Andriol or SpaceX or even Starlink spin
out of SpaceX?
Those are some of the creative ideas you hear people get excited about because there is
the class of unicorns from 21, 22 that are probably going to file and come out.
A lot of those are sort of pre-LLM, pre-GPT companies and what people are really hungry
for is give me the newer names.
And so, until then, what they're doing is they're betting on public names like Snowflake,
CloudFlare, CrowdStrike, the names you guys talk about every day.
Great names, great bets on AI and many many of them are up 50%, 60%,
which I think is a reflection of the interest you'll
see in these private companies when they come out as well.
I mean, the landscape has changed so much,
as you know better than most.
The kinds of companies that we just named, the chat, GPT,
and open AI, or perplexity, Waymo, Databricks, whatever,
back in the day, they'd be public already.
Now all these companies are staying private for longer.
So if you don't have access to investing in the private markets, you don't have a run
at what the next wave might be, but that could be changing too.
It's a real interesting dynamic. If you look back at 1996, we had 7,000 public companies.
Today we have 4,000. So the GDP and the economy have grown, I don't know, 10, 15x in that
time window, and yet more of the market is concentrated in private assets.
So who's winning, who's benefiting from that?
Obviously, our industry, venture capital.
We see a lot of opportunity here with AI.
We've had a challenge with liquidity over the last few
years.
Hopefully, we're starting to see some of that thaw out
with rising IPO and MA activity.
But the private equity firms.
So one way for your viewers to play this
is to bet on some of the private equity names.
TPG, Aries, Blue Owl, Blackstone, KKR. Those folks are lending capital to some of these companies,
and they're also investing in things like power and data centers, which are going to be super
critical to AI. So another way to maybe play it while most of these companies still stay private.
Yeah, I mean, it's the funding substitution. You don't need to go to the public market anymore.
Private equity will fund you, and then investors are having more access to alternatives as
private equity and everything else.
Lastly, what's exciting to you?
What part of AI is going to be the next wave of excitement in your mind as a venture guy
who sees people come through and pitch stuff all the time?
I feel like we are moving from generative AI to more agentic AI. Maybe it's overhyped
today but won't be tomorrow. The obvious ones are things like self-driving, autonomous,
robotics, the things we hear people talk about every day and the rise of things like chat
GPT. I would go into maybe two areas that people don't talk much about. One is small
business. So you and I have talked about this. 50% of our economy is small business, 55%
of employment.
Small businesses are going to benefit in a big way from AI, automating basic tasks like
scheduling, billing, collections, all these basic things that people who own local store
fronts do.
And then another one is vertical AI.
So you've heard Mark Benioff talk a little bit about this, but these companies that sell
software and artificial intelligence into categories like the legal category,
the healthcare category, industrial and manufacturing.
Every single category is going to have a version of AI
that is sort of purpose-built for it.
A lot of it with the same infrastructure,
but purpose-built for that vertical
and then creating a ton of value inside the vertical.
So that's an exciting area.
An environment where I foresee, I guess,
based on what you're describing,
costs down, productivity
up, for any business size.
It's great to catch up with you out here.
Jeff, thanks.
Good to see you, Scott.
All right.
Jeff Richards out here joining us here at One Market.
Up next, Algersanker Crawford.
She breaks out her big tech playbook.
She'll tell us how she's navigating the AI trade, the next wave in the companies she
is excited about.
Now, this special edition, Closing Bell Live
from One Market in San Francisco is back right after this. We're back on this closing bell special, the next wave in AI.
Our next guest counts the biggest names in tech among her top holdings, but is also thinking about the next wave of investments that will bring big returns. Anka Crawford
is portfolio manager at Alger and she joins me now. It's good to see you. I'm sorry I'm
not at post nine with you, but it's so good to have you part of our special. So I mentioned
you are invested in the biggest names, but you are uniquely also thinking about the themes of tomorrow.
In your mind, what are they going to be as it relates to tech and AI?
I think, you know, AI is the biggest theme for tomorrow.
But if you think about all of the tech that we need to be invested in, it's going to be
autonomous vehicles, EV, power, AI, robotics.
I think those five themes are really going to take us through the next decade.
When you tick all those off, you check the box of what stock?
Good question.
You know, the publicly traded stock that you can invest in today is obviously Tesla.
It doesn't touch all of those, but it touches a majority of them.
And with humanoids, solar and batteries, autonomous, so it touches really all of the, three of
the trends and AI.
So it touches three of the trends or four of the trends of the five.
And it's a really interesting stock,
despite all the kerfuffling that we're hearing recently.
How are you thinking about the competitive landscape
as it relates to all of these?
I think it was today, yesterday, day before,
and probably the day after,
we talk about deals that these hyperscalers are doing,
billion dollar investments that they're all making.
You own so many of them.
Are you thinking about a potential winner
in this arms race?
I think there will be several winners
because the market is so, it's so big
that I think it's hard for us to comprehend.
And let me give you an example.
I looked up my chat GPT history, and I asked it to tell me how many tokens I have used
by month.
Over the last six months, my token usage is up 30-fold.
So I might be a power user, and I expect that over the next six months, my token usage will
grow again.
And so if that is the normal trajectory of usage and the general population is just starting
out on this journey, the amount of compute that we're going to need is simply almost
incomprehensible, which is why you see all of this expenditure and why you see the
hyperscalers spending the way they are. Now who wins in this race? I think there's
going to be many winners. I mean you're seeing Google with Gemini become
more used by the average user as it's more accessible. A lot of us use
Perplexity and ChatGPT, but. But I just started using Claude to design a new website.
So I think it will be a market where there will be many that win in order to support this effort.
Ankur, we'll see you back east. I look forward to that.
Thanks for being with us today on our special show here, The Next Way.
That's Ankur Crawford of all you're still
ahead a West Coast edition of
the market zone. Dear Jibosa
Steve Kovach both standing by
with their takes on the next
wave in A.I. back in San
Francisco after this.
We are now in the closing bell market zone, a special edition CNBC senior markets commentator Mike Santoli standing by to break down these final moments of the trading day.
We start here at One Market with Deirdre Bosa once again, our Steve Kovach is here as we
still talk about what the next wave of AI might look like and what role, let's just
bring it full circle here,
Apple will play in it.
Yeah, and so we're at the end
of developers conference season.
Apple wrapped it all up.
And I mean, my main takeaway here is after digesting,
not just what happened during
the developers conference season,
but also just in the last 24 hours of AI headlines we've had,
all these companies, they're doing donuts
on the lawn of Apple Park.
They're just saying, look what we got going on here.
You guys changed the design on the iPhone software
a little bit, announced some minor AI features,
and then we have the super intelligence announcement
coming out of Meta, whether or not that materializes,
it doesn't matter, and then the Google deal with OpenAI,
OpenAI yesterday announcing $10 billion in recurring revenue.
I mean, yeah, it's just the excitement just kind of left Apple on the AI front and it's
all in these startups and the hyperscalers.
Google, for example, you were at WWDC, Deirdre.
I'm sorry, Google I.O.
Apologies, I still have Apple in the brain.
But yeah, it's amazing.
And they're actually shipping stuff.
And you know, Apple has proven it can't execute and can't ship Apple hopes to one day
Say get off my lawn exactly right so you you know had this news today
You know 15 billion dollar investment here. There's 10 billion flying around there. It's incredible as Jeff Richards was telling me as he was here
What's happening out here right now the amount of money being spent is remarkable
Well, I'm so glad that you got like a front row view to that this week, how quickly things move.
And also how quickly the power rankings get shaken up. I mean, you mentioned Apple and Meta,
and these are seen as two mega cap companies. Meta, of course, has put so much more money
into its AI push, but are still kind of lagging behind. And you have openings for smaller,
newer companies,
not so small anymore, like an OpenAI or an Anthropic
or some of the vibe coding ones that I mentioned.
And then Google, which we've talked a ton about the valuation.
You know, it's down in the mid-teens,
but been shipping, as Kovacs said.
Guys, it's been fun. Thank you.
Dear Jebosa, Steve Kovac, Mike Santoli,
the last 30 seconds is yours on this day in the markets.
Yeah, Scott, I mean, look, we got the S&P up about half a percent and it shows that it just twitched
higher on this headline from Commerce Secretary Lutnick saying China talks well. To me, that just
shows the market is almost over eager to try to price in further de-escalation. We'll see if,
again, if the headlines cooperate there.
The thing I'm looking at though, as we sit less than 2% from the record highs from back
in February, almost close enough to just kind of reach out and lunge for, is this so far
orderly rotation, small caps outperforming, laggards today over leaders, and whether that
becomes at some point a more erratic pain trade with
momentum stocks getting hit again. I'm not saying it's happening but you do see
some funny vibrations in the market here even though it's right now very benign
and you know volatility continues to drain and you're starting to see people
assume that we belong here up at these levels. So again we have to see if in
fact the the trade talks
and of course CPI tomorrow can accommodate
this sort of new found comfort.
All right, Michael, thanks so much.
You're gonna hear the bell ring us out green
and you certainly are gonna stay focused
on those developments out of London.
That's all for us on this special closing bell
at One Market in San Francisco.
I'll see you back east into O2 with Morgan and John.