Limitless Podcast - Analysis: What Investors Get Wrong About Apple and Google
Episode Date: November 6, 2025Believe it or not, Google and Apple are undervalued. We discuss Google's aggressive AI investments and the potential to triple its market cap to $10 trillion, alongside Apple's cautious yet s...trategic approach leveraging local inference capabilities.We explore the historical contexts, evolving relationships, and the investment potential that could defy market expectations. Be sure to subscribe and share!------🌌 LIMITLESS HQ: LISTEN & FOLLOW HERE ⬇️https://limitless.bankless.com/https://x.com/LimitlessFTSubstack:https://limitlessft.substack.com/------TIMESTAMPS0:00 Google and Apple: A Tale of Two Giants0:51 The Bull Case for Google15:45 The Disconnect with Apple24:16 Apple’s Strategic AI Partnerships27:42 Conclusion: Value Amidst Market Sentiment------RESOURCESJosh: https://x.com/JoshKaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures
Transcript
Discussion (0)
There are two companies currently worth trillions of dollars that I think are the most undervalued companies in the world right now.
Google and Apple are a tale of two tech giants that have taken a very different approach to AI.
Google, on one hand, has poured billions of dollars to create some of the world's leading AI models,
going from a corporate has been to one of the titans of AI.
They have their own custom GPUs rivaling Nvidia.
They have one of the largest user bases in the world with Gmail, Android, YouTube.
I could go on. Apple, on the other hand, has taken the opposite approach. They haven't invested in
AI at all. They've been late to update their AI assistant Siri. They haven't spent any money to hire
AI geniuses like Zuckerberg at Meta. But despite this, I think these two companies are poised to be
the biggest companies in the world. So in this episode, we're going to give you the bullcase
for Google and Apple, revealing not just why you should buy their stocks, but also why these two
companies in particular are very unlikely allies. Yeah, I think this episode is going to surprise a lot of
people because there's this gross disconnect between public market sentiment and public sentiment
where stocks are trading very high and people don't quite agree with that. So I think in this episode,
we have a lot of what most would have perceived to be contrary intakes about where these companies
have come from and where they are headed to and how we are actually much more optimistic than I think
a lot of people think on these companies. So either, maybe you could help us start with the bullcase for
Google, which is one of the first companies we're going to be talking about here.
What if I told you that Google currently worth $3.4 trillion is an absolute bargain buy right now?
That's a crazy take.
You call me crazy.
But here's why I think Google is actually worth $10 trillion as an AI giant currently masquerading as a simple, humble search engine.
I've got four reasons for you.
Number one, Google has the best AI models.
You have Google Gemini, which is currently crushing across all benchmarks and beating
chat GPT on so many different fronts. You have V-O-3, which is their text-to-video model, which is creating
cinematic masterpieces in a matter of seconds. You have Nanobanana, their image generation model,
which not just creates really good images, but also allows you to edit them. And finally,
who can forget about Google's Alpha Gemini series, their science AI models, which have won,
not one, but two Nobel Prizes. Number two, they're the only real competitor that Invidia has.
They've created TPUs, which is their version of GPL.
built to make their own custom AI models specifically for Google Software Stack and Hardware Stack.
They also have some of the craziest distribution of users. They have the number one mailing app in
Gmail. They have the number one entertainment app with YouTube. They dominate search engines and
advertising. And they also have the number one mobile operating system with Android. And number four,
a little lesser known fact, is there some of the biggest and best investors in the world? They
currently own 14% of competitor Anthropic and 6% of SpaceX. So no matter how way you want to look at it,
it's hard to argue why Google isn't going to become the most valuable company in the world.
They have all these strengths. They have the best models. They have the best data to train these
models. And they have all the compute to build this in-house. So I'm struggling to find an argument
to see why Google isn't going to be the most valuable company within a decade.
I like linking at history. Google is very much a large player in this Game of Thrones that we talk about
in the search for AGI. And it's helpful to have a little bit of context and pretext to how we got here
today where you're saying $10 trillion is a reasonable market cap. It's a little outrageous. But there's
this funny story associated with the history of Google. And it kind of starts in 2019, I believe the
year is, where they released a paper that was basically the inception of the transformer. And for those
who aren't familiar, the transformer is the piece of technology, this little piece of code that has
generated and created every single large language model that exists today. So by all means, Google kind
invented the technology that is running everybody's AI today. Naturally, as a result, you would expect
them to be the frontrunner. But what they did is they kind of sat on this technology for a very long time.
So in 2022, when ChatGBTT came out, Google actually didn't really have a publicly facing AI product.
They just had the technology because they were very much a research lab. So while Open AI acquired
a million users in the first week, Google is just kind of sitting there on, at this time,
three to four year old technology that they just haven't actually implemented in the product. So what they did is
they rushed to implement this product. And they created a lot of AI systems that started with
Bard, and then recently is what we know today is Gemini, that just weren't really quite good.
And you get the idea that the reason they weren't quite good is because the culture was pretty
bad. In Google, there was a lot of problems with the workforce, and it caused a lot of disconnect,
where if you asked Bard at the time to generate the founding fathers, it would generate a series
of minority women, which was just objectively incorrect and wrong. And a lot of people
took that as, okay, Google actually doesn't have a chance at succeeding in AI. And the reality is
that wasn't true because that changed when Sergey Brin comes into the picture. So we have a post here,
Sergey Brin spotted IRL again, and apparently he's working on Gemini at Google. And this is true.
Sergey Brin, for those who don't know, co-founder of Google, he was the guy who started from the
beginning. And he recognized a wave that was too important to miss. He came back to Google and he
started implementing what they did at early stage Google today. And now we have Gemini, which is the most
powerful AI model in the world or at least very much at the frontier with everybody else,
and they're implementing a lot of new services. They incepted this amazing technology,
lost the lead and now are, as you're telling me, they're probably going to get back into the
lead because that would be the implication with $10 million market cap. Yep. And I think the four key
reasons why they hold such a powerful mode and why they're super undervalued. I want to start off
with talking about their model and it's not just like an LLM, it is everything else, right? They've
tackled and are leading every other medium in terms of AI-generated content on the video side
with VO, on the image side with Nanobanana, and on the scientific discovery side with the Alpha
Gemini series. Just taking a look at this benchmark setup that I have here, this is something that
we've started to see with every Gemini model release, which is they're frequently and consistently
at the top, Josh. They're the number one LLM, they're the number one coding assistant, and then the number one
agent provider as well. And I think that this isn't by coincidence. They're very much. They're
worked very hard to kind of like restructure Google from this corporate kind of bloat,
which they typically had up until AI became a running thing, into this like frontier AI
tech lab that is constantly innovating, changing and adapting their strategies, given whatever
environment they're faced with. So they consistently have the best AI models, but it's also
worth explaining how they've translated this into user dominance. So typically how OpenAI and
chat GPT has done this is, they've created.
to model and then they've pioneered basically the chat interface, right? They then did the same for
text to video into SORA. So they've done a really good job of surfacing this to new users. Now,
Google already has these users and they've done a great job of doing the same thing. They've integrated
AI into their search. So whenever you search anything on Google now, you have something called an
AI overview, Josh. And something that I learned just this morning, which they released from their
quarterly report last week, is the monetizing.
rate of this AI search, which comes up at the top of their infamous Google search page,
is monetizing at the same rate as their search links themselves, which is just a crazy undertaking,
right? Because you'd think that Google is killing their own monopoly in search by doing something
like this, but it turns out that it's actually aiding and probably going to offer them more profits
going forwards. The other thing is they have this whole AI productivity suite, which they
surface all these different models and features.
So I guess the point that I'm trying to make is Google isn't just a model creator.
They are a product creator.
They understand their audience, both from the enterprise side and the consumer side.
And I think it's highly undervalued and very misunderstood how good Google is at doing that job.
But what's probably surprising to most is one of their products comes in the form of hardware.
Their equivalent of the GPU called the TPU, which has added unlikely.
the most value to Google's market cap.
Yeah, TPUs are amazing.
And I'm going to do my best to try to explain TPU versus GPUs
because on the surface it seems a little confusing,
but I think we can simplify it quite a bit.
I just, I want you to imagine a GPU being a chef,
a chef who is a master chef who can cook anything
and can make anything that you imagine.
But the reality is that you and myself and the rest of the world,
we only really want sandwiches.
So while a chef can make a sandwich,
it will take a little bit longer to make a sandwich.
That is what a GPU does.
It can cook you anything you like,
But the reality is most of the world wants sandwiches.
A TPU, you can imagine, instead of being a general purpose chef, is a production line for sandwiches.
It is a singular line that has all the ingredients on top of it to hyper-efficiently make sandwiches for the whole world.
And because the world doesn't want salmon, they don't want steak, they just want sandwiches.
This production line is much more efficient and making sandwiches.
So while the GPU can do everything, and a lot of people want a GPU, people who are training hardcore AI, they just want a TPU.
It's called a tensor processing unit, and it is this heighty,
paper-focused piece of hardware built to perform this thing called matrix math, which a lot of
AI hardware is built upon. So what is so effective about these TPUs? Well, there's two ways of measuring
GPU, CPU, compute in general. It's performance per watt, and it is cost per token. So
performance for watt is two times that of the Nvidia A100s, which is one of their leading
flagship of chips. That means that for every watt of energy you apply, you get double the amount
of performance back. The other one is cost per token.
So for every dollar you spend, this is how many tokens you can yield.
And it's about one and a half to two times better performance per token than Nvidia GPUs.
So these TPUs are remarkable.
They work really, really well.
And the question I asked myself when I was going through this is, well, why does the whole
world not use TPUs?
Well, the first answer is they actually don't sell them.
Google just owns the TPU.
It is a Google invention.
They own them.
They use them for their own stack.
And they've vertically integrated through the software.
So it's hyper-performance.
And it's kind of like what we see with Apple Ships.
with the M series. I always referenced the M series because it's amazing, is that they've built this
chip hyper optimized for their exact production workflow. And as a result, the cost efficiencies,
performance efficiencies, every proficiency is through the roof. So that's kind of how I would
describe the TPU to a layman. That's kind of how I learned it myself. And hopefully that description
makes that makes sense. That makes a lot of sense. Although I have one slight pushback for you,
Josh, which is, as of last week, they have started to sell their TPUs to a little known AI company.
known as Anthropic.
Supposedly the deal is marked
at $50 billion,
which doubles their compute offering
from revenue earnings
on the year.
So if this deal actually goes through,
this will be the first major GPU sale
that has sat outside
of Nvidia's remit.
So I already know Jensen's
fuming at the head,
but we will see whether Google takes this seriously.
I mean, to your point,
I don't know if Google has the
manufacturing capability or even the ambition to scale TPUs in the way that
NVIDIA is done with GPUs.
I see kind of two major constraints.
Number one, Google's just focused on so many other things and NVIDIA's like hyper-focused
on doing this one thing really well.
And number two, TPUs, as you said, are highly specific and custom.
So I'm guessing that the TPUs that they're selling to Anthropic in this deal are very specific.
Maybe that's a niche that they can kind of take forward, but I don't think you could grow as
large as Nvidia. Do you think I'm wrong or do you think they're knocking trillions of dollars
of market cap off of Nvidia soon? Yeah, well, there's two cases. There's one in which you believe
TPUs are the next form of GPU. It's like a natural extension where everyone's going to want to
buy these. In that case, if they could do these production at scale, well, they will, yeah,
they can knock off a trillion or two off of Nvidia's market cap, sure. The reality is
that I don't believe they are direct apples to apples comparison. There's a lot of different types
of training that Nvidia GPUs are good for. The perfor a watt of GPUs from Nvidia are still higher
Nvidia is still innovating faster. They have access to the most cutting-edge lithography,
which happens only in a few places in Taiwan. They really have this monopoly on the supply chain
that Google just can't really compete with, but Google can supplement a lot more compute. So
to the point of Jevin's paradox, where compute will just expand to the emptiness that there is
around it, there's no supply demand shortage for these processing units, whether they be
tensor or graphical. And if Google can start selling them, that just opens up another pillar for
Google. I'm not sure it necessarily hurts Nvidia, but it allows Google to just make more money and
print more cash. And I think that's a big win for a company that really hasn't made money off
of the infrastructure play yet, but now has a very clear opportunity to do so. Yeah. I mean,
I'm just looking at this tweet here. Morgan Stanley kind of like up their price and prediction for
Google stock in 2026. But one thing that really caught my eye is the $158 billion backlog.
that they currently have for their cloud offering.
And it got me thinking that like Google has always been a powerhouse when it has come to
like infrastructure provision and they've done so really well with GCP, their cloud product.
You could also argue the same of Microsoft Azure.
Both of these companies have signed major deals, Microsoft and Google, in the last two weeks
for AI specific stuff, not just training but also inference costs.
So I see this like growing trend, Josh, of like infrastructure, whether it's chip,
or just kind of like cloud services,
getting really in demand as we kind of like scale these AI models up.
So I don't know, I'm super excited to see where this goes.
The final thing is Google is just everywhere.
They are the doorstep to the internet and actually the entire house.
You know, they run Gemini, their models.
They have the TPU chips, which we just mentioned.
But they also handle 90% of search.
They run YouTube, the number one entertainment in the world.
They have Google Maps.
They have Android.
They power all.
all Android devices, it is just a very gluttonous monopoly that Google has that extends way beyond
just a simple search engine. And the point I want to make around here is that is a crazy positive
feedback loop to when you're creating AI models. They have all the data. They have all the compute
in-house. So they don't have to rely on that tiny island in Taiwan and on Nvidia's dominance here.
And then they have all the apps to surface that through a very monetizable audience. We're talking about
billions of users here. So whereas Open AI has to kind of build their audience from scratch,
and they've done so very impressively to 800 million weekly active users, Google kind of already
had that waiting and ready to go, and they've proven that, right? In their quarterly earnings,
they have 650 million weekly active users. I'm guessing the next quarter that's going to, you know,
increase by at least a third. So the point is Google's dominance extends way beyond a search engine,
and I don't think people are very aware of that. And I think that is one of the biggest
ball cases for their stock going forwards.
A counter argument that I've heard to this, Josh, is that they just simply won't execute as well.
I would agree with you, except that the data tells us otherwise.
If you look at this chart over here, the retention rate, the three-month retention rate,
specifically for Google Gemini AI models, has been up and to the right.
It is pretty insane.
We're talking about 90% the three-month retention rate and six-month retention rate of 85%.
That is staggering for any kind of consumer internet.
So, EJES, you've laid out a pretty bullish case for Google.
And I'm not sure that mismatches the public perception as of late.
I think that's changed a lot.
People really hated Google, but they're starting to come around and see the light.
It's trading just beneath all-time highs.
And it very vividly reminds me of another company of about the same size, a little bit bigger,
that has, it's sitting at all-time highs.
But there is a very gross disconnect between public sentiment and the company.
And that's Apple.
There's this post on the screen here that says, what the hell is Apple doing?
They failed to make a car.
they have no AI investments.
Series still sucks.
Just releasing the same phone over and over again.
How can you miss AI?
How is that even possible?
And Sean, to your point, I agree.
They've actually, they've swung and missed on just about everything they've tried recently.
And it's been a big disappointment.
But the stock is trading at all-time highs.
So where is this disconnect coming from?
How can one group of the internet be so polarized against the other group?
And where all the money is going is clearly into Apple,
because Apple is trading like $4.something trillion dollar market cap.
It's a gigantic company.
There's a few ways to think about this.
And one that I really like is around local inference.
So the way AI works, and granted, this is kind of funny because I think Apple accidentally
stumbled upon this conclusion, but I'm going to purvey it as if this was all by design.
There's two kind of ways that companies handle compute.
There's cloud-based compute, and then there's edge inference that happens.
So what happens in a data center in a remote place.
That's what we see all the pictures of, these gigantic.
fantastic factories that have a bunch of GPUs.
And the other happens locally on your device,
whether it be your smartphone or your laptop or your desktop,
whatever device it may be.
Compute happens locally on that device.
Compute on a local device is free.
Compute in a cloud costs money.
So what is Apple doing?
Well, Apple just released a new Mac Mini,
which we have a post about here.
The M3 Mac Ultra Studio, or I guess that's what it's called,
is the only device, which you can run
open source, state-of-the-art models,
outside of a data center.
And that seems strange because there's a lot of really powerful GPUs, there's a lot of really
powerful computers, but the only ones that actually allow you to run these are these Macs.
Why is that?
Well, it's because Apple owns the chips, and they've optimized these chips for AI.
So even though there is an absence of AI currently running on Apple devices, they are fully
compute capable of running these very impressive models, from anything on a Mac Studio down to
your iPhone, which leads us to the bull case of Apple being they are not subject to a lot of the
swings that we see in public markets because they're not investing all of the CAPEX. Now, EJS,
if you'll remember, there was a deal struck between Google and Apple in the past. This is not the
first time they've worked together. And that was for Google search. Back in the day, I think
we have a post somewhere about this. Yeah, here's a post from, you dug this up from 2018, I think.
Right? How much is your data worth? So much that Google just paid Apple $9 billion to be the default
search engine in the iPhone, $9 billion to get people to use a free search. Well, that number
has actually jumped to 20 billion as of as late as 2022 was the most confirmed number. And it's just
an outrageous amount of money to spend. But I don't think this is the last time they're going to
work with each other. So EGES, this week we got some news that there is a new deal on the table. And this
isn't a search related deal. This is instead an AI related deal. Yeah, this is like Google and Apple
deal 2.0. And it's so much bigger. And it's back better than ever, right? So this week we had
newsbreak that Apple has asked Google to create their own Gemini model just for Apple users that
will run on their private cloud. Josh, can you break this down for me? What's going on here?
This is amazingly important. So everybody and their mother is spending trillions of dollars
of capital building AI data centers. Open AI spending trillions, Google, Microsoft, and name
anybody they're spending trillions of dollars. There's a question that we frequently ask ourselves
on the show and public markets are asking, is there a bubble? When is this bubble going
to burst, how much CAP-X is appropriate before you just can't justify the returns on your investment.
Apple, in a way, in a funny, messed up roundabout way, is completely immune to these swings,
to this capital expenditure, and to this bubble exposure, because they have not invested much money
into building out the resources needed to train their own supermodels. What they're doing now is a
very clear and obvious answer, is they're actually offloading that responsibility, offloading the
financial risk to Google, and they're going to pay them X amount of dollars in order for
Google Gemini to create a custom model for the Apple ecosystem. And I think this is a really
powerful thing because we mentioned a little bit earlier. There's cloud and then there's local compute.
And the local compute is actually sufficient enough to suffice for a lot of requests that people
have. I think a lot of prompts, and we talked about this a lot where with cutting edge models,
I'm not even sure how to really test them because just my requests, my needs from an AI model
aren't cutting edge. They don't need massive amounts of compute or tokens or resources to solve my
problems. A lot of the times I'm just like, I need help navigating my day. I have a lot going on.
I need to know, like, what times am I able to do things? And the thing with Apple is it actually
has all this intelligence. And this was the promise with Apple intelligence. We just never got.
But by all floating into Gemini, we start to see these unlocks where Apple's the only one that
has more context than Open AI. And as we think about memory a lot and context and how powerful it is
for a system. And what device, what system have you been using longer than chat GPT that has more info
about you, what's your smartphone? And a lot of people will be reluctant to give up that data,
but in the case that it is private and it runs locally and it's able to take all the context
from your device, that creates this really powerful edge compute. And that's kind of part one
of the bullcase for Apple. So I'll stop there to see if you disagree, agree. This is like we're getting
into the weeds here a little bit. I mean, as you know, I've been one of the biggest bears against
Apple for almost a year now. And rightfully so. Yeah, yeah. I think they've done a terrible job. But
But as you pointed out, this works both ways. They've effectively shielded themselves from the crazy
fluctuations that you can see in AI, specifically in CAPEX, right? So they're not investing trillions of
dollars. But also when that roller coaster ride starts going down, they're shielded. They're
affected from it and they can pick and choose the winners. There's also the argument that like,
all this AI CAPEX investment is kind of a race to the bottom because whoever gets to AGI first
is going to absolutely disrupt and destroy any other competitor is an argument that you could potentially
give. The other side of it is, Josh, I really like your point around the personalization and
localization and the importance of privacy of having your model on your own device. Who cares if there's
a general query that can be answered by some research professor versus an AI model? I want it to
be able to read my text and understand the next thing I'm going to text. I want it to be able to
download the apps ahead of me even knowing about the apps and do the things for me. That's only going
to work on personalized devices. And as you said, Apple has the distribution.
Yeah, the use cases for local inference are kind of remarkable when you think about having
access to all of your texts or emails. There's a second thing to this also, which is cost.
Ijas, if you remember in a previous episode, we covered Grock Nano, which was this very lightweight,
very easy to use, very low cost model, and how within days of it being released, it went to the
very top of all the charts in terms of usage. Every developer in the world wanted to use it because
it was fast and it was cheap. And one of the trends that we see as AI develops over time is
is this race to the bottom, the cost per token, the race to zero is fairly aggressive.
And what Apple's done is they've kind of cheated.
And they've gone from the starting line directly to the finish line without doing the work in between.
Because inference on a local device that has an M-series chip or an A-series chip that Apple has is completely free.
It costs $0 to run a query on these local devices.
And there's billions of them that are capable of doing this.
So if you're a developer who wants to build a great experience for a customer,
there is absolutely no reason why you shouldn't be building for Apple devices, because the inference
costs are free. And a lot of these developers who are building applications for the world of AI,
they're pinging these servers, they're constantly looking for the next cheapest one because they want
to keep the cost down. But if you just throw an app in the app store that runs on these local
models that Apple will hopefully produce in partnership with Google, you suddenly tap into this
gigantic context window, this gigantic bay of memory, and it costs you absolutely zero dollars to do.
And I think market forces will kind of come into play at some point here where just the natural tendency to trend towards lower prices will allow a lot more developer adoption and to create the experiences on top.
So while Apple completely and utterly failed at AI version 1 with Siri, it is an abomination.
It is the worst AI product of any major company by far.
And I don't want to get that twisted.
They may have accidentally, just by the fact of vertically integrating these chips, stumbled into the best case scenario where everybody's spending a business.
bunch of money. Everyone is trying to build the AI first to get the user second. Apple has all the
users and they have all the data. They just need the AI to plug into it and Google could be that
person to do so. Yeah, it's super smart. I think the market is finally recognizing that. If I would
boil all of this down to its core, I would say that Apple is actually doing what they've always
done the best, which is create the best consumer experience and products. And so rather than jump into
the infrastructure wars and the infrastructure race and AI with GPUs. They let everyone else do it.
They cherry pick the best. And then they integrate it in the best, most artful way for their own
products and services. And that's why they have the addicted audience that they have, right?
I buy iPhones practically every single year having MacBooks at this point. I'm addicted to the Apple brand.
And there's a reason behind that because it has the best all-in consumer experience. And that's what Apple
knows that they have and they can afford to basically sign these deals and partnerships with the
Googles of the world. Another thing that I find interesting about this, Josh, is in this case,
they're paying Google to create their own AI model for them, which seems like an L on the balance
sheet for Apple. But if you play this out, Gemini and Google are definitely going to build their
own agent, right, which is going to use a bunch of apps and they need to get access to billions of
consumers. Who has that? Apple. I know Google already has it across an array of their different
products and apps that we explained earlier on, but Apple has that addictive user experience that
only operates at the hardware level through the cellular device. And if they maintain that
lead, that could be a case for Google actually paying Apple a hell of a lot more money to get access
to this. Is that fair? Is that crazy or have I got this wrong? Yeah, at the edge, it makes sense that
the search thing happens again.
where Apple owns the user experience, Google owns the software stack,
Google winds up paying Apple for the exposure.
Like I said before, a lot of these companies, they have the intelligence,
they don't have the products, they don't have the interface,
they don't have the trust or the users.
Apple has that.
And by licensing technology from Gemini,
they exclude themselves from all the CAPEX fluctuations and the crazy amount of spending
without a promised return in exchange for creating an AI experience that is frankly
the only one that I really want, the one that has all of my context. It can read all of my texts,
all of my emails, just, it could be the personal assistant that no one else can be yet. And that's
remarkably powerful. And if I want to outsource my thinking to an impressive AI, I could do that.
And sure, Apple could probably ping some sort of API through Google or whoever they're partnered
with, like OpenAI. But for most of my tasks, most of my AI needs, I just want a really helpful
assistant. And no one can provide that better than an AI on my iPhone. And I think that's kind of the
the bull case in summary is, is as AI is able to remove themselves from CAPEX, as they're able to
implement local inference on all these devices for $0,000, there's a lot of network effects that
will probably come swinging their way. And maybe it was accidental, maybe it was by design,
but the long-tail effects of Apple's position right now seem fairly optimistic. And I think
that's probably why it's priced sitting right at an all-time high right now, because we're not
the only ones that I think that. And that just about wraps it up, why we think Google and Apple
are both undervalued wallet all-time high, and while perception might not always match public market sentiment.
And I think it's just, it's important to continue to evaluate these companies.
Like we said on yesterday's episode, these things matter a lot.
And how AI's implemented really matters a lot.
And it's changing the architecture of the world around us.
So it's this exciting conversation we're going to continue having as we go.
I just, as always, thank you so much for watching the episode.
We really appreciate all the support.
Things have been going well.
In fact, so well that 85% of the people who watch this episode on YouTube are not subscribed.
Which is a problem. So if you are watching this and you are not subscribed, please go ahead, click that subscribe button and also listen to us wherever you find your podcast. One place we're struggling to grow is on the RSS feed on Spotify because it's kind of tough to get people to go there. So if you listen and you want to support the show, please go there. Subscribe on Spotify. Subscribe on your RSS feed, wherever your player may be, Apple Podcast. Even if you don't want to listen, if you're taking a shower, just put it on play, put it in the other room. It helps us out a lot. That's what my dad does. It's great.
So yeah, we just appreciate the support of whatever you can give, all the comments.
It's been amazing.
We're coming off our best month ever, and we continue to keep our foot on the gas.
Thanks to all the support.
So thank you for watching.
As always, we will see you in the next episode, and peace.
