Limitless Podcast - Meta's Ray-Ban HUD: Groundbreaking Tech or a Stepping Stone?
Episode Date: September 18, 2025In this episode, we cover the launch of Meta's Meta Ray-Ban Displays at the Meta Connect event, featuring a 46-megapixel camera and 18-hour battery life. Ejaaz and Josh discuss the technology...'s practicality, Zuckerberg's vision, and other devices like the Oakley Meta Vanguard glasses. We explore the future of ambient computing and invite listeners to share their thoughts on these innovations. Join us for a dynamic conversation on wearable tech!------🌌 LIMITLESS HQ: LISTEN & FOLLOW HERE ⬇️https://limitless.bankless.com/https://x.com/LimitlessFT------TIMESTAMPS0:00 Meta's New AI Glasses Unveiled2:52 Initial Reactions to the Meta Ray-Ban5:23 The EMG Neural Wristband Explained8:01 Exploring the Technical Features12:00 User Experience Insights15:27 The Demos: Successes and Failures18:06 Live Translation Feature Discussion19:27 The Fail Demos: A Closer Look30:16 Other Products Announced at Meta Connect38:09 Are Glasses the Final AI Form Factor?38:56 The Future of AI Consumer Devices------RESOURCESJosh: https://x.com/Josh_KaleEjaaz: https://x.com/cryptopunk7213------Not financial or tax advice. See our investment disclosures here:https://www.bankless.com/disclosures
Transcript
Discussion (0)
So just last night, Meta had their Meta Connect event where they unveiled three new flagship devices at the core of their AI hardware efforts.
The first one called the Rayban Meta Gen 2s.
They had the Oakley Meta Vanguard and the product that I think everyone's most excited about, which is the meta rayband display.
Now, basically what this is, a pair of glasses, you put them on your face, and suddenly they have a display in one of the lenses that allows you to see all the compute overlaid on the real world.
So I just want to talk about that first because that seems like the most exciting thing.
Can you walk us through what exactly was announced last night with these cool new metarayband displays?
So imagine a pair of fancy sunglasses except it's AI.
It has a high-resolution display coupled with something called a metanural wristband.
So imagine like a wristband on your arm, something maybe akin to a Fitbit or a Whoop.
And if you make any subtle gestures with your hand or your fingers,
it controls the display that you're looking at through your glasses.
And so you might be asking, well, what am I looking at through my glasses?
Well, pretty much anything you can do with a phone.
You can scroll text, you can reply to them.
You can watch HD videos.
You can take photos, edit photos, and send them to your friends,
all in real time.
And if you're thinking, oh, this kind of sounds like Google Glass and we saw how that went,
I'm afraid we've been brought into the 21st, maybe even the 22nd century here, where we have a full suite of apps, we have a 46 megapixel camera, and we have almost 18 hours of runtime.
But don't take my word for it.
We have a video here demonstrated by none other than Mark Zuckerberg himself when he walked onto the stage with a live demo.
Check this out.
So for those who are listening, he's sitting in his trailer outside of the stage and he's wearing.
these glasses and he, a calendar event pops up, which says, hey, you've got your keynote now. So he's like,
okay, I guess I need to go onto the stage. He selects a track because he wants to listen to some music
whilst he walks onto stage. So he accesses his Spotify and he plays a song. I think it's called
Enjoy the ride. And the music is playing through headphones that I installed into the handles of
these sunglasses. I know what you're thinking. You're thinking, this sounds obtrusive, right? I don't
on people listening to my music on the train or whatever, it's isolated towards your own ears.
So no one else hears it but you. It's a really remarkable attempt at a new form factor of
technology where you can interact with different types of social media, music or anything like that.
What you can see on the screen now is he's chatting live with his friends in real time.
He's saying, hey, I'm on my way. I'll be there in about 20 seconds. A friend replies saying,
okay, cool, I'm over here, sends him a photo of what he's seeing through his Meta-ray display vats.
It's all remarkably cool.
Before I go into the features, Josh, I want to get your initial gut take.
Is this cool to you or is this kind of novelty cringe?
So throughout the presentation, I feel like inside of me there were two wolves.
There's like the techno-optimist part of me that really wants this to be great,
that wants to believe that this is very much the future, this is here right now,
these products are incredible.
And then there was like the more pragmatic, realistic version who has been in the tech world for decades of time.
who is just like, oh my God, this is a toy. This is not real. This is not a serious presentation.
This is like actually a joke. None of your demos work. Even the ones that do, the software is
clunky. The hardware is very much version one. And at times it felt like they were advertising
an action camera versus a real smart AI interface. And that to me kind of rubbed me the wrong way.
So I think in terms of wanting progress, this is amazing. I am very glad they're building in public
and they are doing this in a way that is available to consumer starting September 30th for these
glasses. But also, it feels like it is so early and so half-baked that even if you gave me one
of these glasses for free, I would want to demo it, sure. But I'm not sure I'd actually walk
around and interface with my regular life with them on. But there were some things that were
really interesting. And I think throughout this presentation, there were signs of life. There were
signs of hope that, okay, this very much is a new frontier that they're establishing. There's a lot of
potential here. There's a lot of technology that's built into not only the glasses, but the wristrap
that we're going to talk about soon. And that to me was the exciting part. It was more the promise
of what we're going to get versus what was revealed today. And I'm wondering, Ijaz, you seemed
pretty enthusiastic about it. Did you have a different type of sentiment than me?
Okay. So I went through the same roller coaster that you did, which was incredibly
pessimistic when a few of the demos that they tried to attempt live failed.
And then I realized that what they were trying to achieve here,
what they were trying to build, was incredibly ambitious.
And for a V-1, this was actually pretty impressive.
And I'll tell you why, right?
My gut at the end of the presentation was,
maybe this isn't the coolest V-1,
but for the price of $799, I would buy it.
It has enough for me that I think it would be useful for me to at least try out and maybe see what it can do.
You know what it reminded me of Josh?
What's that?
Do you remember Steve Jobs's keynote for the iPhone 4?
I was like a little teenager at that point.
But I remember him launching it and he did a live demo on stage.
I think it was demonstrating, I think it was like some kind of like new camera feature.
and it flopped.
Safari and FaceTime.
Yeah, Safari and potentially FaceTime, exactly.
And it flopped in real time.
And that was because, you know, there was like Wi-Fi connections or whatever.
And that ended up being okay at the end of the day.
Now, I know this isn't V4 of these Rayban glasses,
but I feel like we can give it a little bit of room to kind of grow.
But the coolest part of these AI glasses, Josh, isn't actually the glasses to me.
It's the wristband.
I think that is
I think that is super
cool
so to kind of dig into
what the wristband does
for the viewers
it's called an EMG neural band
it basically lets you scroll
pinch or even
right in the air
and it transcribes straight
onto the display that you have
within your glasses
and the reason why I think this is cool
is this kind of like
moves away from the
Nintendo Wii controller
to something that is a little more accessible
to the average day-to-day person,
something that is way more useful for someone.
It is also a remarkable invention of technology.
Josh, do you remember when we actually spoke about the research paper
that meta released of this neural band?
I think it was like, got it, it feels like a while ago,
but I think it was literally only a month and a half,
maybe two months ago.
When we went through that paper on our show,
it was remarkable how much detail and effort had gone into this.
it's evident that meta had been working on this for a number of different years,
and Zuckerberg actually confirmed that on stage.
But even the slightest of gestures can be picked up by this neural wristband,
and it detects muscle movement.
So it's not really kind of like checking your pulse or looking at any kind of sensory type of things through your nerves.
It's just muscle twitchers.
And you might be thinking, well, I might use my hand to do something else,
and it might cause the screen to do blah, blah, blah.
It's incredibly intuitive, as we've seen from some of the demos that we've been really,
so far. So that's the thing I'm most excited about, less so the glasses. Yeah, same. So let's summarize. So
basically with these glasses, you get two things, right? You get the pair of wearable glasses and then you
also get this wristrapped, this EMG. And the way these glasses compute and they connect to the world is
through a tether with your phone. So if your phone has connection, well, your glasses have connection as
well. The cool thing about the EMG sensor that you talked about, I kind of want to walk through the
technicals of it because I think a lot of people don't understand the difference between an
interface like the Apple Watch. If you've used a new Apple Watch, it detects if you're pinching your
fingers. That's using muscular detection. It's not quite using the same technology that
meta is using today. So EMG, it stands for electromiography. And the band works by using
the surface electromagnography sensors to detect these tiny little electrical signals in your muscle
forum, which is really cool. So it contacts your skin. It detects the signals. And it processes
is it using this machine learning algorithm to analyze the signals and detect the subtle movements
that allow you to interact with the actual device. So I was watching a few demos of people using it.
A single tap with your pointer finger and your thumb is the select button. And then a double
tap with your middle finger and your thumb is the back button. And you navigate the menus by
actually moving your finger around your hand and simulating the actual movement that the band then
understands and intuitively allows you to interact with the device with. So it's this really cool
companion piece to the glasses. And I think in terms of this new frontier of computing, the way that
we interact with it is really important. Like the last big one we got was multi-touch. And the way
you interact with glasses actually just with your fingers and it accepts multiple touch set once.
Well, the way you interact with this like AR, not that these are AR glasses, but this new like spatial world
of computing. The electro thing is like pretty freaking cool. So that got me really excited. And I think
that's the thing I want to test the most is, is how accurate it is and kind of thinking about
how much further they can push that. Because it feels like this technology is certainly not
limited to just navigating a simple menu, but there's a lot more here for them to unpack.
I also want to bring it back to the glasses itself, which is actually a pretty impressive invention
in itself. The amount of technology and compute that they've packed into what is essentially like,
something that is the weight of a couple of feathers is insane.
So I've pulled up a few stats here, Josh, that I want to walk through, because I know what a lot of people are thinking here, which was an initial thought that I had, which is, if I have a high-resolution display over my eyes, how am I going to interact with the world?
I'm going to walk into someone.
I'm going to knock into something.
They're going to be able to see what I'm seeing on my screen.
That isn't actually the case.
The 600-by-600 pixel resolution is only located in.
in one of the lenses.
So you can pretty much have about 85% of your view unobstructed.
It's also a 20 degree field of view.
So whereas you normally would see kind of like everything everywhere
through a 180 perspective,
this kind of like limits you in some sense,
but gives you a wide enough angle such that you are aware of people
that are walking around you,
objects that are surrounding you can interact with things
in relative perspective.
It has about 5,000 nits of brightness,
which compared to the latest iPhone,
which is around 3,000 nets, I believe,
it's comparably larger, right?
So this is no kind of meager effort or mediocre effort.
This is high-grade, high-resolution technology
for the new age, essentially.
2% light leakage.
What this essentially means is for someone,
let's say I'm wearing the glasses right now, Josh,
and you're looking at me,
and you're like, why are there some flashing lights
in EJazz's lenses?
You actually won't see that at all.
because the light leakage just does not occur.
It's all contained within these glasses,
which is crazy because it's an open device, right?
You can move the handles up and down.
There are speakers on these handles.
You would think otherwise, but it's super cool.
And then they have microphones, speakers, and cameras,
all the usual gizmos that you would get with your phone.
Yeah, so there's a lot of specs here.
There's a lot of numbers.
I was really interested in the actual usability.
So late last night after the presentation,
I went on X and I was scrolling through
and seeing videos of people actually using the device.
And the way it's used seems pretty unique.
I think the highlight features are the resolution is actually pretty good.
So when you are actually wearing the glasses, it's not very pixelated.
Because of that 5,000-knit display, you can look everywhere but the sun, and it will actually
show you a bright display, so you don't have to worry about going outside or being in direct
sunlight.
And in fact, the lenses are transition lenses.
So as you go outside into brighter territory, it may take a second to darken, but once it
darkens, the display will be very clear and obvious.
So it seems like in terms of the actual user experience of the device, it's pretty good.
So long as you don't mind the limited functionality of the device and using Meta's ecosystem,
which we could probably talk about in a little bit.
But Ejazz, I know we have a couple of demos that we want to show, or maybe the lack thereof demos.
Some worked, some didn't, which we're never going to start with here.
Let's start with the ones that actually worked.
Okay, this one is cool.
I loved this one.
Exactly, yeah.
So one of the coolest features from the presentation was the fact that you could make subtle gestures for writing words on any kind of platform.
In this example, it's Zuckerberg writing on.
a kind of like very very conveniently placed pedestal which is right next to him and he's writing out words and it's appearing in actual text in his text chain or in his WhatsApp chat that he's having with someone live on stage and this is all done live. So if you look at this video, his hand is casually placed on this pedestal and he is just writing out words as if you would write in a notebook or a textbook for example and it appears live.
on his screen in his WhatsApp chat and then he clicks send.
And what's interesting here is during this demo,
he kind of makes the case that whilst you're talking to a human
or whilst you're listening to a human,
you might want to make notes.
You might want to interact and have a conversation with someone else.
And what he's suggesting is you can do this subtly
because you can just use your hand and this EMG neural band.
I don't know if I agree with that.
I already get annoyed when people are wearing AirPods in real time
when I'm having a conversation with them. I don't care if you're not listening to music. It's kind of
disrespectful in my opinion. So I don't quite agree with that particular use case, but I think it's
cool that I don't have to type out, that I don't have to whip out my phone and tap with my
thumbs. I could just kind of make subtle gestures. Yeah, this was one of the few things that it rubbed me
a little bit the wrong way. The first one being when he walked out and he kind of led the conference
and presentation with like, hey, this is the convergence of superintelligence and hardware.
And this is a very far cry from superintelligence, and using the S word to lead the presentation,
it felt a little disingenuous and not real. This was the second one where, okay, you have these
glasses, which it's bad enough that now if anyone walks up to you with a pair of glasses,
the societal effects of that will be a little awkward, like, hey, am I being recorded?
Like, I know even if you're not going to record it is meta recording and now am I going to be
part of meta's database. But in addition, during this demo, he pitched it as a multitasking
device, which I thought was really interesting to use the word multitasking when you're already
going up against so much friction in this human-to-human interaction, where now not only are you
going to be maybe perhaps recording the person across me, but now also kind of distracted by
this device on your face where now, oh, and not only am I barely listening to you, but I'm
looking at this display because when you look at the display, your eyes kind of wander a little bit,
you don't make direct eye contact. And you're also like, oh, I'm also kind of writing on the side
and I'm doing these things. And he's kind of digging himself a deeper hole by pitching it as
this multitasking device, this intrusive device. It's just, it doesn't really sit well with me in
terms of how I want to use it or how I would suggest other people use it. So I don't know, the
demos again, a little weird, but the demo was great. And one of the things that he bragged about
was I think he said it was 30 words per minute he was able to achieve with the actual handwriting.
So it creates this cool thing where there's this new acquired skill set. You'll need to learn to
engage with these devices, writing being one of them. And I think that's a fun thing. It's like,
okay, cool. We have this new tool.
We could learn how to use it, learn how to optimize it, learn how to extract more value out of it.
But yeah, a little weird.
I just remembered that he announced a new feature with these glasses, which was conversation focus mode, which is antithetical to the case study that he's pitching on stage right now.
For those of you who didn't catch it, it's basically like a button or a gesture that you can make, which kind of like drowns out.
any other surrounding noise and amplifies the voice of the person that's speaking in front of you.
So let's say you're in the room with me, Josh, and I'm speaking to you and you're like,
whatever, two feet, three feet, whatever away from me.
But, you know, we're in New York.
There's a lot of traffic.
There's a lot of construction.
It drowns all of that out.
And your voice is amplified into my ear through the speakers in the handles.
So it's kind of like this push and pull of like, what are you trying to achieve with these
classes?
Is it multitasking or is it kind of focus?
Or is it both? It might work. I don't know. But I'm not convinced. Yeah, you mentioned this earlier,
the Steve Jobs demonstration. I think growing up watching Steve do these presentations has permanently
tarnished all future presentations because they were really very, very focused. You left with a very
clear vision of the future. They delivered on a very clear value in the present. And a lot of this
presentation left me not only like feeling like it was kind of like not a serious thing, but
wondering where, where's all this going? You're kind of seeing these small signs of life and
small signs of this AI convergence with hardware, one of them being the demo that you mentioned,
where it will actually pick up a voice and it will automatically amplify that voice. It's a smart
feature, but it's kind of pitched in this weird, convoluted, confusing way where I'm not quite
sure what I would use it for today, being someone who doesn't use meta's ecosystem. And I'm not
quite sure what I'm going to use it for in the future, because there really wasn't a clear vision
of where it's going, just like these things are going to get better. But I mean, even if the
voice amplification or the live translation gets better, like, I still.
it's not going to do it for me, but there's another demo on screen. This wasn't the only demo that
worked. Here's the second one that worked. Yeah, yeah. It's a live translation feature. So for all my
subtitle reader fans out here, I'm one of you. The ones who read the subs, despite the movie being
in your native language or in English or whatever it might be, this is the ideal feature for you. It is
basically live subtitles as the person in front of you is speaking. So it appears on the high-res display. So
for those people who kind of like mumble or who talk quietly, no more. You'll understand anything and everything that they say in real time. It's pretty cool. I don't know if this is groundbreaking to me, but it was one of the few demos that actually worked. So we have to talk about it. There was, there's an interesting thing where you could see how this would be really cool for live translation in regards to like different languages. We saw that recently with the Apple event. We saw it again with the Google event where this live translation is becoming a cool thing. So this particular example was English.
which was left a little to be desired.
But I think the actual technology is pretty cool.
The latency, it appears like it was pretty quick, so it's almost real-time translation.
But again, it's like, okay, do I want an $800 pair of glasses to amplify the person I'm talking
to and then give me subtitles?
It's like, I think these are a little uninspiring for the potential of this technology.
And they should have demonstrated it with a different language.
I looked up the features after this stream, and apparently German and Portuguese is available
for live translation. So why not demo that in the demo? It's just surprising. And why only those two
languages, too? That's bizarre. Going into yours, maybe both of ours favorite topic, the fail demos.
There were some pretty, pretty awkward fails. So I had, I was looking at my notes. I was taking
live notes last night as I was watching this. And I have a line in my notepad that says,
OMG, I can't believe this is real. And it was happening as this.
demo was going on. This was, I like, my, my, like, skin started to curl as I was watching this because
it was so awkward. And I love that they did live demos and like, I trust Zuck and the fact that, like,
this, this works, sure. But like, my God, that was tough to watch. It was super tough. And you know
why it was a double whammy? Okay. Well, firstly, let me give you some context. For those who are
listening, on the screen here, we have Mark demoing a live video call that you can execute from these
new glasses, except the call never went through. He got called maybe five times from the person that
he was trying to demo with, and he was unable to pick up. He kept saying, answer the phone, or he would
gesture with his hand to answer the call, and it just would not work. The reason why this is a
double whammy is the app they were using was WhatsApp, which is their own feature as well. So it really
wasn't a good demonstration of the technology or the app. Now, to kind of give them a bit of a bit of
of slack. There has been Wi-Fi issues that have failed live demos before, but you're matter.
You spent $15, no, you spent $25 billion to acquire 100 of the best AI researchers.
Can you spend a couple of that or maybe better Wi-Fi infrastructure? I'm not trying to
shit on you. I'm just trying to make a valid point here. If you're going to live demo something
that is a new form factor that is meant to kill the iPhone, you need to kind of like be on it.
Yeah, I think the live demos are actually really,
admire and I appreciate the fact they did it because Google or Apple recently has straight
away from them and I think it's just it's kind of gross. Do it in real time. Let's see that it
works. The thing for me is that even if all of the live demos worked fine, like I actually
don't care that they failed. I believe that they do work and I believe that they work well. But even
if those features do work, I still don't really care. I'm looking at him use the interface. I'm
looking at him navigate around and I'm realizing well by signing up for these glasses, you're
signing up for the meta ecosystem. And that is a place where I have zero roots or zero allegiance
where zero interest in actually using.
I'm seeing him, he's using WhatsApp, and he's using Facebook Messenger,
and that's where all these messages are kind of populating.
And I'm like, this is not a device for me?
And how many people is this actually a device for?
It's funny, you guys, I don't have glasses.
I don't wear glasses for this episode.
I found a pair and I put them on because I was like, well, what does it feel like
to actually wear glasses?
Because I never do.
And it's just like, eh, if I have to wear an $800 pair of bulky glasses that I don't
think look that great, just to translate some subtitles in a native language
that I already understand.
It's like,
okay,
like,
where are the really cool use cases
that get me fired up?
They were MIA.
Okay, I'm going to do two things now.
I'm going to give you some counter arguments,
so what do you just propose?
Because I kind of disagree with a few things.
And then I'm going to kind of agree with a few things
and give examples of where they actually succeeded in the end.
Okay, what are you got?
So firstly, remember when the airports first came out,
and people were like,
oh, these kind of look weird.
I like my wired headphones.
Now everyone and their mom wears them, right?
And the reason why is because they were just about cool enough
to make it without looking like chunky metal blocks in your ear.
I think the same is going to happen with glasses.
And to be honest, if I look at the progression from Google Glass,
which was this futuristic cringe-looking thing,
to where we are with these new Oakley and Meta Raybans,
I mean, it's literally in the name.
It's Raybans, right?
Raybans have been known for killing the Sunglass game
for decades now. I think we have a good shot at maybe producing a consumer wearable that
anyone and everyone will be cool with wearing, right? The second thing is I actually like the live
demos and I think at the end of the day, I'm still optimistic about it because I know
metal will probably work out through, work through the bugs. They're going to fix all of them
and it'll probably end up being a coherent usable product. The other thing is it's affordable.
Now, they didn't do the Apple thing where they were like, this is going to be thousands.
of dollars or whatever the premium pricing is for whoever at this point. It is something that
is affordable, at least for most people, in tech, to try out. And I think the audience that they're
going after is, you know, these Gen Z kids, which want a MacBook, which want their iPhone, which want
a bunch of different things, you know, they're willing to kind of spend this extra money to try
this. Now, where I agree with you heavily is the ecosystem. Yeah, I don't use WhatsApp. It's just
Instagram. So when I think about
being plugged into meta-AI
and the quality
of their AI assistant, which
I don't know about you, Josh, but I've never
actually used to
any length of time outside of experiments.
I'm not convinced.
This kind of feels like
me using some
low-grade AI assistant.
Dare I say Siri, shout out Apple.
Sorry. But I
don't think it's coherent or usable enough
for me to be really invested in these glasses. I think the object, the device is super cool,
but I think the technology and software, which at the end of the day is going to be the thing
that scales it. It's going to be the thing that puts the nail in the coffin to an iPhone,
if that ever exists, is still unproved and yet to be kind of built out. And I'm not quite seeing
it with this release. Yeah. So there's two points that you made that I want to talk about,
because I think they're both interesting, is the Google Glass reference, because Google Glass
came out in 2013. That's 12 years ago. And a lot of the times when we're kind of analyzing these
companies, we're looking at the rate of acceleration, the rate of improvement year over year, as
things gets better, as a way to kind of judge future returns. So you just kind of assume that
they're on the flat part right now of the exponential curve. But the progress that is being made year
over year is really marginal relative to what it should be, where I can see the same presentation
happening four years in advance with kind of like iPhone level incrementation, like, oh, now
the camera instead of 3K, it's 4K. Oh, it has a wider field of view. Oh, it has double the pixel
count in the displays. And it still doesn't change the fact that it's not an interesting product
because it doesn't have the software ecosystem to back it up. And then the second thing was on price,
which is $800. It's like pretty affordable for what you're getting. It's like you're getting
these cool glasses with this display. You're getting the wrist strap. But just do you not think
that's subsidized in an attempt to gain users early before other companies release actually good
hardware? Because I just bought Apple has the Apple Watch UltraA drops.
tomorrow on Friday, and I paid $800 for a watch, just a single wrist strap. And yeah, it's made
of titanium and it has the cool chips, but it's, it costs $800. And that is from Apple, who has a
very well-established supply chain and manufacturing capabilities, and prices at reasonable
margins because they have a lot of competition. So for meta to price a device, this compelling
in terms of frontier technology, I strongly suspect there's no way, including all the R&D costs and
everything it took to get this thing out the door, that it really costs less than $800. This device
absolutely costs more than $800. They have the wristrap. They have the glasses. It's like,
I very strongly suspect that they are subsidizing the price in order to gain users to sync them more
into the ecosystem before, I mean, like I said earlier, before other companies actually make good
products. If Apple released a pair of glasses that were even marginally close to this, I would spend
twice as much. Even if Google did, I would probably spend a lot more because Google makes good
hardware products. The reality is meta, this is just like, it's a mediocre hardware product.
And they're probably doing all they can to gain users early, which is like maybe a hot take
in terms of the mediocrity. Because, I mean, again, I don't want to judge it without using it.
Because we haven't used it. And all the people who've tried it said, it's remarkable. It's
really cool. So I think we should probably reserve final judgment until we hit the demo.
But just based on first impressions, it's like, okay, this is probably their strategy.
Let me steal man what message argument might be, right? Uh-huh. So what? So what? So what?
they haven't been in the hardware game
meaningfully. Yeah, they acquired Oculus
and they've gone down the AR-V-R route,
but this is the first, I mean, correct me if I'm wrong,
like home-grown product.
Like, they have been working on this neural respan
for years now, and it's their first kind of foray
into a new form factor by far, right?
So maybe pricing it at a cheap enough point
where the hardware is kind of gimmicky but good enough
is good enough to get distribution to a loyal fan base,
of which they already have tons of online users, right?
And then number two, I'm thinking maybe this is a problem that solves itself
just by chucking money at it from the manufacturing perspective.
That might be a super, super naive take,
but I'm guessing that that might be a strategy that Zuck has thought about,
and we might end up seeing a bunch of infrastructure or hardware,
manufacturing, supply chain partnerships over the next couple of months,
remains to be seen. Now, if I'm to look at your point around Apple, I would say that not only
would people pay $800 or maybe even double that for the new what, it's because you have ecosystem
lock-in. It's because the software experience and the app experience is so damn good. And that's
Apple's leverage at this point. It's this kind of like coherent system where if they released an AI
hardware product tomorrow, I'm there. I'm buying it. All my friends are using.
the Apple ecosystem anyway, right? So I'm still an Apple fanboy, and the meta-AI stuff kind of icks
me, but I can appreciate the attempt that they're going for. And if they have even one, Josh,
okay, imagine this, just one novel use case that goes viral. And let me make a guess at what that
might be. Please. It is going to be some form of streaming. Streaming content is becoming
super popular amongst the Gen Z sports people. Social media is everywhere.
If I can stream a video or stream my life in general, clip the cool clips, post it on my social media, it goes viral, that might be good enough to sway me to use it.
Now, are all demographics of people using this? Probably not. Are you going to walk into your Wall Street job with these glasses on and stare at the charts?
Maybe 10 years from now, but certainly not now, right? I feel like this appeals to maybe a younger audience. That might be why Zuck is wearing gold chains every now and then.
But I feel like that's the, that's the demographic that he's going for.
Kind of like Gen Z, teenagers, social content people.
Yeah, we're not even talking about the outfits he walked down on stage.
We'll say that for later.
There was like, two more things, I guess.
One is the AI that runs locally on the device.
I'm unsure EJAS if there is much of it.
Like, I'm very interested in the intelligence because he led with that super intelligence line.
He used the S word.
But the intelligence seemed pretty rudimentary.
It was kind of like, they showed a demo where they were in a surf shop and you look at a surfboard
and they're like, oh, here's an email that you talked about a surfboard.
And it feels kind of like what I would imagine Siri and Apple Intelligence would be, which is bad, but moderately helpful.
And then also, there was other things.
We probably should round this out with the rest of the stuff that was announced during the presentation because it wasn't just these one set of glasses.
There was a few things.
We could kind of speed run through them if you'd like.
The first one, oh, here we are.
Yes, the Oakley Meta Vanguard glasses.
So prior to making this big announcement of the displays, they released a few glasses.
And to me, this felt like, so I love photography, videography, camera stuff.
This felt like I was watching a GoPro announcement.
It was an action camera.
They partnered with Red Bull to do a promo video.
It's like it feels like a really bizarre anomaly for a company like Mehta who's going for
AGI to drop an action camera attached to glasses.
A friend of the pod, host of the pod, David Hoffman, he messaged us.
He said, I'm getting a pair of these.
This is my pair.
David loves climbing.
He loves, I mean, presumably recording the climbing and uses a bulky.
GoPro setup. So this is cool for people like that. A few specs on these glasses because there was some
interesting, noteworthy things for these. They're targeted at athletes, first sport-specific glasses.
They are IP-67 rated, which means they're dust-resistant and water-resistant. You can wear them,
get them wet, drop them in water. They have a 12-mixel camera, 12-migel-view, 12-degree field-of-view, which is
pretty wide for a lot of people who aren't familiar with that. The battery lasts up to nine hours,
and it captures 3K video for $49.
So the 3K video I thought was funny.
It's like, okay, why?
Like, people don't call 2K video 2K at just 1080 because it's 1920 by 1080.
And for 4K it's like 38, 60 by 2010.
So like the 3K is just interesting marketing.
I guess it sounds cool.
I don't think I've ever heard anyone market it at 3K before.
Yeah, because like no, like you do 1080p or 4K.
Like, you're just doing this weird incremental thing that I don't love, but whatever.
But the glasses, I mean, here they are.
They look interesting.
It's like my past self who is playing baseball games in the outfield, I'd love this.
I can get some live content and be catching a fly ball.
But I'm not sure.
What do you think of the camera smack bang in the center?
So ridiculous.
This is like, this was not a serious presentation.
Like, it really, like, I don't know how to say.
It makes you look like a cyclops.
What is this?
But, like, this was a presentation that needed to be made in two or three years.
This is like not a presentation that should have been made today.
And what we're seeing is just like, this is not, I mean, it's a cute little action camera from,
God knows how many hundreds of billions of dollars of R&D budget.
So like, okay, cool.
They also did this not just with Oakley, but with the Raybans as well.
They released the Rayban Metagen 2s, which are like those trademarked.
Yeah, here they are.
Those like trademarked classes you see everywhere.
These look a little more compelling because they're just a little bit smaller.
They start a little bit cheaper at 379.
But again, like, okay.
they'll talk to you and you can kind of ask you questions.
There's no display integrated in them.
It's like it's a little bit better than the last generation.
And that was that was it for the hardware.
It was three glasses, one wristrap, and a lot of enthusiasm and open-ended questions.
I have a question for you, Josh, before we round up.
And this is a very important question, so think about it.
Are glasses the final form factor for AI consumer devices?
No.
I don't think so.
I think the final form isn't a single form.
And I think this is something that is going to take a while to train people's brains
because we've become so accustomed to relying on single devices like our iPhone to do everything.
One of the benefits of AI that we get is a lot of context that is distributed across devices.
So if you think of like EJazz as being a singular profile that exists in the cloud,
that profile is it can be secured and it can be full of context and it can be uniquely you.
and that profile can be carried to a stream of devices that isn't one singular iPhone.
So in the case of this new frontier, these new types of form factors,
glasses are absolutely not final form because there is no way in hell I'm going to be wearing
a pair of glasses to like a private dinner and chatting with people and just letting it like
it's just not, it's too intrusive.
But to have a series of suite of devices, you have your phone, you have a display on the wall,
you have glasses on your face, you have smart AI in your earbuds, you have a little puck
on your desk. I think the future form factor is really varied. It exists in a lot of different
forms, but it uses this one type of intelligence, which is AI, this context window. And the term that
people are using for that is ambient, AI, ambient intelligence, which just kind of exists in devices
around you. It's very portable. It's very modular, but it's not locked into a singular device.
So while I think glasses are part of that ecosystem, I do not think they are the single form that
will lead the way in terms of replacing something like an iPhone. Agreed. It won't be the ultimate
form. I think the ultimate form of you and I probably both agree here will be some form of like chip
in the brain and the projection of display will be on our actual eyelids, on our actual eye lens.
On your neuro cortex. Forget the eyeballs. Those are lame. Okay, there you go. So we have basically
an AI brain that does all the cool functions for us. But that's going to take decades, like you said.
So in the intermediary, what do you think is the new cell phone moment?
Could it be Glassers or is it the puck-like device that we discussed in the past?
I'll give you my take.
Yeah, let's hear it.
I think Glassers is a fantastic step in the right direction.
And I think it'll be one, if not the only intermediary device that's going to work,
at least for the next decade.
And here's my reasoning.
you need something visual
that you kind of can wear
but doesn't bother you too much
I think AirPods
demonstrated that we're okay
doing that with one form of medium
which is music and listening and calls
that part of our social life
but visual never really got
kind of upgraded from staring at a metal slab
and having to pick it up
having to go to a bigger metal slab
I'm gesturing to my laptop
that's in front of me
It's kind of clunky.
So I feel like something that follows me everywhere, yeah, glass is kind of seem annoying, but I don't know.
If you can do it in a way where my vision isn't impaired too much, where there's no like crazy filter or colored filter, I could I could wear it.
Just make me look cool and I could I could wear it.
That sounds good, right?
The other thing is last week we spoke about Apple's new iPhone air.
I have mine arriving tomorrow.
I cannot wait.
We're both getting new phones tomorrow.
Let's go.
We're both getting new phones.
And it's actually both of the different models.
So we can demonstrate that life.
I'm getting critical.
But the amazing part of the iPhone air is the entire computer chips, everything,
aside from the battery, was contained in just the camera slot where you would normally see on your phone.
So it's just in this tiny square.
But those of you are looking at this screen, I'm holding up an iPhone.
And the camera part of it, the plateau, right?
That suggests to me that they are also.
heading towards a world where they're going to release some kind of small device, most likely
a wearable or something that you can wear on you, that will do something similar to what
these glasses are doing. It kind of indicates that glasses might be the thing. I might be getting
ahead of my skis on this one, but I think it's the ideal form factor for the next decade.
No, that's right. Yeah, I think glasses are, I've spent a lot of time thinking about it,
and the mistake that I caught myself making was thinking that there will be another iPhone.
and I really, I genuinely don't believe there will be another iPhone.
It is the like singular moment in history that will not be replicated because the future
technology does not enable it to happen.
The glasses are awesome and I very strongly agree that everyone's working on them and everyone
will be wearing them.
But it is part of a larger ecosystem that kind of exists in this ambient space versus a
singular device.
And I think that's what meta is going for.
I mean, that's why they changed their entire name from Facebook to meta.
They are building these devices for a metaverse, for a,
reality that exists multimodal.
It exists in this like meat space that we're in today,
but also in an overlaid world that is littered with virtual and digital avatars and worlds.
So that's kind of where we land with meta.
It's like, okay, cool today.
Ejazz, are you getting, you're getting a pair?
I'm going to get a pair.
And I just want to point out, Josh, you're still wearing the glasses that you were demonstrating earlier.
So I'm just saying, I'm just saying maybe you may not care about it.
They're comfortable.
I can see why people wear these all the day, even if they don't have a choice.
Yep.
But these glasses come out on the 30th of September.
Yes.
They are priced at $799, which is for the fancy new thing, the one that comes with the neural wristband.
So you're getting two devices for a singular price of $799.
And if that's still out of your price range, you can get the Oakley's at $499 and the Rayban Gen 2's at, I think it was, $3.75.
Something like that. 3.79? Okay. Cool. So, so much, much cheaper than the Apple ecosystem,
but, you know, they come out on September 30th for the, for the new ones.
We were yavins today. We're at 45 minutes long. If you made it to the end, thank you for
sticking with us. Guys, the comment section and the reaction to some of our most recent
episodes have been insane. You got, like, whoever's been sticking to the end, whoever's been
sending comments and messages, it's been so generous. And I woke up this morning and I message
EJA as like just to go look at it because it was really, it's so special. So we appreciate you,
not only sticking to the end, I'll be a 45-minute-long rant about the new frontier of this
meta hardware technology, but also just engaging with all the content. You got a bunch of new
subscribers, we have a bunch of comments. Thank you. We are so excited to wake up and record these
episodes. We have so many more coming down the pipeline. And yeah, just appreciate you sticking
with us. So again, as always, if you enjoyed, share with your friends. Any parting words,
EJAS, anything to leave the people with? Leave more comments. Buy these glasses.
Or will you not buy these glasses? Tell us if you're not going to buy these glasses.
Tell us why it's not cool.
Tell us why it's impractical or tell us why it's the next best thing.
We don't know.
I'm curious.
Yeah, prove me wrong.
Tell me why it's sick and like I didn't waste an hour of my time last night.
I'm kidding.
It was cool.
It was cool.
Techno optimist.
Keep shipping in public.
We were going to do anything anyway last night, Josh, with nerds.
That's so true.
It's so true.
But yeah, pro teams, keep shipping, keep spending those billions of dollars.
We are building a brighter future.
So as always, thank you for watching.
We appreciate it.
And we'll see you guys.
the next one. Peace. See you guys.
