TRUNEWS with Rick Wiles - TruNews Presents New World Tech Week Blurred Lines: Augmented Reality in Everyday Life
Episode Date: March 11, 2022Augmented reality and mixed reality can be described as a user’s virtual experience combined with what they physically see, hear, feel or smell and the world around them, but sometimes this can be t...oo much for any one person to process. IoT devices capture enormous amounts of data from the physical world and AR can compile all that data, analyzing it and providing back a condensed set of data to the physical world for people to view and interact with. Join us for this panel discussion on how AR can be used to improve safety, increase efficiency, share knowledge, and deliver real business benefits. Rick Wiles, Doc Burkhart. Airdate 3/11/22.
Transcript
Discussion (0)
The following program is made possible by the faithful prayers and financial support of listeners just like you.
To find out how you can help, visit nothing but the truth, so help us God.
I'm Rick Wiles. This is the wrap-up of New World Tech Week on True News.
We're providing you with exclusive video highlights from last week's Mobile World Congress in Barcelona, Spain. Now, today,
our topic is augmented reality in everyday life. This is going to be a really interesting
session. Doc Burkhart has all the details about what you're going to see and hear today
on True News. Thanks, Rick. Well, today on True News, we confront the coming reality of
augmented reality, where the blending of the virtual and real world is already taking place.
Now, augmented reality and mixed reality can be described as a user's virtual experience combined
with what they physically see, hear, feel, or even smell, and the world around them. But sometimes
this can be too much for any one person to process.
IoT devices are capturing enormous amounts of data from the physical world,
and augmented reality is capable of compiling all that data, analyzing it,
and providing back a condensed set of data to the physical world for people to view and interact with.
Now, on the panel discussion today, discussion is made on
how augmented reality can be used to improve safety, increase efficiency, share knowledge,
and deliver real business benefits. Those are the good selling points. Long-time viewers and
listeners will be observing with discerning eyes and ears of the dangers ahead for all of humanity
as the tech companies move forward to eliminate this reality with the reality that they're building for us. Our coverage of Mobile
World Congress will begin immediately after this urgent announcement from Heaven's Harvest.
Being prepared means thinking long term. Yes, you may have your immediate food and water needs
supplied, but what happens if there's a long-term crisis or supply chain issues last months or even years do you have a long-term food solution
right now our good friends at heaven's harvest are offering their premier heirloom vegetable seed kit
for the affordable price of only 139.99 now these are not your standard off-the-shelf retail seeds
these are non-hybrid open pollinatedpollinated, non-GMO seeds.
Heaven's Harvest offers 39 different varieties, all specially selected from high-quality,
established suppliers. And these heirloom seeds are meant for the long haul, having a 10-year
shelf life. Each seed selection is packaged in durable, resealable Mylar foil bags.
You know, right now is the perfect time of year to start planting.
And with the way things are looking in the world today
and the prices at the grocery store skyrocketing,
you'll be needing food by the end of summer.
Your Heirloom Vegetable Seed Kit is available now from Heaven's Harvest for only $139.99.
Use promo code SEED for free shipping.
Now, this promo is available only for viewers and listeners of True News.
Once again, your Heirloom Vegetable Seed Kit is available now from Heaven's Harvest for only $139.99.
Use promo code SEED for free shipping.
And once again, this promo is available only for viewers and listeners of True News.
Selling fast while supplies last.
Call today 1-800-516-4773. That's 1-800-516-4773 that's 1-800-516-4773
or visit heavensharvest.com hey we will start our coverage of mobile world congress barcelona spain
in just a minute but first i want to encourage you to do what thousands of your True News family members have already done,
and that is to order a copy of Final Day.
This second edition of paperback has been flying out our doors.
It really has caught me by surprise how many people have ordered this book
in the first two weeks that the second edition of paperback is available. It's $19.95 plus shipping and a
personal autographed copy for $100 donation to True News. I've said it many times, it's not a
book about the last days. It's a book about the last day. It's 10 chapters about the characteristics
of the Lord's second coming. I'm going to sign some,
three more, and the first one goes out to David. So David, thank you so much. I appreciate your
support of True News. David, thank you.
That's yours.
It's going to the mail very, very soon,
within minutes from now.
The next one goes to Laurie.
Laurie, God bless you.
Thank you so much. I always add John 6.54.
I just encourage you to read that verse.
It's one of my favorite verses.
Promise from the Lord.
And the next one goes to Dan.
Thank you, Dan, for supporting us.
All right, Dan, that's your copy.
All of these are headed to the post office within minutes.
I want to read a few paragraphs from the book.
And what am I going to pick out here today?
All right.
This is the beginning of chapter four.
His second coming shall be visible.
The Holy Bible teaches that everybody on earth will see the second coming of Jesus Christ.
The first chapter of the Apocalypse says,
Behold, he is coming with clouds, and every eye will see him,
even they who pierced him.
And all the tribes of the earth will mourn because of him.
How could that happen?
Presently, there are over 7 billion people alive on earth.
How could 7 billion people see the same thing
happening at the same time? In the heyday of religious broadcasting via satellites,
prominent Christian pastors, Bible prophecy teachers, and religious broadcasting executives
often said global television broadcasting was the explanation. They surmised that the second
coming of Jesus would be televised worldwide
in real time. Broadcast TV stations and satellites, however, are quickly fading into communication
technologies yesteryear. Internet streaming is currently the up-and-coming technology for media
content delivery systems. That too will change someday in the future. Regardless, all
three video communication technologies, broadcasting, satellite, and streaming, would require television
news networks to fix their cameras on an incoming object in the sky. Jesus would have to slow his
descent considerably as he approached the earth. Control room operators in the studios of television news
networks would need sufficient time to zoom cameras in and follow his entry into our atmosphere like a
UFO landing in New York City's Central Park. Using media technology to explain how everybody on the
planet could see Jesus when he returns also mandates that all seven billion people are awake
and watching television or a mobile device at the same time. Forget this idea. It won't be on your
TV or your mobile smartphone. How will everybody see Jesus returning to earth? The answer is quite simple. The answer is in the book. Okay. It's right there. I explain it.
It's so simple. You will, after you read it, you go, oh my, why didn't I see that before?
Why hasn't anybody preached on this? Why hasn't any, It's so simple. It's amazingly simple. But the
answer is in the book. And you will just be stunned that nobody has preached on this for
decades and decades and decades. So get the book, Final Day. Again, it's $19.95 plus shipping.
$100 donation to support True News if you would like my personal autograph.
Thank you so much for supporting us.
Very, very grateful.
Doc and I will be back in the studio on Monday.
And God willing, the world hasn't blown up with World War III.
And I don't think it's going to right now.
There's too many lost souls.
The Lord's not done saving souls.
We've got a job to do.
That's the only thing I think about day and night lost souls.
How can we reach them with the saving message of the gospel of Jesus Christ?
I appreciate you being a partner with me.
And when you buy this book, you are supporting me in that mission. All right, let's go to Barcelona, Spain, our exclusive the world in a new light.
And with our different perspectives,
we can accelerate opportunities to create value that lasts.
How will you take advantage of the connections you make this week? to create value that lasts.
How will you take advantage of the connections you make this week?
How will you see
connectivity
unleashed?
As the boundaries blur between online and real life,
we continue to rely on digital innovation and connectivity
to transform our world.
From the metaverse and NFTs to connected devices and IoT,
this year's MWC themes will spark bold new ideas
to create a positive future.
As an industry, we pave the way for transformation
and find new ways to help other industries innovate.
And now, together, we continue our exciting 5G journey.
Delivering on our industry's commitments to tackle the mobile internet usage gap
and reducing our industry's carbon footprint, creating a better future for all.
Because together, we will unlock the full power of connectivity
so that people, industry and society thrive.
This week, in our iconic host city of Barcelona,
we will put our convictions to work for our businesses, for our next generation of leaders, and for the world. Are you ready?
It's time. Good afternoon, everybody.
Welcome to this panel, Blurred Lines, Augmented Reality in Everyday Life.
We've got a great set of panelists covering from the software side through to the tech side,
all related to AR, to share some insights with you today.
Quick intro to me. I worked at the GSMA for a long time, and I recently joined Ookla.
Ookla, many of you will know, from Speedtest.
Many of you will have the Speedtest application on your smartphone.
But Ookla do a range of other things, too.
You might be aware of Down Detector. That's also part of Ookla.
If you want to hear more, come and visit us at our booth in Hall 2.
We'd be pleased to talk to you.
So, augmented reality, you know, the hot topic of this year's
Congress is undoubtedly the metaverse, Web 3.0 or spatial computing. All of these things,
they're essentially the convergence of three main tech areas. One is the evolution of networks as we move towards 5G and standalone networks and the capabilities they provide.
The other is the device and the form factor and the platform over which services are rendered.
And as we move towards more wearable forms of tech.
And finally, in where the compute is done.
Then that move to the cloud, move to the edge, moving off of the device to some degree.
And what role will AR play in this?
What role is it already playing?
How quickly will adoption happen?
These are all questions that I've had over the last few years. We've
had false dawns of AR and VR in the past. So what's different now? What are enterprises
and consumers using AR for, and what will they use AR for? I'd like to welcome the panelists
up to the stage if I can. And we have Ed on the line.
Sorry.
So welcome, everybody.
We have Maria Cuevas, who works for BT.
Jesse Stocks from VMware,
and Philip Langraff from HoloLite.
And on the line, we have Ed Tang from Avagant.
Now, guys, if we can just run through a quick round of introductions.
I've already done mine, but over to you, Maria.
Thank you very much.
I'm Maria Cuevas from BT,
here representing the research department in BT,
but a whole of all the activities that we could do
across the business, across consumer enterprise.
On 5G, the work that we're doing in obviously our 5G network
to enable these new 5G use cases
and maybe to share with you some of the recent experiences
that we've had in that space.
Thank you.
Thanks, Maria.
Over to you, Jessie.
Hi, Jessie Stokes.
I'm a product line marketing manager at VMware.
For those of you who aren't familiar with VMware,
we're probably best known for virtualization,
but we offer solutions that span across multi-cloud,
apps, security, and the digital workspace.
In my role, I specifically focus on the go-to-market strategy
for our end-user computing business and Workspace ONE and how it can enable IT to securely manage devices across any use case and really create an exceptional end-user experience.
Really excited to be here today to talk about what we're seeing in the enterprise and what that means for the future of work.
Brilliant. And over to you, Philipp.
Yeah, sure. I think Hololite I need to introduce quickly because we are, I think, the smallest company here.
You don't know us yet. So what we do is we started six years ago with industrial AR applications.
We found two major issues. First was data security.
We were just not allowed to load CAD data on the mobile devices.
The second one was performance, right?
We solved this.
We built a streaming SDK,
which enables every XR app to be streamed.
And this is now our mission.
We enable XR streaming for everyone.
We do this by a great cloud platform, by easy-to-use
SDK, so all of you can have streamed experiences instead of on-device ones.
Thank you, Philip. Ed, your turn. Thanks. So I'm Ed Tang. I'm the CEO and founder of Avagant.
Avagant is also a startup company based in Silicon Valley. We work on display engines,
tiny, tiny projectors that go into AR glasses.
So these small displays are what's going to power the next generation of AR consumer glasses
and really going to enable very small form factors to enable true glasses-like form factors.
Thank you, Ed.
Now, first off, let's just talk about some of the real-world AR use cases
that we're seeing today that your companies are involved in.
Maria, over to you first, because I'm particularly interested in what EE is doing in this space.
I know you're working with the iconic David Attenborough, a presentation in London.
Can you tell us more?
Absolutely. Thank you.
So this is one particular project, very recent. It's currently
on display at Regent Street in London. And it's a fantastic augmented reality experience showing
the art of the possible in AR technology. Sir David Attenborough is a host. We can see him in
a volumetric video, which is one of these holographic Star Trek-like representations,
but he's obviously the real person speaking to you.
And that's very impressive.
People know him very well in the UK, and he's very well known.
It brings to life the secret, if you like, world of the nature.
And it's highly interactive.
People can use a mobile phone that is given to them at the entrance,
and they follow pretty much this path where they discover, if you like, the world of the green planet.
It's been a very interesting journey to get there.
A few technical details, perhaps, for those of you interested.
We've deployed a 5G private network at the site,
and this is 5G standalone,
which, as you would know,
obviously brings the next generation set of benefits from 5G,
including the capacity that you're going to need,
especially the latency.
So we've achieved very good latency on this particular deployment.
And maybe I can come back to some of the technical features of the project.
But the gist of it is the feedback is fantastic.
People love it.
It's not our first experience in augmented reality.
We've done lots of projects, but maybe we can come back to that later.
Yeah, and I haven't experienced it yet. I tried to get tickets, but they were
sold out. I know, I know. It's hard to get
tickets, unfortunately, but that's a success
that we're having.
Philip, if I could just come to you quickly.
You mentioned the XR Now streaming platform.
You launched that last year, I believe.
You're
also demoing here with
Deutsche Telekom. Could you talk a little
bit to that and then also some of the use cases
that you're enabling through this demo,
what it's targeted at?
Yes, sure.
So what we do here with Deutsche Telekom together
is we implement a technology called Managed Latency L4S.
What it does is essentially it gives you feedback
about the 5G network condition right now.
You can stream easily if your network is stable,
but you need to achieve really low latency, right? I think that's quite obvious because otherwise,
if you would move, the picture would not move with you. You would really get motion sick,
and that's just not a nice experience. So we are really focusing on low latency.
What Telecom enables us with is this technology based on 5G.
And with the network feedback, I can provide a really high quality while still achieving this low latency because the network gives me real-time feedback of how much bandwidth I can use while still having the latency.
And that's what we do together with them.
And that's, I think, a really important step for us with this XRNow platform to get it out there
because right now it works great in wireless LAN environments,
in small networks.
But this is giving us the kick that we can use it
wherever you are, just over cellular networks.
Wonderful.
Ed, if I can jump to you just for a moment,
as the tech guy here.
How do you see demand for AR evolving?
I mean, your focus is on display tech, but what are the kind of use cases that you're seeing your display tech used for now in the market?
And how do you see that changing?
Yeah, we work with pretty much all the consumer companies in the world.
And I would say, first and foremost, the most important thing about AR to get these
products into market is to get the size down. You know, if these products can't look and feel
just like regular glasses, consumers are not going to accept this type of technology.
And let me just show you a quick example about how small we have to get to. Here's an example
of how small our projection engine is, if you hold it up to the camera. This is basically
smaller or thinner than
a pencil. So here's a number two pencil that actually shows you how small these types of
displays can get. These are the type of technologies that you're going to need in order to enable true
AR glasses, right? Here's an example of what the future of AR is going to look like, where we can
get beautiful products and glasses like, you know, true consumer glasses like form factors while adding all the
intelligence into these displays. So I think first and foremost, the form factor, the size,
the shape is the most important thing before consumers will accept the device. And then
moving on to your question around applications, you know, I think these devices are going to go
through multiple generations of iteration. Think about like the very first smartphones or even the very first smartwatches.
The applications we're seeing today are very different than the earlier generation products that we see.
I think the same will be true for AR as well, where the first generation AR products will be limited in performance.
There'll be smaller fields of view, smaller images, potentially even a single eye. And so I think the first generation applications
will be very information, like smart glass type applications. But I think one of the key
differences between AR and any other platform that we have today is the contextual information.
How much information these products will have about the world and how intelligent they can be
in delivering information to us as consumers in real time
based on contextual environments and information it has based on the world.
Thank you. Thank you.
Jessie, coming to you, and VMware works with a lot of partners and enterprises.
Could you give us your take on what are the prominent use cases that you see in the market
from an enterprise AR point of view?
Yeah, sure. So we see a lot of, we're seeing growing interest in both AR and VR-based solutions in the enterprise.
And the biggest application for AR is headsets specifically for frontline workers.
So frontline workers are shift-based task or service workers that are focused on a specific task or set of tasks.
And headsets are great because they're giving
that worker content hands-free in real time
and delivering it directly to their line of sight,
whether that be videos, workflows, content, or remote help.
I think in terms of augmented workflows,
the biggest applications we're seeing are for workers in hazardous environments where they might not be able to have a mobile device on them.
Like if you think of service technicians on those giant wind turbines or on top of a power line.
And then we're also seeing it used for complex or detailed tasks. So unlike a dangerous environment,
these workers focused on complex tasks
like assembly line workers,
they have a workstation,
but there's a lot of back and forth
when you're focusing on a laptop
and then going back to what you're working on.
So what glasses do is make it so
there's uninterrupted concentration, which really
maximizes productivity and makes it so there's less room for error. And then, let's see, it's
also being used for training, just onboarding employees a lot easier, and then also remote support and remote collaboration.
So it's a lot easier to stream a remote expert
and see exactly what the worker is looking at
instead of, you know, emailing back and forth
or flying the expert out.
Understood. And you mentioned VR there.
Similar application in terms of training, right? But how do you see VR set alongside AR or how does it differ? particularly for, again, frontline workers. So typically these workers would, you know,
go through a training course
and then they might shadow someone for one to two weeks.
But I think the majority of people, myself included,
are hands-on learners, right?
So what VR gives you is a hands-on experience
before you get out into the field.
It also helps organizations validate
that that worker is ready to go out in the field.
And this is really important when it comes to, again, dangerous environments where you can't
just let someone go out and try it for the first time. Understood. Understood. Philip, coming back
to you, we've spoken before, you've been working with BMW for some while now. Just want to understand,
what does your AR platform bring to a company like BMW?
Are there any other companies you can divulge that are using your solution?
What are the benefits?
What are the business benefits
that they're seeing through their work with you?
Sure.
So what BMW uses our stuff for, I think,
is a good way to start.
So what they do is they use the power streaming in order to really use their CAD models.
We saw a lot of POCs in the beginning in AR where people worked on their CAD stuff.
They used it.
They reviewed the designs.
But this never really happened until now.
Why?
Because you just hadn't the power.
So all the POCs we saw, they used like three months
with a 3D artist just to get it running on one of these devices. And as we already heard,
these devices should get smaller and smaller. So we can't expect there to be any power increase in
them. More, we expect them to be lower powered since they shouldn't use any battery and more
connected. And that's where we go in. And how BMW uses this?
Imagine they do their prototyping state.
And it sounds crazy if you never were there,
but they really build everything all the time out of 3D printers, handmade,
just to try how the things work together.
And sometimes they build something and then find some easy mistakes,
like, oh, it's too large, it just doesn't fit in there.
And this is what they use AR for. So they load their CAD model and it sounds easy, but they go to the car frame and try if it fits in there and try if you can manufacture it,
if you can get to the parts you need to get to. And this saves them at least a year for every car
they build right now and design right now in the prototyping
phase. And I think that's really great. Another use case we have, which is completely out of
another field, and there you see like why streaming is needed in sort of every field only for great
data, is Enhatch. What they do is, and they work with surgeons, so we found out, or they found out
more, that there is a huge issue. They have to
do so many operations. Even if they are really, really good, mistakes happen. And another thing
is it's really hard for them to get data because you would say, like, yeah, use your laptop and
look at the CT scan. But how do you interact with the laptop without getting the stereo stuff away?
You can't just type on the keyboard and then go back to the patient.
So they use AR for that,
for a completely touchless interaction,
but they also need streaming,
since CT data is really heavy.
And they enable that a surgeon, for example,
can in life have a 3D visualization of the CT data
while performing the operation,
which I think is really great,
and in the end even saves lives.
Wonderful, thank you. Mirj, if I can come to you for a moment. As a service provider
in the room, network operator, from BT's perspective, what are the learnings you're
taking from, I guess, things like Green Planet and bringing back into the business? Part
of that's on understanding how the network performs, the latency, but also, I guess, from the consumer point of view, that demand.
There's obviously a lot of demand to attend the event.
What are you bringing back into the BT's business in terms of where you go on AR?
Yeah, no, that's a good question.
So through these projects, we learn both from a technical side of the house,
but also from a commercial and a customer experience side of the house, but also from a commercial and a customer experience
side of the house. And that's only, you can only learn things, you can only learn through these
activities. It's the feedback that the people give you, it's the quality of experience, it's what
they feel and how they feel about the technology as well, how do they interact with it. And you
see, obviously, a generational change. I think we all see that, right, with the younger children
being already kind of being brought up
with this technology.
But it's still very useful to know exactly how they're going to use it,
how can we improve it together with our partners
as part of a collaborative ecosystem.
So as I said, technical side,
I could tell you loads about all the KPIs and everything.
But as I said, the biggest learning for us
is to put it in the hands of the people
and see how they use it and how they experience the whole thing.
And obviously opening up opportunities for more, you know, business revenues, again, collaborations with our partners in that space.
Understood, understood.
Ed, I mentioned it before.
There have been full storms for AR and VR in the past.
You know, the metaverse is the huge buzz topic
of this year's MWC, and, you know,
everyone's talking about it in various different angles.
I guess, is it different this time around?
Are we going to...
I imagine there's going to be hiccups along the way, right?
A lot of people are talking about closed versus open
walled gardens for applications,
for content, for platforms.
How do you see things evolving from your point of view?
Yeah, it's a difficult question.
The metaverse absolutely is a super hot topic these days.
I think one thing perhaps different about what we do compared to some of the other panelists
on the stage here, not just the fact that we do hardware, but the fact that all of our
customers are consumer focused.
We're seeing huge push in the consumer space
from all the major tech companies,
and of course in VR, but even more so in the AR space.
I think what's different now,
I think is we're seeing a lot more momentum.
And I think the technologies are getting to a point
where we can really check a lot of the boxes
it's gonna take to make successful consumer products.
This convergence of hardware enabling new platforms and some of the new software and new applications are really,
really converging, particularly in the consumer space.
So all the stuff we're seeing in from companies like Meta with Oculus, we're seeing big movements,
even in the mobile space, you know, on the iPhone platforms and Android platforms, all the work that they're doing in those platforms to enable AR,
such as like ARKit and ARCore, as well as the sensor technology they're putting into it.
You know, it's great on the smartphone, but it's really just a precursor for what's coming
on these wearable type devices. And I think I've never seen a momentum like this before in the consumer space.
And the interest level and investments that we're seeing
from all the major consumer players
is an order of magnitude above what we've seen
even just a couple of years ago.
Yeah.
And in terms of timelines here,
when we've spoken before,
you've indicated five, maybe 10 years,
if I'm reading that correctly, for us to get to a real contextual, you know, interactive web, right, using AR to the fullest extent, a metaverse, right?
Does that sound right to you?
What are the key barriers you see at a device level which are going to need to be ironed out before we can actually adopt AR en masse? First, let me call out a timeline. I think that in 2024, you're going to be able to walk into a
store and buy consumer AR glasses from multiple major brands. This is how close we are to having true AR glasses for consumers.
But I think beyond that, AR is such a difficult platform to make from a hardware standpoint.
The size, the volume, the power, the performance that's necessary is so difficult, and it's just full of trade-offs at the moment. So that's why I think the initial types of experiences you see
are going to be quite limited, you know, smaller fields of
view, which means like the visual experience is going to be a little bit smaller. So I think
initial devices will be very much information based, providing you information, contextual
information about your environment intelligently, right? And I think because of some of the volume
and even power demands, you know, we have these super light, super small glasses can only have certain size batteries can only dissipate certain amount of
heat on your head. It's really going to limit what the first generations of devices get to.
But I think once we get to a second, third gen AR device, you know, this is probably going to be
four years from now, five years from now, when the experience becomes much more immersive for people,
it's going to enable all these new applications. For example, instead of us sitting here up on
the stage, we can all be virtually seeing each other, almost like a Kingsman experience from
that movie, if you remember that scene. These are technologies and experiences that are very,
very possible in the next handful of years. That's what's incredibly exciting about the AR space.
Incredible. Jesse, if I can come to you for a moment.
Working more on the enterprise side,
I want to understand what the barriers you see in the market to mass AR adoption.
Sure.
So I think there's three main challenges that we're seeing
when it comes to barriers of adoption in the enterprise.
The biggest one is cost.
So especially when it comes to headsets,
some of these devices can cost thousands of dollars a piece. And that's just really not
realistic or scalable for a lot of organizations. I think we're seeing the price go down. And as
new vendors enter the market and we see new technologies emerge like MR and AR pass-through, I think that'll get better.
The next we're seeing is usability.
So you can have the perfect mix of hardware and software, but at the end of the day, you know, it has to work and the employee has to be able to use it correctly and get value out of it.
So I think just configuring the device and making sure that anyone can use it
and they don't have to be tech savvy.
And then last but not least, security is a huge concern
and it's something that we see with a lot of our customers.
They have a really successful pilot
and then they're all ready to go to roll out into production,
and they're delayed for quite a while.
And that's because they didn't engage the IT organization.
So it's really important that you're engaging IT and making sure that you're meeting your corporate security and compliance standards.
Philip, if I can come back to your work with DT, you mentioned managed latency.
Can you explain why that's so critical to what you're looking to do with your streaming
platform and enabling AR applications?
Yes, sure.
So as we just heard, we always talked about AR glasses need only to get smaller.
I think they need to get lifestyle objects right.
And I think that will happen.
We talked about we have
to have streaming done.
And without managed
latency, it doesn't matter how
good your streaming platform becomes. It doesn't matter
how good your streaming is. You would
always get into the issue that if
a lot of people come together, your
network dips and either you go for
really low quality or you just go with stuttering.
And that is the main barrier we try to overcome also with our partners.
And the second one with L4S, which is important, but also with the streaming platform, is, as we just heard, usability.
It needs to become really easy.
Right now, a lot of people, if they try streaming on their own or so, it's possible.
But it takes you a huge amount of time.
It is really hard to develop on.
The experiences are not there.
We are aiming in this partnership and then also together with AWS
to make this really, really easy usable.
So you just develop as easy as every other Unity app. And I think that is also a really major thing that we
get to the same state with AR experiences as we are right now with smartphones, because it's really
easy that people create smartphone apps. That's why there are so many of them. And that has to be the same state for AR. And I think that is also what then will
sort of become the metaverse slowly by bringing all these things together, all these apps together.
Imagine you're always linked in, right? So you're walking around, everybody's wearing AR glasses,
you're linked in all the time. I think the transformation we are right now in
and what we are enabling there is the exact same
we had a few years ago
when we had the first good, easy cellular connections
and we had the first smartphones,
then social networks become a thing.
Now with AR, I think metaverse will become a thing,
but we need to overcome these barriers.
But I think we will all together.
And Ed, I know you've got opinions on this.
This transition, I guess, or this augmentation of AR devices alongside the smartphone, what
do you feel needs to happen as we move applications onto a pair of glasses or another wearable
device? Is it simply a case of initially Bluetooth connections there
to solve for the massive energy that will be required,
battery packs, et cetera, for 5G to be part of the equation?
And what about on the app side, right?
You've seen with smartwatches in the past,
people have tried to move a similar type of experience onto your wrist,
which hasn't always worked. It has to be different for AR, doesn't it?
Yeah, I totally agree. I think with the first several generations of AR, because of the power
constraints and the size and heat constraints, these type of devices are very likely going to
be connected to a local compute device, something like Something like a phone or watch. So it's unlikely that your AR glasses are going to completely replace your smartphone in the next
five years, right? And I think the dream is sometime in the next decade, these type of devices
will replace your smartphone. And I think technologies such as 5G are really exciting,
particularly the low latency part to be able to move compute resources off of these type of devices.
But I think in the near term, because of some of the power constraints that we have, we're
going to have this local compute platforms that these devices need to rely on, be it
probably still wireless connected, which would be great.
But I think to your point about the applications and what do we call the UI or the UX of these
type of devices, I think historically, you know, technologists have
suffered with trying to take existing computing applications or user interfaces and just trying
to cram them onto a new platform. I remember in the early days, the very early smartphones
were Windows-based and they had a start button and a mouse cursor and just looked like a tiny,
tiny computer, right? But it wasn't until some of these new interfaces came up that really exploded the acceptance and adoption of these type of technologies.
I think the same thing will happen in AR,
where we need to be careful not to repeat this mistake
of taking a smartwatch or a phone experience
and just cramming it onto a pair of glasses.
We need to take advantage of what is unique about these type of devices.
And when you're wearing a pair of AR glasses,
part of what's unique is that it understands your world.
It has all these sensors and contextual information
about what you're doing.
So maybe it sees an item on the table
and knows you want to buy it.
Or maybe you can paint a line on the ground for directions
or even give you a restaurant review
when you're looking at a restaurant
that I think you're interested in.
These are the type of applications that AR can really enable that a smartphone or a smartwatch cannot do.
So that's something that I think we should all be thinking about.
And Maria, if I can come to you.
The metaverse has been touted as the first true B2C low latency use case for 5G networks.
I think everyone creating PowerPoint presentations is glad they can finally put something up in that quadrant.
But what's your opinion from BT's point of view?
I know you're looking to deploy standalone,
which will help drive some of that low latency.
Can you get technical for us?
Explain how 5G networks will evolve to support things like AR?
Absolutely. My pleasure.
Yeah, I mean, there is so much, obviously, we have to do still and are doing in this space.
You've touched on 5G standalone.
Just to maybe touch on some of the key benefits there that we are experiencing and seeing already on the trials.
It's going to be about a number of things.
The capacity, of course, is absolutely key.
A lot of the use cases that we're talking about here
are very upstream intensive,
and we need to get this right
in terms of the downstream versus upstream capacity.
That's one key aspect.
5G standalone, as you would know,
touches not just the radio, but the core network side,
and there is a massive transformational program
for us as
telcos to move on to
a fully virtualized
core that can be very flexible, where
you can deploy functions where they're needed,
when they're needed, very dynamically,
and again, that's very key for
this type of use cases, where the
capacity is going to shift between
different areas, and we need to allocate
capacity and resources very flexibly,
not just on the radio side,
but when it comes to things like H-Cloud compute, again,
where a lot of the, again, the content and the applications
are going to be loaded,
so that you can take that power off the glasses
and, you know, the processing power is in the network.
But having it in the right place is going to be absolutely key.
The bringing down the latency is an end-to-end thing.
We have to make all bits of the network support that ambition,
make it very flexible, very dynamic.
So moving things like, again, your user plane functions towards the edge
is going to help you get that compute capacity closer to where it needs to be.
There is an awful lot more work to do with sharing resources and infrastructure from a GPU perspective,
being able to very flexibly allocate and make the most of your capacity.
That's the only way to make it cost effective, especially as you move towards the edge because the scale goes down.
So getting all those pieces of the jigsaw puzzle right is probably the challenge for operators.
Of course, we're not at it alone. We're part of an
ecosystem and we have to work together to make this happen. So what I would say is, you know,
we're taking steps in that direction. Obviously, with metaverse, understanding even the impact
that is going to have on our networks is key. I mean, we're part of a number of initiatives
already exploring those initiatives very actively so that we understand the impact on our networks and we can be ready for it and be part of that ecosystem to make it happen.
Yeah, and making it cost effective is absolutely key, right, because you need those additional revenue flows to cover the potentially massive cost of millimeter wave deployment and other upgrades to the network, right?
Absolutely.
Yeah, big chance to come. Jessie, if I can return to you just quickly, the enabling tech behind AR from your
perspective, what are the critical components to help solve some of these challenges you're
seeing in the market? Yeah, so I think to enable those AR services, you have to have a solid digital foundation to support those devices that the
services are running on. So within VMware end user computing, we have our Workspace ONE platform,
which is a unified endpoint management platform that enables IT to have full visibility and
control over the device. So you can enroll and configure it, push applications,
integrate access control, all of that.
And then we also launched a solution called XR Hub,
specifically designed for VR and AR devices,
which will allow you to further customize the device with SSO, UI customization,
be able to push an app catalog to the device, and all of that stuff.
So we have a lot of exciting things going on.
And then outside of EUC, we also have networking and multi-cloud infrastructure that will help
enable those
high-fidelity experiences onto
any device.
So I guess if you're interested in learning
more about VMware, please stop by our
booth. We're in Hall 3.
Shameless plug.
Philip, returning to you,
as AR devices
in particular become more performant, how does HoloLite see the types of use cases you can enable on your platform evolving as we move towards this metaverse?
I mean, as I said, I think the devices won't become more performant, but the whole infrastructure will be, as we already heard, the performance will be in the network, not in the devices won't become more performant, but the whole infrastructure will be, as we already heard,
the performance will be in the network,
not in the devices.
But if we get this more to performance,
I think you can imagine that.
I heard a really cool thing.
Imagine you're rendering like a teacup.
That's what you can do right now.
In two years, you can render the whole room
where the teacup is in,
and in 10 years from now,
you can render the whole city. This is what it is in, and in 10 years from now, you can render the whole city.
This is what it enables.
So you can go really large-scale use cases.
You need to do much less to prepare the data
because it can't just use whatever is there,
also by maybe AI algorithms.
Also, you don't need to reduce them.
You can just use them.
So this will really, really help us in the future.
Another big thing is the mobility.
Right now, most of the AR use cases,
if they do anything with a network,
they have to be sort of on a certain place.
And with VR, you always have to be
because otherwise you crash in a wall.
But this will enable us to use your AR stuff wherever you are. So we have use cases we can't do
exactly right now. Like actually we want to show a whole new city district which will be built
somewhere out there. We are working on this right now. It's a huge amount of work because you need
to get it somehow running locally on the devices or we build up a huge network infrastructure just to do this.
Coming in the future, we will have just our 5G or maybe even then someday 6G deployment.
You just connect to it and it will just work. Like where you right now need to plan infrastructure,
need to have an artificial setup network, need to have 3D artists working it. In five years from now,
you need a two-man team just building the app.
And this is where we are going.
And this is, I think, where the use cases just get much better.
Another thing is also something in AR which people wait for.
In the first days with AR, again, the things people wanted to do is,
I don't want to bring stuff to show it to you.
For example, I want to sell you a car. I can't always have all the models of the car in my
in very ER so you have to somehow imagine it right but I want to show it
to you and I don't want to bring it. Then people were like I can do that with AR
or VR and then they found out yeah but with the performance we have right now
not really because if I show you the car car in AR glass with the two-day performance,
it just looks like plastic.
But if we have this performance,
we can use ray tracing,
we can use the algorithms they right now use
for making the cinema films, right?
And it will be photorealistic,
which is a completely new way of use cases
and also will be great for just like show experiences, like gaming experiences.
People just are so used to it.
And the last thing which is important is right now people, our industry, we are used to it.
But if somebody a first time sets up an AR glass or an Oculus, they are like, that's
not sharp.
The density of the pixels is not good enough.
And if you calculate it, if you want to have the density you have on an HD screen on the
whole field of view, you're at like 8K resolution, right? And this is also
something where we need the performance that we have at all the resolution
people would like to have.
Got you, got you. We're almost at time. I can see Ed nodding in the background there.
Final question, I'm sorry. Maria, from the operator perspective, how do you see AR scaling and what impact do you think AR and the metaverse will have?
Have you started thinking about the impact that will have on BT's business five, ten years from now? Absolutely. I mean, there is no doubt it is going to have an impact.
I would say the topology of our networks
is going to have to change,
and I think I was alluding to that with HClub Compute,
but I think we're talking about much more than HClub Compute to me.
It's some sort of fundamental shift
in the way we build our networks.
So I think that's going to happen.
How long is it going to take the entire industry
to move down that road?
I suppose it's going to happen,
probably be accelerated, right,
by some of the recent announcements,
because, of course, it's when big things happen
that we all kind of go for it.
But as I said, you know, capacity is going to be, you know, we all know that the
volumes are growing, and these things are only going to be bringing more volumes. How we get
more capacity out of our radio resources is an absolute key challenge to solve. We do masses of
work, actually, in the radio piece to do with massive MIMO, experiences with adding artificial
intelligence to the way we manage radio resources very carefully
to be giving the right capacity to the right service.
So, you know, everybody talks about edge compute,
but let's remember that our most expensive asset is the radio.
So, you know, those densifying our radios,
again, something we're very actively looking at.
Small sales initiatives like open architectures for radio,
which will bring more diversification to the industry,
will allow us to maybe have, again, you know,
smaller sales filling coverage in dense urban areas.
A lot of those things are going to just explode, if you like,
because of the emergence of these use cases.
Capacity, key, latency, we've kind of touched on and off many times already.
It's interesting with latency,
because everybody talks about a figure for latency,
and it's never quite clear whether that's an average
or a peak or whatever.
I'm an absolute believer
that what we need is predictable latency.
It's not just low latency.
And we work with loads of use cases,
not just AR, VR, to be honest,
remote control, mission-critical services.
And what I keep hearing from all of them
is low latency is great,
but what I cannot experience is a peak
or an unpredictable peak of my latency.
So those issues do have to be resolved.
And again, I can see things like this
accelerating the adoption of maybe,
actually, you know, service level agreement measurement tools that help us predict
that something is going to happen and act, or not react, actually,
but act proactively upon, you know, loads, load increases,
or perhaps failures in the network.
So there is an awful lot of work to do with orchestration,
with management of service-level agreements effectively across our networks again.
And as I said, that's not unique to the metaverse or AR
or the things we're talking about here.
I think we have to do this across the piece.
And it's when you put all those together that you understand
the massive impact that that's going to have on our networks.
But we're doing it. I mean, we're ready.
You're obviously doing something right as well.
Root Metrics, I think, awarded you second half of last year.
Yes.
Best network in the UK.
Yes.
Eighth year in a row, I think we will confirm.
Yes.
That's right.
I think we're basically out of time here, guys.
I want to first thank the panelists.
Thanks, Ed, for joining us.
I know it's probably pretty early where you are. But AR is very much an evolving story. I think what strikes
me is just the momentum that you realize when you come back to Congress. Everyone's here, everyone's
energized, everyone's talking about tech development. It seems to spur and move things along.
And it makes me think of this,
you know, the typical statement of we always overestimate what will happen over the course
of the next year, but we totally underestimate the amount of change that you'll see over the
longer term. And I think that probably rings true here if we're going to be 2024 heading in and
buying AR headsets, even if it's just first generation
or third
whatever generation
from retail stores
it's an exciting time
to be at Mobile World Congress
thank you
thank you
thank you for having us
applause The preceding program was made possible by the faithful prayers and financial support
of listeners just like you.
To find out how you can help, visit www.truenews.com.