ACM ByteCast - Neil Trevett - Episode 33
Episode Date: January 24, 2023In this episode of ACM ByteCast, Rashmi Mohan hosts Neil Trevett, Vice President of Developer Ecosystems at NVIDIA and the President of the Khronos Group, a nonprofit consortium publishing open standa...rds in a variety of areas related to computer graphics. He has worked to bring about standardization in the graphics world, giving developers the ability to extend and expand the capabilities of their visual systems. His accomplishments include bringing interactive 3D graphics to the web, creation of the glTF format for 3D assets, and recently founding the Metaverse Standards Forum. Neil talks about what drew him to computer science and how he became interested in the visual impact of 3D graphics, a field in which he has spent most of his career. He unpacks the evolution of computer graphics and discusses his role at NVIDIA, where his work focuses on helping developers make good use of GPUs. He also explains the benefits of standardization in industry and how open standards can enable innovation and interoperability. Neil also explains how 3D is changing the landscape of e-commerce and online shopping and gives his perspective on the Metaverse and how it can leverage other disruptive technologies.
Transcript
Discussion (0)
This is ACM ByteCast, a podcast series from the Association for Computing Machinery,
the world's largest education and scientific computing society.
We talk to researchers, practitioners, and innovators who are at the intersection of computing research and practice.
They share their experiences, the lessons they've learned, and their visions for the future of computing.
I'm your host, Rashmi Mohan.
The world of technology does not often lend itself easily to being associated with art,
except when it comes to computer graphics.
This niche area aided us in bringing our images to life on the screen,
progressing to giving us immersive and enriching experiences
that transform the manner in which we interact
with the digital and real world.
Our guest today is a pioneer in the world of computer graphics.
Neil Trevett is the Vice President of Developer Ecosystems at NVIDIA
and the President of the Kronos Group,
a non-profit consortium publishing
open standards in a variety of areas related to computer graphics.
He has worked tirelessly to bring about standardization in the graphics world, giving developers the
ability to extend and expand the capabilities of their visual systems.
His work involves bringing interactive 3D graphics to the web, creation of the GLTF
format for 3D assets, and most recently, founding of the Metaverse Standards Forum.
Neil, welcome to ACM ByteCast. Oh, it's a pleasure to be here. Thank you for inviting me.
Absolutely. Neil, I'd love to start with the question that I ask of all my guests.
If you could please introduce yourself and talk about what you currently do, as well
as give us some insight into what drew you into the field of computer science.
Sure.
Well, as you've kind of hinted in your introduction, I currently have three jobs right now that
are related to, I think, the topic of graphics and standardization.
My day job, as you mentioned, is at NVIDIA. I've been at NVIDIA close to 16 years now, and I work
in NVIDIA to try to help developers make good use of GPUs for graphics and compute. But throughout
my career, I found myself getting increasingly involved with
standardization. Again, as you mentioned, first with the Kronos Group, I've been the president
of the Kronos Group for over 20 years now. And now the Metaverse Standards Forum, which is much,
much younger, just a few months old. But I got into computer science right from the get-go at
college. My joint major was in computer science and electronic engineering.
And I really loved your introduction because I did get into 3D graphics because I loved, and still do, I still love the kind of the visual aspect of graphics.
I wish I could be a visual artist, but my hand-to-eye coordination is terrible. I love doing photography, but computer
graphics, the immediacy and the visual impact was just addictive to me from the get-go. So,
I've been fortunate. Basically, my whole career has been in 3D graphics.
That's wonderful. I mean, to think about it sounds like you have an innate interest in art,
and to sort of extend that into the world of computer science
or blend that with technology
is quite a unique sort of experience.
Not all of us have that sort of blend of our passion
and what our day job brings us.
So I'm really happy to hear that.
So what was that journey like, Neil?
So when you, you know, you sort of,
most of us go to college and do computer science.
What really sparked that interest in graphics?
Was it mostly the interest in art or was there a specific incident or a specific teacher
or any sort of moment that you felt, aha, this is exactly where I need to go?
It was when I first went to college and started studying computer science.
It wasn't clear what specialization I would end up in.
I did have this interest in the visual arts. But in the end, computer graphics attracted me in two ways,
not just the visual aspect that we talked about. It was also an amazing challenge. And this was
back in the early 80s. That's dating me. But the gaming, 3D gaming, wasn't yet really pervasive at all. And people were just trying to
figure out the very beginnings of computer gaming back in the 80s. But lots of interesting
challenges were coming up in the real-time nature of gaming and the interactions of users with a gaming system. Now, all these were cutting edge back in early
1980s. So the combination of the visual appeal plus the engineering challenge, both from a
hardware point of view and a software point of view. And my honours degree was in computer science
and electronic engineering, so hardware and software,
it just seemed like the perfect place to try to bring hardware and software together.
That is a great blend and really a wonderful use of your skills as you're developing them in the courses that you take. You mentioned GPUs or graphics processing units. I know that those
were first developed by NVIDIA not too long ago, too long ago, I mean, in 1999, from what I read,
and now are just sort of the industry norm and the applications of GPUs as well has sort of exploded.
But I was wondering if you could sort of take us through the evolution of sort of computer graphics
as you remember it. Yeah, it's actually been an interesting journey. It feels a little bit like
the Forrest Gump movie, where I've just been fortunate to
be at some of the key evolutionary points in 3D graphics, sometimes as an observer,
sometimes involved. So almost my first job straight out of college was a 3D graphics
startup company in the UK, which is where I'm originally from. It was called 3D Labs. And we were quite a few
companies working in the new field of 3D graphics. Silicon Graphics here in Silicon Valley had the
visibility and the high-end hardware, the workstations that many people, I'm sure,
used back in the day in many types of 3D applications.
3D Labs and other companies, we were trying to just build chips much cheaper than workstations
to bring 3D graphics onto PCs. And through the 90s, that was the main focus, helping 3D graphics
onto the PC. And now, of course, as you say, GPUs are everywhere.
And then at Kronos,
I was fortunate to have the opportunity
to initiate the OpenGL ES project,
which ended up bringing 3D graphics
onto mobile phones.
So that continues to be used
by billions of people every day.
And then also at Kronos, again, a lucky opportunity to be involved with bringing 3D to kind of building on the past work of the previous platforms.
And GLTF is used widely on the web as well.
That's the 3D asset format that you mentioned.
And now with the Metaverse Standards Forum
and the work at NVIDIA and Kronos,
the opportunity now is to make sure that 3D with open standards is brought to the
new platform, which is going to be the metaverse. It's amazing. I feel like I have to break that
answer down into four parts for me to really dig deeper into each one of them.
You mentioned OpenGL ES, right? And that was pretty path-breaking at the time when it was
created. And to recognize the need for you to
bring that to mobile devices, to bring OpenGL to mobile devices, was pretty insightful. Do you
remember how that came about? Yes. And I think everyone, myself included, who's involved with OpenGL ES and owes a big debt of gratitude to the granddaddy
of open standards in 3D, which is OpenGL itself.
OpenGL ES was a subset of OpenGL suited for mobiles.
But Silicon Graphics and Kurt Akeley, who's the CTO there,
took the very amazing decision.
Back when Silicon Graphics was at the height
of its market power,
they had a proprietary graphics API called Iris GL
and Silicon Graphics took the decision
to make it into an open standard,
which became OpenGL.
And they invited the other hardware vendors
and other workstation vendors to participate
in the governance and the management of OpenGL.
And that was really the beginning
of much of the 3D industry that we know today.
It was 30 years ago, almost to the day.
OpenGL was 30 years old just a few months ago.
And having the opportunity, because I was working at 3D Labs at the time, I was watching and helping
and supporting OpenGL as Silicon Graphics were launching it. I watched firsthand the power
of an open standard to encourage and enable and foster cooperation between engineers and
companies that normally would maybe be competitors, seeing a common good and seeing that if everyone
was to invest in an open standard, that everyone had a say in evolving. It was good for them. It
was good for the whole industry. It was good for the
participants helping the standard grow. And that was the start of my journey in open standards,
just seeing what a powerful force, the right standard at the right time could be for the
good of the industry and for the good of the participants helping to build that standard.
You know, that lends so well into my next question, because I was going to ask you that.
I mean, it's not common.
You know, a lot of us work in industry, but not a whole lot of us participate in the manner
that you describe in terms of, you know, doing work in open standards or contributing to
open source.
And definitely we'll talk about the distinction between those two.
But I would love to
sort of also understand you know your day job from what i understand is to foster and encourage
developer ecosystems and tied to the company that you work for but are there principles from that
that you apply at work that also lend themselves well to the work that you do for the chronos group
yes i think that there's a big overlap.
It's not 100%, obviously. Any company that's developing products, particularly in a competitive
field like computers and computer graphics, they're going to have a mix of proprietary technologies
and proprietary APIs and frameworks to enable their customers to use their products.
But very often, those same companies will want to use open standards too
because open standards can be very effective to help build a business.
Suddenly, a business doesn't have to invent everything that is incorporated into their products.
They can use open standards to interoperate with other companies to build on the work
that standards community has invested in open standards and really get the networking effect.
So for 3D graphics on the PC, it's a very obvious win-win-win.
If a graphics company selling GPUs was forcing people to use a proprietary API.
In many cases, that would be a friction point to their business.
It's much more enabling for the broader market.
If you can enable software developers to write once and to run across different hardware vendors,
then the hardware vendors get access to more software.
The software
developers don't have to keep rewriting their code. And in the end, of course, the developer
chooses more end users. The end user isn't so confused. And so that grows the market overall.
So the right standards at the right time can be a really positive thing for building business.
But of course, companies need to innovate too. So in a normal company, there's a mix between
proprietary and open standards and a smart company will use the right ones at the right time to
maximize their market reach. And so my own role at NVIDIA, I'm fortunate in NVIDIA lets me invest a lot of time in the open
standards in the computer graphics domain. NVIDIA uses many of the standards that the Kronos group
does, not all of them. And I'm not pulled in two different directions because NVIDIA using some of
the Kronos standards is good for NVIDIA's business. And that's where I focus my efforts at NVIDIA using some of the Kronos standards is good for NVIDIA's business.
And that's where I focus my efforts at NVIDIA, as well as engaging with the larger open standards community.
So there's a consistency to my role, whether I'm at NVIDIA or Kronos.
So again, I regard myself to be very fortunate to be able to do that and very grateful to NVIDIA to let me do
that. And hopefully, everyone is winning. I think, I mean, you're also being very modest
in that I feel like you also have these very special skills to be able to spot the opportunities
to participate in the open standards and also be able to sort of meaningfully contribute towards
the company and the goals that they have. But you know, one of the things that you mentioned,
Neil, which I thought was very interesting, you know, you were talking about the interoperability.
This is something I know I was doing research on the work that you've done. And you do talk about
this quite frequently. But I was curious, when we talk about companies that want to innovate and
want to sort
of build something proprietary that they can potentially monetize and help grow their business
how do you recognize when a new standard is needed so you come up with a you know sort of
you germinate an idea you build it out to a certain amount it sort of probably builds a little bit of
momentum but is there a tipping point when somebody recognizes and says, you know what, I think we need something more generic here?
Yes, that's a great question.
And it's very easy to try to, you know, once you get into the standards group, it's a big mistake to try and standardize everything.
And, you know, you should be very thoughtful about what should be standardized.
And maybe, you know, a certain thing shouldn't be standardized.
It should be left proprietary.
And timing is important too.
So my rule of thumb is, first of all, for a technology to be a candidate for potential
standardization, it has to be a proven technology.
It can't still be evolving. If it's evolving,
you're never going to get people to agree on what it is because it's changing too fast.
And probably companies still are getting commercial advantage from innovating and
coming up with the next revision of whatever that particular technology is.
And so you're not going to get agreement between the various companies as to what should be standardized. We kind of like jokingly say, don't do R&D by a standardization committee.
That's a very painful process. Don't do that.
So the right time to do standardization is when the technology is proven and quite pervasive
everyone's kind of doing their own version of it but you know the technology is basically understood
and the companies are no longer getting commercial advantage from doing that technology in slightly
different ways it's just become a frustration now
because everyone's kind of doing something similar,
but in frustratingly different ways.
And people are beginning to recognize
that those differences are holding back the market.
And it would be much better if we could standardize,
get rid of that friction point
and move on to the next wave of innovation,
the next wave of technology
where innovation is still going apace and you can compete with the next wave.
With standards, you can consolidate what's proven and accepted throughout the industry.
And the superpower of standards, because a standard is a specification that lets two things communicate with each other, be it hardware, software, or a client and a server device.
The specification plus conformance tests,
so you know everyone is implementing it correctly and reliably.
It enables multiple implementations of that technology
in an interoperable way,
and that lets that technology fan out
pervasively across the industry, again,
you know, to everyone's benefit. That is pretty amazing. I do have a question, though. So at the
time when maybe a standard is created, like companies recognize that, hey, we really need
something that's more sort of, you know, genetic and something that's standard across, you know,
the industry, is there a step there, then they're actually sort of, you know,
you're also spending time to adhere to that standard.
So it's almost like a step back.
I don't know if I'm expressing myself well, but it feels like in order to,
like I've been moving at a certain pace and building something out,
but now we're standardizing it and it might not always be exactly the way I built it out.
And so I have to actually spend some time adhering to this more genetic standard. Yeah, that's a good question. I mean, there's a
number of aspects to that question. Yes, I mean, building standards takes time and successful
standards that come into existence are successful because enough companies that are going to use those standards
care and believe that they're going to benefit and that industry is going to benefit. It does take
some resource. And it can take longer to create an open standard with multiple companies agreeing
to it than just doing something proprietary quickly and doing it yourself. But that extra time and those conversations to build a standard, that's not a bug. That's the
whole point. It's the feature that you want. Building consensus on a standard that everyone
can benefit from is the foundation for cooperation and interoperability. So you do need a quorum of companies
that are going to be willing to put that investment in.
But once you've done it, though,
you can begin to benefit in many ways.
And another aspect to your question is
it doesn't hold you back.
Well, because I often get the question,
if I have to use an open standard,
aren't I being brought down to a lowest common
denominator? I don't have any opportunity for differentiation or advancing. A well-designed
open standard very carefully chooses its level abstraction. And a well-designed open standard
will only define the minimum you need to interoperate. If it's an API like Vulkan or WebGL or OpenGL ES,
it's just the calling protocols that are defined in the API,
not how you implement that API.
So the Vulkan API, which is the new generation 3D API,
doesn't dictate at all how you implement your GPU.
So all the GPU vendors that are supporting Vulkan on their hardware, they're fully enabled to innovate in their GPU architectures however they want, as long as that's the final result
and the calling protocols of the API are on it.
They can do all the innovation that they wish to at the
implementation level. And the other way, we make sure that open standards don't hold back the
industry. Again, it's well-designed. Standards are often extensible. So if you really need some
functionality in an API like OpenGL or Vulkan. If you want to, and you want to move
faster than the group on a particular point, you can do your own vendor extensions to meet your
own market and customer needs. And in the end, it actually turns out into a pretty good pipeline.
Often the vendors are doing vendor extensions out ahead of the specification, some of them will fail.
Some of them were really just very specific to a particular vendor.
But very often they prove out the new wave of technology that then gets adopted back into the mainstream open standard.
So a well-designed standards ecosystem will encourage and enable that kind of vendor innovation
through extensions.
I really like how you describe
that and the distinction that you brought to the fore. I think that makes a lot of sense.
But this also begs the question of what is the right team to form to be able to sort of develop
an effective open standards? How do you know that you have the right composition of expertise,
whether that is domain related, or it is somebody
who knows how to build standards and knows how to sort of corral the troops, if you will, towards,
you know, an effective generation of the standard. So I'm just curious, like, what is that recruitment
process like? That's interesting. In my experience, the effective working groups, I mean, of course,
you need domain expertise. And you need expertise where
people have been genuinely implementing the technology that gives you the insight
to keep the standard, to keep it real. Again, a standard shouldn't be doing R&D, it shouldn't be
inventing new technologies, it should be taking, proven practice and figuring out how to reframe that functionality in a way that everyone can benefit from.
So you need people that have been implementing that technology.
They're the folks that are standard to be around the table as you design it,
to make sure that the requirements and use cases are based in the real world needs as well. very easy it's an easy trap for a room full of implementers no with no users no developers in
the room to go off in a direction that no developer is going to find attractive and you know you end
up with a standard that is doesn't meet the needs of the developer community is for an api again like
opengl vulcan it's the developer community are your customers. And so you need to make sure you're talking to your users,
your customers, as you design the standard.
And if you don't have enough of your customers or end users in the room,
that is a definite warning sign.
Almost all successful standards I've seen have had a good mix
of implementers and users of the standard.
That is such a crucial point.
You're absolutely right, right?
Because I think it's like with any business.
I think listening to your customers and getting regular feedback is probably the best way to sort of guarantee some, you know, adoption as well as success.
But, you know, talking about users, like, you know, obviously one of the largest, largest i would say user base for some of the
standards that you've defined are i'm from what i understand are game developers right and from what
i hear the market is so competitive that i wonder do people want to sort of lock in their audience
and build sort of proprietary experiences how have you i don't know if that you know specific
set of users in something interesting that you you might have experienced with the game developers.
Yeah, it's interesting.
And I think it's a fundamental property of the universe
that you're always going to get a spectrum of,
in whatever the field of human endeavor is,
including the computer graphics market or the computer market
in general. Any company can decide, do they want to use proprietary interfaces and technologies
and try to lock in their customers, or do they want to be open? It's a spectrum. And it's not
a criticism either. Some very successful companies, their business model is trying to use proprietary technology wherever they open standards, not having to do everything themselves
and through the networking effect, potentially get access to larger customer base and larger
markets.
And there will always be different companies all the way along that spectrum, different
points, people trying out different business models.
And I think that applies both to the platform vendors and the technology vendors,
you know, the GPU vendors, and also, you know, people selling applications to the same, the same
spectrum applies at each level in the food chain. Some applications will be much more open,
others will be more closed. It's all good. And it's, it's, it's the way of the world that the
Armenian mechanics will kick in and will select out the successful business. It's the way of the world. The Armenian mechanics will kick in
and will select out the successful business models
all the way across that spectrum.
But it's the role of the standards community
towards the open end of that spectrum
to give the companies that want to use open standards
to get that networking effect,
to give them the choice of effective and well-des use open standards to get that networking effect, to give them the choice of effective and
well-designed open standards so they can live on that end of the spectrum if their business model
needs it. Yeah, that's great to hear. I mean, there's room for everybody and I think it sort
of depends on how you want to drive sort of your business forward, right? Whether it is
with the goal of being interoperable and if that's what's sort of really going to bring you the momentum,
then it makes more sense for you to align with a standard that is industry-wide.
ACM ByteCast is available on Apple Podcasts, Google Podcasts, Podbean, Spotify, Stitcher,
and TuneIn. If you're enjoying this episode,
please subscribe and leave us a review
on your favorite platform.
So one of the other areas, Neil,
that I heard you talk about
or write about quite a bit
is pervasive 3D in e-commerce, right?
And e-commerce is obviously
something very close to my heart.
I do a lot of online shopping. But I was wondering if you could maybe walk us through, you know, what do you
envision as 3D in e-commerce? And how do you think it is changing the world as we know it? I know the
GLTF file format is something that, you know, you've been very integral to sort of developing and now is being adopted
pretty widely. So I was just wondering maybe if you could give us like an insight into that journey
and what you think, just maybe a little bit of a future vision into that.
It's interesting. GLTF has an interesting genesis story because we started work on gltf you know uh over 10 years ago now and we
were looking at all of the other media types you know images and video and audio and you know over
time they were all getting their what i call their social media moment and it comes from having a compressed, pervasively available file format.
So MP3s arrive, and I'm going to show my age again now.
And Napster.
I recognize it, so you're all good.
Okay.
And now it's Spotify, right?
And the rest.
The JPEG arrives, and you get Instagram and Facebook, and then videos arrive,
and you get YouTube and TikTok. But there hasn't been that social media opportunity for 3D.
And back in the beginning of GLTF, we thought, well, perhaps one of the missing pieces is 3D
doesn't have the equivalent to JPEG. There's not a file format
that has been specifically designed to be easy to process and display, even on a mobile phone,
just like pictures and videos, and is pervasively available. And that was the reason we started to
design GLTF. And we've been fortunate that the industry has widely adopted GLTF.
And so now GLTF is fulfilling that role of being a file format that's very widely adopted and supported on many platforms, on the web, in native applications.
Many tools generate GLTF.
Many applications absorb GLTF.
So it's been an interesting and fun journey.
The 3D commerce is one of the first real beachhead applications that I think will bring 3D into a wider audience.
Because the commerce that's in general and online commerce is a huge business. And there are many studies out there that show that if you can display a 3D model of an object that you're offering for sale, you do get more customer engagement and customer satisfaction and less returns than just a JPEG.
It's actually quite a big margin, too.
It's very interesting because you can see more details,
you can interact with the model.
It becomes quite a compelling business booster
for the online commerce folks.
So there's a lot of sustained interest to figure out,
okay, we know 3D is an effective selling tool and we want
to use it. But then they discover it's very difficult to have millions of products being
generated by hundreds of different tools flowing through hundreds of different companies with storefronts, with different engines that are being used to
display these 3D objects, it becomes a real logistical nightmare if you don't have not just
the asset formats, but you have a broad understanding of how to create assets that
will run on a mobile phone and and how can you create them,
and how can you have engines on very different platforms,
like a web on a PC or a browser on a mobile phone
or an ad platform like Facebook or Snap,
all using different engines.
The 3D models, even if they use a common format like GLTF,
they end up looking completely different.
So your purple couch,
which of course now everyone wants a purple couch,
looking more red than it should do,
and it will get returned.
And that's a real pain for everybody.
And so 3D commerce has been this really motivated set
of companies wanting to use 3D, helping the 3D
industry really figure out how to get consistency in tooling and guidelines on how to use the tools,
what are good guidelines for building content that can be deployed everywhere.
And now we're onto the stage of making sure that all of the engines, like a web browser engine or a native app on a mobile
phone, they all display those 3D assets consistently to the end user. And that's essential for 3D
commerce. But of course, it's essential for everyone else in the 3D industry who wants to use 3D in this way to communicate real-world
information. So 3D commerce are a precursor. They're solving many of the problems that are
going to be very relevant to many more people across the industry, including the metaverse,
because the metaverse is going to want to consistently and accurately display 3D assets and avatars on multiple metaverse
platforms. So, you know, the 3D commerce is really a precursor to solving these problems
for the wider industry. It's so exciting to hear that. I mean, basically the marriage
of a use case with such a wonderful sort of way to apply it. And the genuine interest that comes
from that sort of, you know, that industry as well, must be so exciting. Because like you said,
there is a lot of passion, there's a real need to solve the problem, not just for themselves,
but also for, you know, those to come after them. So it's really actually very heartening to hear
that. One of the other things that I noticed, Neil, is, you know, I actually went back and did research on some of your talks from many, many years ago.
And one thing I noticed very pleasantly, surprised and impressed was you really started to include a very diverse set of voices while building your standards committee.
So I know you've presented in China and in various
other sort of international destinations. What was the insight that drove that? And what do you feel
bringing those sort of really, you know, different voices and just ideas from around the world
brought in terms of value to, you know, building these committees?
Yeah, I'm so impressed by the research you've done. You know more about me than
I do. But no, we've always wanted to include the international community because an open standard
is not truly open unless it's accessible to everyone. As you say, in an inclusive way,
geography should not be a barrier to the use of open standards.
And over the years, as you've mentioned,
we've figured out ways of reaching out into the various geographies
and investing in the long-term relationships that's necessary
for the 3D industry in Japan or China or other geographies to,
to really trust that's like any business, right?
People do business with people, not, not companies in the end. And, you know,
you need to need to show up and continue to show up in those other
geographies to build the recognition and the trust that they can,
companies in these diverse geographies can trust the standards that this organization
that they've just met is producing.
And I think we have managed to build that trust in many different geographies. And of course, the reward is many
of the innovations and the energy and the input into the standards that we're building. And now,
of course, coming from those diverse geographies and everyone then gains, including the folks here
in the US. So again, it's another win-win-win.
The biggest problem, we kind of joke about it,
but actually it is a real problem.
The biggest problem it introduces
is that the earth is round.
And therefore it makes it impossible
to find a good meeting time that everyone can join in.
Because if you have participants from Europe, US and Asia,
of course, there is no time where someone isn't having to be up and on a Zoom call at two o'clock
in the morning, which is, you know, that's not fair either. So, you know, we struggle with that
one continuously. The standards at the Metaverse Standards Forum for the larger meetings where we
do have a lot of international participation,
particularly, we actually, we've begun holding two sessions of the same meeting. We hold it once
in the morning Pacific time and once in the evening Pacific time with the same agenda.
And so, you know, people from different time zones can participate in a real way.
That's brilliant. That's a really good way to do it.
And I agree.
I mean, I think it's a small price to pay
to bring in that value of both cultural
as well as unique challenges
that each of these either geographies
or working groups face
to bring it in to add sort of, you know, value
and help you think in dimensions
that you might not otherwise have thought about.
So that's great.
But yeah, we've spoken about the metaverse a couple of times, Neil, but that's about
as exciting and novel as it gets.
So I would love to hear more about, you know, maybe one, you know, just your interpretation
of the metaverse for our listeners.
And then also, what is unique in defining standards for the metaverse?
Yes.
So it's very interesting when people hear about the Metaverse Standards Forum, they
go, oh, you're standardizing the metaverse.
So what is the metaverse?
That's normally the first question.
And the answer is, we don't know.
We don't know what the metaverse is going to be in 5, 10, 20 years' time.
We're at the beginning of a very chaotic, in a good sense,
a very dynamic period of innovation
where it's going to be very hard to know what comes next.
The kind of analogy I use is the very first web browser,
the Netscape browser in 1994.
If you sat someone down in front of the Netscape browser in 1994 and said, what is the web going to be in 30 years' time?
I guarantee their guesses would not be anywhere close to reality as to what's happened.
It's just impossible to know, right?
Because there's so many innovations in unexpected directions.
You can't plan or predict that far ahead.
And that is precisely where we are
with the metaverse today.
So yes, we don't know what it's going to be
in 20 years' time.
But the metaverse is real, though.
I think the excitement in the industry
is coming from the fact that
for many people, what it means, what the metaverse means is we're bringing together multiple disruptive technologies in new ways.
And that is going to create opportunity and disruption at a pretty big scale.
So the metaverse often is described
as the 3D evolution of the web.
And I think, at least in part, that's true, yes.
But I think it's the connectivity of the web
with the immersiveness of spatial computing
that includes photorealistic graphics,
so the GPUs are in there
and using the GPUs for simulation and compute.
There's XR, augmented and virtual reality.
Though you're not going to be forced to use those.
It's going to be an option.
And some of the most immersive experiences probably will use them.
But I think most people access metaverse through their phones still.
And the magic pixie dust, though, which is truly disruptive, is AI and machine learning.
So that is letting our machines, our computers, do things that just seem magical, even just a year or two ago.
Natural user interfaces, language processing and understanding, natural user interface through gesture and body tracking,
scanning objects, scanning your environment,
and understanding them semantically.
Really uplifting the tools so people, normal, everyday people,
my mom soon will be able to build and scan 3D models because the AI will be able to help and assist the creation.
Text to VR.
And that innovation is happening right now as we speak, almost a weekly basis.
There are new innovations on how to deploy machine learning.
And so if you mix all those things up together, interesting stuff is going to happen. And there is going to be a constant stream of opportunity.
How can we standardize it if we don't know what it's going to be in 20 years?
Well, actually, it turns out to be quite simple because we know what interoperability problems we have today. If you're going to bring all these technologies to work together, then you need interoperability because interoperability is helping things work
together. That's the whole essence of standardization. And so we're finding this
wave of interest in 3D standards and XR standards, because people are beginning to realize that
they're going to be used as a part of this overall metaverse mix. And so we have a lot of
interest in standards. In some cases, we've been working on for decades, and now a new wave of
people are interested because they're going to be used in the metaverse. So it's an opportunity for everyone,
including the standards community. It's fascinating the way you describe it. And
it's also very exciting to think about it as it sounds like the fact that the common denominator
of interoperability has been identified and is being used as a foundation, it just seems like bringing these three or four or, you know,
N technologies together will be more successful, if you will, right? It just sounds to me that
having the ability for these things to work together is the foundation of how we kind of
maybe imagine what the metaverse will be. That's right. That's right. Because if we're going to avoid just having a series of
vertical silos, like today, I mean, the closest many people get to the metaverse today are games
like Fortnite and Roblox. They are awesome. And they have many of the elements that I think people
imagine the metaverse to have, user-created content, real-time 3D graphics, social connectedness,
connectivity, but they are verticals.
You can't design an asset or an avatar in Roblox
and take it to Fortnite or take it anywhere else.
And I think that many people's vision of the metaverse
is it's not just a series of vertical applications.
It is a larger platform where your investment
in your work and your avatar and your cool Gucci jacket,
you can take it with you across the different spaces
and environments in the metaverse.
And that is going to take this whole new level
of interoperability.
But the important thing to understand around the forum
is it's not a standards
organization. And that seems counterintuitive because there are many standards organizations
already in the industry doing excellent work. Now, Kronos is the one I'm involved with,
but there's the World Wide Web Consortium, there's the Open Geospatial Consortium.
There are dozens and hundreds more standards organizations today creating standards that
are going to be relevant to different parts of the metaverse the problem that we had was there
typically wasn't a lot of communication between them and there were some liaisons but in general
all these organizations that are finding all this increased level of interest have nowhere to go to coordinate and communicate
and to ask the industry what the industry wanted
from metaverse standards.
And that is just a very simple idea behind the forum.
It's not another standards organization.
It's a place for all of the other existing standards organizations
to come together to coordinate and to communicate
with the wider industry.
That's great. I do have a more sort of applied question, if you will, Neil, which is really the
way we're describing the metaverse. Are there things like bandwidth considerations, network
availability, prohibitive costs, are those being considered also? Like trying to see if metaverse is not something
that is only being built for the elite.
Oh, I think, yes.
That's a great question.
I think Darwinian business mechanics
are going to take care of that.
The most successful,
at least the most pervasively accessible metaverse experiences,
whatever they end up being, whether it's a game or whether it's a digital twin controlling a factory
or augmented reality glasses navigating you through a strange city,
it's going to be more accessible.
The cheaper it is and the better designed it is to survive on not needing gigabit
ethernet. The people that innovate and make their products and their technologies accessible
and widely available probably are going to be the ones that stand a chance of getting most
adoption. And that could lead in the right hands to successful business
and that's where your darwinian mechanics will kick in i mean you may get high-end products
at the at the bleeding edge of innovation i mean that's that's a typical pattern in in course in
in the technology industry but you know i think everyone's vision of the metaverse, if you do this right,
it will be as pervasive as the mobile web.
That's really what we want.
And many, many people, of course,
around the world can have access
to mobile web technology.
We want the metaverse to be as pervasive as that.
Yeah, I'm very excited to start a prospect of that.
For our final bite, Neil, what are you most excited about in the field of technology or
in what the metaverse is to bring over the next five years?
It's been an interesting journey being part of this whole metaverse cycle.
I've learned not to believe the hype of the metaverse because you know you see
market forecasts you know i saw one the other day that said the metaverse is going to be a
i think it was 13 trillion dollars market in you know by the end of the decade i'm going how can
you make that kind of prediction when we don't know what the metaverse is?
So some people are being carried away.
Don't get carried away on the hype.
Don't get caught up in the dystopian despair, too.
People say, oh, it's going to be terrible because we're all going to put on our VR masks and we're never going to take them off again.
And we'll lose all social contact.
And it's going to be very dystopian.
I don't think I have faith in human nature. I don't think... I have faith in human nature.
I don't think people want to be disconnected from reality like that.
And as I like to say,
you can't buy a cup of coffee in the metaverse.
The thing is real.
So I don't think we'll go to the dystopian extreme either.
It goes back to what I was saying before. The metaverse, though, it's an exciting place to be because it is going to
create this constant stream of
remarkable commercial opportunities and the combinations of
the technologies that we were talking about.
It's going to be an amazing endeavor for the industry
over the next years and decades.
Just like the web has taken 30, 40 years, the metaverse will too, I think.
But if you force me to say, okay, all the technologies that's coming together,
which one is potentially the most disruptive?
I think machine learning and AI.
If it stays on anything like its current trajectory of innovation,
things are moving very quickly,
and a lot of things are going to change.
So hopefully the metaverse will be a good place
where we can apply that kind of technology
for the good of everyone.
Wonderful.
I have learned so much through this conversation, Neil.
It's been absolutely amazing.
Thank you so much for taking the time
to speak with us at ACM ByteCast.
Of course, Rashmi.
And thank you.
Thank you for the wonderful questions.
It's been a pleasure to talk to you.
Likewise.
ACM ByteCast is a production
of the Association for Computing Machinery's Practitioners Board.
To learn more about ACM and its activities, visit acm.org.
For more information about this and other episodes, please visit our website at learning.acm.org.bytecast. That's learning.acm.org.