Future of Coding - Scott Anderson: End-user Programming in VR
Episode Date: November 7, 2021Scott Anderson has spent the better part of a decade working on end-user programming features for VR and the metaverse. He's worked on playful creation tools in the indie game Luna, scripting for Ocul...us Home and Horizon Worlds at Facebook, and a bunch of concepts for novel programming interfaces in virtual reality. Talking to Scott felt a little bit like peeking into what's coming around the bend for programming. For now, most of us do our programming work on a little 2D rectangle, with a clear split between the world around the computer and the one inside it. That might change — we may find ourselves inside a virtual environment where the code we write and the effects of running it happen in the space around us. We may find ourselves in that space with other people, also doing their own programming. This space might be designed, operated, and owned by a single megacorp with a specific profit motive. Scott takes us through these possibilities — how things are actually shaping up right now and how he feels about where they're going, having spent so much time exploring this space. This episode was sponsored by Glide, and the transcript was sponsored by Replit — thanks to them both for making this possible. The show notes (with copious links) and transcript are available here: https://futureofcoding.org/episodes/053Support us on Patreon: https://www.patreon.com/futureofcodingSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Welcome to the Future of Coding. My guest today is Scott Anderson. Scott is a game developer and a
game designer with a special focus on graphics, and that has taken him some very interesting
places in his career. He's done indie games like Luna and his own project Shadow Physics, but he's also worked
on big AAA blockbusters like Call of Duty.
And he's currently at Unity as a senior graphics engineer.
But before that, he worked at Facebook on Horizon Worlds and an unreleased vr scripting system in oculus home so this interview
was conducted several months ago but in light of last week's announcements from meta facebook the
things that we talked about in this interview are suddenly a little bit contentious because i think
some people are probably sick of all this talk about the metaverse but there's a really interesting angle to it that to do a proper metaverse you need a huge focus on
user-created content and on really good tools to enable people to make content and if that content
is going to be at all interesting it needs to have very rich behavior and so a large part of
what we talk about today is what alternative interfaces
for programming one could create if they wanted to for instance bring a whole bunch of people into a
brand new environment that is computational and how we might enable those people to create whatever
it is that they imagine well at the same time accounting for all the things like new possibilities
for abuse and the questions of power and responsibility that come from the imbalance
between a large corporation and the individuals who are going to come and play in their playground,
there are questions about the technology that goes into building these tools and building this new
so-called metaverse. How would we actually go about building this kind of stuff on a technical level?
And especially with respect to graphics, what things are there out there right now?
And how do they work that we might bring to bear in this new area?
So this is a fairly long interview and we go some places and we go quite deep.
So I hope you will enjoy it.
One thing that I will say right off the bat just
because you know this is something that weighs on my conscience a little bit is that um facebook
especially in light of the recent revelations that have come out you know the papers as it were but
of course you know in a much longer arc of their relationship with society over the past oh let's
say decade maybe their entire
existence, depending on how you want to frame it. They're a company that have a deserved reputation
for being a little bit unscrupulous, to put it mildly. And I just wanted to say that the material
that we discuss on this show and the thoughts that we have about what they're doing are relevant regardless
of who among the you know large corporate overlords that rule our lives end up pushing to create this
new metaverse that we find ourselves confronted with so the fact that it's facebook is you know
unfortunate but it is kind of incidental if apple come out with their thing in the next couple of
years i think this conversation would apply just as well to what they're doing.
If HoloLens from Microsoft ends up taking off in a big way, these ideas are going to manifest there as well.
So the fact that this is Facebook, with questionable morals and instead look at the
fact that we are on the cusp of what might be another major technological revolution on the
scale of the phone at least that's what these large companies like facebook are hoping it will
be because it will give them an opportunity to wrest power and so if they're going to do that
and if there's going to be a a role of creating new kinds of culture within this new environment on this new platform, I think we need to be having these kind of conversations about it, regardless of which of the corporate overlords ends up pushing their particular vision for it.
And, you know, regardless of whether or not this is entirely in VR or is just another manifestation of the internet powered by regular mice and keyboards
and screens. So in any case, all of that stuff being said, there was one other little caveat I
wanted to throw in. I normally try to push for the absolute highest possible sound quality on this
show. Scott's recording has a little bit of background noise in it. I've done my best to
clean it up. It's not pristine,
but it should be more than good enough for everyone to enjoy what Scott has to say and to benefit from the absolute trove of wisdom and experience that he brings to our show.
The show notes and transcript for this episode are available at futureofcoding.org
slash episodes slash 53. Thank you to Replit, as always, for sponsoring that transcript.
And thank you to our other
episode sponsor today, Glide. You'll hear more about both Replit and Glide throughout the show.
So with that, let me kick it over to Past Scott and Past Ivan to give you a whirlwind tour of
what it might be like to do end-user programming in the metaverse. I've seen a lot of what I would
call programming in VR projects that are just like a rectangular
screen with a virtual keyboard and whenever I see those I just think to myself like we have to be
able to do better than that right like that seems sort of like the thing you see whenever there's
some new medium that emerges like the first thing that everybody tries is let's just recreate the
old medium inside the new medium so like early films being like let's just record theater early podcasting being like very informed
by radio before it kind of found its own voice so i'm kind of wondering like just as broadly to set
this up like what are your thoughts on programming in vr and what kind of projects have you seen
where people are like doing interesting things in that space and do you have any of your what kind of projects have you seen where people are like doing interesting things in that
space? And do you have any of your own kind of VR native programming ideas that you've been
thinking about? Yeah, I'll start with the first part of your question. Like just opening up a
rectangle in space and having a virtual keyboard in VR is pretty terrible experience right now.
You know, it's something that people get excited about,
I believe because it's something that they're familiar with.
Right.
And there is like a cool factor to some of the environments that do that,
where like you type code in a rectangle,
but there are things happening around you that are physical.
Like you spawn a cube or,
you know,
some rigid bodies fall from the sky or whatever,
you know,
it gives you this kind of godlike effect in the universe that you wouldn't get in a flat screen game programming
environment, even if it was live coded, right? But at least with current and especially, you know,
kind of last gen VR hardware, screen resolutions aren't really good enough for that to be a good
experience. Virtual keyboards kind of suck on most platforms, but especially in VR. But you can imagine a future where, you know,
we've seen big like advances in VR HMD resolution, right? So you can imagine a future where all VR
HMDs are 8K plus, right? Where you have features like eye tracking and varifocal. So
you can actually look at something and it'll change your lens focus, right? So you're not,
one kind of standard feature of VR hardware is that the lenses don't change focus and they're
optimized for, you know, I think it's roughly like two meters in front of you, which is great for
like a lot of games, right? But it's not really good for reading text right so you actually want to change the the focal length
for reading text so you can imagine with all these harbor advances right and you have a track keyboard
so you can use a real keyboard and see it in vr which is a feature that's available on oculus
quest 2 right now if you buy the right keyboard, where the experience, even if it's a traditional
programming experience, gets good enough that it's still the way people code in VR.
But as you kind of hinted at, it's still not really taking advantage of the medium, right?
You're bringing what people are familiar with, like the, the present of coding into VR and making it work in VR rather than looking at
VR as maybe a future of coding device.
Right.
And looking at,
Oh,
well,
how do,
how does like physically embodied coding work?
Right.
So,
you know,
that,
that kind of brings me back to when I first,
first got interested in coding and VR,
which was, I believe, like,
2015, early 2016, sometime around then, when, you know, around when Oculus Rift and the original
Vive were launching. I was playing around with some VR prototypes. I had just, I believe,
started to work at Phenomena on a VR game called Luna,
and I was doing a lot of prototypes for that game. Luna as a game has a creative mode where it's
almost like a gardening system or like a playing with toy building thing. That was a lot of my
inspiration for some of the design for the UI is actually just playing with toys because the game itself is very playful.
It's got a children's storybook illustration art style.
So, you know, I did a bunch of kind of, you know, VR track controller interaction features.
Like you can plant trees and to plant the trees, you drop them on like a terrain, right?
But then you can edit the trees and to edit the trees, drop them uh on the on like a terrain right but then you can edit the trees and to edit the trees you grab different points along the tree so if you grab the top and
pull up you can scale the tree up and down if you grab the top and pull to the side or grab the
middle and pull to the side the tree itself is an ik chain so you can actually bend the tree and
kind of warp it uh and if you grab the bottom of the tree, you could, there's a color picker
and you basically rotate your hand around the tree
to color pick, right?
And then like Luna had a lot of fun, little interactions,
but I felt like I was just kind of getting into,
you know, the like really basic prototyping phase,
even though all this stuff shipped in the game right
so the game you can buy it now it's on multiple vr platforms pretty much all of them and all this
stuff shipped and it was good but it made me kind of think of like oh well what are deeper ways these
systems could work right like what does a volumetric color picker look like you know what
do more complex ik chains look like etc or like add new nodes or what's the term for that in ik
chains it's been so long since i've done bones can you add like new bones to an existing tree
with an ik chain in it or is that like yeah you can you cannot do that in lunar you can't like
edit mesh there's no mesh editing or anything like that, right?
And there's no terrain editing.
So it's not actually, like, an art tool or anything, right?
It's a game, and, you know, the gameplay is, like, decorative gameplay.
Like, you would have an Animal Crossing, but with less items, right?
So the modifications are pretty limited, right? But those things that you brought up would be kind of interesting things to add to Luna or some other similar application that is maybe more advanced, right?
So Luna and some other VR prototypes I did got me thinking of a lot of different things, but one of them specifically that I
wrote really early on was tangible coding in VR, right? So this idea that you could bring
physical interactions or things that felt like real physical interactions into a digital world
and still have all the affordances of being in a game engine and, you know, being able to
be fast and loose with physics and fast and loose with,
you know, matter and mass, like, obviously, you have constraints, right, in terms of performance,
like CPU and GPU performance, and how many things you can render and what complexity they can be,
etc. But, you know, you have a lot of constraints that, that are entirely lifted, or don't exist
without, you know, complex systems that would exist in the real world,
right? So one thing with tangible coding, I've always been interested in it, but you need to
have a lot of physical blocks, right? There's not really too many real commercial tangible
coding systems and they tend to be expensive and they're all universally targeted at kids,
right? There's no professional tangible coding really like you could you could argue that electronics when you start to move into
electronics that's technically like tangible coding but not really right so you're thinking
like arduino's and that kind of thing or yeah yeah yeah yeah but but like when i talk tangible
coding i mean like uh google project blocks like kind of things that are like, uh, let's write code with a bunch of
Lego blocks or let's, let's write code with, you know, magnetic connecting pieces or something
like that. Right. Um, you know, and that, that was kind of just like my one line blurb. I didn't
really have a lot of ideas about the system or like what it would do or how it would actually
work at that point. But for various reasons, I did not have time to prototype that.
But then I interviewed with the Oculus Home team right before they were launching the kind of new
at the time Oculus Home, which is still the Oculus Home that you launch into when you go into the PC
version of, you know, Oculus software, right? Can you give us like a one sentence? Like what
is Oculus Home? Oculus Home is, you know, effectively you have like a house or an apartment
in VR. It's kind of like your VR desktop. That's the idea, right? And Quest has a similar thing.
It doesn't have the modification aspects of it, but similar to Luna, right? Or, you know, I said,
like mentioned Luna and and mentioned you know
kind of animal crossing style decoration software right you'd have an inventory of items and you
could decorate those items you could put like tvs or computer monitors or desk there were some
interactive objects like uh or there are some interactive objects like bone arrows and uh you
know kind of an n-style zapper gun
that shoots lasers and makes sounds
and basketballs and stuff like that, right?
So there are a lot of interactive elements,
but pure interactivity is cool.
But if you want to actually start building
even small games with that stuff, right,
you need some sort of programming language
scripting you know some way to inject some dynamism yeah yeah exactly so i was hired to
you know kind of work on visual scripting in that environment for various reasons i never actually
shipped any of my work in there um and we'll get into that later but yeah i did a feature
where it was basically i spent like about six months doing a feature where uh that allowed
for screen sharing because it also had pretty good uh pc desktop integration right so uh you
could you could actually see like all of your desktop windows in vr on various virtual screens
that you placed in the world um so we had had TVs and computer monitors and things of that nature. And then we added multiplayer
and for multiplayer, you could screen share with everyone in your space, right? So you could choose
to broadcast and you could use it to, you know, watch videos together or, you know, even prototypes
and none of this shipped but
there are prototypes that would allow for kind of like co-located game playing as well um
so like coach co-op but you know two people a thousand miles apart kind of thing yeah yeah
and latency was an issue and the latent like for video watching the latency wasn't that big of a
deal but like the latency wasn't ideal for game playing because it's obviously all hosted
on one person's PC, right? So if you're just forwarding inputs, you can't
predict in that case, right? You have to solve game streaming
without the benefit of a huge cloud, yeah.
Exactly. It's peer-to-peer and it could be over really long distances. It's not really a solvable
problem in terms of making it good.
Thanks, physics.
Right? But that's why it was a prototype.
It was kind of fun to just see it.
Yeah.
You know what?
To me, that kind of thing feels so inevitable
that I can see there being games created specifically
to accommodate that kind of a latency as a as a baseline where it's like it's
not going to be a twitch shooter or a platformer or something it's going to be you know something
something more strategic maybe or something like that but i i i can't imagine a future of vr without
like two people are in a room together and they want to play a game on the same you know virtual
tv in front of them yeah and you could absolutely do something like jackbox or you know kind of a party game style thing or code names or whatever a party game style game that uh
where latency just doesn't matter at all or it matters very little right and it would be a lot
of fun in vr definitely more fun than playing it on zoom right yeah and i know there's going to be
people out there in the audience who hear me say something like
that and who hear us talking about like let's put tvs in vr and it's like no that's the the
problem like vr gives us this like you want to you want to go like live on pandora or whatever
that's what vr lets you do why why do we keep porting the screens and the rectangles and that
kind of like hunched over narrow view into the dynamic world into what
should be the ultimate dynamic world short of like you know i put some like fancy contact lenses on
and then get like perfect ar overlaid in the real world like that you know let's set that aside for
a minute but like vr you can define the space to be whatever you want so that should unlock a ton
of new creative potential and let us revisit a lot of our assumptions but i think like one of the
points that you made about like hey facebook home is kind of like animal crossing and i'm reminded
when i hear you talk about that of like you know back, back in the in the late 80s, early 90s, when I was a kid, a lot of these early GUI programs that I was playing with were like, about kind of decorating your computer and making it feel more like you can put stuff in. And it's like, there's this, I think the urge to recreate the kind of the familiar surroundings from
before in the new medium is,
is a good urge.
And I think it's just like,
you know,
it's worth kind of being conscious of how much of that is about giving
people an easy way to get themselves
familiarized with the new environment and make themselves a little home so that they're
comfortable in it and then making sure that that's not like the stopping point like that we keep
going and we're like okay now that everybody's got a little you know a little nice cozy space
that they've made for themselves now can we go and let them like swim around inside metaballs or something like that i want to wave my hands around in the air and have that conjure up
computations like they're magic spells so i'm curious like where do we keep pushing
to get closer and closer to that kind of a bigger reinvention yeah definitely i mean i definitely
agree that like i think I think there's two
sides of that, right? Where like one side is that, um, you know, uh, making sure users are familiar,
making sure that like, if you're selling people a product and you show them something that is,
and this isn't a hundred percent true by any means, right. But if you show them something that is and this isn't 100 true by any means right but if you show them
something that they're somewhat unfamiliar with and they don't see the value in it right they may
not be interested in it right so it's like oh i can watch videos on a big screen with my friends
without owning a big screen right is a fantasy that a lot of people have even though it's kind
of mundane right like so i mean and facebook, especially is kind of especially bad at this, right? Like, I feel like as an organization in general, and,
you know, it's, it's unfortunate in some ways that they're shepherding, like VR at this point,
right? And like, no one, no one's close to Facebook, in my opinion, right now. You know,
it's a big company. It's,'s it's you know relatively corporate right and
they're not necessarily going to want to you know kind of push the boundaries in that sense right
um like it's probably not a priority for them that vr has a really strong demo scene right out of the
gate yeah yeah and i mean it's it's a little bit unfortunate because like
i feel like vr in general has been that way outside of with maybe a few exceptions right
there's some people that um uh like i remember really early on and even later like isaac cohen
um who's who's kind of a you know creative coder slash demo scene person uh made a game called larp and he still he still does like a lot of
kind of more out there uh vr and ar experiments right um and there was like a lot of just or
there's some like really weird stuff right but you know i think vr especially has always been
very much about how quickly can we get a new smartphone effectively
right how can we get a new mobile gold rush where there's this new platform that everyone has to
move on to and everything and you see it now with metaverse conversations right it's like oh well
everything needs to be 3d now because because and we're gonna have a new web and it's gonna be great
and we can make the same billions of dollars we made on the web doing the
same stuff over again, but for 3d, right. And like there,
I think, you know, a lot of this conversation has been like, what has,
what gets lost in that kind of thinking. And, you know,
I think there is like a lot of just weird experiments that would fail or will
fail or like aren't for everyone that can push
the mediums that just don't happen because of the way these things are funded um like apple being so
restrictive about like you know what kinds of runtime environments you're allowed to ship inside
your app like are you allowed to have a scripting environment what are the limits on that scripting environment like that sets a real creativity limit i think
and so i do i do kind of fear the same thing happening when it comes to vr and ar more broadly
just because it's yeah like there's there's such a, like you said, for this to be the beginning of a new market and not the beginning of a new culture.
And just because of who's stewarding it.
I don't know what the right recipe to get around that is because there was just as much financial motivation going on in the 80s and 90s and other times as well when new platforms were emerging. But those platforms like,
yes,
the,
the personal computer isn't all that it was supposed to be,
but I think it's arguably close.
Whereas like mobile is definitely not close to what it could have been if it
had been a little more,
you know,
a little more of a wild west maybe.
And I, I know the arguments in favor
of you know having mobile be as locked down as it is but i i hope that like the fact that vr
is so much more intimate and so much more personal and like as we get computers closer and closer to
the cores of our being it kind of heightens the the amount of damage that could hypothetically be done by
by that that technology that computer being used maliciously and i just hope that like the safety
harnesses that are put around computation by these corporations to ensure that they work well for
you know as things that make jim kramer salivate um even if he can't really articulate what the
metaverse is i don't know if you saw that clip. It was hilariously distressing.
Yeah.
But I hope that those kind of restraints that are put on them aren't the kind of restraints
that keep us from having a really strong culture and keep us from creating new kinds of art
and especially coming up with new philosophies about
how to program computers and how to have a dynamic medium and how to you know like actually use these
platforms to not just serve as like a like a you know another platform for consumption which is
you know the often used refrain when talking about mobile devices, but are actual like tools for thinking with and for augmenting human ability.
So,
yeah,
I mean,
all of that stuff is like up in the air right now,
but it is kind of,
I don't know.
I'm nervous about it,
but that's a whole,
there's a whole thing separate from like,
what can we do?
And what are we doing to reinvent programming?
Current financial constraints be damned.
What are the possibilities that have opened up?
Or what are the things that it would be like,
hey, VR, it might someday allow us to do this thing were it not for resolution and latency and focal distance
and other factors that contribute to like nausea
or quality of experience. If it weren't for the limitations on that kind of stuff, like where can
we imagine this going in the, and hopefully in the near future? What do you think? Yeah. Um,
you know, it's like you bring up that,, like, safety or driving the metaverse will kind of force, force isn't really the wrong word, but push VR into a direction that encourages it to be kind of a, you mostly live in walled gardens right like you don't it's not it's more of a consumption device like a mobile device
which you know mobile is its form factor is kind of like optimized for consumption as you mentioned
but also communication but that makes sense because that's where like mobile came from it's
a communication device and that's like really still this best strength of a smartphone right
even even though you can use it to watch videos or read it's not the best reading or video watching device
it just tends to be the most convenient right which is that's one challenge with vr in general
it's not particularly convenient but um one thing about vr is like as i've kind of mentioned and i
haven't talked about horizon at all yet, but as I mentioned
before, you know, both with Oculus Home and Luna, they're both like creative apps. They weren't like,
it wasn't as far as like a, there's a lot of art tools that are popular in VR, like Tilt Brush and
Medium and Quill, right? And so it's not as far as like a professional or hobbyist art tool in any of those cases.
Right. But there's still kind of micro creative tools.
Right. So as I started thinking about and once again, because I have a game development background and a lot of indie game development, I thought about a lot of game ideas, but as I started thinking about game ideas in VR, I was actually like, well, the strength of VR is, is kind of in this, you know, the physical, um, you know, digital
kind of mix that I was talking about, right. That you have tracked controllers that you can move
your body through space. Uh, and that is kind of ideal for creation, not consumption, right?
Like VR experiences are cool because they're immersive, right?
They're more immersive than film, right?
It feels like you're in the place, right?
You can do interesting things with,
and people are doing interesting things with immersive theater, right?
A former coworker of mine has a startup called Lila, where, you know, it's funny
because this is actually, you know, I talked about tangible coding in VR, but actually, like,
I, like, even back in those days, I was like, oh, it'd be cool. And I don't know, are you familiar
with the book Diamond Age at all? No, never heard of it. I feel like future coding people should
definitely read Diamond Age in general. It's a neil stevenson book like snow crash is more popular
right and snow crash comes up a lot with you know it coined the term metaverse etc but uh diamond
age um is like uh similar to snow crash in certain ways right um but it's further in the future
uh and it's the story is about um mostly about this girl who's kind of growing up in poverty.
And one of the richer, they call them, I believe they call them clades, right?
They're like groups of folks that are political entities or groups or whatever.
And one of them invents this thing called like a primer for young ladies.
They're Victorian clave.
So themed clave.
So they have,
you know,
it's fancy name for it,
but it's basically a digital book.
That's an educational book,
you know,
that can,
you know,
can do like probably a lot of the things that you think of a tablet doing
today.
Right.
But,
you know,
more advanced,
right.
But one interesting thing about it is that instead of
having an AI entirely drive the experience, there's actually an actor that effectively
serves as a teacher and kind of a parent to this girl, right? So she's interacting through the book,
but through this actor, right? And they call them Ractors because they're interactive Ractors. And
the conceit is like even though this raptor
is very special and like the book in kind of combination with the raptor help this girl who
grew up in poverty have a much better outcome kind of um in society right um and they actually
play around with like there's other copies of the book and some of the copies of the book have no actor i'm dropping spoilers in now but like that's the basic conceit of the of the book right um so
the idea was basically kind of do do like a vr app where you know and and i described it as you
know because i think back in those days especially it was fun to call everything like an uber for
something but uber for actors right where you could have like sets set up and you could have like experience set up and it'd be immersive
theater effectively a digital immersive experience theater experience that actors could actually get
in a headset and get paid to to run sessions for people right and you could you could also
consider this like role playing as well right where you hire a dm and there's some startups
that do that now but you would do it like in a vr environment instead and a former co-worker of mine actually is doing this
as a real startup not just as a random idea um but i think there is like a lot of power in that
that human element right um in vr that you can't necessarily get in other places. Because VR is that much more personal than mobile,
which is that much more personal than the PC.
It makes me wonder like if the,
if,
if the visions of what it would be like to program within this environment
aren't just going to be about the fact that it is,
you know, like you are
now inside an immersive 3d virtual space where you can bring all of that great game engine technology
to make things look and have physics and animate and maybe have a bit of tangibility to them like
however you want but that that fact that like like what we're seeing with uh like vtubers on
on youtube now where there are these people who just for the benefit of the audience, if they haven't been following this,
people will buy full-body mocap suits and set up digitizers so that they can go on YouTube and livestream or on Twitch and livestream themselves as some kind of a virtual character.
So they'll make some 3d model of a character and that character will be who you tune in to watch and the the person behind
it is just like playing the role of this character but they're doing it with technology in a way that
feels kind of novel and interesting because they can change things about their appearance
on the fly in ways that in the past you know like in theater we'd accomplish this with like
oh somebody's gonna run off stage and have a quick costume change and then run back on and
like fast costume changes are like a big part of theater like how do you iron out the production
so that everybody can transition scenes very quickly and the set spins
around and new backdrops fly in and that sort of thing and it's like as we're getting more and more
technology that allows us to have these human relationships in more and more seamless ways
or more wild kind of different ways like you can actually make yourself look like a photorealistic alien or
something like that like it gets better and better at crossing the uncanny valley in some sense and
it makes me think that like like programming has kind of gone through a similar sort of evolution
where it's like from punch cards or even before punch cards right from like doing computation
where it's like on paper or in your head or with something like
tablet weaving or something like that where it's like not aided by the machine at all where it's
like entirely driven by the person and then we get you know large physical computers that you can
program with with switches or whatever the big physical mechanisms were into, you know, terminal-based
programming and teletype and that sort of thing. And now, you know, we have programming editors
that have, you know, syntax highlighting and you can have like thousands of files and you can have
type hinting and we have all these like richer and richer tools that are gradually getting programming to be closer and closer to people and to what a person is and like a better fit for the human being, like using color, using space better, letting you organize things in a way that suits how you think.
Like having your choice of programming language so that the kind of thinking that you feel most comfortable doing can be reflected by the kind of tool that you have available.
And it's like it's all turning into machine code in the end,
but there are so many different ways that you can approach it to suit who you are.
And so it makes me think that maybe as VR expands in popularity
and becomes more available to more people and the technology gets better
and we kind of get over some of these early little hiccups in in getting it figured out and out there maybe one of the areas
where we'll see vr programming really improve on what came before isn't just in like it's like in
a game engine and 3d stuff and whoa cool you know shaders but like using it to to be a better reflection of our humanity.
And maybe that does involve like it allows us to get more people together working on the problems
in a more intimate way than before,
where it's like right now we're using GitHub
and that was a big change from the processes
that came before it.
It's like the social networkification of programming.
And so maybe VR gives us like, I don't want to say the gamification of programming that's a loaded term that's not that's not what i mean but like maybe there's some some aspect of it
where it like changes what the incentives are or it changes like what the experience of getting an
error is like like maybe it lets you explore errors in a way that is less infuriating and frustrating because there's like i think an easy way to look at this would be like
look at the parts of programming that are miserable right now and what can we do with
more intimate technology to take some of that misery and reduce it or take that misery and like
forge it into something that is more purely about what
we're wanting to do with the tool rather than just like affordances because the machinery isn't very
robust or very rich right now like i love thinking about this idea of like here's how we're seeing it
apply to theater right like people are you know maybe you could have a like uber for theater and
vr kind of thing like you suggested.
And I'm thinking, yeah, maybe the factors to pay the most attention to in thinking about VR programming are the human factors, not the technology factors that are kind of more easy to point to because they're what's existing already.
The culture is not there yet, so we don't know what the human factors are all going to be, but we know what game engines let let us do and so it's easy to kind of pay a lot of attention to that stuff anyways those are my
those are my two cents um yeah yeah yeah definitely um you know i think like that that theater thing
was one one idea that i was messing with it that was you know kind of that's somewhat along the
lines of like it's it's a mix of consumption and creation, right?
But like a lot of the prototypes that I was interested in
outside of the tangible coding one were like,
are you familiar with, you know,
kind of Verlay physics at all in games, the concept?
Yeah, I am, but our audience might not concept. The audience may not be.
Basically, it's
I'm
trying to think of a reference point.
I don't know, World of Goo might be a popular
example, or certain bridge builder games
use this tech pretty often.
Soda Constructor, right?
Fantastic Contraption is an example in VR
that kind of does a similar thing, right?
Basically, you have a bunch of particles.
It's a physics system, right, where you have particles that move through the world,
they collide, but the particles all have constraints between them,
and usually those are rendered as lines or, you know.
And they behave like springs where they don't let the particles get too close or too far.
Yeah, and they can be stiff springs.
And Verlay comes from, it uses Verlay integration
because the integrator is more stable
than like an Euler integrator, for example, right?
So you can actually apply these constraints
and kind of like do, you know...
You get like a reasonable amount of stability.
It tends to not explode, right?
So it tends to, but it's fun to fun to play around with so so you know i was
thinking about doing like a you know a verlay builder in vr or like a sign distance field uh
kind of editor thing right like uh you know sign distance field is uh you know you you you can
define an object uh basically by the distance to, uh, you know, it's, it's surface or interior, right?
It's actually, it's signed because it's, uh, both positive distance and negative distance.
Um, so distance to surface, right? Um, so for example, a sphere would be defined by,
you know, it's the distance to its center point. Um, and you know, you're inside or outside based
on the distance to its center point. And there are various techniques to actually render that, right?
You could render it as polygons, you know,
using marching cubes or some other meshing system.
But one common way to render it is actually render it directly
using sphere tracing or ray marching.
So, you know, it's kind of shoot a ray out
from the camera position to the pixel you're looking at, right?
Or you're trying to draw.
Evaluate the entire distance field,
which you can do, you can either store it explicitly,
like in a 3D texture,
or you can, the common way to do it
is to kind of calculate it implicitly and then you you
move along the ray whatever the shortest distance you got from that method um and if you're close
enough to a surface by some you know margin you kind of decide you've hit the surface and you
draw that as a solid hit if you get far enough away from any if you march far enough away and you haven't hit anything then it's probably a mess
and you just draw a background color or something like that right so the the kind of technical
details of rendering uh you know sign distance fields aren't super important for this but they're
cool people should go learn how to do this stuff it's it's really fun yeah you should you should
yeah um it's it's it's really fun yeah you should you should yeah um
it's it's it's more of that like you know uh and there's there's some there's they're not really
too many vr ones but there's tools like clay souls within unity and dreams obviously on ps4
uh is another one that uses sign distance fields kind of as its base even though it does not do
this simple ray march because the singing complexity is way too high it's a lot it's gonna be more complex
than that but it does do a ray march kind of on the micro level to actually draw the surface um
and uh magica csg is another one and that's a pc a free pc tool which would probably be closer to
this thing um you know and it know, and it's nice for modeling
because you're dealing with volumes and shapes
instead of, you know, edges and points and polygons.
Yeah, like it handles intersections really nicely.
Yes, so intersections aren't as weird.
You don't have to worry about, you know,
managing mesh topology or T-junctions or like kind of, there's a lot of
ugly things that happen with meshes that require a lot of handholding and babysitting that,
you know, you don't really have this problem with signed distance fields. Um, there, there is
actually one relatively popular, uh, VR signed distance field modeler called, uh, it was it was oculus medium it became adobe
medium and now i believe it's called substance modeler um oh substance picked it up yeah well
adobe picked it up and adobe picked up substance and they rebranded a lot of their 3d tools as
substance because it's a better brand better brand yeah it's that that kind of thing right so so it's it's all under adobe's umbrella
yeah um so but this one would be like a lot simpler probably like a simple would have been
a simple version of magica csg so so yeah i had all these ideas for creative tools uh in vr um
and and still have more and i think that's's really the strength or one of the major strengths of VR
that gets overlooked sometimes.
I think the problem is,
and we hinted at this earlier,
is that a lot of the time people want to...
People aren't building...
Some of these apps are built from first principles,
but a lot of times these apps
aren't really built from first principles.
It's like, let's add a VR mode
on top of a game engine editor, right and you quickly run into well okay you have
to translate all the ui from the whole game engine to vr right so you end up with a lot of floating
panels that have like hard to read ui you end up with you know maybe navigation methods to navigate a large level that
aren't ideal right so a lot of this stuff didn't really take off and like performance a lot of time
game engine editors aren't really performance tuned as in most of the time right because
oh well if you're if you're editing a level on a desktop pc and you know it runs at 10 FPS or 15 FPS, that's
interactive rates, right?
Even if the game itself needs to run at
60 or 120 or whatever,
right? But in VR
you want to keep
higher frame rates consistently. You want to keep
90 or 120.
10 or 15 is unlivable.
10 or 15 is vomit-inducing.
Yeah.
So, you know, there are a bunch of challenges there that made it not really usable.
But you can imagine a game development tool that exists entirely in VR was designed kind of initially for VR that just works.
So with some of my,
to kind of bring it back to stuff
that I've actually worked on
and some ideas that I've had as well,
in Oculus Home,
like eventually I started working
on an actual scripting language
and the initial implementation of that
was just kind of a list of commands
that you could apply to objects, right?
So it was a very like turtle
style programming language. There's actually, in that version, there's actually no, what we might
traditionally consider programming, right? There was, there were no variables, there were no,
there are no operators. You know, you weren't doing any data transforms, really, it's just a
list of commands. And the reason for that was one one, it was an easy thing to bootstrap.
But also, I kind of looked at three parts, especially of an end user game creation system, right?
And breaking into three parts is entirely arbitrary.
There's more stuff going on than just these three, right?
But there are three things that I could focus on that would take most of my attention, right?
And the first one is behaviors, which I decided was
most important because if you're, if you can write interesting programs or you have a cool UI or
whatever to build those programs, but they don't actually do anything, which doing something
generally means like moving them or, you know, changing their material properties, you know, assuming that objects are
mostly 3D meshes, 3D models, right, in these environments, which they pretty much are,
right? Some of them are particle systems and stuff like that. But playing animations,
if they're animated, playing audio, there's only a handful of things that you really,
that are really output, especially when you're talking about, you know, kind of a higher level game creation environment that you care about. And most of
those are moving things or like checking collisions, right? You know, I added collision
events, I added, you know, some input events, and I had behaviors to, you know, rotate this thing
by 90 degrees over one second, and you could you could edit these
things and there were like are all kind of parameters that were hard-coded right um and
that was you know that that wasn't like it right obviously but that was a good start um yeah those
are just like assumptions made so that you can focus on the part of the problem you want to focus
on yeah yeah and then uh the other the portions of that were kind of the, uh, the actual programming portion, which is like, you could call it,
you'd call it logic or you could call it scripting or you could call it, you know,
VM or, or whatever you want to call it in this case. Right. And, um, and then the last part
would be, uh, the UI. So that was kind of my, my stage, my, my stage plan, right? Where I did like, you know,
behaviors first and like, okay, these are interesting set of behaviors where you can do a
lot of things if you build on them, right? And then, you know, kind of logic second. So I wrote
a VM and then UI last, right? And UI is actually the hardest. In VR, it's the hardest or the most interesting part.
I mean, it's the hardest everywhere.
Yeah, yeah, definitely, definitely. So in Oculus Home, the trick is I didn't really get to UI. I
just had this list-based UI. I wrote the VM, actually. So I could actually apply some logic.
But without a UI to actually write you know, write any code,
it wasn't easily testable.
You know, I was hard coding some tests and that's it, right?
And then kind of feeling like I was close
to getting this scripting system working on Oculus Home,
you know, that got pretty far along.
And then I was moved to a different organization
than the team I was in
right until I need to work on scripting an entirely different program which is Facebook
Horizon which had no interactive behavior when I started right so you know effectively I kind of
rewrote a lot of the work I had done but not entirely in this new environment right. Was that
like a useful exercise to do in,
in helping you like distill your thinking on this a little bit or like get
closer to the kind of the,
the heart of the issues or was it just like,
all right,
fine.
Got to rewrite it,
you know,
mechanical work.
Yeah.
A little bit of both,
a little bit of both.
You know,
I could simplify some of my ideas a little bit.
Initially, I actually planned for the long-term goal
for it to be kind of a multi-application.
Maybe not the language itself, right?
Because it becomes tricky when you start talking about UI, right?
And in different environments and across different game engines.
You know, assuming you still want to use that game engine's game object model,
which in this case, both Oculus Home and Facebook Horizon did,
there's not a lot of room to port that aspect of it,
but the VM could theoretically be entirely portable.
It was pretty self-contained. That was not the case in reality, right? Because I was like, oh, you have three
months to get scripting working in this new app, right? So, but yeah, it did help me define some,
or refine some of the ideas, right? And I pushed things a little bit further,
clearly because I worked on it longer. Am I remembering that it didn't end up shipping?
I think there was a Twitter thread that I read of you talking about that at one point.
Oculus Home did not ship any scripting stuff.
So there was some working scripting stuff in Oculus Home that didn't launch, right?
Oculus Home is interesting because it is a live product, right?
So it shipped in the sense of like,
if you had the right internal Facebook employee ID, and you were like, you know, whitelisted for it, you could use it, right?
Right. everyone could get access to it. And then it wasn't, it wasn't quite, it wasn't far along. You know, I kind of described how far the polished work I did was, and it wasn't far enough to
actually be considered a usable product, right? But Facebook Horizon shipped, it's out in beta,
it's not, it has not launched, there's no open beta, right, of Facebook Horizon, it's in binomial.
And what is facebook horizon so facebook horizon is um a social vr
application right so you can you basically go and there's there's other ones that are as popular or
maybe like more popular they're definitely more popular in terms of users like rec room um or vr
chat are the two two probably most popular ones but it's basically you know kind of a
multiplayer game slash creation and game creation environment right uh slash online hangout space
so when we talk about the metaverse a lot of the metaverse apps right now are actually really just
these things they're not really like you know i don't know metaverse is a super loaded term so
i'm not going to say they're not the real metaverse or they're they're all i think i think it's fair to say that because
like it's it's fair enough yeah like there's there's there is definitely a as we talked about
earlier a business reason to make you know the new buzzword the new hot thing the you know get
people excited about it and it is it is not the vision of the future that we technologists
have been chasing yeah that's that's fair so for metaverse uh or for these apps right like they're
they're kind of like mini metaverses or whatever right or they could just be considered like online
games with relatively robust end user creation tools as well right so usually there's a world builder
there are multiplayer so you know four to 100 players can join depending on the maximum player
account can kind of join together in a session and play games together or just hang out and talk to
each other right um you can build spaces and share them right and? And you build spaces with, usually with various shape primitives,
sometimes with meshes that you can import,
you know, from a third-party modeling tool.
You know, you can kind of change material settings
on them, et cetera.
So, you know, it's a creative tool,
but they usually have some sort of scripting
or interactive behavior in them as well.
So you can make your own games,
even though the scripting is usually,
but not always visual scripting,
some integrated Lua or other,
you know,
text-based languages.
I think like Roblox is not a VR app,
right.
But Roblox is probably the most popular and,
and by far version of this,
it's a little bit different um in the sense not not just
not being vr but that the creation tools are very much outside of the game that most users are
experiencing um right and that's mostly true for vr chat as well but in rec room and facebook horizon
it's kind of the the expectation is that the world building tools are part of the game right
and not everyone is going to world build some people will many people will consume but you can easily just go into the
world building tools and decide to make your own space or dive into scripting if you want to do
something interactive and that'll be that'll be big like it like having you know to me like you
you referenced roblox um I think the one that maybe arguably
popularized this in the first place was second life. And there again, if my memory serves,
the editing tools were outside of the actual playable environment. Yeah, they were. And it's
this dream of like, you know, something like a hypercard or like dreams on PS4, but in a collaborative space where there are lots of people who can, you know, be together editing something.
And you see that a little bit with like with Minecraft communities where it's like a bunch of people get together on a server and like, you know, build the giant statues that were carved into the cliff face in in lord of the rings or that kind
of thing like like make a you know scale model of the uss enterprise or something like that but
like putting it into vr should with you know some of these other technologies you've talked about
like maybe with you know sdf for modeling so that the modeling is just like that much richer and
feels more like working with actual clay in the real world rather than working with like triangles that can do all sorts of you know degenerate shit it just feels like we're on the
the cusp of kind of pulling together a lot of these threads in a in a really satisfying empowering way
which is exciting and so it's cool to hear that like like facebook of all companies i've like
heard of horizon but i haven't heard exactly what it is
and what it's about so it's cool to hear that like the plan is that editing tools are in the
environment that even consumers will will be in but if they so choose they can go from being a
consumer to a creator without having to you know do what we do now where it's like how do you go
from being a consumer of software to a creator of software? Well, you got to go get a, you know, a tool chain
and an editor and learn how to compile code and, you know, jump through all of those hoops. Just
like, I think shortening that gap between consumer of a thing and creator of a thing is, is
unambiguously good to do. Yeah. Yeah. And like a lot of my, I mean, I have, I have a lot of
influencers, right? Like when working on this stuff, I think, you know, a lot of people,
when they see the tools, they're just like, Oh, it's cool. It's VR scratch, which is like,
that's kind of the reductive version of it. It is pretty much VR scratch, right? Like
in the sense of like, it's a block-based editor, right. Um, for actually editing logic for the code.
Right.
I hinted at this, but I didn't actually get to it, but I actually didn't implement UI
for, uh, Facebook horizon.
I did mostly behaviors and even in horizon, I did mostly behaviors and, uh, like I did
the, the almost all of the VM, right.
Um, in your translation of the work that you started on home.
Yeah, yeah, yeah.
But they had other folks work on UI,
which I can start, I think, not right this minute,
but maybe a little bit later,
I'll talk about some UX ideas that I had
that I didn't try out.
I get a chance to try out.
But yeah, it's shipped. It's out there.
People are building stuff with it.
You know,
I think the,
the kind of scratch type editor,
the reason why horizon has that is,
and at one point there was a like kind of node based editor in horizon,
like early,
like this never shipped or anything.
This was never public, but we actually did implement a node based editor in horizon, like early, like this never shipped or anything. This was never public,
but we actually did implement a node based editor. Right. And there's this idea of like,
uh, what, what people were calling black box scripts, right. Where it's like, um, you know,
I think part of this was, uh, you know, like I said, I did a, I wrote an entire VM about three
months. Uh, and it's a very simple VM, right?
It's not anything to brag about or anything, right?
But it's like, it works.
But I think there was a little bit of hesitancy
as to whether or not anyone could get
a full visual scripting language working
in the timeframe that they wanted to at least get to alpha.
Because everything came in kind of hot.
Right.
So,
so there were like things that felt like they were hedging bets and not
necessarily ideal for the,
for the product.
But one of them was to,
to do kind of these,
these black box scripts,
which were mostly higher level logic written in C sharp,
and then you can wire it together.
And that was actually pretty promising.
It's just the,
the nodes themselves in vr when you talk about fully 3d node and node and wire placement it's
already hard to manage nodes in 2d like on a surf on a plane on a 2d plane in 3d space it's like a
nightmare right you have 3d spaghetti right you can have things behind you right like
occlusion is a big problem you can have nodes occluding each other right and this is that this
is one of the challenges um we haven't really talked about about programming in 3d for real
right like actually building full you're you're placing objects or or placing text or or you know
whatever you use to represent your your of, you know, programming elements,
right? In full 3D, occlusion becomes a problem. Things being behind you become a problem suddenly,
right? So it was definitely a problem. And, you know, performance is also a problem, right? And
you have so many nodes, you have X amount of nodes rendering it slows things down right so
so that got kind of removed entirely even though there were some things there that were promising
and there are other environments that have gone that with that approach um i think uh
there's a there's an app called neos vr that had scripting like that kind of around the same time
uh rec room has similar kind of node-based scripting right and kind of around the same time, uh, rec room has similar kind of node
based scripting. Right. And, uh, a lot of the ways people solve it is, and dreams actually does
scripting like this, both in VR and like in 2d, um, with, with kind of a microchip circuit thing,
but they're really like nodes, right. And they, they do kind of data flow, uh, programming with,
with these, these nodes that they call microchips.
And usually the way to solve it is to still constrain most of the nodes to a plane, right?
So you kind of have a window in 2D in the 3D world, right?
And also making it like a somewhat zoomable UI where you can collapse and expand node graphs, right?
So, I mean, I think Horizon could have worked with a node-based UI if we went that approach.
But, you know, I think blocks work well.
The kind of one regret with the UI that they ended up having, and, you know, I wasn't really involved in this,
but I was hoping that you know the language
would would maybe uh influence some of this was that it still required that users spend a lot of
time on a virtual keyboard and the idea there was to not do that and i had a i have a you know kind
of going back to the luna the luna editor. Um, I had a bunch of ideas for number
picker pickers and vector pickers and, you know, rotation gizmos and stuff, right. Where you,
generally speaking, you would never touch the keyboard to input constants, right. And then for
variables, you would name them once and you just copy and paste them. Right. And for the block
base editor, it's kind of...
The final block-based editor
ended up being very much driven by...
You know, it's still like kind of a flat 2D UI, right?
You know, a lot of people are familiar with Scratch.
It's...
So you don't have free canvas placement
like you do in Scratch a lot of the time.
It's still like kind of a list of things, right?
But you can...
It's kind of an AST editor, right? Where you get empty slots in the blocks and you can put things
in the blocks, right? Originally, I actually wanted to use the exact same 3D world building
tools to build the AST for the scripting language, which meant it would be kind of two and a half D,
right? And I was inspired by Scheme Bricks, which, you know, has kind of of two and a half D, right? And I was inspired by Scheme Bricks,
which has kind of a two and a half D look from rendering, right?
And there are various points where the Horizon scripting language
was kind of two and a half D,
but apparently they had like a design meeting in terms of,
I don't know, I wasn't there,
but it feels like they had a design meeting or something
and across the half they're like,
we need to unify the design of this thing.
And we're going to go with flat design.
So everything's flat now.
Right.
Which is, you know, I kind of hinted at like big companies, right, and doing stuff, right.
But for a block-based scripting language, it's actually really cool or potentially really cool to like be able to reach into you know your your blocks instead of
just hover over them right um and occlusion is not so bad in that case right and you can
highlight blocks and stuff and for for folks who haven't seen scheme bricks like it it looks kind
of like a scratch like programming environment but the blocks are sort of stacked on top of one
another in a way where
it sort of it makes like sort of like a little triangular kind of shapes coming out towards you
a little bit which um looks really neat like it's it's this kind of it has almost a kind of a
fractally look to it but it is still like if you took away the graphical aspect of it it is still
lines of text code kind of one after another
with indentation so it's like perfectly readable as code it just uses the blockiness of the ui
to imbue a sense of depth to it so it really helps you get a sense of like what stuff is
nested inside of other stuff because that stuff comes closer and closer to you in how it appears
yeah and you can think of it as like stacking blocks like in a tower right where you have a base uh you know that's
kind of your your entry point or or you know your your top level event or whatever your your main
function or your function definition whatever yeah you'd stack statements and then you would
stack operators and expressions inside of there right um? It's like literally turning the indentation
into depth information,
where the more indented something is,
the closer it is to you.
Exactly.
And that's kind of like,
it actually started out a little bit more like that,
and then they've moved away from that since.
But I also had an idea that got vetoed quickly
by a UX designer,
but in order to reduce the number of options you were selecting from a list or category, right?
The idea was, or the number of blocks or whatever you want to call it, the idea was to, once again, take advantage of VR.
And you basically pull out, and I was probably inspired by like weird role-playing game dice or something, right? But you'd pull out a cube instead that would have operators on each of the faces,
and you could rotate the block and then place that.
So I don't know.
There were weird things to kind of make that scratch UI more 3D,
and a lot of them didn't get in, or some of them got in and kind of got reduced, right?
But it is still a little bit compromised in the sense of that it's like safe, right?
It's not really, you know, it's like, okay, how can we use VR for this thing that mostly
exists, but it's not really like an entirely brand new paradigm, you know?
And that gets into some of the stuff that I wanted to prototype.
And I would have if I was maybe on the Auk team and, you know, had a slightly longer prototyping phase, but maybe I wouldn't have, right?
You know, I kind of mentioned Cellular Automata before, and I was pretty heavily inspired by this environment called Movable Feast Machine.
It showed up on the Future of coding slack. I believe
it's Dave Ackley is the creator of it. And, uh, it's this thing called robust first computing,
right? Where it's, it's basically, you have a grid of small independent ish. I'm not sure if
they're actually, uh, independent, like in the implementation, if they're actually independent
threads or processes or, you know, running on different machines, but they're actually independent threads or processes or you know running on different machines but they're modeled as independent processes and they could theoretically
be independent processes and the horizon scripting language actually does this too where like each
individual script instance on an object is treats itself as a distributed independent process
effectively uh and they communicate via message passing and message passing could happen locally
or through through the network and theoretically it could have been threaded and stuff like that
even though it wasn't um it's like it's inherently meant to be async yeah yeah exactly um i was
really inspired by that and i like once again i was already doing kind of the in the path of doing
the message passing stuff even even with oculus Oculus Home in the early days, right, you know,
I started developing a message passing system where, you know, it's, it ends up looking a lot
like broadcast and scratch to people or just like a delegate in C sharp or something, right. And
like different people have different ways of thinking of it. But, you know, it was really
inspired by, you know, kind of small talk and specifically an environment called Croquette, which is like early, you know, we talked about Second Life.
But even before Second Life, there's an early like distributed 3D web type environment called Croquette that had like kind of ways to sing.
And this wasn't the networking model because it is for both Oculus Home and for Facebook Horizon.
They're peer-to-peer networks, but you have a fixed number of clients.
You don't have people moving through space and connecting to a different variable rate of clients.
You don't have to sync across a large kind of distributed peer-to-peer world as cool as that would be, right?
So the problem to solve is there aren't the same synchronization problems just don would be, right? So the problem to solve is, like, there aren't, the same synchronization problems
just don't exist, right?
Like, you send a reliable RPC
or something to a client,
they'll get it eventually, right?
There's latency issues involved
and stuff like that,
but, you know, for scripting,
it's not, like, the end of the world
in most cases.
Yeah, so for the movable feast machine
kind of inspiration,
the idea is, like,
I wanted to play around with using some of the automata.
So you basically have like a voxel editor in VR that you're using to write your code, right?
And whether it looks like a wire world thing or whether it looks more like movable feast machine,
which has a scripting language called ULAM, right?
Where it's like you place nodes and each object would be a node, right?
But you'd still write, you know,
kind of code of some sort to decide how those nodes work.
But you do it at a very, at a more granular level
than maybe like an object.
Like, and I guess I haven't really described
what an object is in like these kind of social VR apps, right? It's really like a
collection of shapes or a 3D model that kind of like form a literal object that can transform by
itself in the space, right? And there's a single script. A single script could be associated with
that object, right? So like a cellular automata inspired thing would probably be more granular
than that, right? Where you'd have multiple cells.
You know, it's tricky to think about how it would interact with, you know, it's cool.
Like if you're just making Minecraft, you're just making a voxel world.
Easy, right?
Because everything's voxels already.
Everything's on a grid.
You know, cellular automata make a lot of sense.
If you have more freeform placement of meshes in the world, it's tricky to think about how well that works with the cellular automata approach, right?
Like, you can do things with, you know, there's still input and output.
You trigger, like, a cell.
A cell gets triggered, you know, assuming the wire world approach.
A cell state changes based on a specific input, like a collision event.
And then you have, you know, a bunch of intermediate logic cells that do stuff and route
a signal based on that input and then it will route to a specific output or convert to a specific
output like it's tricky because it didn't really work with these environments so it i'm not sure
how worth it it would be prototyping that one but uh you know it's definitely something i considered
especially when you think about having a volumetric space to deal with.
Right. And there was a 2D.
There are there are some 2D programming languages that kind of do this, like, you know, that aren't necessarily completely cellular automata based.
Like there's one called ASCII dots, which is kind of like moving, and I don't know if people saw it, but it's still text-based, but you basically draw ASCII art
to move a ball through kind of a pipe system.
I don't know what the best metaphor is, right?
And there's different transforms and stuff,
and you can split the ball or combine the ball,
destroy balls and stuff like that.
They can change state of various cells in the world.
I think of it almost like trains on a on a
on a little weird railway system yeah yeah that's that's that's probably a good metaphor for it uh
let's see what else did i what else did i consider uh there's a there's an environment that kimper
land made um called chalk talk which is gesture-based, right? So you draw.
And I think he demonstrated it in VR actually at one point, right?
Well, I haven't seen that.
I've only seen him demo it like on a projector kind of screen.
Yeah, I tried it.
Like the source was released and I tried it myself.
And I saw demos on the projector only.
And like in his videos, I think he's only done projector demos.
But I think he said it worked in VR or something,
or maybe he was just planning on porting it to VR and into math, and I'm not sure.
But yeah, I did think a lot about gesture-based systems,
you know, and Chalk Talk's gesture matching
was kind of lackluster for me.
And doing good gesture matching is kind of hard,
but it would be kind of easy to beat them, right?
So that was one thing I considered and might be fun.
You know, another variant of gesture-based would be just to do,
and this is not future, it's pretty present of coding,
especially kind of in light of Facebook has released another social VR app
that's focused on enterprise recently called Workrooms.
And Workrooms actually uses
the same tech. I mean, Engine, of course, because they both use Unity in the hood,
but there's a lot of extra tech for networking and avatars and world management and just random
features and gameplay code and stuff required in Horizon. And they use the same code base
as Workrooms in Horizon,
even though there doesn't appear to be any scripting or worldbuilding
or anything like that in Workrooms.
Theoretically, there could be, right?
But Workrooms also has an infinite whiteboard.
And I think one thing that came out of Workrooms,
and we talked about this earlier when we had the screen discussion
in Oculus Home, was that there's a little bit of a lack of imagination,
right? It's like, okay, we can be cartoon avatars in an office conference room that looks exactly
like a Facebook office conference room. But now we have an infinite whiteboard instead of a finite
whiteboard. And it's like, well, you know, maybe I don't want to write on a whiteboard. Yeah. Like,
why can't we be on the beach or something or like on some fantasy world or
whatever right like regardless of that aspect of it it might be fun to you know as as people
probably know like big tech companies have a really big whiteboard interview culture and
just whiteboard culture in general so so i thought it'd be fun and one of the big complaints about
whiteboard interviews is is uh you can't actually execute the code right so and this is
this is like half serious half trolling right but i thought it might be fun to do a thing where you
actually can like write and i actually haven't seen this demoed especially not in vr or like
even on a smart whiteboard but maybe it exists somewhere right where you can write code on a
whiteboard and it actually executes right like it does like ocr or whatever
yeah yeah yeah exactly you do like gesture recognition stuff and then like which once
again that's not like super interesting from it's it's interesting from a from an interaction
standpoint right where it's like there are fun things you can do you could potentially add to a
whiteboard coding environment that you couldn't do in a standard IDE, right?
To me, this feels almost the same as like some of the tools that allow you to put executable
examples inside of your documentation, just to make sure that like, if you change something
about the thing that's being documented, and it breaks the example, like you get a compile time
error or whatever, you know, there's this space that has previously been used for talking about computation but it hasn't been a computational space and we
should like it's like incrementing from where we are now to having smart dust any space where we're
talking about computation that is not in itself a computational space like that's that seems like
low-hanging fruit for somebody to figure out how to do it and
make it like,
let's get the actual dynamic medium to be everywhere where we are talking
about the dynamic medium.
Yeah.
Yeah,
definitely.
Yeah.
And then like,
even though you don't really see it in the,
I think some of the things that shift definitely horizon,
you know,
especially at the time,
like I was pretty
inspired by dynamic land, when you talk about bringing computation to space, right? And dynamic
land explicitly says it's not an AR or VR space, right? So it's kind of like, not, you know, not
in the spirit of dynamic land to be like, let's just make dynamic land in VR, right? It doesn't
really make a lot of sense but like one
thing about dynamic land that i liked was the idea of and and i think this did actually come through
in horizon a little bit right and via the message passing system and some other systems in place was
that like objects or or in the case of dynamic land it's a page right it's it's like your whatever your
atomic program thing both has a physical location and might have other physical things associated
with it that aren't code right you know sometimes they have behaviors sometimes they're just visual
but it doesn't doesn't really matter right like it's the case uh it definitely in the horizon
right but that they're they're all kind of self-contained things
and they can work together as a whole,
but they don't necessarily require the whole, right?
So it's not like a whole world program,
which, you know, came up as a possibility, right?
But it couples things in a way that's not as,
like one is shareable or like as remixable,
but also like that decoupling.
And like,
it's interesting because I feel like a lot of programmers,
you know,
professional programmers get really scared when you talk about decoupling
things to the point where,
you know,
Oh,
any object can send a message or make a claim or whatever.
And it'll just do something because it's like,
well,
you don't know what else,
like you bring something in the environment that could completely wreck the
rest of the environment, right?
And it's like...
Yeah, it introduces fragility, yeah.
Yeah, yeah.
But it's like, maybe that's not a bad thing, right?
Especially when, you know, users have control
of what's in the environment, right?
And like can decide what, you know,
one thing I really wanted to do in Horizon
and they didn't implement it,
but they talk about safety a lot in social VR, especially because of Facebook.
But really only the safety features are like there's a recording feature, which has questionable privacy implications already for some people.
Because it's like, why is this app always recording my gameplay and sending it to Facebook in some cases. But theoretically, that's actually a privacy feature because it will, if you get reported, it sends the gameplay footage so they can review it.
But once again, it's all surveillance tech, right?
Ultimately.
And then there's like a safety button and a panic button but like the problem with safety button and the panic button is they require or even like reporting or banning someone right is they all require user input in reaction
to something happening that they don't like in the program and it's actually easier to just take
off the headset and never log into the thing than it is to like deal with panic buttons and emergency
modes and like all this stuff right it's easier for the user just to say this isn't for me take
off the headset not log in ever again right that's the easiest thing right at least right now like i
mean you could say the same thing about twitter or any other case where you get cyber bullying but
it's like that i think the social momentum to actually participate in something once it becomes
a part of the culture is like that's a very very fundamentally strong force that so i think it is important to be
thinking about this kind of stuff yeah i think it's important definitely but um like i'm more
hitting at that like there can be more like even in twitter once again you do proactively ban a lot
but who you follow and like who follows you makes a big difference in your experience right and that
kind of happens before you the ban is reactive right and you still need those reactive elements
but if you only have reactive elements so you mostly have reactive elements and obviously who
your friends are and horizon and stuff matter and like what spaces you enter matter but um
you know harassment is a big problem especially for women in vr spaces right and it's you know
physical harassment like even though it's not physical, physical isn't real life, physical,
it's physical, and people are entering your personal space, it might as well be the same
thing. Like the whole point of making this technology more personal is not just to kind
of have it, you know, both ways where it's like, yeah, it's so intimate and personal. And it's
like, you're really there and inside the world. But when you know, somebody does something
inappropriate, oh, it's just, you know you know virtual reality my phone is an extension of my
mind in the same way where it's like i don't feel comfortable giving my phone to anyone not because
it's like i don't trust them but because it's just like it's like the same way i wouldn't feel
comfortable like detaching my arm and giving it to somebody. This is now a part of my being. I completely am on the
page of, you know, we need to respect people's physical autonomy within virtual spaces to the
full extent that we socially respect one another in non-virtual spaces, or at least the way we
should respect one another in non-virtual spaces yeah the way we should that definitely the way we
should yeah but you know you can because it is a virtual space and it's immediate like theoretically
you can do more so one thing i really want to do at least in the scripting space and a lot of this
was inspired by a second life again and various attacks that you saw in second life that were
enabled by scripting or like you have flying objects flying people's faces right and stuff like that right or you could or or other inappropriate things on live tv yeah yeah or you could uh
you could you know teleport someone to uh you know inappropriate place or like move them super
you know like physical things that would physically and in vr that can actually make
people physically sick right so like my thought was to have like a permission system and you know permission systems
aren't like future of coding really even right like uh like you know you have permissions in
android and there there are known issues with permissions as well where people just say yes
to everything because they don't really understand but you know you if you can imagine permissions
in a vr scripting language right or you know, you know, VR worlds or, you know,
however you want to, whatever level of granularity
you want to put it at, right?
Where you could say like,
oh, I don't want anyone in this world
to be able to access my name.
Like, so I'm going to hide the name,
but also scripts can't get my player name, right?
I don't want them to be able to read my position
in the script, right?
So that way you can't have objects that chase you or
enter your personal space.
I don't want anyone to be able to change my
movement parameters or teleport me,
right? And it could go further
where you could say, oh, I want this world to actually
be entirely static.
I don't want to run no script
mode effectively in the browser.
And Horizon
doesn't implement any implement that stuff as
far as i know uh you know um and those are those are thoughts on like even on like a on a technical
level there's also like the whole suite of things where you can build these systems to embody
principles and i think one of my favorite examples of this is like SimCity,
like the original SimCity was presented as, you know,
this is an objective depiction of the, you know,
the systems in a sort of a systems thinking framework that are taking place within a city. And of course,
it's like some really great recent reporting has shown. No,
it actually is this like really wildly unrealistic model of this disproven
sort of like libertarian utopian idea of what you know cities should be run like and it like
embeds that within the the model of the simulation in the game so that you get all this like almost cartoonish deviation from reality
because it's sort of backdooring in this worldview and and i love that as like a like a counter
example like whenever you're building a kind of a virtual space that's meant to like embody parts
of the real world you're going to be doing that through a framework and what framework you choose will
establish the culture and the norms and the relationships between real people as well as
like the virtual relationships between systems and and even things like the way you consider like how
to model communication between objects and like your idea of like no script or that kind of thing
like those things have cultural ramifications and so I think if you are deliberate and conscious about how you establish
those kinds of systems and, and what things you're simulating from the real world and how
you're simulating them, like you can have a ton of leverage over what culture emerges and what
is acceptable and how people will like treat each other and and just as one more example of
this one of my favorite ever video games is a game called journey and one of the things that they did
in the design of that game because this game came out i think in 2011 maybe a little earlier it was
at the time where online multiplayer games were just this absolute toxic like cesspool of harassment and people swearing and using racial slurs
constantly and like the big you know microsoft and sony and whatever trying to clamp down moderation
and it being this cat and mouse game and journey's premise is that it's a multiplayer game that it's
sort of like you know spoilers 10 years in future, you're not supposed to necessarily realize that it's a multiplayer game as you're playing it.
And they do that by like very carefully allowing
like the sense of another person
being in the world with you to seep out
in a way where it makes you feel
when you're playing it for the first time,
like there are these other characters here
and I can't tell if they're real people
or if they're AIs.
And as you go through
the game which is this like very powerful emotional story told without dialogue or language or
anything like that just told through imagery and the experience of play you go through this very
transcendent emotional experience and you realize that you are going through it together with other
people and you get to the end of the game and it's
like it's the same exact you know gamers are playing that game one moment and then they're
playing halo the next moment and are like just swearing at each other miserably but in this game
you get to the end of it with another person and there's this little like patch of sand on the
ground and this culture emerges where it's like you draw like a little heart or like you know
write this little like expression of your fondness for the other person on the sand on the ground at
the end of the game just because of how they constructed the way that you relate to other
people and they did this very deliberately and it worked and so i think there are there are
absolutely ways that you can structure the way that people are allowed to
relate to one another so that it encourages us to like bring out the best parts of ourselves and
make like really genuine connections rather than just you know indulging in all of our worst impulses
yeah definitely like i i guess i'm i guess i'm a little bit cynical because i don't have like a lot of uh hope that like
that type of thinking you know i've i've worked like i didn't work on journey at all but i worked
with and i know a lot of people that did work on that game and they have similar thoughts about
like like uh phenomena was was founded by two of the uh developers of journey right um so i i've
had a lot of discussions about that type of thinking with
people. And they've almost all been people that have some connection to that team, right? And
then, you know, you kind of go into a larger company like Facebook that does similar work,
and there's just kind of like, it feels almost like ignorance, right? And you see it with newer
social apps often too, where they follow a lot of the same patterns, because the focus is really on like it feels almost like ignorance right like like and you see it with new so newer social
apps often too or they follow a lot of the same patterns because the focus is really on
uh you know user growth and um virality and like you know a lot of these engagement and a lot of
these like you know even even if they do say like oh we want to make a safer or more inclusive place like the and i think some of it's also just
like systems literacy isn't really great so you know and this is this goes beyond just like
software or social media or you know online games or you know but just like kind of how
society and politics function in general right where? Where like, you know, a lot of systems are taken for granted.
You know, they were made as like arbitrary or kind of ad hoc choices,
or they were made a long time ago, right?
And then they become kind of dogma, right?
This is, and you know, obviously this is Future of Coding podcast, so.
That's the thesis of this show, exactly.
Yeah, a lot of people feel, listening to this will feel the same way right
but you know it extends beyond coding as well right where you know i entirely agree with that
i actually i actually would love to see social networks and uh you know social vr programs and
online games and you know anything like it's kind of especially that's one thing that the whole
metaverse kind of push has has shown i think at least a little bit
like software isn't so siloed right and like you know what's the difference between a browser and
a game engine right or what's the difference between like a game and a social network or
whatever and like really there's not that much like it's culture mostly it's just the culture
it's the it's the culture and it's just kind of like how serious we consider it right like if something's for business it's clearly more important
right arguably or more important right or it's going to get more funding let's put it that way
yeah or to get more funding right so or you know if it's not like like you know games often have
marketing problems right or they only appeal to a niche audience right that's that's why i'm turning
this into a gaming podcast nice people
here who are going to be building the future coding had better love games when they're doing
it because there's you know just so much value in that that is ignored yeah but you know like
it's all it's all kind of the same thing in a lot of ways but but yeah i think i would love to see
these apps and and journey is a great example,
and maybe Sky and some other things are great examples.
I think mobile games take a little bit more
of those lessons to heart, right?
Because they are kind of quick play a lot of the time.
They are, you know, like you're going to get lots of churn
and playing against lots of randoms and stuff,
and they're a more casual player base a lot of the time.
Even if people may or may not agree with monetization strategies and stuff and like they're they're more a more casual player base so a lot of the time even if
people may or may not agree with monetization strategies in some of these games the interaction
that you experience is a little bit less toxic i should say not that toxicity doesn't exist but
yeah i'd love to see you know a major social network or something that's designed you know
with with kindness or community or whatever first uh and you know obviously that's designed, you know, with kindness or community or whatever first.
And, you know, obviously that's hard.
And whether that happens in VR or on mobile or wherever, it doesn't matter so much, right?
But yeah, I'd like to see that being the first principle rather than like,
how many users can we get, right?
Or how much funding can we get?
Or how much money can we make?
Or how many ads can we sell?
Which is maybe unrealistic, you know know in a lot of ways but yeah yeah i mean i i maybe it is
unrealistic or maybe it's just the moment we're in like i remember myspace very fondly because
myspace was sort of on the one hand a social network in the way that twitter and facebook and
instagram and all those
are social networks about you know people getting together and and talking online and presenting
themselves and maybe peacocking a little bit but much in the same way that instagram is like a
social network about photos myspace was really a social network about music like it was for
musicians at the time it was this just absolute phenomenal thing that happened and now
that we're in this era of sort of with the exception of maybe tiktok and instagram and
snapchat and a couple of those ones like we're in this era where the dominant social networks are
just based on conversation and link sharing and creating the opportunity to inject ad units that are aligned with the way that
the network is being used for communication so that they kind of just slip in there a little more
easily and i think that maybe what it would take for vr if what becomes like a dominant social
relationship on vr is about a thing like it's about like a creative pursuit in the way that
instagram or myspace are like that might and you know instagram's not a great example because
there's plenty of of problems with that that i think maybe myspace avoided just because it was
so early or like relatively not huge in in mainstream culture compared to social networks
of today but i just i feel like like
every new technological paradigm is a chance for a do-over and maybe you know maybe the corrupting
forces of major corporations will ruin it once again this generation will have to try again in
a decade with whatever the next thing is but it just seems to me like, you know, your cynicism is earned, but I'm still going to take the side of, like, hope and, at the very least, using whatever I can to encourage people to, like, just get weird with it and, like, try and find ways to make it interesting and inspiring and about something more than just infinite whiteboards.
Yeah, yeah, definitely. I mean, I think the current,
like at least the current zeitgeist with all the metaverse stuff is that,
and even apps that aren't calling themselves metaverse,
right, there's like quite a few apps
that are getting funding or coming out
that are all nascent, right?
But they're really about like,
and a lot of these have existed for years on console, right?
Like some of my inspiration was like Wireware DIY, right? Like, some of my inspiration was, like, WarioWare DIY, right?
Which I'm not sure if you've ever played that game, but it was a DS WarioWare game where you could make your own mini games.
And it used a really, really simple scripting language.
It had, like, sound editors and, you know, a pixel art editor and stuff, right?
And, like, they're 30-second by the the nature of kind of wario ware
formatting so it was really easy to make something but they would do like procedural suggestions and
stuff like that but um yeah a lot of people are coming out with new platforms that are designed
to make kind of like micro games or like you know shareable games by end users right and
nothing has really exploded yet in that space right like roblox is
is once again big but it's not quite that space particular but you know as you bring up like you
know with my space you had music right and that kind of came out of like mp3s being widely available
right um and people have been sharing around that with with instagram you had photos right so and that came out of like mobile being a thing and being able to do like filters on the photos and stuff like that.
Right. And everyone having a camera on their phone.
And later, you know, you saw Snap and TikTok and other networks that have taken, you know, video.
And I guess Vine or, you know, some other ones that take video.
And, you know, because everyone has a video camera on
their phone as well and did that, right? But like, we've kind of done like social video, we've done
social photos, we've done social music, all those things kind of exist out there, right? But, you
know, games are big, but you don't really, and obviously writing happened before all those things,
right? Where you had blogging platforms and,ging platforms and status updates and stuff like that.
But games are big, but the difficulty in making a game is significantly higher than producing those things right now.
So I kind of believe that we're getting to a point where people are starting to think about a wide popularity you
know kind of end user game creation platform but it's still not quite there right we're like pre
pre myspace level even there i think like social micro game creation where it's like i could make
a little warrior where blow all of the nose hairs out of this giant person's face micro game as
easily as making a tweet which is kind of like the relationship
between like a tweet and like a newspaper of old is sort of like the relationship between like some
hypothetical future very easy to make a game tool that doesn't exist yet but that hopefully someday
will and what it's like to make a game today yeah exactly, and you know, it's a good question of like,
what the audience is for that. And like, how many people would engage with that? It might be huge,
right? Like, you can look at like, the Pico eight community, there's a thing called tweet cards.
But like, I think other communities have similar things, right? With like JavaScript and various
eight bit computers that people do this for and stuff like that, right? Where it's like,
you do
like a a program which is usually kind of a mini demo but could be a game also that fits in a tweet
or two tweets or something like that right yeah or like the little uh procedural animations that
like bees and bombs and all those folks are doing where it's like they make some processing sketch
and then make like a little three second looping gif and put it in a tweet and it's like with all of those things the value isn't so much in the individual tweets or individual little processing
sketches but the values like for one the accumulation of all of this culture of having
made all of those things and the like the way that that lets humans relate to one another i think is
much more valuable than the artifacts themselves.
But then the biggest value, in my opinion personally, is that it teaches you how to relate to different parts
of reality and different parts of the human experience that you know like a social network
where you're building micro games would be valuable not because of the silly micro games
people make but because of the kind of the rising tide effect that would have on people's ability to
think and reason and all of that kind of you know skill development that would happen
as a second order consequence of of people making all these little silly games and sharing them with
each other yeah yeah yeah i think i think i mean even now i think like the kind of vanguard is
game creation right which um or gameplay slash creation right because you have both right in any environment right like
you have consumers and producers um in all these networks so i think way in when we talk about
future stuff right way in the future i think it'll be or maybe not that far in the future right but
like you know five ten years from now right i think it'll be more you know experience rather than game um which is a subtle differentiation
yeah what is a game scott yeah yeah you know because experience still implies interactive
right and if it's in software right does a game have to be interactive i guess it doesn't have
to be interactive does the game have to have goals does it have a win condition it doesn't
have to have goals or win condition i would i would go as far as saying yes it needs to be interactive depending
on your definition of interactivity right um yeah where interactivity can mean explicitly withholding
interaction yeah like yeah there are games where it's like don't push the button that kind of yeah
yeah um but you could choose right you have a choice there you could choose to push the button
right like a choice could be an entirely um you know in that in this sense like all almost all vr
content even if they call it film or whatever is a game right because you have a choice to
where you're looking at right and you could extend that and you can make it super meta and be like
well film's a game because you could decide to pause the film or rewind it and that's i think that's a little bit too meta for me but like
but um i don't i don't yeah the the what is a game discussion is a rabbit hole that i don't
want to get into right now yeah or or ever it's it's very much like the what is programming
discussion now the real discussion is is programming a game yes yes
definitely programming is definitely a game um but uh uh or are games programming some games
definitely are programming so i mean photoshop is programming as far as i'm concerned so yeah
that's true yeah i think you know i went went, I went off on some tangents there, but, you know, kind of with the experience thing, sharing thing, you know, you think about metaverse and you think about, you know, kind of large scale reality capture, right? objects captured directly. And there's, there aren't really decent consumer tools for this.
There's not really good rendering for this even at this point.
Right.
So it's not ready yet,
even though there's research right in,
in nerfs,
which is like a neural radiance fields,
which is a way to take a few photos and basically do some machine learning
on it and output a volumetric representation of that photographed object, right?
And it's a little bit beyond like photogrammetry or like LiDAR scanning or whatever, right?
Because it actually captures a full volumetric radiance field.
So you can, you can like the trick with photogrammetry is that it doesn't really capture, let's say, flat surfaces because it's doing, you know, essentially edge detection and it generates a mesh, right?
So you don't get transparencies, you don't get, you definitely don't get, like, volumetric type effects, right?
So you can't, you end up with weird kind of looking 3D meshes a lot of the time and a lot of the time they have, you know, and then they flashed on but they just look like photo textures they don't necessarily capture the lighting the lighting is
baked into the texture but yeah yeah exactly and for anybody who has no idea what the hell we're
talking about now we're talking about techniques for basically taking an object that is in the
physical world and getting a 3d virtual version of that object. Yes. Photogrammetry is like one of the, you know, popular techniques for doing that.
This is something that people want to do a lot when you start like trying to make a virtual world.
It's like, I'm holding this thing in my hand.
How do I get it into VR?
And so there's like emerging techniques for doing that.
Yeah.
Yeah.
When I first started working in VR, I had a lot of people come up to me and they're like well how do i just get an actor in vr right and i'm like uh you need a you
know a hundred thousand dollars in a performance capture stage or you know whatever like you know
you need like to hire artists to work for three months or four months to get one character you
know it's like and and there are easier ways to do that.
Even then there were easier ways to do it, right?
You could buy something off the asset store.
Now you could use like character creator
or like Unreal Meta Humans.
Like there are various options
to get modifiable, decent characters in a game faster.
Basically like a character creator from a video game.
Yeah, yeah.
It's like an advanced character creator from a video game yeah yeah it's like an advanced character
creator from a video game but it's still like not like if you just want to capture yourself
and get a 3d version of yourself that can be animated that's like not trivial right so you
can imagine a future where like that does become trivial all of a sudden right and like you can
you have a pretty easy way to capture volumetric
things in the world, like 3D things in the world, right? And display them and share them with people
and remix them, right? Like, I think that'll be pretty massive, right? And that's like,
that might be at that point, like, if especially we can do it on a relatively large scale,
that might be the point where you have something which you're calling the metaverse, right? Because I think, you know, as much as, you know, a lot of people want fantasy or even prefer
fantasy or prefer like stylized art or graphics, right? I think a lot of people want, like, even
if they don't explicitly say that's what they want, like they would prefer being able to experience things that remind them of the real
the real world or are like in the real world as well um vr tourism is a like if you just ask like
people that aren't really gamers that aren't really like you know super heavy tech people
necessarily right like vr tourism is one of the number one things and there are vr tourism apps
right like and you could do a 360 video,
you could do it with rebuild this area and kind of do it. Right. But I mean,
I was on stage with Paul McCartney. Yeah. Yeah. But it's still not like,
it's not quite that compelling. Right. And you can't do it at scale. Right.
You have to, you do, you could do it. You do one experience. People are like,
Oh, that's pretty cool experience. They may not want to return to it,
but you can imagine something where there's effectively infinite content,
like a YouTube or something, right?
All these experiences, and maybe that's more interesting
for people going into the future, right?
To kind of circle this back around a little bit to programming,
sure, object capture like a big part
of it and you can get all of these objects looking photorealistic into the virtual world but unless we
somehow crack programmability like they're not going to behave in a way that is that is an analog
of the real thing and that's like that's why it's like you know i was on stage with paul mccartney
big deal is it's because if i was actually physically on stage with paul mccartney big deal is it's because if i
was actually physically on stage with paul mccartney in the real world i would want to go
over and like play one of the instruments or something like that or like like interact with
it in a way that it was giving back to me as much as i was able to give to it and that was a very
high amount whereas like virtual tourism and whatever it's like what you are able to give back to it
is almost nothing and what you are getting from it is like very very narrow compared to actually
being there yeah exactly so you definitely i definitely agree that you need interactivity
um and i think i i probably should have explicitly stated that right like this is this is on top of
you know you have your your virtual game creation
environment right so you do have programmability right whether it's just like in some cases it
might be fine just to set a set of standard properties right and this is this would be
something that you could do in uh rec room or horizon today right where you say this thing is
a rigid body so it's physicalized that might be kind of janky in a lot of ways right in the sense of like oh well is it really rigid like what is the collision mesh on it look like you know in
today's tech right um but you can imagine a future where some of that's solved um we have nanite for
physics or what have you yeah that would be that'd be that'd be a thing that'd be that'd be pretty
amazing actually um yeah like a like a
super super scale position-based dynamics uh engine or something like that lots of level
collisions or something like that yeah actually wasn't that uh wasn't that uh physics was kind
of like that in a sense uh no physics is just like the standard physics engine using unity or
unreal right it's a rigid body dynamics
engine it doesn't do anything super fancy i thought i thought it was doing something like
very small per triangle penalty springs or that kind of stuff as opposed to like
your havoc derived you know rigid bodies and then soft bodies that have like some amount of
deformation i thought it was like more like everything is a soft body and there was some
fanciness going on there no not in physics not at least i mean the thing is they do have a
physics has a weird history where they they briefly built physics hardware and they got
and i'm not actually sure what it did necessarily but they got bought by nvidia and they have some
they have a bunch of products that are named physx right so they do have some
physx that run on gpu to do like fluid sim and stuff like that right potentially do
but um yeah they haven't really had like that scale or at least if they do or did then
it definitely didn't survive like it didn't become part of the well so nvidia has a
thing which is like separate from physics that is a gpu position-based dynamic system that kind of
works that way right not that many apps used it like there was a vr demo tech demo app that nvidia
made that was like a funhouse that used it that stuff isn't widely used in games. Basically, that's the answer.
Like, in most popular gameplay types,
rigid body physics isn't used very heavily in games.
You mean soft body physics aren't used very heavily?
Rigid bodies are not used very heavily.
I thought, like, everything's, you know,
like your character's a capsule until it ragdolls.
For Collision, yes.
Yeah, your character's a capsule, but ragdolls for collision yes yeah your character's a capsule but almost all character controllers and shipping games especially popular games are fully kinematic
right they're not they don't use dynamics right yeah for like locomotion and that kind of thing
yeah yeah they use like shapecasts and stuff and like most gameplay is just like raycast against
static geometry and but then it'll use dynamics for like you know effects like you know i shoot
the concrete yeah and it's gonna you know crumble and little chunks are gonna roll along the ground
like that's dynamic yeah so so it's actually used a lot and like vehicle sims are a little
bit different and they tend to use it a little bit more so there there are certain genres that
use it more heavily it's it's a little bit more common in vr because you can pick up and throw
objects and like objects need to feel a little bit more physical so you see more of it um and there are games that obviously use it for
like like a lot of puzzle games use it you know like a portal or or like puzzles in half-life 2
or whatever use it because you're kind of picking up cubes and stuff i worked on a game that used
it pretty much my opinions about rigid body dynamics that were were partially influenced
by a game a puzzle game that i worked on called shadow physics that used rigid body dynamics that were partially influenced by a game a puzzle game that I worked on called Shadow Physics
that used rigid body dynamics
pretty heavily for
some objects
in the world and it was
challenging to tune them
and stuff like that right so
I mean they're used they're used pretty widely
like most games I would say ragdolls is
probably the most common usage in like your average
video game the wild tangent um
this episode of the future of coding podcast is brought to you by Glide. Glide's mission is to create a billion software developers by 2030
by making software dramatically easier to build. We all marvel at how successful spreadsheets have
been at letting non-programmers build complex software. But spreadsheets are a terrible way
to distribute software. They are an IDE and the software built in it rolled into one, and you
can't separate the two. One way to think of Glide is as a spreadsheet-y programming model, but with
a separable front-end and distribution mechanism. The way it works right now is that you pick a
Google Sheet, and Glide builds a basic mobile app from the data in the spreadsheet. You can then go
and reconfigure
it in many different ways, including adding computations and building some pretty complex
interactions. Then you click a button and you get a link or a QR code to distribute the app.
The data in the app and in the spreadsheet will automatically keep in sync. For the Glide team,
that's just the beginning. Glide needs to become much more
powerful. Its declarative computation system has to support many more use cases without becoming
yet another formula language. Its imperative actions don't even have a concept of loops yet,
or of transactions. Glide needs to integrate with tons of data sources and scale up to handle much
more data. To do all that, Glide needs your help.
If you're excited about making end-user software development a reality, go to glideapps.com
slash jobs and apply to join the team. My thanks to Glide for helping bring us the future of coding.
I think the folks in this
community already know Replit. I think they know Replit because of this show and because of the
fact that they have been a benefactor of ours for quite some time now, helping to bring us the
transcript that you can find at futureofcoding.org slash episodes slash 53 in the case of this
particular episode, but also because they keep cropping up on other
adjacent shows like the muse podcast the most recent episode that i just listened to
they talk about how replit kind of fits into the space of tools that really minimize the number of
moving parts that you need to concern yourself with if you're just trying to make a little
personal piece of software that's meant to go somewhere and be situated somewhere and just live forever without
needing to be you know tended to and maintained and and and be a constant suck of your attention
because in the case of replit they they abstract away so much of the pointless complexity that we
need to concern ourselves with if we're doing more conventional styles of programming like having to concern ourselves with what operating system our software
is running on and what you know versions of dependencies there are that you need to bring
into your project like having a build tool or or some kind of compiler or something like that or
concerning yourself with the physical hardware that your software runs on. If you, you know, have to do that, if you're setting up some little tool that you want
to run constantly and you put it on some home server, now you have to maintain that home server.
So Repl.it, if you are looking for a way to just make a little piece of software and set it up and
make it available on the internet somehow and let it just run in perpetuity, it's a wonderful place
to do that because they have a very nice contract between
your software and the environment that it runs within and it's a really interesting successor
to like what heroku gives you for example where you know heroku kind of has a very clearly defined
contract between your software and the system so that heroku can concern itself with changing the
underlying things and you don't have to and so replit is like like the next generation of that which is a really interesting way of framing what they offer it's not
just a tool for giving you a nice developer experience or a nice learning environment if
you're new to programming or a collection of nice integrations if you want to work with github or
a nice multiplayer programming environment if you want to get with github or a nice multiplayer programming environment if
you want to get a bunch of people together in a single screen and have them all you know typing
away the way that we are increasingly used to with things like google docs and notion and what have
you but it's also this environment for thinking about making something that just needs to be low
maintenance for the long term and so i really enjoyed hearing that framing come up on muse and so i guess this is a double plug uh replet rules and the muse podcast is also really good so
go check both of those out but in particular because you know um we're here to thank replet
go to replet.com and you can sign up for a free account there and get started and play with a
repl in you know i don't know how many languages it is now
let's just say all of them i have a two-year-old daughter and uh when i ask her you know oh how
many uh how many raisins would you like she says all of them so yeah replet has all of them
languages uh go to replet.com thanks to replet for sponsoring that transcript and helping bring us the future of coding.
Yeah, kind of back to the experience thing,
and like, what does programming look like in, you know, a future metaverse thing
where you have perfect reality capture, right?
You know, it's very nascent, right?
Extremely nascent tech,
but I've played with Open open ai codex a little bit it's a gpt3 based
transformer model that's trained uh and fine-tuned specifically to and it has some extra clearly has
a lot of extra software functionality to actually function properly it's it's it's basically a
coding environment where you write a prompt, right?
So you give it a natural language prompt, kind of like documentation or comments really, right?
Like, so you basically give a comment, like one example I did was they have, they have an
environment that will generate web apps for you, right? You front end web apps, right? So generate
JavaScript using, you know, standard browser stuff and might be able to generate React code or some, some other really popular common stuff that you see on the web
as well. My comment was, you know, draw a spinning triangle using WebGL and generate the code for it,
right? And it works, right? You can run it and it compiles and stuff like that, right? And like,
they have some, they've, they've had some cool demos. It's currently like in closed beta or alpha or whatever you call it. But if you've messed with transformers at all, if you've
messed with, you know, any of these large language models, GPT-2, GPT-3, they do a pretty good job of
generating readable text, right? So this generates runnable code, this particular model, which is cool.
And GPT-3 can actually generate things that aren't, like you can generate well-structured
JSON and some other things, and SVGs, so you can actually generate images kind of with it,
but that's kind of an abuse of the system. It's not really a side effect. But you have, you know,
these large language models that, you know, now they're kind of still kind of novelties, right?
They can be used for writing assistance.
But they have a bunch of flaws where they'll only generate like 120 or 1,024 characters at once, right?
So that's very limited in coding, right?
You have to keep, like, reprompting it and try and keep a thread together and stuff.
So it's hard to write larger programs, but you can imagine a future where someone is coding in
one of these environments or in general, right? And they're partially assisted by an AI or,
you know, fully assisted by an AI. And you just, you're mostly writing very detailed documentation.
You don't care about implementation, right? Which is, you know, even higher level than like,
oh, you know, we have compiler, you know,
we talked about this earlier, I think,
where, you know, you start with assembly language
or machine code or, you know, punch cards or whatever,
and then you kind of have assemblers, right?
And then you have, you know, higher level,
some higher level languages,
and then you have better editors and debugging tools,
you know, visual programming languages
and, you know, other like kind of advanced tools to help you.
But, you know, maybe kind of the further assistance of programming
looks like something AI driven.
And then even then, like designing a programming language
so that it wasn't requiring this sort of hack of
using an ai model that's trained on like reproducing text from examples but could instead be like what
does an ast designed for ai generation look like um what does it look like to just like cut out
that kind of that middle layer of like the textual syntax of it that might like be an
interesting frontier in which to explore this yeah yeah yeah you you have a i mean because i mean gpd
three ended up being just kind of like the tool they had right and it's super expensive it costs
like millions of dollars or whatever to train these models right which you could imagine a
approach that's more like let's take a bunch of running
programs right as they exist you know and this is this is kind of like super fuzzing i guess right
where you can using various fuzzing programs people have demonstrated like the ability to
create uh programs based on you know a certain input that can like parse the input or whatever
or do something for you like you want a specific you know, and that's kind of like, that's really basic, right?
Like you drive it via, you know, effectively a genetic algorithm or something like that, right?
But you can imagine, instead of taking a huge corpus of code, you actually take a huge corpus
of running programs, and you train an AI and I imagine like a transformer model would not be relevant at
all in this case, right? To generate that code, right? And said, and that's even like lower level
than the, you know, AST level or any kind of language level, right? It's just like, okay, well,
here we kind of like either label the program in a certain way of like what it's doing or
you know allow people to input screenshots or or like mock-ups or something right that's similar
to a different to a program right and it kind of generates a working program of the thing that you
mocked up right like and once again all of these for any of this to be useful, you need to do a lot of curation and a lot of very careful prompting.
And even then it still falls down a lot of the time because, you know, there's issues with this, like with codecs at least being, you know, it's still probabilistic.
So like you don't get deterministic output based on your prompts.
Your prompt will probably do a similar thing or the same thing every time you put it in,
but it's not going to do the exact same thing.
So you can write the prompt once,
get a perfect result,
and then write the prompt again,
and it's not quite what you want it to.
And maybe that's not a bad thing.
I don't know.
Yeah, I don't mind that so much
because then that's how people program too.
Yeah, yeah.
So I don't know.
It's an interesting thought, you know,
and that's not necessarily
related to VR particular, but...
It's just another nascent technology that will have an influence on how we program.
Yeah.
Well, let's just wrap it up here. Like we've got two hours and 20 minutes worth of
wandering in the wilderness, which is exactly what this show is about. So I think that that's,
even though we
hit basically one of the five different areas we could have gone into, I think that we hit it really
hard and that's super cool. All right. Well, it was good talking to you. Um, you know, I hope you,
hopefully that's, you know, a good set of material. We did go off in the weeds sometimes.
Um, but you know, that's what this show is all about.
That's why,
that's why,
you know,
muse can do their like tight,
you know,
45 minute or like the notion podcast.
They can do their like tight 45 minute interview with Alan K over here.
We're going to get fringy.
We're going to get weird.
We're going to go very deep into like hypotheticals and,
and that kind of wilder side of futurism.
So thank you,
Scott,
very much for going on that wander
in the VR woods with me.
Okay, cool. My pleasure.
And that's the end of the interview.
Thanks, Scott, for coming on the show.
Thanks to Replit and Glide for sponsoring,
and thank you to you for listening.
You can rate the show or leave a review,
but don't because that's not how this show spreads.
It spreads by word of mouth,
and so don't bother. It's fine. We have another interview coming shortly. I've already got it recorded with Ella Heppner about the Vloser programming environment, so stay tuned for that.
That's all for today, so I will see you in the future. Thank you. Thanks for watching!