CppCast - Game Development with C++ and Javascript
Episode Date: January 14, 2016Rob and Jason are joined by Mark Logan to discuss his experience building a game engine in Javascript and C++. Mark started learning C++ with Borland Turbo C++ in high school, so that he could... build video games. After 20 years, he's finally starting to feel like he knows what he's doing. After graduating from Northeastern University's College of Computer Science, Mark spent 7 years at Google, mainly working on internal infrastructure and automation. More recently, he returned to his first love - game programming - and helped found a studio called Artillery. He's currently the tech lead on Artillery's free-to-play RTS, code-named Atlas. He spends his time working on performance optimization, networking, and solving cross-platform development problems. News New cppcheck released How to make your own C++ static analyzer with clang Improving your build times with Incredibuild and VS 2015 Mark Logan @technicaldebtor Links Artillery Artillery Blog
Transcript
Discussion (0)
This episode of CppCast is sponsored by Undo Software.
Debugging C++ is hard, which is why Undo Software's technology
has proven to reduce debugging time by up to two-thirds.
Memory corruptions, resource leaks, race conditions, and logic errors
can now be fixed quickly and easily.
So visit undo-software.com to find out how its next-generation
debugging technology can help you find and fix your bugs in minutes, not weeks.
Episode 41 of CppCast with guest Mark Logan recorded January 14, 2016. In this episode, we talk about building static analyzers in Clang.
Then we'll talk to Mark Logan from Artillery.
Mark will talk to us about his experience building a game engine in JavaScript and C++.
Welcome to episode 41 of CppCast, the only podcast for C++ developers by C++ developers.
I'm your host, Rob Irving, joined by my co-host, Jason Turner.
Jason, how are you doing tonight?
Doing all right, Rob. How about you?
Doing pretty good.
As I was just telling you, the next few weeks might be a bit hectic for the show.
I'm doing a business trip, and I'm also going to be moving again,
as I think I mentioned to you before, but I haven't brought up on the show yet.
It shouldn't be as much of a move.
I'm just moving across town, basically.
But I still have to do a lot of packing, and it might interrupt the show a little bit.
So apologies ahead of time to all listeners if we miss a week or two.
But I just wanted to let you all know.
You know, it really doesn't matter how far you move, you still have to pack everything you own and then unpack
it again.
That part just takes some time.
Anyway,
at the top of every episode, I'd like to read a piece of
feedback. This week, Matt
writes in about last week's episode
and he just wanted to say how
glad he was to run across the podcast a few
months ago. He really enjoys C++ and your guest discussions have been really good.
The more technical, the better.
He's very biased towards the Unix and Linux platform as a C++ development environment
because he's been working on it since the 80s.
So it was nice to hear that UndoDB was geared towards developers like him.
And he hopes we continue to include Linux as at least an equal peer to windows
for him personally that's one of the best aspects of your podcast so many other programming podcasts
focus on windows or mac development environments exclusively so yours is unique in that way
um yeah i've tried really hard to make sure we're balancing windows and unix specific topics because
i think that's representative of the C++ community.
What did you say, Jason?
Yeah, I almost feel like there's maybe more Unix C++ developers than Windows, but that
could just be a complete fabrication.
I don't know if anyone's done a study on the numbers.
I remember way back when we did a poll asking about development IDEs, what people are using,
and I know Visual Studio
was up there.
Just based on that alone, you'd think there
are a fair amount of Windows
developers for C++.
Yeah, I'm sure that's true.
Anyway, thanks for writing in, Matt.
We'll definitely keep Linux topics in the show.
Don't have to worry about that.
And we'd love to hear your thoughts about the show as well.
You can always reach out to us on Facebook, Twitter,
or you can email us at feedback at cppcast.com.
And don't forget to leave us reviews on iTunes.
Joining us tonight is Mark Logan.
Mark started learning C++ with Borland Turbo C++ back in high school
so that he could build video games.
After 20 years, he's finally starting to feel like he knows what he's doing.
After graduating from Northeastern
University's College of Computer Science,
Mark spent seven years at Google, mainly
working on internal infrastructure and automation.
And more recently, he returned
to his first love, game programming,
and helped found a studio called Artillery.
He's currently the tech lead on
Artillery's free-to-play RTS,
codenamed Atlas. He spends
his time working on performance
optimization, networking, and solving cross-platform development problems. Mark, welcome to the
show.
Thanks, guys. Glad to be here.
So a lot of people got their start on the Borland tools. I personally kind of missed
those, but I love cross-platform development problems myself. Do you have any exciting
anecdotes that you would like to share?
Oh, I have several that are super technical
and might be a bit of a digression to start the episode off with.
You know, that's cool.
I think the most interesting thing I've had to deal with recently,
so I guess we're going to be talking about uh c++ and javascript uh today and um the biggest cross-platform issue i've run into there
is uh basically interoperating c++ and javascript on windows uh you uh the stack frames become
undecodable um because uh v8 which which is the JIT that we use,
uses its own calling conventions that Windows stack walking code
doesn't know how to deal with.
So that's been kind of an ongoing tax on our own development.
But yeah, that does sound terribly exciting.
Maybe we can dig into that a little bit later here.
Yeah, sure.
Okay, before we get to that, though, let's just go over a couple news articles real quick.
And Mark, feel free to chime in on any of these.
The first one is a new version of CPP Check.
Jason, did you want to talk about this one?
Yeah, anyone who's been paying attention to stuff that I talk about,
I guess, is I love static analysis tools and using all the tools you possibly can for your project.
And so just, you know, there's nothing terribly notable. I don't think about this particular
CPP check release, but I just wanted to let the world know, yes, CPP check is still in development
and you should use it and it's completely free open
source static analysis tool right yes yeah i think you talked about this on our very first episode
together when i had you on as a guest and uh you know you talked about how it was one of the
better free gui analysis tools right or analysis tools that came with a gui yeah it comes with a
gui um and it
comes with visual studio integration if you want to work that way or it also works great from the
command line on linux and mac os very cool so kind of building on that if you want to go and build
your own static analysis there's this great blog entry going over how you can go about making that in clang and i don't think i've seen such a
in-depth article really going through how to use clang in order to build your own sag analysis
tool it's definitely something we've discussed a couple times on the show though right jason
yeah and it looks like a potentially a great primer for getting you started with libclang with C++.
The only other decent tutorial I've seen was using Python with libclang.
Oh, okay.
That sounds really... Oh, sorry, go ahead.
No, I was just going to ask you if you wanted to bring anything into the conversation.
Oh, I did read the post about building a static analyzer with clang.
I'm going to have to look into it more
because the thing I've always heard about Clang
is that it's built basically to make it easier
to develop tools on top of it,
easier to write your own extensions to the compiler.
But I've never had a chance to do that
or seen a lot of really good intros on how to do that.
So it seems pretty interesting.
Yeah, this article definitely provides that entry point
if you want to get started with seeing the power of Clang
as a library and analysis tool.
Yeah.
Okay, and this last one, this is actually an older article,
but I don't think we mentioned it before coming from the Visual C++ blog.
And they're just highlighting that in the newest release of Visual Studio 2015,
they added support for IncrediBuild,
which was a paid-for tool separate from anything Microsoft provided.
And they're now including it, bundling it in with versions of Visual Studio 2015,
which is pretty cool.
And what IncrediBuild does, if you haven't heard of it,
is if you have multiple machines with your coworkers
and you all have Visual Studio installed,
you all have this IncrediBuild package installed,
then when you go to build,
you can parallelize the compilation across multiple machines and get done significantly faster.
I think that sounds really interesting to me.
Compile times are sort of one of the big downsides of C++.
One thing I'd really like to see is a good compiler cache for Windows on C++.
If you guys know of one,
or if any listeners know of one, I'd love to hear about it.
I've been looking recently.
I have looked also and have not found one.
Yeah, it's kind of crazy
that it doesn't exist.
Well, I think Incredibuild provides part
of that, but I'm also a little confused
as to exactly what is free.
It seems like it's just the
standalone single host instance of incredibuild with that is free which still gives you a good
performance improvement because it can better parallelize on your local box oh is that what it
is okay so it's kind of just an entry into it they're they're not giving you everything. I think if you buy IncrediBuild from the company, you can
install it on as many machines as you buy licenses
for. Then all those machines can
take advantage of their processors to improve your builds.
It does work well. I do know I have a client that is using it.
They get good use out of it.
Very cool.
Okay, well, Mark, we have you on here, and as you mentioned, we're probably going to be talking about some C++ and JavaScript.
Let's get started with, just tell us a little bit about what you're developing with at Artillery.
Sure.
So we started out a couple years ago trying to build a pure javascript
game engine and uh went quite a ways with that and eventually determined that uh javascript at
least native idiomatic javascript was not going to be fast enough uh and so we started uh taking
sort of an incremental series of steps uh getting into C++ more and more.
And the first part of that was basically using Emscripten to compile C++ into JavaScript in the form of ASM.js, which is a restricted subset of JavaScript.
And doing that got us massive speedups. And at this point, basically what we have is a game engine
that runs in the browser and as a native application.
And this is all accomplished through either compiling JavaScript into,
or compiling C++ into JavaScript and running it in the browser,
or we have essentially a native port of the engine
that embeds V8
and runs the C++
code natively and then calls
back to JavaScript in order to
run the JavaScript
parts of the game. So it's been
kind of a pragmatic process
and we've kind of...
I think what's interesting about is we've
we've come at things somewhat in the opposite direction from a lot of game studios where
typically they'll start out uh developing in pure c++ and then add a scripting language often lua
or something like that or their own in-house scripting language we sort of started with a
scripting language and then later started putting c++ into it and arrived at a somewhat similar place in the end.
Right.
So you said that Asm.js, excuse me, C++ compiled to Asm.js was actually faster than the equivalent native, if you will, JavaScript?
Yes, it's far, far faster.
Interesting.
There's a couple reasons for that.
Probably the most important is that if you're just writing idiomatic JavaScript, there's very little you can do to get cache coherency.
Because if you're using a native JavaScript array, you don't even necessarily know that it's contiguous.
I'm sure in general it is but the individual elements
in it uh are probably gonna be boxed numbers so really every element in the array is gonna be a
pointer to somewhere else um and then if you if you have an array of objects it's even worse uh
if you are writing sql plus and you are controlling your memory layout and you have just
a big array of structs that you need to iterate through that all gets compiled into javascript but the the javascript is it ends up looking a lot
like assembly and so really you just have this uh this uh essentially the heap in javascript is just
a big typed array uh for people not familiar with javascript the typed array is basically an abstraction uh onto sort of raw
memory that exists in most modern javascript vms and then you can have your c++ code executing
in javascript and all it's really doing is accessing contiguous elements in this array
just like you would be accessing contiguous addresses in memory if it had been compiled
to native code interesting yeah so that's, I think, the biggest reason,
at least on modern processors,
where cache coherency is so important.
The other thing is that
when you compile C++ to JavaScript,
the JavaScript that you end up with,
it can be type monomorphic.
You never have to worry.
If you have a function that gets called
in asm.js
and it was produced from C++ code,
then you're never going to get a different type
for one of the parameters of those functions
because the type checker in C++
has already verified that that's not happening.
And so once you're doing that,
it basically means that a lot of the dynamic nature
of JavaScript is gone.
And different
JavaScript VMs take different approaches
to this.
Firefox's VM
called SpiderMonkey will
actually take the ASM.js
and just blast out native code from it.
So it just recompiles
the ASM.js to
native code.
V8 doesn't do that, but V8 is heavily JIT-based language,
and it's going to have a much easier time optimizing the code when it's type monomorphic
because it can say, okay, the last 20 times this function ran,
we got an int for this parameter and we got a float for the second parameter, or whatever.
And then it can
produce optimized code for that,
and it will never have to bail out of that optimization
because it's never going to get a call
to that function that violates
the measurements that
it had already taken that guided the
JIT optimization process.
Does that make sense?
Yes.
At the risk
of immediately diving into the weeds here,
I'm curious
what kind of performance differences
we're talking about. If you have just
ballpark numbers like idiomatic
JavaScript versus native
C++ versus
C++ compiled
with Inscriptum.
Sure.
So ballpark,
we've been
kind of in this process of gradually
rewriting different pieces of the engine
in C++.
And in general,
what I found, so a lot of the code that we're
rewriting is stuff that just kind of has to
do a very simple thing to a lot
of different entities
so for example i don't know if you guys play many rts's like starcraft but typically you'll have
a a little mini map in the corner that shows the positions of all the units on the on the game map
and so that's just code that has to go through all the units and it puts the position of the
unit into a uh a vertex buffer that gets uploaded to
the gpu so that it can just be rendered as um as a you know essentially a dot on this map so all
this code is doing is it's going through a an array of things and then writing you know an x y
coordinate into a memory address so when we when we move that code into C++, the overall speedup was between 5 and 10x.
And the marginal speedup, in other words, the cost of adding one additional unit into that, disregarding the overhead, the marginal speedup was like 20x, just compiling to JavaScript.
So it was hugely, hugely faster.
And almost everything that we ported in this way,
we just suddenly didn't have to worry about it anymore.
It just became almost too small to measure, which was really nice.
And then we had more complicated systems like our renderer,
which has to do a lot more interesting things.
And it's not just sort of straight line, just blast through memory in sequential order.
That was not as dramatic a sort of overall speed up, but it seems that the big benefits we get are sort of in the marginal, at the margins.
You can just keep adding stuff and it doesn't change it very much.
But we only got like a 2 or 3x speed up uh from where we started on that one um so in general i would say
you know going from idiomatic javascript to uh sql's plus that's compiled back into javascript
uh you can somewhere between 5 and 20x depending on depending on m. And then ASM.js is commonly measured to be like 1.5x to 2x slower
than native C++ code.
But that's kind of a moving target because V8 and SpiderMonkey,
they just keep getting better and making things faster.
So have you tested it against IE's new JavaScript engine that was just open sourced?
we have not, been too busy for that
but it sounds fun
just curious
we'll get to that if we can
just to pull it back a little bit
I had not heard of Project Atlas or Artillery
much before going into this
interview but i just did a quick search and i found this interesting article how project atlas
could kill the console and change gaming forever so this is a pretty serious game
so uh that's a um that article oh boy is um that was written where was that again was that on daily dot
daily found it okay so um when you're when you're fundraising you tend to have um a lot of these
sort of uh somewhat more hyperbolic uh articles written um yeah i think i mean now, we're focused on
making one fun game, and we'll
get around to
killing the console later, I guess.
It looks like
comparing it to StarCraft 2, though, is
a pretty good comparison, just judging
by the images I'm seeing on the internet.
It looks like a pretty beautiful
looking game, and it would be impressive
to see this running in a browser.
Yeah, so it is inspired in a lot of ways by StarCraft.
It's one of our favorite games.
Probably, I don't know what image you're looking at.
There's some very old ones out there.
We kind of cringe at when we look at it now.
But I'm sorry.
I think I lost my train of thought.
Was there a question in there?
Uh, I'm not sure if there was really.
So a question I did have though, was when you first started and you were looking at
JavaScript, were you aware of them scripting at the time and you just chose to go with,
you know, just pure idiomatic JavaScript?
We were aware of it.
Um, but we, uh, we didn't start out from the, from day one
building Atlas. Um, we started out building some much simpler games that really had no, um,
no need for, for the higher performance requirements. Uh, and yeah, so once we,
at a certain point we decided to, more on a much more ambitious game,
and that was when it started to become necessary.
So we had one demo that was essentially a clone of Bomberman, the old NES game,
and we were trying to position ourselves as providing tools and middleware for games on the web.
And we pivoted away from that at a certain point.
Did you consider any other languages
besides JavaScript at that time?
Since we were targeting the browser, no.
Oh, I remembered what I was going to say.
The game, in regards to your earlier comment,
the game does run in the browser right now.
And we use it a lot for internal development. We probably won't ship a playable run in the browser right now, and we use it a lot for internal development.
We probably won't ship a playable version in the browser because there's some somewhat insurmountable user experience problems in the browser.
Specifically, the escape key always takes you out of full screen, and there's nothing we can do about it and so um but we probably will be shipping uh
replay viewing uh and observing in the browser which we're pretty excited about because we think
it's it's something that hasn't been seen before in in a web browser for sure okay so what does
your overall architecture look like then do you have any anything that is compiled to native code from C++?
Oh, yeah.
In our native, the game now we mainly ship to playtesters as a native application.
But it's the same engine that also works in the browser.
And the basic architecture is that the code that in the browser
would have been compiled to Emscripten,
in the native application it gets compiled to native code,
and we call it via the V8 C++ API
instead of calling it as Emscripten code inside the same JavaScript VM.
So from an application perspective, it looks more or less
the same. Underneath, it's, you know, it's asm.js in the browser or native code on the native
clients. So do you have like a lot of moving parts to manage to make that all work, or does it all kind of happen naturally? I wouldn't say moving parts.
Basically, when we started transitioning
to the native platform,
we basically had to build a binding generator.
So Emscripten has a lot of different,
or it has three different ways
to generate C++ bindings
that you can access from JavaScript.
The one that we use is called the IDL binder, where you write your binding definitions in
WebIDL.
I don't know if you guys have seen IDL before.
I've seen IDL from a long time ago.
Things called IDL.
Okay.
So there's a subset of it called WebIDL, which I believe that the Chrome team
uses to sort of generate their internal bindings, or maybe it's Mozilla uses it,
you know, for all the things that you use in the browser that are actually,
all the JavaScript APIs that browsers present that actually are implemented in native code.
Okay. So Emscripten built a binding generator
based on the same IDL language.
And so we started using that.
And then when it came time to make the transition
to a native application,
we had all these bindings,
all these binding definitions written.
So we just wrote a binding generator
that could generate V8 C++ bindings for all our code,
so that we could call into it from JavaScript. Interesting. I was going to ask if you tried
using Swig, but Swig doesn't give you the inscripten side of it that you need it only gives you the v8
side of it that you need right exactly yeah and possibly we would have done that if we had if we
had gone in the the reverse order but since we started with inscripten um we were already uh
that was the world we were living in and the the binding generator it turned out to
not be a huge amount of work.
Since we were starting with a very clearly defined problem space,
it basically just provided the same functionality that Emscripten provides. So it just became a matter of writing a bunch of code to spit out C++.
Right.
Yeah.
The unfortunate thing is that just the size of the bindings
becomes prohibitive once you have sort of a not prohibitive but it becomes inconvenient once you
have a reasonably large application uh so we we have i think about um four about 400 um classes
that are exposed for classes classes struct? Classes, structs,
etc. And part of that is
some of that is third-party software.
But anyway, it's enough that
our native binding
generator spits out about one and a half
megabytes of C++,
which takes
a long time to compile,
unfortunately.
I've played with Emscriptum in the past
and tried to compile heavily templated code,
it doesn't necessarily like that.
It seems like I'm overwhelming the symbol table or something.
I don't know. It seems to take a very long time.
Emscriptum?
Yeah, last time I tried, but there's been large improvements
every time that I've tried it, and it's been a while now.
I was just wondering if you've hit any limitations with it or had any problems.
The one thing with Emscripten that is a bit of a bummer is that there's no incremental linking.
It's all built on top of Clang, so the front-end part of of it it compiles all your um your c++ files into
uh llvm bitcode and then uh so that's all it's it's a custom fork of clang but it's it's mostly
just clang so you end up with uh um all these bitcode files and then uh the and Linker turns those all into JavaScript which is kind of an
interesting process
because obviously JavaScript is
not assembly
or machine code so you don't have
jump instructions and
branch instructions so there's this process
where they have to turn LLV and
bitcode back into
JavaScript constructs
like ifs and functions and things like that.
So that takes a while.
And then also in order to make...
So they have all sort of the translation-level unit optimizations all happen in LLVM.
They get to take advantage of all that.
But once it gets turned back into JavaScript,
they have another pass that actually re-optimizes that JavaScript. And that part of it takes a while.
So you end up with this link step, which is basically this non-incremental link for your
whole project, and then an additional optimization pass over the output of that. I might have gotten
some of the details wrong, but that's basically how it goes. And so you end up with essentially just a long linking step even when you've only changed one file.
So that is a bit of a bummer.
Is it prohibitively long?
Like, does it mess with your development times?
No, it's probably, lately it's gotten up to maybe a minute sometimes.
So it's inconvenient.
It's not prohibitive.
I'd love for it to be faster,
but I try not to be too critical
given that the fact that
Emscripten even exists at all
is kind of a miracle.
I want to interrupt this discussion
for just a moment
to bring you a word from our sponsors.
Do you spend half your programming time finding and fixing errors?
Is printf your default go-to when you encounter a bug?
At Undo Software, they know that debugging C++ can be hard.
That is why their next-generation debugging technology for Linux and Android is designed for C++ users
and is proven to reduce your debugging time by up to two-thirds.
Harness the reversible debugging capabilities
of UndoDB, the reversible
debugger for Linux and Android,
and step backwards as well as forwards
to find the root cause of a bug.
Use watchpoints to reverse continue
straight to the time a variable was last changed.
Memory corruptions, resource
leaks, race conditions, and hard-to-find
bugs can now be
solved quickly and easily visit undo-software.com for more information and start fixing bugs in
minutes not weeks one question i had is you're you know working on this game engine and you want
other game developers be using it i'm assuming are they going to be working with a javascript api or will they be
able to work with it in c++ as well you know right now uh when we started we were much more focused
on building something for uh third-party developers uh we came to the conclusion at a certain point
that most uh platform companies get started by releasing a successful game.
Unity is the obvious exception, but other than that, there aren't a lot of really strong exceptions to that rule.
So right now we're just very focused on building a fun game, and the kind of considerations about third-party development are not as important to us right now i think if we if we do want to do that later on
um after launching this game uh it will involve a pretty thorough redesign or rewrite of the engine
um but i think uh i think if somebody were to just jump into the engine right now
it would be um essentially a mix of the two there's some things that you want to do in JavaScript, and there's very clear ways to do it.
And then the C++ side of things kind of has its own
set of abstractions that are oriented more around
ensuring things like sequential memory access and cache coherency
and all the things that you want to do in order to make your code really fast.
So did you, does your, and all the things that you want to do in order to make your code really fast. So does the interaction between C++ and JavaScript go both ways,
or is it only the JavaScript side calling the C++ libraries?
It goes both ways.
And the mechanism that we use to do that is, uh, you can in the, um, the
IDL, uh, binding generator, uh, the scripted one, and then also the, the one that we built
that sort of mimics it, you can define a, an interface, uh, which basically causes the
binding generator to make an abstract class, uh, or sorry, make a, make a, make a sub to make an abstract class,
or sorry, make a subclass of an abstract class,
and then you can provide implementations of the virtual methods in JavaScript.
Right.
And so you can pass in a pointer to your class that has these JavaScript virtual method implementations on it.
You pass that in C++.
C++ calls that code, so it can call some virtual method.
And then that call goes back into the bindings,
and the bindings dispatch the call back to JavaScript.
And then you have a function in JavaScript
that takes some arguments that have been suitably handled by the binding layer.
So if you're just passing a number, you just get a number.
If you're passing a C string into the virtual function,
then you're going to get at least an Emscripten,
you're going to get just a pointer as the argument,
and then you have to call a special Emscripten method
to turn that pointer into a string by reading it out of the heap. Uh, so it, it's a little finicky, but it works very well in
practice and, uh, it makes it quite easy to do, um, most of the things that you, you typically
want to do. Well, it sounds like a great opportunity then to bring up the windows
problems that you mentioned right at the beginning of our talk okay yeah so
uh that's that's been a fun one um yeah basically uh i guess actually let me back up a bit and kind
of talk about the um the challenges of debugging um in in an environment like this because i think
that's probably one of the biggest downsides um i was just thinking about that. Yeah, and this will tie into the stack issues.
So if you're developing simply
in Emscript and in the browser
with C++ and JavaScript,
it's not too bad because
if you put a breakpoint somewhere
and then you look at your call stack at that breakpoint
the javascript functions will of course just look like javascript functions the c++ functions um
first of all uh if you have optimizations on uh things can get inlined um and obviously anybody
who's using you know gdb regularly is familiar with that but, but you don't always expect to see that
in a browser developer console,
so that can be a little confusing.
And you also need to be careful
about what optimizations you turn on
because InScripted will also run the Clojure compiler,
which is basically a super powerful minifier for JavaScript,
and it will place all your functions with super short strings like ZN4 and Q8 and things like that.
So assuming you're not running into any of those issues, then you get a call stack that basically
shows you exactly what's happening. And if you actually look at the C++ code and you want to step through it, it's not really any worse.
And arguably, it's probably even easier than reading straight machine code assembly.
The JavaScript that gets produced, it looks a little bit like assembly because all the variables are like $0, $1, $2.
So it's like, okay, well well i guess those are my registers and and then your your
memory reads instead of a read instruction you have like uh you know keep u8 and then a array
index so it's fairly straightforward to to follow but certainly not nearly as good as actually
looking at your your source code as it executes um when you're doing native development is where
it gets a little more, a little hairier,
because now you have to choose, okay, am I going to look at my, as I'm debugging my code,
am I going to use GDB? Or am I going to use the V8 debugging, you know, interface. So you can
either debug your JavaScript, in which case, all your native code just shows up in the call stack
as you know, native, like it just says native and
it you can't find anything else about it um or you can debug in gdb in which case you'll see all
your native code assuming you have debugging symbols of course uh and then the um the
javascript um stack frames in between those just are memory addresses because the jit has just
you know it's spat, some code into some location
in memory and called it, but, uh, there's no way for GDB to figure out what that, um, addresses.
Uh, I, I'm sure it would be possible to, you know, obviously V8, um, which is, you know,
I've been saying V8 a lot, obviously there, It's not the only JavaScript engine, but it's the one that we use in our native builds.
Somewhere inside V8 is the information
about what function that actually is,
but building the plugin or whatever to GDB
that would be able to find that out
would be an interesting project if anybody wanted to do that.
And then things get even worse on Windows
where if you are debugging in Visual Studio,
if you put a breakpoint somewhere in your C++ code
in Visual Studio and stop at that point,
if you've been called from V8,
once the stack reaches the point of being inside the v8 jit it just goes
off the rails because v8 uses a calling convention that's kind of like cdecl but it's not really
and in any case um visual studio certainly uh doesn't know what to do with it. And so you get these stacks that just wander off into insanity.
It'll say a thousand frames of the same function or whatever.
So that's probably one of the biggest drawbacks about doing things this way.
So that's only a problem when you're trying to debug from Visual Studio and inspect the memory between the interaction between the two?
Pretty much, yeah. And it hasn't really hurt us too much in practice. We're able to
debug most non-Windows specific issues on Mac as well. And the calling conventions are similar enough
that the debugger seems to have much less trouble most of the time.
Okay.
So do you find yourself choosing one debugging environment,
the native versus the JavaScript, just because it's easier?
Are you able to debug your issues, most of them,
in the JavaScript environment, for example? So typically, we still do a lot of our sort of
day-to-day development purely in the browser. Okay. And so, you know, 90% of the debugging
that we're doing is just kind of everyday logic bugs where, you know, something just isn't working
the way it's supposed to in the game. And so you can go into the browser and, um, pretty quickly zero in on it.
Uh, the trouble is when we have, um, platform specific issues, like when we're, um, you
know, trying to figure out why some, you know, GL call isn't doing what we wanted or something
like that.
Uh, and then you're, you're stuck on the native platform.
Uh, but even then in practice, it even in practice, it hasn't been too bad
because we're typically looking at a pretty isolated piece of code.
And so the fact that you have five stack frames that make sense
and then a thousand frames of nonsense,
in practice, if we're only looking at those you know a couple stack frames
it hasn't been too much of an issue cool so uh were you excited at your team about the web
assembly news a few months ago um i i've been following that a little bit uh i think for
as i understand it the the major benefit of that is basically increasing load times
because you have something that is a lot faster to parse for the browser
than actually parsing JavaScript.
And since what we're developing, we have a lot of asset loading, basically, anyway,
so we have to load textures and meshes and all that stuff.
So the actual parsing time of our ASm.js isn't all that bad anyway compared to loading you know hundreds of
megabytes of textures and meshes and things like that so it's not a big pain point for us but it
does seem like i think it's great that they're doing that it's probably gonna help a lot for
um smaller applications that that aren't loading
all this other stuff and so that the uh the parsing phase is much heavier for them right
this all sounds kind of magical like you can just compile your c++ to javascript land and it's
magically almost as fast as native code and why does anyone even write anything in javascript at all anymore i'm sitting here
well myself i i mean uh in a word memory safety i think would be the big one um if if you
you know like people always argue about um you know dynamic typing versus static typing and and
which one saves more time and personally i i don't worry
too much about that i i don't worry about anything until i have to deal with memory corruption and
then then it's you know it's like it's your worst nightmare and so yeah i wouldn't advise that
anybody start um writing web apps in c++ unless they have a real performance need uh to do that
because yeah it is it is a lot faster,
but you do have to deal with all the C++ issues
that we all deal with when we're just writing nothing but C++.
Right.
Well, when do you project the game is going to be coming out
if people are interested in looking into it so we've been um
we've started uh doing limited uh external play tests uh we had our first one in uh early december
we had about a thousand people come in and test out the game and basically over the course of
this year we're going to be uh doing a bunch of similar tests and growing each one with the goal to
be in
a much wider
beta by the end of the year.
Possibly
limited, possibly public
beta, but we're not totally sure.
Okay.
Where can people find more information about
Artillery and Atlas online?
Artillery.com. Cool. And where can people find you information about artillery and Atlas online? Uh,
artillery.com.
Cool.
And,
where can people find you online if they want to see more from your stuff?
Uh,
I,
well,
you know,
I,
I have a somewhat limited online presence.
I was actually inspired to contact you guys after hearing,
I forget what guest it was,
but somebody was talking about dark matter developers.
And,
uh,
I don't know if you guys remember that episode.
Um,
so I said, Hey, that sounds like me um but i i uh i do occasionally write blogs on the blog posts on the artillery
blog um i have a couple up there um we haven't had a lot of time for for blogging lately because
we've been pretty heads down but um okay great well thank you so much for your time, Mark. Thanks a lot, guys.
Thanks for joining us.
Thanks so much for listening as we chat about
C++. I'd love to hear what you
think of the podcast. Please let me know if we're
discussing the stuff you're interested in, or if
you have a suggestion for a topic, I'd love to hear that
also. You can email all your
thoughts to feedback at cppcast.com.
I'd also appreciate
if you can follow CppCast on Twitter
and like CppCast on Facebook.
And of course, you can find all that info
and the show notes on the podcast website
at cppcast.com.
Theme music for this episode
is provided by podcastthemes.com.