CppCast - C++ Tour, Compilers and FASTBuild
Episode Date: December 7, 2017Rob and Jason are joined by Arvid Gerstmann from Appico to talk about bringing his new C++ Tour project, building your own C Compiler, using FASTBuild and more. Arvid Gerstmann is a passionate... programmer and computer enthusiast, with a focus on writing high-performance C++. His area of expertise include, but is not limited to, writing compilers, implementing the included standard libraries, and creating game engines and games. He is currently the CTO of Appico. If he is not programming, he enjoys reading books while drinking a nice cup of self-brewed coffee. He currently lives in the sunny Hamburg, Germany. News Intel offers Parallel STL implementation to GNU libstdc++ Exceptions vs expected: Let's find a compromise Interactive workflows for C++ with Jupyter C++17 published C++Now Call for Submissions Embo++ call for papers and ticket are for sale Arvid Gerstmann @ArvidGerstmann Arvid Gerstmann's blog Arvid Gerstmann's GitHub Links Appico Cpp Tour FASTBuild Kai Wolf's Effective CMake Book Sponsors Undo JetBrains Hosts @robwirving @lefticus
Transcript
Discussion (0)
Episode 129 of CppCast with guest Arvid Gerstmann
recorded December 6, 2017. conditions, and logic errors can now be fixed quickly and easily. So visit undo.io to find out how its next-generation debugging technology
can help you find and fix your bugs in minutes, not weeks.
And by JetBrains, maker of intelligent development tools
to simplify your challenging tasks and automate the routine ones.
JetBrains is offering a 25% discount for an individual license
on the C++ tool of your choice,
CLion, ReSharper, C++, or AppCode.
Use the coupon code JETBRAINS for a CPP cast during checkout at jetbrains.com. In this episode, we talk about exceptions versus return codes.
Then we talk to Arvid Gerstmann from Apico.
Arvid talks to us about a range of topics, the only podcast for C++ developers by C++ developers.
I'm your host, Rob Bergman, joined by my co-host, Jason Turner.
Jason, how are you doing today?
I'm doing pretty well. How are you doing, Rob?
Doing okay. Getting into the holiday spirit, I guess.
Good. You know, well, there's a local channel that plays nothing but Christmas music
from Friday until
that's just the radio station that my car is set to now
yeah my wife always likes to listen to all the holiday music too
I can kind of take it or leave it
but I'm not opposed to it
I love it even though it's like the same 10 songs
over and over again basically i
just keep singing along i'm okay with that same tame song 10 songs played by like a dozen different
singers over the years yeah yeah well at the top of every episode i like to read a piece of feedback
uh this week we got a tweet uh you should like this one jason, from John Foltz, referencing last episode at 16 minutes and 15 seconds.
You mirrored his frustration with CMake docs.
A great tool, but impoverished documentation makes for a steep and unpleasant learning curve.
Doesn't help that Google hits never seem to target the up-to-date docs.
Yeah.
Yeah. yeah there was a later tweet in that conversation there that said that you're that google uh that
the that the wiki pages aren't even officially updated i guess oh really yeah well there's
another tweet that i want to point out to you um this guy kai wolf is saying he is trying to fix
that uh meaning the bad cmake documentation and he he linked to EffectiveCMake.com.
It's Effective-CMake.com.
And it looks like he's writing a book or maybe an online book.
And it currently says it's at like 40% completion.
Yeah.
Yeah, maybe we should have him on when he's a little bit closer to being done with the
book.
Yeah, it looks like he's self-publishing it, if I am understanding that correctly.
It's an ambitious project yeah definitely well we'd love to hear your thoughts about the show as well you can always reach out to us on facebook twitter or email us at feedback at cpcast.com
don't forget to leave us a review on itunes joining us today is arvid gerstmann arvid is
a passionate programmer and computer enthusiast with a focus on writing high-performance C++.
His area of expertise includes but is not limited to writing compilers, implementing the included standard libraries, and creating game engines and games.
He is currently the CTO of Apico.
If he is not programming, he enjoys reading books while drinking a nice cup of self-brewed coffee.
He currently lives in the sunny Hamburg, Germany.
Arvid, welcome to the show.
Welcome. Nice being on here.
Yeah, but is it sunny this time of year?
No. That is the joke in that sentence.
Okay.
Hamburg is not exactly known to be sunny. Unfortunately, we are right now in the very
dark and very rainy season of the year.
Do you get a lot of snow where you are?
No, unfortunately not.
We used to.
Hamburg has a very big lake in the middle of the city.
And a couple of years ago, the lake froze, the complete lake.
And what Hamburg City decided is to put the whole
christmas market on the frozen lake which is actually pretty damn amazing if you think about
it and it was a lot of fun but since then we haven't had that much snow unfortunately and if
we get snow it becomes icy very quickly and very yeah not, not white anymore. Brown snow.
It's not much fun.
That's unfortunate.
Yeah.
Okay, well, Arvid, we've got a couple news items to discuss,
so feel free to comment on any of these,
and then we'll start talking to you about all the projects you have going on, okay?
Perfect. Sounds good.
Okay.
So this first one is Intel offering their parallel STL implementation to new Libstd C++.
I thought this was a pretty interesting offer, and apparently I think they offered it to Clang as well.
Yeah, I saw a follow-up comment, I think from Marshall, saying that they offered it to Clang as well,
but I did not see any conclusion to either of those offers if they were accepted.
Yeah, this was made pretty recently,
just about a week ago, it looks like it went out.
This is where
those two standard library implementations
are still lacking, but I believe
the Visual Studio implementation
is quickly coming into compliance.
Yes.
I think I saw some tweets from STL this week
that they're already at 75%, I think.
If you follow
billy o'neill on twitter he's working on that and then tweeting about his progress very very often
and very very good actually his progress and or frustration with implementation of it yes
okay uh next article is uh from funathan's blog, Exceptions vs. Expected, Let's Find a Compromise.
And it looks like this blog post was actually in reference or in response to two earlier blog posts,
one from Simon and one from Vittorio, both past guests on the show.
Yes. Have you been following along with this discussion at all, Rob?
I read this post.
I didn't have a chance to go through Simon and Vittorio's post yet.
You've been following this somewhat, Arvid, I believe.
I've seen your comments on Twitter.
I've been following that, yes, of course.
Yes.
I have a very strong opinion on exceptions.
Okay, well, I'll give a quick summation,
not of the full argument, but just of what's been going on.
There's been back and forth, like four Bog Plus posts now, plus associative Reddit discussions.
And we don't have mentioned in the news that Tartan Lama just posted on his blog the call for more data on exceptions. So now that's five articles and a very long Twitter thread
also about exceptions
versus using algebraic types
for returning error data.
So go ahead and give us
your strong opinion now, Arvid.
Actually, yes,
I have a strong opinion
against exceptions.
And I found the blog post by Jonathan actually very, very good.
He did a perfect summary of why exceptions are usually not the best thing you can have,
but sometimes are desired.
And although he did a very good blog post,
I think boost outcome should have had more prominent shout out, I guess,
because it actually offers you a perfect symmetry
between having algebraic data types and having exceptions.
So the blog post mentions the double interface of boost file system,
which has one function which calls, which throws,
and one function which is returning an error code
in an odd variable. And if you use boost outcome, for example, you can replace that with one single
function. And this function based on how you call it, will either throw or return you an error type,
an error code, which you can then handle.
How does it...
It still has to be based on what parameters you pass to it or something, right?
It is an out...
No, it's a return value.
It has boost, outcome, and then the value and an error.
So it's similar to expected,
but you can replace exceptions with that.
And based on if you have turned them on or off,
you can use them or just crash or use the return value.
And I have my perennial complaint with all of these types
that it makes you have to reason harder, if you will,
about the lifetime of your objects.
When you're returning something into an algebraic
type do you now need to use stood move to get the efficiency that you would have gotten with return
value optimizations in the past and when you get the algebraic type back out on the other end if
you need to extract that object you have to pay the cost of another move which is not free it is
a thing and that is absolutely i share that sentiment um it makes makes a big diff it
doesn't make that much of a big difference i think but um it can make a difference if you have tons
of those and especially given that exceptions if they don't throw are basically costless right
and in the hypothetical case that you've got something that like a stood
array that you're returning that doesn't really have a move mechanism it could be potentially
expensive that is potentially very expensive i've seen that in code i've implemented on my own
where i'm have something similar to stood expected and um yes if i have a type which is not movable
i usually go for a different solution maybe
an lparam or something because it just is too expensive to use well so all of our listeners
can go and read all of those blog posts and read up on the 30 message twitter thread or whatever
it was and make your own opinion it's a threat. Okay, this next article is from the Jupyter blog,
and it's about interactive workflows for C++ with Jupyter.
And I feel like we've talked about Jupyter on the show before.
We have mentioned it.
We have mentioned it, but I wasn't really sure what Jupyter is used for primarily.
Do you remember, Jason?
In my head, Jupyter is like this program i don't even
know if it still exists that was called math cad okay cad i think that was the right word
where uh now i have to try to google that you mean math you don't mean matlab i don't mean
matlab okay i have seen jupiter used in a lot of Python-related codes.
Okay.
Especially math, something you would implement in MATLAB and then port it over to Python in Jupyter's notebook.
Yeah.
So the basic concept is that you can program
or generate whatever, graphs, whatever, computationally
inside the middle of a document to some extent
so you can write like a paper up and have it do the calculations and the graph generation in the
middle of the paper for you correct that's what i've seen yes okay and that's what my understanding
yeah mathcat does still exist just for the record the initial release was 1986 okay and i guess this this article is talking
about how they can now embed c++ code in along with you know the the graphs and such that you're
talking about yes okay that's pretty interesting and with a proper api so you can use c++ to
generate gui components and whatever in the middle of your document. That's awesome.
Another great use of cling.
Yeah, I personally have often felt like cling is a solution looking for a problem, but I would, I do, I really do like the idea of using it here.
I can offer you a problem later on because I have one.
Okay. later on because i have one okay and although i haven't yet i looked into cling but uh i found it
to be very undocumented yeah but you are using it uh i've been looking into using it right now i'm
using a kind of a workaround just calling clang in a sub process which works but is really not that cool solution right okay uh and then the
next thing we have and this isn't really an article but more of an announcement that the
iso c++ 17 paper is finalized uh we already talked about how uh you know that basically
was accepted a couple months ago but you can now actually go and buy a hard copy of the standard if you want to.
Or a PDF.
Or a PDF, yeah.
Is that something you guys have ever done?
I mean, I've never thought about buying a hard copy of the standard.
But I guess if you're a compiler writer, then you definitely need it.
Even as someone who's into the C compiler, no, I haven't.
I can see, you know,
so it's supposed to be only cosmetic changes
from the last final draft of a particular version
to this PDF.
But if you were like working at Microsoft or something,
I could see having the official copy
so that you know what those differences were
would be valuable.
Yeah.
And with current exchange rates,
so our listeners know,
it's approximately 200 units of money,
regardless of what your currency is, right?
I mean, we're fairly close in the euro, the Swiss franc,
and the dollar at the moment.
Fairly close.
Yeah, on this link I have, it's not showing the rate in US dollars.
It's showing it in CHF, which I'm not sure what that currency is.
That's Swiss francs.
Okay.
Yeah.
Interesting.
Okay.
And then two more quick announcements.
We have a call for submissions from C++ Now,
which is going to be in Aspen again in May 6th to 11th, 2018.
And how long is that call for submissions open?
Oh, February.
January 24th.
January.
Yeah, and they'll send out decisions on the 26th of February.
Do you have your plan yet, Rob?
No.
Do you?
I do actually have my plan.
Do you have your plan, Arvid?
I have one or two talks in my head, but I'm not
decided yet, so I've got to do some brainstorming
there. But you are going to submit and
consider coming this year?
Yeah, I do. Would this be your first
year to C++ now? It would be, yeah.
Okay, that's what I thought.
It's a fun conference. Highly recommended.
And then we also have
an announcement about the
Embo++ conference
saying that they're also doing call for papers
although you just have this tweet here
Do they have an actual link on a blog, Jason?
I actually couldn't find a link
but I wanted to make everyone aware that this was published in November
late November
that they still have a call of papers open until the end of December
and it'll be i guess too
late for our listeners to buy an early bird ticket if i read the website right the early
bird tickets expire tomorrow but yes okay okay well arvid um let's start talking about
first of all this c++ tour that you are working working on. Yes, that's a project I want to actually announce here.
I haven't talked to anybody about that.
I wanted to take the unique opportunity.
Yes, it's a project which started on the C++ Slack, actually.
And it aims to create a kind of interactive tour
through the C++ language,
showing big programmers who begin programming in C++,
but maybe have some experience in other programming language,
how do you write C++ code.
The idea originally stemmed from the Golang tour,
because if you go to the Golang website golang.org you can
there's a there's a input window for for writing go code and there's a little tour button and if
you click on that you come to a separate page and it guides you through every feature or through
most features of the language like here's how you write a function, here's how you write an interface, here's how you slice an array, here's how you declare a map and so on and
so forth.
And the C++ tour is aiming to provide the very same for C++, showing you how to write
a function, how to declare a variable and maybe even later have a tutorial for beginners.
So complete beginners can jump right in, have a list of lessons where they can go through,
like writing your first program, and then obviously everything will be dynamic and interactive.
So you will always have a code editor on the right and a lesson on the left,
allowing you to write code and see the lesson and
test everything you uh you just learned um and another another um extension we we might have in
the future is to show new language features for already experienced c++ programmers so
for example for the c++ 17 standard we could show
hey this is an optional you can use optional like like this it is useful for that specific reason
and here's constructor template deduction whatever what are you going to be using for
the interactive compiler is that going to be with Kling that we were just talking about?
No, right now the solution is to use Wendbox, because Wendbox
actually offers us a full API.
Although
in the future we might consider
migrating away from that and building our own
system, but
it's in very early stages. We started working
on it like a week ago,
and decided on W my inbox first.
So what is the current status?
Our current status is we have started
and we are looking for people to help us making that happen
because my goal is to make this a community effort,
like CPP references, for example.
So if you have less on the left, you will always have, if you have a lesson on the left,
you will always have a button to go through GitHub.
Everything is right now hosted on GitHub to open an issue with a lesson or even edit it directly and submit the changes to us.
Okay, so do you have, what website should people go to to get involved?
Right now, the best way to get involved is to go
to github.com slash leon ross slash cpp dash tour okay or hit me up on twitter hit me up on the cpp
lang slack um i'm at arvid gersman on twitter and arvid on the cpp lang slack okay every every person we were right now we have four four people um and um
every every person is appreciated to be to be to help with the project that sounds like a pretty
ambitious project like it could be a virtually unending project really yes i have a talent for
starting such projects although i actually plan to to go through with
that because it um it helped me immensely of getting to actually start writing go although
i haven't done that in a long long long time um all the other languages i look for or look at
just to try out it's like yeah here download, here, download that compiler, that IDE, install that, install this.
And five hours later, you got yourself
your working environment.
And just to have a tutorial where you can jump in
and start writing code the second you start it,
it's just priceless for me.
Yeah, or five hours later, you're still wondering
why your IDE can't find the compiler you downloaded
yes exactly yeah that's uh it sounds like a really good goal um do you have a specific
outline of of what things you want to cover first so people can start contributing straight away
yes we have uh we have a couple of issues opened and there's a there's a list of lessons
we are a working on or be going to start working on so you can just look into that and start writing
lessons and we can we can then talk about including them and making that happen so i'm kind of curious
i mean you said you're using one box um which I know does not show the disassembly like Compiler Explorer does.
Yes.
And I've kind of gotten to this place in my own talks
that I show Compiler Explorer,
and whether or not the assembly is actually relevant
to the particular thing that I'm discussing, it's there.
Like, this is a normal thing.
We should all be used to reading x86 assembly.
And I don't know if i've gone off the
deep end here or not um so what did you guys talk about that at all in your determination for for
what to show and how to interact with the user the the learner um we looked into that and it appears
that gcc oh compile explorer sorry um actually has an API as well. It does, yes.
In the future, we will probably make use of it,
but hide it under some sort of settings or button
just to not confuse the beginner first.
Okay.
Yeah, I think the beginner is more interested
in seeing the return of the program
and not the disassembly output.
Yeah, well, you know.
If you come to the point where we can show new features like optional,
having the disassembly is absolutely useful.
I agree with you.
Yeah.
Yeah.
Okay, well, do you want to tell us a little bit about your background
in implementing high-performance systems?
My background?
That's a long story.
Long story short.
I've started implementing games
and it all rolled from there on.
Given that you need to have 60 frames per second nowadays
or even back then,
and you have 60 milliseconds for that
for a single update loop,
you have to consider a lot of stuff
very, very differently
compared to
normal c++ programmer so did you work on any games that any of us would have heard of probably not
probably not probably not i did i did a lot of game implementations beyond the scenes
and the only games i've published so far are mobile games. Unlikely that you've played any of those.
Never know, but...
You never know.
That's right. You can
search for Tesla Tubes on the App Store, for example.
It's by Kilo
and our studio, Vuni Games.
It is written
at a time where C++11
and C++14 were currently very new.
And a lot of my current opinions on C++ stem from that project, actually.
Since auto was new, and since, well, you got new stuff, you use it.
And almost always auto is not a rule I follow anymore.
I say that.
Well, that's interesting, I guess, a side diversion here.
So you used to follow almost always auto,
and now you don't, you're saying?
Yes, I pretty much had auto everywhere,
and even if I haven't had auto,
I use decltype of something I declare with auto.
So I took it very far.
Right, that's pretty extreme, yeah um and at one point just noticed like
this code becomes kind of unreadable if you put auto literally everywhere
so do you still use auto though yes of course auto is wonderful especially for something like
if you have an option return i put that in an auto or if you have an optional return, I put that in an auto, or if you have iterators.
But the usual what a lot of C++ programmers praise is like,
you declare an int and you put auto i equals a number.
I don't see the value in that anymore.
That's an interesting perspective,
because I think I've read some people who have gotten so turned off from auto that they've gone the opposite way, and they basically don't use auto at all. And I think that that's a real waste right there.
That absolutely is. Auto has absolute uses. It's just you, the rule of almost always auto is probably not always good to follow.
Right. You're okay with bool. you're okay with int every now and then
absolutely yes cool and auto has some uses where you can't get around it i mean if you have a
lambda which is generic for example yes so in your uh quest to you know hit 60 frames per second, were there common performance issues that you ran into?
Common performance issues, usually you have...
There are some common things, yes.
One of the many things Donald Knuth said is like,
premature optimization is the root of all evil.
And I found that to be true in a lot of cases.
You write some math code in Intrinsics
and it turns out to be slower than what the compiler actually does.
I like that. I would love to see examples of that.
Yeah, I would love to see them too.
I think I've eliminated them in the engines we used in the past.
At least I hope I've eliminated them. Let's say that um one of the other common issues i found is i'm a lie i like as just as you i like
seeing and i also like writing x86 assembly or assembly in general and i did rewrite a bunch of
math code or a bunch of other code in assembly, which also sometimes turns out is not always the best idea to do. Sometimes even assembly can be worse than what the compiler, even handwritten assembly
you looked at and think it's the best thing, can be worse than a compiler can do nowadays.
Compilers are very, very clever. I just realized we mentioned intrinsics,
you had mentioned intrinsics, and I don't know if they have ever come up on the podcast before
would you mind explaining to our listeners what intrinsics are?
Sure, when we talk about intrinsics I think we are talking about
the Intel intrinsics generally and mostly
the Intel intrinsics are direct access to
the underlying x86, SSE, SSE2, SSE3, and so forth, and AVX instructions.
So you have, for example, it's an idiom called SIMD, single instruction multiple data you have a you have a packed you have a packed byte array let's say that
which contains four floats or two doubles for example and then you can do a mathematical
operation on that you can multiply those two floats those two doubles or multiply those four
floats with another four floats which allows you to for, for example, do operations on a vec4,
on a vector of four floats
in one single instruction
instead of four separate instructions.
Okay.
And your experience was
you went through all this hard work
of using this intrinsics
and it didn't necessarily gain you anything.
Sometimes it doesn't necessarily gain you anything. Exactly it doesn't necessarily gain you anything, exactly.
This usually comes from if you have a ton of code
which is doing mathematical operations
and sometimes the compiler is clever enough
to eliminate all the writes and all the reads
you actually need in between.
But since you're using intrinsics,
you kind of hide what your actual intent is.
And hiding the intent from the compiler
is usually not a good thing.
Though this has been quite a while back.
I'm not sure if compilers still do that.
I would say they don't.
You're saying there's better chance
that they can see around your intrinsics
and still optimize the code?
Yes, they optimize around
intrinsics a lot nowadays, and
I think they also used to do back then, but
if you do
something like switching between AVX
or SSE
in the middle of some other
operation, you pay the price
for that, for example.
I can imagine. It's not good to confuse your
compiler. No, definitely example. I can imagine. It's not good to confuse your compiler. No, definitely not.
I wanted to interrupt this discussion for just a moment
to bring you a word from our sponsors.
You have an extensive test suite, right?
You're using TDD, continuous integration,
and other best practices, but
do all of your tests pass all the time?
Getting to the bottom of intermittent and obscure
test failures is crucial if you want to get
the full value from these practices.
And Undo's live recorder technology allows you to easily fix the bugs that don't otherwise get fixed.
Capture recordings of failing tests in your test suites and debug them offline so that you can collaborate with your development team and customers.
Get your software out of development and into production much more quickly and be confident that it's of higher quality. Visit undo.io to see how they can help you find out exactly what your software really did
as opposed to what you expected it to do and fix your bugs in minutes, not weeks.
Since we're talking about compilers, you mentioned when we brought up the ISO,
SQL's newspaper availability, that you have written your own compiler. Do you want to tell
us a little bit about that experience? Oh, I have, yes. Many call me a madman for doing that.
I have written my own C compiler and my own libc. In hindsight, it might have been not the best idea that was fun um i have an interesting uh let's say side on c++ or relationship to c++
that i write a lot of c++ and then i see some terrible code or write some terrible code and
just jump back to c because well it's easy and simple and then I see why C is not the best language always
to implement something.
And usually the normal thing you would do is just,
yeah, go back to C++ and implement it in C++.
I did implement my own C compiler
and implemented a few extensions on top of it.
Like, I wanted to have RAII
and
implemented the
RAII GNU extensions,
I think, which kind of
allow for a similar idiom to use in C.
Interesting.
I didn't know that existed.
I think GLib,
the GTK
standard library, uses that a lot.
Okay.
I believe that.
Well, glib is kind of like, it's an entire framework that should have been written in C++ that's written in C.
Yes.
And if there's any C fans listening, they'll probably think that I just made a terrible comment there, but it's totally true.
It is object-oriented. It is using RAI everywhere.
And it is using many more C++ idioms I don't know right now, but I guess.
And it definitely should have been implemented in C++, or at least something which has those idioms or supports them in the language.
Right.
So how far did you get with your C compiler?
It works.
It compiles.
It is self-hosting at this point, actually.
It works.
It doesn't just compile still,
even after working on it for like two years now, I think.
Writing compilers turns out to be quite hard.
So is there anything that you learned that C can do,
that C++ that you wish you could still do those things in C++, if you will?
Yes. If we talk about current C++,++ yes there are things you cannot do in c in
c++ you can do in c one of those is for example um i always forget how they're called uh let me
let me google that real quick it's kind of bad as live googling yes, as the residency guy, I shouldn't forget how the features are called.
Anyway, C99 introduced a feature
which is called Compound Literals, for example,
where you can have a function call
which takes a struct, for example.
And you can then, which takes an...
Wait, let's go back.
It doesn't take the struct by value,
but it takes the struct as a pointer.
It takes the pointer to this struct.
You can then declare that struct inside this function call
and pass a pointer to it.
So it creates a temporary object,
passes the pointer to the function,
and then executes the function and returns.
This is a way in C to get named function arguments, for example.
Okay.
It's a beautiful, wonderful feature.
Nobody knows it,
and it's one of those features where you can tell,
or go to a C++ programmer and say,
we can do that in C.
Can you do that in C++?
And they look at you like, what is that?
That is pretty cool, actually.
So that works because you can name,
you can, what's it called,
where you can initialize values of your struct by name in C.
Yes, designated initializers.
Which has just been approved for C++20, I believe.
Correct, yes.
It will come in C++20 with a few minor tweaks, though.
For example, designated initializers in C allow to zero initialize everything.
So if you have a designated initializer and you are initializing your struct,
all members you do not name will be initialized to zero.
Okay.
In C++, this will not work.
But you could give them default initialization values,
I would imagine.
I would imagine that.
I haven't looked that detailed into the proposal,
but I think you talked about that on the podcast as well,
if I remember correctly, I think with Gore.
Perhaps.
And he mentioned that, for example,
that you also cannot declare variables in another order
than they are defined or declared in their struct,
which you can do in C, for example.
It all makes sense in a sense of c++ because
you have default initialization and one member can depend on another member so you have to
initialize them in the correct order yeah that's uh although i i've never thought about using
like for me the designated initializer i'm like yeah that's interesting but why do i want it
the idea of using it as basically named parameters,
passing them into a function, like you're saying,
but in this case using a const reference stack variable,
you know, it could be pretty cool.
That could definitely be pretty cool, yeah.
So in addition to writing your own compiler,
you said you also wrote the standard library
to use with that compiler?
Yes, I did.
It is, I think, so far the only C11 libc I know of, own compiler you said you also wrote the standard library to use that compiler yes i did it is i
think so far the only c11 libc i know of which runs on windows
that i know of there might be one which actually runs on windows
but uh it's not the min gw implementation c11 this GLIPC. This is the GNU ABI.
It's not using the normal MSVC ABI.
It's using the MSVC ABI, yes.
The GLIPC runs on Windows and is C11, correct.
Okay.
Well, there goes my argument.
Well.
But that's, yeah, so that's also a side comment
that the Visual Studio one isn't quite up to C11 yet, right?
No.
It is very good in terms of C99 right now
because C++11 required C99 features.
And even though the MSVC compiler
is commonly known as a very bad C compiler,
it does actually support C99 features which are not available in C++,
like the designated initializers and the compound literals.
So it's kind of far, far since MSVC's C98 heritage.
Right.
C89, sorry. C89, sorry.
C89, right.
Are you ever attempted to
enable the extensions on the compiler
that you happen to be on that would let you use
these C features in C++?
I have to assume
GCC at the very least would allow you to do that.
GCC allows you to use
designated initializers, for example.
Yes, I actually have never
turned them on, honestly,
now that you say it.
I'm not suggesting you do.
I'm just wondering if you've done it.
I should give that a try, actually.
I'm interested how designated initializers
are actually implemented in C++ on GCC.
Right.
Did you run into anything particularly difficult
when implementing your own standard library?
Difficult, yes.
The standards are very vague
compared to what we actually think of
both in implementing compilers
and implementing standard libraries.
The documents try to not define anything.
Like the C standard, for example, tries very, very hard to not define anything like the c standard for example tries to very very hard to
not define what a pointer is really yes and so do the uh do the standard libraries obviously
are defined in the function this is the function this is what the function takes this is the input
and this is what the function should output there's one or two guarantees you have to think about but that's
about it you can achieve that in any way you want so when you say that tries really hard to not
define what a pointer is for example do you mean they try to not say like what size it should be
just so that it can work on any kind of system or do
you mean that they like don't even specify at all that a pointer is something that points to another
location in memory they don't specify that at all as far as i know um because of for example
segmented pointers in the good old dust days you had foreign foreign new. And so they could, for example,
be not actually emulating a linear memory
you have like right now,
but something very specific or different, I guess.
Right.
Both C++ and C still try to be super portable
to very niche systems.
You can run C
compilers on these piece where there's
like 36-bit ints and
no other size.
Everything's 36-bit or something.
You like that idea, Rob?
36-bit everything.
No, I don't.
I think PDP-11 also had 36-bit words.
Was it something like that?
Yes, that sounds right.
Yeah, something like that, yeah.
I was recently looking at a system,
and I almost never talk about C++ Weekly on here,
but I might be talking about this system on an episode coming up.
That was 12-bit words, I believe, for everything.
Oh, what system was that?
PIC-16.
I've never heard of that.
Microchip still makes the PIC microcontrollers.
This was a particularly old flavor of it.
All right.
I'm going to have to double-check the specs on that
before I make an episode about it, of course.
But yes, anyhow.
But yeah, that's the problem problem let's say of c uh on nc plus plus to some extent that they try to be
portable as much as possible and be as vague as possible as well my a lot of my problems actually
came from the preprocessor the preprocessorcessor is, in my opinion, very underdefined,
what it actually allows you to do.
There's a lot of,
there's a few things,
like this is UB in the preprocessor
or this is implementation defined.
Please do not depend on it in your code.
But I think I can think of
two or three different passages
where this is the case for the preprocessor
and everything else like, yeah, this is the the case for the preprocessor and everything else like
yeah this is the this is the macro please give us this as a preprocessed output how you do that
what you do in the end i don't care did you take any liberties then when you were implementing
your own version to make it better or different uh my restriction came from I wanted to use my previously written C code,
which was written on MSVC and was using the MSVC standard library
or the Windows headers to some extent, rather, not the library.
And to use the Windows headers, I had to implement some of the extensions
which the MSVC preprocessor uses.
Some of which are technically undefined behaviors.
So your program, if you include any Windows header, is technically undefined.
Well, that means that every program that includes a Windows header should just return zero and exit basically right i wouldn't take it that far but um there's
this one particular construct in the preprocessor which basically means if you have a macro and you
concatenate in those macros to other macros and the values of those two other macros are pound and define,
and then you have put another macro behind that,
the MSVC preprocessor still evaluates that define, even though you concatenate it from other macros,
which is technically not allowed in the preprocessors,
but MSVC relies on it in some of the headers,
or Windows relies on it in some of the headers, or Windows relies on it on some of the headers,
and both GCC and Clang actually implement that as well.
But with a very big warning, you cannot turn off,
please don't use this.
I like that, warnings that can't be disabled.
Yeah.
So you are also a bit of an evangelist for fast build uh for listeners who haven't heard
of that can you tell us a little bit about it of course yes fast build is um as a build system like
the name already says and it has a focus on building your code very quickly very fast
um it has a few build in build in uh helpers. For example, you can do Unity or so-called blob builds
where you have a bunch of object files,
a bunch of C++ files, sorry,
and then just concatenate them and compile them
in one compiler invocation.
Precompile header support,
so you can use precompile headers on Windows, Linux,
on all these supported compilers.
And it has a very simple DSL,
which is, I always say, on purpose,
not the Turing-complete,
although I don't know if it's actually not Turing-complete.
I have never tried to prove it.
And I also don't know if it's on purpose,
but I assume it is,
because it's lacking a lot of
features which as I talked to the author of fastbolt like there's no if construct and he
said this is on purpose use something else like you can concatenate variable or you can
you can dynamically construct variable names so you can still have an if
but you cannot miss out cases
for example if you have
flags underscore debug
flags underscore release
and then you want to get the flags values
for your build type
you can say flags underscore
and then build type
and then you get the value
and you would obviously get an error
if your build type is not a a variable which allows to to to have a safe if for example in
cmac you could say if this is debug do that on release don't do nothing so this could be an error
for example let's talk a little bit more about uh what exactly you do with fastball is this um multiple
platforms or is it windows only fastball is available on any platform you would think it
is available you want to use it it's available in linux mac and windows and runs fine on all of them
although it is listed as an alpha build on both linux and mac on the website um i've been using
it for like two years now and i never had any problems so don't take that
as an actual alpha quality it works fine that's cool is it um work with cmake at all because
cmake's kind of our you know de facto standard here for the c++ world yes um no it doesn't
unfortunately um fastbuild is a lot different compared to
cmake because it's not a meta build
system it does not
generate any
code or any project
files for a different
build system
there used to be a project trying
to implement a cmake generator
for fastbuild though
no the other way around
generating fastbuild files from cmake to implement a CMake generator for FastBuild though. No, the other way around. Generating
FastBuild files from CMake.
But I think the last time I looked
into that it was like 2k commits behind
master and that's been
six months ago so I don't think
that picked up again.
It would be nice to have that though
because FastBuild is actually
pretty fast and has a
caching mechanism so you can, for example, stop using C cache
and it has the distributed profile compiling
so you can have a build server
somewhere in your attic, for example,
with like 64 cores
and then just build over the network.
Yeah, that was the distributed compilation feature that made me think.
If I could easily plug this in on client projects
that I'm already working on or something that are CMake,
that would be really handy.
That would be really handy, yeah.
I have to look into the CMake generator again.
Maybe we can get that to work.
That would be pretty cool, actually.
That with the caching would be pretty awesome.
So it says, though,
that it does do Visual Studio and Xcode
project generation.
Yes, it does do that. It doesn't
do that on its own, though. That's the
catch there.
Okay.
It basically
has a generator for
has a
function which generates
a Visual Studio solution file or an Xcode project file and
you have variables which then gets filled in.
Which basically means you say these are my source files, these are my include files,
these are all my preprocessor defines and here's the command you have to run when I press run
or when I press debug.
And then it runs that command.
And usually you would just execute fast build in the background.
On my website, there's a talk on fast build,
and it has a PDF with the slides.
If you go to that and open that you would see um
you would see a screenshot of my visual studio which has a bunch of targets and i can compile
with one click all targets which includes linux mac and windows in two flavors once msvc and once
clang which is is a super powerful feature
because you just generate the solution
file and in the background you invoke
the fast build, but
not actual Visual Studio.
And that automatically takes advantage
of the distributed features to build on
the appropriate platforms?
Yes.
That's pretty darn neat.
It is pretty cool.
And as far as I know,
I think Ubisoft is using Fast Build for their games.
So it has some battle testing.
And I've talked to one engineer at Riot Games who's saying he knows the engineer working on the Fast Build
and they are using that internally as well.
So it has some users in the game industry
and I'd love to have more users on FastPool
because it's a pretty damn good build system
but it's kind of unknown.
I've never heard about anything
and the actual author is just silently working on it
in the background and improving it.
And he's actually really, really good.
He responds really fast
to all the issues you open.
He's been very
nice and very good.
Are you
contributing to the Fast Build project yourself,
or is it just this one author?
There are
a bunch of people contributing
to that. I haven't yet
actually started contributing, but I'm in discussion and I've opened a bunch of tickets contributing to that i haven't yet actually started contributing but um i'm in
discussion and i've opened a bunch of tickets on that and they have uh they have made um feature
requests and everything but there are a bunch of people working from uh from amd on that for
example as well like there's there's this one one girl who uh one woman who who has a bunch of comments on that and a few other people.
So there's some progress and some contribution
from the open source community.
Do you know where it came from?
What the author's original motivation was for this?
That's a good question.
Making a build system is a big project.
It is a big project, especially a build system is a big project i'm just it is a big project especially especially build
system with like that because the usual build system you see is like we take python we be
python we just write your your source your build source build files in python or in lure he has
written its own language that's it has its DSL, which is parsed, and
FastBuild as well
is just a single
executable. So if you need to install
FastBuild on your system, you just download that executable
and put it in your path. Not like
CMake, which is
300 megabytes or something big, and
do cars and installer.
You can just...
In my case, for example, I even put FastBuild into my repository
so I can just drop it in and on the correct platform
picks the correct FastBuild binary
and just compiles the project.
It looks like it's written in C++ also.
It is written in C++, yes.
That's cool. Interesting.
Well, Arvid, what is your take on uh the future direction of c++
are there any particular features you're hoping making into c++ 20 um maybe not 20 but i'm really
i really sought after the meta classes um i have actually on my gith GitHub yet another shameless plug. I'm sorry.
There's a GitHub project I've started.
It's called Meta,
and it is currently trying to implement constant expression blocks in C++,
which is the project I teased in the beginning
with there would be a good cling API
would be very useful for that.
Okay.
Because right now I just parse the C++ source code
using Clang or lip
clang rather and um extract the context present block context for blocks put them in a c++ source
file and pipe them to clang and then just get the output and invoke the actual compiler in a sub
process which is kind of bad and kind of hacky but works all right
all right i'm trying to visualize this it's like you're doing like almost like uh in my mind this
is like a templated c++ file you have like a chunk of code that you're saying this part this part can
be done at compile time so you do that at compile time then replace the code with the compiled
version exactly pretty much the current
solution is i'm parsing the c++ code i'm looking for the context for blog i extract the context
for log replace all the dash greater sign blocks
with printf's basically,
and put all the code in a main,
execute that,
and then put all the output into the C++ file.
So it basically is a preprocessor to C++.
So you can plug it into your build system,
say this is the preprocessor now.
Please preprocess it
and then pass it to the compiler, which doesn't have
to preprocess it anymore.
Wow. And it works?
It works for some very simple things,
yes. I haven't yet
had much of the time I wanted to actually spend
on it, but
I really wanted to spend
more time on it, and I hope on the holiday to spend some time on it but um i really want to to spend more time on it and i hope on the holiday to to spend some
time on it how many of us take our vacation days to program on personal projects you got a lot of
different products to choose from now don't you yes i'm and that's only that's only scratching
the surface well so we've been talking about all the projects well not all the projects apparently
many of the projects you're working on but we haven't actually asked you what you do at your
work you said you're the cto of apico do you want to tell us about that project that product at all
that company ironically um anything i do in my free time is kind of unrelated to what i actually do at my work
um as the cto i'm just overseeing all the operations all the engineering operations
and that usually requires me to jump into stuff into like any project we're currently working on
and if something goes wrong i'm the guy who fixes that so um my time there basically is um to pick
up anything which is thrown at me we've had a project in where we implemented some some back
end for a for a postcard application application in no jazz never written no jazz in my life before
and like here's the back end please make this happen all right um we've got
we've got the requests for for writing custom android um distributions for example like hey
we want to have a we want to make a tablet and can you write us a custom android for that like
sure a lot of different stuff, different companies, big companies,
like we're working with Audi, we're working with Eventum,
and we're focused a lot on the actual design.
So I'm usually in the back and something goes wrong.
It's my fault, but I'm usually just there to fix it.
Okay. and something goes wrong it's my fault but i'm usually just there to fix it okay well it's been great having you on the show today arvid uh where can people find you online um they can find me on twitter uh twitter.com slash arvid gerstmann or if you are like probably anybody who doesn't know how to write their name
is arvid.io
a-r-v-i-d.io
is my blog
and there you can find me
and on the cvp lang slack I'm Arvid
so if you've got any questions
please hit me up
and we'll always kind of try to respond
in a timely manner
and I believe the outfit that you've chosen to wear
for our podcast, which is not going to be seen,
I thought you were wearing your suit.
I'm not wearing my suit, but it is hanging right behind me.
Okay, I thought that was a silent shout-out
to your Slack friends there,
because I could only see your shoulders.
Those of you who are at Meeting C++ would know what we're referring to.
If you weren't, then you should go to more conferences.
Absolutely, yes.
Okay, well, it's been great having you on the show today.
It's been fun.
Yeah, thanks for joining us.
Thanks so much for listening in as we chat about C++.
I'd love to hear what you think of the podcast.
Please let me know if we're discussing the stuff you're interested in. or if you have a suggestion for a topic, I'd love to hear about
that too. You can email all your thoughts to feedback at cppcast.com. I'd also appreciate
if you like CppCast on Facebook and follow CppCast on Twitter. You can also follow me at
Rob W. Irving and Jason at Leftkiss on Twitter. And And of course you can find all that info and the show notes on the podcast
website at cppcast.com.
Theme music for this episode is provided by podcastthemes.com.