Transcript
Discussion (0)
Hello and welcome to the C++ Club. This is episode 16 for the meeting 141 that took place
on the 16th of December 2021. C++23 progress. There is a link in the notes where you can see
all the papers targeted for C++23. A couple of papers that I will mention are p2365, standard library modules, stud and studall.
This paper got voted on in November and achieved consensus in favor. The poll was
send this paper to library working group for C++23, classified as a focus, bucket, one item.
The results were 20 strongly in favor, 9 in favor, 1 neutral, 2 against, and 1 strongly against.
I wonder who that was, and what were their motives.
The paper is currently targeted for C++23, according to the issue tracker.
And the other paper is P2300, Stead Execution. This paper was discussed in a supertelecom
over several days in December. It's currently targeted for C++23.
a modern linker. Now Now this is some big news.
Yesterday the original author of the currently fastest LLD LLVM linker, Rui Ueyama from Tokyo,
has released the first stable and production-ready version 1 of Mold, his modern linker.
It's a drop-in replacement for existing Unix linkers, and it's several times faster than
an LD. Mold benchmarks look completely crazy. Why is it so fast? Quote, one reason is because
it simply uses faster algorithms and efficient data structures than other linkers do. The other
reason is that the new linker is highly parallelized."
The plans are to release version 2 for macOS and version 3 for Windows.
In the Reddit thread, even STL, Stefan T. Loverwaid, himself is excited.
I'm really looking forward to trying Mold. It's not often that a new linker appears.
Don't mind the fasted one.
Exciting times.
The Mould logo could be better, though.
It's a mouldy slice of bread.
Ugh.
Will Rust replace C++?
Here is a good example of the Betteridge's law.
If you're not aware, Baturage's law of headlines says
any headline that ends in a question mark can be answered by the word no.
A redditor asks will Rust replace C++? The thread has some entertaining replies of which the first
one conforms to the above-mentioned law of headlines. Quote, No. End quote.
There is a reply from Arrestation.
Quote,
In all seriousness,
Rust isn't,
nor does it pretend to be,
a C++ killer.
If anything,
it's better positioned as an alternative to C,
but even then,
it's not a threat so much
as a complementary tool.
End quote.
Another reply.
You mean like every competitor to C, C++ successfully replaced it during the past 30-40 years?
I don't think so.
And here is another.
Without exceptions, templates, inheritance?
Good luck with that.
And more.
Probably not. And also I hope not. Looks like C++ is safe from rust for now.
Choosing the appropriate container. A Redditor involved in the SteadHive paper
posted a brief and incomplete guide for selecting the appropriate container from inside outside the C++ standard library,
based on performance characteristics, functionality, and benchmark results.
The benchmarks the guide is based on are for PLF colony and HashMap benchmarks.
Links in the show notes.
The selection guide doesn't cover all scenarios, multithreading or technical nuance
like CPU architecture differences.
The guide is in the form of human-readable algorithm
and looks like a very valuable resource.
A curious compiler bug.
A redditor posted a code snippet which causes ICC, GCC and MSVC to generate incorrect code.
In this quite contrived example, the virtual base is destroyed twice instead of once because
of an exception thrown from a constructor of a derived class that uses a delegating
constructor.
The original poster and the commenters filed bugs for MSVC, GCC and ICC.
Someone asked in the thread if there was a chance that this was an expected behavior,
given that only Clang generates correct, in quotes, code.
A useful tidbit from the thread.
Quote,
The delegating constructor is significant because delegating constructors
have a special behavior
when the body of the target constructor
has finished executing.
The object is now considered fully constructed,
so any exception thrown in a delegating constructor
will cause the destructor to be called.
End quote.
So no, this is not an expected behavior. It's clearly a bug in all those compilers.
C++ links.
Here is another set of curated C++ and related links by Matt PD. Link in the show notes. A huge hierarchical list of really interesting stuff.
PLF C++ library. This library by Matt Bentley provides alternatives to standard libraries
containers and offers some additional utilities and data types like PLF colony,
which apparently is not coming to C++23 as std hive.
Maybe C++26 will get it?
The library is header-only and comes under a permissive zlib license. It supports C++ standards from C++03 to C++20 and builds with MSVC, Clang and GCC.
There are links to talks that the author gave at various C++ conferences on the colony data type
and how to design a faster list data structure.
When to use Pimple? This post on Reddit is asking when to use the
pointer to implementation or Pimple idiom. It used to be recommended to break
dependencies between components in big systems to decrease the number of files
to recompile when a header changes. And to guard against ABI breaks to a degree.
Reddit also says that a separate use case for Pimple is to prevent inclusion of the
Windows.h header, or to act as a wrapper for a large library used in implementation that
client code doesn't need to know about.
It is also used widely in Qt. However, if you are compiling everything together
or use static linking, Pimple doesn't add anything except an unnecessary level of indirection and heap
allocation overhead. The added indirection can harm optimization. In large codebases,
it can have a detrimental effect of complicating class hierarchies and
relationships.
There is one other thing that will make Pimple obsolete, and that's modules.
Bad C++ habits.
A redditor asks what bad habits developers have seen in the C++ code.
The replies include
Abuse of std shared pointer where every non scalar function parameter is a
shared pointer. I'm actually working on several code bases like this. As they say
you can write Java in any language. Stefan T. Lovewaite says working around
a bug without reporting it it not commenting the work around
not citing the bug database and number in the comment not using a uniform pattern for such
commented workarounds so that they can be found and re-evaluated or removed later good habits
in msvc's stl we use a command transition bug database number for this purpose.
Of course another redditor grabbed the Microsoft STL code for this and found 667 instances.
Other bad habits in this raid included over complicated logical expressions,
throwing exceptions or logic errors instead of terminating the program.
This one occurs often in our code, and I'm not sure what to do in case a library must
absolutely not crash even when the program is screwed up.
Unnecessarily complex template frameworks without documentation.
Unnecessarily complicated class hierarchies.
I can physically feel this one. Union-based type
punning, which is legal in C but not legal in C++. And two-step class
initialization, hello SymbianOS. Another redditor gives us some more. Using
unique pointer when a simple composition would suffice. Using shape pointer when a
unique pointer would suffice. Mocking everything and creating a maintenance nightmare. Taking
test coverage to an extreme, letting it damage the design and clarity just to make it unit
testable to an extreme degree. More than 98% coverage. There was a sad reply.
You just described the code base I work on every day.
And there was another good list.
C-strings and arrays with enough pointer arithmetic.
Macros everywhere instead of templates.
Disabling exceptions but then not being consistent.
Checking error return codes from every call. using pointers instead of references for out parameters the new
fashion of header only libraries being too clever with Sphene and template
metaprogramming defining functions and headers writing everything as a template
just because someone might one day want to customize
something. Using std endl. This adds unnecessary overhead to the already slow stream.io.
Not using the correct include format, quotes versus angle brackets. Fixing problem code while
failing to find and inform author and reviewer,
which deprives them of the opportunity to improve.
Minimal or no use of const.
Speculatively or unnecessarily defensive code, instead of assertive code.
Including pointers in a typedef or macro. This is the so-called handle pattern.
It's used in CUDA and OpenCascade.
Using stdmap for anything but huge runtime maps.
And uninitialized variables.
I'm sure everyone can add to this list.
A large part of my work is dealing with technical debt, which is all the above and more.
Apple Metal C++ Bindings technical debt, which is all the above and more.
Apple Metal C++ Bindings.
Apple published C++ bindings for Metal, which is Apple's high-level graphics and general
acceleration API.
Vulkan anyone?
I guess if you control both your hardware and software stacks, you'd want to have
something of your own in the high-performance graphics department so that you control both your hardware and software stacks, you'd want to have something of your own
in the high performance graphics department so that you can tune your silicon accordingly.
And oh boy did they tune it. Are you feeling okay, Intel? It's nice though that Apple now
allows using C++ to write metal code given that the main language is Swift. Objective-C is legacy tech now.
Juice coding standards. For a sane set of coding standards you can refer to juice.
I only skimmed this article but nothing jumps out as obviously wrong or weird or outdated.
A good point of reference if you must come up with your own
none of that google or a random game company weirdness. How is constexpr implemented in the compiler?
This is what a Redditor asked.
I know that constexpr means that something can be evaluated at compile time.
But this obviously means that constexpr code must somehow be interpreted,
since it must be executed before
compilation. Do modern C++ compilers come equipped with full-blown C++ interpreters?
Or is the constexpr code compiled to LLVM and then the resulting bytecode is run on a virtual
machine? End quote. The replies shed some light on this. Quote, Not full-blown.
It's a significantly simplified subset of the runtime,
and the AST can be reused.
There is an experiment going on with Clang to replace the AST walking interpreter
with a proper bytecode VM.
End quote.
Eric Keen of Intel says,
Quote,
It's not so much an interpreter as an AST evaluator
the constant evaluator happens after the code has been passed
lexed, semantically analyzed
and formed through the abstract syntax tree
when the compiler evaluates a constant expression
it goes through the AST and evaluates each node to get the answer.
There is an effort that is ongoing, though slowly, to replace this with an AST to bytecode
type compilation, which can then just be evaluated immediately. My understanding is it is quite a bit
faster, particularly when the same code is evaluated multiple times." And Cling gets a mention.
CERN has developed a C++ interpreter with a REPL called Cling. It is amazing. It is built on top
of Clang and LLVM and jits the code. There is even a Jupyter extension for it, so you can use it in a notebook.
It's kind of a solution in search of a problem outside of CERN.
But man, what a cool solution it is.
End quote.
MSVC didn't have an AST until at least 2015.
It used a token stream instead.
This allowed compilation on some really memory-restricted machines.
I'm wondering how that affected the processing of constexpr. Or did Microsoft switch to the new compiler frontend by the time constexpr's support was added?
Colin Moon's C++ libraries
Colin Moon posted links to his libraries on Reddit.
These include a nice and clean meta programming library using modules,
an implementation of P2300 std execution, and a testing library that doesn't require
macros but uses std source location instead. All libraries are well documented, and the
documentation looks really nice too. Compile Time Parser Generator Polish C++ developer Piotr Winter has released
a new version of his Compile Time Parser Generator library.
Quote, C++ single header library which takes a language description as a C++ code and turns it into an LR1 table parser with a deterministic finite automaton lexical
analyzer, all in compile time.
What's more, the generated parser is actually itself capable of parsing at compile time.
All it needs is a C++17 compiler, end quote.
The parsing rules are defined in a declarative functional way.
Patterns supported are character, string, and regex.
CTPG uses an LR1 parser. This is short for left to right and one look ahead symbol.
The library offers optional verbose output and state machine diagnostics for debugging
purposes.
It requires C++17 and is distributed under MIT license.
And now for some fun.
There was an operating system called BOS in the 90s.
And its API had some really interesting functions.
One of them was isComputerOn.
It was documented like this.
int32 isComputerOn void.
Returns 1 if the computer is on.
If the computer isn't on, the value returned by this function is undefined.
And another one.
isComputerOnFire.
double isComputer on fire void.
Returns the temperature of the motherboard if the computer is currently on fire.
Smoldering doesn't count.
If the computer isn't on fire, the function returns some other value.
That's it for today, and I'll leave you with this exchange on Twitter.
Viktor Zverovich posted. That's it for today and I'll leave you with this exchange on Twitter.
Viktor Zvirovich posted, Current status, writing C++ in Notepad on Windows.
Send help.
To which Corentin Jabo replied, Notepad is for writing C. For C++ you should use Notepad
plus plus.
That's it.
Thanks for joining me today and until next time.
Bye bye.