CppCast - Microsoft Announcements at CppCon 2020
Episode Date: September 17, 2020Rob and Jason are joined by Julia Reid, Sy Brand and Augustin Popa from Microsoft. They talk about the virtual CppCon, favorite talks and the virtual conference experience. Then they talk about some o...f the announcements being made by the Microsoft Visual C++ team during the CppCon conference talks. Links Microsoft C++ Team at CppCon 2020 C++ in Visual Studio Code reaches version 1.0! vcpkg: Accelerate your team development environment with binary caching and manifests A Multitude of Updates in Visual Studio 2019 version 16.8 Preview 3 Standard C++20 Modules support with MSVC in Visual Studio 2019 version 16.8 C++ Coroutines in Visual Studio 2019 Version 16.8 Debug Linux core dumps in Visual Studio Project OneFuzz: new open source developer tool to find and fix bugs at scale Sponsors PVS-Studio. Write #cppcast in the message field on the download page and get one month license PVS-Studio is now in Compiler Explorer! Free PVS-Studio for Students and Teachers
Transcript
Discussion (0)
Episode 265 of CppCast with guests Julia Reed, Sai Brand, and Augustin Popa, recorded September 17th, 2020.
Sponsor of this episode of CppCast is the PVS Studio team.
The team promotes regular usage of static code analysis and the PVS Studio Static Analysis Tool. In this episode, we talk about the first virtual CppCon.
Then we talk to members of the Microsoft Visual C++ team.
We go over some of the big announcements on Visual Studio, Visual Studio Code,
and VC Package. Welcome to episode 265 of CppCast, the first podcast for C++ developers by C++ developers.
I'm your host, Rob Irving, joined by my co-host, Jason Turner.
Jason, how are you doing today?
I'm all right, Rob. How are you doing?
Doing okay. We are more than halfway through CppCon 2020, the first virtual CppCon. I think it's going pretty well.
End of the day Thursday for specifics there. We do try to get a couple in-person interviews when we attend CppCon.
Obviously, we can't do that.
No one's in person.
But we still wanted to get a couple interviews in this week.
So this first one we're going to have is with a bunch of the Microsoft team.
They put on several talks this week.
And I guess we'll go ahead and introduce everyone, right?
Go for it.
All right.
So first we have Julia Reed. Julia is a program manager on the C++ team at Microsoft focusing specifically
on Visual Studio Code experiences. Hey Julia. Hello. Then we have Sai Brand. Sai is Microsoft
C++ developer advocate. Their background is in compilers and debuggers for embedded accelerators
but they're also interested in generic library design,
metaprogramming, functional style C++, undefined behavior, and making our communities more welcoming and inclusive. Sai, welcome back to the show. Hey, thanks very much. Sai, could you
quickly explain a monad for us since we have you back on? I'm sorry. I'm not going to make the
obvious joke there. So yeah, don't even try. I didn't know there was an obvious joke honestly oh the obvious joke is
no you're just trying to make me say it i'm sorry i honestly don't know but don't worry about it
we'll continue on a monad is a just a monoid in the category of endofunct i endofuncture i can't
even remember it's like oh yeah the technical definition there. Yeah, the technical definition. No, sorry, I wasn't being ridiculous.
Okay.
And last, we have Augustin Papa,
who is a program manager at the C++ team at Microsoft,
currently working on the VC package library manager,
address sanitizer support for Windows,
and acquisition of build tools.
Augustin, welcome to the show.
Hey there.
Okay, so...
I'm not going to ask you to explain a monad just for the record.
Should make everyone explain it.
That can be the new intro.
Everyone will get a different explanation.
It'd be great.
Yeah, I know.
Okay, so before we start going over all the Microsoft news,
why don't we just talk a little bit about CppCon so far.
Julia, I think this is your first CppCon, maybe at least
your first with Microsoft. Is that right? Yeah, this is my first ever CppCon. So I can't really
compare it to what it's like to be there in person. I've really enjoyed it so far. I think
Remo, the platform that they've been using to virtualize expo hall tables and exhibitor rooms and stuff.
It's pretty cool.
I'm not sure if we have as many attendees this year because it's virtual, but it's been a fun first experience for sure.
Yeah, definitely.
Yeah, I like the remo tables i will say i wish more people would turn their cameras on because
it seems like a lot of time you'll you know they have this virtual hall and you see all these
little tables and then you can see everyone's little avatar around the table and you usually
get like one or two tables where everyone's got their microphone and camera on yeah but then just
the rest of the hall is filled with people just sitting there you know silently it becomes a magnet as soon as a couple people are talking then it's like
that table is full yeah right yeah exactly so how about you are you enjoying the conference
yeah i've had uh three talks um and a lightning talk so far so i had a talk every day of the
conference um and i've also been doing the code of conduct team and uh the microsoft booth so yeah it's been a lot i think i've seen two other
talks from other people so far and it's it's day four of the conference but yeah it's been
enjoyable i've had some really really great conversations you know uh like you said there
can be trouble finding people to to chat to during the sessions.
But when you do find a good group, you can have some really good talks.
And you feel like you're back in that conference experience again, which is nice.
Definitely.
How about you, Augustin?
Are you enjoying the conference?
Yeah, I'm really happy that the conference is actually happening this year.
Because I think for a while, we were all wondering if there would even be a CppCon
and I'm really glad they found a solution through this remote platform and I'm glad we still have
so many talks to look at and it's mostly the same I mean I guess the the interactions with people
are very different and as was alluded to earlier with just the online format just changes everything there.
But still, it's nice that we can still have the conference.
We have so many.
The schedule is still really packed.
I was wondering if that would be impacted as well.
But there are lots and lots of talks out there.
And I look forward to catching up to a couple of them on YouTube afterwards as well, because I spent
a lot of time at the Microsoft
booth in the expo hall,
so I couldn't attend a lot of different
talks, but I hope I can catch up
with them afterwards.
Does anyone know the statistics on this?
My guess would be it's about two-thirds
of a normal schedule.
Half to two-thirds, something like that. It's still a lot.
Yeah, that feels about right. I think there's technically the same number of tracks,
but not all the tracks are active at the same time in the same way that they are when it's
a physical conference.
And we're ending like one session earlier than I think we would be.
Yeah.
Anyone have any highlights or favorite talks so far?
I loved watching the lightning talk that Sai just gave.
It was really funny.
It was about how many people we needed to slap a Google data center to power it for the rest of time.
Is that right, Sai?
Yeah, it turns out that it's exactly one Google, which is 10 to the power of time. Is that right, Sai? Yeah, that's it. It turns out that it's exactly one Google,
which is 10 to the power of 100.
Yeah.
I missed that talk, so I was getting ready for this,
but I'll have to go watch it.
It was fun.
I practiced it literally 50 times today
because I had to speak through it so fast
because there was so many physics equations to get through.
Yeah, it was very impressive. Wow. Anyone else? Anyone else have any highlights?
I really liked JS talk about JIT compilation. It was like a brief history of JIT compilers.
I have an interest in compilers and JIT compilation and things like that. So this was going through a bunch of different papers
and showing where JIT compilers came from
and how JIT compilation and ahead-of-time compilation
is kind of a spectrum, it's not binary,
and had a lot of really interesting references, things to read later.
And JS always a really dynamic presenter
as well. So yeah, I really enjoyed that one. I feel like there's been a lot this year
that just reminds us that there's nothing new, really.
Right, yeah. Oh yeah, like Ben Dean gave a lightning talk on this programming language
book, which was written, I don't know, like 50 years ago or something.
And it was a survey of over 150 different programming languages.
And so many of the ideas contained in there are things that we still talk about at the conference today.
So yeah, that was really fascinating.
Yeah.
And I'll say if you're interested in learning more about JITS, JF's talk was great.
And then Ben Dean gave one about JITS and C++ and the possible things you might be able to do with it. Yeah. I need to go back and watch that one still. Yeah, I then Ben Dean gave one about, you know, chits in C++ and the possible things
you might be able to do with it.
I need to go back and watch that one still.
Yeah, I need to watch that one too.
How about you, Augustin? Any favorite talks?
I know you said you spent a lot of time at the Microsoft booth.
Yeah, there was a talk given
by one of my coworkers, Erica.
And I remember getting
spooked halfway through the talk because I was watching it.
And then she sent me an instant message
and then I went, wait a minute,
how are you sending me an instant message
while you're presenting your talk?
And then I remembered, oh yeah, this part's pre-recorded,
but it looked so real.
I couldn't tell.
I completely forgot in the moment that it was pre-recorded.
But yeah, it was an interesting talk
on cross-platform development,
talking about different pitfalls,
talking about package managers, like system package managers, language
package managers, CMake, other built systems.
It's always an interesting subject, I think,
for us at Microsoft, since we've been working
a lot on cross-platform tooling recently, and it's a big change
from where we used to be some years ago.
Right. Yes, it is.
Yeah, definitely.
Well, I think that's probably a good segue
to start talking about all the announcements
that your team has made this week.
How many talks in total were put on by Microsoft people?
Yeah, we had 10 talks overall. Not including lightning talks.
And there's still one or two more tomorrow, probably?
There's still one tomorrow. It's on
the new
fuzzing tool.
That's tomorrow.
One of the last sessions, I think.
Okay. So I guess, do we want
to start going over some of the big news
made in the
first Visual Studio Talk?
Sure, yeah, I can give it a go.
So the
kind of major flagship announcements
were a lot to do with
conformance. The main
ones being C++20 modules,
which are now feature-complete
in MSVC, which we're
really proud of, and that's including all of
the most recent changes to modules and also some experimental tooling support in MSVC, which we're really proud of. And that's including all of the most recent
changes to modules and also some experimental tooling support in MSBuild. So, you know,
modules can have complex dependencies between them, and that might not just be in a single
project. You might have dependencies on modules which are produced in some static library,
and all of those things need to be built in some order. So the MSBuild
has a dependency checker so it will go through all of your modules, work out the dependencies
between them, make sure they're built in the right order for consumption. So we're really
happy to have that support and hope that people will go try it out as well as modules coroutines are now feature complete
as well
we had support for the coroutines TS
for quite a while under the
slash await switch
but this stuff is now in the
C++ latest switch
we still do support the old
TS style under await and the
experimental slash coroutine header
and then the new stuff is under C under await and the experimental slash coroutine header and then the new stuff is
under C++ latest and the standard coroutine header. So we'd like people to upgrade when they
can and try out the new features. We also have the support in the debugger so you can set a break
point inside a coroutine and when that's hit then you'll see a reasonable call stack which actually gives you
information you might want rather than just being polluted with a bunch of extraneous coroutine
nonsense which is pretty cool. Yeah, so give that a shot. I've been playing around with coroutines
a lot recently and tooling support will definitely help because you know there's so many complex
callback mechanisms and things like that that it really helps people to reason about what your
coroutines are doing as well as coroutines modules we announced c11 and c17 conformance that's c11
c17 not c++ this is something that people have been asking for for a while, and now we have it.
So this is all of the required features from C11, C17. There were some features in C99
which were required but then made optional for C11, like threads and atomics and things
like that. So we haven't implemented those. Some of those are on our
roadmaps. We do plan on implementing the threading and the atomics into our C support. So yeah,
that should be coming. Those are our major feature complete things. Then we've been doing a ton of
work on ranges. So we're about, let's say, 95% complete on ranges, so it's good enough that
you could go try it out and most of the things which you would expect to be there will be there,
and we're working hard on getting the last of them done. As far as the broader STL goes,
I think we have implemented 84 of the features and have 24 left.
The thing we're really pleased about is that last year we announced the STL was open source,
which we did on this show as well.
And since then, more than 30 of our new C++20 features were actually contributed by people outside of Microsoft, which is really, really cool.
So we're so pleased with all of the community engagement it's got
and that other people are coming in and helping make the library better
and helping push the standard of C++ tooling forward for everyone.
So we're really, really pleased with that.
Please keep on submitting your pull requests and sending us your issues.
So those are the major...
Yeah, sorry, go ahead. Out of curiosity, if there's someone listening and says, sending us your issues. So those are the major...
Yes, sorry, go ahead.
Out of curiosity, if there's someone listening
and says, I would love to get involved right now,
what are some of the things that you're looking
for contributors for? Do you know?
Off the top of my head, I don't,
but we do have on the...
So the GitHub repo is at github.com
slash Microsoft slash
STL, and we do have...
We use that as our issue tracker and we have our roadmap there. So if you go have a look at the roadmap, see what's coming,
see what needs implemented and have a look at the issues. There might be some marked
as like good first issues to try out, things like that. So yeah, I'd recommend going, having
a look at the GitHub repo as your first port call.
Yeah, I was kind of wondering, have there been any cases where STL was going to go start working on a feature himself
and then a pull request came through to go and finish that work?
Or is there some work to split that up?
Yeah, I mean, we try and be as transparent as possible on what we're working on.
That's why we have a fairly detailed roadmap.
So we do try and minimize that. When we say it's an open source project, we really do mean it.
Like we're treating it as a fully open source transparent project. So yeah. So those are our
main conformance things. Some of the things we're working on, static analysis. There was a talk from Sunny Shashri, who's one of our static analysis developers,
on trying to close the gap between Rust and C++ for static analysis,
which we're mainly doing through multiple different avenues of analysis.
We have our static analysis, which is available in the IDE directly, and we have KindTidy integration, and we have linters.
So there's a bunch of different areas we're looking at.
And then of course, more dynamic analysis like address sanitizer, which now supports both x86 and x64,
and also debug mode, which it didn't last year.
So we're still working on that.
We still are saying it's experimental. It's not quite finished yet,
but we're working hard on getting that done.
I don't think I've mentioned it on the show before,
but I just want to say I've used the ASAN support
since the x64 and debug was added
and it really helped me a few problems i found in my project so great yeah um another thing is the
uh control flow guard which is something which we've had in uh we've actually contributed to
llvm uh so we you know we've done a lot of work in integrating LLVM stuff
and clang into our tooling and now we've
actually contributed
some new analysis support
to LLVM itself. So that's
in upstream now. You can go try it out.
And it's just like a
thing for enforcing control flow
integrity. Nice.
Yeah. Definitely.
Yeah. I. Yeah.
I think for,
as far as the, the like language tooling goes,
of course,
we're always improving our build time performance.
You know,
we're really focused on trying to help people who have requirements on their,
their iteration times,
like game devs and things like that,
who,
you know, you just can't be waiting around for a build for way too long after you make a tiny change. So the most recent changes we've made were we actually parallelized PDB creation, which should hopefully speed up a lot of link times by about two times on large projects. So we'd love to have data on that. If you upgrade the tools and you rebuild your projects,
we'd love to hear how your build times are affected.
And also we had last year, was it a year ago?
It was maybe six months ago that I was on here with the,
we were talking about C++ build insights.
Oh, yeah.
Right. Okay. yeah. Right.
Okay, yeah.
That was about six months ago, I think.
Sounds right.
Yeah, with Kevin Caddio, who was the developer on that.
So Build Insights is a way of analyzing your build and generating traces.
So that's one of the ways that you can try and work out where the bottlenecks are.
For runtime performance, we've been doing work as well on our newest optimization, jump threading.
Which is like, if you have a bunch of, like at the assembly level, if you have a bunch of jumps which are always going to be taken in the same order,
then you can eliminate some of those intermediate jumps, which can just save you some time because it's easier on things like the branch predictor. And then also we improved the code gen of our memset
and optimizations around intrinsics and auto-vectorization as well.
So those are the main things in the compiler and standard library.
And then we've done a bunch of work in the IDE as well.
But maybe, Augustin, do you want to talk about some of the VC package stuff?
Well, I was going to say, before we move on to the other announcements,
I had a few questions.
I'm curious about the C++20 module support.
It's great to hear that we have one of the compilers fully conforming with that.
And I'm curious, has there been much work internally to to use modules like has it been
used on any big projects that you know of yeah i mean modules have been we've been developing them
for a long time and utilizing them on um i know we are utilizing them on large internal projects
i don't know if i can say which ones okay But yeah, we are using modules internally and it's been one of the things which has
driven both the standard of modules and our own implementation. The areas which we were
wanting to build modules in are in many ways to help aid our development efforts.
So we're applying them to real-world projects
and then feeding back that information into the standardization process.
Very cool. Jason, do you have any questions before we move on?
No, that was a ton. I don't think I have any questions from it, though.
Okay, so yeah, Augustin, let's talk about VC Package.
Sure. So we've been doing a lot of work on the VC package manager.
We have really been looking at extending the different development scenarios that you can use the tool.
VC package has always been in this interesting state where, unlike most Microsoft products, it's very popular with hobbyists and
open source developers, but it hasn't made as much headway with professional developers,
large enterprises, that kind of thing. So we've really been refocusing a little bit to add some
functionality that perhaps would be more appreciated for professional developers,
but I think will also still be very useful for just anybody,
depending on how much flexibility you want in a package manager.
So we kind of made a list of big features that people were asking for
and that just make sense for a couple different scenarios.
And two of them are now generally available,
we have binary caching, and we have a new manifest that we support. So basically, for
binary caching, the way VC package works traditionally is you acquire it, you can use
it to acquire libraries from source and build them.
But sometimes that can take too long because you have to go through the whole build process.
And if you're building a lot of dependencies
or if you're building a particularly large set of libraries,
like if you're building Boost, it can take a while.
And you only really need to do this once per machine normally.
But if you have, say, your
own machine, and you have a CI machine, or maybe you're running things in multiple containers,
or maybe you have multiple developers on your team, and they each have to go through this process
once, that can take a while. So with binary caching, essentially what you can do is you can
cache those binaries after you've built them exactly once. You can put them in a place that all of the machines on your network can access
and then you can just acquire the pre-built binaries and then it just takes seconds
to get them in every environment from that point on is it smart enough to make sure that all the
build flags and everything are the same and platform and all that yeah so uh so basically it um it just reuses the same logic
vc package would normally use to install uh libraries on your local environment where yeah
yeah we look at what uh your platform is we look at um uh what kind of build you're requesting your
build settings and uh it what we will do is it basically checks the cache
if something compatible already exists in the cache.
If it doesn't exist, then it will have to do the build from source
and then it can populate the cache with it.
So for later, it'll be available.
But if it is in the cache, then it just downloads that
and you don't have to go through the whole build process for that.
So it's basically an optimization for not having to build as often
and hopefully only have to build once for any given build configuration
and platform for your dependencies.
So we got a lot of feedback that people really wanted.
It's kind of like our solution to people asking for,
can we install pre-built dependencies,
which you can't realistically have pre-built libraries
for every single possible library,
every single possible version,
and resolving all the version conflicts
and all the possible issues
you can have with C++ dependencies, you end up with a very, very large catalog. But instead,
what we can do is we can make it easy to build that once and then just reuse that. And that's
basically binary caching. And then with manifests, so again, historically, with vcpackage, you have to run
commands in a terminal or write a script that does that for you. So you can say things like
vcpackage install, boost curl, open SSL, and then you can specify your build configuration
as desired. But with manifests, you can instead put your dependencies in a JSON file, and then they can get installed
without you having to run those commands.
So we have support for MSBuild and CMake projects,
and the way it essentially works is you just write that file,
and assuming, so let's use CMake as an example,
assuming that you're already pointing to the VC package
CMake toolchain file so that CMake knows about VC package.
Then when you run the CMake generation step,
it will automatically invoke VC package,
and it will automatically install your dependencies at that time.
So then when you go to build, it should just work.
And it will look at your build configurations and your CMake project,
and it will make sure that you get dependencies that are compatible with that
built on your machine,
and then you should basically be good to go at that point.
And that manifest file has a second use,
which is if you want to package a library for use
in the VC package ecosystem, that still uses that format.
So we have a file format called the control file today.
We're kind of deprecating that,
and we're moving towards this common manifest file.
Yeah, and we have some more features as well
that we're working on to make it even easier
for teams to be productive using VC package.
Right, so there's a few upcoming features too,
which we can talk about.
Yeah, so we have one feature that has been requested for a very
long time is versioning support so the way vc package works today we have this very large
catalog of over 1400 open source libraries and we basically build that whole thing as part of our
ci process to make sure that all the libraries are compatible
with each other and there are no version conflicts and stuff like that. But the downside of that
is that we're essentially dictating what version of the library you get when you do VC package
install some library, which might not be desirable for some people, especially if you're only
installing some small subset of the libraries, you know exactly where the version conflicts are,
you know, you're not going to have any problems and you understand exactly what you're only installing some small subset of the libraries, you know exactly where the version conflicts are, you know you're not going to have any problems
and you understand exactly what you're doing,
you might want to be able to say,
well, I want this version of this library
and this other version of another library,
or specify something like,
well, I'll accept a library at least at this version
or the next one that works if the minimum one doesn't work.
So we're trying to enable stuff like that with versioning support.
So we're going to take our entire catalog,
make the old versions
that have historically been available
and old versions of VC package
available just for install,
even with the latest version.
So you can pick and choose
exactly what versions you want.
If you want that level of control,
you can still get the default behavior
where we give you one
if you don't really care what version you want.
But you can pick and choose, and you can specify that in the manifest file as well.
But that feature is something we're still working on completing.
We're pretty close, but it's not available yet. called registries, which is us trying to answer the question of how do you bring, how should you
bring your own libraries to the VC package ecosystem? So we have all these open source
libraries, but then people have their own private libraries, or maybe they're using other open
source libraries that aren't in the main catalog, or maybe they want to fork an open source library
and really control it within their organization
and make sure that they run through their own process for security reasons and stuff
like that.
And maybe they patch it once in a while.
So in that scenario, we want to support something called registries, which a registry is like
a catalog of libraries.
So you can define your own catalogs and define your own libraries to be in
that catalog. And you can reference libraries from multiple catalogs. You can get them from the
public catalog that everyone has access to, or you can get them from your own private catalogs.
And that way, we hope to kind of unify the story of just installing any libraries that you might
need, whether they're open source ones or not, and having one consistent
experience for doing that. That sounds very cool. So how exactly are the registries going to work?
Are you just kind of giving a URL and saying, like, I want to use OpenSSL from here instead of
the one on GitHub or whatever? So there are two. So first of all, there are two
ways we're looking at.
So we already actually have some very early,
maybe let's call it pre-alpha support
for registries in file system format
where you can basically pick some folder.
You can put a bunch of libraries there,
and the idea is that you'll be able to use
another JSON configuration file
to essentially define that registry, define the libraries that are part of that,
and just any metadata that's around that.
And then you can reference, and you can also define which libraries
from which registries take precedence.
So for example, let's say you have your own private fork of boost
where you're making some patches.
You can essentially define in that file
that when somebody requests boost
from the vcpackage.json manifest file,
actually give them my boost.
Don't give them the public boost
because ours has some hotfix we threw in recently.
So you can do stuff like that.
And we're also working on support
for git registries which brings it to the point where you can specify a git repo of the library
you want and even the specific commit id and and also including a lock file for to be able to lock
in and say okay make sure that anytime somebody installs
the set of dependencies and runs VC package
against the manifest file,
don't update them or change their version on them
so that they're consistent across all the machines,
across all the environments,
whether you're on your local machine
or running builds in a continuous integration server,
and just keep it all consistent.
So we're still working on the implementation details,
but that's kind of where we're at right now with the design.
Okay, that sounds really helpful.
Yeah.
And is this coming out sometime next year,
or do you think it'll be in the next update or something like that, can you say?
Yeah, so for versioning and registries,
I think initial experimental support for registries
is pretty close, like next few months.
I would expect it this year.
Versioning, we're actually even closer.
We're just finishing some of the implementation there
and getting the catalog of older libraries versions populated.
So we're actually hoping within a month or two for versioning, but that's at least the
current timeline.
So we're actually pretty close with these.
We were actually hoping to get both of these features in time for CppCon.
Unfortunately, there wasn't enough time, but we wanted to make sure both of these features in time for CppCon. Unfortunately, there wasn't enough time,
and we want to make sure we polish these features up and that we're happy with the quality before
we go and make a big announcement about it. So we're still considering them in progress,
but we have done a lot of development on them.
I'm kind of curious, how big is the team that Microsoft has devoted to VC package?
So we have around five full-time engineers on it.
There's myself as well as a program manager.
And we have a couple of people that work on mostly maintaining the PRs for library updates and stuff like that and tracking statistics and how the project is doing as well that we're working with.
And that's a partner team.
So it's not a very big team, but it is an open source project.
And we do get quite a few contributions as well.
And that's what's really been keeping things.
That's why we've been growing as fast as we have.
And the catalog has been growing a lot over the years.
We only started in 2016.
So we're already at over 1,400 unique libraries.
And also went...
I think we started off as Windows only.
And now, of course, we're on Windows, Mac OS, and Linux.
So we're trying to support all the C++ developers out there and all their needs.
Awesome.
And I think with versioning as well, I think one of the things which people misunderstand about VC package is the versioning model which we currently use.
Like, it's not the case that the versioning support gives you
something which is always
better than what we currently have.
These two models have
upsides and downsides. What we currently
have enables
a fully tested suite of
libraries which have been tested against each other
and you know that if you check out a version
of a package that that version
and all of the things
which it depends on,
that whole chain of dependencies has been tested.
And you can trust that it's good for at least, you know, how good their testing is.
Whereas you don't get that from specifying your own version.
So like these are two different use cases, which now we're trying to support both of.
Yeah.
And some of this is really unique for C++
compared to some other languages like
C Sharp.
Because of course we have
some people who also use something like
NuGet for example for package management.
But we really evolved
VC package out of a need to
solve some of these underlying problems for
C++ developers where you
have to deal with version conflicts and building a variety of different open source libraries
there's not one process that you can follow for all of them they're all different if you go to
all their git repos they may have different building instructions be supported for multiple
build systems some of them might not be set up as well for the build system you're using.
And it can just take some time to really figure out,
okay, how do I actually get all of these working?
And then even when you do build them,
maybe they don't all work together.
So with VCFactors, we tried to solve all of that
just from the beginning.
But now we're really trying to expand
because there's additional functionality
that people need in specific scenarios.
Yeah, I tried recently to compile Nginx, the HTTP server for my NAS drive, which runs on ARM, on a 32-bit ARM.
And Nginx does not support cross-completion.
I hate having to deal with these things.
I want to just be able to build things
and not have to end up at 1 a.m. crying
because I'm trying to build on a 64-bit machine
and building for a 32-bit machine
and it thinks my pointers are the wrong size.
I don't want to deal with this.
Speaking of figuring out
what compatibilities libraries have,
I guess it's also worth pointing out
we haven't really announced this anywhere yet,
but we're also working on a website for VC package.
And part of that is actually a search feature
for the different packages that are there.
And you can basically, you'll be able to see,
okay, which operating system is this compatible for which targets are these compatible for
and hopefully that that's helpful because of course not all libraries are completely
cross-platform and designed for every target out there so hopefully that helps a little bit as
people are hopefully helps people explore different libraries and see what are their best options for what they need.
That sounds like a great tool for the ecosystem.
Yeah, we hope so.
Yeah.
Today's sponsor is the PVS Studio team.
The company develops the PVS Studio Static Code Analyzer
designed to detect errors in the code of programs
written in C, C++, C Sharp, and Java.
Recently, the team has released a new
analyzer version. In addition to working under Windows, the C Sharp part of the analyzer can
also operate under Linux and Mac OS. However, for C++ programmers, it will be much more interesting
to find out that now you can experiment with the analyzer in online mode on the godbolt.org website.
The project is called Compiler Explorer and lets you easily try various compilers and code analyzers.
This is an indispensable tool for studying the capabilities various compilers and code analyzers this is an
indispensable tool for studying the capabilities of compilers besides that it's a handy assistant
when it comes to demonstration of code examples you'll find all the links in the description of
this episode okay uh julia let's start talking about vs code i know you had some some really
big news this week too we did um so the first announcement is that the C++ extension came out of preview.
So we released our first 1.0 release, which is really exciting.
The extension had been in preview for over four years, I believe.
And it's come a long way since it was first shipped with a tag parser intellisense
experience and um coming out of preview and marking this release as 1.0 you might be wondering
what that means basically we believe the extension has the core language server functionality um that
we've envisioned it does not by any means mean that we're stopping there. We
will, we have a lot more coming in the future that we're excited about. But we've been working
to stabilize the extension for the past few years and believe it's gotten to a point of high quality
and a large enough feature set where we can put that 1.0 stamp on it. Awesome.
And can you go over what exactly is included in the C++ extension?
I know we've talked about it before.
Yeah, yeah, of course.
So I kind of divide it into two categories.
We have editing and debugging.
And the team that I work most closely with focuses on the editing features, which includes IntelliSense, things like completion lists, parameter help,
quick info, semantic colorization, some code navigation features like find all references, go to definition, go to declaration, things like that. We also have code formatting. And
one of the new features of 1.0 is that we have the visual C++ formatting
engine in VS Code now which essentially is the formatting engine in Visual Studio so
all of the C++ formatting options that you have in Visual Studio are now supported in VS Code
there's also editor config support for all these new settings just to give customers some more
flexibility with how they want to format their code and then the editor config thing that means if a if a team wants to use either
visual studio or visual studio code they could have that file checked into source and have like
the same kind of experience between both exactly yes so all of the naming conventions are the same
between vs code and visual studio for that scenario. And then we have debugging support.
So Visual Studio Code has a debugger UI
and it uses whichever debugger you have installed
on your computer under the hood.
So you can customize your debug launch configurations,
but you get to use that intuitive user experience from VS Code.
And like on that same note, so the debugger uses the debugger that you have on your computer,
the C++ extension uses whichever compiler you have installed on your computer to set up
IntelliSense for your source files. Okay. And is it able to detect any of that automatically what kind of setup is involved
and yeah so it'll check a few default locations for a compiler and debugger on your system but
if you don't have your compiler in one of those default locations then you can manually specify
the path to the compiler that you want to use. You can also change your IntelliSense mode.
So let's say you are compiling for a different architecture
than the architecture of your host machine.
You can specify that as an IntelliSense setting
for the C++ extension so that it'll give you
the proper IntelliSense for the architecture
that you're compiling for.
And I think since last time we spoke about VS Code as well,
we improved the configuration of the debug launch settings,
which is something which always annoyed me.
Like, I would try and launch and debug something,
and I'd just stare at the settings page
and be like, what do I have to know?
Right, right, yeah.
It's much easier to use now.
It is much easier.
I think they are able to pre-populate as many more things now.
And we also have acquired the CMake Tools extension within the past year.
And CMake Tools makes the configuration process very simple for CMake projects.
You don't need to look at any of those JSON settings files yourself at
all. So if you are not using the CMake tools extension, then you would be setting up your
build task and your launch configuration through JSON files. But with CMake tools, it completely
abstracts that and will set up that build and debug configuration under the hood for you.
Yeah, the CMake tools used to be developed by Vector Abul, who did an incredible job
of getting it to where it was.
It's still developed as an open source project.
It's just we're doing the primary maintenance of it.
Even if you're not a CMake user, one feature I like, it's not a new feature, but I think it was added in the past
year or year and a half, is the
build and debug active file. So you can start with just like one
CPP file, one Hello World project. Let's say you're a student
trying C++ for the first time, or you're somebody who's trying
to evaluate Visual Studio Code
to see if it's good for you,
you can just start with the one file.
You don't have to have a build system set up or anything.
You can just right-click and say build and debug active file,
assuming you have your compiler and debugger,
and then it'll just confirm,
okay, so you want to use this compiler and this debugger,
and then it will just set up some default configuration for you,
and you should be able to just test it out without any additional work.
So that's close to the Visual Studio ID level kind of experience,
which is really nice.
Yeah, I use that all the time if I just want to test out something
in a small project, just make a file in Visual Studio Code, hit build, go.
Yeah, it's less than 30 seconds to get everything set up,
and most of it is writing your C++ file.
Compiler Explorer instance running for those cases.
Yeah, I mean, that's what I do for most things.
I use the Compiler Explorer online for most things,
but then sometimes I'm trying out
new versions of MSPC and things like that which aren't on Compiler Explorer yet.
We are working on that, by the way.
People have been asking me when we're going to get the most recent versions of the compiler
on the upstream Compiler Explorer.
People are looking at it.
I do want to get them up there.
People should be able to play with the most recent toys and things like that without having to
pull down everything just talking about trying to get like preview builds up there
it's just it's out of date of even release builds oh okay yeah yeah it's out of date of release
builds yeah um i can't remember what the most recent one is but uh yeah it's like 0.24 i think
yeah and compared to like clang and gcc think. Yeah, and compared to Clang and GCC
where they literally have trunk
on Compiler Explorer, it's miles away.
Nightly builds.
Yeah, I'd like to have that remedy.
That would be nice.
Julia, I've got a question I like to bring up
every time Visual Studio Code comes up
and I feel like it just becomes more and more relevant
each time because you keep adding
more and more C++ features to it. Where's the line between VS Code and Visual Studio Code comes up, and I feel like it just becomes more and more relevant each time because you keep adding more and more C++ features to it. Where's the line between VS Code and Visual
Studio? Who are you targeting? It's a good question. So for VS Code, we're targeting Linux
and Mac customers. We're saying if you're on Linux and you're on Mac, then VS Code is the way to go.
And whereas Windows, we would recommend using Visual Studio
IDE for a more extensive experience. But yeah, that is a good question. And we actually are
beginning to do some integration work with Visual Studio, which will enable us to light up some of
the remaining features in VS Code that are not there that are in Visual Studio,
like such as call hierarchy, for example. That's another one of our top feature asks that will come
after we do some of this integration work. But that is a good question. And I think it's one
of those things where we don't have, you know, an exact line that we're drawing, like this feature VS Code, this feature only
Visual Studio. I think we will continue to listen to what our customers want in Visual Studio and
adapt the product based off their feedback. We don't actually have that much overlap in our
user base between VS Code and Visual Studio. Interesting.
Yeah.
That's surprising.
I have both.
I use both all the time.
Yeah, but you're an MVP, though.
That's true. Yeah, I think as far as the line goes, like Julia says,
there's no hard line, but we still do want to be able to say
that Visual Studio should give you a more lightweight editing experience,
which is what some people want,
and Visual Studio should be a more fully featured IDE.
So yeah, those are fuzzy definitions kind of by design,
but that's kind of the ballpark which we want to aim for.
And even on the Visual Studio side,
the lines are getting blurred in terms of,
yeah, Visual Studio is this big, comprehensive tool that comes with all this stuff built in.
But now we're having conversations about things like, okay, should we really be shipping
this much stuff in the MSVC tool set in Visual Studio? Can we decouple it further? Can we
break things down by targets? Can we maybe
start with x64 support? And then if you want x86, that can be an optional package, you have to check
it in the installer. Now we're thinking about things like that. We don't have that in right now.
But those are just the kinds of conversations we're having. Because we, we recognize that there
is value in having something that's smaller and lightweight and maybe gives you more
targeted things that you need and then you can extend that and add other packages as needed
and even if we look at the evolution of the visual studio installer when 2017 came out
the 2015 installer had way fewer options and check boxes for things than 2017 because we were trying
to really break things down and give people
more flexibility and what exactly they want on their system and i know that we've been doing
some work and looking at both msvc and and the windows 10 sdk to think about how can we break
this how can we break this down and install smaller stuff for people it gets better every
year but it's a little bit of a pain point when you're like, I just need that version
of the compiler on my CI.
That's all I want.
Yeah. We definitely want to
make that a bit easier going forward.
Okay. Is there anything
else? Any kind of big announcements
that happened this week that we haven't
gone over yet that you want to?
One thing we haven't talked about is
Codespaces. Which is, the idea is that this is a gone over yet that you want to uh one thing we haven't talked about is uh code spaces okay which
is uh the idea is the this is a github code spaces the idea is it should be like an instant developer
environment so uh you know you have like a github repo and you hit a button and it opens a code
space either in um like an online editor or in your own local machine.
And then you have your whole development environment set up and you can edit the project, you can
build it, you can run it without having to install anything on your own machine.
So the benefits there are like if you have a project which takes a bunch of resources,
then you can use, this is all built with like Clired environments.
So you're,
you're not having to use any of your own system resources for like building and running this stuff.
Also, it means that you can, you know, access your code from anywhere. Like if you're stuck
somewhere or traveling or whatever, if you're on a train, if you're ever on a train again, then
then you could, could access your code and do development.
Also, one of the things which I think is most interesting is onboarding people.
We do a bunch of surveys of how long does it take you to set up your development environment
when you join a team?
Or if you're just doing what you want to do contribute some small feature to a
project like how long does it take you to get the build environment set up install all the
dependencies and things like that and for some people it's literally days and goodness that's
just a huge pain point so if you can just click a button and have a developer environment ready to go, then that's much better than a few days.
And to add to that, we've, oh, sorry, go first.
No, I was just gonna say, I was playing around with the VS Code, Codespaces experience. And
I just am still blown away every time I open VS Code in the browser and you have the full VS Code UI, you can use the debugger,
you can use like every feature, all the feature sets of VS Code are there available in the
browser. It's just, it's amazing. You could open it from your iPad. Our coworker was able to
connect to Codespaces from his car, from his Tesla, from the screen of his car. It's really cool.
I feel like I'm totally missing something here.
How do you launch Codespaces in GitHub?
So it's currently
there's a sign up
so it's not like fully open yet.
If you go to
aka.ms
slash Codespaces
hyphen set sign up.
Okay, so it's preview.
So it's an amazing tool that we can't use just yet.
Is there a release date announced at all?
I think you can sign up for the preview for VS Code at least.
I believe that's public preview.
Yeah.
So you can definitely try it.
It's just not considered finished yet.
We're still doing some work on that.
But it's really an amazing thing
because we're really going in the direction
where you can just go on GitHub,
find some repo,
and then just open it in Visual Studio Code
in the same browser.
And you can just start playing around with it.
And it essentially just creates
like your own private fork of that
for you to be able to play around with.
And this works with C++, I assume.
Yeah, it's not just C++.
It goes beyond that even.
But yeah, we've been doing a ton of work
to get the C++ support there.
And yeah, we've come really far.
So we're quite happy with this.
Yeah, it's currently, we say it's
in limited beta access.
So you can request early access on GitHub
and it will be available to a smaller set of users.
So yeah, if you're interested, you can go and sign up
and see if you can get access, give it a shot.
I think Rob is signing up right now.
Yeah.
Check it out.
Okay, well, we went over a lot.
Anything else before we let all of you go?
I do have one more announcement,
which is that we have Makefile support
coming to VS Code very soon.
And so we're working on a new extension.
It'll be called Makefile Tools,
so very similar in nature to the CMakeTools extension,
basically providing configuration, build, debug support for MakeFile projects.
And we'll be releasing the beta version of this,
hopefully within the next few weeks.
And I think something we've been thinking about long-term
is will we combine our Make file tools extension, our CMake tools extension, and the C++ extension so that it just comes with this build native build functionality out of the box.
But we're releasing it for now as a separate extension, hopefully in the next few weeks.
That's interesting.
I mean, CLion is also working on a make files extension,
which like I took for my, for me personally, I'm like, make, who needs that? Except for the one
time every five years that I'm like, I need that right now. Suddenly I need to build like
these 10 Linux tools and all their dependencies and they all use auto tools. Yeah. That's a good
point. That was one of our reasonings for shipping this as a separate extension
and just using it as a way to start that dialogue with the Makefile community
and see how they are using Makefile, why they're using it.
And if it's something that people express a lot of interest in,
then we know, okay, this is worth pursuing,
worth maintaining.
But yeah, we have those same questions as well.
And one other thing that we're working on, I guess,
and this also ties to Visual Studio Code,
we're working on better VC package integration
to the point that,
so Julie actually demoed this as well in her talk,
which was like a preview because it's not actually out yet.
But we are working on an experience
to manage dependencies in the VS Code UI,
like in the GUI, through vcpackage.
Actually shipping the vcpackage tool itself
as a binary inside the C++ extension.
So when you install the c++ extension
you can already start installing libraries um right off the bat and uh we're also exploring
a similar thing for visual studio although we haven't uh done the work there so we're not it'll
be a bit farther out to get the visual studio integration but yeah we want people to be able to
have the package manager built in um it's part of their id or editor yeah i mean to be able to have the package manager built in as part of their IDE or editor.
Yeah.
I mean, to be clear, like in Visual Studio, there's still some integration already where you can, like if you have a CMake project and you have like a fine package or something,
then if the package isn't installed and it is available in VC package, you get a little tool tip and you can install it right from there this would be something like if you're familiar with some of the tooling for python where you can just like
install you can get like a list of packages and install something from your sidebar right yeah
we already have some integration of vc package if you install vc package yourself separately
and have an installer you can install libraries
when you do a
pound include and there's a squiggle because we don't know where
the file is, you can install
that way. But this way you don't
even have to do the step of installing VC
package. You can just
have that already. Sounds great.
Okay, well thank you all for
joining us today and
hope you enjoy the last day of the conference tomorrow.
Yeah, you too.
Awesome, thanks.
Thanks for having us.
Thanks.
Thanks so much for listening in as we chat about C++.
We'd love to hear what you think of the podcast.
Please let us know if we're discussing the stuff you're interested in,
or if you have a suggestion for a topic, we'd love to hear about that too.
You can email all your thoughts to feedback at cppcast.com.
We'd also
appreciate if you can like CppCast on Facebook and follow CppCast on Twitter. You can also follow me
at Rob W Irving and Jason at Lefticus on Twitter. We'd also like to thank all our patrons who help
support the show through Patreon. If you'd like to support us on Patreon, you can do so at
patreon.com slash cppcast. And of course you can find all that info
and the show notes on the podcast website
at cppcast.com
Theme music for this
episode was provided by podcastthemes.com