CppCast - Reflection and C++26, with Herb Sutter

Episode Date: October 10, 2025

Phil and Timur are joined by Herb Sutter to catch up on what's going in to C++26 which, let's be honest, is dominated by reflection. News CLion now has a constexpr debugger "CMake for comple...x projects" - tutorial: Part one Part two Safe C++ proposal is not being continued Episode with Sean Baxter Links Herb's Reflection talk at CppCon 2025 Herb's Contract's talk at CppCon 2025

Transcript
Discussion (0)
Starting point is 00:00:00 Episode 402 of CPPCast recorded 2nd of October 2025. In this episode, we talk about ConstExper debugging, using C-Make on large projects, and the future of safe C++-plus. Then we are joined by Herb Sutter. Herb talks to us about reflection and other recent developments in the world of C++. Welcome to episode 402 of CPPCast, the first podcast for C++ developers by C++ developers. I'm your host Timo Dumler, joined by my co-host Phil Nash. Phil, how are you doing today?
Starting point is 00:00:59 I'm all right, Seema. How are you doing? I'm not too bad. Thank you. It's good to be back. Yeah, it's been a while. Yeah, it's been a while. You had a break of, I think, about three months now. Yes, yeah, completely unplanned. But, you know, life happens. Things call up with us. Yeah, so huge apologies to our listeners. We know you have been waiting all this time for a new episode.
Starting point is 00:01:21 There is more to say about this, but I think Phil and I want to do this properly. so we have going to probably devote a little bit more space to this topic. At the next episode, we're going to talk a little bit more about what happened to us in the last three months. This episode, though, we want to catch you up on all of the stuff that has happened since last time. All of it? Not all of it, but some of the high-level bits.
Starting point is 00:01:45 Before we get to that, at the top of every episode, we'd like to read a piece of feedback. This time, actually, we have not received any written feedback in email form at least since the last episode. It was in June. But a couple of weeks ago, I was at CPPCon. And I did have a handful of people who approached me and said, you know, shook my hands and said, hey, thank you so much for the podcast. So I take that as a positive feedback. So thank you very much for everybody. I love meeting people at conferences and talking to them. So thank you very much for appreciating what we do and talking to us. Yeah, we'd like to hear your thoughts about the show. and you can always email us at feedback at tppcust.com.
Starting point is 00:02:26 Joining us today is Herb Sutter. Herb Sutter is a technical fellow at Citadel Securities, designer of several standard C++ features and chair of the ISO C++ Committee and the Standard C++ Foundation. His current interest is simplifying C++. Herb, welcome to the show. Thanks for having me.
Starting point is 00:02:46 It's great to be here again. So your bio has changed a bit since the last time we spoke to you. And I think by the end of this show, we may actually change it again. But it's not nice to keep it updated. Well, just so nobody's alarmed, the bio change I think you're talking about is that I started a new job a year, almost a year ago. And no, I'm not changing jobs again. So it's not like, I don't want to imply the wrong thing or tease the wrong thing.
Starting point is 00:03:14 You've got to go to the end of the show to find out. All right. So we're going to talk more about this later in the show. but before we do that, we have a couple of news articles to talk about. So feel free to comment on any of these, if you like. So obviously, we've been off the air for three months. So I'm not going to talk about all the new libraries and all the new compiler updates, what we usually do because there's just been too much stuff going on,
Starting point is 00:03:36 same with blog posts. But there are a few things that stood out to me, which were kind of high-level, kind of noteworthy. So I'm just going to quickly go through those. The first thing that jumped out to me, completely out biased, of course, because this is totally not a product that I used to work on in the distant past. But we are of course talking about C-Line, which has a Const-Exper debugger. And I think this is not so much about C-Line specifically.
Starting point is 00:04:02 I suspect other IDEs will have similar features at some point. They're not too distant future as well. It's more about the fact that we do have now an IDE that offers a Const-Exper debugger because Const-Exper-Programming is becoming more and more prevalent, important and more and more important, you know, part of the development cycle. Constex for all the things. Yeah, so I had a quick look at this and it looks pretty great. So this is a preview that you can already check out.
Starting point is 00:04:27 It will, I assume, will be in the next release. I didn't actually talk to the C-Line people about this. I haven't had a chance to talk to any of them. But I just literally just read the blog post. You're external now. Yeah, I'm external now. Yeah, yeah. But it looks really, really good.
Starting point is 00:04:43 you kind of just go to any ConstExper declarator or static assert or any other like compile time thing there's like a new debug button there then yeah you are debugging cons expert code you can step in you can step over you can step out you can restart you can even step backward that's not something you can do at runtime unless you have a special debugger like undo is doing this kind of stuff but at compile time all the information is in the compiler right so you have the entire state there so you can do stuff like this and you see what the compiler sees. You see the cold stack at compile time. You see the local variable. You see the last returns. You see the template arguments of the, you know, the template you're trying to instantiate or
Starting point is 00:05:22 whatever you're trying to do right now. And it's like really, really cool. And I believe we're going to talk about this more. Once compiler support reflection, which is going to be a big topic later, I think this will be even more awesome because then you will actually be able to like debug how your compiler reflects and your code and, you know, but make, put together a new type or, you know, whatever. So I think I'm just very, very, very excited about that future. Yeah, this is one of those inevitable we need it features, but that yet has been slow to come. And ever since we added ConstExper in the very beginnings of it in C++11, there has started to be demand for this, because as we've been doing more and more and more execution of C++S code at compile time,
Starting point is 00:06:09 you write code, you got to debug it, right? And so the workaround that we've mostly been using is saying, oh, look, ConstExper code can run at compile time or runtime. Now, to debug the compile time problem I'm having, I'll just try to invoke the same function in a non-Const expert context and use my regular debugger. Fine. That gets you X percent far. I guess you maybe 50 percent far. But as we write more and more ConstExper and ConstEval, which is compile time. only code. And don't get me started because we're going to spend the whole hour talking about how reflection is going to create an even greater explosion in gold rush for writing that code. Then we have to debug it. I mean, the example I've been giving to IDE vendors for the last decade is, hey, if we call a constexper function to calculate the size of a stood array. So stood array of my type, comma, and then you invoke some context per expression,
Starting point is 00:07:12 and it's the wrong size, I got to debug that. That's part of the type. I may not be able to do that at runtime. So anyway, I was pleased as Punch, and I have not looked at it. Yeah, I'm not actually a C-Lion user. I hear great things about their products, and I've played a little bit. But this is the kind of thing that we need more of. And so I was super delighted to hear that ID vendors are now actually pushing into that space.
Starting point is 00:07:39 Because you write code, you write bugs. You got to debug it. Yeah, yeah, this is really exciting. It happens to be my favorite IDE, so I will actually get to use this properly once it's in the release. There's a few limitations. They don't support modules yet. Not dive it to Safe BuffFus, really.
Starting point is 00:07:58 Yeah. Why, is modules more harder than other features to support? Sorry, I couldn't resist. Yeah, a different topic. But yes. But also, there's a few compiler-time constructs that it doesn't quite understand yet. But I'm sure they're going to iron that out as time goes by. I'm also really curious to hear how it works. So I hope I can have a conversation about that if I bump into any of
Starting point is 00:08:23 the sea line people anytime soon. But yeah, really exciting. And I'm sure that modules and other features are on the roadmap because they're important. But of course, anytime you have a version one feature, you're going to do, I almost want to say a minimum viable product. You want to have like a feature set that gives you real value now, but that you can build on in the future. And so everybody understands that version one is good, and version two will be great. And it's just so nice to see it landing. Yeah, the one thing that I want that it doesn't quite do yet in this earlier version is debugging through if constexper cascades, because I've got lots of those in my code.
Starting point is 00:09:05 And I could really do with being able to debug those. But hopefully that will come in a future update. Right. Hey, by the way, it sounds like you have learned the lesson that many of us have learned as soon as we started to use the IFCONS expert in earnest is, oh, right, instead of specialization and function template overloading, you just write the general function template and put the if const expers inside, which among other things, and this was a discovery to me, which is where I'm mentioning it, I didn't realize, after I'd done it a few times, I realized, oh, not only does this replace overload resolution, it lets me control overload resolution because I see. the order in which I want the test done, mind-blown. That is, when I made that realization that helped me think about it that way, that was an eye-opener for me. Yeah, 100%.
Starting point is 00:09:52 Right. Okay, so we have another tooling-related news item, which I put on there because one thing I did earlier today was I scroll through the entire CPP Reddit for the last three months. And there was one article that had like way more upvotes than any other. It had like 452 upvotes, which I don't think I've ever seen on that Reddit, which is a Seamake tutorial about how to use Seamake for complex projects. And so that is a thing that you can now read.
Starting point is 00:10:25 It uses as its project. It uses like a complex game engine called the Columbia Engine, has a complex build system, has some web assembly, has some other interesting stuff going on. So it's like a real world project. And then basically the tutorial talks is where, how do you set that up with CMAIC? And I think that's really, really significant because a lot of these tutorials, they just kind of play with toy examples where kind of they just teach you the basics,
Starting point is 00:10:48 but they don't really show you like how do you apply this at scale. But that's kind of what most people have to do. And that can be very, very hard with CMAIC and very painful. And I know from like just going to local meetups talking to my friends is that most developers don't want to touch this, don't want to have anything to do with CMA because it's just too painful. They don't want to learn it. but like this kind of stuff really helps to actually do that right it actually walks you through
Starting point is 00:11:13 how do you as a as a civil source developer as a real world project how do you actually you know use a built system like CMAIC properly and it does that in an approachable way so I really like that I'm really happy that exists now yeah I only saw this because you you put it in the use item so thanks for that but I'd have read through it was actually really interesting and it does sort of go some way to solving that now draw the rest of owl problem. As long as you want a particular type of owl, so it is specific to this type of project. It's not going to cover everything, but it's a nice representative slice through a reasonably real-world, somewhat sophisticated project. And I learned a thing or two from it. So yeah,
Starting point is 00:11:53 definitely go and check that out. Well, there is a follow-up post as well that goes into even more detail, I saw. Oh, I haven't seen that one. Yeah. I'll put that link in as well. So the last news item is something that I think might be interesting to people who kind of follow the long-term development of C++ and particularly all the safety and security discussions that we have had in the last few years. So Sean Baxter had put a lot of work into creating something like something like a Rasparo checker for C++. It was called Safe C++. That's a proposal. There were like he gave this presentation about it. It was like very, very impressive work. I think that many people said
Starting point is 00:12:34 that's really interesting. We're really cool to see that we can do something like the Rust version of safety in CBS class. And it was encouraged in SG23, the safety group in the committee.
Starting point is 00:12:48 Yeah, so that is interesting because the blog post that I saw today basically says that Sean Baxter decided to not continue working on it. So that's the news item.
Starting point is 00:12:58 And I was a bit surprised by that. And there was also a claim there that the safety and security working group voted to prioritize profiles over this stuff, which was not entirely my perception of what happened, but I was not in a room. I don't know. I don't want to speculate. I don't want to like, you know, bring any politics into this, but I do find it interesting that Sean Baxter seems to have abandoned that project. I think that's quite sad, but I think it's also newsworthy potentially, which is why I bring it up. Yeah, I think there is a little bit
Starting point is 00:13:30 of some of his words been taken out of context, not necessarily deliberately, just that's how things work sometimes. And also a bit of, I think, Sean himself, he just wanted to move on. He's been doing circle for a long time now. Oh, he's been running in circles for a long time, yes. Oh, I saw that pun you did there.
Starting point is 00:13:50 Almost slipped it past year. Right. But I think as a proof of concept, the work that he did do, and we had him on the show. Last year we talked extensively about this. It stands on its own is a kind of proof of concept that shows this can be done. I think this is still helpful if he's not going to continue, like make it into a proper product or proposal, I think. Yeah, I believe there are some other takes in the works as well.
Starting point is 00:14:13 So maybe we'll talk about those at some point, but not this one. All right. So that concludes the news items. If I could inject one more news item, though, in the, today I learned at CPPCon, of course, now that's two weeks ago, department. but one of the cool things I learned was in the panel on Monday evening. So just to set up a little bit of backstory, after Bjarnes' stellar opening keynote, that's always the crown jewel of CPP on to launch things off Monday morning, and then I had a talk.
Starting point is 00:14:47 And then after lunch, I went and I did an interview that was adjacent to CPPCon. There were a couple of documentary camera crews at the conference, and so I sat for an interview with one of those. and they asked really great questions, one of which was a question I had got before and didn't really have a clear answer to, and that was, how does AI affect language evolution? And so I sort of said, well, AI is really important and it's a great tool. I'm not really sure that it directly affects language evolution any way I know of, but, okay, so then the interview moved on. fast forward to the evening to the panel that you were on as well Timor and during the panel in the middle of the panel Daisy Holman a longtime C++ committee person who's now at Anthropic doing drum roll
Starting point is 00:15:35 AI stuff gave an answer saying well you know I hadn't really paid a whole lot of attention to contracts until the last six months but now that I've been working at Anthropic on AI one of the things I learned there was that contracts, being able to write pre and post conditions on function signatures instead of burying them as assertions in the body, doesn't just help static analysis tools, which we knew that it did, but it also helps AI because AI now has a much easier time for the same reason as the human reader of the code, the same reason as the static analyzer to know what the function does because you've declared your intent up front. You've documented And, of course, preconditions are always more important than post conditions, but you've got both of them on the function signature, and that helps refactoring tools and other things like that as well.
Starting point is 00:16:28 And so that was a lightbulb moment for me. He's like, oh, well, yeah, that makes total sense. I just hadn't thought of that. And now that cloud code and tools like that are actually in the position where they can use that information and it lights up AI and AI does a better job, including I think you were mentioning a conversation we had recently that it looks at comments as well. Well, this all really helps. And now, if only I had known that, oh, something like eight hours earlier, I would have given a better answer in that interview and been able to say, oh, yes, AI does, in fact, affect language evolution. Because one thing, the more you declare your intent, including by putting pre and post conditions on function signatures, that really helps AI.
Starting point is 00:17:15 and Daisy did make the comment that, well, at least the current generation of AI, I can't make predictions about what's going to happen two to five years from now, but certainly for at least the current generation, it helps a lot. So that was a new thing I learned. It's always exciting to be learning new things, especially as the industry keeps moving forward. Yeah, thank you, Herb. This is a great news item.
Starting point is 00:17:37 Thank you for injecting that. That was also a light belt moment for me. I was on the same panel. I have two comments to this. like one is obviously I helped design and specify contracts and like it's obviously very cool like as somebody who has kind of participated a bit in and making this feature happen to then hear that people are using it in like new and different ways and not something I thought of that was that's really really cool but actually the other thing is that the very last episode we had we were
Starting point is 00:18:07 talking to Matt Colocundis from Brontosaurus about his refactoring tools which you use AI and He was saying the same stuff that, you know, if you can't give it more context, if it can look at comments, it just makes this much better job. And back then, I didn't connect the dots yet that, you know, if you have contact assertions on your functions, that's even better because then you're expressing in code, you know, here's what the function expects. Here's what it gives back. You know, here's how you use it correctly. Here's or use it incorrect. You express that in code. That the I can consume that and figure out what the code does much better.
Starting point is 00:18:41 And I haven't connected the dots at that point. but I did now, so thanks a lot to you and Daisy for bringing it up. Hey, we're all learning this together. Things are moving fast, but it is nice to see that features that we're working on for their own beneficial reasons now turn out to have applications and benefits in other areas that we didn't plan. And that always takes me back to Bjarna, who has always famously said, and it's an important thing I learned from him, that when he designs a feature, he tries very hard not to make
Starting point is 00:19:13 could do only the use cases he can think of today, but also to design it in such a way that people will find ways to use it, good ways to use it, that he hadn't thought of at the time, because nobody has a crystal ball and can think of everything. If we're going to wait for somebody to be omniscient and think of everything, we'll be waiting a long time for the first programming language. We'd still be waiting. Yeah, that's a good point. All right, so we talked about CPPCon. So we're kind of now in the main part of the episode with talking to you. So let's continue doing that. And maybe let's rewind a little bit further to the past from CPPCon to the committee meeting on Sophia, which we had back at the end of June.
Starting point is 00:19:57 Pivotal. Just, just pivotal. So what happened there and what was it pivotal? Why was it pivotal, Hart. We voted reflection, static reflection, into draft standard C++ for the very first time, and that is going to dominate the next decade or two of C++. It is just more impactful than any feature, any 10 features we've ever added to C++. And the way I said it to the committee, it was either to the committee in Sophia or two days later at your conference, Phil, at C++ on C, perhaps both. It's all a blur to me now. But if I had to look back in time as what is the most impactful single paper that we ever voted into the standard, in the history of the standard, until the Sophia meeting this past June, I would have said it was the Bionna Strostribson
Starting point is 00:20:51 and Gabriel Dasrace's paper on Constexper, adding Constexper to C++ X, which became C++ 11. That was in 2007, I think it was, at the Toronto meeting, because that was just the small edge of the wedge of the, of Constexper, just the very beginnings of Constexper, but it has clearly changed the trajectory of C++. In fact, at the time, I'll just say in a sentence or two, why it was important, everybody saw people were doing more compile time programming. We wanted it so badly we're going to use templates and angle brackets and abuse this template type system that was never designed to compute anything. And, oh, look, it's true and complete.
Starting point is 00:21:31 That's what Biana wanted. Well, I'm not sure he wanted it in this way, exactly, but I think he was okay with the result. But, you know, we're abusing this thing that was never designed to compute anything to do all this computation at compile time. Clearly, C++ needs a compile time language. The question was, what was that going to be? Because you could easily imagine creating a little simple scripting language that would run
Starting point is 00:21:55 at compile time. instead the insight which actually took a lot of heavy lifting to convince the committee and a lot of people still were not convinced at the time that we voted it in was that the compile time language for c++ should be c++ that was not obvious at the time and yet fast forward to today where you could run almost all a standard c++ language and library not all of it but almost all of it all the popular bits at compile time to compute things and we can see how impactful it is. Of course, it does mean that for the last decade plus, every C++ compiler has come with a C++ interpreter built in, which most people don't realize. But boy, is it worth it. And it makes our compilation faster
Starting point is 00:22:42 because when you replace the template meta program with constex per code, it's quite common to see it 10 times faster. Why? Because, again, it comes back to what we said about code contracts and about the range four loop. you're expressing your intent directly, instead of trying to indirectly abuse some other mechanism, a template type system with lots of angle brackets and implicit side effects of computation, but that wasn't designed to compute something, you're saying directly, hey, I want to run this code.
Starting point is 00:23:11 And as soon as you do that, tools, optimizers, they all say, oh, code, now you're talking my language. I know what to do with code. And the world is just a better place. So I would say that was the most impactful single proposal we ever added to C++ in the past 30 years. The single one was that first ConstExper paper until Sophia, and now we've done something even bigger, but that builds directly on that one, because you need ConstExper and ConstEval Code to use static reflection. So it's going to take us the next decade just to figure out all the things we can do with what we just voted into C++. plus us 26, and of course, we're going to build on that in 29 and on.
Starting point is 00:23:57 So let's talk about this a little bit more, because, like, when the, I think the comments you just as developer, things about reflection, they think about, like, all of these things we do today with ugly macros and, like, non-portable hacks, like, you know, enum to string or, you know, all of these things, like, if it's an aggregate, you can, like, write a hack to like iterate over the members and do stuff like this and there's like the whole library is built on this there's like a boost like library for this is other libraries for this and so the first thought is kind of oh it's going to make that easier because you can just iterate over members of a structure or whatever and that's just code but it seems like you're
Starting point is 00:24:37 saying it's actually way more profound than that and um i'm just kind of curious like what what am i missing if i'm just looking at the feature set that's there So first of all, the feature set that's there is we can reflect namespaces, classes, functions, and even parameters. But there's more to be done. We can't yet reflect default arguments for those parameters. We can't yet reflect annotations on those, nor can we go inside function bodies for statements and expressions.
Starting point is 00:25:07 All of those things must come. The end point, as I mentioned to SG7 last year and SG7, the Compelterme Programming subgroup in the committee, strong. I think unanimously, if I remember right, approved was, yes, we must be able to reflect everything a programmer could see. Why? Because if I can read it as a human, I will want to automate it. I will want to write programs that use that information. And so the one, I can't remember the exact phrasing I used for my CPP car abstract. Let me see if I can find it really quickly because I really like the one particular phrase. yes in june 2025 c++ crossed a rubicon it handed us the keys to its own machinery i really like that phrase because it c plus plus can now at least for the parts so far supported and then we'll generalize it it can read itself and generate itself and right now the generation is that you create another c plus plus file but we're going to work on in the future on generating c plus plus in the same
Starting point is 00:26:19 translation unit as you're compiling it, which is extremely powerful. So I've already implemented that in CPP front to a limited extent. But all of that's intended to be directly transferable to standard ISOC++ itself. And it just makes a whole bunch of sense. So I spent my entire keynote 90 minutes going, trying to just go through at CPPCon, nothing but reflection examples and the whole point was not to be wowed by any particular one but to see just how many examples and just be one after the other be wowed by oh we can do now and we can do that now and we're still just scratching the surface it'll take us a decade to figure out all the things that we'll be able to do but also in news just today a few hours before we're recording this
Starting point is 00:27:16 there was a Reddit post on automating PiBind 11 using reflection after somebody who had worked with Dan Katz's Clang prototype for a month. And so we're seeing this accelerate. I don't think you'll see any serious C++S conference for the next decade where reflection isn't a significant fraction of the talks. You mentioned Python, C Sharp, Java. They've all had reflection for a long time. And it doesn't seem to be setting their world alike. So is it really going to be different for C++? So I think it actually does. So Python doesn't have fully have the kind of reflection we're talking about,
Starting point is 00:27:57 nor Rust, but they have some subsets of it. C-sharp and Java do have reflection that enables libraries they couldn't have otherwise. That's runtime reflection. And so you make a virtual method call in Java using reflection versus doing the same method call directly, it's like something like four or five times slower. And even there, it's a huge advantage and it enables libraries and tools I couldn't have otherwise. But it's known to be, you know, scare quotes, slow and expensive. C++ reflection is not that. It is static. It is compile time. And if you want to compute something and keep it to runtime, you can then generate
Starting point is 00:28:36 runtime values. But by default, it's zero runtime cost at all, not just zero overhead, but zero cost at runtime by default, which is just a great place to be. And that's in the spirit of C++. And I think that, the fact that it's static with default zero runtime cost, number one, and that it's more flexible, even in its initial feature set, and that feature set's going to be expanded. Number two, as far as I know, it is already the most powerful reflection facility. Granted, it hasn't hit major compilers yet in commercial release.
Starting point is 00:29:14 pieces and trunk, but it's already the most powerful reflection feature I know of in any of the top dozen most popular commercial languages. And that's version one. It's, it is impossible to overstate how this is going to impact C++ for the better. So I was actually in the room for your keynote on reflection. So thank you at TPBCCon. So thank you very much again for doing this. It is actually online. You can't find it on YouTube yet. But if you have the link, you can access it. So if you're going to to actually put the link in our show notes so people can watch it. I highly recommend you do. And I had exactly the feeling that you described. Wait, we can do this. Oh, I had no idea. Like, that was really cool. Do you have a favorite use case? What's your personal favorite use case? I know, I'm putting you on the spot. It's a hard question. I, like, which is my favorite child? I'm not that I'm the father of, I'm not the father of reflection or anything, but it's like, it's that kind of question. Oh, my.
Starting point is 00:30:14 My goodness. Can I pick three? Okay. Yeah. And they're more general category. The first is a more general category. So I've been talking for the last eight years about writing classes using reflection. So being able to say, I'm writing a value class. I'm writing an interface class. I'm writing a proxy class. I'm writing an async class. Just declaratively up front. And then use reflection on the class on the class body to get the right defaults, the right generated functions. So you never need to write equals defaults or equals delete again if you use this style which i'm convinced um in the c plus is 29 onward will just be just be be common so that's your metaclassus proposal for that's the meta class's proposal which is all about using yeah which is all based on hey if we make the assumption we have reflection static reflection in the language this is one of the major things we can do with them so that that's one and i've talked about that before and i've implemented oh like a a couple of dozen of those in CPP front to show that, yeah, it actually works and it's great. But looking for, and some of that's now actually working in the, that you could actually write most of those
Starting point is 00:31:26 in the version 1, C++26 reflection, if you output the generated type that you generated from the prototype to a separate file. And of course, we're then on track to do it in the same file, which maybe it's seamless. But you can already do a lot of that. The example number two, which is going just slightly into the future, and I actually had the benefit of being able to bring Max Saga bow on stage from Germany, who happens to be an automatic differentiation expert. And so he was on stage, and he actually implemented auto diff in CPP front over the summer. And so we were able to go through automatic differentiation examples. If for those, please see the talk for that,
Starting point is 00:32:16 and for the examples that describe what it is. But basically, you can write a function to compute something. Like, say something as simple as, you start off with a simple example, like one input, one output, Y equals X squared. What's the derivative? Well, we all learned in school, it's 2x, first derivative. What's the second derivative? It's 2.
Starting point is 00:32:34 It's the flat line. It's the slope 2. So we can do differentiation symbolically. we can do it numerically using approximation and making the delta, the finer and finer the way numeric approaches do. What this does instead is it transforms the code. It actually takes the, reflects over the, in this case, using reflection, reflects over the code that computes the value, sees exactly what you're doing, materializes the temporaries, uses the chain
Starting point is 00:33:08 rule and differentiation for each individual sub-expression to create a function which in one pass calculates not only the original value, but also the first derivative, second derivative as many as you want, because you can just repeat this, with a constant factor overhead, usually two to two and a half times a factor overhead. This is hugely important for AI and ML, especially because it generalizes. You can do multiple inputs and outputs, and in fact, in AI and ML, it's very common to have billions. In some cases, we suspect that the companies have not confirmed yet trillions of inputs
Starting point is 00:33:49 to get an output, and then you just use forwarder or reverse mode auto-diff to do it efficiently in a single pass. forward is for if your outputs outnumber your inputs and and reverse is when the inputs outnumber the outputs I think I got those in the right order but it's a super accurate because it's it's not approximating it gives you exact results it can handle any function that can compute anything almost any function that can compute anything including with branches and loops inside it and if you can computed, even a discontinuous function that you couldn't model in MATLAB or Maple, you can auto-diff, and this is just hugely important. In fact, I didn't say this in the talk, but I'll take a step
Starting point is 00:34:37 further. I suspect proved, but I suspect that if C++ had had a static reflection for including in function bodies, which you don't have yet in C++ is 26, if it had had it 10 years ago, AI would be probably a couple of years further along today than it is. And the reason I say that is because when I was first looking at this, hey, AI is turning into a big thing. This is like even 10 years ago, way before chat GPT. And I talked to some researchers and like, okay, well, what languages are you using and why are you using them? And what I kept hearing was, oh, we're using Julia or we're using Python. We'd love to use C++. because it's faster.
Starting point is 00:35:24 It generates way faster code. We can actually do much better in the experimentation phase. We're using these other languages, like in particular, Julia, because it has auto diff, because we can write auto diff and Julia. And that's really important because it turns out that when you're optimizing models, you need to be able to find the gradients, the slopes of things, basically, am I getting closer to an optimum or not? It's really important, a really important question to ask.
Starting point is 00:35:52 And at the time, the one person from Cambridge who was mentioning, oh, that's why we're using Julia, and he said, and it sucks. I mean, Julia is great. He liked that. But the reason it sucks is because if only we could have done this in C++, we could be doing a thousand, a factor of a thousand more experiments. Now, so when you think about AI, we often hear about the two main phases being training and then inference. So training, you train it on data. and then we can have all sorts of ethical discussions on how much data and which data and all that,
Starting point is 00:36:26 but you train, right? And then inference was basically you run the model and you, like, chat GPT actually gives you an answer or whatever, right? But those are steps two and three. That's skip step one, which is experimentation, to come up with the model itself. And that, this is before you actually then train it on large volumes of data.
Starting point is 00:36:47 And it's that experimentation that was slowed down because it couldn't be done in C++. And so that's, I cannot prove, but it seems to me that we could have done orders of magnitude, more experimentation by running experiments faster than in slower interpreted languages in over the past like five years ago, 10 years ago, and gotten to the AI we're at faster if only we'd had reflection to have auto diff in C++ so that we could have used C++ to do experimentation. This is only going to accelerate now that we are going to have it. But it's interesting how a choice in one programming language can potentially affect an industry. And in this case, I suspect it did. We still got there. Don't get me wrong.
Starting point is 00:37:33 AI is still going fast enough, right? But we would have got there. We're always greedy. So I think we would have got there at least a couple of years sooner if C++ had had reflection. And that is a non-intuitive result or at least opinion, conjecture that I have. So that's really interesting. So but you mentioned a few things that we don't have at C++ 26, right, that we really need for this. One is to be able to reflect on expressions and statements and expressions.
Starting point is 00:38:00 Yeah. Function bodies, basically. And the other one that you mentioned in your first example was to be able to actually spit out code at compile time and like feed it back into the same compiler, right? Into the same translation unit. Yeah, yeah, yeah. So are we going to get those things in 29 and are there any other important things we need to wait for in 29? Oh, our focus in the next two meetings, so the next six months or so, is to ship C++-26. So I don't want to get out ahead of the skis.
Starting point is 00:38:30 It is premature to say what will be in C-plus-plus-29. Having said that, proposals to do these things already exist. So it is certainly possible that they may be in 29. But remember, we're going to take a decade to figure out all the things we can do with reflection. reflection. We have got a great running start. We have way more reflection in version one than we had ConstExper in version one. So there's a lot there already. But yes, generation in the same source file as you're compiling it is coming. And reflection on more of the language, including things like aliases, being able to know both the alias name and the alias type name, which we
Starting point is 00:39:18 don't currently reflect on, and things inside function bodies, like statements and expressions, those things will certainly come. It's probably premature to say whether C++29 or not, you know me, I'm an optimist. And hey, I cared about it enough that I went to implement it in a CPP front, which is not a C++ compiler, but it generates 100% pure C++ code. It is a proof of concept for what C++ itself can do. And I think we'll do. And that's all open source, right? Yes. Yeah, it's on GitHub.
Starting point is 00:39:53 So thanks, thank you again to Max Sagabaum, who wrote the two and a half thousand lines of code to do a, and he's still doing more. He thinks the total for a reverse mode automatic differentiation, the total will eventually be 5,000 lines. But he's already written two and a half thousand lines of compile time, CPP front, CPP2 code, which directly will work with as C++ code. You can pretty much do it line by line when C++ has reflection on function bodies. But he's using the CPP front support
Starting point is 00:40:29 for reflecting function bodies to do auto-diff. Thank you very much, Max, for that. So for C++-26 reflection, we do still have splices, don't we? To small amount of generation? We do, in fact, have a small amount of generation. So splicers, which are the square bracket colon and then colon square bracket syntax, we have that. We have that and use those usually in template four bodies, which is also a new feature in C++26,
Starting point is 00:40:57 to be able to talk about like my object dot, insert data member name here as we iterate over the data member names, like in a template four loop. That's indefinitely in C++6. and so we and we also have a very small amount of generation of runtime types so if we want to take data from compile time to runtime we have defined static array to find static string we have those and we have the ability to in just a little bit just enough and I show this in the talk in one of the first examples the ability to actually create new types anonymous mostly anonymous types, but at compile time in the same translation unit in C++26.
Starting point is 00:41:45 For example, the example that I show in the talk, which is by Dan Katz, so credit to him for coming up with the example, is using the hashtag embed to read a JSON file into a constexper data structure, and then compile time traversing the structure of the JSON and creating an actual new C++ type. that is exactly the same nested structs with the same member names and types as the JSON file. It's that much we can do. We just can't do general source code generation yet in C++26 in the same file. So that not only gives us much better serialization, de-serrealization kind of facilities,
Starting point is 00:42:30 but it kind of lets us stamp out any kind of bindings to at least simple C-Bos-S-types that we like, we want, right? So that seems very powerful. Which brings me to the third example. I said I would give you three examples. And this one's forward looking. It's like, gee whiz, if we can, if we can emit C++ code, we can emit Python code. We can emit JavaScript code. We can emit Go code. And I gave several examples live. And if we can, combining that with meta classes, being able to declare, hey, my class is a value, comma, Python accessible. or my class is at Interface, comma, Java accessible. Oh, let's say Java and Python.
Starting point is 00:43:15 You can say as many as you want. We might actually finally maybe be able to solve the problem we have today where the lingua franca of APIs operating system and other APIs is C. Today, if you want to write code that is callable from any major language, You have to provide a C interface. It is horrible that we haven't been able to uplevel that to at least talk about classes, encapsulated classes, and member functions.
Starting point is 00:43:50 There have been many attempts, com, corba, even the dot net CLR was intended to be a common language runtime. That dream fizzled. Nobody's been able to do it, which means that if, say, Rust and C Sharp want to talk to each other, they talk through C, unencapsulated raw data structs, raw pointers, explicit memory management, this is a terrible way to glom together a civilization. And yet it's our industrial strength duct tape that we use because it's the only,
Starting point is 00:44:19 C is the only thing that everybody can talk to. If we could get to a world where with C++ reflection, we can very easily at compile time just say, hey, make this C++ type, not every C++ type, but this one is an API type and just have it automatically run the metafunctions for exposing it to Python. to JavaScript, to go, to Rust, to all these others. And I live demoed this on stage. So they're prototype proof of concepts, but we live demoed this on stage, making something a C++S class available to Python
Starting point is 00:44:53 and using named arguments in Python at the call sites, and it works, making it available to JavaScript, doing the M-Script and binding, which also makes it available to TypeScript, by the way. So you get two languages, for the price of one there, and so forth, I think that is, if we are in the lifetime of my career, possibly in my own, my whole natural lifetime, if we are ever going to get past C
Starting point is 00:45:20 as the lingua franca, that is the only language that everybody can talk to, I think the only hope for that, and it may not work, but I think the only hope for that achieving that is C plus us with reflection, because only C plus us now has, and soon will have a reflection, capability powerful enough to mechanically expose its types to any language, and it's just code, right? So you can improve it and make it higher fidelity, and you don't need to have a special compiler or something proprietary. You can do it in portable standard C++S code. I think it would be one of the best things that C++ would have ever done for our civilization, even better than any of the safety work, which I think is also great in improvement, is...
Starting point is 00:46:07 If we can provide a way to up-level our civilization's lingua franca from C to anything that has encapsulated classes and non-raw pointers and non-manual memory management, anything is better than that. And I think C++ with reflection is the only hope I see. And I don't know if it'll work, but it's the only hope I see of getting there. And that would help every language, not C++. It's very rare that we standardize a feature in one language that can actually help the whole tech industry, like all programming languages. I think this is one of them.
Starting point is 00:46:46 And that's an example of why I think this is such a hugely impactful feature. Yeah. Thank you so much for this insight. It's really surprising and interesting and inspiring how deep this goes. When I think about reflection, you know, first I was thinking, you know, like probably most people about, you know, things like, yeah, enum to string or whatever. It's an array of struct of arrays, which is something you do a lot when you perform in code. And then if you think a little bit more, I think about, oh, wouldn't it be great that to like not have to do stuff anymore like the cute mock like compiler thing. But like what you're talking about just goes way beyond that.
Starting point is 00:47:25 And it's really, I'm really like just curious to hear like to see like I wish I had a time machine to see like what's going to happen five years, 10 years from now. Oh, I have good news for you, Timor. Yeah. You have a time machine. It's called wait. You are young enough that you will see it. Right. We are at a time machine right now that is going forward one day per day.
Starting point is 00:47:47 So, hey, congratulations. You're aboard. Yeah, yeah. I guess you're right. Let's just wait and see. It's going to be very exciting. Yeah, absolutely. Talking of very exciting.
Starting point is 00:48:00 We spent a lot of time on reflection, quite rightly so. it sounds like it is a real game-changing feature. Is that really all there is to C++ plus-plus-26? Is there anything else interesting that's going into it? Oh, sure. There's other important features. I don't mean to downplay those. It is a major release.
Starting point is 00:48:16 The two big ones, sender receiver, so a stood execution, which the way I summarized it, I think I don't know if I'm the only one who says it this way, but it helps me think about it is it's C++ as an as-sync model. to actually have a standard async model. Rust, other languages have been working on an async model. You know, it's important. Async is important these days, not just for cloud, but also for parallelism and compute.
Starting point is 00:48:45 One of the things that really impresses me, and I had time, my keynote at CPPCon was only reflection. So I had no time to talk about it there. But I did say this at your conference at C++LC, the thing that impresses me the most about the sender receiver model. once you get over the fact that you're writing you're writing pipe then a lot just think of it as dot then it's a very tall period just don't don't be put off by the functional notation you'll get used to it like whatever sorry i i i just find functional notation still non-intuitive but we'll get over it right we're adults we're professionals once you get over that the fact that there are two major sources of doing more than one thing at a time in a program. And I tend to use the terms concurrency and parallelism for them. Concurrency is where you're doing asynchronous stuff where you're hiding waiting and you're trying to be responsive and not block.
Starting point is 00:49:41 So you think you've got a, you're doing background printer, you've got a web service access, and you're trying to keep the UI responsive. So that whole category is about doing more than one thing at once and each can run at its own speed and they're not blocking for each other, right? So that's, the mantra there is don't block. and be concurrent. The other one is parallelism where you're using more cores or resources to get an answer faster.
Starting point is 00:50:08 And the difference is that in that case, the two things that are running at the same time are parts of the same computation. In the first case, they're independent things, doing might be computations, might be non-computations, just doing different things. In the second model, they are parts of the same computation. They're trying to quicksort the left and right subrange of the same data structure. And the quicksort isn't done until they're both done, right? So you're trying to use more resources to get the answer faster.
Starting point is 00:50:40 So the first example is more like threads and open MPI. The second one is more like CUDA. And the mind-blowing thing for me was that the people in the first category, concurrency, were coming to be saying things like, this is, this send-to-receiver model, execution is finally a great, a great model to use over co-routines, which is all about case one, pillar one, about concurrency. But at the same time, the Kuda folks are coming, which are the gold standard of parallelism, are saying, hey, this is a great model for Kuda and for data parallel GP-GPU computation. And in fact, we're going to do send-a-receiver
Starting point is 00:51:23 bindings over Kuda, and it's performance competitive. blown because I would not have expected to see a single framework be able to deal well with both of those two categories. Because, I mean, category one is like threads and co-routines. The category two is Kuda and I should have mentioned Open MP. They are in different worlds and that there is one model that so far so good looks like it will deal with them both well is great. In fact, one of my learnings, so I joined Citadel Securities, I'll probably say SITSEC for short, because it's a mouthful about a year ago. And one of my learnings, which I did not realize until I joined, was that they were already
Starting point is 00:52:08 using Sender Receiver, their own implementation, to trade an entire asset class. And we're currently working on expanding the use of send a receiver more broadly in SITSEC without waiting for the standard. And of course, it'll be nice when finally we can use one that's built into our compilators and stop maintaining our own version. But the fact that it's so useful and we're using it in production was great news for me. So I think that's one of the major features. The other one, of course, is contracts.
Starting point is 00:52:42 And we're going to have lots of discussion a few weeks from now in Kona. There are some who say, well, we should wait a little longer. There are some who say, no, we have consensus. There's no reason to revisit that to change our minds, rather. where we will talk about it, but to change our minds. So we'll have that discussion and see whether there's new information. As I already said at the beginning, the main new information that I know of for contracts, and this isn't to put down others' concerns.
Starting point is 00:53:08 We will absolutely listen to all the concerns, including ones we've heard before, to make sure we didn't miss anything. And maybe we shouldn't ship contracts yet in C++26. We'll discuss it. But to me, the major new information I know of is that it's like we've been to, at the top, it's so great for AI as well as for static analysis tools and for human beings to have contracts on function signatures. And that's independent of how you enforce them. Like, that's just declaring your intent is just always, if there's anything I've learned from language
Starting point is 00:53:42 evolution, it's giving people a way to declare their intent is just always the right thing to do. And then, of course, you need to do it in the right way and with the right knobs and syntax and so forth. but meta classes the range four loop contracts like over and over that's the path to simplification you can add a thing to the language so the language is now bigger technically more complex because there's another thing to learn but code is simpler the actually using the language is simpler because I can say what I mean instead of saying oh here's a rest of a how to recipe for how to get the effect please mr static analysis tool go figure that out and and figure out what I meant. No. Your code is much better for tools, for optimizers if you just declare your
Starting point is 00:54:30 intent. And of course, it turns out for humans. And for AI, as it turns out. So that's just a very important principle for language development. Couldn't agree more. Yeah. So fair to say, C++ plus 26 is going to be a bigish release. It is going to be an impactful in a good way, Not in the asteroid hitting the surface way. The other impactful release, yes. And again, this is not to take away from other features. There are many smaller features, many bug fixes and issue resolutions. There's a whole ton of work.
Starting point is 00:55:10 And so thank you again to everybody in the committee who's been working these last three years on this release. And longer, for many of the features that have been baking for longer, but that have landed in this three-year release. it is just a really great release and the fact that we're going to be able to do so much is good it's a major release on its own but not to take away from that but reflection dominates everything i mean if you if you combine c plus us 11 14 17 20 and 23 and 26 into one release i would make an argument that reflection would dominate the rest even then so that's not to say that all the other things we did aren't important they sure are but you just we just cannot
Starting point is 00:55:57 I think overstate the impact of reflection we we just we don't know it's going to take us 10 years to figure out what we can do with it in in a good way right so so we now have two more meetings to kind of do last minute buck fixes and things like that and then we can we will ship 26 right which will have reflection in it yes at the at the London meeting which Phil, you are hosting. Thank you very much. And we're working on sponsors as well, so there'll be co-sponsors to announce as well. And by the way, I was told, I'm calling it the London meeting. And I'm told I should call it the Croydon meeting. Well, that is in London, isn't it? It's a bar of London. But we tend to, London's very big. So I'm going to say the Croydon, common, London, UK meeting. I still need to update the website. Yeah, it's just outside of central London, but within greater London. So you can use either.
Starting point is 00:56:53 And so we'll be looking forward to that. So that's going to be in March, right? And that's when we're going to ship 26, hopefully. So another thing also on the committee will happen. Not only I be shipping another standard version, and we're going to start working on 29 afterwards. But also, Herb, you have been the convener of the CSS committee for many, many years. Now, how many years has it been, Herb? It's been 22 years. I started 23 years ago, and a PJ Plogger did it for a year in the middle of 2008, 2009, and then I resumed again, you know, when he retired again. So, yeah, it's been a while. Some people have been asking me, well, why are you retiring? And I'm like, it's like, why haven't I retired already?
Starting point is 00:57:40 this is a very long time. It's like eight terms already coming up on eight terms. One was a partial term. And like it's time for somebody else to do it. None of my other responsibilities are changing. I'm continuing to chair and be chair and CEO of the Standard C++ House Foundation. I'll still be running CPPCon. I'll still be in the committee.
Starting point is 00:58:03 In fact, one of the reasons I'm doing this is because it takes a fair amount of time to do all the administrative and organizational stuff. that it is associated with the helping a committee run. And I want to spend more time writing technical papers. So this is so I can spend more or not less time in the committee and on technical things. Oh, now we're talking. Do you have any particular technical papers coming up potentially that you can share? Yeah, reflection and safety are going to be two of my focus areas in the near term.
Starting point is 00:58:34 And so that's, I want to be to have more time to work on those. things. I don't plan to be diminishing my work in the C++ committee for the foreseeable future. I'm nowhere close to retiring. So it's like, I can't tell you enough. You might have noticed that I'm looking forward to a super exciting decade for C++. And it's going to be a great ride in the committee. So there doesn't actually seem to be much left, but we'll ask our question anyway, which is, is there anything else going on in the world of C++ right now that you find particularly interesting or exciting? Usually at this point, people say, oh, reflection, but we've got to cover that. Well, a lot of others say that, but let me, I'll just add a postcript to what I just
Starting point is 00:59:19 said, because the incoming chair to replace me, the new convener starting January 1st, for the London meeting will be their first meeting, is Guy Davidson from UK. And also the guy's intention he just announced a day or two ago in the committee is, because there's a lot of work for one person as convener. I can tell you that. A lot of organizational and administrative stuff. He's quite wisely. And I've already been sort of working with one or two folks informally, but he's going to do much more of that. So there's actually going to be sort of a group of three. So with Guy Davidson as being the main responsible person, but also working with Nina Rans and Jeff Garland to as vice conveners,
Starting point is 01:00:12 co-convenors, I forget the exact term he's using. It's one of those two. So I think that'll be great because it's good for succession planning. It's good for avoiding burnout because it's a lot of work for one person. And so I'm looking forward very much to enjoying being a participant, under their leadership and them doing all the administrative work starting in the new year. Yeah, good luck to Guy. He's a great guy, and that joke is just never going to get old.
Starting point is 01:00:46 I never even thought about that. All right, yeah. I know, I know, and the French speakers are going, it's ghee, it's ghee. That joke doesn't actually work, but it works in English. So we'll take the Anglo-Saxon view of it. All right, so we got new leadership for the SuperSoslav. committee and we got an exciting decade with amazing things you can do with reflection. So it sounds like a world I want to be part of. So that is that is
Starting point is 01:01:12 really good. Well, thank you so much, Herb, for taking the time to talk to us and share you ideas. I think this was amazing and quite inspiring, actually. So thank you very much for that. Well, thanks for having me on. It's been it's a it's a great time to talk about the bright future we have. Is there anything else you want to mention before we wrap up? Is there anything you want to tell people or maybe where they can reach you if they want to talk more about CPP Front or reflection or any of the other things? The CPP Front is on GitHub. We should put a link to the CPPCon talk in the show notes. So I'll send you that when we've wrapped up. Yes. And continue being excited about C++. When you do look at that keynote, I'll leave your listeners
Starting point is 01:01:55 with one thing. I hope that you are not only blown away by, oh, look at all these different things. we can do with reflection. Most of all, though, I want that to prime your imagination, to think of things we haven't thought of yet because we still need to discover. And we'll all be, including you, dear listener, will be discovering things we can do with reflection. And I look forward to seeing even better ideas that other people come up with. It's an exciting new frontier.
Starting point is 01:02:23 I already have a list. Yeah, so thanks again for having me on. And it's been a pleasure. And I'm sure we'll look forward to more news. your next podcast. Thank you hope. Thanks so much for listening in as we chat about C++. We'd love to hear what you think of the podcast.
Starting point is 01:02:39 Please let us know if we're discussing the stuff that you're interested in. Or if you have a suggestion for a topic, we'd love to hear about that too. You can email all your thoughts to feedback at CPPcast.com. We'd also appreciate it if you can follow CPPCast at CPPCast on X or at Mastodon at CPPCast.com on Mastod and leave us a review on iTunes. You can find all of that info and the show. notes on the podcast website at cppcast.com. The theme music for this episode was provided by podcastthems.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.