Algorithms + Data Structures = Programs - Episode 119: Why APL & Haskell are AWESOME with Zach Laine! (Part 3)

Episode Date: March 3, 2023

In this episode, Conor and Bryce talk to Zach Laine about APL, Haskell, the problem Three Consecutive Odds and why C++ developers should learn other languages.Link to Episode 119 on WebsiteDiscuss thi...s episode, leave a comment, or ask a question (on GitHub)TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachAbout the GuestZach Laine has been using C++ in industry for 15 years, focusing on data visualization, numeric computing, games, generic programming, and good library design. He finds the process of writing bio blurbs to be a little uncomfortable.Show NotesDate Recorded: 2023-02-16Date Released: 2023-03-03ADSP Episode 117: OOP, C++ Containers, APIs, EOP & More with Zach Laine!ADSP Episode 118: C++ Allocators with Zach Laine! (Part 2)APLBQNC++98 std::count_ifAnamorphismsC++20 std::views::splitC++23 std::views::chunkC++23 std::views::chunk_byADSP Episode 115: Max Gap in C++23ADSP Episode 116: Max Gap Count in C++23C++98 std::adjacent_differenceC++23 std::views::adjacent_transformThree Consecutive OddsC++98 std::transformC++17 std::transform_reduceC++23 std::views::adjacentC++23 std::views::slideHaskell fromEnumArrayCast Episode: Michael Higginson, 2022 Dyalog Contest WinnerReverse Polish notationP2672 Exploring the Design Space for a Pipeline OperatorDuo LingoDaniela Engert Duo Lingo StreakCategory Theory for Programmers - Bartosz MilewskiC++23 std::views::filterCollection Oriented Programming

Transcript
Discussion (0)
Starting point is 00:00:00 I mean, clearly they should be called a Scanduction. Scanduction. That is actually pretty good. And that's the thing is I don't know, but this pattern, Scanduction is very good. That was 100% joking. Do not call that Scanduction. Please do not call that Scanduction. Welcome to ADSP, the podcast episode 119 recorded on February 16th, 2023. My name is Connor, and today with my co-host Bryce, we interview Zach Lane.
Starting point is 00:00:37 This is part three of a five-part interview. We talk about why APL and Haskell are awesome and why C++ developers should be interested in learning other languages. Let's switch to APL now, though. APL. So the question was, why am I so enamored or obsessed with the language? Yeah. Yeah. So what in particular, like, because I know you said it's your favorite language, but when you say something like that, like what are the things you're thinking of that that you love about it that make it so
Starting point is 00:01:07 singular for you so i my top five languages is constantly evolving number two actually now is and there is a new apl quote unquote on the block called bqn oh hang on hang on for for for for all intents and purposes like for for for non-array programming, like for regular civilians like us, Connor, the distinction between APL and BQN is like if BQN is your favorite programming language versus APL, to us it's like it's all APL. I mean there's a few array language listeners that appreciate the difference, but you're correct. So is that like a nerdy gag that each letter is this sequence letter? Yeah, yeah. It actually was supposed to be, if it's plus one on each, it should have been BQM, but they messed it up.
Starting point is 00:01:53 So now the joke is that it's Fibonacci plus APL, which I think is very funny. So what do I, I mean... I can give my analysis as to why you like apl go ahead bryce um i think connor's one of connor's primary programming interests is the design of algorithms not the design of applications of like full applications or libraries connor is interested in like how do i how do i construct an algorithm to solve this particular type of problem like of like full applications or libraries. Connor is interested in like, how do I construct an algorithm to solve this particular type of problem?
Starting point is 00:02:29 Like what's the best algorithm for tackling this problem? And for a question like that, a language like APL offers a great deal of simplicity and elegance. And I think Connor has this, um, this desire for very elegant solutions to, to these problems. Um, and so when Connor says it's his favorite programming language, he doesn't mean like he's going to go like write, you know, like an application in APL, um, because that might be a little bit more challenging.
Starting point is 00:03:07 But what he means is that like, he loves solving these algorithmic problems and that APL is a great tool for that. And APL-like languages. Am I close, buddy? That's not entirely wrong. I would agree. I mean, at some point,
Starting point is 00:03:23 I do plan to start building larger projects in these languages to really find their sharp corners. But like, there's like a list of 10 reasons why I love these languages. And I've also started in the last couple weeks to appreciate that like these languages are really misunderstood. Like, everyone, sure, they make fun of the glyphs and whatnot but it's it is a very like the glyphs aside it is a notation and it doesn't even really need to be the notation it could just be a wordified version of it but like it's the best way to think about solving problems and honestly like i had this path from C++ to becoming really familiar with the C++ algorithms and falling in love with them. And there's a lot of, when you start to learn
Starting point is 00:04:12 about, you know, count if and partition, there is, you are, you unlock the ability to suppress unnecessary detail. And like, there's this subordination sort of, of, you know, looping and indexing and off by ones, you know, you have count if you pass it a unary predicate in the form of a lambda. And that's so much more elegant and expressive, in my opinion, than compared to writing a hand rolled for loop. And then I went from sort of C++ to C++ algorithms to Haskell, where, you know, the same expressing the same kind of idea with sure worse performance, but like, you know, writing a Lambda or composing functions, you know, especially compared to before ranges and C++ 20 is just so much more elegant. And there's a richer set of those
Starting point is 00:04:56 algorithms. You know, that's one of the things is that with the C++ algorithms, we have lots of reductions and lots of maps, you know, things that go from a list of values to a value as a reduction and things that go to a list of values to a list of values as a map. But there's this third category of algorithms in my kind of mental algorithm taxonomy called splits. You know, I used to refer to them as anamorphisms, but for reasons I'm not going to get into, it's actually a bad description. So splits go from like a list of elements to like a list of list of elements so like stood ranges views split is a great example but we're getting a bunch more with chunk by chunk uh you know i think there's a couple others coming but the point is is like
Starting point is 00:05:36 those exist in haskell and so like my mind was just blown away by haskell it like increased my ability to like solve problems and think about problems. Like it blew my mind by like an order of magnitude. And then like, so I went C++ to C++ algorithms to Haskell. And then I landed on APL. And APL, it just, it adds another order of magnitude for your ability to like play around with solutions. I can code five different solutions to the same problem. Like
Starting point is 00:06:05 I'm working on a talk idea or YouTube video idea, which is going to be the eighth problem. So in the last couple of episodes, we talked about problem number six and seven, max gap and max gap count. And now there's a new problem that I've discovered, which is- I was shocked that Bryce could not get adjacent different. I was just like yelling at the screen. I was like, adjacent difference! What are you doing? Adjacent difference with a custom comparator. What are you doing? The current wording in the standard for adjacent difference, I wrote all that word.
Starting point is 00:06:35 Oh, really? And you still didn't get it. That's great. And we have adjacent transform now, which is the better named because adjacent difference is one of the worst named algorithms because it encodes the semantics of the default binary operation. And to pause on before I get to problem number eight, I've like, so upon thinking about Bryce's solution to the max gap count, that is just a single reduction. I've come up with a word now for this pattern where you can use basically
Starting point is 00:07:01 a scan followed by a reduction, which doesn't actually fit into the max gap count one, but it is for Cadain's algorithm and for max consecutive ones. And it's called a double fold, which like there'll be a whole talk or YouTube video on this, but it's the idea where you have like a... I mean, clearly they should be called a scanduction. Scanduction. That is actually pretty good. And that's the thing is I don't know, but this pattern, it's... Scanduction is very good. That was 100% joking. Do not call that scanduction. Please do not call that. This clip is going to end up in a talk when I'm talking about the double fold that can be used.
Starting point is 00:07:39 But yeah, it's... Anyway, so I don't want to belabor this point, but the next problem eight is three consecutive odds. Does a list of integers contain three consecutive odds? And the beautiful thing about this problem is when it ties in beautifully to some of the other problems from one to seven in this list of problems I'm coming up with. But there's three separate like categories of ways to solve this. One is just by doing not an adjacent transform now. Actually, you can do an adjacent transform because you can pass the number of elements you look at as a template parameter. But to me, I would just do that as a straight up left fold, right? I mean, I would pass in a tuple that says, here's the current number and here's how many odds I've seen so far.
Starting point is 00:08:27 And at the beginning, it would just be 0, 0. And then you proceed through and you say, like, do I see an odd? Oh, great. Then I'm going to return that number and then increment the other number by one. Right. And so then the next iteration, I would say, do I see the same number? Is it also odd? Or do I see another odd, I should say, right? And then and so then the next iteration i would say do i see the same number is it also odd or do i see another odd i should say right um and then uh and so on right i i guess the first one
Starting point is 00:08:50 doesn't need to be the actual number just need to be a boolean like i just saw an odd right yeah um but but you know what i'm saying so so like i think and and one of the the i think problems with modern computer science education is there's not enough emphasis on like the name of this podcast, right? That the way to decompose problems is algorithmic data structures. And the algorithms part should be algorithmically like should be, should be thought of as an algorithmic decomposition problem as much as you can. And fold is right at the head of the list.
Starting point is 00:09:33 Like the number of things you can't do without fold that are a for loop is kind of small. Yeah. Like it's, it's not, it's not empty, but it's very small. It is like,
Starting point is 00:09:43 I have a, in my algorithm and taxonomy of like reduction, like folds, maps and splits, like reduce or fold is the like the most elementary of the elementary algorithms. Because you can technically implement like the maps, the splits, everything in terms of folds where like your binary operation is just sort of growing this list of state. But so exactly your solution is the double fold where you basically have two binary operations and like you can simplify three consecutive odds into not maximum consecutive ones. But if you basically, if you turn your list of integers into a list of ones and zeros where they represent whether it's an odd or not, you now basically have a different flavored maximum consecutive ones
Starting point is 00:10:30 where it's not max, it's like, is the maximum consecutive greater than or equal to 3? This is just a transform reduce. So what you described is a two pass and I'm saying you can do it with a fold Yeah, that's what I'm saying as well,
Starting point is 00:10:45 is that you can, a double fold is a double fold is maybe poorly named, but it actually is only one pass. Oh, it's a, it's a, it's a one pass algorithm that successively for each element in your sequence applies to different binary operations.
Starting point is 00:11:00 And that two pieces of state, that's the key point is that you're, you're carrying your current number of odds you've seen so far and the global maximum of odds in a row you've seen. So you're always trying to – you always – max is your second binary operation. And your first binary operation you can do with something called a phi combinator. But basically it's your current so far, and you always multiply it by whether it's an odd. So if it's not an odd, it resets it back to zero um so so it's just a the the way that i'm thinking of doing this in my head and i'm thinking and thinking in particular about the an implementation that you can paralyze
Starting point is 00:11:38 and in particular one where you can reorder the inputs um and what i have in mind is you do a transform where it's taking in three input sequences. The first is zero to n minus two, the second is one to n minus one, and the third is two to n. And then that returns you whether or not all three of those consecutive elements are odd, and then the second one is just a plus reduction. And the way to do it that way is that then you can just call like a, you know, a C++ standard transform reduce with a parallel algorithm policy, and it's fine if the input gets reordered because the input is essentially a pair of the three of the consecutive elements. Right. Yeah, that's interesting because, I mean, my goal is always for radical simplicity
Starting point is 00:12:37 wherever I can get it. So to me, it's just a fold. I don't want all this other stuff. It's just a fold. What you just described is interesting because then you make it embarrassingly parallel, and you can just go out and do it in chunks. That's really nice. That is what they pay me for.
Starting point is 00:12:52 In spirit, like this now covers kind of two of the solutions. So you encoded the second solution in a transform reduce. So this double fold or a scan followed by a fold, the one that basically Zach explained is the first way to solve this. One of the three categories of ways. The second way is what you solved it. And I categorize that as the slide. So like you could also do this,
Starting point is 00:13:15 wouldn't be as performant, but do a slide or an adjacent, which looks at three at a time, turns those into booleans or ones and just checks, are any of those, you know, sliding windows of threes true for all odd?
Starting point is 00:13:26 Then there's a third solution, which is basically turn everything once again into zeros and ones, one being an odd or a trues and falses, and then do a split or what they call like a chunk by in C++23. And you're looking at two adjacent elements and you want to split by them being not equal to each other. And then you're basically getting a list of lists, which are all consecutive ones, all consecutive zeros. And then you can either do reductions over each of those sublists or filter out the zeros and then just map the lengths over all the sequences of ones. And the point is not like which one of these are more performant. The point about why I love APL and BQN is like these languages enable me so quickly
Starting point is 00:14:08 to think about solution one, solution two, and solution three. And it gives you this ability to have a flexibility of thought in terms of algorithmic thinking or thinking about different solutions. And to a fault sometimes. I have such a strong array brain that I forget when I go to Haskell that like, I shouldn't be trying to turn things into lists of just like the way that I am able to think about problem solving, it's like there's this way, this way, this way. And that was, we had an interview on my other podcast with an individual by the name of Michael Higginson.
Starting point is 00:14:52 And he articulated something so well that like sometimes, like there's different reasons that you write code a certain way. Sometimes it's for performance, but sometimes it's for readability. Sometimes it's for the ability to refactor something in the future. Sometimes it's, you know, refactor something in the future. Sometimes it's, you know, you're putting more, like there are different reasons to write code different ways. And I think in C++-topia and Rust-topia, like we're always about perf, perf, perf, perf, perf. But like array languages, especially APL and BQN aren't necessarily like focused with like perf at like, at is number one. And, and it's so just just it just like changes the way you can think about
Starting point is 00:15:26 things and on top of that like there's a bunch of other reasons like array languages aren't at all known to be combinator languages like languages that provide you with primitives for composing functions and like when we think about composing functions in c++ or python or whatever people think about like the bash pipe model where you like you pipe one thing to another thing. And like that is like you composing unary functions is like a very, very simplified model of what's possible with like different types of function composition. What if I want to compose two functions? The first one takes two arguments, evaluates that, then you have a single argument, then you pass that to another function that's a unary function. How do you do that? Well, you're going to have to write some custom function in
Starting point is 00:16:09 C++. But like in languages like Haskell and BQN, they have primitives where you can just have a binary function on the right, a unary function on the left. You put your what's known as the B1 combinator in between, and then like it just automatically composes. So like what's really important in that composition is just the two functions and the pattern. The name of the arguments isn't important. Like the braces defined or like the parentheses to define a Lambda, all of that stuff is just like unnecessary. And BQN and APL are like the only two languages and maybe Haskell that like provide you with this ability to enhance the way you think about function composition, enhance the way you think about solving things algorithmically. And there's also just on top of it, like, especially with APL, like it's unbelievable
Starting point is 00:16:53 the amount of thought and beauty is put into the glyphs. Like people make fun of them all the time, but it's like making fun of Chinese because you don't speak Chinese. It's just like, you know, or like, you know, there's a quote that's like, would you say you don't, like, Russian poetry is not understandable because you don't speak Russian. It's like, no, you would just acknowledge that you don't speak Russian. Whereas no one does that with APL because every language for the most part looks similar enough, you know, Python to Java to C++ to Rust. Like they look similar enough that, you know, it's like, oh, well, this one looks so different. It must be unreadable. When in my opinion, like the amount of time that it takes to read, you know, it's like, oh, well, this one looks so different, it must be unreadable. When in my opinion, like, the amount of time that it takes to read,
Starting point is 00:17:33 you know, two or three lines of APL, which the equivalent in C++, you know, with like template arguments and, and just like, you know, everything. It's just like, I think, I think APL is actually much easier to read. Yeah, C++ has no great claim to readability. Although you and I have had this discussion in the past, though, I one of the reasons that I find it claim to readability although you and I have had this discussion in the past though one of the reasons that I find it hard to read your APL is because there's all these order of operation things that you prefer to leave implicit
Starting point is 00:17:54 in your APL code and I think would be a lot clearer if you just put the parens that you feel that the parens make it less elegant so in certain cases that is the case but like I also had. I mean, so in certain cases, that is the case. But like, I also had this, I mean, this has come up now once or twice across the two podcasts that I have that like one time at work, I pointed out that like in APL, there is not
Starting point is 00:18:17 really any precedence. There is some precedence between operators and functions. But like, if you just have a series of functions, it's just linear. It's just right to left, which you can argue is the wrong order, but it's just like, it's just, if you have like one plus two times three divided, you just process right to left. Whereas in school we teach bed mass or ped mass, whatever country, like, and it's just like, I brought this up in a meeting once and got like, got like, heck, I don't know what the word is, heckled or whatever. And people were just like, whoa, whoa, whoa. Like, how can you say that the APL model is better? And I was like, because there's no
Starting point is 00:18:46 model, like there's no rules. It's just right to left. Every function has the same precedence. Like you just read it in a line, whereas you have to memorize this like hierarchy of binary operations. And, and it's like, yeah, but my issue is that like, like if I'm a newcomer to APL or BQN, I don't know which things are binary operations versus unary operations versus input or something. Then it may not be clear to me that, oh, there's some nested operations being called here. Whereas with the parens, it would help me at least understand sort of the structure of the calls that you're composing. So, I mean, I think there's an argument to be made there for sure. I mean, I agree with what Connor is saying because having written – I mean, if you've ever written an interpreter, you know that as soon as you put stuff in like RPN, like reverse Polish notation.
Starting point is 00:19:42 I'm not sure why the word – why the country of Poland is in there. We can ask for more. Whenever you write stuff that way. Get back some time. Do you know why RPN is called RPN? Just real quick. Yeah, so as soon as you do that, you have operand, operand, operator, operand, operand, operator.
Starting point is 00:19:57 And it's just, it looks like the stack. It looks like exactly what the stack is doing all the time. There's never any ambiguity. And it's hard to read in the same way that all code is hard to read all all code idioms are hard to read the first time you encounter them but i have a feeling that like we all learn to write code that way it would be very simple to adjust to very quickly and you have no ambiguity i think there's something to that because i remember once i got used to it seeing it you know essentially i was at a point where i was like trying to figure out what's wrong with my interpreter. And so I was looking at lots of these strings of, of RPN values in their
Starting point is 00:20:31 operations. And I got used to looking at them and understanding exactly what was going to happen at all, at all steps. And then, you know, debugging based on that understanding, but it did take me a while to get there. But once I started doing it, it was like very natural. Yeah. It's the same thing with, I think LISPs. People, people People talk about how LISP is the lots of silly parentheses, et cetera. But like, there are no more parentheses in a LISP program, relatively speaking, compared to like Python or C++. It's the same number of parentheses to invoke a function. It's just they move it from right before the argument list to right before the name of the function. Yeah's still two parentheses and a lot of cases because of enclosure they've got a language feature or macro called uh thread first and thread last macros which is similar to the
Starting point is 00:21:16 pipe operator that's being proposed by barry for c++ 26 it's like you actually get you can avoid like the main thing that i proposal is proposal is actually DOA now I think. It's DOA? What is that? I don't think – I think he's dropping that proposal for the pipe operations. What? Yeah. No.
Starting point is 00:21:32 He has a better model he's pursuing instead. Oh, okay. That subsumes the same functionality you would get from the pipe thing. But the pipe is like too limited. Don't do that to me, Zach. I was like i know i was when he told me that like he told me this is the meeting like you know last week or whatever and i was like what but i love the pizza give me the pizza you know it was like the
Starting point is 00:21:53 paper was so cool and i loved all the semantics in it and he was like yeah yeah but we we can do better and he starts describing this other thing i'm like yeah yeah screw the pizza i want that like we can do better but the question is will we get better i i have an answer to the reverse polish notation question which is that it's named for this polish mathematician jan lucia i i'm not i i cannot pronounce i mean that's what's called you already offended the polish people right that's why it's called Polish and not that dude's name because it's impossible. My phone is going to do it. Jan Łukaszewicz. Of course.
Starting point is 00:22:31 Yeah, Polish. Like I said, Polish. That's why we call it Polish. I speak a little Polish. I've been learning Polish. Wait, wait. Oh, come on, Connor. Don't do this to me.
Starting point is 00:22:40 You speak a little Polish. I speak actually quite a bit of Polish. What would you like me to say? You know, I downloaded Duolingo. That is the extent Polish. I speak actually quite a bit of Polish. What would you like me to say? You know, I downloaded Duolingo. That is the extent to which I speak. I got to say that app, just a little tangent here, beautifully designed. It's so addictive. I saw that.
Starting point is 00:22:56 Was it Daniela? Was 1,700 days in a row in Twitter or something like that? I might have it wrong. But, I mean, I haven't missed a day since i've uh since i've started and uh yeah how are you learning polish oh you know i was in poland i'm going back to poland in june for uh the lambda days conference we'll come with you okay we're well we got to go to slovenia um i'm i'm planning on being in europe for all of june i'm gonna be in europe for pretty much all of june too yeah so we're hitting up slovenia june 2023 we promise listeners i'm not sure if you heard
Starting point is 00:23:29 that uh episode zach but um oh you should come to vienna with me i'm gonna be in vienna for two days on my return trip maybe i mean that's on the way to uh slovenia is? I mean, they're both in Europe, sure. No, I mean like between Poland and Slovenia. I'll be landing in Poland and then going to Slovenia. And apparently Slovenia is a huge running country. So I'm going to try and run a run while I'm there. You guys know Gajpar is from Slovenia too, right?
Starting point is 00:23:58 Yes. He's the only person I've ever known from Slovenia. I did not know that. He lives in London, does he not? Or outside London? Yeah, he does. I think he does now, yeah. Yeah they they're um it's got a house in london now or it part of the house there's remodeling or construction or something they just bought
Starting point is 00:24:13 half the house we didn't how do you knew you knew that gasper was from slovenia and you didn't reach out to him because we were number eight in slovenia we were trying to get the number one i i knew i just I hadn't put those facts together. Yeah. Well, Slovenia is really interesting. So Gajpar told me this one time that the guideline for, you know, many countries have different entrance requirements to becoming a citizen, right? In Slovenia, the entrance requirement is, do you speak Slovenian? Like, that's it. You take a standardized test. If you can speak Slovenian, you're a citizen. Like, because it's so important to them to keep it because, you know, like it's a small European country, not that many people speak it outside the country. Every year, the
Starting point is 00:24:53 number of people that speaks it in the world gets smaller. And so they're trying to preserve the language. And so that's their whole requirement. So anyone can come, they're taking all comers. If you can, if you can learn Slovenian after a certain amount of time in the country boom you're a citizen i thought that was really interesting i wonder if i can get dual passports maybe i should start learning slovenia too i will i will say the polish has been getting a bit difficult at whatever level i'm at so uh so okay but before we move back to bryce's thing i just want to say like based on what you're saying the apl i because this is something i want to put a pin in we we didn't have a chance to get back to it. So, you know, I started in basic,
Starting point is 00:25:27 like just like in grade school kind of thing. I had access to a computer very young. So that was what was around. Also basic came with early versions of DOS. It was just like kind of built into the shell. And then I went to C++. I basically have not done anything else seriously, but I will say like when I was in university
Starting point is 00:25:44 and I learned Haskell, there was that same thing you're describing with learning uh apl was having to do everything functionally when i've been doing everything procedurally from the get-go and and every step in between it expanded my understanding of what programs were and how i could write them and how i could structure how i could think about them. Like the, the, the, the expansion of reasoning capacity from that was dramatic. And so that's really interesting that you're getting the same thing out of APL. I find that really interesting. We call C++ a multi-paradigm language, but there are other paradigms outside of it that I think are also extremely useful. Yeah. And I think like, I think that's the other thing. A lot of people bounce off of Haskell as well. And like, I'm not a Haskell expert by any means, you know, I don't, I'm not, you know,
Starting point is 00:26:28 Monad is a monoid in the category of endofunctors. You know, I, I read category three for programmers, you know, I've maybe understood a third of it. And like, I don't know the, the more, I don't know, advanced topics in Haskell, but like, I think people should just go to Haskell just to learn how to write like the equivalent of, you know, account if, or like, you know, if you want to filter odd lists from a number, it's just like solution equals filter space odd. Like that's it, you know? And like, what's the equivalent of that in C++? First you need C++ 20, then you need to write a function, then it's return stood colon, colon views, colon, colon. If you know that views is a shortcut for stood colon, colon ranges, colon, colon views, colon, colon filter, paren your list comma, or you can pipe it in. You can go list and then pipe to that.
Starting point is 00:27:14 And then your Lambda bracket, bracket, paren auto E N paren brace return E percent to semicolon and brace N paren semicolonolon and then finish writing your function like how is rant brought to you by the python foundation and it's really not selling the language very well i just like don't get me wrong i'm so excited about real modern like c++ 23 and c++ 26 c++ because the kind of code that i can write is getting like very close to yeah you know the kind of functional things is can write is getting like very close to, you know, the kind of functional things. Is it as elegant? You know, is it as, does it subordinate as much detail? No, you know, are the namespaces a bit irritating? Yes. But like, it's, it's so far from what was possible in C++11. And I just think that like going to Haskell and just learning how, like how to compose
Starting point is 00:27:59 a few basic functions really gives you an appreciation for like what could be possible in C++ and why like C++ ranges is so important. And like, we talk about ranges all the time and like never talk about the fact that like it is, it falls into this category of a paradigm of programming, which is not really widely known called collection oriented programming. And it guarantees basically like single iteration over your sequence. So like when you compose a bunch of things, whereas if you were doing that in C++11 without composing things, you're mapping over, iterating over these sequences, you know, however many times you call these algorithms. And so you're going to get like a coefficient of however many algorithms you call versus when you pipe these things together in
Starting point is 00:28:38 ranges, you now have just like, oh, and no coefficient. It's just, yeah, I'm very excited. Yeah. I think I agree with everything you just said i think there were other things that i got out of haskell in addition to that i think the algorithmic stuff was like the elegance of which you could do like things with list comprehensions and just like fold algorithms like simplified algorithms in terms of of a kind of a fold that was great but then there was also just a generic programming part where i can say like you know type inference yeah the type inference and like uh using types as cases in inside of an algorithm, like, oh, I know if I, if I have a list of things, I want to do this. If I have a scaler, I'm going to do
Starting point is 00:29:13 this other thing. Right. And, and just the ability to treat the input data as like one of a family of things that are isomorphic in some way. And then I deal with that family of things different only at certain steps of my work. Right. That that's that was kind of a revelation to me because you know and I think at that point I really hadn't been introduced to generic programming as a real like fully baked concept that would come later but even then if I think even if I had that as a foundation and I went over to Haskell and saw that I'd be like you know wow I want this instead and I think there's lots of things where you where there's lots of sort of radical proposals that come in front of, especially the language part of the committee, where people say, that doesn't feel like C++. That's a little weird.
Starting point is 00:29:55 I don't like that. But I think there's a lot we can steal from other languages and make our languages that much better. And I think a lot of the things you're talking about are really important to me. I think having like a Rust traits-like system, which is the thing that, one of the things that Barry's working on, I think that would be incredible.
Starting point is 00:30:14 And I think probably having a safe mode that basically is isomorphic with the Rust borrow checker is, if it's not necessary, it's damn near necessary for the future health of the language, right? I think if we had something like that, then I don't know why I would use Rust unless I was just, you know, fluent in it in the way that I'm fluent in C++ and don't want to change. But I think that we can do a lot towards embracing a lot of these paradigms from other languages but i think that there's we have this installed base of code and semantics that are very specific to our language and it's hard to shoehorn things in so it becomes it becomes a constraint solving problem because
Starting point is 00:30:54 you know you introduce a new semantic and intersects with all these and other semantics we have already and that becomes difficult yeah but my concern is that the problem isn't that the problem isn't that people are writing insecure c++ code now the problem is that even if we give them the tools to write you know memory safe c++ code today there's still billions of lines of existing c++ code out there that aren't aren't going away yeah Yeah, that's always a problem. And the folks at Google, they would make claims, especially Titus would make claims like, well, you just write a rewriter that just fixes your code.
Starting point is 00:31:38 And I wish we had a better story for that in C++, but it's so hard to parse C++ code so that automation can understand what it parsed hard to parse C++ code so that automation can understand what it parsed and then fix things in a way that actually makes sense. And I think it's a lot harder than, I mean, the reason Google can get away with saying stuff like that is because they have like, you know, whole teams of people that do exactly this task. And so if you have that kind of scale, then often you can do these kind of interesting
Starting point is 00:32:03 transformations, but the rest of us usually can't. Be sure to check the show notes for links to everything we discussed in today's episode, as well as a link to a GitHub discussion if you have any questions or comments. Thanks for listening. We hope you enjoyed and have a great day.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.