Algorithms + Data Structures = Programs - Episode 216: Programming Paradigms and Algorithmic Thinking

Episode Date: January 10, 2025

In this episode, Conor and Ben chat about programming paradigms, algorithms, and much more!Link to Episode 216 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)SocialsADSP...: The Podcast: TwitterConor Hoekstra: Twitter | BlueSky | MastodonBen Deane: Twitter | BlueSkyShow NotesDate Generated: 2024-12-16Date Released: 2025-01-10SICP - Structure and Interpretation of Computer ProgramsC++Now 2019 - Algorithm IntuitionC++98 std::adjacent_differenceC++23 std::views::adjacent_transformHaskell mapAdjacentC++98 std::partial_sumBQN ⌾ (under)Design Patterns (Gang of Four)"'tag_invoke' - An Actually Good Way to Do Customization Points" - Gašper AžmanDyalog APL TatinDyalog APL LinkIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8

Transcript
Discussion (0)
Starting point is 00:00:00 You know, like in C++, we say no raw loops. There's an idea that if you're writing a for loop, if you're writing a while loop, you should scrutinize it. You should figure out if there's a standard algorithm does this, don't write the raw loop, which has a tendency to gather cruft. The difference in a way between Haskell and Lisp, similarly, a mantra in Haskell would be no raw recursions. In Lisp, you write the function and you're dealing with head and tail and you're recursing.
Starting point is 00:00:26 Great. In Haskell, if you write a function where you're manually dealing with head and tail, it's kind of like in C++ when we're writing a raw loop. One of the things that languages like Haskell and BQN, one of the sort of next step up they've achieved over languages like C++, Python, whatever, is exactly what you were just saying. We keep making this mistake of when we write algorithms, we don't recognize that we can decouple the traversal of whatever data structure from the actual operation that we're doing on the Welcome to ADSP, the podcast, episode 216, recorded on December 16th, 2024. My name is Connor, and today with my co-host, Ben, we chat about programming paradigms,
Starting point is 00:01:21 algorithms, and more. So, my first job was programming C. C at that time, at least, and probably still was very procedural. You know, you write functions, you do, you do one thing after another, you keep track of state, right? That is one paradigm. Now, and there are plenty of C++ programs that do that today. Most of them, in fact.
Starting point is 00:01:49 You know, object orientation was another paradigm that came along, really had a big heyday in the late 90s through the 2000s. Functional has been taking over this past decade plus um array languages have always been like watching from the sidelines if you like you've really got into them um it seems like there are you know each one of these things teaches you a different way to think right like like yeah um and uh with languages with with with languages like lisp, Lisp family languages, the big way it teaches you to think is recursion, right? You're always thinking about head and tail, you know,
Starting point is 00:02:36 and you're coding up what to do for the head and then how to recurse on the tail. And I solved a couple of Advent code in Lisp, and it was very nice to do that, right? But then when you learn Haskell, you have the same things available, but, you know, like in C++, we say no raw loops, right? So we've reached, there's an idea that if you're writing a for loop, if you're writing a while loop, you should scrutinize it.
Starting point is 00:03:02 You should figure out if there's a standard algorithm does this. If there's not a standard algorithm, maybe there's a boost algorithm. Maybe there's some kind of algorithm. It is unlikely that we are every day in our regular jobs inventing algorithms that are new to science. And so there's a recognition that we should call the thing by its right name, form an abstraction, don't write the raw loop, which has a tendency to gather cruft. The difference in a way between Haskell and Lisp, one of the things about their functional styles, they're both sort of functional languages at the core. But I might say that similarly, a mantra in Haskell would be no raw recursions, right? In Lisp, you write the function and you're dealing with head and tail and you're recursing. Great. In Haskell, if you
Starting point is 00:03:52 write a function where you're manually dealing with head and tail, it's kind of like in C++ when we're writing a raw loop, right? There's the idea that really I should be able to express this better using built-in recursion schemes in the language. And I think, you know, array languages maybe have that, but they take it to a different place. They, you know, mask-based programming is about, you know, in the Lisp way of thinking, it's very much minimizing data and data layout and it's bringing to the fore the idea of the evolution of the program right to the point where you can you can model famously in like lecture five of the sick p lectures um Hal Abelson i think shows shows how to model like pairs and therefore lists and all the underlying data like that's not actually you thought that was an array under the hood no that's that's not any kind of data at all that just functions right and in bqn it seems like that
Starting point is 00:04:59 sort of turns around gets put in his head It's like directly composing data, right? You have a data structure, you make another data structure that mirrors it like a mask and use it to select out. So you're just bringing the data to the forefront and using transformations on data. All of which is to say, you know, I think there are lots of different tools here and uh and all of them are very useful in their own way but i don't know if you have ideas about that those observations no no i completely agree with almost everything you said uh i mean i think it was this is goes back to like the very first time we met actually which i believe was at c plus plus now 2019 i was super super early on my we'll call it
Starting point is 00:05:49 algorithmic ascension journey and you i remember you being super excited just to meet other people who could talk at the level that you're already talking at about who could who could talk exchange these ideas of like this algorithm, this algorithm goes together this way. Well, it's, I mean, it's funny that you, you say that because I think at the time I don't, I didn't even realize really like what, what I was onto. And in fact,
Starting point is 00:06:20 when I first gave the meetup version of that talk in the same year in January, my goal was to try and give it at CPPCon. And I had been listening to CPPCast and knew C++ Now was like the expert conference. And I was not interested. Like I did not consider myself an expert. And I remember when I was talking to Bryce, he was like, you got to submit this to C++ Now. And I was like, no, no, no, no way.'m not i'm not going there like i don't have anything to contribute
Starting point is 00:06:49 he convinced me and then when i uh i showed up i was like pretty nervous but like my advantage was that i had almost met like half the people virtually in a way from watching everyone's talks like i had watched your talks i I had watched a bunch of folks talk. So whenever I'd bump into someone, I'd be like, oh, my favorite talk of this was yours, which is a good like ice breaker if, you know, if that's your style. Anyways, and I remember talking to you and just like, I remember one of the things you told me because I was just learning. I was at the phase of like transitioning to c++
Starting point is 00:07:26 algorithms are awesome but like man haskell makes it so much easier and i think you were the one that mentioned like oh have you heard of uh like bartaj maluski um and i was like no and and you were like oh yeah like he has been giving talks for years at this conference, basically implementing Haskell stuff at compile time and C++. And that was just like one of like several. You also like led to at least one slide directly in the 2019 edition of that talk where you pointed out that Fibonacci could be uh calculated with adjacent difference which like i think that that was like a a stone that you pushed off a mountain that like rolled into like a whole thesis of mine an algorithmic intuition lighting a fire that was part of that yeah like i remember i remember not believing you um and i think I went and like got my laptop and whipped out Gumball
Starting point is 00:08:25 because I could not believe that that was what adjacent difference did. And I think that led to the insight of like, holy smokes, like anytime you encode the default binary operation or operation in the name, it's so limiting. It's just putting a fence around the way you think about it yeah and like a intellectual box around what something can do and i was like how could you like fibonacci is
Starting point is 00:08:51 adding how could you possibly use an algorithm that's about subtracting and then you realize oh my goodness it's not about subtracting at all it's about combining adjacent elements together with a binary operation and they've accidentally and i think in is it um uh stepanov's defense i'm not sure if he was the one that named it it was to it was uh adjacent difference in partial sum were specifically named to i think elicit the fact that these were two round tripping functions to be kind of inverses yeah yeah which is a mistake uh even if that's your goal you succeeded in the goal but you've you've made the mistake of obfuscating what you can do with this. Anyways, the point being is that all the stuff that you're talking about, it's like I completely agree with.
Starting point is 00:09:34 In Haskell, I rarely ever write recursion. In the same way in C++, I rarely ever write raw loops. And that's not because no one ever said no raw recursion. It's because I don't think of haskell as a recursion language i think of it as every algorithm has already been written it's in it's in a library somewhere data.list data.list.ht data.split i'm not going to go write it because i don't like writing recursion i just like using the dot composition operator or the b1 composition operator in between things that other people wrote anyways you're going to mention something a couple times and uh
Starting point is 00:10:03 i was going to agree i was just i was going to say one of the things that other people wrote. Anyways, you were going to mention something a couple of times. I was going to agree. I was just going to say, one of the things that languages like Haskell and BQN, one of the sort of next step up they've achieved over languages like C++, Python, whatever, is exactly what you were just saying. We keep making this mistake of when we write algorithms, we don't recognize that we can decouple the traversal of whatever
Starting point is 00:10:27 data structure from the actual operation that we're doing on the loci, right? In C++, it looks like we do that less because we have this large library of algorithms. But in fact, it's a small library of algorithms. Those algorithms pretty much only work to a first approximation they work on linear sequences right as soon as you go into something that's a tree or a graph or something else you are back to rolling your own algorithm and like as not if you are used to programming in c++ and python you are not thinking in the way of let me separate the recursion from the operation or the traversal from the operation, right? Which in languages like
Starting point is 00:11:10 Haskell, BQN, anything like that, sort of the way they lead you to write the code is naturally to do that, I think. Yeah, no, I've heard Adam make this argument. He's one of the panelists on arraycast that that's one of the nice things about apl and bqn in these languages is because they're symbolic you don't actually need to name the symbols which like he's i think take an issue with the fact that in a lot of the documentation we have to try and find ways to name these things so we can speak about them but the naming of them you can fall into that trap again whereas like without the name at first it is conceptually and like intellectually free like uh until you put the name indices on the uh symbol that you know pulls out the corresponding index name without that
Starting point is 00:11:59 there you might be more likely to see that there's a generalization of this where it's really more of a copying thing than it is a a taking the index of thing and and yeah you know speaking of which adjacent difference in in haskell they named correctly uh adjacent map which we then got a version of in c++ 23 with adjacent transform anyways yeah there's uh i don't i don't know what the bqn version of no raw recursion and no raw loops is i'll have to think about that i'll get back to you and think if there's but i think uh maybe there isn't one but maybe there is something that you can identify yeah i think i think yeah and this is getting back to your very original question is you know what is bqn and these languages, if, and that was the thing at the beginning of, I keep on not answering this question because I could just have thoughts
Starting point is 00:12:48 that pop into my head is what was it? Problem three that you did with the command line and grep at the beginning of my video and regex, I think I said, you know, really, I want regex for this, but BQN doesn't have a regex library. APL does, but I was doing this in BQN and that's what I would have done if I, if I had that DSL available to me. Yeah. Well, you know, regardless of what languages that you know, I would highly recommend, not you personally, but that one knows, I would highly recommend getting familiar with the command line.
Starting point is 00:13:20 You know. Oh, yeah, yeah. Just string together, like, grep, tr x args find yeah and when you need them said an orc tons and tons of problems become become trivial you know i i've i've had situations where uh where i've been on a team and and um you know the day after this happened someone another program on the team came to me and, you know, we were chatting about what we did yesterday. And he said, oh, I wrote this Python script to find all these files and zip them up, right, for localization or whatever. And I spent half a day on it.
Starting point is 00:14:03 And I'm confused at this point confused this one like why why would you spend half a day writing python to do that when it's literally like 10 seconds on the command line yeah but they didn't make him feel bad over it i just you know you're one of today's lucky 10 000 yeah sort of thing yeah and it makes me uh think of like the times when i've learned a new like you've got your toolbox of command line tools yours is definitely larger than mine i did some said slash i don't actually think i've ever done any awk um but i've done some said stuff in the past but like definitely do learn a bit of awk do learn a bit of orc it's a fantastic swiss army knife it's come up before multiple times when when we've been chatting um but like one of the things that i
Starting point is 00:14:52 learned was the the back ticks for evaluation like that that is probably like a a small insight in the way that my brain works because when i learned that small trick because you there's two ways to spell that dollar parens yeah exactly dollar parens and i think because there's two ways to spell that. The dollar parens. Exactly. Dollar parens. And I think even there's like an eval thing, but like the back text was like a couple characters less. And it then, I was just, it made me so happy because it's a more elegant way. It's like, it's back to the whole partial application.
Starting point is 00:15:19 Like when there's a simpler, more elegant way to do things and you're able to leverage like a whole tool belt of these things, it makes it painful to go back to the way you had to do it before. And it's very much like programming. Like when you're on the command line, you're not like it's like programming in a REPL, right? You're not writing out this 100, 200 characters cold. You're building it up piecemeal. You're playing with it.
Starting point is 00:15:45 You know, it's real easy to do yeah it's uh and so yeah to circle back for the 10th time to your you know what is it great for it's best for things where reduction scans these kinds of things you need to reach for but my follow-up to that is that's like 98% of like if I plan to do... Of your day job? Well, just of all of my programming. Like even when you're going to read from some JSON file or do some kind of, you know, I do some geospatial stuff every once in a while. The code that I end up writing in Python or whatever, it's just like the kind of code that I would be writing in BQN. But it's just more painful and I'm reaching to a third-party library here oh a built-in function or whatever and it's
Starting point is 00:16:29 like bqn as i learn more about it and these array languages in general it's like it does the common stuff so much better than every other language i know like reduction scans etc and then it has the ability to do you know all the other stuff maybe not as elegantly but like it does the common path so much better and that's the thing is people think of it as like oh you need arbitrarily ranked arrays and blah blah blah no you don't think what what if i just restrict like you we said earlier that uh algorithms basically they just work on 1d sequences it's like that's a subset of what APL does. I think of like, yeah,
Starting point is 00:17:07 APL is just like a superset of the programming I was doing in C plus plus algorithm land. And then it turns out, not only do you get one D sequences, you get all the dimensions. And then on top of that, you're discovering like four or five different paradigms. This mask programming under programming is just,
Starting point is 00:17:24 it's mind blowing. Like this, like perform an-programming is just, it's mind-blowing, like this, like perform an operation and then do something and do the reverse of it. Oh, I see what you mean. Under, yeah, yeah, yeah. I get, you know, I was wondering what you meant by that. It's such a useful tool. So let's make that a little clearer for the listener.
Starting point is 00:17:39 Oh, okay, yeah. When you say under-programming, you are talking about transforming your data into some form where you run an algorithm and then transforming back. So, the algorithm is performed under some reversible transformation. Yeah. That's one way we can say it. And this ties into a quote that I heard once that – what is that?
Starting point is 00:18:00 The book of – the Gang of Four book that's patterns? Pattern matching. No, sorry That's patterns. Pattern matching. No, sorry. Design patterns, not pattern matching. Design patterns. And I can't remember who said the quote, but it was something to the effect of design patterns are just like missing. Do you know what I'm talking about? Yes.
Starting point is 00:18:17 I think it was Peter Norvig. Okay. Does that sound right to you? Yeah. okay does that sound right to you yeah and but it's this idea that design patterns are merely there to address flaws or like missing things in a certain language lacking things which like you go from language to language sometimes you have some language facility or library thing and now you no longer have a need for like for instance the visitor pattern like we need that until we have something like pattern matching you know then you no longer need the visitor pattern like we need that until we have something like pattern matching you know then you no longer need the visitor pattern is that correct well okay the visitor patterns i'm
Starting point is 00:18:49 going to take issue with that the visitor patterns are probably a poor a bad example of what i'm reaching for example because the visitor pattern is not really about visiting things the visitor pattern is a solution to the expression problem right right it's turning you know in an object in your in your standard object oriented languages it's and i'm including c++ even though i wouldn't really say c++ is not sure but it's that in that model of object orientation it's easy to create new types it's easy to subclass things it's easy to you know derive and uh and extend um but it's hard to create new things. It's easy to, you know, derive and extend. But it's hard to create new functionality.
Starting point is 00:19:29 If you have a class hierarchy and you want to add a function, add a member function, you need to thread it through every part of that class hierarchy, right? And in functional languages, to a certain extent, it's the other way around. It's very easy to add new functionality, but because they have pattern matching, right? If you have pattern matching, you have functions.
Starting point is 00:19:53 Every function you write matches the patterns of your classes. And so you've turned the problem on its head. It's hard to, it's easy to add functions, but it's hard to add new types because you would need to go and change all of the pattern matching places to get a new type. So that's a fundamental dichotomy, and it's known as the expression problem. And the visitor pattern changes the one form into the other form.
Starting point is 00:20:15 So for languages that have object orientation, and it's easy to add classes, types, what the visitor pattern does, it makes it easy to add functions. And I think Gasper Asman has a really good talk about the expression problem. I'll find a link and throw it in the show notes. I think that was him. I'll fix it in post if it wasn't, but I'm pretty sure it was. So anyway, I agree with you, but I just think the visitor pattern is a bad example. It's a can of worms when we open the visit pattern well the so the the reason i brought it up though was because uh under obfuscates or makes um a couple things in both c++ and haskell
Starting point is 00:20:57 as those are the languages we've been mentioning unnecessary or not unnecessary but there's a different way for solving it so one of my kind kind of irritations is that in Haskell, you have two different types of scans. You've got scan L and scan R. One's for the left, one's for the right. But in BQN, you only need one because a right scan is just a left scan under reverse which modulo performance concerns because obviously if you're going to be actually doing a copy uh and i'm actually not sure how under reverse i know a lot of them have like idiom recognition so you're not actually completely doing copies you're just once again doing some kind of view like thing but it's a very nice way to conceptualize, like I can do things.
Starting point is 00:21:46 And it's like, and some people might be thinking, well, now I like our scam better. What about all the other operations though, that don't have the right equivalence that you'd need to go and roll yourself? It's like,
Starting point is 00:21:57 well now we have a generic higher order thing that I can just apply this operation under this. And it's, what about under transpose? What about under some sort what actually i don't know if sort works because it has to be something that is reversible in a sense or invertible yeah yeah i think it again i'm going to kind of say that haskell l and r scan was maybe not the best example because uh it's not just performance concerns of putting a reverse in, it's laziness
Starting point is 00:22:25 as well, right? And so there are real reasons, thinking about laziness, like the canonical example is like, Haskell has what do they call reduce in Haskell?
Starting point is 00:22:40 Fold? Yeah. So Haskell has fold L and fold l prime yeah the prime meaning it evaluates strictly not lazily and and and fold r right which is right to left um and the rule of thumb is basically never use fold l to a first approximation right because if you're doing a left fold, you probably want strictness or you probably, it is performance-wise much better to be strict in the first argument. Conversely, if you're doing a right fold, the performance of right fold comes from the laziness. Yeah. These are things that I skate over a lot of the time because in haskell well in haskell i've never checked in any production haskell code in my life so
Starting point is 00:23:30 i play fast and loosed with fast and loose with my folds yeah i mean and that's fine but now i'm gonna hit you with the killer question like so so there's been a killer question in the wings this whole time just one more question. So I've known professional programmers who programmed in functional languages. And to a large extent, you know, they don't spend their days carefully crafting algorithms. They spend their days wondering about how to interface library A with library B and wrangling the build process. So, you know, using BQN in a professional sense, which I'm pretty sure you don't currently do, but maybe you know people who do.
Starting point is 00:24:15 You've certainly talked to people on Arraycast who do probably. What happens there? One of the great things about programming in, let's say, a mainstream language like Python or c++ is there are there are off-the-shelf solutions for just doing the same thing in ci as you do at your command line right what does ci look like in in bqn or one of these array languages ci i think across the board ci is non-existent i know that like so there's bqn which is a language in its infancy in that probably my biggest complaint right now is that there's no any form of like a package manager and even even the idea of libraries and packages in the array community is controversial
Starting point is 00:24:58 because some folks don't believe in dependencies which the more and more I use these languages, the more and more I disagree with that. Because at a certain point, you do not want to be writing the world, which is what some people advocate for. Right. A similar thing happens in Lisp, right? The more things change, the more they stay the same, right? same right um if we look at lisp the the ease of writing things in lisp to a certain extent for a long time meant that yeah the package ecosystem was very poor yeah because lisp is a language where you want to write something well you you build the language up to to make a dsl in which you can easily express your problem that is the lisp way if i could say that yeah professional lisp writer.
Starting point is 00:25:46 But then I don't think it's that way anymore. You have folks like Zach Bean who wrote Quick Lisp, I think, and Package Managers. I forget what it's called. It's got ASDF, is it called? But yeah, Lisp has a package ecosystem these days, thanks to efforts of people like that. And anyway, so where do you think this is going with array languages?
Starting point is 00:26:10 I can't predict, but I can hope. And what I hope for is that a language like BQN comes up with a package management and kind of like ecosystem solution for this stuff. And on top of that, a part of writing array code is that everyone has a different style, not dissimilar from Lisp and, you know, Racket, where you can very quickly, there's a proliferation of DSLs. And I want, I don't necessarily need to be the dictator that decides the style, but like something closer to Rust and Rust format.
Starting point is 00:26:46 And like, even as something I think that's configurable where we do get some, you know, we don't want all BQN code looking different, but like, so something towards package management, standardized linting, formatting tools and testing frameworks.
Starting point is 00:27:03 And then from there you get to ci um this is for bqn we'll see what happens there i know that at uh dialogue the company that makes uh the apl interpreter they're also working on a bunch of tooling so i know they have like a package manager called i'm gonna mess this up i think it's tartan or tatan, something like that. And they're all Apple based things. So I think one of those is like a tart. And I know that there's like a testing framework. So they're starting to build this stuff. But you have to remember or know if you if you didn't know the listener didn't know from before.
Starting point is 00:27:37 APL is is shares a similarity with small talk in that it's a workspace based language which is completely different than how a lot of ci systems you're not starting from this like workspace environment you're typically starting from code that's either compiled and then executed or already you've got some binary or executable thing so to do something where you're loading a workspace and loading like the act of running something from a script and dialogue involves with this kind you have to use something called dialogue link which i think in the background is running this workspace kind of box right it's like loading a default workspace image exactly exactly i think code into it i could be wrong uh because it would it would seem like why not just write an interpreter that executes it and it's like well they've got i think hundreds of thousands of lines of this interpreter code that works this way it's not it's not a small
Starting point is 00:28:29 thing to um have it be like a scripting language like python um and so uh it's a sad answer to your question is that yeah a lot of this stuff doesn't exist at all and my hope is that like if we get a language whether that's bqn or wewa or some version of apl that comes with this stuff i think the interest in the language would skyrocket because right now i love bqn so much that like i'm willing to go and write like the uh simple versions of this tooling for myself because like it's like you know there's some quote i heard someone saying is that like you know a language has to inspire people to a certain extent that they're willing to suffer um in its infancy and like okay i am i am that person right now like i
Starting point is 00:29:16 am willing to not only suffer but like write this stuff myself just because it it enables me to write more of this code on my free time. Like if I can make BQN the entirety of my command line scripting. So I, instead of going to grep and whatnot, I just, I hop down into a BQN like session. I can do that there. You know, it's got a super easy way to load files and whatnot in the command line. Be sure to check these show notes, either in your podcast app or at ADSP, the podcast.com for links to anything we mentioned in today's episode, as well as a link to a get up discussion where you can leave thoughts, comments and questions. Thanks for listening. We hope you enjoyed and have a great day.
Starting point is 00:29:52 I am the anti brace.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.