Algorithms + Data Structures = Programs - Episode 5: Holiday Special - From China to APL

Episode Date: December 25, 2020

In this episode, Bryce and Conor have a "casual Christmas chat".Date Recorded: 2020-12-20Date Released: 2020-12-251979 Turing Award Paper - Notation as a Tool of ThoughtSchönfinkel 1924 Pap...er on Combinatory LogicSchönfinkel & Curry SwansMaxwell NewmanDyalog APLAPL Commute GlyphAPL Reverse GlyphAPL Trasponse GlyphAPL Reverse First GlyphSean Parent's 2013 C++ SeasoningALGOL 68Pharo / SmalltalkEDG C++ FrontendSchemeSean Parent's Now What? A Vignette in 3 PartsIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8

Transcript
Discussion (0)
Starting point is 00:00:00 This will be released on Christmas. I don't even really know when Christmas is. Like, I have to be reminded. Everyone's eating ice cream and cookies. And that is a big problem for cheese manufacturers, because apparently dry ice is used in the manufacture of cheese. You're white as a ghost. I would really appreciate if you lied down, sir.
Starting point is 00:00:20 You're going to have to cut this into the beginning, but we have to remind our viewers about the APL drinking game. Every time Connor mentioned APL during this episode, you were supposed to take a shot of eggnog. Welcome to ADSP, the podcast. Episode 5, recorded on December 20th, 2020. My name is Connor, and today with my co-host Bryce, we bring you a holiday special, casual Christmas chat, where we talk about anything and everything we feel like. The short version of the story is I decided after coming back from my first co-op slash internship in 2011 that I wanted to double major in computer science because I realized computer science was the future. And in meeting with the representative or whatever, administrative person for the computer science department,
Starting point is 00:01:31 like I said, long story short, she was like, oh, we've got this dual degree program. It involves you going to China for two years. You should do it. And then I said, sure. And then I went to China. And I only made it a year because they kicked me out. Wait, wait.
Starting point is 00:01:45 You got kicked out of China? I mean, technically they gave me... Are you an enemy of the People's Republic of China? No, no, no. I think they'd have me back. I mean, the university I went to would definitely not have me back. Why did the university kick you out? They had a rule in their handbook that was written in Chinese.
Starting point is 00:02:13 And at the time, I couldn't read Chinese. So I'm not sure how they expected me to know this rule. But you had to attend at least 70% of your courses in order to write the final exam. And if you don't write the final exam obviously you can't pass um i was only attending uh like zero percent of my courses um so you so like me you two were about hey you also flunked out of a school perfect we have uh it makes me feel better about my academic uh failures just you flunked out of a school that was, you know, in China. You went to none of your classes? Well, so I started out, I showed up in like, it was like May of 2012, I think.
Starting point is 00:02:58 And I went to courses like really, I went to all of them. My Chinese like skyrocketed. And then that course sort of only went till from May till about like the beginning of July. And then I had two months off. And at that point in my life, since I was 15, um, I had always either been working or in school. I had never, I had never had any amount of time where I wasn't doing at least one of those. And so I started to party a lot cause white people drink for free in China, or at least to where I was.
Starting point is 00:03:36 And then, you know, we got back to class in September and our teachers at that point, like in the previous semester, they spoke English in the previous semester they spoke English in the new semester they did not speak any English um wait different teachers or just the same teachers but just not in English different teachers different classes yeah so completely in Chinese and like imagine taking a grammar course about new Chinese words that you don't know
Starting point is 00:04:02 in Chinese um it's not a very good teaching tactic because you're like, you know, here's a new word and now I'm going to explain what that word means in Chinese, which is already a language you're currently learning. Versus like, so everyone in that class would be on what, there was this like app called Pleco where you would type in the Chinese character and it would recognize the Chinese character
Starting point is 00:04:23 and then translate it to English. Because like no one had any idea what the teacher was saying i was a little bookworm so i would always read ahead and i knew what the teacher like i could make out the explanation but like i remember one class they were just telling the difference between like idea suggestion and like thinking or something and like one's like young g g young and like young fa and like thinking or something and like one's like young ji uh ji young and like young fa and like i just made that up like i'm that's probably all wrong um but uh like it's very subtle like the difference between a suggestion and an idea and like trying to explain that and everyone was just super confused the point being that like it was a waste of my time to go to class
Starting point is 00:04:59 so i stopped going kept partying and then signed up for uh salsa dance classes and basically just became a full-time salsa student i do remember i do remember that part of the story yeah and then i showed up to my elect my final exams in december of that year or january of that year and um i basically got into like an argument with like the professor because at that point i had like stopped going to class and started just like hanging out with my Chinese friends all day and like going to salsa classes that were taught in Chinese where no one spoke English. There was one guy that spoke English, but like basically everyone I'd like so I couldn't survive unless I was speaking Chinese. And like if you want to learn a language, that is the sure fastest way. Like it is basically go take a salsa dancing lesson
Starting point is 00:05:46 no it's just immersing yourself in a place where they don't speak your language like very quickly you learn uh a key phrase um how was it in chinese it was like you can explain uh thank you or something it's like but basically can you explain that for me um like well booming by which means i don't understand you can like can you explain um and like and Like, well, bu min bai, which means I don't understand. Ni kei jie ze, like, can you explain? And then, like, they're super happy. Like, all the people I met in China were, like, super, super nice and always were just like, oh, yeah, yeah, sure.
Starting point is 00:06:15 Because, like, if you're speaking in Chinese and explaining that you just don't understand what they're saying, your ability in that language, whether it's Chinese or another language, is clearly good enough that if they choose a different set of words, you can probably understand what they're saying.
Starting point is 00:06:30 Anyways, so like my Chinese got so super, super good. And then I showed up to this exam. And like, clearly, like my Chinese at that point was way better than any other of the students. And I just got into this full blown argument with the prof. And I was like, what do you mean I can't write? Was the argument in Chinese? Yeah, it was in Chinese. I was just because they don't speak English. So I was just was like what do you mean i can't write like was the argument in chinese yeah yeah it was in chinese i was just because they don't speak
Starting point is 00:06:46 english so i was just like what do you mean i can't write the exam like you know clearly and they're like well you haven't been coming to class and i was like why does that matter like in in where i come from you don't need to go to classes you can just show up and write the exam you're like well that's against the rules and then anyways the university was like you've got two options you can either go home or you can retake all the courses that you didn't attend and rewrite the exams after having attended the courses. And I was like, that does not seem like a good use of my time. So I just went back to Canada. That actually brings up an interesting topic.
Starting point is 00:07:18 So as far as I know, almost all programming languages are english-centric but um i was i was looking at some data the other day and there are a lot of you know engineers in china and in in not just in china but in countries that do not speak english as a primary language in fact i'd say probably the majority probably more than the majority of software engineers these days are like, don't speak English as their primary language. And there's probably a good chunk of those who don't speak English at all. Yet almost all programming languages are English centric. I think that that's true. Do you, do you know of any programming languages that are not English centric or that are not at least like
Starting point is 00:08:10 Latin centric? Um, I do not know of any, but, um, my response to your observation is that like, that doesn't surprise me at all. So like one thing that you will learn if you like travel abroad especially to um well i was in china so i can't i can't speak for other um asian countries but i've heard as much the same is that honestly like a lot of the graduate students that i met in china like their knowledge of the english language is uh superior to mine um, I many times would hang out in, like, the common areas of these sort of residences, and they would have these little exchange groups where, like, you would ask Chinese questions, they would ask English questions, and, like, the questions that I got asked about English, they'd be like, why is this not valid grammatically? And then, like, I would
Starting point is 00:09:01 read it and just be like, like i do not know how to answer that question i just know that it's not like the like the english language is phenomenally complicated and like the way you learn it as a kid half the time is just your parents saying like no you can't say that yes you can't say that and you just you develop like an intense understanding it's like a gut yeah for native speakers it's like a gut feeling where it's where you learn it you might think about it more academically it's there's an interesting analog. So my grandfather used to teach citizenship courses for people who were about to take their U.S. citizenship class. say, you know, like, he, a Native American, like, a native-born American citizen who had, you know, fought in World War II, was a veteran, he probably couldn't have passed one of those tests if he hadn't, you know, been one of the people teaching it. Because, you know, that was
Starting point is 00:10:00 just sort of, it was just sort of an inherent thing for him. So it's like an interesting property that sometimes people that, you know, learn a language might know it better than the native speakers or might know the, you know, the, not necessarily know how to speak it better, but understand the language and understand it sort of academically better than a native speaker whereas a native speaker might just be like no you got to say it this way because that's just that's what sounds right you know it's gut reaction yeah and and so like that just it brings me to the point that uh like so out of china out of india like if you graduate from uh basically like a one of the top universities like your grasp of the English language is phenomenal so like an English is the uh it is it is the not
Starting point is 00:10:52 it is not the most common uh language spoken as a first language but it is the most common language spoken overall um and and so like uh i'm not surprised that like when a programming language pops up even if it's even if it is popping up uh out of you know china and i'm sure there are like some i was just shared the other day a link to uh fledgling programming languages um yeah let me send this to you although at this point i feel like we should be reporting we should be recording the uh well i think this is this is the episode right this is the episode no we didn't we didn't do we no we got to start we got to do one two three clap no no this is going in the episode i want i want to drill i want to drill down on this well i actually i think that there is one example
Starting point is 00:11:37 um of a language that's i'm not going to say popular um it's that's that's going to offend you but there is an example of a language which is not English-centric, which would be APL, right? Because APL does not have keywords. It's all symbols. Yeah? Yes and no. I mean, technically, in the dialog APL implementation, they do have some keywords like if colon and else colon, et cetera. But, like, that stuff is not considered
Starting point is 00:12:07 idiomatic um if you're dropping down into that like you're dropping down into i think i don't actually know why it was added potentially as a bridge for people from uh you know von neumann languages um but i guess that's true i mean apl what did I, I've said many times in tweets which is, people that go to Waterloo like to call it the Harvard of Canada, which is not really true. It's a very good school, but I wouldn't say it's the Harvard of Canada. But it has a very, very good computer science program. And they used to teach APL back in the day. And I know that both Kate Gregory, who for those of you not in the C++ community is a very well-known C++ speaker and consultant. And also Herb Sutter, who is the chair of the C++ ISO committee.
Starting point is 00:13:12 They both went to Waterloo. And so I asked Kate if she had been taught APL or knew anything about APL and what her thoughts on it were. And she said, yeah, it's a very unreadable language. And I was hoping she would say that she loved it. But yeah, most people don't. And the point that I'm trying to make here is that people don't say Chinese is unreadable. People say that they can't read Chinese because they don't know the characters. The same way that when there's a quote about when you read a Russian poem, if you can't read Russian, you don't say
Starting point is 00:13:46 that that poem is terrible or that poem, you know, is unreadable. You just acknowledge the fact that you don't speak or read Russian. And so there's this like, there's this disconnect because most programming languages look the same. When you see something that looks different, people just automatically say, well, that's just line noise, and that's unreadable. But the truth is, is that APL is to C++ what Chinese is to English. You can't look at APL and just say, oh, that's awful. I can't read that. You first have to learn APL and then judge whether or not it's a beautiful or readable language. And I think it's probably one of the biggest mistakes
Starting point is 00:14:34 of the APL community is that they haven't been able to communicate this. When they hear, oh, it's unreadable, most of the response is just that, oh, well, you just don't know APL. Like I've never heard anyone in an APL talk or in a podcast articulate like this difference that like you can't appreciate APL until you learn it. And a lot of people say, well, you know, Barry, he's passed away, but there was an APL party, a celebration of Thought, which is that, you know, if you learn this APL notation, it was originally called Iverson notation, you can learn to think differently. And I haven't even read the paper. I've read bits of it. But Barry Revzin is alluding to this paper when he said, notation is a tool of thought, dot, dot, dot, not the way I think. And sort of, I think he was retweeting some of the APL code I tweeted.
Starting point is 00:15:48 Well, you can't say you don't think that way if you don't even know what the symbols mean. And that's like, at one point, I had thought that potentially the paper shouldn't have been called notation as a tool of thought. It should have been called algorithms as a tool of thought. Because really, each of the symbols represents an algorithm. And then it wasn't until later on that I realized that not all of the symbols are algorithms. Some of them are operators, which is basically not all of them, but a lot of the operators map to something called combinators, which exist in Haskell and they exist in something called the SKI combinator calculus
Starting point is 00:16:25 that was created by Sean Finkel in a 1924 paper and then later was developed by Haskell Curry. First name Haskell for the programming name Haskell, Curry for the common word that's used for currying functions. Anyways, I've got two swans. I tweeted them out the other day that I named Sean Finkel and Curry. We should clarify. These are not actual living swans. They're little stuffed swans that are sitting next to Connor's plant, whose name I don't remember. That's the name of the plant. Oh, yeah, yeah.
Starting point is 00:17:00 You don't remember Maxwell's name? Maxwell, that's right. He's insulted. Yes, I named Maxwell after Maxwell Newman, who was the computer science professor that inspired, or at least Maxwell, Professor Newman said that he inspired Alan Turing to go in the direction that he went with his career in basically developing the Turing machine. Anyways, that's a long-winded ramble on... Oh, yes, I thought at one point the paper should be algorithms as a tool of thought, not notation as a tool of thought, but the language is both algorithms and operators. Or you can read operators as combinators. Like, so for example, the Haskell flip
Starting point is 00:17:44 that we talked about, I believe, in the previous episode, there is a symbol in APL called commute, which is exactly that. And what's even more beautiful is that both, like every symbol is both monadic and dyadic. So if it is past a single argument, it has one behavior. And if it's past two arguments, then it has a different behavior. Isn't that a little bit confusing? Because the behaviors aren't always a natural extension of each other, correct? Oh, oh, like this is now, now we're getting into like...
Starting point is 00:18:20 This is now APL the podcast. Now we're getting into like secrets that i've been keeping um for future talks that i'd like to give um that like uh yes and no like many times there there is so much thought which is why i just i think ken iverson is such a genius like thought and care were put into um like the the monadic and dyadic meanings of these algorithms um what is uh an example that i wasn't planning on showing in it well you know you know what screw it if you're listening to this podcast you're getting a a a preview of what is going to be you know i've been thinking about this moment that I was going to give an opportunity. This is Connor's Christmas present to you.
Starting point is 00:19:08 Yeah, this is a merry, yeah. Well, so I thought we hadn't started the podcast, so I don't know how we're going to cut this. But, yeah, this podcast is coming out on. That is your problem, not mine. It's coming out on Christmas. So happy holidays to everyone. This is my Christmas gift
Starting point is 00:19:25 to all the listeners so there is a algorithm in APL that is a circle with a vertical line through it so imagine just like the pipe symbol with the letter O
Starting point is 00:19:40 overlaying each other and what is well there's so many directions I can go with this. A small tangent is that in the original APL back in the 60s, they had like, they didn't have Unicode and all this stuff, so they had like a set of characters that were simple, and that for certain characters, you would literally have to superimpose two of the characters next to each other. So a circle was pi. And the pipe, I believe the dyadic version was modulus,
Starting point is 00:20:13 or technically the opposite of modulus, but you can just read modulus. And the monadic version of the pipe was absolute value. But if you took the pipe and the circle and overlaid them, you get an algorithm called reverse. And this is, it's by, I'm just giving so much away. Merry Christmas. I'm probably going to get an APL tattoo saying, and I'll include this stuff about hating needles, maybe. Maybe if that stuff got cut earlier, Bryce and I were talking about how we hate needles, but we're obviously still both can get the vaccine. I decided in January of 2019, roughly a year ago that I wanted this tattoo, but I had always hated the idea of getting a tattoo.
Starting point is 00:20:56 So I figured, you know what, I'm going to wait a year. If I still want this tattoo in a year, then I'll probably go ahead and get it. And now that I'm saying it on the podcast is basically i basically have to get it but uh so reverse we will do a podcast episode while you're getting the tattoo sure um so reverse it's it's so beautiful for so many reasons so uh first of all it is visually neonomic think about a circle with a vertical line if you imagine that you could grab the bottom so like the line extends
Starting point is 00:21:33 it's not the circle picture it not as just like a diameter but picture the line extending past the circle if you imagine you can grab the bottom of that line and twirl it that visually is reversing i i think that's reaching i think like a little you know symbol that might be like like a good visual representation for reverse no so but here's the
Starting point is 00:22:03 thing here's the thing maybe by itself you might be like, ah, that's a stretch. Until you realize there's another algorithm called transpose, which is a circle with a slash through it. And what do you do when you transpose a matrix? You twirl it once again by where the line is. Yeah. And there's another algorithm, which is a horizontal reverse. So if you have a rank one matrix, which is just a vector, it doesn't do anything. You can't horizontally.
Starting point is 00:22:38 You can only sort of vertically reverse that. But if you have a matrix, you can both horizontally reverse it. So like row wise reverse, or you can column-wise reverse. Well, there are three symbols. Each of them is a circle with a line through it. The line represents basically the plane that you are mutating your two-rank matrix, which is so – When I started learning APL, I didn't realize that like they were visual representations of the mutation that you were making to your vector or matrix.
Starting point is 00:23:10 And then when I realized it, I was like, what? Like, this is so beautiful. It's so beautiful. And like once you see it, you are never going to forget it. You don't need to know the name of reverse or transpose. It is a visual representation of the way that you are mutating like your data. And this is just the start. Your original question was, it's probably, like, isn't it confusing to have an overloading based on the number of arguments? Because there's
Starting point is 00:23:38 probably not a relationship between the monadic version and the dyadic. So here we go. Here we go, Bryce. You're not going to know the answer to this question, but I'm going to ask it anyways, and our listeners can listen along. What do you think the dyadic version of the reverse glyph is? So like the same symbol that takes two arguments. Could you guess? And I'll give you a hint. It exists in the C++ algorithm header. The same way that reverse exists in the C++ algorithm header. All right, give me a second. So what is a natural extension of reverse? Well, it's an algorithm which takes two input sequences.
Starting point is 00:24:24 So that narrows it down. It actually, I'll give you a hint. It doesn't take two input sequences. So that narrows it down a little bit. Actually, I'll give you a hint. It doesn't take two input sequences. It's an algorithm that takes three iterators. Three iterators. On the same range. Wait, three begin iterators or begin, end, and begin? Begin, one in the middle, and one at the end.
Starting point is 00:24:42 And that should really narrow it down. Wait, partition? Partition? one in the middle and one at the end. And that should really narrow it down. Wait, partition? Partition takes two iterators plus a unary predicate. In fact, I can only think of two algorithms at the moment that take a begin, an end, and something in the middle. Hang on. I got to pull up the list at least. If you're listening along,
Starting point is 00:25:12 I hope some of you are thinking, some of the C++ developers at least are thinking they know it in their head because it's... Oh, man. This is the suspense. I don't know. All right. Tell me. It's a rotate.
Starting point is 00:25:27 Ah, rotate. Yes. Yes. That would make sense. Yeah, that is natural. But wait. So it is natural. Wait, what are the other algorithms that take three iterators, that take a sequence and
Starting point is 00:25:41 then an iterator into the sequence? So the two off of my top of my head were std rotate and std ent element. Yeah. I think those are the only two. Maybe rotate copy? Yeah, rotate copy. Well, no, technically it takes another sequence, but the output sequence. If there's any of the suffix prefix extensions, yeah, I don't really –
Starting point is 00:26:01 I'm thinking more the base ones. But, yeah, if you're listening to this podcast and you know an extra one, feel free to ping us on Twitter when this goes out. But so you think that, yes, Rotate is a kind of reverse-ish algorithm. But there's even more. And I have to imagine that this is intentional on uh ken iverson's um part that it's more than just it's a reverse a reverse ish algorithm rotate rotates implementation is extremely linked to reverse do you know why uh and here's the here is the beauty of apl because because I learned this from APL.
Starting point is 00:26:45 Like I learned this from APL. Rotate is implementable in terms of three reverses. Oh, yeah, that kind of makes sense. In fact, I would not be surprised if the c plus plus stood rotate implementation is implemented in terms of three reverses you're given your we'll call it uh you know f for the begin uh l for the end and m for the midpoint reverse f to m reverse m to l and then reverse the whole thing. F to L. You're done. Yeah. That is, that is a rotate. That's, that's quite a, I, I, I learned that like you could implement stood rotate one of, you know, it's gotta be one of, you know, everyone has a soft, uh, a place in their heart,
Starting point is 00:27:37 um, for stood rotate ever since, you know, Sean parent, uh, in his C plus plus seasoning 2013, um, talk, you know, uh, 2013 talk, you know, reduces his example down to a std rotate and a find if. But like I had never really thought about the implementation of std rotate. And then from messing around with APL and trying to like learn why are there, you know, why does one choose the dyadic and monadic names or sort of algorithms for each of the glyphs? And there's so much thought and meaning put into it.
Starting point is 00:28:14 And I don't know. Some people might be listening and thinking, okay, this just seems like a bunch of nonsense. That's not really that important but like so much of it like once once you once you learn uh the apl glyphs um honestly like so much of it just becomes like it's it's it's literally like the title of the the turing award speech notation is a tool of thought like you you learn the stood algorithms uh the algorithms in the stood algorithm header um and you you start to think in terms of you know uh rotates and reverse and and um you know ant elements and transforms and stuff like that and it's like going and learning apl is just an extension of that except instead
Starting point is 00:28:56 of having to spell out you know std colon colon each of the algorithm names it's a single it's a single click of a button on your key or a key on your keyboard anyways digression over i don't know how we started talk all right we were talking about languages not in english and apl is the closest one to that um and that yes there there is a lot there is a huge barrier to entry uh for anyone that sort of sees apl um and i yeah i'll keep i'll keep my my biggest reveal although maybe the rotate the stood rotate is the biggest reveal from from the talk that i wanted to give um but yeah it's there's just so much um in the language that i just find just absolutely like mesmerizing like take and drop are two algorithms that we got in C++20 with the range views.
Starting point is 00:29:48 And those exist in almost every functional programming language. Take is an up arrow in APL and drop is a down arrow. And for me, having like symmetry in, you can't really call it the name of the algorithm in APL because it's a glyph, but there is absolute symmetry vertically between the take and the drop. Whereas in functional programming languages and now C++20 range views, you have to know that like take and drop are sibling algorithms. Like they go together. Take with a number on a range basically takes the first n elements,
Starting point is 00:30:31 whereas drop removes the first n elements. So they are basically like the opposites of each other. But like take and drop, if you don't come from the functional programming land, it's not immediately obvious what those do. And I think in certain languages, they actually don't use take. If I'm not mistaken, some languages use skip for take. Whereas in APL, it's unmistakably like those two algorithms are related to each other. Anyways, I haven't let you speak for a while, so. There's nothing I love better than hearing you ramble about APL.
Starting point is 00:31:13 I have a question for you, though. Putting my programming language standards hat on, there is one of the ISO programming languages which has been translated into other languages, and there is an official version in other languages with keywords in other languages. Can you guess which language it is? I did once go through all the ISO languages. This is a long shot because I don't think this is actually like officially an iso language i think it went part way through the process and then froze is it ruby it is not ruby no
Starting point is 00:31:51 it's an it's an older language is it ada it's not ada you're you're close though alphabetically close to ada alphabetically close to ada alg Alphabetically close to ADA. ALGOL? Yes, ALGOL 68. So ISO, which is the International Organization for Standardization, and ISO is not an acronym. It's not the International Standards Organization. It's the International Organization for Standardization. ISO used to be, I think it used to be like the acronym in French,
Starting point is 00:32:26 but now it's just, it's just like the name. And I believe it comes from like a Latin word, which, oh boy, I'm being a bad, a bad ISO person that I don't remember it off the top of my head. Um, uh, but, uh, yeah, so it's the international organization for standardization and, um, uh, the main languages of ISO are English, French, and Russian. Um, oh yeah. The, the name, uh, uh, ISO is derived. It's not Latin. It's Greek. It's, it's derived from the Greek word isos, meaning equal. So it's not an acronym. Don't think it's an acronym.
Starting point is 00:33:14 But there's multiple different ISO official languages, English, French, and Russian. And standards are allowed to be translated into some other languages too. Now, it's very rare for sort of programming language or technical standards to be published in a language that's not English, which is sort of the primary language. But ALGO 68 was published in Russian, German, French, Bulgarian, and Japanese. And the standard was also available in Braille.
Starting point is 00:33:48 And it wasn't just the standard itself, like the keywords were also translated. So this is like a Russian version of Algo where all the keywords are, you know, in Russian and Cyrillic. And I wonder whether it's actually been implemented in these other languages. But the fun fact, we could technically have the C++ language standard in other languages. And I think that there's some ISO thing that says that, yeah, we're supposed to translate it into French. We never have, but it's a thing that could happen. You could have French C++. You could have Russian C++. That's interesting.
Starting point is 00:34:31 And there is another family of languages which I think lend themselves well to being internationalized. What would you think that would be? Let's see if we have the same thought here. A family of languages? Yeah. That lend themselves well to being internationalized.
Starting point is 00:34:50 Yeah. I don't... Well, by family, do you mean, like, you know, sort of the language families, like the ones that we mentioned? Erlang, Smalltalk, Forth, Algo. I consider it a language family. I don't know if it's a language family in whatever formal definitions you'd come up with.
Starting point is 00:35:10 But, yeah. I honestly don't know. I don't know what... I would assume if I was thinking a language that is best suited to be internationally translated, it would be one with a small set of keywords. Right. So how about like a Lisp language like Scheme? Lisp languages not only tend to have a small set of keywords, they give you a great degree of flexibility to define domain-specific languages, to define your own operations. And they have very powerful macro systems.
Starting point is 00:35:52 In some ways, I like to think of Lisp languages as not necessarily being languages, as being sort of like a language for expressing a language. So I think you could take Scheme and you could define, you could redefine macros or functions for, or aliases for all of the sort of core functionality using a lexicon from a different language.
Starting point is 00:36:28 And then that could be an internationalized version of Scheme. I think that's pretty true for most of the Lisp languages. It's probably also true for Haskell, I think. Well, so I guess the thought in the back of my head is when you say internationalize, does that include libraries? Because I think actually I'm not I'm not a schemer, but I am more familiar with Racket, which is basically an evolved scheme. And then Closure, which is a modern scheme or a modern lisp by Rich Hickey and both of those have quite large I don't know if they're standard libraries definitely Clojure's comes with like a core Clojure core which has
Starting point is 00:37:18 a ton of algorithms Clojure's a very beautiful language and so like internationalizing Cloj closure i would assume would would mean um also coming up with translations of the the yeah and that would be a lot more work i mean one of the nice things about the scheme is a very minimal language it doesn't really have anything like a library um which is why i which is why i picked it as a candidate there. Yeah. Have you ever in your career come across code comments, non-English code comments in a project you're working with? Yeah. Well, so not a project,
Starting point is 00:37:56 but if you do any amount of competitive coding at some point in your career, whether it's just for fun or not. On a lot of websites, there's a ton of just international folks from all over the world that are competing. And so I've seen comments in Japanese, I've seen comments in Chinese, I believe I've seen comments in Russia, Russian. I don't believe I've ever seen like variable names, but like a lot of the competitive coding, like you can import like in C++, you can just copy and paste some file with a ton of macros for like's just for i that you just need to pass a single number to it that hard codes your start of your index at zero and then goes up to less than the number that you pass in. It's obviously like a terrible coding practice. But like when speed is the name of the game, it can help you write your solutions faster. And a lot of those like copy and pasted things have like comments in whatever language
Starting point is 00:39:06 the individual is familiar in speaking. I've seen it in, back when I was working in high performance computing, I came across some code where, that I was supposed to port where the comments were all in Chinese. And I remember being quite happy because typically there's no comments in a lot of HPC codes and there's like single letter variables for everything.
Starting point is 00:39:35 So I was like, okay, I can't read these comments, but I appreciate that they're here at least. And you translate them. I don't think I needed to. I don't remember what the outcome was, but I don't think i needed to i think i don't remember i don't remember what the outcome was but i don't think it translation was necessary i do remember one time seeing um uh identifiers that were like french like it was in c++ so um i think it still like ASCII'd but they were like they were French
Starting point is 00:40:08 and that reminds me of another amusing language related programming thing which is so I work a bit with the EDG C++ compiler front end which is
Starting point is 00:40:23 it's a proprietary front end that you can license from this company EDG C++ compiler front end, which is, it's a proprietary front end that you can license from this company, EDG. And it's a very, it's a very nice little code base. There's one interesting quirk of the code base, which is all of the data types, like the data type for a node type, it's not called struct node. It's called struct anode. Or the data type for an enum is called struct anenum. Because whoever originally structured this code base wanted all of the types to not just be the noun of what they were, but to have the, whatever you call A and an, um, and, uh, I, I think if it was like a singleton type, it would be called like struck the singleton. Um, and it was, it's just, I, I, it was so, it was so I've never seen anything like that. It's so unique. So actually I have, um, it's possible that the individual, um, is taking inspiration from small
Starting point is 00:41:29 talk. Um, so in my, in my small talk learnings, uh, it's very idiomatic for, in fact, like if you, that's small talk is it's a wonderful language to, to work in. Um, you basically are immersed into a small talk world. Currently, the world that I live in when I code Smalltalk is called Pharaoh, but there used to be ones called Squeak. And it's basically this like full-blown IDE with just amazing functionality. But like typically, you write your unit tests, you click like run, and then it'll fail and say, oh, this method doesn't exist. And so it gives you an option to create it. And so you click Create, and then it'll stamp out basically the signature
Starting point is 00:42:11 and any variables if it takes sort of arguments. And the name of the arguments, it can tell. Like if you're taking a string, it will say, you know, name of your method, colon, A string. And if you look at the standard – well, I'm not sure if it's standard library, but if you look at the libraries that comes with Pharaoh and the Smalltalk language, a lot of them use like A stream or A string or an integer to name the arguments or the parameters that are being being passed to um your method so that's the only other place i've seen it so potentially this person used to work in small
Starting point is 00:42:49 talk um well there's there's like there's like he's a small company there's like eight people there so uh we could just we could just ask them but i sort of like the the the myth of it i like the mystery of it you know one thing about i it. Yeah. One thing about, I wanted to mention about ALGOL too, because I didn't know this at one point. ALGOL actually stands for algorithm language, because it was initially created as a prototyping language for writing algorithms. It sort of went in a different direction. And I've always thought this was curious because for a while, I thought APL should be named, I think I mentioned this before, is that I think it should be named the algorithm programming language. Until, yeah, the remark that I made
Starting point is 00:43:36 earlier about operators and combinators. So it makes sense. But also too, ALGOL, if you listen to CppCast, or if you've watched any of Kevlin Henney's talks, he's one of my favorite speakers. You probably know this already. But Algol is one of the most influential programming languages ever. I don't know, like, the complete list of keywords that Algol was responsible for creating. But I believe, like, struct and a ton of like keywords uh directly come from algol um and can be traced all the way into c++ but like many other programming languages
Starting point is 00:44:12 yeah i mean it it's um it uh uh it's really sort of the the the root of the family of imperative programming languages um like c the whole c um language of the whole c family of languages um owe a lot to algol um pretty much all a lot of people like the shell programming languages um ada python they all drew a lot of inspiration from algo um yeah it was really the first of the the imperative languages which is which is interesting given given that it that original intent of it being a um an algorithm language um because you don't necessarily think of imperative languages as being you know the ideal fit for developing pure algorithms. You'd want perhaps some sort of more functional language for that. So one thing, what was I just going to say?
Starting point is 00:45:17 Yes, a lot of people refer to the C family as the algal family. I think a lot more people are familiar with C, so that's why it's, I think, more colloquially used. But if you go into literature, a lot of times they read it as the Algol family because that's really, if you trace the root of the hierarchy back, it goes back to Algol.
Starting point is 00:45:37 Second thing I wanted to say is that, funny that we've mentioned now both Scheme and Algol. Scheme is actually the combination of both uh lisp and algol so the two creators of that language uh guy steel um and i believe gerald sussman sussman's one of the authors of the sick p textbook um they created that language and basically were combining algol and lisp uh so neat that we've mentioned those those two things um and then there was a third thing what was what did you just say at the tail end um i forgot it there's too many things oh god i can't hop too far up the stack um oh so here's another interesting
Starting point is 00:46:17 algol fact so um when they were creating algol um uh they formed a working group on ALGOL. It was part of the, one of the predecessors for, one of the predecessor standards organizations for ISO, which I think was called IFP or something like that here. Let me look it up. Yeah, International Federation for Information Processing. They formed a working group for ALGOL, and that working group was called WG2.1. And that's amusing because C++, the descendant of ALGOL, its standards committee in ISO is WG21. So that... Not confusing at all. Yeah, that cracked me up.
Starting point is 00:47:11 Yeah. Well, and I recalled the thing I was going to say. You said that it's odd that for sort of designing algorithms, you might expect a more functional language. If you look at almost like all academic literature with respect to algorithms and algorithm textbooks, it's all using to determine whether basically like a set of parentheses are valid, which is a very common sort of interview question, like given left and right parentheses, determine whether it's valid. And I'm not sure if we've talked about this on this
Starting point is 00:47:57 podcast before, but there's a very simple, basically convert it to one negative one, each of the left parentheses and right parentheses, do a plus scan, and then just check if at any point it goes below 0, and that it ends in a... So it shouldn't go below 0, and it should end in a 0. And if you can check those two properties on the plus scans of the 1, negative 1, you basically have a valid parentheses.
Starting point is 00:48:21 So, like, in a functional language where you have a map and a scan, it's very easy to code this up. And so I submitted it. And then the response that I got was, please do not code this in a functional style. We only accept imperative styles or whatever. And I was just sort of like, this is ironic because like the most idiomatic, in my opinion, sort of way to solve this is using maps and scans, which are entirely parallelizable, which is, I think, like a huge mistake. That's another thing about APL. Not to go back to APL, but like, when you are coding your code in all these primitives,
Starting point is 00:48:57 your code is basically like inherently parallelizable. Everything's a transform, everything's a scan, everything's, you know, something that is like, trivially parallelizable everything's a transform everything's a scan everything's uh you know something that is like trivially parallelizable and when you code things in for loops and if statements like okay good luck it's like go find the thrust algorithm that that it corresponds to switch it to then and then you can parallelize your code yeah that's, like loops that you've written by hand, we provide you with a rich library of algorithmic primitives, which we know how to parallelize, and you use those to construct the algorithms that you need. There's a lot of talk in C++ about this Sean Perrin idea of no raw loops. Don't use raw loops when you could use a standard algorithm instead
Starting point is 00:49:58 because it expresses your intent clearer. And yeah, that's true. But when we're thinking about that, we're usually thinking about expresses your intent clearer to other human programmers. But there's a hidden benefit, which is when you write for each instead of a for loop, or when you write reduce instead of writing a for loop that does a reduction, et cetera, et cetera. When you use the standard algorithm, you're not just making your intent clearer to the programmer, you're making your intent clearer to the compiler. And so when you
Starting point is 00:50:30 want to go to parallelize that code, it becomes a lot easier to parallelize when it's, oh, hey, this is just some known algorithm that has certain constraints, and I plug in these user-defined functions, and then I can parallelize it as I wish within the constraints of the algorithm, within the contracts of the algorithm. And so that's another very powerful aspect of the no raw loops idiom is that it makes your code easier to parallelize
Starting point is 00:51:04 and to vectorize and to just optimize in general. I think that's a point that like in one of Sean Perrin's talks, what now like C++ part in three vignettes, he explicitly points that out. The recorded version has very poor audio, but I think I highlighted it in one of my talks where he says when you use these abstractions um you can do later like a code review and figure out oh yeah like we would have a huge performance win if we parallelize this and then all you have to do is just swap in your serial algorithm for a parallel algorithm with the same name and you're done exactly um talk about a low low barrier to whatever performance improvement. Yeah.
Starting point is 00:51:49 Anyways, I feel like we're, we've been talking for like an hour now. Yeah, these were, oh, and you know, we, we, I'm going to, you're going to have to cut this into the beginning, but we have to remind our viewers about the APL drinking game that every time Connor mentioned APL during this episode, you were supposed to take a shot of eggnog. This was, This was rich. It's rich from coming from two people that don't drink. That our podcast has a drinking game associated with it.
Starting point is 00:52:14 Yeah. Remember when we started this and we said these episodes were going to be 20 to 30 minutes? Well, so up until now, every episode has been between 30 and 40 minutes. And I thought it was going to be a nice goal to try and always do it between 30 and 40. This one, I think, definitely is going to be over 40 minutes, probably closer to an hour. Yeah, I've got an hour and 20 here. Well, I think this has been good. This has been fun.
Starting point is 00:52:44 Oh, yeah. Well, this was supposed to happen at the beginning where we read feedback, but we've got a very nice, lovely... We're doing the beginning at the end. That's fine. Yeah, maybe I'll rearrange it. Maybe I won't. We got a very nice piece of feedback from...
Starting point is 00:53:01 I apologize once again for the pronunciation. David Yastremki, who says, really like how your and Bryce's flow has developed through the episode. This may have been my favorite episode yet, referring to episode four, especially hearing you both reason about why different algorithms would work and the implementations.
Starting point is 00:53:20 We'll take a little time this month to give Haskell a try. It'd probably help with the functional pieces of C++ as well. And I responded, thanks for the awesome feedback. It's awesome to hear. And I've heard from many non-Haskell devs that learning Haskell has improved the way that they code in their day-to-day programming language. And what was his name?
Starting point is 00:53:41 David Yastrzemski. Yeah, yeah, I chatted with him too um yeah yeah so awesome that uh we're getting better or at least we're not getting worse that's the key thing we're not getting worse we make we make no promises to the sponsors that we don't have yeah um well um uh i guess uh happy holidays and uh happy new year to everybody and uh yeah i guess we we may do one more episode wait what's it's next week i guess there might be another episode will be released on january 1st so this is the last episode of 2020 yeah this is the last episode of 2020 but there will be an episode next week thank god 2020 yeah almost over i'm i'm done with this year
Starting point is 00:54:29 hopefully 2021 fingers crossed yeah fingers crossed yeah it's gonna be a good year good year yeah hopefully we did we did publish a you know the larger the most uh substantial revision of c++ in a decade this year so there's that that's true maybe well we're supposed to do our parallel scan podcast episode but maybe maybe next episode we can uh do uh well we'll think about it but we can try and do a 2020 in review and find the highlights in the midst of the year that was 2020 and see if we can reflect and be positive about the things that did happen. You know, we had Prague. Prague was nice. Prague was a good committee meeting. Yeah, that was pretty good. Yeah, I got to see the Herb Sutter, Tony Vaniard, and Bjarnus Drusup, a vast meetup.
Starting point is 00:55:28 It was awesome. So, yeah, there were some good things. There were some good things that happened in 2020. Yeah, definitely. Thanks for listening. We hope you enjoyed, and Bryce and I wish you a safe and happy holidays and a Merry Christmas.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.