Algorithms + Data Structures = Programs - Episode 71: APL, COBOL, BASIC & More
Episode Date: April 1, 2022In this episode, Bryce and Conor talk about APL, COBOL, BASIC and cover the highlights of GTC and FunctionalConf 2022!TwitterADSP: The PodcastConor HoekstraBryce Adelstein LelbachShow NotesDate Record...ed: 2022-03-27Date Released: 2022-04-01FunctionalConfNVIDIA GTC 2022NVIDIA GTC 2022 Trip ReportPLDI 2022ARRAY 2022FP LanguageFL LangaugeJohn BackusMichael GarlandCan programming be liberated from the von Neumann style?Alex AikenThe FL ProjectCOOL Compilers CourseCOBOLFORTRANSmalltalkBASICVisual BasicVBA (Visual Basic Application)C++ Standard Parallelism by Bryce Adelstein LelbachShifting through the Gears of GPU Programming by Jeff HammondRichard Feldman on TwitterRoc Programming LanguageWhy Isn’t Functional Programming the Norm? – Richard FeldmanThe Essence of Functional Programming by Richard Feldman #FnConf 2022Teaching Optics through Conspiracy Theories by Bartosz Milewski #FnConf 2022Navigating the loop in water on land & programming models by Bruce Tate & Francesco Cesarini #FnConfThe Great American LoopIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
Transcript
Discussion (0)
I think that's entirely false. The metaverse, doesn't that date back to like Snow Crash or something from like the 70s or 80s?
Whatever, my point is that we were first.
Whatever, I mean, that's not a fact and my point is that I'm right.
Correct. Welcome to ADSP, the podcast episode 71, recorded on March 27th, 2022.
My name is Connor, and today with my co-host Bryce, we talk about APL, COBOL, BASIC, and we recap the 2022 GTC Conference and Functional Conf Conference.
This will be episode 71 here.
We just dropped episode 70,
which was the programming language quiz.
And so we're going to record two.
This one will be episode 71.
What we're going to talk about in this one
will be a conference recap
because GTC just happened all last week
and lots of talks um happened and then
also functional conf um just ended which was a virtual conference that took place on india
standard time so i was asleep for most of it but i did stay up till like one slash two a couple
nights and then woke up at like 6 38 this is a conference that was held in India? Well, it's held online,
but it historically takes place in India.
Yeah, it's like they call it
the premier functional programming conference in Asia.
But before we start doing our recap,
conference recap,
about all the exciting stuff that got announced
and the talks that took place.
Thesis, maybe not,
but I am submitting a paper to array 2022,
which is a co-located conference with PLDI.
I think I mentioned that last time,
but that might've not made it into episode seven.
For those who do not know PLDI is like the major conference for
programming language research in,
in the CS academia.
Yeah.
And so it's being held in San Diego in June.
And yeah, the paper is all about,
it's honestly, I think it's like,
it's a culmination of like two years
of me deep diving into combinators
and APL and other array languages.
And also too, this is worth a whole episode to itself,
is that I've recently done a couple really big deep dives on two programming languages called FP and FL that
are from john backus. And I actually remembered mentioning it back in I think was like episode 47
or something, the one where we were waiting for Dave Abrahams and you made me explain
combinatory logic and combinators to you. So there's an episode called Combinatory Logic.
And in that episode, I actually referred to this language FP and John Backus introduced it.
I want to clarify that we're talking about John Backus, the computer scientist with a CK,
not John Backus, the football player with a CH.
But yes, this is the John Backus behind Algol Fortran, and that won the Turing Award in 1977.
And in his Turing Award lecture, he introduced a language called FP, which stood for function
program, function program or function programming. And then that language evolved into FL,
which stood for function level.
And basically, I think it's widely not known
that J, the second language that Ken Iverson worked on
after creating APL and working on that for 30 years,
wasn't just an evolution of APL.
It was a combination of FL and APL.
And in FL, John Backus had these things called combining forms,
two of which were basically combinators.
And Ken Iverson had a paper in 1989 called phrasal forms,
which he actually references one of the combining forms called construction form
that's introduced in the 78 paper.
Anyways, so I've always had this view that like J was APL 2.0,
but really it's like, you know, we did the quiz last week.
J is APL plus FL, and no one knows about this language FL.
Thank you so much to Michael Garland, head of programming languages and models at NVIDIA.
He's a senior director at NVIDIA Research.
And yeah, programming languages is within here
and system software are within his purview.
Yeah, he was the individual
that put the FL project paper on my radar.
And he was like, you should go read this.
I just looked it up and like, it's pretty obscure.
Like there's no, the syntax isn't even
on the wikipedia page and i had to like click through to a few links to to find uh to find
anything yeah so this is great i was actually so during functional comp the other day i was having
a conversation with aaron and i basically told this mini story of how you know fp and like in
that 78 paper by bacchus there's a whole paragraph basically thanking
iverson because the name of the paper is can and some of you will have already read it or be
familiar with it it's can we be liberated from von neumann languages and goes on to basically
say that like algol and fortran were a big mistake imperative programming was a big mistake and that
we should all head in the direction of functional programming languages which rely on these things called combining forms which is basically ways of
composing programs to form other programs and combining forms is basically just another name
for what would end up being called combinators and trains and the array languages and so he's
he calls out to Iverson though and says that APL is one of the few languages that is actually going in the right direction, as opposed to like Algorand and Fortran.
And so FP is highly influenced by APL, but Bacchus says that Iverson made a bunch of mistakes.
And then, you know, fast forward, I guess, 12 years from 77, 78 to 1990.
And then Iverson ends up holding a bunch of...
Did you just say FP was influenced by FL?
So FL is the basically successor language to FP.
And they're very, very similar.
It's just a few things were added that didn't exist in FP.
And there was actually never an implementation of FP,
but FL was actually implemented at IBM.
Gotcha.
And then the FL project report
is basically like a 40-page paper
written by Alex Aiken,
who's the programming language professor at Stanford.
And some of you will be familiar with that name
because he has, I think it's a Udacity or Coursera course,
or maybe it's actually like a Stanford online course
called Cool,
where you implement a programming language called Cool,
and it's like the compiler's course there. i i worked through part of that at one point that's the first time
one of the legion people yeah he works part-time at nvidia he works one day a week for us um
and um well i shouldn't say for us i by us i meant the greater company of nvidia
um and uh and so yeah you know apl influenced fp then FP went on to become FL IBM implemented FL and then
Professor Aiken and a couple others wrote this report on whether or not this sort of combinator
point-free programming was a good idea and then Iverson so I would love it I don't know if it's
our listeners or there's someone that I should go and be reaching out to. My guess is that at some point, I don't know where I have to go look at
what Bacchus did with his career. But like, my guess is like Iverson and Bacchus were probably
like colleagues or like, you know, at least acquaintances and met at conferences. And there
was like a ton of idea sharing between the two. And anyways, so there's this just this, this huge
sort of unknown history and like it only
dawned on me a couple weeks ago in the last couple weeks that my favorite paper the 1989
phrasal forms by Iverson and I always say Iverson but it's actually he co-wrote it with
Eugene McDonald as well and the name of that paper phrasal forms which was the initial name
for what they would call combinators in the array
languages, APL and, uh, and J, uh, trains is clearly inspired, uh, by the combining forms
from Bacchus's 78 paper. It's combining forms, phrasal forms. And in J they, they had this
pattern of basically like taking things that already had decent names, but like renaming them in to the English grammar lexicon.
So like functions were called verbs and arrays were called nouns
and variables were called pronouns.
And so they were like taking everything that already had a name
and like renaming it so that it was like paralleled to grammar.
You know what?
Other programming language tried to have um syntactic
elements that were that closely mirrored english uh i want to say pearl or raccoon think it think
older think older because i know pearl when you want to print something it says like the keyword
is like say cobalt cobalt that's right because cobalt so back in the day um you know cobalt was a contemporary
of of fortran um and a few other languages but i'll say fortran because it was the the
the one from that era that was most successful and still in use today um and fortran was a language designed for mathematics, for computer scientists.
It was for solving numerical problems or I want to just broadly say science.
Or maybe math is a better better way to put it so so Fortran was the language was a language for math and so it was natural for it to have
a mathy syntax um but COBOL was a language for business um for for programming your business logic and for doing, I mean, I think if you look,
if we were to take COBOL and put it into contemporary terms, it was a data analytics
language. It was a language for, you know, writing, you know, software that was going to process
some data that you have. And one of the goals and intentions in designing COBOL
was to make it accessible to non-math people, to people who were just, you know, maybe not even
engineers, but maybe just, you know, people who are on the business side of a company, to be able to write their business logic in COBOL.
And so that's why COBOL essentially doesn't really have any operators.
It's all just English.
And if you look at the COBOL syntax, you'll see that it strives, in some some cases to a fault to just appear to be like
english sentences and then in fact there are some cobalt programs that like you could read
out to somebody and like sounds like you know that sounds like a sentence that like a regular
human might utter yeah so well i think that so there's a difference too like there's a spectrum so cobalt
and i've looked at some cobalt code and i've it used to be on my language list of languages that
i wanted to learn but then after looking at it i no longer we just published a new cobalt standard
come on yeah but like the verbosity of cobalt is is immense like a hello world program is like 40 lines long. Cause you have to, there's
a bunch of stuff you have to set up for like every program. And they, they wanted to sort of like,
they wanted it almost to be like executable, like sort of business. Like, so you could really like,
like non-programmers could use it, which I think it probably succeeded in that goal. Uh, but for
folks that are trying to do things like for someone
who really likes APL, like myself, you're able to express yourself very tersely. COBOL is almost
the opposite of that. So but the difference here is that COBOL actually reads like English,
whereas J, you know, it's ASCII digraphs, it reads like hieroglyphics. But the terminology,
like the names of things they chose to sort of be like English grammar. And in between those two sits Smalltalk, where Smalltalk was also designed sort of not to be
aimed at business folks, but to also read very, very similarly to English. And so you read things
left to right. And like, if you compose a bunch of unary messages, you'll have something like,
you know, 10 times plus three print or something. Like that's a bad
example that probably doesn't actually compile. But the goal of Alan Kay was to have a language
that would be very, very easy to read without really not much knowledge of programming languages.
And it was also one of the design goals of it was to be good for kids to learn.
You know what, you know what, there another language that um i don't think was actually
directly influenced by kobol but i think also has this property and and specifically a language that
was designed originally i believe for beginners am i supposed to guess which one it is yeah you are
um well i know logo was designed for beginners but i'm guessing that's not the language you're
thinking of logo was like a scheme it had turtle graphics built into it. It was done by...
No, it's very much an imperative language.
So what was the description? Inspired by COBOL, aimed at...
I don't think directly inspired by COBOL, but I think its syntax is similar to COBOL in that it
strives to be English words and sort of have a, um, uh, something close to a
sentence to like an English sentence structure. Um, and it's something that I believe was originally
intended to be for, um, uh, for non specialists for non-computer scientists and mathematicians.
Yeah. Small talk's the language that comes to mind,
but I can't think of a second one.
Basic.
Which I believe the B in basic stands
for beginners.
Really? Yeah.
Hang on, let me...
Yes, I am correct.
Beginners All-Purpose Symbolic
Instruction Code.
Created at Dartmouth College in 1964.
The creators wanted to enable students in non-scientific fields to use computers.
At the time, nearly all computers required writing custom software,
which only scientists and mathematicians tended to learn.
I don't know whether it was influenced by cobalt um cobalt the original uh you know cobalt was the
summer the designed in a very short period of time and i have my my best twitter thread um my
pinned twitter thread is about the history of cobalt um in like the 1959 like the summer of 59 and then the report was published in 1960
and so basic came out in 64 it is possible that it was influenced by it but when i search for
cobalt on the basic wikipedia page i don't find anything but i think you know i started off
programming in basic ti basic on my TI 89 calculator.
And a lot of people in C++ say that they started off programming in basic.
Are you one of them?
I can't remember.
Yeah, yeah.
I think at one point it ended up coming up on an episode that, yeah, we both have a very similar, like, didn't start programming when we were six.
And the first lines of code we wrote were on a TI basic, in TI basic on a TI 83 calculator in high school.
So, but then you get what I mean about the syntax,
how it's very, it's essentially just English words.
There aren't really operators.
Yeah.
Basic's a language, I mean, I coded a ton in Visual Basic application,
which is the language that you've've want to if you want to
program macros in Excel. And I also did back when I wanted to be a little finance quant, I wrote
like three different versions of a technical analysis program that like would plot OHLC,
open high, low close, sort of stock charts. And the first one that I implemented was in Visual Basic. And it was,
it was so slow that like, I immediately switched to a different language, because like, it would
start to draw all the little candlestick bars, which is it would take like 10 seconds to render,
you know, the 200 different bars for each day of the last couple months um my my my initial
musings with programming in my calculator were um not for so noble of intentions um i think the
one of the first things i did was there was this game that i'd like downloaded onto my calculator
um but my teacher had caught me playing it in class so i added a mode to the game where if
you just pressed a key it dropped you into something that appeared to be like the regular
calculator ti89 calculator prompt and like did basic math but if you did anything that that
added up to like seven then it dropped you back into the game. And I thought that this was pretty, pretty
damn clever. That is pretty, that's pretty good. That is also very me. Yeah. All right. Well,
that was supposed to be, I mean, we're almost at the 25 minute mark and this was supposed to be an
episode about our conference recap. Anyways, we'll talk more about this paper that I was going to submit and
the history of APL, FP, FL, J on some other episode in the last five, 10 minutes here,
because the next episode that you're going to hear next week that Bryce and I are going to
record right after this is going to be a C++ interview and an actual question that I asked
an intern or a potential intern in an interview the
other day. And I had not, I've interviewed many co-op students or interns in a past career,
but this was the first one at NVIDIA. And I actually didn't know if this question was too
difficult. And so I thought, you know, there's a good way to figure this out. I'll ask Bryce
and we will solve it live. And depending on how Bryce does, I think it's a great question
and we're going to solve it family feud style.
But stay tuned till next week.
I mean, Bryce can stay tuned for like six minutes
and then I'll be asking him it.
But the listener must wait until episode 72.
In the last five, 10 minutes though,
highlights, GDC 2022.
I mean, this episode's now going to be called
like Cobol, APL and FL.
And they're just going to get a couple of highlights.
And basic.
Don't, don't.
Oh, yeah.
And basic.
And basic.
What do you want to say?
Top three things that you should mention about GTC 2022.
Talks people should go watch.
Well, obviously my talk.
Unless you've already seen my talk on C++ standard parallelism at another conference.
In which case, this is more or less the same talk. There is some new stuff. There's some new cool applications and porting
results. And we had a panel on the future of standard C++ with Daisy Hallman, Christian Trott, um, Eric Kneebler and Michael Garland. That was a really
fun panel. I had a great time doing that. That was a good, a good group of people to, to put in a
room and ask questions about C++. Um, yeah, I think those were the, those were the two big,
uh, the two big things for me. Um, we also had a, uh, you know, sort of a user Q&A session for
standard C++
and CUDA C++ and then another one
for all of our C++ core libraries.
Those are always fun. What about you?
What were your GTC highlights?
I can echo
what you just said. So I have a
quote-unquote trip report that
will be out by the time this
airs.
Oh man, I forgot about trip reports.
You know, it's been two years since we traveled anywhere.
I forgot about, you know, I used to write trip reports.
Yeah, and it's kind of, I mean, the one thing about it being virtual
is that it's easy to write it while you go through the week.
And I think I watched like, I don't know, 15 or 20 talks
slash connect with expert panels.
So yeah, Bryce's talk. And so
I have a little legend to have like, good talk with a star next to it, you know, worth watching.
There was a couple that had really bad audio. So I put a little emoji next to that. But yeah,
your talk was great. I definitely agree that the panel that you had with Daisy Christian,
Michael and Eric, that was, That was my favorite of the panel slash
CWEs. Tim Costa's talk was great. Jeff Larkin's talk was great. I won't go into details what
they're about. You can just go to the trip report, see the stars, go watch them if you're interested.
But both, I mean, Jeff Larkin's talk, your talk, and Tim Costa's talk all sort of covered similar things, you know, the future C++, HBC Compute.
There was also a fantastic talk from Jeff Hammond.
It's called The Gears of GPU Computing.
And it gives a tiny brief history of like GPU programming, but then compares like OpenACC, OpenMP, Standard C++, CUDA C++, Standard Fortran, C++, like all these
different, you know, CUPAI, the list goes on and on. And he takes three small, like, you know,
one of them SACS-P, and then one of them's like, you know, matrix transpose, and then like one
other operation. So three small sort of problems that, you know, ideally, a GPU is going to eat
for breakfast. And then he basically like profiles them all against each other and has these nice little charts and it's it's like a super
short like 20 25 minute talk uh but covers like a lot of ground olivia even when i tweeted it out
was responding like shouldn't these all be the exact same and he's and jeff's end up replying
like uh didn't take long or didn't take long for you to forget about like unified memory or i don't know there's some thread i'll add it in the in the show notes but um all in all pretty
exciting also i mean the keynote jensen's keynote's always uh pretty inspiring to watch the stuff like
you know half the stuff that's going on at this company i'm not aware of and the keynote's always
a good way to probably probably shouldn't say that in a public forum. Why?
NVIDIA is huge.
I'm supposed to know?
I mean, you're a software politician.
I've got my head down working on rapid stuff.
I mean, I don't know what's happening in, like, Omniverse land.
Why would I know what's going on in Omniverse land?
Well, actually, I have a coworker who's in the city who works.
I don't know whether he works on Omniverse still. He may be doing a different thing he's uh he's told me many cool things about omniverse and it is very cool i mean it's
super cool but like the only time i learn about it is when i watch jensen's i walk back i walk back
i mean i walk back what i said i know it's our metaverse and it's going to be the best
and yeah it's the future it's our omniverse like we
were calling it that before people decided we're going to call it metaverse like we've been i think
that's i think that's entirely false the metaverse doesn't that date back to like snow crash or
something from like the 70s or 80s whatever my point is that we were whatever i mean, that's not a fact. And my point is that I'm right.
Correct.
Anyways, lots of great talks. Link it in the show notes. And very quickly, yeah, FunctionalConf.
My two favorite talks. I might, well, I write a trip report. I don't really have time. But my
two favorite talks, one was from Richard Feldman, who is pretty well known in the functional
programming community, works at NoRedInk, which is a company that uses Elm as a functional programming language.
And he has now recently started working on a new programming language called Rock, R-O-C.
I think that actually might be the first time we've mentioned Rock on this podcast.
And he's had a bunch of really, really popular, successful talks in the past,
like why isn't functional programming the norm,
things like that.
Anyways, his talk, he gave a keynote.
The three keynotes, the first was by Richard Fellman,
and it was entitled The Essence of Functional Programming.
The second was by Bartosz Maluski, Mr. Category Theory,
and it was on optics and lenses,
which is sort of, if you're familiar with Haskell, you've probably heard of that. And then the closing it was on optics and lenses, which is sort of if you're familiar with
Haskell, you've probably heard of that. And then the closing keynote was by Bruce Tate, the author
of seven languages in seven weeks. And he actually he gave a very entertaining talk wasn't super
technical, but like the first half of the talk, he gave the talk from a boat. He was in he was in
his boat, and he was doing something Have you heard? And we got to wrap this episode up because we got to get to episode 72.
Have you heard of?
Now we clearly have life goals, which is to record a podcast episode.
Give a keynote from a boat.
Have you heard of the Great American Tour or something like that?
Have you ever heard of this?
I probably just got it wrong.
You're Googling it right now, right? I was shaking my head. I was not googling it. Then I realized this is a podcast.
So shaking my head does not not really help the listener. Yeah, I mean, maybe it's the Great
American Loop. But basically, it's this it's this loop you can take by boat that I mean,
you can start anywhere on the loop, but it goes actually by Lake Ontario and Toronto through the Great Lakes,
down through like Chicago, Illinois, all the way down to the bottom of America,
out through some state down there around the, you know, Florida Gulf.
Which state, Connor?
I don't know.
I don't think it's Texas.
What's next to Texas?
Louisiana?
Connor, this is down the mississippi i presume which which state
famously does the mississippi terminate at i don't know uh i mean i'm gonna say
i don't know georgia louisiana i'm not i'm not american i hope you saved that that material from, like, when we recorded, like, two or three weeks ago,
where you demonstrated, like, enormous ineptitude at knowledge of U.S. history.
It was like, there was, like, I asked Connor, like, did the War of 1812 happen before the Civil War?
And it was, like, a struggle to get him to the correct answer.
Man, was that right? It looks like it is Louisiana.
What do you give me such a hard time for, man?
The only reason I'm doubting myself is because you're like laughing at me like, man, you got to know this.
Well, like also, I lived in Louisiana for, you know, four years.
So, yeah, I've been to New Orleans twice. Lived in Louisiana for, you know, four years.
Yeah, I've been to New Orleans twice.
Look, I even say it right.
New Orleans.
That's the correct way to say it.
Anyways, the point is, Bruce Tate gives a keynote about this huge, he's going to be doing this tour until October. And it's March right now.
And it goes all the way around Florida, up the coast, through i don't know some river in new york or connecticut
or something anyways it's crazy first of all didn't know you could do that second of all didn't
know it was a thing third of all given a keynote from that boat trip that's a pretty it's a pretty
cool thing to do um so i mean most of the talk was about the boat trip and it was it was to do
with loops because it's a loop that he's on
right now and like the q a though and i felt like i was at a boat conference not at a it was just
this one guy asking questions about uh you know the the garmin system that he had set up and like
you know how he you know how he dealt with this and that and alligators and stuff anyways
entertaining talk um not super technical though but anything you want to say to close this episode out?
I'm just thinking about what type of boat you and I need to get, Connor.
Like, this is definitely happening.
That's actually, that's, wow, I didn't even think about that.
Like, we have a proverbial boat.
I mean, you have a proverbial boat.
Yes, I have a proverbial boat.
But how much better would it be if i had an actual boat
i mean this is in our future i i now see us yeah we've got the balkans 2023 tour look
i am a white guy from connecticut so there's like some law that by the time i'm 40 i have to own a
boat everybody in other case every like male in connecticut owns a boat
i will never own a boat i mean i've never even owned a car um i don't like like it gives me
anxiety just thinking about having a boat and like how much work that would be i would rather
just like airbnb a boat i'm sure there's like an airbnb service yeah yeah i mean there's probably
there's probably i mean there's an app for like, like,
you know,
there's,
there's the air,
the equivalent of Airbnb for cars.
It's,
there's an app called Turo,
which I've used.
Um,
I'm sure there's some equivalent for boats or maybe Turo does boats too.
If we do cars.
Anyways,
well,
stay tuned.
Listener Airbnb for boats.
We'll be boating.
We'll be recording an episode from a boat sometime in the next 10 years, stay tuned, listener. I'm googling Airbnb for boats. We'll be boating.
We'll be recording an episode from a boat sometime in the next 10 years, in the next decade.
That's an ADSP, the podcast promise.
Yes, it is.
All right.
Episode 71 is over.
Are you ready?
No, I am not ready.
Thanks for listening.
We hope you enjoyed and have a great day.