Advent of Computing - Episode 38 - JOVIAL, the Evolution of Programming

Episode Date: September 6, 2020

The creation of FORTRAN and early compilers set the stage to change computing forever. However, they were just the start of a much longer process. Just like a spoken language, programming languages ha...ve morphed and changed over time. Today we are looking at an interesting case of this slow evolution. JOVIAL was developed during the Cold War for use in the US Military, and it's been in constant small-scale use ever since. It's story gives us a wonderful insight into how programming language change over time, and why some stick around while others die out. Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers: https://www.patreon.com/adventofcomputing

Transcript
Discussion (0)
Starting point is 00:00:00 In 2011, London's Heathrow Airport made a major upgrade. Their air traffic control systems had been running off some fairly old IBM mainframes. Now, it had worked for decades, but as of late, it had started to become quite a big problem. Over the preceding years, the computer had crashed a number of times. And without the machine running, no flights could safely go in or leave. So why had the airport been so slow to upgrade? Well, that comes down to a software problem. The entire system was running code written all the way back in the 1970s
Starting point is 00:00:36 and it was in an old enough language that it wasn't very easy to get up and running on a new computer. The upgrade would be expensive and it took Heathrow some time to save up for a flashy new air traffic control system. Now, that's a pretty common story. There are a lot of important systems that rely on downright archaic technology. But the specifics of this case, well, those take us down a bit of a rabbit hole. The vintage code that kept Heathrow running for so long was written in a language called Jovial. Now, you've probably never heard of it. I know that I hadn't until very recently, but I'm willing to bet that it's impacted your life in some way. Jovial was designed for military use, specifically to make it easier to program large-scale defense
Starting point is 00:01:22 networks in the United States. Since then, the language has spread into all corners of aviation. Air traffic control is just one example. Planes ranging from the B-52 to the U-2 and even some parts of a Boeing 757 all run jovial. Communication satellites, cruise missiles, and even rocket engines have all been controlled by this niche language. It's been in the background of geopolitics and air travel for well over 50 years. But what exactly is Jovial? And, maybe more importantly, why has it been so long-lived? As it turns out, that second one is a pretty tricky question to answer. There have been countless programming languages developed since the dawn of computers, and of those, most die out pretty quickly. Jovial success is an interesting case
Starting point is 00:02:11 of what makes a programming language stick around. But it also gives us a window into how the larger field of programming first started to form. Welcome back to Advent of Computing. I'm your host, Sean Haas, and this is episode 38, Jovial, the Evolution of Programming. This episode, we're getting back into one of my favorite genres, the more esoteric. But we aren't just going to talk about some obscure corner of computing for no reason. I have a larger goal in mind. A while back, I put together an episode that covered Fortran, the first compilers, and how programming languages were originally developed, and, most importantly, why they were developed in the first place. Now, that's really just the
Starting point is 00:03:02 start of a much larger story. Programming languages are a big part of the world today, even if most people don't come into contact with the source code directly. While languages like Grace Hopper's A0 or John Backus' Fortran were really important steps, they were just the beginning. They weren't the end of the process. I mean, most computers nowadays don't even run Fortran. Just like with human languages, the tongues that we use to talk to computers have changed a whole lot over time. Existing languages evolve, dialects are developed for different uses, and eventually, totally new languages appear. It's a pretty long process, and it's a complicated one,
Starting point is 00:03:42 but eventually it leads us into the modern day. Today, we're going to look at how languages evolved, but to do so, I'm going to focus in on a microcosm of this process. Specifically, we're going to be examining Jovial, a language developed for the US military in the latter years of the 1950s. Like Fortran, it continues to be used in very niche applications today, but unlike Fortran, it was developed after the first handful of programming languages came into use. Jovial was created early enough that nothing was set in stone, but was just late enough to the game that we can see how it adapted other languages for its own use. More importantly, I think that Jovial gives us a wonderful case study of why some technology
Starting point is 00:04:25 dies out while others thrive. So let's get into it. What exactly was Jovial, and why was it made in the first place? How does it fit into the larger story of computers? And why does it still show up in air traffic control systems today? To start things off, we need to travel all the way back to 1958, and the earliest dawn of programming languages. By this point, a handful of high-level languages already existed. Fortran, Lisp, Flomatic, and many others were already well under development.
Starting point is 00:04:57 But there's a difference between invention and adoption. There's some lag time. The fact was that most programmers didn't really care about these new languages on the block. Part of the reason for that comes down to the attitudes around computers in the era. The idea of a programmer didn't really exist. Well, unless you were Grace Hopper, but she's always the exception to every rule. Most folk who actually used computers were mathematicians or engineers, or some other stripe of scientist. Computers were a wonderful tool, and the field was growing quickly,
Starting point is 00:05:33 but it wasn't fully separated as a specialty in its own right. And how computers functioned made it really hard to specialize in just programming. To put things mildly, in the 1950s, computers were a lot more hands-on. You couldn't really get away with just programming them. You had to know the computer inside and out. Hardware and software went hand in hand, so much so that some systems still had to be programmed using jumper cables. So if you wanted to program a computer, you'd better also want to dig around in its hardware. Attitudes about software development mirrored this rough state of hardware. You'd usually end up programming in machine code, that's the actual binary data that
Starting point is 00:06:16 a computer can understand. If you're lucky, you might be able to use assembly language or something slightly more readable to us humans. You needed full control over what the computer was doing to get anything accomplished, and these primitive low-level languages offered that. The flip side was that they were really, really hard to program in. Fortran and its contemporaries offered an alternative. Instead of mucking around in bits and bytes, you can write out a program in something that's roughly more akin to English, and then you run it through a compiler and it's translated into machine code. It was a revolution in the making, but a pretty slow one. There was resistance to this change. Programmers were worried that Fortran wouldn't be able to control a computer as well as something like machine code. And in some cases, these fears were totally correct. Many early programming
Starting point is 00:07:10 languages had problems in this regard, so they offered the ability to write lines of assembly language in the middle of your program. In theory, this meant that a programmer could drop back down to low-level languages when needed, but mainly stay in the realm of Fortran. But in practice, that meant that you're now programming in two languages at once. And I shouldn't need to tell you this, but that's a recipe for disaster. For programming languages to take off, both the technical and the more socio-political issues would have to be corrected for. But there was one last little issue at play. You see, at this point, nearly every computer was unique. Mass-produced computers were first
Starting point is 00:07:52 starting to appear in the late 50s, but the mass in that term is somewhere in the tens of computers. Each of these slightly different computers spoke a different version of machine code, Each of these slightly different computers spoke a different version of machine code, and they had to be used in a slightly different way. So someone working on an IBM 701, for instance, couldn't really sit down and make the shift to programming for ENIAC. High-level languages just complicated the matter. Fortran ran on a few different machines, Lisp on some others,
Starting point is 00:08:24 Flomatic on another subset. In this utter lack of standards, it was possible that some machines might have access to a number of different programming languages, while others could just have none to offer. This patchwork meant that learning any one language, well, that could help you with one computer, maybe two. But outside of its specific platform, it wasn't very useful. You could spend the time learning, or you could just not. And a lot of people chose not to bother. These all formed big barriers to the spread of programming languages.
Starting point is 00:08:58 While adoption may have been a little slow, those who came over to the brighter side of programming became ardent supporters. To anyone who took the time to actually learn one of these early languages, the benefits were immediately clear. Fortran may pull you further away from the machine, but that opens up a lot of opportunities. But even early supporters knew that there were problems with the state of programming languages. Adoption would only be possible if at least some of these issues could be addressed. One of the first concerted efforts on this front was a project that went by a few different names. The International Algebraic Language, IIL,
Starting point is 00:09:36 the Algorithmic Language, or most commonly ALGOL. Proposals for the language started to circulate in the mid-1950s, with a formal description being published in 1958. Now, it broke from the established blueprint in quite a few ways. But most importantly, it was the first attempt to create a standardized and universal programming language. By creating a single unified language, the entire field could be revolutionized. unified language, the entire field could be revolutionized. If every computer spoke the same high-level programming language, then there'd be a much larger incentive to learn that language. And if every programmer spoke the same international language, then a more robust community could be formed. Maybe unintentionally, this would also go a long way towards creating
Starting point is 00:10:21 the modern idea of a programmer. If you didn't have to worry about the hardware as much, you could just program. You could specialize. You wouldn't have to be a hardware wizard on the side. So, who was behind Algol? Perhaps fittingly, Algol doesn't have any one creator. Instead, it was developed by committee. Like I mentioned, the idea of something like Algol had been brewing for a few years in computing circles. And while ink was being spilled for a number of years, something as big as a universal programming language, well, that would take a lot more coordinated effort to produce. The process would get going at the end of May in 1958, at the Zurich Joint Conference of ACM and
Starting point is 00:11:03 GAMM, where computer users from around the world gathered to tackle their collective problem. The first meeting only had eight members, but it was the start of something a whole lot bigger. After about a week of work, attendees at the Zurich conference were actually able to put forward the first draft of the ALGOL programming language. Now, this wasn't built from scratch exactly. Each of the contributing members put their own expertise and preliminary work into this draft. What the team came up with really was an international language. It was a collaboration between some of the best minds in the field, gathered from the US and Europe.
Starting point is 00:11:42 The earliest documentation that we have detailing the language was written by John Backus, a conference attendee and the creator of Fortran. Backus opened the paper with this, quote, The proposed language is intended to provide convenient and concise means for expressing virtually all procedures of numerical computation, while employing relatively few syntactical rules and statement types, end quote. Now, from the beginning, we can already see that ALGOL is going to be a
Starting point is 00:12:13 different type of programming language. Just the language used right there should give us a bit of a clue. Up to this point, programming languages have been developed as a tool, built up for getting work done on a computer. Take Fortran, for instance. Bacchus' original intent in creating Fortran was to offer an alternative to assembly language that was easier to use. But a lot of the final design was dictated by hardware considerations. A lot of time was spent making sure that Fortran could compile fast code in as little time as possible. In other words, there were real-world factors that shaped the language considerably. ALGOL set out to take a different route.
Starting point is 00:12:54 Its design was very abstracted away from the hardware. It took a much more academic approach, and that can be seen even down to the language used to talk about Algol. The way the committee saw it, Algol the language came first. The hardware side of things was unimportant, at least for the time being. And as such, the first paper describing Algol is a lot closer to an academic treatise than an actual reference manual. Going further into Bax's paper gives us the more meat and potato details, the formal spec for Algol. Now, there are actually a few different dialects of the language. Most commonly, they are referenced by the year in which they were first published. This earliest version breaks the convention a little. At this point, the US and European
Starting point is 00:13:44 contributors had slightly different ideas on what to call the new language. In Europe, the first draft was called ALGOL 58, since, well, it was the algorithmic language formalized in 1958. Stateside, most contributors called it IAL, or the International Arithmetic Language. The European name would win out, so in general, I'm going to try to stick to using that naming convention, but just bear in mind that some sources will bring it up and call it IAL. Anyway, the language outline that Backus gives keeps up this air of formality. But all in all, it's not really that big of a paper. Over just 24 pages, he describes the entirety of the language's syntax and operations. It's succinct, but it's also pretty dense. The ALGOL committee
Starting point is 00:14:33 was trying to create a whole new language at a time when programming languages were just starting to really show up. So they had a whole lot of factors that would need to be addressed in 24 pages. The paper even goes as far as describing what a programming language is and how a language should act in general. Quoting again from Backus, an IAL program is a sequence of statements which may be interspersed with certain declarations. Each statement describes a rule of computation and explicitly or implicitly specifies a successor statement. End quote. That may sound a little bit alien to most listeners, but it's actually a pretty complete theoretical framework that describes a programming language. You got lines of code.
Starting point is 00:15:22 Each line does something or declares something else. And from each line, you can figure out which line comes next. Also, let's just hammer home that all the work around ALGOL is written much more like an academic paper than anything else. I mean, just the language in that one quote should cue you in. It all sounds like a string of math theorems just applied to programming. From here, you can see the early birth of a new type of discipline and a new kind of thinking about programming. But what does the language actually look like? Well, that part gets a little bit strange. It's a mix of older and newer syntax. ALGOL looks like a more archaic language, but it has a smattering of modern touches that programmers today can still recognize.
Starting point is 00:16:11 The core of the language is, unsurprisingly, mathematics. Computers were really just very, very expensive calculators at this point, so math had to be a strong suit. In this regard, it takes a similar approach as its contemporaries. The expected operations are all present and accounted for. The 1958 paper even touts how easy it is to convert pen and paper mathematical expressions into lines of Algol code. Beyond the mundane, we get into variables. And this is where Algol starts to depart from similar languages. Variables are one of the most important parts of a program.
Starting point is 00:16:51 In short, they act like a tiny chunk of storage where a user can keep data. In high-level languages, you have named variables. Say, a number called x. When you assign a value to that variable, you're actually storing a number somewhere in the computer's memory. But a programmer can just say x equals 5. It's pretty simple, but deceptively so. The details of how a language deals with variables are actually really important, and it dictates how you can work with data. Getting a little bit more technical for a minute, dictates how you can work with data. Getting a little bit more technical for a minute,
Starting point is 00:17:30 ALGOL is a strongly typed language. That means that every variable has to be declared with some data type, and once declared, it can only work with that one type of data. Basically, by declaring a variable type, you're telling ALGOL how to treat the data stored in that variable. The variable system in ALGOL is roughly similar to FORTRAN. You have integer variables for whole numbers, and you have floating point variables for decimal numbers. But ALGOL takes it a little bit further. Boolean data types make an appearance in the 58 report. Now, this might be splitting hairs, but bear with me here for a second and I'll explain why this matters. Boolean variables are used to represent logical values, either true or false.
Starting point is 00:18:12 These types of variables are used to store the results of logical operations, like comparing two numbers or maybe comparing some strings. But here's where it gets interesting. That is, if we can call data typing that interesting. It's pretty easy to get a computer to work with numbers. Everything about computer hardware is built to handle integers. Floats are a little more complicated, but it's still in the realm of numbers, so pretty easy to work out.
Starting point is 00:18:39 But a boolean, well, that's a little bit different. You can't just store true or false into some memory location. Boolean types are one of the ways that we can see ALGOL starting to get away from the hardware more and more. And that's just the start. ALGOL is full of these abstractions to keep a programmer safe from the underlying machine. The ideology of a programming language being separate from a computer was a very new take at this point, and ALGOL would take that further than any other language did before. It did more than just make programming easier, it made programming much more flexible.
Starting point is 00:19:17 One of the places that we can see this again is how ALGOL handles flow control. The formal description that I quoted earlier says that each line of a program must either explicitly or implicitly point to the next line. Implicit is the most common case. In normal operation, a computer would run line 1 and then line 2, then line 3, and then keep going until all the code was gone. But that's also the boring case. ALGOL offers a few different ways to change that up, and in other words, explicitly tell the computer what needs to get run next. You have if statements, loops, and functions, which all work more or less the same as in other languages. But then you have the switch, which is another weird way that ALGOL separates itself from computers.
Starting point is 00:20:08 Switch statements still exist in a lot of modern languages today, but they bear little resemblance to ALGOL's original formulation. In ALGOL, a switch is a way to define a list of subroutines. It's basically an array where each element is a chunk of code. You can then run each of those chunks of code just by referencing its location within the list. This is the same kind of abstraction that we saw with Boolean data types, just kicked up to the next level. Altogether, this shows how different ALGOL was from other languages in use at the time. The committee that described the language was trying to make a universal language,
Starting point is 00:20:49 one that could describe any problem, not something tied down to a computer. And in that goal, well, they succeeded. While ALGOL 58 didn't look all that modern, its feature set really was. The ideas presented by this new language would change how programming was done, and it would help change how programming was viewed. Now's the part where you'd expect that we'd go over how ALGOL grew and evolved over time, how it spread and exacted change throughout the world. But that phase never really happened for Algol. The language would go through a series of updates in the 1960s and onward.
Starting point is 00:21:35 And while it did see use, it never became the universal language that its creators envisioned. There were many factors that prevented Algol from taking the world by storm. And not all of them were the fault of the language itself. In the late 50s and early 60s, if you had a mainframe, chances are it was branded with the letters IBM. Fortran was Big Blue's own in-house programming language at this point, which made the language very accessible to anyone who owned IBM hardware. By 1960, Fortran 3 had already been released, which addressed enough issues with
Starting point is 00:22:06 the language that Fortran was starting to become a better option for more programmers. ALGOL was definitely the technically superior language, but it didn't have backing from a huge company like IBM. ALGOL compilers would find their way to IBM hardware, but by that time, Fortran had already become an entrenched player. Another huge factor was that ALGOL may have just been too ambitious. The language had lofty ideals, and its design made a lot of those very attainable. The committee behind ALGOL was building language from a totally new standpoint, ALGOL was building language from a totally new standpoint. And while that made for a revolutionary approach to programming, it didn't really lend itself to popularity. That, and for all its new and innovative features, early versions of ALGOL lacked one very important thing. Those are input
Starting point is 00:22:59 output routines. If you followed one of ALGOL's early specifications, then there was no way for you to directly handle input or output. That's right, it was so far abstracted from the computer that Algol didn't even specify a way to interact with a screen, a keyboard, or disks. Now, this sounds like a really big omission, and it definitely was, but it wasn't done without reason. The committee behind ALGOL was only responsible for designing the language, so their work theoretically ended once the language specification was written. The rest was left as an exercise for the reader, so to speak. So the ALGOL committee didn't have to bother with
Starting point is 00:23:45 writing a practical compiler for the language. This freed them up to create a high-minded and abstracted language. But it also caused its own problems. Someone would still have to write a compiler for ALGOL. That's the software needed to turn it from source code into the binary data a computer can actually read. As far as the ALGOL committee was concerned, input-output could be handled by whoever took up that challenge. This further slowed adoption of ALGOL. The first compiler was written for the Z22 mainframe in Germany sometime in late 1958. That would be the only platform capable of using ALGOL until the early 1960s. Even once more compilers hit the scene, input-output routines still proved to be spotty
Starting point is 00:24:32 at best. The fact of the matter was that different computers handled data in slightly different ways, so I.O. had to be tailored to fit each machine. Some guidance and standards from the ALGOL committee could have really helped a whole lot, but none came for a number of years. So by the mid-1960s, you had a universal programming language, but each computer it ran on spoke a slightly different version of it. In that way, one of ALGOL's main goals was thwarted. Algol itself never hit the mainstream, but its ideas would spread a lot further than the language itself. Papers like Backus' 1958 language description circulated around universities, think tanks, and research labs. Ideas and features from those writings helped shape the coming generation of programming languages. and features from those writings helped shape the coming generation of programming languages.
Starting point is 00:25:31 Despite not sticking to landing exactly, ALGOL got a whole lot of the theory exactly right. Today, higher-profile languages like C, Java, Perl, and even Python can trace their lineages back to ALGOL. But those languages all came much later, and they'd filter Algol's influence through the lens of time and much better technology. That being said, there were earlier successors to Algol. And this brings us to the interesting story of Jovial, a strange little language that is still in use today, at least if you know exactly where to look. Jovial stands for Joule's own version of the International Algebraic Language. To get a clear picture of what we're dealing with, we must first start by looking at the eponymous Joules and what led him to create his own version of algol. Joule Schwartz probably wouldn't have called himself a computer scientist, at least not in 1958. But by the time the first
Starting point is 00:26:23 mumblings of a universal language were brewing, Schwartz was already very familiar with computers. In the early years of the 1940s, he earned a bachelor's degree in mathematics from Rutgers University, before being pulled out of academics and into World War II, and eventually the Korean War. Once he'd become familiar with the military-industrial complex, he'd actually stick with it for the rest of his life. Once back in the United States, he'd slip into studies at Columbia University briefly, but soon dropped his courses to go work at Rand Corp. Over the years, Schwartz would bounce around from lab to lab, but in general, he stayed within the influence of Rand. Now, Rand is a bit of an interesting beast to say
Starting point is 00:27:07 the least. We've talked about the company before on the show. They were instrumental in the early development of ARPANET. Strictly speaking, it's a non-profit think tank and research center. But that descriptor doesn't really tell the full story. RAND has very close ties to the US military and government. Most of its funding comes from the US government directly, and most of its contract work is for the US government or military. Their field of study is pretty wide. It ranges from socioeconomic problems to aerospace research. And in the 50s, one of the big areas of interest for the company was computing. Schwartz already had some exposure to computer from his time at Columbia. That, and his mathematics degree, made him a perfect fit to work in the fledgling field of computing. While at Rand,
Starting point is 00:27:56 he programmed extensively for Joniac, an in-house vacuum tube computer. Working with this computer, Jules got a better taste for the state of programming from a more practical side of the science. But Schwartz wasn't restricted to Janiak for long. In 1955, he moved from RAND to MIT's Lincoln Lab and became part of an ambitious project called the Semi-Automatic Ground Environment, or SAGE. If we were to say that his time at RAND gave him a firm grounding in real-world programming, then working on SAGE would solidify that with practical in-field experience. SAGE was an early air defense network built up for the U.S. Air Force, and for the time, it was cutting-edge technology.
Starting point is 00:28:43 To make a very complex topic short, SAGE was one of the first large-scale computer networks in the world, full stop. It connected over 20 mainframe sites scattered throughout the United States, and many, many more radar dishes. Each node, called a direction center, monitored its region of airspace for anything that showed up on radar. It would then stitch together multiple radar inputs to create a unified version of its chunk of US airspace. Any planes or missiles were tracked, and eventually they'd be handed off
Starting point is 00:29:16 to one mainframe or another as the object moved. In this way, SAGE was able to track and predict the movements of anything above U.S. soil. It was a huge and very complicated system. But it wasn't just big. SAGE had to manage a whole lot of different kinds of hardware all at once. The core of the operation was driven by a network of IBM ANFSQ-7 mainframes. These are computers that are built specifically for use in the project. There were also multiple radar display terminals at each installation, projection rooms for
Starting point is 00:29:52 displaying larger datasets, radar dishes, and eventually down the line, other remote mainframes. The fine details aren't super important for our purposes today, but here's what is. Sage was a very practical system. It was complicated, it was messy, and it was all about good input-output. We aren't dealing with an academic design. Sage had to be firmly grounded in the real world, and more than anything, it had to work. Now, all that flashy hardware would only go so far. The biggest challenge in getting Sage up and running was developing the software to run the network. This is where Schwartz comes in. Programming was still in its early ages. Sage did start in the 1950s, after all.
Starting point is 00:30:39 Not only was there a lack of tools in place to make programming easier, but there was just a lack of programmers. As Jules would later recall, quote, it was an environment that was interesting in the sense that at the time, of course, there were not many programmers, and we probably knew almost every programmer around the country until that project started. Then Sage hired, within a year or two, about a thousand or two thousand people. End quote. People like Schwartz made up a backbone of experienced programmers. But there just weren't enough of them. The solution was to train up a core of new programmers and then just throw them all en masse at the problem.
Starting point is 00:31:21 Eventually, that would do something. But even after an infusion of thousands of new recruits, Sage would still take years upon years to complete. This wasn't just due to the scale of the project, there was another reason at play. Sage was programmed mainly in machine code and assembly language. You're all probably sick of me saying this, but those languages are really hard to learn, they're really hard to use, and they take a lot of code to get anything done. Sage would have some huge impacts on the military-industrial complex and computing at large. But there's other results of Sage that I think are just as important. It was crafting a generation of new programmers,
Starting point is 00:32:01 one that had a different approach to computing than their more academic contemporaries. Schwartz is a prime example of this type of software nut. He didn't work in academia, or in the private sector. Instead, Jules was somewhere in between, spending his days in the gray area of the public-private partnership. He wasn't working on a product, or some high-minded academic treaties. He was designing and programming very practical systems that had to work, and they had to work very well. And this practical infield experience at Sage would serve him very well in the years to come. Schwartz left Sage behind in 1957 and went back to working at Rand. More accurately, he was now working for the Systems Development
Starting point is 00:32:46 Corporation, but that was just a more focused company that was spun off from RAND. But even though he was no longer part of SAGE, the project would still linger in the air. SAGE was really the first step in a long process of digitizing America's defense systems, and as the Cold War continued, more large-scale projects would start to emerge. Experienced programmers like Schwartz were in high demand, so it wasn't long before he was working on one of these next-generation secret projects. In 1958, the Strategic Air Command Control System, or SACS, started. This was touted as something like a super-SAGE. The plan was to use knowledge gained from the development of SAGE, blend it with newer technologies, and create a more sophisticated command and control system. The entire project was to be
Starting point is 00:33:36 overseen by Strategic Air Command, but the software side of things was contracted out to SDC. This must have given Jules flashbacks. SACS was slated to have even more features and complications than SAGE. The network was planned to be larger, handle more types of communications, and even be hardened to nuclear attacks. In other words, Schwartz was facing down a repeat of all the problems that he experienced at SAGE. If SACS was going to be completed in a reasonable amount of time, then a radical solution Schwartz was facing down a repeat of all the problems that he experienced at SAGE. If SACS was going to be completed in a reasonable amount of time, then a radical solution would have to be figured out.
Starting point is 00:34:14 In this regard, timing was everything. Just as Schwartz and his co-workers were trying to find a way to make SACS feasible, a little paper was published. Of course, I'm talking about Backus' 1958 paper that laid out the first description of IAL, also known as ALGOL. The paper was circulated very widely among programmers, and eventually it would find its way to Schwartz's desk at SDC. He recognized right away that a language like IAL could be a solution, but it would need some changes. Quoting from Schwartz yet again, quote, based on the experience with Sage and the reading of the IAL description, I recommended to the SACS development managers prior to my transfer,
Starting point is 00:34:58 the use of a high-level language to program the system. To my and a number of other people's surprise, the recommendation was accepted. End quote. Honestly, it shouldn't have been that surprising. High-level programming languages were just starting to get popular, and with SACS touted as the next step in defense technology, it makes sense to pair it with the next step in programming. Using a high-level language would be flashy, but would also make programming much easier and hopefully much faster so out of the spirit of practicality a new language was starting to form in the latter half of 1958 schwartz submitted a more formal request for this new programming language project it was a short document but it laid down the outline of this future programming language and plans on how it would be developed.
Starting point is 00:35:48 Quoting from Schwartz, The title of it was OVL, our version of the International Algebraic Language. It was a brief description of some of the language concepts which could be implemented in order to develop the SACS program. This paper was accepted as the beginning of the project, and the first work was begun. End quote. Sadly, by Jules' own admission, he threw out the original design doc. But the project would see a lot of changes during development, so maybe it's not that big of a loss. One of the first changes was the name. It seemed that no one really liked calling the language OVL. Early on, the project team suggested that the language be renamed Joule's Own version of IAL,
Starting point is 00:36:33 and that would stick. But besides the name change, a lot of the rough ideas about Jovial would remain constant. It's just that the details would kind of be ironed out on the fly. The primary goal was to develop Jovial into a tool that could facilitate the development of Saks. Basically making a new tool before you get down to the real work. The rest of the language's goals were heavily informed by this. Jovial had to be easy to learn and easy to work with. It also had to be able to handle large and complicated tasks. From working on Sage, Jules also knew that Jovial would need to be really, really good at logical operations and flow control, while also being able to handle low-level hardware access. Algol fit the bill
Starting point is 00:37:18 for a lot of those needs, so it would serve as the template for the new language. But there would need to be a lot of tweaks and additions made to really get things to fit. It's these changes to Algol's established formula that are where the more interesting bits are. One of the big differences comes down to core methodology. Jovial had to be designed as a language, but a compiler would also need to be developed. So Jules and his team couldn't just write up a language doc and then be done with it. The compiler had to be just as important as the language itself. So you can clearly see this thinking at play in the design of the compiler. You see, here's where Schwartz did something that I think is really, really clever. From the
Starting point is 00:38:03 beginning, he decided that the Jovial compiler would have to be written in Jovial itself. Now, that should be a little bit mind-bending. Believe it or not, most modern compilers are built this way. For instance, the GNU C compiler is written in C. These types of compilers are called self-hosting because, well, they can compile themselves. As far as I can tell, Jovial was the first language to use this approach. An initial reference compiler was completed in assembly language, and then once there was a way to convert Jovial code into binary, the entire compiler was rewritten in Jovial. Schwartz didn't just do this to flex his team's programming prowess.
Starting point is 00:38:47 There were actually a few really good reasons for this departure. For one, it forced Jovial to be a true general-purpose programming language. Compilers are, perhaps unsurprisingly, some of the most complicated programs out there, so having self-hosting as a goal served as a really good guidepost for the language. If you can write a compiler in Jovial, then you can write just about anything. The other big reason was portability. Self-hosting means that there's almost no machine-specific code in the entire toolchain for the language. To bring Jovial to a new computer, an initial compiler would need
Starting point is 00:39:25 to be written in some version of that computer's programming language. But then the magic happens. Using the first compiler, you bootstrap into the self-hosting Jovial written compiler. Then any update to the compiler is very easy to port over with no new code. This type of planning and development was only possible because the Jovial team had to build the language and the compiler together. It's very in line with the same thinking that went into Algol, abstracting your code and your programmer away from hardware. But it's an approach that the Algol team wasn't able to take. When we get to the actual language itself, we can also see more of ALGOL's big impacts. The general structure of Jovial is almost identical to its inspiration. It has all the same core functions as ALGOL, right down to
Starting point is 00:40:16 if statements, loops, switches, and all the math system. And in general, the two languages use very similar syntax, which means just looking at code side by side, they look really similar. Pulling from Bacchus' language definition saved a whole lot of work for the team at STC, but it was just a starting point for the new language. One easy place to spot the difference comes down to our good ol' friend variable typing. Now, a network defense grid like SACS has to process a whole lot of data. But more than that, it has to do it efficiently. This type of data handling is central to Jovial, right down to the types of variables that it offers. All the data types from Algol
Starting point is 00:40:58 are present, but the list was expanded quite a bit. One more niche-side example is Jovial's native data type for handling coordinate or vector data. They're called duals, and basically it's a specialized array that always has two elements. Duals are a really good example of how Jovial is tailor-made for real-world problems. XY coordinates are used constantly for geospatial data, so it's not surprising that they're a core element of Jovial. On a technical level, that's nothing groundbreaking. It's just an extension to what Algol offered, but you could do roughly the same thing with any other language of the time. But looking a little further gets us into the more interesting and newer territory. So-called status variables are a more surprising inclusion. In more modern languages, a status variable would be called
Starting point is 00:41:52 an enumerated variable or in C just an enum. They allow a programmer to define a list of possible states for a variable and then reference those states by name. It's a convenience thing, but there's a lot of little intricacies to it that I think are really fascinating. For instance, if you're programming a missile tracker, you may have a status variable that can be set to either ready, launching, flying, or eventually exploding. As with booleans, you can't directly store any of these states in memory. You can't really tell a computer, oh yeah, I have this variable, set it to explosion. The compiler has to know how to handle that.
Starting point is 00:42:34 Eventually each status gets mixed into some numbers for the computer to shift around, but the actual state names are much easier for a human to understand. It's an abstraction that makes life a whole lot easier for the programmer. And as far as I can tell, Jovial was the first language to add this feature, another one of many firsts to come out of this relatively obscure language. But there's one more data type that I want to get into that to me is mind-blowing. What Jovial calls tables and entries are another place where we can see this type of really high-level abstraction at work. Now, this is where the language takes a really modern twist. The reference manuals describe tables as a, quote, collection of data objects.
Starting point is 00:43:27 That's right. Jovial had objects before 1960. That's a really big deal. Objects are a complicated topic in themselves, but to simplify things considerably, think of an object as an abstract way to describe data. You define an object once by telling the computer what kind of data it holds. Then you can create a new instance of that object and fill it with new data as needed. Objects make handling complex datasets easy and pretty fail-safe. Generally speaking, object-oriented programming is seen as a really modern thing. It first gained steam in the 60s and 70s. But here we have Jovial doing a very, very similar thing in the late 1950s. To be totally fair, entries in Jovial aren't 100% what we would call objects today, but they're starting to get just the edges of the idea correct. Each entry has its own internal items that store data,
Starting point is 00:44:20 and you only define that entry once. You can create new instances of that entry by expanding the table that holds it. In practice, it can look and act a lot like a very primitive object. But the reasons Jovial has these entries are a lot different than the reasons objects became popular. Jules designed tables and entries as a way to handle data streams more nicely. If you have some incoming chunk of raw binary data, you can tell Jovial to treat it as a set of entries of some given format. Then you can treat it like a set of variables. It's simple, it's fast, it's effective, and it's a really good way for programmers to hand wave away complicated data. As long as you know, at least roughly, what the input or output data
Starting point is 00:45:06 looks like, you can just convert it into some nicely named variables. And really, I think that entirely sums up the spirit of Jovial. It's a high-level language that borrows a lot of the best ideas from Algol, but it also compromises and has low-level features where needed. Sure, you can stay up in object land, but sometimes you need to look at raw data. And while Schwartz and his team made ways for programmers to stay safe from the hardware, they also let them get down and dirty when needed. It's practical without being overly pretentious. And when the high-level language wasn't enough,
Starting point is 00:45:45 Jovial even had the ability to write inline assembly language. It really seems like Jovial was built so programmers had no reason not to use it. By the end of April 1959, most of the language design was completed, and Schwartz published the first full documentation of the project. The first prototype compilers were working by the beginning of 1960, and the self-hosted compiler was running by the winter of the same year. With a working compiler in hand, it was time to get down to the actual task. If we can remember that far back, it was time to actually program SACs. The process of creating a new language and doing the actual program ended up being the exact right decision.
Starting point is 00:46:25 In practice, Jovial was relatively easy for programmers to pick up, at least after a little bit of work. You see, the team had developed a respectable language, and a few different compilers even, but documentation would take a little bit to catch up with this state of the art. Quoting Schwartz, take a little bit to catch up with this state of the art. Quoting Schwartz, quote, The initial capability didn't provide much tutorial assistance, so the learning of the language was not easy and the documentation, although it was improved with time, wasn't the best for beginners. But those who were responsible for training and criticized this lack of good teaching aids also pointed out that once people got over the hurdle of learning Jovial, there were very few questions and very few problems
Starting point is 00:47:10 in using what they learned. Most people found the language relatively easy to retain and use for everyday work. End quote. I've been digging through a lot of the reference docs for Jovial, and it's true what they say, they are pretty dense. The most official versions span multiple volumes, each volume being somewhere around 300 pages long. It's not the most useful teaching material, that's for sure. But I think the SCC team can be forgiven a little bit in that regard. There weren't any languages at this point that were beginner-friendly. I mean, computers in the 60s and 50s weren't really beginner-friendly to start with. That being said, Jovial is a good deal more user-friendly than contemporary languages. And as Schwartz pointed out, once trained up, programmers found the language easy to use day-to-day. And really, that was the point.
Starting point is 00:48:02 All the initial programming for SACS was completed pretty soon after Jovial hit the scene. I've seen it estimated that the entire system had well over a million lines of code, and took a combined 1400 programmer years of work to complete. Now, that's a whole lot of code no matter how you cut it, but in context, it becomes impressive. Sadly, all these systems are pretty military, so we don't have access to all the gory details. We just have reported estimates. According to the book The Closed War by Paul Edwards,
Starting point is 00:48:35 the codebase for SACS was roughly four times larger than Sage. But it was completed in half the time. That is the power of a really good programming language. It lets programmers tackle more complex problems more easily. Over the early 1960s, SACS expanded out, and Jovial snuck along for the ride. It was during this decade that Jovial was further codified and eventually became the standard language of the U.S. Air Force. And yeah, you heard right. The Air Force actually has a preferred programming language,
Starting point is 00:49:09 and for decades, it was Jovial. The language also saw some limited use in the Navy. The Federal Aviation Administration, or FAA, also took to Jovial pretty early on. In practice, this meant that contractors and vendors that worked with the Air Force, FAA, or some DoD projects were forced to program their systems in Jovial. So companies like Lockheed, Boeing, MacArthur Douglas, and a lot of others were kind of pushed into the new language. For decades and decades, Jovial was a federal mandate, and that really helped entrench the language in a lot of strange places. But it wasn't chosen as a de facto standard without any reason. Jovial offered a feature set that no other language really had at that time. In a strange way, Jovial started out as
Starting point is 00:49:57 a really forward-thinking language. Schwartz's focus on portability made it easy to push Jovial over to any new computer, and its more low-level features made it really good at hardware control. Mainframes were just the first application. As computers got smaller, a new class of systems would appear, the microcontroller. These are small, cheap, and relatively self-contained systems that you can tack onto just about anything. The idea is that for a low cost, you can computerize any electronic device. For the military, that means something like a missile could now have an onboard computer, or a jet engine could have its own independent control system.
Starting point is 00:50:37 The actual code that a microcontroller runs is very concerned with low-level hardware control, but oftentimes, you don't want to use something as low-level as assembly language. It's just not a good thing to program in often. Jovial fits perfectly into this use case. So as microcontrollers became more widely adopted in the Air Force and contractors, so did this early language. At its core, Jovial is based off the high-minded ideals of Algol, but it's been reshaped into something more practical. And that's been its major path to success. It may not be very mainstream, but Jovial represents a language that adapted earlier work to suit a very unique environment.
Starting point is 00:51:23 All right, it's time to wrap this episode up. This one has been a bit of a journey, and hopefully not overly technical. Programming is really near and dear to me, so when I get the chance to talk about it, I tend to ramble on for a while. And the early roots of the discipline give us a lot of insight into modern languages and practice. ALGOL is a great example of a trend that we still see at work today. That's the drive towards standardization. Fractured and divided technology, well, turns out that's always existed. And attempts to clean things up always follow.
Starting point is 00:52:00 Algol was probably too ambitious for its own good. But its creators tapped into something really big. The idea of a standard language, independent of computer, that kept programmers isolated from the messy hardware world? Well, that's a really big deal. In the years to come, that core concept would reshape how computers were used by programmers. We still feel its effects today. But ultimately, ALGOL wouldn't bring about that change directly. The ideas laid out by the language would have to be filtered through its successors. Maybe ALGOL was just a little bit too strong of an idea for the time. Jovial was the earliest attempt to reshape ALGOL from theory into practice. It had a very real and very pressing goal,
Starting point is 00:52:47 namely to make programming command and control systems a lot easier. But more abstract features like portability were still retained. It even slid in a glimmer of the future. I'm not going to argue that Jovial was object-oriented, but you can see the shape of an idea forming there. But I think the best way to see the difference is how Schwartz and the SDC team described a programming language. Remember that the first Algol paper laid out a formal description of a language like it was a mathematical
Starting point is 00:53:16 theorem. C.J. Shaw, a programmer at SDC, opened an early paper on Jovial like this. From the very beginning, the effective utilization of the automatic digital computer has been hampered by the problem of man-machine communication. To the uninitiated, communicating with a computer is quite as esoteric as communicating with spirits, and many authorities believe fundamental similarities exist between the two occupations. After all, what is an algorithm but a practical incantation? End quote. Thanks for listening to Advent of Computing. I'll be back in two weeks time with another piece of the story of the computer. And hey, if you like the show,
Starting point is 00:54:00 there are now a few ways you can support it. If you know someone else who's interested in computer history, then why not take a minute to share the show with them? You can also rate and review on Apple Podcasts. And if you want to be a super fan and help support the show, then you can do that through Advent of Computing merch or signing up as a patron on Patreon. Patrons get early access to episodes, polls for the direction of the show, and other assorted perks. You can find links to everything on my website, adventofcomputing.com. If you have any comments or suggestions for a future episode, then go ahead and shoot me a tweet. I'm at Advent of Comp on Twitter.
Starting point is 00:54:34 And as always, have a great rest of your day.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.