Advent of Computing - Episode 45 - Keeping Things BASIC
Episode Date: December 14, 2020BASIC is a strange language. During the early days of home computing it was everywhere you looked, pretty much every microcomputer in the 70s and early 80s ran BASIC. For a time it filled a niche almo...st perfectly, it was a useable language that anyone could learn. That didn't happen by accident. Today we are looking at the development of BASIC, how two mathematicians started a quest to expose more students to computers, and how their creation got away from them.
Transcript
Discussion (0)
This is not a drill. This is for real. Stand by for urgent communication from ACT Central.
Drop whatever you're doing and go immediately to your computer terminal.
The urgent message on your micro snaps you to attention. By the time you get there,
the printer has already spewed out an eight-line message. It looks like gibberish, but you know
better. Unlocking the desk drawer, you bring out ACT's latest codebook,
the most recent issue of X-Men comics. To the untrained eye, it looks quite ordinary,
but once the special transparency has been slipped over the next last page,
the lines of a basic program leap into view. Sounds like fun, doesn't it? That's the opening
to Space Attack, microadventure number one.
It's a sci-fi adventure book that was written to teach kids how to program.
But don't be tricked by my usual M.O.
Space Attack isn't some isolated and obscure text that I've dug up from the middle of
nowhere.
This was part of a much larger cultural phenomenon in the 70s and 80s.
BASIC, that is, BASIC the programming language, was synonymous
with computing. Pretty much every computer on the market ran BASIC as soon as it turned on.
It was one of the few things that IBM, Apple, RadioShack, and Commodore all agreed on.
But more than just being a popular option on home computers, the language itself was everywhere.
Micro Adventure is just one example.
Similar books in the genre were prevalent. Computing and gaming magazines invariably
came full of basic source code, it's just waiting to be typed in. TV shows taught the language.
The BBC even had a hand in the language's spread into classrooms and the home. Everywhere you turned, there was BASIC.
You could call it a cultural moment, that is, if a moment can last for years on end.
But then, almost as quickly as it appeared, BASIC was just gone.
Welcome back to Advent of Computing.
I'm your host, Sean Haas, and this is episode 45, Keeping Things Basic.
Now, this is one of those episodes that I've been meaning to do for quite some time.
Basic, aka beginner's all-purpose symbolic instructional code, well, it's made these repeated cameos throughout the show.
It's reared its head in my episodes on the Altair 8800, the Oregon Trail, Applesoft, and it's also
just been in the background of a lot of other episodes. I figured that it was finally time to
cover it in depth. But I guess that leads to the first question of many. Why was BASIC so prevalent?
Why does it show up in so many different places?
I think the biggest factor is that BASIC isn't really a programmer's language.
It's not an ALGOL, it's not a C, and it's not LISP.
It wasn't ever intended to be used for huge, complicated projects. A lot of more professional programmers don't even like
BASIC. That's because the language wasn't designed for professional programmers.
BASIC was built to give non-experts a way to use computers. This was back in
the day when using a computer meant you had to know how to program at least a
little. Not everyone needs to develop a high-performance particle simulation, but
a lot more people
do need to have access to just handy and quick calculations.
BASIC has a pretty long history.
The language was around well before home computers hit the scene.
Once laypeople start getting in on computing, BASIC becomes integral to their experience.
Then pretty suddenly, actually, BASIC disappears.
The IBM PC comes with BASIC in ROM, and so do a lot of early PC clones, but by the latter half
of the 1980s, that becomes less and less common. A perfect example is the Macintosh. Released in
1984, it had no BASIC to speak of. The only way to get your programming fix on that platform was
via third-party software. This episode, we're going to be tracing things back to the very start.
I want to examine how BASIC came into being, and then how it filled a very important niche in early
home computing. Finally, we'll close out by trying to understand why BASIC left center stage.
we'll close out by trying to understand why basic left center stage. The year was 1955,
and when Thomas E. Kurtz was on the prowl for a new job. He was mere months away from earning his PhD in mathematics from Princeton, and with that phase of his life nearing an end,
Kurtz wanted to have something new lined up. And as it so happened, timing worked out really well.
Around the time that Kurtz was thinking about a new job, he heard that John Kiminy, a math
professor at Dartmouth, was in the area looking for fresh recruits.
Early the next year, Kurtz packed up his family and moved over to his swanky new research
position at Dartmouth College.
Kiminy and Kurtz ended up being kind of kindred souls, at least in two big ways.
They were both passionate about
education, that should go without saying if you have a position at a college, and perhaps more
importantly for us, they were very early digital devotees. In 1951, Kurtz attended a series of
lectures at UCLA, where he saw a computer for the very first time. That same year, he wrote his first
computer program. This early exposure was enough to convince Kurtz that computers were going to be
the future. Kiminy had a very similar form of conviction, but his came from a different source.
Born in Hungary, Kiminy immigrated to America with his family in 1940. This was done primarily
to escape the Holocaust.
While it was a matter of survival, this move would also lead to some important opportunities for the young Kimony. In 1943, he entered Princeton's mathematics department,
but before completing his undergrad, he was recruited for the Manhattan Project.
Kimony spent the next three years working at Los Alamos alongside the likes of John von Neumann.
While there, he was exposed to cutting-edge technology.
And eventually, he too would come face-to-face with early computers.
After the war, Kimony returned to Princeton to complete his education.
During his grad studies, he got his first chance to actually program a computer for himself.
So, we have two mathematicians that each caught their own tantalizing glimpse of the decades ahead.
But despite the promise of a bright future,
it was plain to see that the road ahead was going to be really long.
Computers of the 1950s were primitive.
This is firmly in the era of vacuum tubes and punch cards.
As both educators and researchers,
Kemeny and Kurtz were some of the first to realize the importance of spreading the digital
message to students. Getting a little more specific, Kurtz described his vision this way,
quote, Dartmouth students are interested mainly in subjects outside the sciences.
Only about 25% major in science or engineering. While science students
will learn computing naturally and well, the non-science group produces most of the decision
makers of business and government. We wonder, how can sensible decisions about computing and
its use be made by persons essentially ignorant of it. This question begged the conclusion that non-science
students should be taught computing. The hard question was not whether, but how.
Remember now that at this point in history, computers were largely experimental. There
were some mass production models, but those weren't all that far removed from government-funded labs.
However, that was enough for the duo from Dartmouth.
They knew what a big deal computers were going to be, not just for the sciences, but for anyone.
To them, a computer wasn't just a number-crunching tool.
It was the start of something bigger, something that would affect everyone.
But the fact stood that in 1955,
there really wasn't a way to introduce students to computers. But in the coming years, that would
change, albeit very slowly. In 1956, the first promising development came down the grapevine.
MIT opened up the kind of auspiciously named New England Regional Computing Center.
It sounds pretty grandiose, but we're still in the 50s.
The center was actually a single room that housed a single IBM 704.
True, it's a top-of-the-line computer for the year, but that was it.
It was one computer in one room somewhere off in MIT.
And with machines being a really hot resource, access was heavily impacted.
MIT had only been able to get a computer thanks to an agreement with IBM.
The machine was heavily subsidized so long as IBM could get some computing time on it.
So yeah, computers were so scarce that IBM had to ask for help.
Access to the coveted IBM 704 worked a little bit like this.
Its time was broken up into shifts.
One day shift was dedicated to IBM's own researchers,
and the night shift was dedicated to IBM projects.
During a third third spare shift,
researchers from around New England
could come and use the computer
or rather they could schedule time to use the computer.
In practice, this meant that each college in the region
had someone working as a digital liaison.
For Dartmouth, it was Kurtz.
He was in charge of getting other researchers
excited about computing
and the messy work of interfacing with the center at MIT. Quoting from Kurtz, quote,
So we had the key punches and people would write programs. Then I would carry them down in a steel
box, catch the 620 out of White River, get into Boston around 930, and take a cab to go to the
MIT campus or something like that.
Submit the cards and they would go in. At the end of the day, I would pick up a bunch of listings
that were the results, which were usually error reports. Then I would cart these back to Dartmouth.
Every two weeks I did this." Now, you could call this remote access, or more accurately, you could call it frustrating.
There was a computer within reach, but without direct access, no one could really get all that
much done. With a two-week turnaround time, debugging could take months. Crucially, that
meant teaching programming in a semester just wasn't possible.
Imagine trying to understand computing when your only experience is an error message that comes back twice a month.
That being said, this was a spark.
It may have been underwhelming, and it was definitely frustrating, but it was something to work with.
Before long, Kimony was trying to find a way to integrate the New England Computing Center into his classes. The long turnaround wasn't really something he could address. He couldn't
schedule more time on the system because there just wasn't. Instead, Kimony focused on an issue
that he could actually address. Programming has always been pretty hard, especially for those outside the sciences. The easiest way to program
IBM 704 was via assembly language. If that's the best option, then it should be pretty clear that
things are bad. Any aspiring programmer would have to know their way around the mainframe
really well to get anything done. So, Kimony set to work. His first attempt was called DAR-SIM-CO, the Dartmouth
Simplified Code. It's an awful acronym, but bear with me here. It was a very preliminary step in
the right direction. Essentially, DAR-SIM-CO was a set of assembly language templates for
common operations. The idea was to reduce the amount of code a
student needed to write and hopefully make the code more understandable. There isn't all that
much surviving information about the language because, well, it didn't really catch on and it
was an experiment to begin with. From the sample code we do have, some of it strewn throughout
later papers written by Kurtz and some of
it in contemporary sources, it's pretty clear that the language just didn't go far
enough.
By this, I mean it was still too close to the hardware.
Darcym Co. was still based around instructions.
Each line of code gave a very simple step.
These were steps like storing a number or adding a number to a stored value.
It's essentially how assembly language works.
It makes the language unwieldy unless you know exactly what you're doing.
In other words, there was a very steep learning curve.
Kimony had identified one of the key problems with teaching students to program, but he hadn't yet reached a workable solution.
If anything, the work around Darcymko proved the
current methods just weren't viable. That's a result in itself, but it kind of sucks.
The way forward would have to come from a totally new angle. And luckily, the state of the art in
the late 50s changed really quickly. While Kemeny and Kurtz were struggling with assembly language, something
big was brewing. Tucked away at a Sperry-Rand lab, Grace Hopper had developed the very first
compilers. And in a nearby IBM facility, the first practical application of that technology
was taking shape. By 1957, Fortran compilers and manuals shipped out to every IBM installation in the country.
That included the New England Regional Computing Center.
For the Dartmouth team, this represented a possible new angle of attack.
But there were still some reservations.
Kurtz initially didn't want anything to do with Fortran.
There was this prevailing notion at the time that anything but assembly language had to be slow.
To program in high-level languages like Fortran, you have to give up a little bit of control.
Sure, it's a lot easier to use, but you have to trust that the compiler will turn your source
code into efficient machine code. Kurtz would come around eventually. After hours of trying
to get an assembly language program to work, he decided why not give Fortran a try.
And to his shock, the program was up and running in minutes.
Any remaining resistance didn't really hold up to those kinds of results.
I can only imagine there was a moment where Kurtz ran into Kempney's office to throw
a Fortran manual at the older mathematician.
The next huge step was getting a computer.
the older mathematician. The next huge step was getting a computer. In 1959, Dartmouth finally agreed to purchase a machine after considerable coercion from Kimony. The computer in question,
an LGP-30, was relatively small, but it was a big improvement over train-only access.
So everything should be good, right? Dartmouth finally has a computer, and with
Fortran around, there's an easy way to program the thing. The LGP30 didn't have a native Fortran
compiler, but there were other high-level languages for the computer at this point.
Kimony and Kurtz even had an ample supply of grad students to manage the system,
so how can there be a problem? Believe it or not, this wasn't
all that much of a step forward for the team. Over the next few years, they got a whole lot of use
out of their new computer. In 1960, the ALGOL specification was published, and Kurtz and
Kimney were some of the first to develop a fully-fledged ALGOL compiler. But even with a glut of programming options, there was a
problem. All these new languages were meant for programmers, not for novices. It's kind of obvious,
but I think it really bears stating. Fortran was developed specifically for scientific applications.
It was easier to use, but it was meant to be used for the sciences.
ALGOL was designed as a quote-unquote universal programming language.
To a computer scientist, it looked and sounded fantastic.
It's a very mature, well-thought-out language.
It had a lot of features and design choices that experienced programmers marveled at.
But to a novice, it was just an impenetrable wall of text and symbols. New programming languages were making computers more accessible, but
only to a certain crowd. In 1962, Kimony took a second crack at creating a new programming language.
With the help of a grad student, Sidney Marshall, he wrote the Dartmouth Oversimplified Programming Experiment, aka DOPE.
This would be the immediate predecessor of BASIC, but to say the languages are similar may be a bit of a stretch on the surface.
Rather, DOPE is an ideological ancestor. It was a proving ground for some of the notions that Kemeny and Kurtz had been forming.
The way the duo saw things, there were a handful of key barriers preventing beginners from
programming. Computers were too hard to use and too hard to access. That all came down to the
systems themselves. Before anything else, users needed a way to get their hands on an actual
computer, and it had to be done
in some human-friendly way. Even once the computer was accessed, the languages themselves
formed an impenetrable wall to the novice. If you'll excuse me for a minute here, the
best way I can put this is a little bit of programming theory. Languages are made up
of syntax and semantics. Syntax is what the language actually
looks like, what kinds of characters and words make up the source code. Semantics is what that
code actually means, in other words, how you can use that code, and generally just how it can be
understood. To program, you need to understand both of these parts of a language. You need to be able to write
valid code and sensical code. Let's take Fortran as an example, since I think it's especially
applicable here. Fortran has this reputation as being particularly confusing, especially the early
versions that Kimony and Kurtz were using. Let's say you want to write a Fortran program that adds two numbers. Well, you can't just add two numbers and then print the result.
First off, you need to figure out if you're going to be adding floating-point numbers
or integers.
Then, you need to declare some variables with the appropriate data types.
Two for inputs and one for the eventual output.
Ah, but to declare variables of a specific type, you have to follow
naming conventions. If you're in the integer realm, then you need variables to start with I, J, K, L, M,
or N. Once you declare variables, set their values, and then do the math, how are you going to print
it out? You need to pick a print device, choose a format, and turn your resulting numbers into a
string.
And for each line of code, you need to know where to put all the proper commas, colons,
and equal signs to get it past the compiler.
If you're an experienced user, then this kind of work makes sense.
It's just how you program.
But for everyone else, it's nonsense.
Syntax rules especially tend to turn into something like Calvin Ball pretty quickly.
Oh, you want to add 1 to a floating point number?
Well, you better remember to put a trailing period on that line.
For programmers, it's become a bit of a joke to write
here be dragons around particularly confusing code.
But for someone outside the field, every line might as well contain dragons.
Dope was the latest attempt to address these issues, and as the name suggests,
Kimony designed it as a very simplified programming language,
one that diverges considerably from other languages.
Normally, this is where I'd start to say that sources are scarce,
but I ran into some really good luck here.
Thanks to the power of a little searching, a friendly email, and the wonderful archivists
at Dartmouth's library, I was actually able to track down a copy of the original paper
describing the language.
From this one document, it's clear to see how Kemeny's views on programming were starting
to develop.
And we can already see the bones of BASIC taking shape.
In DOPE, each line was a single statement.
This feature alone protected students from sticky assembly language.
That's all in line with contemporary languages, but here's the key difference.
Each line started with a line number.
And the program would be sorted by the line numbers before it was compiled and executed. This
is subtle but it actually makes a huge difference. For one, this gives a
convenient way to handle control flow. Each line had an explicit name to it,
which makes it easy to branch anywhere you want without extra syntax or wasted
labels. The rest of the language was built to be simple and reasonable to understand.
After the line number came an operation, so every line had a very predictable and simple structure.
But the operations were still a little bit terse. Most were only a single character.
Print was just P. J would ask the user for a number. And since operations had to be at the start of each line, math functions were a little annoying.
Adding two numbers and storing them in a variable looked something like plus ABC.
Maybe not the most intuitive way to handle math.
The rigid syntax structure makes things easy to understand, but only if you're familiar with the language. You
can always tell where operations and arguments are, but it can look kind of weird. Math operations
are just one example of the strangeness. Most operations are a single character, and with fixed
argument formats, that can make lines of code hard to read for a newcomer. Loops are another egregious example. The loop operation, for some inexplicable
reason I can't understand, is called Z. It takes arguments for iterator, starting value, and ending
value. The end of the loop is notated by an E for end loop. You just kind of have to know that z and e need to come in pairs or your program will
break. Variables are another weird halfway solution. In Dope, everything was represented
as a floating point number, at least somewhere along the line. On input or output, numbers were
converted to a reasonable format. But on the backend, everything was afloat. This meant that there
weren't variable types to deal with. A student could just throw in a number and do some math.
But there was one caveat to this. One of the many rules of programming Calvin Ball.
Dope had reserved variable names, but these were just used for arrays. E, F, G, and H were each used to denote 16-element
long lists. Variable names could be a single letter followed by up to one number, but you
had to keep in mind which names were present for arrays. Honestly, it's not that much of an
improvement over Fortran, but the combined variable typing really does make things a lot more simple. This simplicity is one of the amazing things about Dope.
Once you learn the weird syntax and abbreviations, you're over the hump.
And there are only a dozen or so operations to learn,
so you can really learn the language in a few minutes.
Implementation-wise, things are also easy.
As you should know, dear listener, I don't take things in half measure.
As soon as I got my hands on the full language description,
courtesy of the good people over at Dartmouth's Archive, I set to work.
It took all of one afternoon to write a very quick and dirty interpreter.
And as near as I can tell, that makes me the current reigning world expert on dope.
And I can tell you that this me the current reigning world expert on dope. And I can tell you that this
is truly a language designed for simplicity. It's easy to write, once you learn it that is,
and the actual interpretation or compilation of the language is relatively simple. The very
simplified and rigid syntax ensures that. Dope would only be used for a single semester in 1962, but that was the point
of it. It was designed as an experiment. Kimony's 1962 paper makes that much clearer. The entire
language description is written as a lesson plan. The manual describes how to draw flowcharts and
then how to turn those directly into code. There are even handy worksheets to
help in the process. In this sense, DOPE was used to turn students into lab rats, and that went a
long way towards a bigger and better language. Kimney and Kurtz were starting to see which
future and ideas were actually viable. With the next push, they would come to a full solution.
With the next push, they would come to a full solution.
In 1964, after a substantial NSF grant, Dartmouth acquired a new computer.
This new and much larger machine, a GE225, would become the focus of Kimony and Kurt's efforts.
Combined with some extra hardware and a smattering of teletype terminals, something really extraordinary started to form.
On a trip to MIT, the team had seen a demo of an amazing new technology,
timesharing. Now, we seem to come back to timesharing a lot on this show because,
well, it was a really important step in the development of more modern computing.
To keep things short, it's a technique whereby switching between multiple programs, a computer can be shared between multiple users. As long as the computer
can switch fast enough, each user just thinks that they're driving solo. In other words,
it lets one computer service many more users. In the earliest days of the 1960s, MIT was
prominently spreading the idea, but it appeared
organically in a few other labs.
And once Dartmouth upgraded to a bigger system, the idea spread to their campus.
For Kimony and Kurtz, timesharing would allow them to spread around their limited resources.
Perhaps more importantly, it opened up the chance for more students to hop on the computer.
Their ultimate plan was underway,
to develop a time-sharing operating system and a new beginner's programming language.
Kurtz envisioned the system in shockingly modern terms. He put it this way, quote,
A. Students would have free access. B. There would be complete privacy. No one would know what the students were doing.
c. The system would be easy to learn. The computer should contain its own instructions.
d. The system would be designed to save the time of the user, even if it appeared that the computer time was being, quote, wasted.
e. Turnaround time should be sufficiently rapid that students could use the system for homework.
And F. The system would be pleasant and friendly.
Now, the new multi-million dollar mainframe would become a campus resource, just like a library or anything else.
It would be free, easy, and hopefully friendly to use.
easy, and hopefully friendly to use. It would be built as a safe space for novices to play and learn, and a tool to help them with homework. With the help of a rotating cast of grad students,
Kimney and Kurtz would hurdle toward this goal very quickly. Dartmouth Time Sharing System,
DTSS, was developed concurrently with a new programming language. That language would be
known as BASIC, the Beginner's All-Purpose Symbolic Instructional Code. The idea being
that DTSS would serve as a framework to allow access to BASIC. And with a better computer,
more advanced software, and a lot of lessons learned, BASIC would shape up to be a really
good programming language. While DTSS and BASIC were developed together, BASIC would shape up to be a really good programming language.
While DTSS and BASIC were developed together, BASIC actually got off the ground a little bit early.
Instead of waiting for a full timesharing system to be up and running, Kemeny started
out by designing a quick-and-dirty punch-card-based compiler.
On May 1st, 1964, the first BASIC program was compiled and ran.
And as DTSS came online, the language became
available for all students at Dartmouth. The early versions of BASIC the language were tied
very closely to BASIC the implementation. By that I mean BASIC wasn't built as some abstract
programming language. It was made as a practical tool paired with DTSS. And that's actually
led to some confusion over the years. You see, BASIC started out as a compiled language.
The team at Dartmouth wrote a fully-fledged compiler to turn BASIC into machine code,
but the interface that users actually worked with kept them safe from all of that.
Kurtz described the full package as
an illusion designed to make students think that the computer spoke a more friendly language.
And I think the ensuing confusion over interpreted versus compiled BASIC is a testament to how well
that illusion worked. So what did this friendly language actually look like? Most noticeably,
what did this friendly language actually look like? Most noticeably, BASIC has explicit line numbers. It's sort of like DOPE, but with an important and very small difference. In DOPE,
each line number pretty much had to be sequential. You start at 1 and you went up to 99. BASIC,
on the other hand, didn't have those kinds of restrictions. The program started with the lowest number that you put in, then ran the next one and so on.
You can start at 10 and have the next line of code over on line 20 with no issues at all.
Dartmouth's mainframe was outfitted with paper feed terminals.
That was the primary input and output method.
That on its own imposed major limitations on how users interacted with
the computer. Text editing was one area that was severely impacted. The only option was to offer
programs that worked one line at a time, that's not really conducive for editing a file. These
kinds of editors were called line editors, and they were usually controlled by a series of cryptic
keywords and keystrokes. If you've ever used Edlin on DOS, then you should have an idea how
cumbersome this is. In order to program, you'd usually need to know how to use the editor first.
Either that, or just punch everything up on cards and input it directly. Neither of those is very
good for someone who doesn't care that
much about computers. But by adding explicit line numbers and with a little bit of crafty work on
the interface, a lot of the more esoteric commands can just be eliminated. Need to edit line 10? Just
type that number followed by your new line of code and boom, the old line 10 is overwritten.
The other less technical upside about explicit
line numbers is that it makes you think about what you're writing. You don't have to use
sequential numbers, so every line you pick a number to use. At least for me, it helps chunks
of my code stick in my mind. You get to name each line yourself. BASIC also borrowed its overall
syntactic structure from DOPE, at least somewhat. After
each line number comes an operation, but these weren't the cryptic single-letter commands
used by Dope. For the most part, Basic keywords are all human-readable. Add in math operations
that look roughly similar to written equations, and you get a robust and easy-to-understand syntax. To sum
two variables in BASIC, you just write let a equal 1 plus 2. That's it. No variable definitions
needed, no finesse to deal with number types. It just works. One little thing about BASIC that
I don't really see addressed in the sources is that the language can be spoken out loud.
Well, without sounding like some kind of sinister incantation, that is. Everything is alphanumeric
plus mathematical operations and quotes. This is a language that you could easily use in a lecture.
A teacher can just read off a line of code and a student can jot it down as a note without
needing to know any of the language's weird rules. Keywords were also kept to a minimum. In all, the first version of BASIC
only has 15 keywords. The aforementioned LET is used for variable assignment. PRINT puts characters
onto the teletype printout. IFTHIN handles conditionals. for is used for loops. They're all just words. Coming
from assembly language or even Fortran, this would have almost felt too simple. There's no
weird extra characters to make everything work. I'm not going to list the full set of operations,
you'd be better served by a manual. But I do have to bring up one more. That's the much-maligned goto statement.
Line numbers in BASIC serve a very special purpose. Since each line has its own unique
identifier, you can easily jump to any part of the program. If statements use that feature,
if their conditional is true, then they jump to a given line. Subroutines are also called by line number, but GoTo offers a more
direct way to use line numbers. It's an unconditional jump. As in, GoTo this line.
This is a really simple statement, but I think GoTo itself explains a lot of the thinking that
went into BASIC. A lot of programmers think that GOTO is somewhere on
the spectrum of pure evil. It's not a feature that BASIC invented, it's actually been around
a lot longer. But BASIC became a prominent language that used GOTO. Programmers hate it,
and in some cases they actually believe that the mere use of goto can permanently damage the mind of a software developer.
Why?
Well, because for an expert, there are better ways to accomplish the same task.
A common use for goto is to construct loops.
Pairing it with an if statement lets you execute a chunk of code until some condition is met.
But there are better tools for the job.
Even in BASIC, there is a dedicated for loop.
GoTo can also be unpredictable.
If line numbers change, then any GoTo in the program needs to be updated.
And in a large enough program,
it can be hard to spot every instance of this one operation.
A well-trained developer knows to avoid these kinds of,
frankly, dangerous situations. But that's not who BASIC is for. Kimney and Kurtz crafted the
language specifically for non-trained programmers. At every point, they chose to make the language
simple and easy to understand, even if that came at the cost of elegance or
power. Adding jumps with named labels would have been safer than GoTo, but for a novice,
GoTo is fine. The intent wasn't for people to write massive programs in BASIC. It was a way
to get novices to use a computer. GoTo is just simple, and for a non-expert, it's plain useful. You don't
need to plan ahead, you can just jump. Print is another fantastic example of this approach.
You just say print x and BASIC figures out everything else. The output goes to whatever
teletype terminal you're sitting at, and the variable will be formatted automatically.
If the number is too
long, it's printed in scientific notation. If it looks like an integer, it's printed without
decimal points. You don't need to think about what you're outputting or where to output it.
You just print. And I guess that brings us nicely around to variables. Usually this is where I get
into what kind of variable typing BASIC uses, but you can't really do that with BASIC, at least not early versions.
The initial versions of BASIC don't really have variable typing.
It's not implicit like DOPE, or even early versions of FORTRAN.
And it doesn't use explicit declarations like ALGOL.
The closest I can get to a label is maybe semi-explicit typing?
The first edition of BASIC only uses variables for numbers. That's mainly because it wasn't meant for
anything beyond math problems. To a programmer, you just have variables around, and you can put
any number you want in them. A equals zero works just as well as a equals 0.35.
But you can also turn variables into an array using the dim or dimension statement. Then you
have a variable that has 10 elements. But here's the trick that's going on inside BASIC, or at
least inside the earliest versions. Every variable is stored and treated as a floating
point number. Printing it may produce what looks like an integer, and you may set a variable to
what you think is an integer, but it's all decimal on the inside. That means you can't
get weird typing errors. Those just can't happen in this system. You can't accidentally divide a float by an integer.
In fact, you can't even have a true integer. For me, a seasoned programmer, if I do say so myself,
this is an utter nightmare. But if you don't care about variable types, or if you don't even know
about variable types, then this is perfect. It saved students from another seemingly arbitrary rule, and in
doing so, it made programming that much more accessible to a wider audience. The other half
of BASIC, or rather what made it all possible, comes down to its implementation on DTSS. The
language spec was really well thought out, and as I hope I've shown, BASIC was designed as the perfect starting language.
But DTSS is where the magic actually came alive.
It's where Kimony and Kurtz pulled off their biggest illusions.
There was a more general text interface for DTSS, but the real meat and potatoes was its BASIC interface.
Even logging into DTSS feels a lot different than contemporary
systems. While something like MIT's compatible timesharing system was built for programmers and
researchers, DTSS was intended to be used by students. The idea was to integrate it into
lessons and homework, so it had to be approachable. Walk up to a terminal, punch in your student ID number, which
everyone has, and you're greeted with a simple hello. Loading up BASIC is as simple as typing
out BASIC and hitting enter. DTSS then asks if you're working on an old or a new problem,
and from there it asks for a name for that problem. It's these little touches that make
the computer feel more personal. You're not
staring at a bank of switches, you're having a short conversation, and you don't really need to
remember all that much to get up and running. Just from starting up BASIC, we can already see
the smart choices that the crew made. When you run BASIC, you don't need to give it any arguments.
You don't need to pass in a file to run. Heck, you don't even need to know what a file
is. You can just say, yeah, I'm working on a new problem. It's Econ Homework 5. Or maybe I want to
pull up my stats project again. But here's where I personally run into some weird territory.
The DTSS implementation of BASIC is a strange beast. It's a compiler with an interactive front-end. To me, it almost feels
like an early just-in-time compiler. I know that's not a thing yet, but it has the hallmarks.
It's not an interpreter, but you type in commands and can run snippets of code.
The best word I can think of is an environment. It's a playground where a student can build a
program, debug it, and run it. And
it's all designed to maintain this illusion that the hulking machine buried in Dartmouth's basement
is actually just a friendly computer that speaks BASIC. If you've ever used BASIC on a home computer,
then DTSS's environment is instantly familiar. It even starts off with a little ready prompt.
To enter a program,
you type in a line number and your code, rinse and repeat until the entire program is ready
to run. The environment also offers the list command to list your current program. Save
will save your changes, and run will, well, run the program. To the user, it's all simple,
responsive, and pretty interactive. But on the mainframe
side, there's some interesting tricks. Like I keep bringing up, this basic was compiled.
When a user typed run, a call goes out to the mainframe. The user's current working
file is compiled and then executed. The other interesting piece here is that the compiled
program is only temporary. It's never actually saved.
Only the BASIC source code is ever saved down to disk for any period of time.
Why go this strange route?
On the surface, it seems inefficient.
Compiling code takes time, but this was actually an integral part to Kimney and Kurtz's plan.
BASIC was designed in such a way that it only takes one pass to compile. That's just a fancy way of saying that the compiler is really simple.
That's possible thanks to BASIC's syntax structure. Each line is a single command.
This is one of the big influences from DOPE. Besides the math system,
a lot of BASIC's syntax follows the same pattern as DOPE.
After spending a little bit of time building my own interpreter for DOPE,
I can tell you that its simple syntax made my job a lot easier.
So BASIC's syntax didn't just make it easy for novices,
it also let Kimany and Kurtz get away with some unusual stuff.
All this hard work really paid off in the coming years.
Almost as soon as terminals were installed on campus, students were lining up.
And this is where Kimony and Kurtz made another really smart choice.
Instead of offering classes in BASIC, they decided to pitch BASIC to teachers.
Admin wouldn't really like adding another class to its catalog, and students would definitely get mad about a new requirement.
another class to its catalog, and students would definitely get mad about a new requirement.
But by adding in BASIC as part of existing classes, then the language could reach many more people a lot more easily. And it only took a small initial exposure for students to fall in
love with BASIC. Kurtz recalled this, quote,
Students latch onto something like this no problem. They aren't afraid of making
mistakes. We had the teletype paper is yellow. It's on big rolls. Yellow paper in those days.
This yellow paper appeared all over campus. I remember at the engineering school, the faculty
member said, quote, the students keep turning in their homework with this yellow paper.
Something's going on here. I'd better figure out and learn what it is, end quote. Yellow paper would show up all over campus for years to come,
and over the latter half of the 60s, BASIC would meet and exceed every one of its goals.
The language went through a series of revisions and upgrades, and DTSS was expanded beyond Dartmouth's campus.
The university offered remote access to their mainframe to outside schools, and the code for DTSS and its BASIC compiler were shared pretty freely.
GE would even have their own install of it.
By 1968, there were over 8,000 active DTSS users, and by far the main use of the system was BASIC.
Kimony and Kurtz had created the perfect formula, and really found the perfect vehicle to spread their digital gospel.
And it's during this escape from Dartmouth that BASIC takes a sudden turn.
In the book Back to BASIC, Kimony and Kurtz describe it as the start of a long period
of corruption. This is the period when BASIC transitioned onto microcomputers. The main thesis
of Back to Basic is that as the language moved onto smaller machines, unscrupulous programmers
destroyed everything good about it. According to BASIC's creators, that led to the language's eventual
downfall. But is that true? Did BASIC's widespread adoption actually doom it?
Now, it didn't take long for BASIC to become ubiquitous on mainframes. These environments
were really close to DTSS's version. Similar hardware was being used, at least broadly speaking,
and language was still compiled. Also, at least broadly speaking, there were exceptions.
But as the 70s rolled along, a new technology hit the market. That's the microprocessor,
perhaps one of the most important technologic shifts in the story of computing. This new
technology really opened the floodgates.
Small-scale personal computers were now actually looking possible, and researchers hopped on the
hype train pretty early on. The problem was, how could you practically make use of a microprocessor?
In the early 70s, chips were severely limited, to say the least. So sure, you could build a computer, but the machine could only deal with 8-bit numbers.
It wouldn't be able to address much memory, and perhaps the biggest issue, there was yet
to be any useful hardware built to work with microprocessors.
But that didn't stop progress.
This was a starting point, after all.
In 1974, BASIC would come to
the microprocessor for the first time. A group of computer scientists at the University of Illinois
were the first to take the plunge. Their target of choice was the then-new Intel 8008, quite
literally one of the first commercially available microprocessors. From the beginning, BASIC was a very logical choice as an interface
for a microcomputer. The language was designed to be easy to learn, and it was widely popular
on mainframes of the era. But getting BASIC up and running on a microcomputer wasn't actually
a trivial process. At this point, BASIC compilers were very much mainframe software. Things would have to get scaled down a lot.
The U of I team made the decision to write a BASIC interpreter rather than a compiler.
The rationale for this decision is simple.
A compiler just wasn't an option.
Like I alluded to earlier, this wasn't a totally unprecedented option.
Many of the non-Dartmouth versions of BASIC
had been written as interpreters, but on a microcomputer, they really didn't have a choice.
The reason a compiler worked so well for DTSS was thanks to hardware backing it up.
Dartmouth's GE mainframe was pretty fast for the time, and there was plenty of disk space to store
temporarily compiled code. There were just no
options for mass storage on a microcomputer in 1974, full stop. That, and microprocessors were
slow as molasses compared to their big iron counterparts. Interpreters also tend to be more
simple to develop, especially when you take BASIC's syntax design into account.
So developing a BASIC interpreter is fast, easy, and effective. It's a way to get an
interactive environment up and running on a new platform without having to move any mountains.
The simplicity also meant that a BASIC interpreter didn't take up all that much memory.
University of Illinois' interpreter was smaller than 16
kilobytes. And as more developers took up the task, interpreters would become even more lightweight.
The next big shift for BASIC came in 1975, with the release of Microsoft's Altair BASIC.
With the home computer market taking shape, BASIC became even more enticing as an option.
the home computer market taking shape, BASIC became even more enticing as an option.
For one, it was public domain. The spec was widely known and widely shared,
and there weren't big businesses that could come clamped down on developers for making their own versions. And the language was actually perfect for totally new computer users.
That was the intent from the beginning, and as home computer companies started to target their systems at novices, BASIC was there as the perfect interface.
The language would drive computers into the home for nearly a decade.
However, this new breed of BASIC had become pretty removed from the language that Kimony
and Kurtz designed.
The duo even made a name for these derivative languages,
Street Basic. Now, the swap from compilers to interpreters was just one of the changes.
That alone didn't affect users much. Like I said, there wasn't really another way to get
BASIC onto smaller computers. But over time, manufacturers kind of got pushed into a corner.
But over time, manufacturers kind of got pushed into a corner.
BASIC had to pull more and more weight on these home systems.
All the big changes between Dartmouth BASIC and so-called Street BASICs come down to hardware.
By the time Microsoft BASIC is released, interpreters can be shrunk down to around 4 kilobytes of code.
That saved room for the end user, but meant that corners had to be cut.
You can't really just tidy up code and get those kinds of results. You have to remove something.
Many versions dropped support for floating point math. Apple's early Integer Basic is just one example. Wozniak chose to leave out floating points to save time and space. There were also changes made to allow users'
programs to take up less memory. Most home computer versions of BASIC supported assignments
without the use of the LET statement. Some even allowed multiple statements on a single line.
On the surface, that sounds fine. More flexibility is usually a good thing.
But this broke one of the core parts of BASIC's design.
Highly structured syntax wasn't meant as a restriction alone. It was meant as a way to
keep the language simple and understandable. The final big change isn't a deletion, but
rather an addition of new features. On Home Micro's, BASIC wasn't just a programming environment.
On home micros, BASIC wasn't just a programming environment.
It was your only environment.
A user needed a way to save and load files, play sounds, draw graphics, spin disks, read memory.
They need to do more than BASIC alone offered.
So companies added in instructions for controlling hardware.
Once again, in general, this is a fine addition. But these kinds of functions
were never in BASIC's spec, so each company just kind of did their own thing. Drawing a circle in
Commodore BASIC was vastly different than Applesoft BASIC. You may know how to load a file on a TI-99-4A,
but a TRS-80 takes different commands. Even though they all ran BASIC, the language
was diverging. According to Kemeny and Kurtz, it was this divergence that made BASIC lose
popularity. They claimed that as the language strayed further from its original path, it became
a mess. To them, Street BASIC was a corruption of their work. Dartmouth's version of Basic was the
only way forward. And while there's definitely something to that, I think it's only part
of the reason that Basic disappeared. True, the drift and change of the language made
Basic feel inconsistent. Some of its core design faded to the background in favor of
just handy features. But as the 80s drag on, I think we start to see that
BASIC didn't perfectly fit the home computing niche after all. The set of hardware control
commands are a great example of this. Look no further than the venerable Commodore 64.
Its version of BASIC came with functions for directly reading from and writing to memory locations, peak and poke
respectively. That's not something a novice should ever need to do with a computer. But novices
weren't the only people buying computers. The microcomputer market started to mature.
A lot more consumers expected more out of their computers. BASIC had been a really good option for a long time, but especially once we get into the 80s, it was starting to show its age.
The other huge factor was software availability.
BASIC was developed back when you couldn't go out and buy software.
If you wanted to use a computer, you had to know how to program.
Even something as mundane as spreadsheet software
just didn't exist. So sure, you could do your taxes on a home computer, but you'd need to
write up a little BASIC program to get anywhere. And as strange as it may sound, I'm of the firm
opinion that commercial software actually doomed BASIC. Here's just one example. In 1979, VisiCalc, the first spreadsheet program for
microcomputers, hit shelves. A user can now buy a ready-made tool for automating, say, their tax
returns, or for running repetitive calculations. And what's more, it was easier to use than BASIC.
With fewer keystrokes and a smaller learning curve, you could be up and
running numbers. Almost overnight, BASIC's usefulness was starting to wane. This scenario
would play out countless times in the coming years. The release of the IBM PC, tricked out
with DOS, was another nail in BASIC's coffin. The PC did ship with BASIC in ROM, but come on, DOS was where it
was really at. With removable storage now a viable option, BASIC's jerry-rigged file management
didn't cut it anymore. DOS offered a better solution. It was easier to learn. Users could
spend less time trying to bang out lines of BASIC and more time getting stuff done.
Machines that exclusively ran BASIC weren't good enough anymore,
and by the 1984 release of the Macintosh, there's just no looking back.
Computers were powerful enough to do a whole lot more, and users wanted to get more done.
BASIC's niche just was no longer there.
BASIC's niche just was no longer there.
Alright, that brings us to the end of this episode.
John Kimony and Thomas Kurtz really struck a nerve when they developed BASIC. The general idea of getting more students exposed to computers was brilliant.
And for decades, the Dartmouth duo remained ahead of
their times. Their dedication to bringing computing to a wider audience would lead to a string of
projects culminating in BASIC. And once the language was off the ground, there was no stopping
it. Almost as soon as microprocessors hit the scene, BASIC came along for the ride. The language
was a seemingly perfect fit for early home computers. BASIC would along for the ride. The language was a seemingly perfect fit for early home
computers. BASIC would be a huge force in the industry for pretty much an entire decade.
In fact, it was almost exactly a decade of life, starting with the first microprocessor BASIC
implementation in 1974 and ending at some nebulous point around the release of the Mac.
It held a niche that nothing else could, but over the years issues started to show.
Kemeny and Kurtz called it a corruption, but I think it was more of an evolution.
Computers changed, and BASIC was just behind the times.
Its design was fantastic for beginners, and it was fantastic for an era with few options.
But once home computers grew past their infancy, Basic wasn't the only show in town, and a once-dominated niche just disappeared.
That being said, Basic's story doesn't really have a sad ending, at least not how I see it.
The language served its purpose beyond anyone's wildest imagination. Its design
clicked with a lot of people, not just Dartmouth's student body. As more users came into contact with
computing, BASIC was there to give them a friendly introduction. Thanks for listening to Advent of
Computing. I'll be back in two weeks time with another piece of the story of the computer.
And hey, if you like the show,
there are now a few ways you can support it. If you know anyone else who'd like to hear about
computing history, then why not take a minute to share the show with them? You can also rate and
review me on Apple Podcasts. And if you want to be a super fan, then you can now support the show
through advent of computing merch or signing up as a patron on Patreon. Patrons get access to
early episodes, polls for the direction of the show, and bonus content. In fact, right now,
there are two bonus episodes up, so why not sign up, put in a dollar, and get some extra content?
You can find links to everything on my website, adventofcomputing.com. If you have any comments
or suggestions for a future episode, then go ahead
and shoot me a tweet. I'm at adventofcomp on Twitter. And as always, have a great rest of your day.