Advent of Computing - Episode 138 - Type-It-Yourself
Episode Date: September 1, 2024I'm finally back to my usual programming! This time we are taking one of my patent pending rambles through a topics. Today's victim: the humble type-in program. Along the way we will see how tradition...s formed around early type-in software, and how the practice shifted over time. Was this just a handy way to distribute code? Was this just an educational trick? The answers are more complex than you may first imagine. Selected Sources: https://s3data.computerhistory.org/pdp-1/DEC.pdp_1.1964.102650371.pdf - LISP for the PDP-1 https://archive.org/details/DigiBarnPeoplesComputerCompanyVol1No1Oct1972 - PCC Issue #1 https://archive.org/details/Whattodoafteryouhitreturn - What To Do After You Hit Return Â
Transcript
Discussion (0)
If you lived through the early days of home computing, then it's likely you remember type-in programs.
They would show up in magazines and newsletters, just little columns with neat lines of code waiting to be typed into your very own computer.
Many early games spread around the world this way.
This was the best you could get in the absence of, you know, the information superhighway we know and love today.
I was a little bit too young to experience this firsthand. However, I did experience something
very similar. Way back in my misspent youth, I learned how to program. Unlike the kids these
days with their fancy online courseware and actual classroom learning, I taught myself the discipline from old water-damaged textbooks.
Perhaps not the most auspicious start, but I've made it work.
I still keep around a few of those old tomes.
The one that really got me going was called Common Lisp,
a gentle introduction to symbolic computation.
Once again, perhaps not the best introduction to the arts, but
I made it work. If you've ever worked out of a language textbook, then you already know the
format. But if not, let me explain. There are a few different possible methods, but the usual go-to
is something like this. You introduce a new concept, give the student some example code to
type in, and then you hand out an exercise. Crucially, as the student, you're expected to
actually type in the example code and work with it. By doing so, you get to practice writing in
the language. You start to get a feel for what's going on and, perhaps through a little bit of osmosis, even learn something.
One of the classic exercises is to adapt that example program to do something different.
You're shown a program that can replace one element in a list, then you're asked to write
a program that can replace two elements in a list. Or maybe you're given some code that makes
a random number between 1 and 10, and then you're asked some code that makes a random number between 1 and 10,
and then you're asked to change that to make a number between 1 and 100. It's gentle at first,
but it slowly works you up to writing your own software from scratch. And sure, in textbooks,
that can get pretty boring. You can only work up so many variations of a sorting algorithm before your eyes start to bleed a little around the edges.
You could, however, spice things up a little. You could change around the subject matter, maybe
even throw in a game or two. Perhaps you can tell where I'm going with this. Is it possible that
those fondly remembered type-in programs from back in the day, the ones that brought joy to generations of budding programmers, were actually just gussied-up textbooks?
Well, maybe.
Welcome back to Advent of Computing.
I'm your host, Sean Haas, and this is episode 138, Type It Yourself.
I'm finally back in the central office and back to my normal schedule.
It's been a while, and it feels very nice.
I'm a creature of habit, so when my habits break down, I start feeling a little off.
Today, we're opening up again with a pretty classic
advent of computing format. That is, a wild ramble over a topic that I think is kind of interesting.
Today we're going to be dipping into a subject that I've been curious about for quite a while.
That is, old-school software distribution. At least, kinda. We'll get to the details. I think this episode's gonna be
a lot of fun. Way back in episode 76, I brought up one Dave Aul. You know, good old episode number
76? That one was about Star Trek the Computer Game. It was a game written in BASIC that spread
via source code listings. One of its largest vectors was Dave Aul's book 101 Basic
Computer Games, and his newsletter Creative Computing. These publications had little
stories about software, descriptions of programs, and then full source code listings in BASIC.
This was in the very early 1970s, so software distribution was totally different than it is
today. This was especially true for the hobbyist.
Or at least, the budding hobbyist.
We're not quite fully into the hobby sphere yet.
Back in this period, you couldn't just pass around a link to GitHub or Sourceforge or
something.
During the 70s and 80s, these type-in programs were a huge part of the culture.
This became especially true as microcomputers
became more and more popular. But this didn't start in the 70s. At least, I don't think so.
Here's where we're getting the hook for the ramble. This is a mystery that I want to contend
with this episode. I've ran into older sources that look and feel like these later type-in software books. And I'm talking
academic papers, articles in trade journals, all the usual suspects, all the usual places I hang
out. I found it's pretty common to run into articles that describe a program and then end
with that program, with the full source code printed out on paper. It's an easy way to get
all the details across, after all.
But I have a hunch this could go even deeper. Remember Algol? It's one of those touchstones
that I keep coming back to. Part of Algol's original goal was to provide a universal language
for publication, a standard tongue that could be used to pass around source code in academic papers.
that could be used to pass around source code in academic papers.
We have minutes from committee meetings discussing this in 1958.
That, to me, makes it sound like the Algole committee was trying to fix an extant problem.
For that to be the case, there must have been issues with source code in some very old programming papers.
I also want to look at the actual goals and really the purpose
of type-in software. 101 Basic Computer Games sets the mold for type-in programs in the micro-era.
At least, kind of. It's this mix of software distribution and education. On the one hand,
you have this interesting and cheap channel for sending around software. On the other,
you have this thing called learning. A novice who types in a program is shown the guts of the thing.
They're invited to make tweaks. Sometimes the program even comes with instructions or prompts to modify the code yourself. That's one kind of type in program, but are there others? Is the same
kind of medium serving a different purpose in different contexts?
The goal for the day is to jump through the decades and see when and where type and software
shows up. Is my theory about Algo correct? Does that lead to an actual lead? Is this all tied
to academics, or is this more a matter of practicality and perhaps a way to subvert
larger supply chains?
Let's try and sort this out together.
Let's start out with a funny fact, shall we? When discussing the history of type-in programs,
and especially type-in games, the most common example is going to be Dave Aul's 101 Basic Computer Games.
That's usually seen as the start of the phenomenon.
Most who are exposed to type-in programs would have used them on a home microcomputer,
which means, inevitably, they would have been using BASIC.
That was kind of the only choice in home computing during the 70s and 80s,
with some notable exceptions that kind of prove
the rule. So it may be tempting to just assume that all was published in games for home computers.
That, however, is simply not the case. 101 Basic Games is published in 1973.
The first practical home computer, the Altair 8800, wasn't released until 74.
Realistically, it's not until early 75 that anyone owns an Altair.
So, Aal wasn't printing games for microcomputers or home computers.
These games were for minicomputers.
I happen to have a first edition of the book on my shelf.
It's a wild text because at the time, all was
working for DEC. It was actually published by DEC themselves, so the first edition bindings
are styled like old DEC computer manuals. They even use the same blue and white color scheme.
The book doesn't mention anything about microcomputers or the home, because that's not a thing in 1973. It's meant to be a
somewhat platform-agnostic text, but all platforms it mentions are minicomputers. We're in this weird
period where BASIC exists, which very much becomes a language synonymous with home computing,
but home computers don't exist. BASIC's running on small and accessible machines,
but not in homes. It's running on these cheaper mini-computers.
Further, Aal drops a hint for our search. In the preface, he describes 101 BASIC games like this.
This is not the first collection of computer games and simulations, nor will it by any means be the
last. However, in many ways, it is unique. It is the first collections of games all in BASIC.
End quote. Note the emphasis here is on BASIC, the language, and not the concept of a collection of
source code or a collection of games. That's enough to convince
me that there has to be more to the story. So where do we go from here? Well, I did email
All and he didn't get back to me in time, so I'm not entirely sure what he considers
prior arts. So I guess the only thing we can do is hit the books. Before we go further,
I think this is a good time to lay out a framework
for type-in software. It pays to be rigorous, after all. First off, we're talking specifically
about source code distributed as physical printouts for the use of other people. We
aren't going to consider things like source code listings kept in vaults or sent around to
colleagues, but actual public distribution.
The perfect candidate will be a program that was printed out and sent, given, taken, reviewed, and used by multiple folk.
I also want complete programs.
Snippets, well, interesting to the larger story, and we will discuss them, don't really cut it for full type-in software.
larger story, and we will discuss them, don't really cut it for full type-in software.
This may sound vague enough to be almost useless, but right away, this puts a lower boundary on our search. For this model to make any sense, we have to assume that there's more than one computer in
the world, and that it's possible for source code that worked on one computer to work on another.
This could be accomplished by a portable programming language
or just multiple identical computers.
Where does that land us in the timeline?
Well, we have to be outside the era of unique machines.
A programmer isn't writing down a program from ENIAC
and mailing it out to his colleague that works over on EDVAC.
That said, there may be a little bit of wiggle
room since there were some unique machines with similar architectures, but that's splitting hairs.
Let's just say the 1940s are out entirely. The earliest period where this scenario is possible
is probably the middle of the 1950s. At that point, there are a few mass-produced machines,
middle of the 1950s. At that point, there are a few mass-produced machines, even if the mass in that production is padded out by the sheer mass per unit. In that decade, we get IBM 600s and
700s popping up in multiple locations. We have Univacs, we have Burrows, I think? Point is,
we have a few different types of machines that are made in quantities larger
than one. My gut feeling is that the first type in programs may have come out of early user groups
that formed around these machines. That's just a guess, but I think it's enough to start our
search. The reason for this guess comes down to the early history of free software. Allow me to explain.
In the beginning, software was kind of an afterthought. The real economy was hardware
itself. Programs were either written by the end user or provided by the hardware manufacturer.
In the earliest epochs, this made good sense. End users were initially research outfits or big organizations.
While IBM would do some very accommodating things for their users, it wouldn't really have made a
lot of sense if IBM wrote very custom software packages for every single researcher using an
IBM mainframe or for every office that needs a very specific little micromanaging tool.
In that climate, it made a lot of sense for end users to write their software,
supplemented with a few tools for manufacturers.
However, this is where I reach a bit of a stall.
Allow me to introduce Share, the first computer user group.
Specifically, Share is an IBM user group. Now, note the phrasing there.
Share is, not Share was. They still exist, and they still charge dues. That makes researching
them a little tricky. I can find smatterings of information, but nothing like an easily
searchable archive. I've also reached them for comment and, at the time of press, crickets.
That said, there is another more well-documented user group that we can look at. That is DECUS,
the DEC user group. I actually ran across some of their publications just a few episodes ago.
Peter L. Deutsch wrote the first TRAC implementation. Prior to that gig, he implemented a version of
Lisp for the DEC PDP-1. That program is super well documented. It's perfectly preserved. We have
all of its source code. And that's because it was published in the DECA's program library in print. We have a type-in, and it's from 1964. So what exactly are we looking at
here? The document itself is pretty straightforward. We have a DECAS cover sheet, complete with the
user group's cool blue logo. Then we have a full description of the program, how to use it, and some examples
of Lisp code itself. Following that is the entire source code listing for the implementation.
This is the full source for the program, and it's written in MacroAssembler. So we aren't
looking at portable code, but code for a popular machine. MacroAssembler is just
machine code with a few extra handy mnemonics and a little bit of find-and-replace magic.
This is definitely a type-in program, but there are a few caveats and points of interest here.
The first is that you didn't have to type in the program, to quote from the paper itself.
The first is that you didn't have to type in the program.
To quote from the paper itself,
Punched tapes for placing this LISP system on the PDP-1 computer are available through DECAS,
the Digital Equipment Corporation User Organization, Magnard, Massachusetts.
End quote.
So you could send out for the program punched on paper tape and save some time at a keyboard.
But this doesn't make the printed code useless.
That's because the code was, in part, meant to explain how the program worked and how it could be expanded.
The source code is actually commented.
It's not really up to modern standards. I think if I was given this pull request, I would reject it.
But there are comments there. These are notes to modern standards. I think if I was given this pull request, I would reject it. But there are comments there. These are notes to other programmers. The paper even explains that it
should be possible for another programmer to expand the interpreter. The comments are there
to help the next programmer along. In that sense, this isn't filling the same role as later type-in programs in the micro-era,
at least not entirely. There is an educational aspect to many later type-in programs. That's
partly the point of things like 101 Basic Games. Type in these programs, mess around with Basic,
and maybe you'll learn something about programming along the way. It's a call to education. The Lisp paper, on the other hand, is more of a call to action. It literally
says that code is included in order to help others create Lisp implementations and to make
it possible to expand and change this implementation. At the time, Lisp was very new. This is specifically for Lisp 1.5.
There were still growing pains, and there are still pains yet to be felt.
Deutsch presented the source code as a way to help fight those pains. If Lisp primitives changed
over the next few years, then some programmer, likely another DECAS member,
could go in and make their own tweaks. What we're looking at is a type-in program as part of a larger
culture of collaboration. This was the point of user groups way back in the day. They were a way
for users to pool resources and work on larger projects together. In the case of IBM Share,
and work on larger projects together.
In the case of IBM Share,
that led to entire operating systems and programming tools.
Those became part of a larger library of code that members contributed to.
The same was true for DECAS.
Software distribution for DECAS was primarily done
through their so-called program library.
Every year, a new catalog would be compiled
and sent out to user group members.
For a small fee, users could write in and request software, which would be sent out on paper tape.
From what I've read, and from what the Lisp paper explains, those tapes were only binaries.
As in, the actual machine code for the computer.
Code for the program was also available, but not on tape.
computer. Code for the program was also available, but not on tape. From what I understand, it sounds like DECA's initially distributed code as printouts, like we saw in this list paper. Once again, this is
functioning in a different capacity than later type-in programs. This isn't necessarily aimed at
education, but at collaboration. The distribution aspect here is less important since there are
tapes you can send out for. This is also courting a different user base than later type-in games.
The DECAS library is meant for professionals and researchers. It's not so much meant for
novices or people wanting to play games on a home computer. There's another side of this user group story that I just have to discuss.
Program libraries are only part of the equation.
User groups also publish a lot of paper.
These come in all kinds of forms.
Newsletters, magazines, proceedings from conferences, the list goes on.
This presents another avenue of investigation because many later
type in programs are printed in, well, newsletters and journals and magazines. DEC has published
proceedings from their regularly scheduled conferences. These give us a really cool
breakdown of what DEC users were working on in any given year. And they talk about software. Which, I guess, takes us to
an interesting sidebar. How do you talk about software? Like, seriously, how do you explain
in technical detail, in writing, what a program does? There are a few different approaches that
show up in these proceedings. One approach is simple description. You just explain
what your program does in English. I wrote a program that takes input over a serial line.
When a number is received into the buffer, I check its value against a known constant. If it exceeds
that constant, then I call a subroutine that makes the computer beep. That kind of description
certainly has its place. If written well,
it's a succinct way to explain your program without going into deeper details, and it gives
you a way to transition into commentary on your code. That's kind of the whole point with any
descriptive method. You want to explain what you did and get the reader up to speed so you can
discuss the program in more depth, so you can show off the finer details. Maybe you have a cool algorithm for how you deal with the serial buffer,
but you need to add context, so you have to work up a description in text. Another approach we see
is the functional diagram. This is a diagram that presents functional chunks of a program and shows how those chunks talk to each other.
You could draw out boxes for the serial interface, the buffering code, and the beeper.
Then you plot out how data flows from the serial line into your program, and then eventually how the beeper gets triggered.
That can be just as serviceable as a narrative description, and it works towards the same
purpose.
The reader becomes acquainted with the program, and then you can style on it.
Both those approaches, however, miss a certain level of detail.
You're showing what the program does, but not always how those goals are accomplished.
You have to add in a lot more ink to describe specific information here.
You need more commentary to back up that description.
For more detail, we can turn to something like a flowchart.
That's another tool that shows up in these very DECAS papers.
This is a lot closer to actual code.
It actually shows the flow of the program.
You have little boxes that show how data comes into the system, how it's entered into
the serial buffer, then you have boxes describing the checking logic that decides when you send out
the beep. It's more detailed, but it's still missing something. What sucks is all these
approaches are abstractions on top of the code. They're all attempts to describe the code without
using any code at all. It would be so
much easier to just print out your program and staple it to the article. In a perfect world,
we could all do that. Then the reader would have total access and complete, perfect understanding.
At least, that's the hope. There are practical limits to this more pure approach. The main limitation here
is, perhaps unsurprisingly, size. You can't really staple hundreds of pages of code together.
That's too big for a single staple to go through. And you can't really expect a reader to go through
hundreds of pages of source code just so they can understand your
discussion of the finer points of buffer management. In that way, the more abstract descriptions help
you get your point across. But in some cases, code is fully justified. It works best when you have a
small program or part of a program. And indeed, that's just what we see in proceedings of DECAS.
The Spring 68 proceedings have some really good examples of type-in code. I'm going to be pulling
just one example from there, which I think will be illustrative of the larger trend here.
In this proceeding is a paper titled Extended Memory Fortran with an 8K PDP-7. It's a neat little piece about how to implement crude
virtual memory. Basically, the researchers wrote a program that added some circuitry to the PDP-7
that allowed you to swap memory onto magnetic tape. They then used this to soup up their Fortran
programs. It's honestly a pretty cool little trick to do with very few resources.
It all hinges on a bit of code that you link to your Fortran program.
When your program tries to access memory that's outside the normal 8K memory space,
that bit of code sees the error, intervenes, and does a sleight of hand.
It swaps out some memory with storage on tape.
The details aren't
important, but it means that as long as you have that chunk of code and the right circuit,
you can make any Fortran program swap memory with your tape drive. The proceedings paper
includes the source code for the entire swapping routine. It only takes up a single page.
routine. It only takes up a single page. This is a wonderful candidate for this kind of distribution.
It's short, it's targeted at a specific platform, and it's the kind of program that requires a lot of detail to explain. The article even references the code itself. Any user with a PDP-7 setup
could take that code, type it in, and get it up and running with rough
virtual memory. This lines up with a lot of other user group articles I've read. The code is there
partly for descriptive purposes and partly for distribution. So that's one kind of type-in
program. It's one tradition that existed well before the type-in microgames that we're more familiar with.
This specific example is from 1968, but if you look deeper into archives, you're gonna find more.
I'd be willing to bet that Cher would have very similar articles, and those could go back to the 1950s.
This type-in tradition is very specific to user groups, and it fits nicely with the larger
tradition of open-source software. It's part of this ongoing community conversation and
collaboration between users. In that sense, type-in code is more a byproduct of that larger
practice. Users are publishing articles with code because they were doing some interesting work,
publishing articles with code because they were doing some interesting work, or because they want to collaborate with other users. So where do we go from here? Where else can we see type-in software,
and, I guess in a larger sense, how else did type-in code impact the discussion around software?
We're moving further up in the chronology.
In 1964, the first version of BASIC sparks to life at Dartmouth.
It's a big mainframe language meant to get a lot of students to share one single computer.
One fact that never ceases to amaze me is that BASIC doesn't really become a thing in home computing for another 10 years.
It doesn't really do what it's best known for another 10 years. It doesn't really do what
it's best known for until the middle of the 1970s. So we have this whole time period where the BASIC
landscape is very different from what micro-users would expect, but there are still many similarities.
This is the period where BASIC is running on timeshared machines. It's when early games like the Oregon Trail or the Sumerian game are written.
Lesson plans are developed that revolve around getting students access to teletypes.
It's a strange time.
That said, we do run into real type-in programs in this period.
What I've found is that the use of BASIC makes for a fundamentally
different approach to type-ins. Allow me to explain by way of example. For this, I'm going
to be jumping towards the end of the macro BASIC period. In 1972, a newsletter called
The People's Computer Company starts publication. Let's just call it PCC to save a few bytes in my script.
At first, it's almost a zine. It's small, scrappy, bold, and brash. It describes itself like so,
quote, the people's computer company is a newspaper about having fun with computers
and learning how to use computers and how to buy a minicomputer for yourself or your school,
and books and films and tools of the future. End quote. That's from issue number one. To note,
this is still pre-microcomputer. It even tips its hand right at the start. It's targeted
initially at people that want to learn how to use mini-computers.
Those would have been the smallest systems you could get a hold of, outside of some really
new stuff.
It wouldn't be until 74 that the first somewhat accessible microcomputers hit the scene.
Maybe a little earlier if you were really good with a soldering iron.
So let's just set that expectation to start with.
Part of PCC was the People's Computer Center, also PCC, in Menlo Park, California. Housed there
was just such a mini-computer. PCC was actually a deck outfit, using the same kind of machines
that all would work with. If you didn't have a machine of your own, then, for a modest fee,
you could rent time on PCC's DEC Edu system. PCC is broadly targeted at anyone who's interested
in these new computer things. Of course, there is a bit of an assumption here. PCC wouldn't really
mean much to someone in an institution who already had access to a computer or already
knew their way around a machine. Rather, this is a tool for the novice or for someone outside of
the industry. So, what would a novice be doing with a computer in the 70s? Why, basic of course.
Nearly half of this first issue is dedicated to the language.
It gives the reader a quick primer, some book recommendations if you want to learn more,
some example code, and then it gives a full program.
It actually gives two.
This follows essentially the same framework that we're going to see in the micro era.
Each program is presented as source code with some example outputs, a short description,
and some fun and flashy art.
The first here is just a number guessing game, one of the many classic programming demos.
You run the program, it generates a random number, then asks you to guess.
It says if your number is correct, if it's too large or too small, and it loops until you get just the right number.
It's really basic stuff, but that's the point.
It's presented as an example of a complete program.
A PCC reader was expected to sit down at a terminal, hammer in the program,
and by doing so, become a little more acquainted with BASIC.
And at the end of the program is the next didactic trick,
And at the end of the program is the next didactic trick, a call to action, or rather,
to exercise. Quote,
Your turn. Modified versions of the number guessing game are suggested below. Pick one.
Write the program. End quote.
The article suggests finding a way to count how many guesses the human makes and then display that as a score. It's an exercise for the reader. You have the program, you have some understanding of language,
so solve a problem. So by 72, we have type in programs that look and act very similar to later
examples and also pull a very similar didactic trick to a lot of programming textbooks. Notably, this is quite a
bit different from earlier user group programs. This isn't about collaboration or discussing new
and exciting code. This is an educational tool. What's more, this isn't a one-off thing. PCC even
announced that there are more games to come. To quote,
Beginning next issue, we will print listings of one or more Lawrence Hall of Science game-playing
programs for the HP2000C, along with other news from the Hall. LHS is one of our favorite sources
of computer games, and one of our favorite games is Bagels. End quote. Bagels, of course, does show up in issue number two. Type-in programs,
and especially games, would become a staple of PCC's issues. But this leads to one of those
weird things about the weird basic era. It took me a while to clock this. Let me lead you into it.
So, as mentioned, PCC started pulling games from other
sources as early as issue 2. One of those sources was Lawrence Hall of Science, the LHS. This was
another outfit where aspiring users could get computer time, amongst other things. Note the
type of machine is mentioned here, an HP-2000C. In fact, many early programs that show
up in PCC have instructions for how to convert between DEC and HP versions of BASIC, since,
well, street versions of BASIC were all a little bit different.
But there was one constant between the systems. That is, timesharing. PCC, LHS, or any of these locales where the usual
person off the street could get computer access while they were all running timesharing operating
systems. As in, they had one big computer that multiple users could log into and share.
See where I'm going here? Let's take the HP 2000 as an example, since that's better documented than
an EduSystem 20. At least, it's easier to find docs, since DeX lineup can turn into a bit of a
morass. The 2000C was a small timesharing computer with, drumroll please, a hard disk. It saved and
loaded programs off that disk. In other words, when you logged in,
you had access to software. You could load up basic programs that were already waiting for you.
So when PCC talks about including programs from the Lawrence Hall of Science, well, those programs
were already on the hall's HP machine. If you stopped by the hall, you could load them directly without needing to type them in.
In this environment, distribution means something different.
We aren't talking about getting a game onto a thousand home computers.
The game is already on these systems.
This may actually change what I thought I knew about type-in games to begin
with. The core use of type-ins here in this specific context isn't really distribution,
it's education. Distribution comes second. That only really factors in for readers that aren't
physically close to PPC's machines or don't have a way to dial in over a
terminal. But that's something we'll get to in a minute. Before we move on, I want to discuss
BASIC itself. We've reached a case where the medium really informs the message. Remember how
I said that early typing software was limited to short programs? That still matters when we reach the basic era, but the definition of
short starts to change. Come the pages of DECAS and you'll find code in Fortran and assembly language.
Big old languages. Let's say that without doing any weird formatting, you can fit 50 lines of text
on a page. I know that in practice, some program listings would have really little
font, but let's just assume a normal page here. Now, we have to take some of those lines away
for formatting and code comments, which could knock us down to, say, 40 lines of actual code.
In assembly language, that makes for a very small program. You have just 40 machine instructions per page.
What we saw with DECAS was that small programs didn't do a whole lot. The fancy Fortran memory
pager doesn't actually do very much at all. It captures a certain type of exception error
and swaps out a chunk of memory. It's more glue than a full program.
Deutsch's Lisp implementation breaks this rule. It's a full 16 pages of assembly language,
but that was more meant as a supplement to an actual software distribution, so the exception kind of proves the point. Fortran helps with density, but just a little.
The same is true for any compiled language. In those types of languages,
a single line of code can turn into dozens of operations. So one line is worth 10 or maybe
100 lines of assembly. But many of these older languages are still super verbose. You have to
set up headers, format things in special ways, throw in extra lines for special incantations
and the like.
A full Fortran program wastes at least a few lines, and that's being generous.
And even then, the assembly to Fortran ratio isn't always super high.
There are some cases where it's actually pretty low.
BASIC is much less adorned.
A single snippet of BASIC will run just as well as a full program.
By the time we're seeing type-ins, basic isn't compiled. Most later basics are actually a mix of
just-in-time code and interpretation, but there's still a hand-wavy ratio here.
A single line of basic can encompass a lot of machine instructions. So 40 lines of BASIC can
do a lot more than 40 lines of Fortran or 40 lines of Assembly. Further, BASIC is actually portable.
At least, kinda. Roughly speaking. There are some arguments about the drift of the standard
ruining the language, but in general, any machine that was
running BASIC in the early 70s could run any BASIC program. When PCC talks about changes made to run
type-ins on a DEC versus an HP machine, the actual change is usually one or two lines. So when we're
looking at the distribution aspect of type-ins, the whole BASIC thing is especially powerful.
Or, to flip that a little bit, BASIC is especially predisposed to this kind of software.
It's a language that's meant to be easy to understand, and it's a standard that was widely adopted.
Alright, with that said, let's step back to the other question that I was setting up.
Should we even really think about distribution in this early period? Should we instead look at
type-ins as purely educational? To start with, let's stay in the land of PCC. In that very first
issue, the magazine recommends a few books for learning BASIC. The beginner's option is a text called, simply,
BASIC BASIC by James Cone.
You gotta love that name.
This book has a very simple didactic method.
You learn a new feature of the language,
then you're given some code to type in and execute that shows off the feature.
In other words, it's full of little type-in
exercises. But crucially, this isn't building up to some game. It's not distributing useful
software, but rather using small type-in examples to educate the budding programmer.
Many educational texts work this way. I used to work out of a book that teaches Lisp this way.
work this way. I used to work out of a book that teaches Lisp this way. Even Trax Beginner Guide uses this same method. The best way to learn to program is by programming, and at first you're
only going to be able to write very small, simple programs. This can make it seem that magazines
like PCC, with their type in games, are purely educational tools, that they're just an outgrowth of a very
well-established tradition. Ah, but we can take it further. And for that matter, we can make it
more fun. One of the things I love about PCC is it styles itself as countercultural. It's
intentionally meant to be pushing up to the edge of what's considered educational.
intentionally meant to be pushing up to the edge of what's considered educational.
The use of games is core to this. The magazine argues that there isn't really a reasonable difference between simulations and games, and that both can be educational and fun.
Allow me to introduce the precursor to the modern typin. There's this book from 1968 titled Game Playing with Computers
by Donald Spencer. It's not the best title, and perhaps that clues you into how early this is
in the chronology. But I do have a caveat here. The first edition to this book is printed in 1968.
The first edition to this book is printed in 1968.
A second, revised edition came out in 1975.
I can only find scans of the second edition, and I don't have a first edition in my tomes.
I know, it's really a shame.
The seven-year gap may sound small, but it might as well divide oceans in this case.
Between 68 and 75, the Altair 8800 comes out.
Home computing becomes viable. The literal horizon line of computing possibility shifts.
So yeah, the second edition is coming out in a totally different world. As such,
we must be mindful here. The source may have shifted during revisions. Anyway, this is an educational text through and through, and it's full of type and
programs. It's also a bit of a strange text compared to the larger canon. Gameplaying uses
Fortran as its primary language, but it's not necessarily teaching Fortran. Rather,
it's teaching how to program games. Let me pull from the preface to explain itself.
Quote, Game playing with computers is intended to introduce the reader to many games that may
be programmed for a digital computer. Since game playing is an excellent medium for one to learn computer
programming, it may be used by students and beginner programmers. On the other hand,
since many game playing programs are extremely complex, the book may also be used by senior
programming personnel, system analysts, and mathematicians. End quote. The book contains
over 50 games with full Fortran source code.
Outside of those games, it has exercises for the reader.
It gives explanations of some games and their logic, then leaves the programming up to you.
What's so interesting here is that gameplaying is an educational text,
but not in the mold of a basic basic.
It's teaching you how to write games using little
type in programs as illustrative examples and as a way to get you into programming.
I think this helps to flesh out the picture. We have books and magazines that use type in software
for educational purposes. The end goals here range from learning a language to learning to program
to learning how to write games. To me, that's evidence of a very rich tradition around type-in
education. Distribution, at least in this mini-computer era where there is shared infrastructure,
isn't really the only factor here. It's not even the primary factor for the medium.
The fact that gameplaying reaches for Fortran here,
well, I think that's the cherry on top, right?
This tradition isn't just limited to BASIC.
It existed prior to and outside of BASIC.
Therefore, I'd argue that BASIC type and programs
are really a response to a changing environment.
We're seeing how an older tradition is being adopted and updated to fit new technologies and new ideas,
and it turns out that that fit was very good.
That's all well and good, but there's one other line I want to tease out.
101BasicGames isn't the start of this trend, that much is clear,
but it does become a huge cultural touchstone.
So how do we get there?
I told part of this story way back in episode 76, when we discussed Star Trek the game.
That game's distribution pipeline had a lot to do with Dave All,
but that was a while ago and it wasn't really told in
context, so I want to set the stage. In 1969, Dave All started working for DEC. He was hired to help
DEC with their educational line of computers, the EduSystem machines. This included software,
documentation, and marketing. But the new gig wasn't all sunshine and rainbows. To quote from All,
quote, early in my days as education marketing manager at Digital Equipment Corporation,
it became apparent that DEC was not communicating very well with its educational users and
communication among users was virtually non-existent. End quote. DEC was putting all these resources into educational
machines and software. I mean, they had a whole line of machines targeted at the education market.
From what Aal describes, it sounds like the relationship with edu-system users was just
plain lacking. Or, perhaps put another way, DEC didn't really know what to do with educational users.
They knew corporate contacts.
They knew support and contract negotiation.
What else could a school want?
It turns out, quite a lot more.
In 1971, All started a newsletter called EDU.
It was published by DEC themselves and targeted at edu-system users.
The idea was to make something similar to those older user group publications. Think DECUS, but for the education market.
In practice, we get something that's a lot closer to PCC. Or perhaps we should say that PCC is very
similar to EDU. It's a little messy. EDU issues informed users about new machines,
new software, and events. Readers could contribute articles or order out for software and documentation.
Crucially, it contained type-ins. At least, kind of. EDU didn't actually run a type-in until issue number two.
This is the so-called basic program of the month column.
Students were asked to send in programs, and one was chosen each month to appear in the magazine itself.
That makes things a little more complicated.
For this part, I went through the first handful of issues of EDU. The student program of the month are type-ins in the magazine.
But none of these are presented as part of a larger lesson on programming.
Rather, it's more like a communications, if that makes sense.
It's like taking a cool program and posting it on a bulletin board.
I think that jives with how All Explains EDU.
It was an attempt to build up a community around educational computer users, and part of that was
showing off what these students learning to use computers were actually doing.
EDU doesn't teach, but it points readers to where they can learn. In fact, by issue number 10,
EDU is actually pointing readers towards 101 Basic Games itself.
But this leads us back to the timeline, which, I must say, gets really weird in this period.
Remember how I said EDU started in 1971? Well, that comes from an article written years later by Dave All himself.
I'd normally accept this as read, but I have my reservations in this case. I think All may
misremember some dates in his later writings. This is frustrated by the fact that EDU number
one doesn't actually have a date written anywhere on it. The newsletter is quite
literally timeless. As near as I can tell, EDU starts either in late 71 or early 72. By issue
three, there's references to conferences in the summer of 72, so that gives us some kind of timeline. This matters because 72 is a crucial year. PCC's first issue, with
educational type-ins, is published in October of 72. The newer newsletter is very similar to EDU,
but with a few key differences. One of those being, PCC is much more of an educational text on its own. But there are clear similarities
to EDU. In fact, I think both newsletters use very similar to stock images and text formatting.
There's at least one arrow that I keep seeing in both of them. It has a little curve at the end,
little cross-hatching. Thus, we reach the question of influence. PCC was much closer to 101 Basic Games than EDU
ever was. So, I ask, was 101 Basic Games influenced by PCC? Well, wouldn't it be cool if we had a nice
quote from All explaining this for us? Well, we do. Kinda. This comes from an interview that all did with
Antic, the Atari 8-bit podcast, which, by the way, if you're a fan of Atari, I can recommend.
Anyway, here's the quote. Manufacturers like DEC and HP were making smaller timesharing systems
for terminals on a computer. Specifically, Bob Albrecht opened up People's
Computer Company down in San Carlos, San Mateo, one of the sands. It was an open-to-the-public
place. What were people going to do with computers? Well, he wrote this book of what to do after you
hit return of games. Then I wrote my book, not for his sinner, but for people in the East that had
access to the same type of thing on DEC computers. Those two books actually came out in 72,
so that was well before. There was an impetus for people to use computers.
Even though it was a minicomputer and they didn't really have their own, they did have access."
did have access, end quote. That's, uh, certainly interesting, right? That makes it sound as if there was some connection. But remember how I said I don't trust those dates? Allow me to
nitpick someone's old memories, and to be clear, this is no knock against all. This happened a
long time ago. I just want to use this as a jumping off point for the larger mystery
and to point out that you have to double check your sources. 72 is just the wrong year here.
101 Basic Games is published in 73, so right off the bat, that doesn't line up. I could be convinced
that all started writing 101 games in 72, but that's being generous. So what about the whole
PCC book? What's up with What to Do After You Hit Return? Would you believe it's super similar to
101 games? What to Do After You Hit Return is a collection of type-in programs from PCC and their community. It has the same crass edge that
PCC had since issue 1. It's also very much an educational text. It teaches you BASIC using
games as examples. You get code, a write-up of the game, and then a breakdown about how the code actually works. And very importantly here, the game was published in 1975.
So we're looking at text that came out years after 101 Basic Games.
And here's where things get weird. This is something for us to really think about.
What to do after you hit return is written to be used with an HP 2000 minicomputer.
It's still for timeshared BASIC.
Now, if you've been following along, that should seem strange.
The Altair 8800 hits shelves, at least in theory, in 74.
Really, it wasn't until, I think, January 75, or at least early 75,
that users had physical
altars on their desks.
But that does mean that PCC is publishing on the microcomputer side of the divide.
By the time their book comes out, a home user can have a computer sitting on their desk
for $400.
This opens up the possibility that What to Do was actually written earlier in the
decade and only published in 75. That could make All's timeline closer to correct. But that theory
quickly falls apart. The first mention of the upcoming book is actually in the summer of 74,
so the timeline works out such that PCC's book has to come out after the success
of 101 Basic Games. So it's pretty safe to assume that What to Do is inspired by All's earlier work,
but there's definitely some back and forth going on here. All even mentions the importance of what
PCC was doing and how that impacted his work.
This leads me to another realization. In preparing for this episode, I've been reading a lot of these newsletters, so I have a pretty solid understanding of the timeline right now.
This is one of those little details you may have already spotted, so check it out.
have already spotted, so check it out. The Altair 8800 is announced and starts to move in January of 75. That month, PCC runs a cover story on the announcement. They're caught so off guard that
they don't even have a photo of an Intel 8080 to show off, so they run a photo of an older 8008.
It's actually kind of funny. What's so neat is that PCC had been
prophesying the eventual home computer revolution. Reading that issue of the newsletter is wild
because their vision had just come true, but with a catch. This was the start of the dream.
BASIC still didn't exist for the Altair. We don't get BASIC until July of 75. That actually
puts a very specific upper bound on the pre-micro era of BASIC. It ranges from 1964 to July of 1975.
That means that what to do after you hit return is technically obsolete basically as soon as it's published.
Once Altair Basic hits the scene, and especially once it's pirated and cloned, we enter a new age
of basic programming. You can now have Basic in the home, totally disconnected from a remote machine.
But that doesn't mean that these obsolete texts disappear.
Far from it.
This is where their use shifts.
In 77, Aal publishes 101 Basic Computer Games Microcomputer Edition.
As Aal describes in the book,
Basic Computer Games Microcomputer Edition is a major revision of my first book. End quote. All chose to standardize his revised
version around just Microsoft Basic, the very dialect that shipped for the Altair in 75.
There's always a bit of an issue of dialect drift, and even this new book suffers here.
All has a whole section on how to convert
between different dialects, so you can, in theory, get games running on your computer.
But I will point out, the changes here aren't very big. In practice, you could take an older
copy of 101 Games and adapt those programs on your own to work on your Altair or on your Apple
2 or Commodore PET.
Now, there are some big obvious changes.
One of the first is that games no longer have save files.
Okay, maybe that's not super obvious.
I might be in my own head here.
When you're running in a timeshared system, you have to have access to disk space.
It's a prerequisite in most cases.
So those basic games can actually save down information for later. The best known game that
made use of this is probably Animal. It's a guessing game that quote-unquote learns as you
play it. The new micro version doesn't actually save files, it just keeps data in memory.
There are also more metaphysical changes
at play here. The entire computer landscape has changed drastically after the microcomputer hits
the scene. There are more users than ever. Almost all micros drop users into a basic prompt, so
there are more people using basic than ever before. What's more, these computers are isolated.
Your Altair or Apple II isn't hooked into some big shared disk drive.
You aren't using an online system.
So you have to physically get access to software.
That could come as paper tape in the very early days,
or it could come as magnetic tape or even floppy disks. But to get
that data into the machine, you needed access to extra parts. When you first get a machine, you
just have the computer and a keyboard. This change in climate made the distribution aspect of type-in
games much more relevant. Suddenly, this tradition, which came directly from education, had a second use.
It was now reasonable to use a type-in book as an actual game library. When someone in 77
bought 101 Basic Games, they weren't just getting an educational text, but 101 games that they could
load into their computer. It may have been the only way for them to get
software onto their machine, seeing as disk drives were add-ons at cost extra.
We must also consider the change of audience here. During the mini-computer period, more folk were
getting access to machines, but the access was still controlled. Take PCC as the example. You
could go to their physical computer center and log into a machine.
For that, you needed physical access.
You needed to pay an hourly fee.
You could also dial into a computer center using a terminal and a modem, but you still
had to pay hourly for access and go through the process of finding a center to dial into.
The population that could get access in that period was larger than in the
earliest epochs, that's for sure, but it was still restricted. It was still limited, and it was still
centralized in these groupings of users. Users shared resources amongst their group, and thus
shared a certain amount of software and culture. If someone on your edu-system got the animal-guessing game working,
then everyone could run it.
Consequently, its save file would be filled in pretty quickly.
You don't have that same centrality with microcomputers,
at least not to the same degree.
In this new context, books like 101 Games became more powerful.
The audience was now bigger.
The distribution aspect could go hand-in-hand with education.
These texts could serve as a lifeline to tie a community together
that didn't have the same technological connection as a shared disk drive.
Sure, you can't all run the same copy of Animal from some big hard drive off in a data center,
but all your friends could type in the same source code. It's in this new era that type-in really hits its stride,
and it's only made possible by its firm rooting in the community.
Alright, that does it for our exploration of type-in programs, where they came from,
and what they were all about. What I've arrived at is, perhaps, a pretty obvious conclusion.
The use of type-in programs shifts as computing changes over the years. But hey, that's true
about basically anything digital, I think. Type- in start, well, probably as soon as we have
programming. I intentionally avoided delving too deep here because, oh, this is one of those
cultural types of things. It's common when talking about code to drop some code into your article or
letter or memo. Going to the earliest papers on programming and looking for preserved notes
feels like a fool's errand. You're going to see code in there, and you won't be able to really
find a meaningful first post. Rather, we can look back and see a tradition of printing code.
It shows up in academics, and it shows up in user groups. In that incarnation, type-ins allow
authors to quickly share their code, and they
facilitate collaboration between peers. Once we get further along, we start seeing educational
type-ins. This is where the medium really takes shape, in newsletters such as PCC and EDU.
These type-ins are almost completely for educational purposes. When the microcomputer finally appears, type-ins shift,
partly in form, but mostly in meaning. As home computers rise in prominence, the type-in comes
along for the ride. And the reason that that medium was there is because of this earlier
educational tradition. Really, it was something waiting for a larger audience to grab onto it.
Thanks for listening to Advent of Computing. I'll be back in two weeks with another piece Really, it was something waiting for a larger audience to grab onto it.
Thanks for listening to Advent of Computing.
I'll be back in two weeks with another piece of computing's past.
If you like the show, there are a few ways you can support it.
If you know someone else who'd be interested in the history of computing,
then please take a minute to share the podcast with them.
You can also rate and review the show on Apple Podcasts and Spotify.
If you want to be a super fan, you can support the show directly through Admin of Computing merch or signing up as a patron on Patreon.
Patrons get early access to episodes, polls for the direction of the show, and bonus content.
And just as, again, a word of warning, because of some changes on Apple's platform, I don't
recommend signing up for Patreon through an iPhone. They
will charge you an extra 30% that goes directly into Apple's accounts. So if you are considering
becoming a patron, please do it from your web browser on your computer. Save the 30%. Anyway,
you can find links to everything on my website, adventofcomputing.com.
If you have any comments or suggestions for a future episode, then please reach out and get in touch.
And as always, have a great rest of your day.