Advent of Computing - Episode 2 - The Demo
Episode Date: April 22, 2019A lot of newer technology doesn't expressly say it's going to "revolutionize the human experience", but sometimes, that line may actually be closer to the truth than you would expect. Today, I am goin...g to tell you about a time when that was very much the case. Today we go back to 1968 to look at Doug Engelbart's "The Mother of all Demos" You can watch the entite archve of the demo here:Â http://www.dougengelbart.org/content/view/209/448/
Transcript
Discussion (0)
What if I told you that the foundation for the graphical interfaces that we're so familiar with today were first laid during a tech demo in 1968?
On its own, that sounds pretty shocking.
The fact that while the microprocessor was still a pipe dream, there were prototypes of the modern mouse running around still seems unreal to me.
mouse running around still seems unreal to me. But what if I also told you that at the same demo,
the technology that would become the basis for the modern day internet also made an appearance?
Well, then that must have strange and exciting history of computers.
This is episode 2, the demo, and I'm your host, Sean Haas.
If you were to look back at the journey that technology has gone on to reach its current state,
one can start to see a certain shared
fabric that all new innovations are cut from. One of the recurring fibers here is the idea of
improving the human experience. Undoubtedly, the wheel was invented to allow humans to move
more weight in less time. So too, many advancements in computing are made with the intent of somehow augmenting human ability.
Whether that be our ability to connect to humans on distant continents, increasing our stores of knowledge, or even helping us solve more complex problems in less time.
Sure, a lot of new technology doesn't expressly say it's going to quote-unquote revolutionize the human experience.
Or rather, if it does, then someone's probably trying to sell you something.
But sometimes, on the rare occasion, that line may actually be closer to the truth than you would expect.
Today, I'm going to tell you about a time when that was very much the case.
I'm talking about Doug Engelbart's mother of all demos and the software that made it possible,
NLS, or the online system. Before we talk about the actual demo, I want to take a diversion to the 1940s. Any discussion of early computing will hit a point in history before computing
was a thing, but where researchers were still starting to lay out the idea of what would become
computing. Usually this is around the 20s, 30s, and 40s. This single topic alone would quickly
go into the weeds discussing analog calculating systems, or von Neumann's early papers on computing,
and a possibly never-ending debate on who has the earliest claims to creating binary adding machines.
Instead, I'd like to skip over all of that and focus on a single document from this era that predates NLS.
That document is As We May Think by Vannevar Bush.
Now, Bush was an MIT-educated engineer that was involved with a shocking number of technological
advances in the 30s and 40s, including presiding as one of the first chairmen over NACA, the
institution that would later become NASA, and working on the top policy group for the Manhattan Project, among many other accomplishments.
In 1945, Bush published As We May Think, a paper which describes the problem humanity faces as our combined knowledge starts to outpace what any one person could ever hope to know or understand.
To quote Bush directly,
There is a growing mountain of research,
but there is increased evidence that we are being bogged down today as specialization extends.
The investigator is staggered by the findings and conclusions of thousands of other workers,
conclusions which he cannot find time to grasp, much less to remember.
Bush goes on to outline one possible solution to this problem,
what he calls the Mimics.
This machine will be able to store the sum whole of all human knowledge
as a series of hyperlinked pages, a concept that we
are now familiar with as the internet. However, this was still the 1940s, so the design of the
Mimics was pretty retrofuturistic. The device itself would be built into a large wooden desk
with two, quoting again, touch-sensitive translucent screens, or touch screens, for
accessing and editing data.
Information was to be stored on a set of microfilm reels, with a platen built into the desk for
photographing and adding pages to the rudimentary database.
Each of the Mimix's entries would be linked to other entries in a scheme that was designed to mimic how humans were able to associate ideas and information with one another in our own minds.
Reels could be removed, duplicated, and shared with other Mimex users to allow the slow creating of a type of universal encyclopedia using 1940s technology.
Basically, this was an analog version of what would become Wikipedia in the modern era.
While the Mimex would never actually be built, the concepts laid out by Bush left a deep
impression on other digital pioneers.
As computers started to catch up, these ideas would once again come to the surface.
While other research and writing
on human-computer augmentation
existed concurrent to 1945 and after,
As You May Think served as
one of the most important germination points
for this concept.
As we go forward in time after As You May Think, we hit the era just leading up to the
digital revolution. One of the things I find most remarkable about this part of history is the huge
amount of convergent discoveries and designs being made in almost total isolation from one another.
For instance, the idea of using binary arithmetic for electric calculators
was being laid out in an MIT dissertation at nearly the same time as a project at Bell Labs
would construct the first electric calculator to use binary arithmetic, all without the Bell
Labs engineer or the MIT doctoral student having any knowledge of each other.
A large reason for this lack of communication between computing pioneers was a conceptual
barrier. There wasn't any one field devoted to the creation of computers. At this time,
you could just as easily have an engineer or a physicist working on the problem, or even a
psychologist or a mathematician. The word computer wasn't even used universally by most of these
early researchers. Another isolating factor was the basic fact that most of the ideas coming out
of these early days were not accepted by the scientific community at large, so many pioneers did their work in the
shadows, waiting for the climate to change. One of those trailblazers was named Doug Engelbart.
It was sometime in the early 50s that Engelbart started to formalize his thoughts on the matter
of computing. While working as a researcher at Stanford's Research Institute, or SRI, he slowly started to write Augmenting Human Intellect, a conceptual framework.
This publication would build on Bush's earlier writings, but in a way that was far more than just derivative.
Engelbart laid out a revolutionary vision.
Augmenting Human Intellect stated its goals as follows.
1. That it provide perspective for both long-range basic research and research that will yield
practical results soon. 2. That it indicate what this augmentation will actually involve in way of changes in working environment, in thinking, in skills, and in methods of working.
3. That it be a basis for evaluating the possible relevance of work and knowledge from existing fields and for assimilating whatever is relevant.
fields and for assimilating whatever is relevant, and four, that it reveal areas where research is possible and ways to assess the research be a basis for choosing starting points and indicate
how to develop appropriate methodologies for the needed research. That's a little dense, as most academic papers are, but what Engelbart is promising is remarkable if he actually delivers upon it.
Essentially, the four points of the study boil down to this.
One is a long-term and short-term roadmap for human-computer symbiosis.
Two is to answer the question of how augmentation will change us as humans. Three,
what existing technology and research are needed? And four, what do we still need to research and
invent to make this human-computer symbiosis occur? So, over the course of the following 144 pages or so, Doug would deliver on this.
He laid out a map for improved human-computer interfaces from 1945 onward, with the how and when.
More importantly, he drew these ideas on increasing human-computer symbiosis from how humans think and work on our own,
human-computer symbiosis from how humans think and work on our own, while also describing how having better access and control of computers will change how humans will work and think in
the near and distant future. Now armed with a map, Engelbart needed funding to blaze out his new
trail. In 1962, he would get just that. After a few years of being refused by
multiple institutions, DARPA would eventually agree to put forward the funds needed to start
what would become the Augmentation Research Center, aka ARC. So, how did Doug and Ark deliver on their promises?
Sadly, the state of computing in the early 60s was still extremely primitive.
Just to give you a sense, by the time Augmenting Human Intellect was published,
Fortran, one of the oldest high-level programming languages, was five years old.
Computers were really only in the realm of big businesses and a few research labs. I mean, the first patent for an integrated circuit chip wouldn't come around
until 1964. Computing still needed to come a long way to catch up with Doug's vision before
anyone could become augmented. Let's fast forward to the actual demo so we can start getting into some of the meat of this subject.
On December 9th, 1968, Doug Engelbart walked into the Civic Auditorium in San Francisco
and sat down at a new terminal on stage.
By the time he left, the computing industry was forever changed. In the six years since
receiving funding, ARK had developed on augmenting human intellect from an idea on paper to a suite
of software now called NLS. But what did NLS offer? Let me start on the most basic and most
visible level. NLS was used from a terminal that had a standard keyboard, a mouse, and a device called a cord
key set.
The keyboard was nothing special.
Keyboards had been used with computers for decades at this point.
The exciting development here was the mouse.
Today, we largely take the mouse for granted, but in the 60s it came as a
total revelation. The original mouse designed at ARC was a hand-sized device with three buttons
on the top edge that was used by sliding it across the desk to point on items at screen.
Essentially, with a few alterations, it would be the exact same mouse we're familiar with today.
The other new device that showed up at the demo was a chorded keyset.
This device was more of a miss. It never really caught on as well as the mouse.
Physically, just imagine a small five-key piano.
The chorded keyset was played by pressing one or more keys together.
Each key combination would run a different function.
Essentially, it operated somewhat similarly to a modern MacroPad.
It's hard to describe without having one in front of you in person, since there's not really an equivalent that we use with computers today.
That's basically it for the actual physical layer, at least
as far as new technology goes. The terminal itself also had a video display and a modem
for communicating with the mainframe, but neither of those were new technology at this point.
In this way, the NLS terminals weren't a total reinvention of computer-human interfaces. Rather, it was an
expansion. Up to this point, the only way of interactively using a computer was via a keyboard
and video monitor. Adding the mouse was like adding an entirely new access to the equation.
Humans now had a new and more intuitive way of controlling a computer.
Humans now had a new and more intuitive way of controlling a computer.
Sweet. So now we have a super cool new mouse.
On its own though, that's pretty useless no matter how cool it sounds.
You really need something to use the mouse with.
And in the case of NLS, there was plenty of new software to back this new and exciting hardware. The killer app here
was the Hypermedia Viewer and Editor. This software was the real backbone of the NLS experience.
Now, Hypermedia and Hypertext are something that we're all very familiar with in function,
if not in name. The current version of hypermedia that
is used the most is HTML, the language used mainly for websites. In HTML, a document is written up
as normal looking text with some added code to link between pages or embed objects or add
formatting. The hypermedia and NLS worked in the same way. On screen, you would use a mouse
pointer to navigate between pages using links. Another piece of the puzzle was a hypermedia
editor. Basically, a fancy text editor that would be used for creating or editing hypertext pages
on the fly. There were a few other less standard things about text editing in NLS that I want to talk about,
and these features are innovations that make it especially important
as a stone in the road to modern computers.
Those features are version control and collaborative editing.
Version control is a feature that you might not know about
unless you program a lot. Simply put, version control is a way to keep track of changes made
to a file and, once you mess up the file, roll back those changes. Current day version control
systems like Git, for instance, are almost universally used by programmers, but everyday computer users
also use version control on the regular. If you've ever had to hit control Z to undo an
accidental change in a Word document, then you can thank NLS for that. The other feature I want
to mention is collaborative editing. In the modern day, we see this when we use something
like Google Docs to share a file with a friend. Basically, this allows more than one person to
edit a file at the same time. Just as the name suggests, the whole point of this feature is to
enable collaboration on projects. So, if we're keeping track at home, NLS has a new intuitive human interface, linked pages of information, and the ability to add new information and make new links that can be shared with other users.
So far, that's sounding an awful lot like Bush's mimics in the flesh, but that's just the surface.
There are still a lot of future tech ideas tucked away
deeper inside NLS. Most of the other features I'm going to mention are things that fade into
the background of computer use today, but at the time they were new and unheard of feats.
One of the flashier features introduced was multi-windowing. A user could break up their screen into spaces for multiple tasks, just like we do today with Windows.
Another innovation that to us now seems like second nature is the idea of context sensitivity.
In NLS, the mouse cursor would change depending on what it was hovering over.
NLS, the mouse cursor would change depending on what it was hovering over. It would display as a normal dot if it was hovering over nothing, but would change to an arrow for a link or a text
cursor for editable text. This also carried over into NLS's help and documentation. When you pull
up the help menu, it would pertain to where you were in the system. Another key to the design of NLS was a uniform user interface.
All the software that made up NLS was interacted with using the same controls and the same interface software.
Again, this is a feature that we think of as core to modern software, so much so that we totally take it for granted today.
While this isn't an exhaustive list, those are the more recognizable advances that NLS debuted.
Many of the other new technology created at ARC was on the more technical side that most computer
users, even most programmers, will rarely interact with. Those were innovations such as virtual terminal protocols
and remote procedure calls, just to name a few.
Alright, so that's the description of what NLS really offered to a user.
At this point, it should seem like some inextricably modern computer system
that just happened to appear as a bolt from the blue in the 60s.
A few intrepid engineers and programmers designed and developed technology decades and decades ahead of their time.
Well, partly this is really true, but the fact of the matter is that those engineers and programmers were working with technology that was designed in the 60s, and also on a pretty tight budget.
During the demo, NLS was running on a single SDS-940 computer, a very large and early mainframe.
With this one mainframe, ARC was able to control about a dozen workstations at once.
control about a dozen workstations at once. A lot of tricks had to be employed to make NLS work with the limited resources available. The keyboard and mice and core set were all fed
directly into the shared mainframe, but the video displays were controlled in a totally different
way. The mainframe would output video to a series of small 5-inch CRTs that were
being watched by a set of CCTV cameras. The CCTV signal is then sent out to each workstation's
larger 17-inch display. This is a pretty weird and janky signal flow. Partly, though, this was done to save costs.
The mainframe's relatively small, higher-resolution display tubes were the only expensive or custom part in the whole system.
Since the rest of the signal flow was made using entirely off-the-shelf and readily available parts, this setup was relatively cheap.
It was also faster to set up this system instead of developing new hardware to transfer and receive digital video signals.
Since NLS was already connected up using CCTV, it was also relatively easy to add in video conferencing.
Another camera mounted like a modern webcam
was added to some workstations and then fed through the same video controls used for the
terminal displays. The second video feed was then superimposed over the main NLS video output so
remote collaborators could actually look at and talk to one another while working.
to one another while working.
ARK and Doug Engelbart were able to create something that truly delivered on the promise of changing the human experience.
But if NLS was such a game changer, then why aren't we all using it today?
In reality, we kind of are.
While NLS didn't really ever become a product that sold and powered computers in the home,
the groundwork laid by Doug and his team seeded the creation of a lot of the personal computer hardware and software
that would take the world by storm in the coming decades.
The Apple Macintosh, for instance, can trace its lineage directly back to NLS,
but that's a topic for another episode.
But don't think this is done. I think one of the most special things about the mother of all demos
is how well it's documented. A lot of computer innovation happened over decades, but NLS is
special in that it had one central coming out moment. The entire 90-minute
demo was recorded and is archived and easy to find online. If this episode interests you,
then I highly recommend checking it out. I'll have a link to where you can watch it in the
description. Watching the demo 50 plus years later is one of the best ways I can think of
to fully understand
just how groundbreaking augmenting human intellect was.
Thanks for listening to Advent of Computing.
If you liked this episode, we should be back in two weeks with a new one.
Or you can find all our past shows on adventofcomputing.libsyn.com.