Advent of Computing - Episode 42 - IBM Gets Personal
Episode Date: November 2, 2020This episode is not about the IBM PC. In 1981 the Personal Computer would change the world. Really, it's hard to talk about home computing without diving into it. But I've always had an issue with the... traditional story. The PC didn't come out of left field, IBM had actually been trying to make a home computer for years. In 1981 those efforts would pay off, but the PC wasn't revolutionary hardware for Big Blue, it was evolutionary. So today we are looking at that run up with SCAMP, the 5100, and the Datamaster. Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content:Â https://www.patreon.com/adventofcomputing
Transcript
Discussion (0)
It's hard to talk about the rise of the personal computer without mentioning, well, you know,
IBM's personal computer.
Once the machine hit shelves in August of 1981, the world really changed.
It changed how people looked at computers, and it changed how computers were accessible
to anyone.
The machine, and perhaps more importantly, the machine's design, was more successful
than anyone ever thought possible.
In 2020, most computers are still based off the IBM original.
I've even made this whole episode on PC-based hardware.
Needless to say, if we're looking for a computer that shaped the modern world more than any other, well, the PC is one of the clearest candidates.
more than any other, well, the PC is one of the clearest candidates. But for such an important computer, I think its story isn't always covered in the right way. The traditional narrative goes
something like this. IBM wanted in on the new home computer market, but they didn't want to spend all
that much money on a risky project. So a small and scrappy team was put up to the task. The final product
was a gamble. The machine was well outside IBM's area of expertise. However, it all paid off.
By breaking out of their comfort zone, IBM revolutionized computing and made a pretty
penny at the same time. Now, a lot of that is true, at least broadly speaking. The PC was vastly different
than IBM's usual fare, but this wasn't totally uncharted territory for the company. The PC was
their first successful home computer. But IBM had actually been working on home systems for quite
some time at this point. In fact, they were
one of the first companies to give the market a shot. Now, this shouldn't be a huge revelation.
The hints are out in the open. The PC's official name is the IBM 5150. So why did it have such a
long number if it was supposed to be a break from convention? That's an easy one. It wasn't.
Before the PC was ever a memo,
before the Apple II,
even before the Altair 8800,
IBM was trying to make a home computer.
The PC would never have made it to market
if IBM didn't have the space
and, more importantly, the time
to experiment and learn from their failures.
Welcome back to Advent of Computing. I'm your host, Sean Haas, and this is episode 42,
IBM Gets Personal. Now, before we actually dive into the content, I have a few announcements to make at the top.
It feels like forever since I've actually made announcements on the show.
So let's break that trend.
Firstly, I just launched a new website.
It's called oldcomputerpods.com.
And it's not really just for my podcast.
It's more for other shows.
Old Computer Pods is a directory of other retro computing and computer history related podcasts. Right now I have, I think, 22 shows up, and it
automatically updates, pulls in new episodes every day. So if you like this show, then chances are
you could find some more interesting content over at oldcomputerpods.com. Second announcement,
and one much more related to Advent of Computing
itself, I've been running Patreon for a while. And back in August, I put out a bonus episode
for patrons. It's about the MIL-STD-1750A microprocessor. It's a kind of interesting,
obscure processor design that was built by the US military for aviation and some other bizarre
applications. I'm planning to do another bonus episode in November, which if you're listening
when this comes out, that's this month. The idea with these special bonus episodes is they'll be
companions to the show or just shorter, more lighthearted takes on computing history. To that
end, I have a poll up on Patreon right now for
supporters. You can go and choose from three options for which bonus episode sounds the most
interesting to you. So if you want to participate in the poll, get the last bonus episode and the
upcoming one, then head over to Patreon. It's just $1 a month, and I'm hoping that there'll be a lot
more bonus content coming in the future.
So with that out of the way, let's get to the show. Today, we're not talking about the PC.
We're going to be looking at the almost personal computers that lead up to the eventual PC. Now,
I sort of fell into this topic. My plan was for this episode to be the next step in my ongoing and perhaps unending Intel series.
But I ran into a bit of an issue.
The next chip to cover is the 8085.
Now, I had plans tucked away in spreadsheets to talk about the chip's development,
its eventual role as a microcontroller, and just microcontrollers in general.
The idea was for this to be a bridge between the 8080 episode and the eventual coverage of the 8086.
But hey, you know what they say about plans.
When I actually sat down to draft an outline and hopefully find some sources,
well, I kinda drew a blank.
Information on the 8085's development is pretty scant.
Which I guess makes some sense. It is sandwiched right between the two biggest chips that Intel ever made. And in a lot of ways, it's just a minor revision to
the 88. But something did come up in my search. The IBM System 23 Datamaster. It's a computer
built using the Intel 8085 that directly preceded the PC. A lot of the technical details
from the Datamaster were actually carried over to the more successful system. Looking more into
the Datamaster, it's easy to fall down a rabbit hole. One that I think is pretty worthwhile.
You see, I wanted to cover the 8085 because it was the chip that first brought IBM and Intel
together. But IBM's home computing aspirations didn't actually start with Intel.
It goes back a good deal further.
So today's episode is going to be examining those early aspirations,
eventually leading up to the IBM-Intel alliance that's within the Datamaster.
It's a story that, at least for me, subverted a lot of my expectations.
IBM is often colloquially called Big Blue.
You've probably even heard that name come up on the show before.
The origins of that nickname are a little murky, but there's two main theories.
The big part is easy.
Historically, IBM has been the largest computer manufacturer in the world.
They dominated the punch card market and then dominated the mainframe market for decades.
Blue either came from the color of their logo or, according to some folklore,
the color of suits that employees wore when they went on sales trips.
Either way, I think the name Big Blue really encapsulates the feel around IBM.
It's a big, serious business. They make big,
expensive machines for big, powerful businesses. And they are anything but personal. And I think
that that really makes the narrative of this surprise PC easy to swallow. But the actual
story is a lot more complicated, as with a lot of things on the show, and I think that makes it a lot more important.
So let's look at how Big Blue got more personal.
What led them down the road that eventually ends up with the PC?
What kinds of trials and tribulations did they face?
And perhaps most importantly of all, how did they learn to let go and trust third parties?
importantly of all, how did they learn to let go and trust third parties? If you want to get really technical, then the roots of the PC don't actually go back to a computer, but instead a
programming language. That language went by the very uninspired name of a programming language.
I know, usually just called APL. It had been developed in the 60s at IBM as a replacement of sorts for Fortran,
and over the course of the decade, it had become a key feature on the company's mainframes.
But as with any product, adoption could always be wider, and sales could always go up.
In 1972, some functionary came to the conclusion that the real issue was just a lack of knowledge of APL,
and that the best remedy would be some entry-level product that showcased the language.
To the higher-ups, the details didn't super matter, they just wanted something.
So the call for a new product was shopped around inside IBM. I've seen it put that they were
looking for a, quote, pocket calculator or something that ran APL,
end quote. You know, maybe not the most specific request, but inadvertently, this would spark
wild change inside the company. The primary person to take up this call was one Dr. Paul
Friedel, a researcher at IBM's Los Gatos lab. Besides being just plain big, IBM in the 70s was a very diversified
company. They did a whole lot more than just manufacture mainframes. Freidel's career is a
perfect example of this. He held a PhD in chemical engineering. You know, nothing that related to
computers. But he had been exposed to an IBM computer at a vulnerable time during grad school.
After learning to program and eventually graduating, Freidel went off to work for Big Blue itself.
His initial assignment was at the Advanced System Development Division.
Instead of working directly with computers, his group focused primarily on process control.
In other words, systems designed to manage industrial machines,
factories, shipping, you know, machinery kind of stuff. By the time IBM's brass was looking for a
new APL device, Freidel had moved up to the Scientific Center. Now, Scientific Center is a
bit of a misleading name. Division or group may have been better. There were actually two centers that made up
the scientific center, one in Los Gatos and the other in Palo Alto. Now, funnily enough,
these operated in a similar fashion to the much more famous Xerox Palo Alto Research Center.
They were somewhat separated from IBM's larger corporate machinations, even going so far as to be geographically isolated from other IBM offices.
Employees at the Scientific Center had a lot more leeway to work on new projects.
And being affiliated with the largest computer company in the world,
well, that meant that these projects could be well-funded and well-staffed.
It was a perfect environment for real wild ideas to take shape. And as 1972
drew to a close, Freidel found a pretty wild idea floating around in his head. Now, Freidel's big
idea was that using current technology, he could actually create a small single-user computer.
From our standpoint in 2020, that seems like not much of a leap. But in this case, context is really everything.
The entire computer market was focused on large-scale, multi-user systems.
Massively expensive and powerful mainframes were the status quo.
The closest thing to a personal computer was either a fancy desktop calculator or some fancy smart terminal.
But those were really just hitting
the market at the time. To be fair, a self-contained computer that was owned and operated by a single
person wasn't a totally new idea, but at the time it was firmly in the realm of theory and science
fiction. This kind of idea was in the zeitgeist, and Freidel was starting to see that this type of machine
may actually be possible. He started drafting rough plans and drawings in late 1972,
and by January of 73, he had a proposal together. Freidel called this machine SCAMP,
the Special Computer APL Machine Portable, and from the outset, it was a huge departure from IBM's comfort zone.
His proposal laid out how, using only current IBM hardware, a machine small enough to carry
could be constructed. When Freidel presented his idea to management, all they could say was,
well, wouldn't that be something? And I think that's a pretty good reaction. It was a far-out goal, but Paul's rough draft made it seem like it could actually be attained. He was given just six months to work up a prototype.
The architecture of the SCAMP system was quite conventional.
Moreover, the short time available for development meant that the system had to be built from existing hardware and software components as much as possible.
The thing that was new about SCAMP as a device was its packaging.
Everything had to be based around technology that was already available.
It would just be combined for a new use.
You don't really need to reinvent the wheel, just find a new way to use a wheel.
Not only was this a very pragmatic approach, but it played directly into IBM's hand.
Big Blue had a level of vertical integration that let them do some really weird things,
to put it lightly. They didn't just make mainframes. They made processors. They could produce their own integrated circuits. Even something as mundane as a keyboard or plastic
housing could be produced in-house. This meant that Freidel had ready access to any hardware
that he could ever desire. However, that's not to say that Scamp's design would be very conventional
today. The overall bill of sales is somewhat familiar,
but the specifics get a little bit less so. Freidel's design called for the entire machine
to be one piece, with everything packed into a single chassis. Inside the case would be a small
display, power supply, keyboard, removable storage, and all the important processor and memory components.
Freidel's early sketches depict something that looks kind of like a suitcase. On the front,
there'd be a small CRT display mounted on the left, with a tape drive on the right side of
the unit. Under all that would be a keyboard, with the bulk of the machine in the back bulky
part of the box. Now, if we just look at that item for item, Frydell is using the same
components that, roughly speaking, show up in something like a laptop. I mean, you can detach
the keyboard and the overall description is even close to the future Macintosh. He also planned for
Scamp to be portable. I mean, it's right there in the name. However, that may be a bit of a misleading term in this
case. The machine was never planned to be used on batteries. It had to be plugged into a wall.
There would even be a lid used to protect the keyboard and CRT and a handle. But it wasn't the
kind of portable computer you want to carry very far. The final prototype would weigh somewhere along the lines of 90 pounds.
What makes Freidel's design so revolutionary is, as always, the context. In 1972, there was nothing
close to a personal computer, especially not one that you could buy. The best we had was theory.
The very same year that Freidel was drafting plans for his very real personal computer, Alan Kay published a paper that outlined his near-future plans for the Dynabook.
Alan Kay's theoretical machine is often pointed to as one of the first descriptions of a portable and personal single-user computer.
Key here is that Kay's paper stressed Dynabook was not quite possible yet.
But here we have a less well-known IBM project that was hitting a lot of the same beats as the Dynabook.
But for Freidel, there was nothing theoretical about his proposal.
A prototype was planned to be ready by the end of 1973, not some ethereal near-future date.
by the end of 1973, not some ethereal near-future date. The internals and all the finer details of Scamp would start to take shape soon after Freidel received funding from IBM's management.
He was able to assemble a crack team of 10 big blue veterans. It was split 50-50 between software
and hardware developers. Having ready access to such a large pool of talent, that really allowed
Friddle to get the project off the ground as quickly as possible.
This is where we really start to stray from the familiar into some of the stranger parts
of SCAM. Components like the screen, tape drive, and keyboard were pretty much run of
the mill. You can't really get too fancy with the basics. But on the inside,
Scamp used a somewhat new technology, at least really new to this application.
The processor that made everything work was an IBM Palm, otherwise known as Put All Logic
in Microcode. Now, this is one of those things that makes some sense. Palm was a processor made entirely in-house by IBM.
You know, may as well use what you have around, right?
And Freidel most likely had some experience with the processor from working on industrial
control systems.
But, man, to put it lightly, Palm is strange.
It officially launched in 1975, so Freidel was working with a new and pretty secretive piece of technology.
In Freidel's writing and the technical information, it's referred to as a microprocessor, but that's not in the modern sense of the term.
Palm was just a small card, but it was composed of multiple integrated circuits.
So it's not just a processor on a single chip.
Big Blue called it a microprocessor because, you know, it executed microcode.
Basically, a lower-level machine code used to build up an instruction set.
But here's the twist.
Palm hadn't been used as the processor for a computer before.
Previously, it served as something closer to a microcontroller. It managed industrial hardware
and computer interface systems. IBM just wasn't interested in making small-scale computers,
but it did have a need for some smart hardware controllers. That meant that when Scamp needed a brain,
Tridel and his team had to get creative. And Palm ended up being a pretty creative solution.
It was compact enough to drive a small computer, but also offered the flexibility that Scamp would
need to get off the ground. The software side of the project also took an interesting tack.
With such a short timeline, remember that
a prototype had to be completed within six months, well, there had to be some smart planning. From
the outset, it was clear that APL, well, that could become a really big problem for the team.
Since Palm had never been used as a standalone computer before, there just wasn't an implementation
of APL for that processor.
Writing their own compiler for Palm would definitely take more than six months.
But this is actually a case where Palm really shined. As Freidel explained,
the solution was therefore to emulate a processor for which a suitable APL system already existed. With this as a strategy,
we chose to write an IBM 1130 emulator in Palm Microcode, and then plug in almost all of an
1130 APL system which was available. In this way, we were able to replace a several-person
year programming effort with one of several-person, end quote. Now, this may require a
little bit of an explanation. Starting as far back as the 1960s, microcode had become something of a
specialty for IBM. It's an interesting and, frankly, confusing technology. So stay with me here and I'll
try to explain. It's probably easiest to think of microcode as a level somewhere between machine code and the actual hardware.
Microcode allows a hardware developer to define a processor instruction set.
And it's done at such a low level that a program running would never notice the difference.
A microcode-based processor like Palm can't really be used out of the
box. It only supports extremely basic micro-instructions. I mean, sure, you could program directly on
that, but there's a lot better things to do with microcode. By leveraging microcode,
you can build up a full instruction set totally separate from the underlying hardware. And,
best of all, it's really just a program,
so you can dream up whatever instructions you want. The trick that Scamp pulled was implementing
an existing instruction set on Palm, specifically for the 1130 mainframe. Any software that could
run on that computer could run almost the same on Palm. The Scamp version would be a little less powerful,
but it was still able to use existing code.
And it turned out that writing up some microcode
took a lot less time than writing a whole new APL compiler.
The weird side effect here is that,
really for all intents and purposes,
this made Scamp a miniaturized desktop-sized version of one of IBM's mainframes.
While not super important to the final product, I just think this is a neat detail. It shows that
this radically new computer was still very much cast in the mold of an IBM system. This compromise
approach of using existing components and software to make a new machine, it worked a lot better than
Freidel anticipated. The project was moving forward quickly, but the problem is, that gave
Paul some breathing room to think, and, well, he ran into a strange realization. Quote,
So you have a personal portable APL machine. What do you do with it? How should applications be presented to the user?
How could Scamp best be handled as a product? How could I get these messages across to IBM
management? End quote. In other words, he had been so busy asking if he could make a personal
computer that he never stopped to ask if he should. Now, I love this. For me, this is the part that really flips the
traditional PC story right on its head. Neither Fridale nor IBM were coming into this project
with a problem to fix. They weren't looking to get into the home market. Instead, Fridale was
realizing that a small computer, well, yeah, you could make one,
so why not give it a shot, see where it goes?
They were flying totally blind into the future, so much so that they didn't even really
know what a PC should be used for.
So it would be in the middle of 1973 that Freidel and his team of engineers, well, they
had to sit down and fix that.
They had to
figure out why someone would want a personal computer to begin with. And really, what good
could it do anyone? Now, Freidel had a few ideas, but not all were very compelling. In his mind,
the most simple answer was SCAMP could be used as a quote, super desktop calculator, which,
could be used as a quote, super desktop calculator, which, yeah, but that's kind of boring.
Alternatively, you could just write APL programs on the machine. Either of those ideas are useful for, say, Paul himself, but he wasn't really selling the idea to other programmers. Quoting
from Freidel again, the first sale had to be made to high-level company management who were not, in general, experienced APL users.
It seemed that my best shot was to demonstrate Scamp's capability in the third area, i.e. as an application machine for non-programmers.
Once again, that's a really simple concept for us today.
Of course you use a computer if you're not a programmer.
But in 1973, it was pretty radical, and I mean that in all senses of the term.
Freidel was using IBM's executives as a target audience.
People who probably never touched a computer in their lives, or at the very least, hadn't
in quite some time. At the time least, hadn't in quite some time.
At the time, computers were firmly in the technical realm. You didn't really come in contact with a computer unless you were a programmer, scientist, or someone maintaining
the system. There had been work towards making more user-friendly systems, but that line of
thinking was still relatively new. Once again, Alan Kay's Dynabook is a wonderful
contemporary example. Kay described his computer as being used by children, businessmen, and
programmers alike. In theory, that was revolutionary. But in practice, that was even more so.
But then the question becomes, how do you make a computer usable for a non-programmer?
question becomes, how do you make a computer usable for a non-programmer?
Freidel's solution was actually simple. Make the machine boot up into a series of menus.
Each screen asks a simple question like, what program would you like to execute? From there,
it offers a set of options. Traversing through menus took you to various programs that Freidel's team had developed. You could drop out of the menus and program APL if you wanted to. Or instead, you could load up a simple spreadsheet program.
You could run a day planner just as easily as loading up the help file. In an era where
contemporary machines were accessed over glowing terminals using long and archaic incantations,
Scamp must have seemed downright futuristic. And come the fated
day of the demo, IBM's top brass seemed to totally agree. Freidel's final trick was to get IBM
executives to see why Scamp was such an important project. So he decided the best way was to really
get them to use the machine itself. Freidel put it like this, and I really like this description.
Quote,
Many of my victims had not had recent typing experience,
much less hands-on computing or programming experience.
When demonstrating Scamp to various IBM executives,
I had no notion of how much pressure I'd be putting them under
by asking them to execute a simple, to me, demo.
Here I was, insisting that they perform tasks which they had never done before, in front of their peers.
I hope all of these fine people have either forgotten or forgiven me.
The outcome was superb.
No system failures, and my new set of no longer inexperienced users performed beautifully.
I knew we had arrived when one of the execs said,
gee, anyone should be able to do this. Boy, was he right. End quote.
Freidel had really pulled off an insane feat. In the course of six months, he and his team of 10 had created a totally new type of computer.
Scamp lit a fire at IBM. The prototype convinced the company's higher-ups that it was possible to
create a computer that anyone could use, not just programmers. From there, Big Blue would enter into
a long-lasting chase after the personal computer. By the summer of 1973, the prototype
part of Scamp was done. The decision was made to fully fund the project and push it into production.
But that wasn't going to be a simple process. There were still details to iron out. Whatever
Scamp turned into was going to be a consumer product. And, you know, IBM didn't really do consumer products, unless you count typewriters.
The transition from lab to market is where Scamp gets a proper name, the IBM 5150. And now,
if it wasn't apparent before, we are firmly in the line of succession that leads to the 5150,
aka the personal computer. But despite the naming convention, the 5100 wasn't
anywhere close to the better-loved PC. Its design borrowed heavily from Scamp, but there were a few
minor changes that I want to talk about. The most obvious difference is the case. The prototype that
Freidel developed, well, it worked, but it wasn't really the prettiest machine. So the 5100
had a totally new exterior, molded from hard plastic and steel. The basic shape remained the
same, a front section with all the controls and a large rear section containing the actual guts of
the computer. But everything had been given a little more spit polish. The overall look is much more in line with other IBM products.
The colors and curves of the case matched the beige CRT terminals
the company was producing in that era.
The internals were also largely unchanged.
A new power supply was slotted in,
one that was still in development while Scamp was being built.
The same Palm processor card was the core of the operation,
and very similar I.O. was present. But there were some pain points, and while Palm was the best tool
that IBM had at their disposal, it came with its own annoying issues. In general, Palm is a 16-bit
processor. Some of the inner workings are 8-bit, but the memory bus is 16-bits wide.
No matter what fancy mainframe instruction set the Palm was microcoded to run,
there was a hard cap on the amount of memory that it could physically access. That size bus lets
you address up to 64 kilobytes of RAM. For an early home computer, that's already very impressive.
But we aren't dealing with a run-of-the- a run of the mill company. For one, no one had really made a personal
computer before. I keep driving that point home, but it's important. It explains a lot of the
weirdness and a lot of the issues that IBM ran into. In this case, the parameters were kinda up in the air. Can a single user
get by with 1 kilobyte of RAM? Or do they need something closer to a megabyte? Systems
like the Apple II would eventually show that a user can get by with just 4 kilobytes no
problem, but commercially successful home computers were still years away.
The other consideration for the bus was ROM,
the read-only memory that was planned to store most of the 5100's software. Just a quick digression,
but in most modern computers, ROM is accessed the same as RAM. They're on a shared memory bus. To
the computer, the chips all bleed together. But ROM really just presented some more questions.
How much software would IBM need to bake into the machine to make it useful? What about if there
were updates to the default software? There were just a lot of unknowns. Adding the fact that SCAMP
and now the 5100 are very much miniaturized mainframes. Scamp ran the same software as much larger IBM computers,
just at a more diminutive scale.
In IBM's eyes, 64 kilobytes may not be enough for all use cases.
So they did what Big Blue does.
They over-engineered the machine.
Specifically, a method called bank switching was used.
It let Palm access more memory.
It allowed the computer to switch between multiple RAM and ROM chips.
More than just the 16-bit bus could address.
Bank switching adds complexity, and it can make read and write times a little slower,
but the upside was just too good to resist.
The final design maxed out at 64 kilobytes of RAM,
but could accommodate much more ROM, somewhere in the hundreds of kilobytes.
With all that open space, what exactly did IBM do? Well, they did some kind of weird stuff.
Scamp had made very aggressive use of microcode and emulation to stay on schedule.
Conversely, the 5100 team, well,
they had a lot of time on their hands, a lot more than 6 months. But they decided to stick
with emulation. The new machine didn't use the same emulation layer as SCAM. Instead,
IBM opted to make the 5100 emulate the newer System 370 or 360 mainframe. So,
emulate the newer System 370 or 360 mainframe. So, why stick with emulation instead of native palm code? An internal memo at IBM explained it like this, quote,
In addition to greatly reducing the time and manpower requirements, the technique of using
already existing code yielded a cleaner and more compatible product. The only differences between the 5100 version
of APL and APL on the System 370 were the portions dependent on the input-output devices,
display vs. typewriter, and tape vs. direct access device." So, partly, emulation was used
to save time and costs, but the compatibility aspect here is interesting. With some tweaking,
you could make most System 370 code run on your new desktop computer. The main change would be
for input and output routines, but you could carry over your Big Iron programs to some much
smaller iron. On the APL side of things, a program could be brought over almost verbatim.
This was handy for developers, but also gave a clear picture of IBM's outlook for the 5100.
This wasn't a totally standalone computer. It was still meant as a system that would work
alongside a mainframe. Sure, you don't strictly need a mainframe to get use out of this new more personal machine, but the 5100 still very much existed in the same ecosystem as larger machines.
Expandability was another interesting addition that IBM borrowed from their mainframe work.
The 5100 had the basic expansion options.
You could increase the amount of RAM in the system system and you could also swap out optional ROM software. A large part of the actual
logic of the machine was also expandable, or more accurately, serviceable. If you
crack open a 5100, you'll find that most of the computer's circuits are on
removable cards. About half of the case is actually taken up by a cage full of
these little cards.
This includes the palm processor itself, RAM and ROM, input-output cards, and other controller logic.
I wasn't able to find if there were ever any optional expansion cards, but the framework is certainly there.
In practice, this was used to make servicing the 5100 quick and simple. You could replace a bad component without needing to have
a soldering iron on hand. But there's actually a lot more to the 5100. This mainframe-mindedness
is visible in most of the computer's design. The SCAMP prototype had used a cassette tape drive for
long-term data storage, but the 5100 replaced that with a quarter-inch tape cartridge.
The cartridge was more reliable, it was faster, and just in general stored a lot more data.
But it was also a format that a mainframe would be able to more easily read and write.
I.O. also helped the 5100 bleed into existing computer rooms.
Around back were new ports for connecting up a printer, an external disk drive, or even
directly to a mainframe. In the latter mode, the 5100 could be turned into just a really,
really fancy terminal. However, there were some less mainframe-oriented aspects of the machine.
You see, APL wasn't the only language offered on the platform. This is where we get back into the brass tacks of timing.
The 5100 was officially released in September of 1975. Development started somewhere during 73.
During that time period, the home computer market actually started to take shape.
Now, 75 was still very much in the earliest days of home computing. But there were some
options out there. In 74, the Altair 8800
was announced. The machine would ship early the next year. S100 systems derived from the Altair
would soon follow. Most of these early personal computers had one big thing in common.
They all ran BASIC. For Altair, Microsoft BASIC would become its killer app. And with good reason.
The language was designed to be easy to learn, even for an absolute beginner.
BASIC was already a popular choice for non-programmers on some larger computers,
so it really made good sense to adapt it for microcomputers.
When the 5100 came out, IBM offered a BASIC interpreter as an add-on.
The reasons for this aren't entirely
clear, but I have a pet theory I'd like to pitch. A lot of the information about the 5100's
development comes from one source, an internal IBM memo written in 1977 by one A.J. Coven.
And sadly, we don't have all that much more to go on. The memo is a few pages long, and it gives some enticing information, but it doesn't go into quite enough detail for my tastes.
It just says that BASIC was included because, quote, of its popularity as an interactive computer language in the United States.
Now, that's vague. That's so vague as to almost be useless.
But I have one final puzzle piece. The first BASIC manual for the 5100 has a copyright date of 1974.
This means that it was written while the computer was still in development.
And most importantly, that's prior to the announcement of the Altair 8800's version
of BASIC. That didn't come out until 1975. So IBM, they weren't taking cues from up-and-coming
microcomputers. Instead, they were presaging the rise of BASIC in the home market. It's more likely
the popularity was a reference to the language's use on larger computers.
To me, what makes this little detail really important is that IBM was a little bit ahead
of the curve.
The inclusion of BASIC on the 5100?
Well, that was one of the puzzle pieces of the PC that BigBlue got right before anyone
else did.
Putting this all together, though, we get a kind
of conflicted picture. The IBM 5100 is branded in no uncertain terms as a personal computer,
but the machine is really just a mainframe hiding in PC's clothing. The software is derived
directly from mainframe code, and the majority of its hardware is made up of mainframe components.
There are little touches that make it closer to the personal computers of the 80s, but the 5100 is still very much part of this big iron environment that IBM was selling.
It's a weird mix. IBM hits some ideas right on the nose, but others are lagging behind.
But things actually get stranger once we hit 1975.
IBM's plan was to sell direct to consumers.
So as the computer hit shelves, a slew of ads also went out.
A base model of the machine sold for just under $9,000.
Adjusting for inflation, that's $43,000. Fully loaded with
more RAM and optional software, and you get even into higher prices. Now, that's not hobby computer
money, that's approaching brand new car money. So who exactly was this thing for? Well, according to IBM, anyone who has a business.
Here's a choice clipping from one of their amazing TV ad campaigns.
The 5100 is easy to learn and simple to use.
There are countless combinations of feed we can mix.
What is the most economical for any particular herd?
That's what I'm figuring out now.
The IBM 5100.
It's bringing the advantages of the computer to more and more people.
IBM.
Helping put information to work for people.
Easy.
It's so simple, even a rancher can use it.
And it's so useful that even a rancher needs it.
Now, all jokes aside, the signaling from IBM here is bizarre to me.
Sure, for a computer, the 5100 is actually pretty cheap.
But for someone who hasn't used a computer before, the price makes it a tough sell.
It's not the egalitarian vision that someone like Alan
Kay had. It's more of a utilitarian aid for a large company. However, this ad does show a promising
trajectory. You can't see it here since, well, you can't see anything on a podcast. But the very
beginning of the ad has a familiar video. It opens with someone picking up
the computer, placing it on the desk, and plugging it in, and then just turning it on. It's simple.
It's to the point. Almost a decade later, Steve Jobs would take a very similar angle with the
Macintosh. Within minutes, you can become a computer user. Just buy the machine,
open it up, plug it in, and go. Well, IBM did that first, and they did it almost 10 years before the
Macintosh ever existed. This is just one more example of how IBM was starting to close in on
what a PC should be. Now, the foreshadowing here may be a bit heavy-handed. I've never claimed to be a master
of suspense. The 5100 didn't take off all that much. Few people really bought into IBM's first
pass at a personal computer. But the 5100 series actually stuck around for some time.
An upgraded model, the 5110, came out two years later.
The 5120, a further upgrade complete with a new case and internal floppy drives,
arrived soon after. But all these incremental improvements stuck to the same underlying
palm-based hardware. IBM was starting to pick a way at a personal computer.
But they were still off the mark.
They needed to make another radical jump.
And this is where the story starts to sound closer to the traditional tale.
As the 5100 series was floundering in the market, something really big happened.
In 1977, the Commodore PET, Apple II, and TRS-80 home computers were all released.
This trifecta of microcomputers offered a different take than previously seen. Like the 5100, these systems were targeted at
the layperson. The Apple II, with all its polish, wasn't meant for the average hobbyist. It was an
appliance computer. It was something you could buy for the home, plop it on a desk, and pretty soon,
you too were a computer user. IBM probably wouldn't have cared all that much. But this
is where something weird starts to go on. Big Blue's customers started buying into
microcomputers. They weren't being used as a replacement for mainframes, they filled a
different niche. But Apple IIs and Pets were
starting to show up in offices alongside bigger IBM systems. That perceived threat would be the
spark needed for the next big phase of development. But this doesn't take us to the PC, at least not
for a while. In 1978, IBM put together a project aimed at competing in the changing home market, and hopefully taking
back some of the office market share also. This new computer would be known as the System 23
Datamaster, and this is the direct predecessor to the IBM PC. So, what was the Datamaster?
Once again, we can fall back to one of IBM's strange ads. Now, that already sounds really
similar to the 5100. Did IBM actually make a computer for anyone to use? Was it really a
personal computer? And was it really easy for anyone to understand? Well, that's kind of a
complicated question. The data master is very much a case of IBM taking some direct cues from the market.
Home computers in this period were all built around 8-bit processors.
They sat on a stationary desk.
And they ran BASIC.
IBM would take a very similar approach.
But there was one more key aspect to these new machines that often goes unappreciated.
None of them used very custom hardware.
Apple didn't design and fabricate a proprietary processor for the Apple II.
They just bought the chips from a supplier.
Commodore didn't use custom-built RAM modules.
Instead, they just had some off-the-shelf chips.
For these home computers, it made a lot of sense.
A company like RadioShack was big, but not big enough to have their own chip fab plant
or integrated circuit design department.
The 5100 was far and away superior to these small-scale machines.
But that didn't really matter.
IBM's marketing said that their personal computer was meant for anyone,
but that just wasn't true.
For most users, the 5100 was way too much computer.
No one really wanted a tiny mainframe in a handy portable case.
A simple microcomputer like the Apple II,
well, that was more than enough power for new users.
And frankly, it was just cheaper.
To compete against these smaller and more agile companies, IBM needed to step out of their comfort
zone. To this end, the Datamaster was a radically new design, at least radical for Big Blue.
It ditched nearly all the hardware of the earlier 5100 series. In fact, it ditched
most IBM hardware altogether. The biggest change was to drop Palm and switch over to an actual
microprocessor, not one of these funky IBM microprocessors. The change makes a whole lot
of sense, especially when you look at surrounding events. IBM released the 4004,
the first computer on a chip, in 1971. It was a massive technical achievement. But all in all,
the 4004 wasn't useful. It wouldn't be until the middle of the 70s that practical microprocessors
actually hit shelves, with the 8080 shipping in 1974. I think if things had happened a year later,
then IBM may have become a lot more highly invested in the microprocessor a lot sooner,
but the computer was already designed by the time chips were actually available.
By 1978, microprocessors were becoming a very proven technology, and a popular one at that.
So if the Datamaster was going to be a
totally new computer, then a microprocessor was the clear best route. Specifically,
the Datamaster was built around Intel's newest chip, the 8085. It was, for sure, less powerful
than Palm. But it was much more in line with contemporary home computers. It was closer to what people actually wanted.
The rest of the design of the Datamaster was highly influenced by that choice of processor.
So, I guess it's time to meet the 8085.
And to be clear, this is a pretty obscure processor.
I'd understand if you hadn't heard about it before.
But it is an interesting one.
It was binary compatible with Intel's more successful
8080. So from day one, it had access to a large library of code. What makes the 8085 particularly
interesting is that it used a much higher level of integration than Intel's last processor.
There was just more printed on the circuit. There were more support chips built into one package.
more printed on the circuit. There were more support chips built into one package.
It had so many components packed into a single chip that the 8085 has actually been more often used as a microcontroller than a fully-fledged processor. There were also little quality-of-life
improvements, such as a simplified power supply requirement. The level of integration here,
though, that was the key. The 8085 needed just a lot fewer chips
to run. Features like serial I.O. and interrupt control were all built into the processor.
In general, an 8085-based computer could be made more simple than an 8080-based one.
That translates into an easier development cycle and and perhaps the biggest thing here, a cheaper computer.
The other big upside to the 85 was something that I've touched on in my Intel series, but
never in a whole lot of detail. You see, IBM wasn't just buying into a processor. The 8085,
as well as most Intel processors, came with a whole family of ancillary chips.
If IBM wanted some RAM, well, Intel had
some tailor-made and ready. A bus controller? Just tack it onto the order sheet. Of course,
for IBM, that meant giving up a whole lot of control over the data master's hardware.
It was a very un-IBM thing to do, but giving up that control let them be more competitive with other personal computer
manufacturers. The 8085 was the heart of the operation, with a slate of other Intel chips for
RAM, ROM, and I.O. But don't get me wrong here. This wasn't just a generic Intel-based machine.
We're still dealing with IBM after all. Like the 5100 before it, the Datamaster had a complicated scheme for handling
memory. The problem came down to the 8085's memory bus. While the chip was mainly 8-bit,
it actually had a 16-bit wide bus for addressing RAM and ROM. That works out to about 64 kilobytes
of RAM, which, to IBM's engineers, wasn't going to be enough this time. So like the 5100, the Datamaster aggressively used bank switching,
but with more of a focus on RAM instead of ROM.
A fully loaded Datamaster came with a whopping 256 kilobytes of RAM
and 112 kilobytes of baked-in software in ROM.
For an 8-bit machine, that's a beastly amount of memory. That's approaching a
16-bit computer or even a PC. Bank switching is one example of how ideas from the 5100 were making
their way into IBM's latest machine. And this forms a pretty solid trend. But things aren't
exactly one-to-one. The dataaster was pushing ideas further than IBM had before.
One feature in this weird realm of part old, part new is actually one of the most important.
And that's expandability.
The Datamaster was already taking on a very familiar form for PC users.
Physically, it was composed of a motherboard that held the CPU,
I.O., and support chips, and it had plugs for expansion cards and RAM modules.
It's those expansion slots where things start to get a little wild,
at least for PC nerds like us.
On the motherboard were six slots designed to hold the edge connector of expansion cards.
On the back of the case were cutouts to accommodate any plugs on those cards. And for convenience, the whole motherboard assembly
could even slide out like a little drawer. Now, qualitatively, that should sound a lot like the
expansion slots in the IBM PC. But this isn't just a case of similar appearances. These were very early ISA slots.
The bus these cards used was nearly identical to the later 8-bit iSET bus that the PC would
make so common. And really, this just makes a lot of good sense. The ISA bus is an IBM original
design, but a lot of its layout and signaling is dictated by the underlying Intel chipset.
With future IBM personal computers using at least roughly similar Intel chips, you can start to see
why the bus would actually stick around for a long time. The Datamaster also borrowed a case
from the 5100 series. Well, specifically the 5120. That model had ditched the slightly deep portable design, replacing it with
something more vertically aligned. The front of the case had all the usual fixings. You have a
good-sized CRT display mounted next to two internal floppy drives. Below that's a keyboard, and around
the back is access to the motherboard and assorted ports and plugs. The floppy drives in this case were 8-inch disks, so pretty big by
really any standard. But they provided faster and more convenient storage than any tape drive ever
could. The entire thing has much less depth to it. So unlike the 5100, it fills a lot more vertical
space. It's closer to the final footprint of something like the IBM PC. Once again,
we're looking at a single-unit kind of construction. In this formation, the Datamaster
looks even more strikingly similar to the Macintosh. Now, I'm not claiming that Jobs got
his idea for his computer from IBM, but there's some similarities in both form and function.
But there's some similarities in both form and function.
Both systems were built into a sleek, all-encompassing case.
Both were beige.
Well, I guess that's a given, but still.
They're the same color.
But perhaps most importantly, both use this type of design for the same reason.
You can plop the computer on a desk, plug it in, and you're done.
This idea was present even all the way back during Scamp's development, but the Datamaster shows that they were really sold on the idea. And hey, it's one
of the few places where IBM and Apple actually agree, so I can't help but point this one out.
From everything I've described so far, the Datamaster should sound like a slam dunk,
right? The computer uses third-party hardware, and overall, the
internals are really close to the IBM PC. It's capable, it can support this insane amount of RAM,
and it even has some features and touches that wouldn't be seen in home computers for years to
come. So where did it go wrong, exactly? Why aren't we all using Datamaster-based machines today?
Well, to echo a common excuse, it was a software problem.
Unlike IBM's earliest attempts, the Datamaster was totally devoid of emulation.
You could call it legacy-free, or you could call it a disaster waiting to happen.
This was one of the big risks of using Intel hardware.
IBM didn't have existing software to ship with the Datamaster.
They didn't really write anything for the 8085 platform beforehand, so BigBlue was starting at a deficit.
And while IBM had already taken a big step towards switching to someone else's hardware, well, they weren't willing to outsource software. In other words, the Datamaster team was in for a really bad time. The plan was
to ship with BASIC built into the ROM. This would match all other popular home computers of the era.
Easy, right? Just write a new BASIC interpreter. Well, you see, IBM really wasn't a microprocessor kind of company.
They were still the mainframe guys.
As near as I can tell, this was the company's first time using this sort of hardware.
In this way, the data master fit perfectly into Big Blue's big blue blind spot.
Development of just the basic interpreter ended up being a long process.
Now, what made matters worse was that partway through the development, IBM decided that
Datamaster's BASIC should be compatible with the new System 34 BASIC. This was an implementation
that ran on some of the company's smaller mainframes. Now, I can't find a good reason for this. Honestly, this was
a very bad choice. My best guess is that it was an attempt to relive some of the compatibility of
APL on the 5100 series. Whatever the reason, the choice would be a disaster. It delayed development
a lot more. And that's not to mention any other software.
The Datamaster was supposed to be a business machine.
With third-party software out of the question, that meant IBM had to write all their own
business software.
So they get to write a word processor, spreadsheet, accounting software, and, you know, all the
other assorted programs that a user could ever need.
and, you know, all the other assorted programs that a user could ever need.
For the development team, this must have been a grueling and very uncomfortable experience.
But for the company, there were bigger ramifications than some late nights at the office.
The delay was bad.
Like I mentioned, development started in 1978.
The data master shipped on July 28th, 1981. So, you know, just three years of development. By that time, the Datamaster was already nearing obsolescence. 8-bit was on the
way out, and while the machine was really capable, it wasn't on the leading edge anymore. That alone
made for a lackluster release date. But there's another issue. An article in Dr.
Dobbs' journal put it this way, quote, And the price? Well, the price is genuine IBM. A complete
system is just under $10,000, we were told. If you can do without a printer, you can get it into
the $7,000 range. That rushing sound you hear is an entire industry sighing
in relief. Now, Grimesbout's slow development aside, the price was a major problem for the
data master. For some context, the Apple II cost $1,200 in 1977, even less in 1981. If IBM,
a much more experienced company, had been able to put out a computer at
that price point, they would have burst the market wide open. Relative small fries like
the Apple of the 1970s would have been utterly destroyed. But $10,000? That's not even close
to the realm of competition. The data master was starting to get the equation right,
but IBM hit just off the mark. Now, I want to close this out by going back to release day.
With all the fun of development, schedules getting twisted, and general bureaucracy of a big company,
the data master came to market really late. But the exact date is a little wild.
You see, just 15 days later, the IBM PC was announced,
starting at just $1,500.
That's right, the release day for the Datamaster
may have been the worst time possible.
And with the release of the PC,
well, we all know what happens next.
With that, our tale today is brought to a close.
There are many approaches to creating a personal computer.
And like a lot of long-sought goals, there were many attempts going on in total isolation.
IBM would eventually crack the code
with the PC, but it wasn't a bolt from the blue development. The company had been working towards
the personal computer for the better part of a decade, and then they only struck it big when
they were able to give up all control. The PC was a very un-IBM computer. It used third-party hardware, and it shipped with a whole slate of
third-party software. But that was just the terminus of work that had been brewing since
the early 1970s. If we actually want to look towards IBM's first personal computer, then we
shouldn't be talking about the 5150. We should be looking at SCAM, the 5100, the Datamaster,
and all the trials and errors that led to the PC we know today.
And while that story makes things more complicated, I think it's a better one.
IBM didn't get their act together all at once.
Instead, they slowly learned, they slowly adapted, and they were slowly able to let go of a little bit of control.
If it wasn't for this long evolutionary period,
we wouldn't have the same computers we have today.
And it was over this long evolution
that Freidel's scamp would turn into everyone's PC.
Thanks for listening to Adren of Computing.
I'll be back in two weeks' time
with another piece of the story of the computer.
And hey, if you like the show,
there are now a few ways you can support it. If you know anyone else who likes computer history,
then why not share the show with them? You can also rate and review on Apple Podcasts.
And if you want to be a super fan, you can now support the show through Adren of Computing merch
or signing up as a patron on Patreon. Patrons get access to early episodes,
polls for the direction of the show, and bonus
content. You can find links to everything on my website, adventofcomputing.com. If you have any
comments or suggestions for a future episode, then go ahead and shoot me a tweet. I'm at
adventofcomp on Twitter. And as always, have a great rest of your day.