Advent of Computing - Episode 51 - The IBM PC
Episode Date: March 8, 2021Released in August 1981, the IBM PC is perhaps one of the most important computers in history. It originated the basic architecture computers still use today, it flung the doors open to a thriving cl...one market, and created an ad-hoc set of standards. The heart of the operation, Intel's 8088, solidified the x86 architecture as the computing platform of the future. IBM accomplished this runaway success by breaking all their own rules, heavily leveraging 3rd party hardware and software, and by cutting as many corners as possible. The PC was designed in less than a year, so how did it become the most enduring design in the industry?  Some ad clips this episode were from this fabulous PC ad compilation: https://www.youtube.com/watch?v=kQT_YCBb9ao  Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing
Transcript
Discussion (0)
With this tool for modern times, a person can quickly master such jobs as accounting or word processing.
Even use the IBM personal computer to forecast growth.
All helping the business person at home to wear many hats.
That's an ad for the one, the only, IBM personal computer.
Released in August of 1981, the PC was a watershed moment for computing.
Look no further than your own desk, or even the closest internet connection.
Most computers in 2021 are still based off IBM's original design. That ranges from the
cheapest laptops up to the bulky servers that keep the internet running. The newer M1 devices
that Apple started manufacturing wouldn't be interesting or new if they weren't trying to
break from long-established convention. In that sense, we're very much still seeing ripples of
1981 in the modern day. But was the PC really a tool for modern times? Was it really even modern in the time it was produced?
By today's standards, definitely not.
The heart of the PC, Intel's 8088 microprocessor,
was already starting to show its age when IBM's new computer hit the scene.
The chip came complete with nearly a decade's worth of legacy and baggage.
The PC would be marketed as a 16-bit computer,
but it came loaded out with a smattering of 8-bit parts.
Despite all that, this one system became the blueprint for computing for decades to come.
The PC broke from IBM's tradition.
It opened the doors to outside developers.
And by doing so, it changed computing
forever. But looking at the bill of parts, it's hard to see how that change was even possible.
Somehow, IBM was able to make magic happen, break all their own rules, and build a future
out of the most unexpected components.
Welcome back to Advent of Computing.
I'm your host, Sean Haas, and this is episode 51, the IBM PC.
This is going to be a companion to last episode,
which covered the development and release of the Intel 8086 processor. While episode 50 isn't required to enjoy today's episode,
I'd recommend giving it a listen. The 8086 and the PC are kind of two sides of the same coin.
It's hard to talk about one without addressing the other. This will also be a bit of an epilogue to my series on Intel's
processors, so it wouldn't hurt to be up to date. The series so far has covered the 4004, the 8008,
the 8080, IBM's first brush with Intel, and then finally arrived at the 8086. That's a lot of
eights and zeros and a lot of silicon, but we've made it to the payoff,
to where Intel's processors enter the mainstream in a massive way.
Now, if we were to make a list of the most important computers of all time, the IBM PC
would have to be pretty high up there.
It wasn't just a personal computer.
It was THE personal computer, from which modern systems draw their basic design.
The development, overall architecture, and administrative choices surrounding the computer
lead to runaway success. After IBM blew up in the market, third parties started producing
compatible PC clones, and from there, a kind of unintended standard formed.
The PC became the de facto in-home computing up into the current day.
At the core of the IBM PC was Intel's 8088 microprocessor.
That's a modified version of the 8086.
This is important because IBM's decision and the ensuing PC explosion solidified x86 processors as a core part of the PC standard.
That meant that all the baggage around Intel's stopgap chip became part of IBM's new baggage.
But outside of the weird legacy stuff, when we get down to it, the IBM PC was a really weird computer, especially by IBM's standards.
The Intel chip that made the computer tick was just one aspect of that strangeness.
So let's dive into my long overdue coverage of the original PC. How did IBM's earlier love affair
with small computers come to a head? What made the PC so different than other IBM systems that came
before it? How did Intel get wrapped up in all of this? And to round everything out, what would
have happened if IBM and Intel had never met? What were those other options that didn't come to pass?
Honestly, it's not all that easy to find a good starting point for the story of the IBM PC.
It's not all that easy to find a good starting point for the story of the IBM PC. Sometime in the early 70s is probably the best place, at least in my opinion.
This is where the actual product designation and numbers start to matter.
Officially, the IBM PC was called the IBM 5150, which puts it squarely in part of an
earlier product line.
The story of that product line, the 5100 series, is a little convoluted.
I covered it back in the archives in an episode called IBM Gets Personal.
To quickly summarize the key points,
engineers inside IBM started to realize that they could build a small computer to sell to smaller businesses.
This led to a computer known as the 5100,
an all-in-house system developed, designed, and manufactured by IBM. At the time, that was just
how IBM did things. Control was a key part of the company's outlook, so the 5100 had no third-party
parts in it, save for a CRT tube and a tape drive. The 5100 didn't sell well.
Mainly because it was insanely expensive. IBM was still very much in the mindset of selling
multi-million dollar mainframes. Over the next few years, the computer was revised. The 5100
turned into the 5110 and then the 5120. Despite improvements, sales lagged behind.
The key problem was that IBM was chasing a market that wasn't necessarily there yet,
and they weren't chasing it very well.
They had created a solution in need of a problem.
The early 5100 series computers were impressive,
but ultimately it was too much power for a small
scale user. IBM was very much early to the small computer game, but they didn't have the game fully
figured out yet. Then we reach the middle point of the 1970s, and change occurs very rapidly.
In 1974, the Altair 8800 is announced.
The Apple II, TRS-80, and Commodore PET soon follow a few years after, and really this
cadre of home computers start to hone in on what users actually want.
The 5100 was basically a scaled-down mainframe, at least that's how IBM was looking at it.
IBM had a vastly more powerful product, but that wasn't what people wanted out of a personal computer.
Things were brought to a head when IBM technicians started to notice that Apple IIs were showing up in some of their clients' offices.
IBM had to do something radically different to realign to the market.
different to realign to the market. This led to the Datamaster, a successor to the 5100 series that used some third-party hardware and proprietary IBM software. Importantly for us,
the Datamaster was powered by an Intel 8085 processor. So we have the first IBM computer
with Intel inside. But even before release, the Datamaster was shaping up to
be a disappointment. It was a step towards IBM giving up control and thinking more like their
competition. But that process wasn't over. The software realm still needed to change.
That background out of the way, we're all cut up to 1980. Now, don't get me wrong here.
The background is important, but we don't have the most direct line from the 5100 to
the 5150, aka the PC.
IBM has this knack for compartmentalization, and we have to keep that in mind when we're
looking at their history.
If the path forward gets a little bit twisty and there's some walls,
that's because the history is also kind of walled up and a little twisting.
Will Lowe is one of these little twists in the path.
Lowe was a career IBMer, but not on the engineering or programming side of things.
Lowe was in management. During the early
70s, he was a supervisor in the lab where the 5100 was designed, and he supported its development.
The sourcing there is a little weird. He's definitely mentioned in the story of the 5100,
but there's not excruciating detail. So he was involved, but we can't be sure precisely what he did in the
development process. In 1978, he made his way up to systems manager at the entry-level systems
division. That's IBM's lab in Boca Raton, Florida. Blue Magic, a fantastic book written by James
Cheposky on the history of the PC, describes Lowe as an erringly ambitious, yet very conservative
in his efforts. For the time, he was the vision of an IBM executive, dead set on progress, but
not willing to take many big risks. However, in the coming years, Lowe would see a risk that was
too good to pass up. At the dawn of the 80s, computing's next
frontier was still the home, but it wasn't quite yet conquered. Smaller companies like Apple,
at least the Apple of the time was small, were getting close, but they weren't quite there.
The Apple II is a really good example of this awkward phase of personal computing.
It's a very functional device.
You can program BASIC on it, you can play games, you can run spreadsheets, and you can
run word processors.
You can use it to automate an office and have fun at the same time.
The details of its design, its carefully molded plastic case, even down to Apple's ad campaign,
made it a really attractive computer for consumers.
But all in all, the Apple II wasn't a very capable computer.
Broadly speaking, all home computers of this era suffered from a similar set of pitfalls.
The core issue really came down to technological restrictions.
Home micros of the late 70s were all, with a few notable exceptions, powered by 8-bit
microprocessors.
Zilog Z80s, MOS 6502s, and even a few Intel 8080s made for cheap and robust computers.
By the time we hit 1980, this 8-bit technology was a very tried and true path.
Perhaps most importantly, there was a lot of hardware and software support for these 8-bit
chips. It was easy for a company to build around them, but that came with a few big trade-offs.
8-bit processors have severe caps on how much memory they can access. The Apple II,
with its 6502 processor, could only access 64 kilobytes of memory, and that's at most.
Now, on its own, memory restrictions don't have to be a deal-breaker. With the right hardware and some tinkering, there are ways to work around
low memory limits. But home micros, especially in this era, didn't have the right hardware to pull
these kinds of tricks. One way around a memory cap was to leverage storage. In the realm of mainframes,
this kind of technique was common practice.
Chunks of memory were often swapped onto disk to free up some immediate space in
memory. Then, when that data was needed, it could be loaded back into RAM. This
worked really good for timesharing, but there are more mundane applications.
Let's say you're just trying to write a really long text document.
One that's, you know, more than 64 kilobytes of text. With some kind of relatively fast
random access storage and a little bit of smart programming, you can write a word processor
that only loads up the immediately relevant parts of the file. So the file may be larger than your memory space, but
you may only need to load in a few lines of text at any one time. That kind of flexibility was
possible with fast storage, like a hard drive. Most computers in the 1980s just flat out didn't
have hard drives available. Out of the box, most home computers actually didn't have
any kind of long-term storage. That may sound wild to us, but it's just how things were.
Storage options were add-ons, and often a cost barrier for consumers. One popular option was
the venerable cassette tape deck. This ranged in complexity depending on the vendor you went with.
Some, like Commodore, had custom tape decks, while others, like Apple for instance, well,
they just had audio jacks on their computers. No matter how this was implemented, all these
vendors' tape storage solutions suffered from the same flaw. It was very slow, sequential storage.
You can't really seek around a tape
without waiting for it to rewind or fast forward. Moving up the price scale, we get to these shiny
new things called 5.25 inch floppy disk drives. They were a lot more expensive, and they held
more data, but still not quite fast enough to pull really wild, seamless memory swapping tricks.
Just one more piece in this puzzle of inadequacy, and what I find to be the most funny to think about,
is text display resolution.
That is, how many characters of text can one of these computers fit on a screen?
Answer?
Actually, not that many.
The key here was width.
This early batch of systems all displayed 40 columns of text at a time.
That's not many cells of a spreadsheet.
It's also just not much space to type.
Complicated code is hard to read on one of these displays,
and when you can only fit a few words before it wraps, that can definitely interrupt your flow.
By 1980, it was becoming more and more apparent that computers were in fact interested in more
capable systems. More screen space for spreadsheets could move units. More RAM for better programs could become a selling point very easily.
That is, if someone was able to get to the market quickly enough.
Rumblings were going on inside IBM.
Upper management was asking around the company's many departments for a new home computing
project.
It's not entirely clear if this was spurred on by
any one factor. My guess would be that this was a kind of recurring request ever since the 5100
series began. Just like, oh look, it's January, we better look back at our roadmaps and yeah,
home computing's still on there, so let's go see if we can drum up a project.
still on there, so let's go see if we can drum up a project.
Low offered to take on this new challenge, and here's where the risky play comes in.
Internally, IBMers assumed that this kind of project would take 3-4 years to complete.
The 5100 computer had taken around 2 years from prototype to market, but that system had been sticking pretty close to IBM's
wheelhouse. This new proposed computer would have to be built from the ground up for a market that
IBM was still unsuccessful at capturing. So doubling the 5100's timeline seemed like a fair
estimation, but in Lowe's approximation, four years was too long to wait. The way he saw it,
there was an opportunity in 1980. Maybe in 1981. But that wouldn't show up again. If IBM could
swoop in with a bigger, better personal computer before Apple or Commodore or any other manufacturer
could upgrade their systems, well, then the market could be won over.
So Lowe said he'd take the project,
and promised to deliver a product in just one year.
And this wasn't just bluster.
Lowe's gamble was based on some shrewd analysis.
In his words,
The only way we can get into the personal computer business
is to go out and buy
part of a computer company, or buy both the CPU and software from people like Apple or Atari,
because we can't do this within the culture of IBM. End quote. This culture of IBM was all about
tight control. This was a cornerstone of the company ever since it
was founded. Ideas weren't shared outside IBM. Everything had to be patented and trademarked
and wrapped up in tamper-proof packaging. Clients very rarely owned IBM hardware outright.
Everything from IBM's earliest tabulation machines all the way up to their new top-of-the-line
mainframes had to be leased from Big Blue. From processors to software, almost everything IBM
sold was made in-house. Even something as mundane as a terminal keyboard was built in a factory
owned and operated by IBM. That let IBM do things that other companies just couldn't.
The 5100 is actually a great example of where vertical integration led to innovation.
IBM put a powerful and portable home computer on the market before anyone else. The 5100 was
released years before the Apple II. It was even released before the Altair 8800,
and it was miles beyond what competitors could do for years to come. But the 5100 was too expensive,
possibly too complicated, and really just not what consumers wanted.
wanted. On the flip side, a company like Apple, that is, the 1970s Apple, was just a few scruffy computer nerds operating out of a garage. Eventually, after some smart work and savvy dealings,
they upgraded to a rented office space. They were small but agile. Apple was able to figure out what consumers wanted and then create a new product
relatively quickly. The key, the reason for their fast turnaround, was that Apple didn't have any
fancy factories to make keyboards and tape drives and silicon chips. Everything in an Apple II could
be found in an electronics catalog. Apple just put together readily available parts to create a new computer.
Sure, the Apple II was way less powerful than a top-of-the-line IBM 5100, but that didn't matter
to consumers. An Apple II was cheap, it got the job done, and it was available. You could just
buy one in a store. To compete on the smaller stage, IBM would have to
fully give up control over their new computer. Lowe would have to take a page from the playbook
of these smaller manufacturers. Or, you know, just go out and buy a small manufacturer to use.
Either way, taking a totally different angle of attack would save on development time,
and hopefully lead to a successful new personal computer. IBM was already tinkering with this
line of thinking. The data master, still in development at this point, was using Intel
chips inside it. The software and overall design was still IBM, but third parties were creeping into
the picture. Lowe proposed a total completion of this process. Quote,
The key decisions were to go with an open architecture, non-IBM technology, non-IBM
software, non-IBM sales, and non-IBM service. And we probably spent a full half of
the presentation carrying the corporate management committee into this concept.
Because this was a new concept for IBM at this point. End quote. It took some convincing,
but 1980 saw the birth of a new project that was unlike anything IBM had done before.
Lowell was proposing an IBM computer that wasn't made with IBM technology.
And this brings us to something totally unexpected.
Well, at least something that I didn't expect at all.
In my head, I always just assumed the story of the PC would be
pretty cut and dry. A to B to C with nice sources stacked up in a neat little pile.
But, dear listener, that is not entirely the case. Primary sources on the PC's development
aren't exactly scarce, I've ran into much more scantily documented stories,
but there aren't as many as I assumed. Making things all the more frustrating, there are
actually a lot of rumors and very poorly substantiated claims around the PC's early
development history. One of those rumors is that Bill Lowe tried to get Atari to manufacture a
personal computer for IBM. Now, I've also seen this phrased online as IBM tried to outright
buy Atari to form some new home computing division. This is a bit hard for me to address because,
like I mentioned, the story of the PC isn't paved with excruciatingly detailed primary sources.
The Atari connection gets addressed in the book Atari Inc. Business is Fun, but only very briefly.
According to that text, Lowe visited Ray Kassar, then CEO of Atari, to talk home computers.
Steve Mayer, another Atari employee, elaborates slightly on
this in the book. He says that there were meetings but no real headway. Quote,
We discussed our computer systems with them. I think the two problems were we had a closed
proprietary design, and also because our systems had to work on televisions, we only had 40 column displays.
End quote.
The implication here, and how Business is Fun frames this meeting,
is that IBM was looking to work with Atari but ultimately decided not to.
That would also go well with Lowe's mindset that IBM either had to buy part of a competitor
or outsource a large part of
the PC's hardware.
It's a fun story, but I have major reservations about it.
For one, there's a timeline issue.
Businesses Fund claims that these IBM-Atari meetings happened sometime in 1979.
IBM sources say that the project that led to the PC started in 1980.
Crucially, Lowe made his one-year proposal to IBM's executives in 1980. That's already a little
bit off to me. The other factor is that Atari entered the home computing market in 1979.
The other factor is that Atari entered the home computing market in 1979. That year they announced the Atari 400, another contender in the already crowded 8-bit microcomputer
space.
So, we have a pretty tight timeline around these supposed meetings.
My guess is the meetings were less to do with the PC itself and more to do with run-of-the-mill
corporate kinda operations.
IBM, and Lowe specifically, had been looking at home computers for a while.
Atari just launched a new home computer range, so it would be prudent to send out some folk
in blue suits to take a look.
Sans more sourcing, I think events here may be overblown.
If anything, a trip to Atari may
have helped Lowe focus in on his plans for the eventual PC. The ultimate plan that Lowe pitched
was to build a new computer inside IBM, but to use all third-party parts and software.
IBM would design the case, circuit boards, and overall computer using chips from other manufacturers.
Software would be sourced from third parties, and, crucially, the system would have an open architecture.
In other words, its design would be made totally public, nothing proprietary.
The execs bought the plan, and an official project formed. This new project was
called Project Chess, and would end in the creation of the IBM PC, and really making the computing
platform that we still use today. So how did those ideas go from plans to steel-clad reality?
The best way to approach Project Chess is by discussing
its design goals and how they were accomplished. Most importantly, at least most importantly in my
view, is why certain choices were made. A natural starting point is then the team behind Project
Chess. Initially, this was a task force of 12 IBM engineers, sometimes called
the Dirty Dozen. So, once again, we see how myth-making starts to seep into what I think
is a kind of unexpected place. This gets into how IBM worked back in the 70s and 80s. Project Chess functioned as a quote-unquote independent business unit.
Lowe was given broad discretion on planning and spending. Basically, he was building his own
little apple in a corner of Boca Raton. This separation also meant Project Chess could be
carried out in absolute secrecy, even from other teams inside
IBM. Once paperwork was signed off, Lowe started forming the Chess Lab, pulling in a veritable
dream team of IBMers to crack the case. To those outside Chess, it must have looked like their
co-workers were just disappearing into some secretive rumored lab for a few months.
The team was composed of programmers, manufacturing engineers, and hardware designers, headed up by Don Estridge. One of these
early team members, David Bradley, is a really good example of the kind of talent that Lowe
pulled together. And specifically I want to focus on Bradley because he wrote a wonderful article in Byte
about the development process of the PC. So there's one of those primary sources that we
really need to cling on to. Bradley had previously been involved with the Datamaster project. In fact,
he had been pulled off that project to join the chess team. The datamaster was really close to what Lowe wanted
to create, so Bradley had the specific skill set that Lowe needed. Just as important, Bradley and
other developers pulled into chess had a similar outlook on home computing. And that outlook was
very compatible with Lowe's grand ambitions. When Bradley left the Datamaster team, he wasn't leaving just
any project. He was actually being saved from what we call in the biz, development hell. And,
yes, that is a very often used and very technical term, believe it or not.
So, the Datamaster did use third-party hardware, but all the software was still developed by
IBM itself.
That meant that any office software, spreadsheets, and even basic implementations had to be written
by Bradley and his co-workers.
To complicate matters, the Datamaster was supposed to be part of IBM's larger small
office lineup.
Datamaster was supposed to be part of IBM's larger small office lineup, so management wanted some level of compatibility between this small computer and other mid-sized machines that they were
planning to sell. This had to be done in-house. Outside help was out of the question. Bradley
described this experience thusly, quote, During the Dat master development, we quote unquote converged our basic
with the basic used on the IBM system 34. That change delayed the data master by nearly a year.
That experience taught us two things about getting a product to market rapidly. We need to use an The Datamaster is a good example of how IBM's mentality just wasn't applicable to the home computer market.
IBM couldn't function as an island unto itself.
It had to be part of a larger computer ecosystem. Bradley and
his colleagues experienced that firsthand, so they were easy converts to Project Chess.
The growing PC design actually took a lot of cues from the datamaster. As I've been hammering,
that shouldn't be surprising. There was a lot of talent from that project being pulled onto the PC
team. Some of these choices were reactionary, like the shift in ideology, while others were more
direct and technical. So I guess this is as good a time as any to address the elephant in the room.
That's the Intel 8088 processor. Now, there are a few common answers as to why the PC was built using this specific chip.
Usually it's just pointed out that IBM engineers on the project were already familiar with
Intel processors, so they went with what they knew.
While that's partly the case, the Datamaster did use an Intel 8085, I think there's a lot more going on
here. And this is one of those twists that I mentioned that I want to spend a little time
inside. The core design strategy inside Project Chess was to make a cheap computer quickly.
To accomplish that, every part in the machine, that goes from RAM to storage to the processor,
every part in the machine that goes from RAM to storage to the processor had to be readily available from third-party providers. IBM wasn't buying into just a single chip. They needed all
the support that went with that. That meant things like interrupt and hardware controllers, but also
software. The availability factor here was also huge. Every part had to be easily sourced in large quantities.
Supply chain is a very boring topic, but it can make or break a release. The PC also had to be
technically superior to the competition. There are a lot of touches that we'll probably get into,
but the easiest one here is that the PC had to be 16-bit.
Adopting a 16-bit processor would give the PC access to more memory, and generally just make
for a more flexible and powerful machine. In 1980, there were quite a few processors that
would fit this overall bill. One early lead was the Motorola 6800. Completed in 1979,
it was by far the most modern and technically superior option. The chip was a mix between 16-bit
and 32-bit. The details get kind of complicated, I'm sure we'll talk about the Motorola 68000 in
detail one day, but the bottom line is it would be more powerful than any other choice
on the market at that time. But being such a new chip made for a problem. At the time,
the 68000 didn't have a second source. That's to say that only Motorola factories were producing
the chip. There wasn't some outside factory turning out 68,000 chips. In a crunch, that could
limit IBM's production run sizes. Either that, or inflate the price of a personal computer.
Another big contender was the TMS9900 from Texas Instruments. Now, this is actually a really
fascinating processor that I didn't know about until recently,
and I need to learn a lot more about.
I don't doubt it'll have its own episode eventually.
In short, the TMS-9900 was based off an earlier minicomputer manufactured by Texas Instruments.
Ideologically, this was pretty in line with where IBM was before Project Chess.
TI had taken some of their larger hardware and, using newer technology, condensed it
down to a single chip.
The TMS9900 was fully 16-bit, but had some strange quirks.
For the era, it may have been the chip most well-suited to multitasking.
It handles registers in this really different way that makes storing data fast and really
easy.
You can switch context with a single command.
But there are two big downsides to this chip.
Like the Motorola, TI had yet to find a second source for their shiny new processor.
Worse still, the TMS9900 had a 16-bit address bus.
It could only access 64 kilobytes of memory, and that wasn't nearly enough for IBM's plans.
The 8086, and by extension the 8088, met a lot of the requirements that these other chips failed to.
The biggest technological feature was the memory bus.
At 20 bits wide, the 8086 was able to access more than enough RAM.
It may not have been as powerful as some options, but it did have one thing going for it.
There were people within IBM who had
experience with Intel chips. Bradley was just one example. Project Chess ended up pulling multiple
engineers from the Datamaster team for this exact reason. The PC had to use third-party chips. It
would save a lot of time and money if IBM's team didn't have to learn a totally new skillset.
And thanks to backwards compatibility, the 8086 and the 8088 used basically the same
assembly language as Intel's earlier processors.
Another huge factor came down to software availability.
The PC was slated to use third-party software.
So you know, it would be pretty good if third-party software was already available for whatever chip they picked.
Specifically, IBM wanted to ship with BASIC some type of disk operating system and a few third-party software applications.
By 1980, there wasn't really a thriving x86 software scene.
But there was some software out there.
This is where we reach another kind of weird twist.
Seattle Computing Products, aka SCP, launched their very own 8086 processor board in 1979.
In the lead-up to this launch, SCP contracted with Microsoft to develop a version of BASIC
for x86 processors.
Soon after, an employee at SCP, Tim Patterson, wrote an in-house operating system called
86DOS.
That was only one of the software offerings for the 8086 at the time.
Intel also offered their own software libraries for the chip, so first-party support was there. And, in another fun callback to an
earlier episode, the Mark Williams company was even in play. They already announced their intention
to port coherent to x86-based systems, meaning a roughly Unix-compatible
operating system was just over the horizon. Other 16-bit processors did have software support,
but the 8086 already had options really close to what IBM was looking for.
The final key factor, and the reason that IBM chose the 8088 instead of the 8086,
has to do with support chips.
And this is where the difference between the 88 and the 86 actually matter a lot.
The 8086 has a 16-bit wide data bus.
That means that it can read and write 16 bits of data at a time.
The 8088 has an 8-bit data bus, hence the extra 8,
so it can only deal with external data in 8-bit wide chunks.
And that's it. That's the only difference.
The two chips execute the same code in the same ways except for that tweak to the external data bus.
Last episode, I made a point of how that difference only matters on the hardware design level.
Well, surprise surprise, we've now reached the hardware design level, so it starts to matter.
The 8086 can read and write data much faster simply because it can blit numbers out in larger chunks. That means you can design faster and more flexible hardware. But that's only if you
are using support chips that work with 16-bit wide data. You need to pair the 8086 with RAM and ROM chips that have 16-bit data buses.
The same goes for interrupt and I.O. controllers. In 1980, those chips were all new, which meant
that they were expensive and manufactured in smaller quantities than their older counterparts.
than their older counterparts. The 8088, on the other hand, prefers to speak with older 8-bit parts. That means you get cheaper and more plentiful chips. So IBM could get away with
using a newer processor, but still stick to older parts. Another upshot was that engineers who had
worked on the Datamaster already knew their way around Intel's 8-bit offerings.
Some parts from the Datamaster could be adapted for use with the PC, so the team could cut down even more on their learning curve.
That's the long answer as to why IBM went with Intel's 8088.
Most other design choices followed a similar logic and were heavily influenced by
this choice of processor. Another place that we can see this thinking in action is the PC's
expansion bus. One of the key features that any new computer had to have was expandability. That
was something that a lot of earlier home computers either lacked or weren't very good at.
If an end user wanted to upgrade their RAM, there should be a way to do that relatively cleanly.
Or if they wanted to get color graphics out of the computer, then there should be options to accommodate for that.
The PC provided this via its internal expansion bus, a series of slots on the motherboard designed to hold expansion cards.
Pop off the case, slot in the card, put the case back on for safety, and you're good to go.
The expansion bus used by the PC would eventually become a standard because it was simple, it worked well, it had a hassle-free design, and the PC just ends up being everywhere.
So where did this great idea come from?
Well, it's another one of the many corners that IBM cut to save development time and costs.
Quoting from Bradley again,
The IBM PC's bus architecture came from two sources, the Datamaster definition
and the new requirements of the 8088. We wanted to keep the bus very similar to the Datamasters
since we had developed several adapter cards for that bus. Keeping the bus similar would make the
adaptation very simple. Just add a new layout for the cards. End quote. To be clear, this isn't legacy compatibility that Bradley's talking about, per se.
The Datamaster would hit markets about one month prior to the PC,
so we're still dealing with contemporary computers.
Adapting the Datamasters expansion bus saved time. It gave Project Chess
access to existing hardware that IBM had already developed internally. This all worked thanks to
the 8088's all-important 8-bit address bus. The 8-bit Datamaster's hardware was really easy to rig up to the new computer being built
inside Project Chess.
This expanded beyond, well, the expansion bus.
Notably the keyboard that shipped with the IBM PC, the Model F and might I say one of
the best keyboards ever manufactured, was taken from a Datamaster.
They're the same keyboard with just a different case.
But don't get it twisted. The PC wasn't just a rehash of the Datamaster with a slightly better
processor. There were general hardware changes on the motherboard, upgraded components, and the like.
Instead of getting bogged down in a chip-for-chip comparison, I want to look at the bigger picture.
Project Chess redefined the data master's design, mixed in some newer thinking, and created a very different platform.
Central to this new computer was its open architecture.
And we aren't talking open as in open source here exactly. That would come later.
This is a very specific type of openness that was backed up by the PC's hardware design.
One of Project Chess's huge goals was, as I keep stating and restating,
to avoid writing in-house software.
Programmers like to program, sure, but if you can get away with it, it's better just not to.
There would be some IBM code tucked away deep inside the PC, but that only accounted for
maybe a few hundred kilobytes of data. This methodology served a dual purpose. Initially,
it saved product development time. BASIC, a disk operating system, and some initial office productivity software were all contracted
out to third parties, some of those by the name of Microsoft.
That kept the PC on a very aggressive timeline.
But that was only the start.
The second goal, and by far the bigger and more important challenge, was to court additional third parties to develop for the PC.
One of Lowe's observations during the early phases of Project Chess was that IBM's smaller competition didn't write most of their own software.
The really big, killer applications like VisiCalc weren't written by Apple themselves,
but were developed by independent third parties for use on the Apple II.
This helped to extend the software library of the Apple II,
making it a more useful computer, and therefore more attractive to consumers.
Eventually, this forms a sort of adoption feedback loop.
The computer becomes more popular because there's more software on it.
More developers hop onto the platform, and more software helps to market the computer even better.
So the trick was to get those juicy, independent third-party developers interested in the IBM PC.
That's where an open architecture comes into play in a really big way.
In the ramp-up to release, the chess team worked up a beautifully poetic series of volumes called the IBM 5150 Personal Computer Hardware Reference Library. Really, this type of work is a bit of a
love letter to external developers, believe it or not.
And while technical references weren't new to IBM, the PC would break from convention even here.
The reference library lays out every last detail about the PC's design.
From circuit schematics to the expansion bus interface, from CPU timing to memory maps,
there weren't any secrets or gotchas about the PC's design. Pick up the reference manual and
you know every single gory detail. Even to this day, they're a great case study in good
documentation. This went deeper than just hardware, though, which brings us to the
BIOS, IBM's basic input-output system. Now, don't get confused by the name. BIOS has nothing to do
with the language BASIC. In short, the PC BIOS is a chunk of code that's resident in the computer's
memory when it boots up. It handles
bringing the system to life, and it provides an interface for programmers to control the PC's
hardware. Bradley described it as a, quote, buffer between the hardware and the programmer, which,
in my personal experience, is pretty spot on. It gives you a way to, say, access the floppy drive without knowing the
correct arcane bits of commands to make a motor spin. BIOS wasn't exactly a new idea on the PC.
CPM Systems implemented a similar feature by actually the exact same name, and ROM resident code was already common on other home computers.
However, for IBM, the PC BIOS were a really big deal. It was the only chunk of code that IBM
released with the PC, and every line of its code was documented in the PC's reference manual.
If you leaf over to Appendix A of the IBM 5150 Technical Reference Manual,
then you'll find a full listing of the source code for the PC BIOS. Every single line is commented.
There's blocks of comments explaining each function. There's even a table of contents
for the code listing. So not only do programmers have access to documentation
on how to use the BIOS, they also have access to how the BIOS works, line by line. For IBM,
this was an unprecedented level of disclosure. The benefits from this break from tradition were
huge. The PC's immaculate documentation made it very attractive
as a platform for programmers. As an occasional third-party programmer myself, I can tell you
just how important good documentation is. A lot of issues can be solved by just finding a full
explanation of what you're dealing with. Well, for the PC, there were hundreds upon
hundreds of pages explaining exactly what you were dealing with as a programmer. The barrier
to programming on the PC was just the cost of a computer plus a little more cash for the set of
manuals. The other huge benefit came on the hardware side. Since the PC didn't have a
proprietary bone in its body, it became an attractive platform for third-party hardware
developers. IBM would manufacture a lot of the expansion cards used by the machine. But it didn't
take very long for other manufacturers to throw their hats into the arena. IBM was almost making a trap for
developers of all stripes. All this work happened at a breakneck pace. The initial motherboard took
a little over a month to complete. That's just one month to go from idea to a working circuit.
By April of 1981, the computer was out of prototype and entering production.
Software, contracted after Microsoft, would soon reach completion. Long hours, careful design work,
and some cut corners made the process fly by. Lowe met his year deadline with room to spare.
But the final trick was to see if the IBM 5150, the personal computer,
would be as promising in the market as it was on paper. With all the behind-the-scenes stuff
out of the way, I think we're in a good place to depart IBM, at least for the time being.
While Lowe and Project Chess burned the silicon candle at both ends, the computing world continued
on. The upcoming PC was a secret until its announcement in August of 1981. But rumor was
already in the air. The fact was that Lowe had identified the right time to pounce on the home
computing market. Microcomputers, home computers, personal computers, whatever you want to call
them, small systems were on everyone's minds. IBM had already shown that they were very interested
in this emerging market. That much was public knowledge. Everyone was waiting to see what IBM
would do next. Their move could dictate the outcome of the microcomputing revolution.
Their move could dictate the outcome of the microcomputing revolution.
But Project Chess remained a closely guarded secret.
So in the absence of any information, speculation ran pretty wild.
And from this speculation, I think we can start to read the feel of the room.
The January 1981 edition of Byte had one such wild rumor tucked away in it.
A small note in that magazine started with this juicy tidbit. Quote, with many industry analysts predicting advances in semiconductor technology
that will allow the instruction set of the IBM 370 computer to be executed by a single chip of
silicon, some pioneering enthusiasts are anticipating the announcement of the IBM 380,
a possibly personal computer with the full capabilities of, perhaps, the System 370-135.
This was one major school of thought, that IBM would, very soon, release a scaled-down mainframe on a chip
as a personal computer. In fact, this may have been the most reasonable guess to make.
The IBM 5100 was exactly that, a personal computer powered by a scaled-down mainframe.
It was even compatible with some mainframe software. So if IBM was going to
give home computing another go, it stood to reason that they would stick to their old tricks.
When this article was written in early 1981, the data master still hadn't hit market. IBM wasn't
attacking the home market with third-party hardware, at, not quite yet. The other really interesting aspect here comes
when we look at the broader context. That is, when we look at what Intel had brewing in the
background. This part isn't strictly connected, but I think it's too interesting to pass up.
Back in episode 50, I covered the Intel 8800, aka the IAPX432, an ambitious chip that Intel had been developing since around 1976.
A chip that they called the quote-unquote micro mainframe. It really makes me wonder if the dream
of a desktop-sized mainframe was being talked about behind closed doors, or if the concept
was just in the air at the time.
Anyway, that's something for me to look into another time.
The other major school of thought that kept showing up in my research was that IBM wasn't
making a home computer at all. Instead, they were contracting out. Here's another passage
from the June 81 edition of Byte.
passage from the June 81 edition of Byte.
Quote, Matsushita, the giant Japanese electronics conglomerate that markets Panasonic and Quasar products in the U.S., recently admitted that it had been approached by IBM in regards to
manufacturing a personal computer for the U.S. market.
End quote.
The rest of that clipping is scant on details.
These are rumors we're talking about.
But this thinking makes up the second camp I mentioned.
IBM was too big to compete on the small scale,
so they were going to hire someone to do it for them.
I found nothing backing up this rumor, so let's just take it as a rumor for now.
No matter which side of the fence you sat on, one thing was clear.
IBM had to be up to something.
And once that something was made public, something big would happen.
Big Blue, after all, didn't do things small.
Everything fell into place on August 12th, 1981.
We start with a simple press release.
Quote,
IBM Corporation today announced its smallest, lowest-priced computer system, the IBM Personal
Computer. Designed for businesses, school, and home, the easy-to-use system sells for as little And the public had a name for this mysterious something.
The IBM PC was now public knowledge.
something. The IBM PC was now public knowledge. A home and small business system for just $300 more than an Apple II. Overall, the announcement subverted expectations. The PC wasn't some scaled
down mainframe, and it wasn't entirely outsourced hardware. It was an IBM original, but rendered in borrowed silicon chips.
The base configuration of the PC, that low $1500 model, was pretty bare bones.
You got the IBM PC itself with only 16 kilobytes of RAM.
No disk drives, no printer, no screen.
That sounds terrible, but in context was a somewhat reasonable offering.
A bare-bones PC could use a TV as a screen, just like its 8-bit competitors.
It could use a cassette deck for storage and ran Microsoft Basic in ROM.
That's essentially the same base-level offering as Apple or Commodore,
just a faster 16-bit processor was tucked inside.
Very few people actually ran such a pared-down PC. This is where the computer's expandability
comes into play. On release day, IBM already offered a set of expansion cards and peripherals,
no waiting around for later support. Just upgrade to a monochrome display adapter card,
grab a new IBM CRT display, and you had access to crisp 80-column text. That meant more space
for spreadsheets, code, or even just a wider text editor. Bump that up again and you could
get a color graphics adapter card, giving you up to 16 color graphics
in addition to 80 column text. Another upgrade could move you from 16 kilobytes all the way
up to a total of 640 kilobytes of RAM. Even the PC's chassis was designed with expansion
in mind. The case itself is a large rectangular slab made from folded steel. Around back are cutouts
for the back brackets of expansion cards, the idea being that most cards would add some sort
of ports that you'd want to access. Up front were two drive bays, space to install optional floppy
disk drives. IBM went to all this trouble so that they could offer an easy and seamless upgrade path.
You could start off with a cheap unit and buy your way into a more powerful PC.
But importantly, at each step, maintain compatibility with your existing software.
It's a really smart tactic.
The other important touch is that the PC launched with third-party software lined up.
IBM makes this clear even in the initial press release.
The PC, in all its configurations, shipped with Microsoft Basic.
But that was just the start.
From the press release again, quote,
Program packages available for the IBM personal computer cover popular business and home applications.
For example, EasyWriter will store letters, manuscripts, and other text for editing or
rapid reproduction on the printer.
Businesses can use General Ledger, Accounts Payable, and Accounts Receivable by Peachtree
Software Inc. to generate balance sheets, track accounts, and automatically print checks.
VisiCalc is available for applications ranging from financial analysis to budget planning.
Microsoft Adventure brings players into a fantasy world of caves and treasure.
So yeah, you could do your taxes on a PC, or you could play a Microsoft-licensed version of Colossal Cave Adventure.
IBM wasn't taking half-steps when it came to third-party software.
This was a huge part of Lowe's plan from the beginning.
On the day the PC hit market, it was a viable product.
An office could go out, buy up a new computer, get all the software they needed to run, and get to work.
No waiting around for third-party support.
IBM had that lined up for you.
The net result of all of this, the third-party support and easy upgrade paths, was that the PC had a very low barrier to entry.
If you had the cash, you could digitize your home or office right away.
This brings us to the final twist in our wild ride through the world of Big Blue's new computer.
And that's how IBM marketed and sold the PC. How the PC was sold was another key departure
from normal IBM business. Usually to get some new hardware, you got in
touch with IBM directly. You'd go to a dealer or your designated IBM representative to sign
some contracts and get yourself a lease on a new IBM system. The PC was still sold through these
old channels, but you could also get a PC from one of these new things called a computer store.
Initially, IBM started selling PCs through a series of shops called Computerland.
This was a consumer-facing brick-and-mortar store that already existed. In theory,
any normal person could walk off the street, buy a big box, and head home a computer user.
In the coming months and years, IBM would expand out to
more shops in an organized drive to make owning a PC as easy as possible. And, as with everything
else, marketing for the PC took a distinctively new flavor for IBM. I've actually addressed this
a little bit in the past when I did my episode on the 5100 series.
I devoted some time to looking at how IBM advertised their supposed home computer.
Those ads were a bit, well, strange, and they offered some bizarre mixed messaging.
The pitch was that the 5100 and its successors were the perfect home system,
so easy anyone could use them.
But the guts of the ad focused on businesses,
showing people using these small computers in a series of increasingly bizarre business roles.
Just as a prep for your palate, here's a clip from a 1977 commercial for the IBM 5100.
The 5100 is easy to learn and simple to use.
There are countless combinations of feed we can mix. What is the most economical for any
particular herd? That's what I'm figuring out now. The cost of the 5100 is reasonable.
It's a reasonably priced computer. You can use it for accounting. You can use it for managing feed-mixed ratios or something.
The Datamaster took a very similar tact,
showing the computer in use for accounting and explaining how much it streamlines business.
IBM was leaning really heavily on the B in its name.
The PC broke from that formula in a pretty big way.
If anything, IBM had to shift harder towards their M.
The first commercial for the IBM PC featured an actor playing Charlie Chaplin's Little Tramp character,
using the new computer, with a more gentle voiceover track.
IBM put a lot of what it knows about computers into the new IBM Personal Computer.
Not to make it complicated, but to make it simple,
so it's easy to understand and easy to use.
Frankly, this is a really weird ad campaign.
But just under the surface, there's something interesting going on.
The focus is that the PC is a personal computer.
It's designed for ease of use, to be personable, and as a general tool to aid in daily life.
The ad doesn't mention accounting, calculating feed mix ratios, or anything specific.
Just that the PC is a new computer, and it's designed for people like you.
As for the Chaplin connection, that came from outside Big Blue. In keeping with the outsourcing
mantra of Project Chess, IBM hired an outside ad agency called Lord, Geller, Federico, and Einstein
to handle marketing the PC. The initial commercial with the faux Charlie
Chaplin sitting at a table with the PC was followed up by a series of similar ads. Some
saw the silent actor tottering through traffic to purchase a new PC. Others depicted the PC
helping Chaplin on a slapstick assembly line. Still more, like the clip I started with,
showed the actor running a hat of the month club
and the PC helping with shipping and finance logistics. The central goal here was to make
the PC look approachable. It's not just a business machine, it's a home computer. It's a tool to make
your life better. And it's friendly. The contrast to earlier IBM commercials and really earlier home computing commercials
couldn't be more clear.
The PC ads are sparse on details, not very technical, and they're funny enough to be
memorable.
I know it sounds weird to call an ad funny, but seriously, if you haven't seen one of
these, then check them out.
I'll link to some in this episode's description.
As to why Charlie Chaplin and specifically why his Tramp character, that gets a little more nebulous.
In a 1983 article in Advertising Age, Tom Mabley, one of the people on the original team behind the ads, wrote,
quote,
We knew we wanted a single friendly person who would
represent every man, but we didn't really see a need for on-camera dialogue. That pointed to mime,
end quote. Chaplin's Tramp character fit that role really nicely. In Chaplin's films, the character
is portrayed as a likable, bumbling, and approachable person,
a friendly face that IBM could put in front of their new, supposedly friendly, computer.
The final connection is one that I haven't seen explicitly stated in the sourcing,
but is definitely winked at in the ad campaign itself.
One of these ads, the one I started with, calls the PC the quote-unquote tool for modern
times. The reference here, or at least the apparent reference, is to the Chaplin film Modern Times.
The movie is about the silent tramp working in a factory, eventually being consumed and
irrevocably altered by the very machines he works with.
IBM's ad campaign is recontextualizing that vision of the near future.
There aren't any oversized gears, no dangerous conveyor belts, and no cold walls made out of metal.
They offer a more personalized tool for modern times, something that can fit into your life
without changing you.
modern times, something that can fit into your life without changing you. It's not a tiny mainframe, and it's not a rebranded system. And really, it's not even revolutionarily new technology.
The IBM PC was a computer that used older parts and some smart planning to revolutionize the
home and the office. They built a tool that could help in modern times.
Alright, that brings us to the end of today's episode. And, I think at least for the time
being, this closes out my discussion of Intel's rise to dominance. Starting with the 4004,
leading up to more capable processors and then entering
ubiquity with the 8086 and its direct family. IBM would prove integral to the success of the
x86 architecture in general. As the to remember why IBM went with x86 processors and the 8088
in particular. The 8088 wasn't the most powerful chip on the market. It wasn't the fastest,
and it wasn't the newest technology. IBM chose the 8088 because it made the PC faster and easier to develop.
It was an important puzzle piece in a larger corporate plan.
The PC wasn't groundbreaking technology.
It outpaced contemporary competition, sure, but didn't offer any resoundingly new tech.
offer any resoundingly new tech. IBM's smash hit was well-planned, carefully built, and cunningly marketed to be the best contender possible in 1981. It was designed to be a success,
but not necessarily built to be the best computer in history. In the coming years,
the PC would gain dominance and near-ubiquity in home computing.
Then, once legally clean BIOS were built, 100% PC-compatible clones hit the market.
The PC's open architecture made a perfect storm for third parties.
It was easy and probably pretty fun to start developing software for the platform.
It was simple to get
in on hardware development too. And eventually, it was just as easy for third parties to design
their own PCs from scratch. Today, we still benefit massively from this market proliferation.
The PC's basic design has become the foundation for modern computers. The 8086's architecture has become the default heart for these systems.
That combo of IBM and Lowe's savvy corner cutting and Intel's stopgap processor have defined computing ever since 1981.
For better or for worse, they formed an unexpected and, frankly, an unplanned future.
Thanks so much for listening to Advent of Computing.
I'll be back in two weeks' time with another piece of the story of the computer.
I'm thinking about going for a little bit of a shorter one since these last few episodes
have ran a little bit long, and I'm thinking about breaking back into some networking. So stay tuned. And if you like the show, there are now quite a few ways
you can help support it. If you know someone else who'd be interested in the story of computing,
then why not take a minute to share the show with them? You can rate and review on Apple Podcasts.
And if you want to be a super fan, then you can support the show directly
through Advent of Computing merch or signing up as a patron on Patreon. Patrons get early access
to episodes, polls for the direction of the show, and bonus content. You can find links to everything
on my website, adventofcomputing.com. If you have any comments or suggestions for a future episode,
then go ahead and shoot me a tweet.
I'm at AdventOfComp on Twitter.
And as always, have a great rest of your day.