Advent of Computing - Episode 21 - 8008: Intel's Second Shot
Episode Date: January 13, 2020It's time to continue our deep dive into the legacy of Intel's processors. This episode we will be looking at the 8008, the second microprocessor produced by Intel and the progenitor of the x86 family.... Along the way we will see how an innovative terminal from 1969 inspired the chip, how Intel lost a contract, and discuss some of the first personal computes. Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers:Â https://www.patreon.com/adventofcomputing Important dates in this episode: 1969: CTC Develops First 'Glass-Teletype' Terminal 1972: 8008 CPU Released by Intel
Transcript
Discussion (0)
When was the last time you booted up a PC? Maybe it was a desktop, or a laptop, or even a Macintosh or a server would fit the bill.
Something remarkable about the modern personal computer is the fact that cross-compatibility has become so darn good.
In 2020, most software will run the same, or at least close enough to the same, on most any computer.
at least close enough to the same on most any computer. Now, there are a lot of factors that make this possible, but one of the big ones is the fact that by and large, all PCs use the same
chips under the hood. That may be a little simplistic. It's more accurate to say that
all modern PCs have processors in the same family of chips. Each year, flashy new CPUs come out,
each one faster and with more features than the last.
But each new model retains compatibility with older chips.
The family that all these processors belong to is called x86,
so named after the Intel 8086.
It's the chip that powered the original IBM PC,
and all later models were designed to emulate that processor.
That means that even the fastest, fanciest, and newest computer
can still run the same software that the IBM machine came with way back in 1981.
That fact on its own should be pretty shocking.
But as it turns out, this family of processors actually has earlier roots than the 80s.
In fact, the design that would eventually morph into powering modern PCs dates all the
way back to 1969.
That's a full two years before the first microprocessor was ever released.
In fact, that's even a little bit before Intel would even be
founded. On its face, this may sound paradoxical, but as it turns out, there's a really interesting
history buried here. So let's take a look at the story of the Intel 8008 microprocessor.
Welcome back to Advent of Computing.
I'm your host, Sean Haas, and this is episode 21, 8008, Intel's Second Shot.
This is going to be the second in my still-ongoing series on the history of Intel's microprocessors.
Now, when you think about computer companies, Intel might not be the first that comes to your mind. But they've been a key player
in making the modern PC a real possibility. There's definitely an argument to be made
that without Intel, the personal computing revolution that we enjoy the spoils from today
may not have happened. Or at least, it may have been greatly delayed. The first episode in this series covered the 4004, Intel's first microprocessor and really
THE first microprocessor ever produced. If you haven't listened to that episode already,
then I'd suggest checking it out, but it's not strictly necessary material for today.
To quickly simplify things, the 4004 was first released in 1971 as an outgrowth of an
Intel contract. A Japanese calculator company called Busycom had hired Intel to design and
produce chips for an upcoming desktop calculator. Engineers inside of Intel got a little carried
away, and over the course of a few years, they ended up creating a computer on a chip as a counteroffer to Busycom's design.
This technology is what we would now call today a microprocessor.
The 4004 is totally disconnected from modern chips.
It's kind of a family of one.
one. That being said, creating the 4004 gave Intel the know-how to make other microprocessors and it really positioned the company as an early leader in the field. When the chance came to make another
chip, Intel was ready to take up the challenge. So let's take a look at the next chip Intel made,
the 8008. How did Intel turn microprocessors into their big business, and how did the new chip
take design cues from a 1969 terminal? Along the way, we'll see Intel once again violating their
mission statement to attempt to court a lucrative contract. To tell the story properly, we need to
start back before Intel was founded. We have to go back to 1968 and
the formation of Computer Terminal Corporation, otherwise known as CTC. It was a small San
Antonio, Texas-based startup. And as the name suggests, CTC was a company primarily concerned
with computer terminals. Now, these types of machines have come up on the show a lot
before, but I think it bears
a little explanation here, since outside of some specific applications, dedicated terminals are
no longer in use today. A terminal was essentially how a user would interact with a mainframe back
in the days before personal computers came to be. The very earliest examples of terminals were
called teletypes, and these
particular machines were essentially a glorified electric typewriter. You'd hammer out a request
on a typewriter keyboard that would be hammered onto a paper feed, then the computer would send
its response to be typed onto your paper feed. Teletypes were inflexible. They were loud and they wasted huge amounts of
paper. Any program logic was on the computer side, so you couldn't do much besides send and
receive data. But for a time, basically the 50s and 60s, they were really the only option.
Where some users were annoyed or frustrated with teletypes, the founders of CTC,
Gus Roche and Phil Ray, saw an opportunity. Outside of some special-purpose hardware,
there really wasn't any replacement for the venerated teletype, so CTC set about designing
one. Their machine, eventually called the Datapoint 3300, was completely compatible with a few popular terminals, but it had some key advantages.
The biggest was that instead of printing data out onto paper, the Datapoint 3300 used a small cathode ray display.
In the parlance of the day, this new type of terminal was called a glass teletype, since, well, everything was displayed on
glass instead of paper. Now, I've seen some allegations that CTC, and especially Roche,
were actually trying to create an early personal computer and market it as a terminal to
kind of sneak into offices. I can't say for certain if those claims are valid, but a look
at the first product shows some hints that that theory might have weight behind it.
Announced almost as soon as the company was founded and first sold in 1969, the 3300 terminal was a very complex piece of equipment.
Managing communications with the mainframe, keyboard input, CRT output, plus optional peripherals wasn't a small feat.
input, CRT output plus optional peripherals wasn't a small feat. The machine was managed by a logic circuitry that approached the complexity of a simple computer. But we have to keep
the time in mind here. Remember that this is a few years before the first microprocessor
would ever hit shelves, so the 3300 was instead built using discrete logic chips. It had a processor, roughly speaking, but that
processor was composed of many, many multiple integrated circuits. Once on the market, this
fancy new glass teletype really became a sensation. Not only could it directly replace the old and
now outdated teletypes, but it was smaller than the venerated terminals. CTC had designed the 3300 to be,
at least as much as possible, a small and sleek machine. The screen, keyboard, and all the
circuitry was built into a single unit. It even went beyond old teletypes, allowing users to move
the text cursor around on the screen. You really can't do that with paper. This early success gave CTC the capital and momentum to move on to a more ambitious project,
the Datapoint 2200.
The earlier model had already been a huge disruption to the market,
but the 2200 was slated to be an utter revolution.
That is, if they could make the design work.
You see, if things went to spec,
then this new machine would really be a personal computer. But there were some problems.
As I've said earlier, there is some debate as to whether CTC intended the 2200 to be a personal
computer out of the gate, or if the final product was actually a convenient shortcut.
final project was actually a convenient shortcut. But the specs of this quote-unquote terminal definitely raised my eyebrows. The biggest point here is that the 2200 actually had its own
software. Terminals of the time were getting more and more complex, that's true, but they were
purpose-built. A normal terminal's function was defined on the hardware level, plus a little bit of
code and ROM here and there. Compare that to the 2200 and we see a different picture.
It was designed to have a much more complex processor than the early 3300. One a lot closer
to an actual computer. It also had two tape drives for loading data and programs, plus an optional
disk drive expansion. The machine itself didn't actually function as a terminal. You would
first have to turn it on and then load up a tape with terminal emulation software on
it. Then, once that was running, you could connect the 2200 up to a mainframe. The reason
for all this complication was to address a key issue,
compatibility. In the 60s and 70s, the mainframe market was exceedingly diverse. Many manufacturers
all designed computers to competing standards. That meant that most terminals were only compatible
with a subset of the mainframe market. The 2200 addressed this with a very tried-and-true method, and that's going
way over the top with engineering. It was able to load different terminal software as needed so that
you could just adjust the CTC machine to work with most mainframes. But to get the flexibility
needed to accomplish this feat, the team at CTC essentially designed a computer in miniature.
And since the 2200 could run software to communicate with mainframes, it was just a step up to
run any kind of software.
Like with the earlier 3300, CTC designed their new machine using discrete components.
The processor for the Datapoint 2200 used around 100 separate chips.
With that, plus screen, keyboard, and circuits to drive all those components,
you get a self-contained computer that can also function as a terminal.
All in all, it's a pretty capable little machine for the time.
But all was not well in San Antonio, despite how good everything looked on paper.
In my experience, anything involving computers is inherently cursed, All was not well in San Antonio, despite how good everything looked on paper.
In my experience, anything involving computers is inherently cursed, and the 2200 is no exception.
Early on, it had been decided that this new terminal should be pretty compact,
and with the goal that it should match the footprint of a common electric typewriter.
The logic behind this was that it would make it an easier fit in a new office. That compact case would lead to problems with heat dissipation, and early models were prone
to overheat, which was a major problem. The combination of small case and the multitude of
chips made for a perfect storm, or rather a perfect oven, so to speak. One key issue with complex logic circuits
like the ones used in the 2200 is the amount of power they require. Each chip has some set wattage
that it draws, and no matter how well you make a chip, some of that power gets turned into waste
heat. Multiply that small amount of heat coming off each chip by the large chip count in the 2200,
that small amount of heat coming off each chip by the large chip count in the 2200, and you get a pretty big temperature increase. The other issue, albeit a smaller one, was the cost of manufacturing
the terminal. The sheer number of chips it used, combined with how complicated the wiring for those
chips was, added up. Simplifying the machine's design and reducing the overall chip count would
go a long way
towards making it a more profitable venture.
Reducing the number of chips used would also address the overheating problem, since less
power would be required to drive everything.
And as good as that sounded, the fact of the matter was that in 1969, there weren't that
many options available.
The only way to dramatically reduce the complexity of the 2200
would be to have custom integrated circuits made. This could condense the sprawl of CTC's logic
circuits into maybe a handful of purpose-built chips. CTC, though, didn't have the experience
or the equipment to make that in-house, but they did have designs for what they wanted.
or the equipment to make that in-house. But they did have designs for what they wanted.
So they went shopping for someone who could make those designs into reality baked into silicon.
In this early period, semiconductor companies, or SIMCOs for short, or if you want to sound cool,
worked a lot differently than they do today. There weren't many off-the-shelf ICs. Instead, a company like CTC would submit a custom design
to a Simcoe for production. Most everything was on an as-needed contractual basis.
And that's where Intel comes into the picture. Their business model was designed to be
fundamentally different. Instead of working contracts, Intel's plan was to make generic
ICs that could be sold to
really anyone, thus making operations for the company much more simple and much more profitable.
Their first product line would be memory chips, something that any computer needed. The problem
was that even with such a good business plan, one that would eventually revolutionize the industry,
plan, one that would eventually revolutionize the industry, Intel was still a startup. And in the early years especially, the company was chronically low on funds. So, Intel ended up, despite their
best efforts, taking a number of contracts to make ends meet. That was how the 4004 project started.
Intel took a contract from a Japanese calculator company called Busycom in order to
get cash flowing. It helped them get off the ground, but it was a practice that ran counter
to their goals, and a lot of employees at Intel were none too pleased about it. Near the same time
that Busycom came to Intel seeking a new calculator contract, delegates from CTC would make the same
trip. Intel had already had dealings with this
terminal company. CTC had ordered a batch of what amounted to early memory chips from Intel.
In December of 1969, a group of CTC employees came by Intel headquarters in Mountain View,
California to check on the status of their chips. From first-hand accounts, it sounds like there
was some small talk between the two groups of engineers.
And during the conversation, someone from CTC let slip that they were working on a computer,
the aforementioned 2200, and they were having some issues fixing it.
That very well could have stayed as office gossip, just some idle complaining between colleagues.
But the timing here conspired to make it a lot bigger.
One of the Intel engineers in the office that day was Stan Mazor. Just a few months earlier,
he had started working on the 4004 processor project. When he heard that CTC was working
on a computer, he saw an opportunity. Mazor caught the delegation before they returned to Texas
and pitched the idea that
Intel may be able to help them with their processor problem. That is, if they could
furnish some more information. For CTC, this must have seemed like an opportunity just dropped in
their lap. Keep in mind that the 2200 was already designed, so by early 1970, CTC sent over a full
technical specification for their computer.
And after some back and forth, an agreement was reached to have Intel develop the processor
for the Datapoint 2200. It looked, almost by coincidence, like Intel had another contract
on their hands. But what did the spec look like? What were the engineers at Intel actually facing?
In a lot of ways, not really that much more than they were facing with the 4-bit 4004 processor.
Just a little bigger.
CTC's spec called for an 8-bit processor,
basically meaning that it was designed to operate on 8-bit numbers.
Like the 4004, this new computer would be a register-based machine.
That meant that the processor itself would have small chunks of memory on board for storing and
operating on temporary data. Normally, these registers are used to store things like the
result of a math operation or the address of some important bit in memory. The 4004 only had one general-purpose register,
while the Datapoint machine was planned to have seven.
It was also slated to use a call stack,
a very rudimentary way to implement something like functions.
The earlier 4004 used the same method.
Qualitatively, each feature of the new chip would have an analog
on the currently in development
4004.
But quantitatively, everything would just be expanded.
The largest change by far was the instruction set, and this may be a good time to delve
into a quick discussion of how a processor actually works.
The actual program that a computer runs, when you get down to the silicon level,
is a set of instructions. Each of these tells the computer to do some very basic task, like
add these two numbers, or put this data at this spot in memory, or jump to another part of this
program. When you string enough of those instructions together, you can create something
that's actually useful, a program.
For a computer to understand these instructions, it has to be translated into machine code, the actual binary data that the processor knows how to read.
The full set of instructions that a computer knows is called the instruction set.
And, while theoretically you can program a computer to do anything, the more fully featured an instruction set is, the easier it becomes to program for.
The 4004 had a reasonable enough instruction set, at least for something like a calculator, but in a fully fledged computer, there would have been some real issues.
If you wanted a microprocessor that you could use to power a more complicated machine, then you'd need a larger and more fully-featured instruction set.
Anyway, Intel wouldn't have to worry about designing another instruction set themselves.
CTC already had one for the 2200, so Intel just needed to adhere to that existing spec.
The specifics of how that was accomplished on the silicon level were all up to Intel.
This instruction set, developed by CTC at the very end of the 1960s and then passed on to Intel,
is one of the earliest roots of the future x86 architecture. By the middle of 1970,
Intel had started the new project in earnest. The chip they designed was originally called the 1201,
but it was renamed the 8008, so I'm gonna just keep calling it that for clarity's sake.
The project was headed up by two familiar faces, Stan Mazor and Ted Hoff. In 1970,
they had just come off designing the 4004, so they were probably the best people in the world,
designing the 4004, so they were probably the best people in the world, at the time at least,
to design the next microprocessor. The two would figure out how to take the CTC spec and create it on a single chip, and in doing so, they took inspiration from the closest source they could
find. The task of transferring designs for the new chip onto silicon fell to one Hal Feeney,
a new hire at Intel. Feeney took the spec that Hoff and
Mazor had written and started the task of converting it to a workable chip in March of 1970.
And this is a spot where the timeline starts to really matter. As I keep alluding to,
the 4004 chip was still in development. In 1970, Frederico Fagan, the engineer working on the silicon level for that chip, was still hard at work.
Many of the design choices made for the newer 8008 were drawn from Fagan's work, incomplete as it may be.
There weren't commonly used ways for testing something as complex as a microprocessor, or to make dense enough integrated circuits.
All the specifics needed for creating something like this was totally new.
Fagan and his colleagues would blaze out this new trail for the 4004,
which made the development of the 8008 right behind it very possible.
After the 4004 wrapped, Fagan would move over to help with the last touches on the 8008.
Samples would roll in by fall of 1971.
But even with a completed microprocessor, there were still some issues, and the Datapoint 2200
would actually end up shipping without the new chip inside. So how did Intel lose the contract
with CTC? Well, there's a few factors. The least likely reason, but still an interesting one nonetheless,
has to do with Texas Instruments. TI is another Simcoe that started up a little earlier than Intel,
and at this point, they were a dangerous competitor. Remember back to early 1971,
when Intel sent over a preliminary design for their CPU to CTC. Well, it turns out that the terminal manufacturer
wasn't just courting Intel. CTC would turn around and pass that design dock onto Texas Instruments.
A chip would be produced, with TI sending samples to CTC well before Intel finished
their implementation. But this new completed microprocessor was too buggy and too unstable to use.
Thus, TI was out of the running for the contract.
I've seen it said that the original Intel designs had a few major flaws that got into
the TI chip, but I haven't seen evidence that confirms that.
It may just be an urban legend.
It may be the case that receiving a failed chip from TI made CTC look for other solutions.
But once again, there's no confirmation of that.
Ultimately, CTC would find its own way to fix the Datapoint 2200.
Fed up with waiting for Intel to send in a new chip, CTC had decided to just redesign their terminal, still using discrete logic chips.
In the end, it turned out that some
careful engineering was able to avoid needing a totally new technology. The final product was
released in spring of 1970, well ahead of Intel's chip ever being completed. No longer needing
outside services, CTC decided to sever ties with Intel. This left Intel in a weird limbo kind of position.
They now had another microprocessor without a client. After much internal debate, it was decided
the best solution would be to release the chip alongside their other offerings, to try to recoup
some costs. So, with a few modifications, the 8008 was announced, first showing up in catalogs in
April of 72. It would sell for $120, or about $720 once adjusted for inflation.
So that's how the 8008 came to be. But why is a somewhat obscure chip from the early 70s so
important? It wasn't a revolution like the first microprocessor, but it was an evolution.
In a lot of ways, the 8008 would end up being a refinement of earlier designs, one that
would pave the way for more complicated chips down the road. And there is one key feature
that's easy to miss when looking at the spec sheet. The 8008 worked with generic support
chips. Let's say you're trying to build a computer based on
the newest Intel CPU. What RAM do you need? Or ROM? What about chips for controlling I.O. devices?
In the case of the earlier 4004, you had to use custom ICs designed just for use with that chip.
But starting with this new 8-bit chip, the field was open. You could pair just about any other chips with it.
That may seem like a small change, and in a lot of ways, it was.
But it helped set the stage for the personal computer revolution to come.
So this new chip had the flexibility to use different ROM and RAM.
But it also was just plain more flexible as a system than the earlier 4004.
The instruction set for the 8008 was much more fully featured, meaning the programmers had a
lot more room to create useful systems. It was a lot more than just a really fancy calculator,
it was a fully-fledged computer. And as such, the 8008 would find its way into a relatively new class of systems.
There were a handful of very early personal computers that would use this Intel chip.
The 8008 was in an interesting place for the time.
It was a cheap way to get a computer up and running, far cheaper than any other options.
It was flexible enough to build complex computers around it, and it was powerful enough to do some amount of work.
But at the same time, it wasn't powerful enough to compete with larger computers, so it didn't ever have a shot at the mainframe market.
It was in a league of its own, and it arrived just as some of the earliest personal computers were starting to appear.
This wave of new systems would start in 1973, with a lot of them powered by Intel chips. These started off as kits, where you'd have to
buy plans for the computer and then order out for the parts and assemble the machine yourself.
This was more of a hobbyist type of product, since it required a lot of time and know-how
to get up and running. Not all that personal for the general public, but it was a step in the right direction.
The next step beyond that would also start to appear the same year.
The Miracle, designed and produced by R2E in France,
is often cited as the first true personal computer.
And there is some good reason for that.
On its release, it was the only small computer that a consumer could order fully assembled and ready to use.
It was originally designed for use in the French National Institute for Agronomic Research as a system control unit for factories or farms.
The logic at the time was that most factories, or anywhere that needed automation, didn't need an expensive mainframe.
In 1973, one of the cheapest machines that you could get was a DEC PDP-8, and that would cost about $18,000.
That made it a hard sell when it came to automating small to medium-sized businesses.
sized businesses. Once in market, the Miracol base unit sold for $1,700, and that jump in price was in a large part thanks to the cheap and flexible 8008. There were an estimated 90,000 Miracol
computers sold during its lifespan. The exterior of the system is a far cry from what we'd be used
to today. It came in a small squat box with a series of lights and
switches on the front. But when you look inside, you find something very similar to modern computers.
It was powered by a microprocessor, as I've said before, that was connected to memory chips with
support for expansion cards. You could add storage in the form of floppy disks. You could hook up
outputs like terminals and printers. You could even load
up and run software on it. And all of this is cheap and easy enough, roughly speaking,
that it can be used by one person in the comfort of their own home.
Another early contender in this space was the MCM-70. This computer was built by Microcomputer
Machines in Canada, with the first units shipping in 1974. This was another
computer based around the Intel 8008, but there were a few big differences between this system
and the Miracol. The most striking thing about the MCM70 has to be its appearance. The computer
is shaped like a massive keyboard. In front is the keyboard itself, then two cassette drives are positioned
above that with a single-line plasma display above the drives. It's a strange sight to say the least.
This oddly-shaped computer served as another stepping stone towards a truly personal computer
that would arrive in the 80s and 90s. The Miracol was a big step because you no longer had to assemble a computer yourself,
but it still came as just a computer. To get much use out of it, you needed to get a terminal,
disk drives, and so on. The MCM-70 was a totally different beast. The system had everything you
needed to get up and running built into one box. The keyboard served as input, with the dual tape drives being used for storage,
and the small screen on top as output. All this was packed into a metal box alongside a fully
functioning 8-bit computer. It was a little more expensive than the Miracle, coming in at about
$5,000 for a bare-bones model, but it was still a real computer, and it was becoming more and more
accessible. Ultimately, 1970 wouldn't
be the decade where a computer reached every desk. Systems were still too expensive and
too hard to use. That being said, this would be the decade when the seeds of the coming
microcomputer revolution were planted, and Intel's 8008 was a big step along that path.
Okay, I think it's about time to wrap this episode up.
The trend should be starting to become clear already.
The story of the microprocessor quickly becomes the story of the personal computer.
Ultimately, the 8008 itself would only be one more step along the way to modern PCs,
but it was a big one.
The 8008 is kinda hard to talk about because its legacy doesn't become immediately apparent until we get to the future generation of Intel chips.
That being said, I think it's an important story because it shows how the company's
early work with the 4004, plus some outside intervention
and good timing, positioned them to take control of the microprocessor market. A lot of what would
become the x86 architecture that we use today starts with the 8008, and we'll start to see
more of that as we venture further along Intel's path. This is just the second part of the overall
series I'm planning on the legacy of Intel's processors. This is just the second part of the overall series I'm planning
on the legacy of Intel's processors. As a reminder from the last part of the series, these aren't
going to be back-to-back episodes on the topic. I don't really want to bog down my feed with months
and months of talking nothing but chips. Instead, I'll pick this back up at a later date to talk
about the next big leap, the Intel 8080.
Until then, thanks for listening to Advent of Computing.
I'll be back in two weeks time with a new episode, and this time I'm thinking about looking at some of the more fun applications of the computer.
If you like the show, then please take a minute to share it with your friends.
You can also rate and review me on Apple Podcasts.
If you have any comments or suggestions for a future topic, then go ahead and shoot me a tweet.
I'm at Advent of Comp on Twitter.
And as always, have a great rest of your day.