Advent of Computing - Episode 34 - 8080 VS Z80
Episode Date: July 12, 2020In 1974 Intel released the 8080 processor, a chip long in the making. It was the first microprocessor that had the right combination of power and price to make personal computers viable. But that same... year a small group of employees defected and formed their own company called Zilog. Among this group were Masatoshi Shima and Federico Faggin, two of the principal architects behind the 8080 as well as Intel's other processors. Zilog would go on to release a better chip, the Z80, that blew Intel out of the water. Today we continue our Intel series with a look into this twisting story. Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and stickers:Â https://www.patreon.com/adventofcomputing Important Dates: 1974: Intel 8080 hits shelves 1976: Zilog Z80 goes on sale
Transcript
Discussion (0)
As long-time listeners will surely be aware of, I'm a bit of a sucker for a nice descriptive word.
For whatever reason, the field of computing is crowded out with really lame and really
poorly chosen names for things. So when you come across a better name, even if it's not
the most commonly used one, it's something you want to keep track of for later.
Microcomputer is one of the more evocative words that's fallen out of
fashion in more recent years. From the early 1970s, the term was used to describe a small
but inexpensive computer, something that someone could buy and use in their own home. I think the
term is so great because it bundles up a lot of information. Microcomputers are the smallest end
of the computing spectrum, shrunk down in all regards and all dimensions. But they have another micro-thing inside of them. That's
the microprocessor. Now, it may seem a little strange to ruminate on such a small detail,
but there was a time where that distinction would matter a great deal. The first microprocessors
started being manufactured at the beginning of the 1970s,
but widespread adoption would take many more years. So in that transitionary period,
you needed a way to describe what exactly you were working with. Was it a big system that was
built out of piles of transistors and wires? Or was it a more micro-sized machine? Today,
almost no one uses the term microcomputer. That's because the
distinction doesn't really exist anymore. The microprocessor won out. Every computer today,
without exception, is powered by these little silicon wonder chips. But this future, well,
it wasn't always a given. When microprocessors first hit the scene, everyone knew that they
would change the world. But no one really knew how. No one was quite sure what to do with them. Early
chips were extremely limited, they were hard to sell, and they weren't quite ready to
be packed up into microcomputers. At least not yet.
That would all change with Intel's introduction of the 8080. The new processor would take
the world by storm, and it would solidify Intel's place in the 8080. The new processor would take the world by storm,
and it would solidify Intel's place in the computing world. But the 8080's story is much
more than the chip itself. Shortly after release, Intel would have to contend with some very
dangerous competition. It came in the form of a new semiconductor company. And this was a company
that was built by some of the same people that were responsible for every Intel processor to date. Welcome back to Advent of Computing. I'm your host,
Sean Haas, and this is episode 34, 8080 vs Z80.
Today, we're finally picking back up my Intel series with a look at the next chip in the lineage,
the Intel 8080 microprocessor.
For those just joining in, I've been slowly working my way through the development of microprocessors at Intel,
starting with the 4004 and one day ending with the 8086.
As of this episode, we should be just about in the middle of that series,
give or take some wiggle room for any tangents that I run into during my research.
You shouldn't need to listen to the earlier parts of the series to enjoy today's episode,
but if you want to learn more about Intel's past,
then I'd highly recommend checking out episodes 16 and 21 in the archive.
The 8080 is really a turning point for Intel in a number of ways. The company's first two chips,
the 4004 and the 8008, had relatively limited mass market appeal. The 4004 in particular is a pretty interesting case. It was the first microprocessor ever produced. A computer on a chip had been a
dream for decades, and Intel was finally the first to achieve it. But the final product ended up
lacking a lot. It was a totally new technology, so I think that can be forgiven. The 8008 falls
into a similar strange spot. It was the next big leap forward, and it was a much more capable
chip than Intel's first shot. But it also didn't hit it big in the market. These early
chips were much more likely to find their way into calculators or industrial control
systems than fully-fledged computers. With the 8080, things would really start to change.
This chip would help create the personal computing boom
that we see in the latter half of the 1970s. It brought a combination of power and economy
to the market at just the right time. This new chip would turn out to be a huge success for Intel,
but it would also lead to some problems for the company down the road.
Soon after the 8080 was released, Federico Fagan, one of the key designers behind all of Intel's processors up to this point, he left the company.
Even worse, he started a competing firm, Zilog, with the express purpose of making a better 8080 than Intel could.
So let's dive into how the 8080 came to be.
How did Intel's success lead to the birth of its first major competition?
And how does this all fit into the larger story of the personal computer?
To start things off, we need to look at the tail end of the 8008's development.
And just as a warning, there's going to be a whole lot of 8s in this episode.
I'll do my best to keep things from getting too confusing, but all these chips have really
similar names and they're all just numbers with 8s and 0s, so bear with me here.
The 8008 was Intel's first step into the world of 8-bit computing, with the development wrapping in 1971.
The company had previously developed the 4004, but that chip was aimed primarily at desktop calculators. The newer
8008 was designed as a computer first, an approach that made the chip a lot more flexible and a lot
more powerful in general. But no matter how capable the 8008 may have been, the growing problem was
figuring out how to sell the thing. A microprocessor was still a totally new idea,
and as such, there wasn't really a market for this type of product. It was obvious to Intel
that people would want a microprocessor, but getting customers and manufacturers to agree
was going to be a little bit of a feat. The other big reality to keep in mind is that for Intel,
the microprocessor was almost like
this weird side hustle that they had fallen into.
Intel's underlying business plan was to make standardized semiconductor components and
sell them en masse.
The 4004 had started out as a side contract to kind of keep the lights on at Intel.
The same was true for the 8008. As flashy as new processors were,
the technology just wasn't the core business for Intel. At least not yet. Part of drumming up
interest in these processors was sending out engineers around to give talks and some
demonstrations. One of those touring engineers was none other than Federico Fagan, often described as the father
of the microprocessor. Fagan, along with some help from other engineers and primarily Masatoshi
Shima, had implemented the 4004 from early designs into real-life silicon. He then led
the development of the 8008 project. Just as that second project was wrapping, Fagan found himself
in Europe showing
off his groundbreaking work to prospective clients. But not everything went as expected.
To quote Fagan, quote, late in the summer of 1971, I went to Europe to give a series of technical
seminars on the MCS-4 and the 8008 and to visit customers. It was an important experience. I
received a fair amount of criticism,
some of it valid, about the architecture and performance of the microprocessors.
The more computer-oriented the company I visited was, the nastier people's comments were."
Just to note there, MCS4 is the name for the family of chips supporting the 4004 processor.
the name for the family of chips supporting the 4004 processor. So on the surface, it may seem strange that computer manufacturers would be antagonistic about a microprocessor, but they
had some really good reason for concern. Part of it was probably a mixture of pride and self-interest.
Intel wasn't a computer company. They made semiconductor chips for computers, but they didn't make
computers. So to see a new company swaggering into the computer market must have ruffled quite
a few feathers to say the least. But besides worries for competition, there were a slate of
technical flaws in Fagan's early designs. While Intel may have had the most experience in integrated
circuit design, they lacked
experience in computer design.
Computer manufacturers of the time had to build up systems from scratch, designing processors
from discrete transistors or logic chips.
These types of companies just had way more real-world understanding than Intel did.
One of the biggest complaints was that these early microprocessors were just way too slow to be useful. The 8008 ran at 500kHz, which is pretty sluggish compared to its non-integrated
contemporaries. Another problem came down to how the 8008 was packaged. Now, let me
take a second to explain what that actually means for those not very familiar with integrated circuits. The actual IC is just a tiny flake of silicon. On its own, you can't really use that. It's too
small and way too delicate to handle. To make a chip more usable, it's packed into a housing,
usually plastic or sometimes ceramic. That's just the little black box part of the microchip.
or sometimes ceramic. That's just the little black box part of the microchip. But when people talk about packing, they're usually referring more to the pins that line the perimeter of that enclosure.
Each pin connects directly to the tiny silicon flake so you can actually interface with the
circuit. How those pins are assigned to different functions ends up being really important to how the final microchip is used. And in the specific case of the 8008, those pins were assigned pretty poorly. The 8008 came
in an 18-pin packing. That means you only had 18 wires to interface with the outside world.
Two of those pins were used for power. Eight were allocated for signal and state control.
for power. Eight were allocated for signal and state control. That left just eight pins for data and addressing memory, just enough to represent a single 8-bit number. That's not a whole lot to
work with. With so few pins, the 8008 had to use a method called multiplexing, which essentially
made it possible to share those eight free pins between sending out an address in memory or
sending or receiving a data value.
This was done by using a few state pins to tell the peripherals if the CPU was about
to send out a memory address or send out some data.
And while this worked, it was slow.
It required supporting chips to decode the multiplexed pins, and it just wasn't very fun to use.
In the opinions of computer manufacturers that Fagan talked with, the packing made the 8008 way too hard to work with.
It needed more pins to make it usable.
So, why did this processor have such bad packing?
Why didn't Fagan just design the chip to use 40 pins or so?
Well, it comes down to corporate
policy at Intel at the time. Intel wasn't prioritizing processors, far from it. They
primarily made memory chips. And those worked fine with only 18 pins. I haven't seen it explicitly
stated anywhere, but I'd be willing to bet that another reason was upfront cost. Intel probably didn't have the
machinery to make larger packings, and buying that kind of industrial equipment was not cheap.
The net result was that Fagan was severely limited in what he could actually produce.
On the inside, things weren't that much better. One of the other big complaints was that the 8080
was very primitive. Computer manufacturers may not have
known much about working with raw silicon, but they knew what a computer needed to be able to do.
They knew how to design a processor that was practically useful. Part of the issue was the
same one that I touched on before. Intel just didn't have the computer know-how needed. But
a compounding factor was that integrated circuits were still a relatively new
technology. An entire processor on a chip was an amazing feat. The 8008 pushed the boundaries of
complexity, but it still only had 3,500 transistors. In the grand scheme of things,
that's not all that much to work with. So there was really a hard set limit on how powerful a
microprocessor could be. For a lot of people, hearing their work torn apart by those more
experienced in the industry would have been devastating. But not for Federico. Instead of
despairing, he listened. And by the time his microprocessor tour was over, he had a solid
plan in mind. Federico knew that fixing the 8008 was not
really that possible, so he started to design a new chip. After all, the 8008 had only been
Intel's second microprocessor. With any luck, the third time would be the charm.
Shortly after the 8008 officially launched, sometime in early 1972, Fagan started the long process of convincing Intel to
create another chip. But he faked quite a bit of internal resistance. The microprocessor market
was still untested. Intel had yet to see if they could even sell enough chips outside of their
existing contracts. It also didn't help that Fagan insisted that the new chip had to have a 40-pin
packing, use a new type of silicon technology, and be far more complex than any existing ICs
that Intel manufactured. All of this was too much change too fast. Fagan found his proposal
sidelined, at least for the time being. This is one of those instances where a little bit of delay ended up helping the project, actually.
At the same time that Fagan was lobbying for his new chip, he was also busy arranging a job for an old colleague.
Masatoshi Shima had played an important role in the completion of the 4004, but he wasn't actually an Intel employee.
the 4004, but he wasn't actually an Intel employee. At the time, he was working for Busycom, a Japanese calculator company that contracted with Intel to develop calculator
ICs. Through a series of delays and contract dealings, Shima ended up traveling to the
United States to help with the 4004 project. Once development finished, he headed back
to Japan, but he and F Fagan had kept in touch.
Back in Japan, Shima had kept himself busy also.
He continued working at BusyCom for a bit before moving on to design ICs for Riosh.
While Shima wasn't building computers at Riosh, he did do the next best thing.
His main projects were developing integrated circuit chips for interfacing with mini computers.
Now, these are a class of computers that no longer exist, but at the time,
they were an interesting advancement. Mini computers were basically the smallest computers
you could buy. They were cheaper and less powerful than mainframes, but still sturdy
enough to get a lot of work done. Like their larger counterparts, they still had yet to
embrace the microprocessor. Instead, they were built from discrete chips.
The core of these systems were built from circuit boards populated with relatively generic
TTL logic integrated circuits.
But that only went so far.
When it came to building out interfaces to other computers or peripherals, it became
really more practical to develop custom integrated circuits.
So Shima got a great chance to sharpen his skills in IC design while working just on the periphery
of computing. This new experience made Shima even better
equipped for the job that Fagan had in mind. In November of 1972, Masatoshi Shima made
the move to California and started his new job with Intel. The timing ended up working out
perfectly. Fagan had finally gotten the management to agree to his new chip in October. This gave him
a month head start to get the project up and ready for Shima to take over upon his arrival.
During that window, Fagan, along with Stan Mazor and Ted Hoff, finished up the initial design of
the 8080. They were basically turning the basic
idea of a processor into a more workable form, ironing out details, and getting a design document
roughed up. Hoff and Mazor had been major contributors to the 4004 and 8008 on the design
side, and with their inputs, the 8080 would come full circle and reunite the original team behind the first microprocessor.
The overall design for the 8080 ended up being pretty conservative, all things considered.
The team decided early on that an enhanced 8008 chip instead of a totally new processor would be the way to go.
A lot of their choices in the design stemmed from enhancements and extensions that existing clients wanted.
By this point, 8008s had actually started to sell.
So between real-world users and feedback that Fagan received during his tours,
there was a glut of proposed upgrades and fixes.
But the new chip wouldn't be a direct expansion.
A lot of the underlying design would have to be retooled.
The easiest place this can
be seen is in the instruction set for the 8080. The new chip was not binary compatible with its
predecessor, but it was roughly code compatible. That means that you couldn't just take an already
compiled program for the 8008 and run it on the new chip, but you could take the source code for
that same program and rebuild it for the 8080.
In theory, this meant that any code a programmer wrote for the older chip could be moved over
once the 8080 came out. But in practice, things got a little bit more complicated.
The new chip's instruction set was designed in such a way that each instruction had a direct
equivalent on the older processor. These equivalent instructions
worked in the same way, but the versions on the 8080 were often called by a different
name in the assembler code. This made it so that older programs could be converted over
into the new assembly language. Think of it as compatibility with an extra needed step.
A big reason to keep this compatibility was because the 8008 was actually getting some
market share. The plan was to pitch the 8080 to existing customers as a better chip that they
were already using. As Ted Hoff later recalled, quote, Stan and I were really having lots of
difficulty dealing with the customers who were looking at the 8008 and trying to push it beyond
its capabilities.
They'd come to us asking for help, and there was just a limit to what we could do.
I don't know when we made the decision to try to keep a degree of compatibility,
but that seemed to really help in the long run. Because then, we could sort of tell the customer,
try working with the 8008. If it doesn't solve the problem, then the next generation will be coming.
End quote. Plans also called for a 40-pin packing. This did away with the frustrating multiplexing of Intel's earlier processors. And with more pins, the 8080 would be able to address
a good deal more memory. There were also a slate of other more technical changes. Interop handling
was vastly improved,
new memory addressing modes were added, and new instructions were added to back everything up.
One of the big ones that Stan's mentioning is that the 8080 was Intel's first chip to implement
any in-memory stack. This was a big step towards making the chip more flexible and able to handle
more complex programs.
But those were all just plans. The next step was to make those into a silicon-ish reality.
And that task fell to the newly hired Shima. Once settled in his new offices at Intel, he set to work converting the rough design into a workable prototype. There was some direction
from Fagan, but the vast majority of the layout and IC design was done by Shima alone. But this task wouldn't be smooth sailing.
The new chip was planned to be released in just over a year. From the time Shima started the
project, it was clear that he had his work cut out for him. To quote Shima,
There were so many challenges and problems in the 8080 development. At first,
as I said, there were no recent product detailed manuals when I first came here.
Secondly, it was unbelievable, but only I was assigned to the 8080 microprocessor project,
end quote. The actual designs for the 8080 were contained in a stack of memos and notes.
It hadn't all been collated and compiled into a single overarching design document.
And with Intel still not 100% on selling CPUs, there were limited resources available.
Ashima had to manage most of the project solo, but that wouldn't slow him down.
He had all the tools and expertise he needed to get the job done.
Over the next eight months, the 8080 took shape. But why the time crunch? Part of that came down to
marketing decisions. There were already other microprocessors entering the market, and pretty
soon there would be stiff competition. Intel had the lead, and losing that could spell disaster
for the fledgling processor division.
Fagan worried that if they went to all the trouble of making a better processor and then got beat to the punch,
Intel may not gamble on the technology again.
He already had to fight for the project in the first place.
If it failed in the market, then that would leave him in a pretty awkward position.
Despite Intel not going all- in on the project quite yet,
Shima was still able to benefit greatly from his new position within the company.
One of the key factors was that Intel had institutional knowledge when it came to
building microprocessors. It may have only been a handful of years of experience,
but that was more than anyone else in the industry at the time. While the whole 4004 team was back together,
Shima had quick access to any help when needed. But that was just one side benefit. Intel also
had all the designs for the 8008 and the 4004 on file, right down to the silicon layouts. So,
there was a pool of working circuit designs that Shima could dip into if needed. For instance, a good deal of the
arithmetic circuits were borrowed from the 4004's design. With some tweaks, Shima was able to adapt
these earlier components to work in the new CPU, and thus keep the project on schedule.
Another huge help was that, well, Intel wasn't actually that full of computer people.
As any respectable company in the 1970s, Intel had a mainframe, specifically a DEC-10 computer.
But not that many employees actually bothered to use it.
A few years previously, Hoff and Mazor had written up a program for running circuit simulations.
This allowed engineers to build a virtual circuit inside the mainframe and test it,
without ever having to etch out an actual silicon wafer. This was a wonderful tool for designing ICs, but at the time, it wasn't being
used all that often. Shima only had to share the mainframe with the occasional memory chip project,
so he had plenty of power to work with. By testing parts of the 8080 circuitry,
he was able to work out bugs before the chip ever went to production.
After months and months of grinding away and bashing out a lot of simulations on Intel's mainframe,
Shima finally approached the finish line.
Once the IC design was done, it was just a matter of production.
Masks and dyes were made, silicon wafers were pressed,
and a few months later, a prototype 8080 wafer was delivered
to Shima's desk. After some initial struggle, he had a processor up and running. All tests
looked good, only a few minor bugs were ferreted out. And in these trials, it became clear
just how much more powerful the new chip was. The 8080 ended up running up to 20 times faster
than its predecessor. After a few rounds of debugging
and more testing, the die was finalized, and the 8080 went into production. In total, it took 16
grueling months from idea to finished chip. By the beginning of 1974, the chip was ready to send out.
Now, all that was left to do was sell the thing. Fagan's push to get the chip done quickly,
combined with Shima's hard work and determination, would really pay off.
At the time of release, there was nothing quite like the 8080 available.
It was respectably fast.
Its feature set was much more in line with a fully-fledged computer.
And it was remarkably cheap. In 74, an 8080 costs $360.
That inflates up to about $1800 in 2020 money. That may sound a little expensive, but it was a great deal for the time. For just $360,
you had a nearly complete computer. With a little bit of RAM, some ROM chips, you were
well on your way to a fully functioning machine. Thanks to the larger packing, it was super easy to wire up this new processor to memory
chips.
Data and address lines were separate so there is no supporting chips necessary.
Intel had a very strong offering at just the right time.
In the first few years of production, the 8080 would explode onto the scene.
Existing 8008 customers were eager to upgrade, and new consumers
came streaming in. One of the most influential of the pack of new users was MITS, the company
behind the Altair 8800. Now, we've talked about the 8800 on the show before. It's often considered
one of the earliest personal computers. At least it was the early accessible computer. The machine, released in 1975, was the
first microcomputer an end user could buy pre-assembled. Up to that point, consumer computers
came in the form of kits. And while this was great for hobbyists, it made for a steep barrier to
entry. The Altair 8800 was relatively simple to get up and running. And perhaps more importantly, it was cheap. A stock model Altair cost just over $600. If you wanted it in kit form, then that dropped
closer to $400. That was simply unheard of at the time, and it was only possible thanks to a
powerful and cheap microprocessor like the 8080. MITS was able to work out a deal with Intel to
buy bulk 8080s for just $75. And from there there it was just a skip and a hop to making a viable personal computer.
But not all was well at Intel, and not all was well with the computer industry as a whole.
In 1973, the world started slipping into a recession.
Companies like Intel were hit pretty hard, as there were fewer and fewer customers for chips in general.
Some SimCos would fold completely, while others had to adapt to survive.
For Intel, that meant laying off workers and restructuring.
Fagan found himself being pushed into management, further away from his beloved processor work.
He was getting further away from the job that he wanted to have.
He was getting further away from the job that he wanted to have.
Federico wanted to make processors, and he was realizing that at Intel he wouldn't be able to do that.
So in 1974, he quit his job at Intel to create a new company.
One where he wouldn't be stifled by an increasingly corporate business,
where he wouldn't have to spend months trying to get a new processor project authorized.
At first, the whole company had two employees,
Federico Fagan and Ralph Ungerman, another Intel engineer who was involved with processor development. The two dropped their jobs with only a faint plan. Fagan put it as,
I asked Ralph to go out for a drink and I said, Ralph, I would like to start a company to build
microprocessors. Are you interested?
And he said, yeah, let's do it. And that was it. I mean, basically, there was no discussion of what we were going to do, how we were going to do it, and so on. It was just an immediate response.
End quote. The new company would just make microprocessors. The details, those would come
later. Everything would be okay, right? But this left the duo in a bit of a bind.
They were starting a new company at basically the worst possible time.
One of the driving forces behind tech, especially in Silicon Valley, is venture capital.
These are essentially seed investments that come in before a company really gets off the ground.
If you want to get your startup actually started up,
you need to build a business plan,
maybe some prototypes,
and then go around pitching to investors.
The hope is that you find someone,
or usually a group of someones,
willing to put in some upfront cash to your idea
in return for later return on investment.
It's just part of the life cycle in Silicon Valley.
Usually, venture capital investment isn't that hard to come by, especially for a good business plan coming from someone with
a history in the industry. But in 1974, the economy was pretty dry, and so too were Fagan's
prospects of securing funding. But this is where Fagan lucked out in a really big way.
funding. But this is where Fagan lucked out in a really big way. He and Ungerman leaving Intel was a big deal, especially so soon after the launch of the 8080. Rumors and at least one
article in a trade journal were circulating. People were talking about the new company that
was forming. Even before they named the company, they were getting attention. To quote Fagan again,
before they named the company, they were getting attention. To quote Fagan again,
Somebody called me up from Exxon and wanted to visit, and I told them that we were not ready yet. We had not yet decided what we were going to do, but if they would come around later on,
we'd gladly see them. So they called me soon after saying that they were going to be in the area 10
days later. That's when I got busy trying to figure out what to do. End quote.
Exxon, the oil company Exxon, was interested in investing. With the lack of venture capital,
this was probably the best and maybe the only shot that this new company had.
But interest doesn't directly correlate to money. The key would be building a solid plan and getting
a solid pitch out to convince Exxon to invest.
They simply couldn't afford to mess up this chance.
Just going with, hey guys, we used to work in Intel, but now we have our own office space.
We're going to make processors.
Well, that wasn't going to cut it.
The duo started to draft an actual business plan and some initial designs.
The idea being that they could start off producing a
single-chip microcontroller. This is essentially a small and simplified computer chip that's used
for processes like industrial control. After 10 days, there was enough of a plan to present,
and the group from Exxon paid a visit to the now-named Zilog office. By the end of the visit,
Zilog was able to secure initial funding. Once money was in,
Fagan started hiring on more employees, and his first choice would be to bring on Masatoshi Shima
from his old job at Intel. As it turns out, the business plan that Fagan and Ungerman threw
together may not have been the most well thought out. I mean, they did only have 10 days to make
it. Immediately, the company ran into a
supply chain issue. Zilog had two employees in a small office and a stack of investment funding,
but they didn't have a manufacturing plant to produce chips. To get anything made, they needed
to contract with a foundry, and that turned out to be a lot more expensive than they realized
initially. Microcontrollers are meant to be a pretty cheap device.
They get crammed into machines that are too cheap or too simple to warrant a full computer.
After talking to some manufacturers and running some numbers, Fagan realized that there was
simply no way they could make a cheap enough microcontroller to compete in the market.
So, Zilog would have to find a way to shift this initial strategy.
If it was going to
cost so much to make a cheap chip, then why not go all out? Intel had started slow and eased into
processors, but there was no reason that Zilog would have to do the same thing. The first real
project was initially called the Super 80, an 8080 clone that would be faster, more powerful,
and easier to work with. Processors were also a more expensive product, so F80 clone that would be faster, more powerful, and easier to work with. Processors
were also a more expensive product, so Fagan figured it would be possible to compete on cost.
Perhaps more importantly, an improved clone of the 8080 would be really achievable for Zilog.
The company had two out of the four principal creators of Intel's chip. Fagan and Shima knew
every detail of the design, they knew
every problem with the chip, and they knew exactly what customers were complaining about.
In a lot of ways, the 8080 would end up being a first draft for Zilog. Using the experience
gained at Intel, plus new fabrication methods that were now becoming possible, a vastly
revised processor could be built. The name for this chip was pretty quickly changed from the Super 80 to the better known Z80,
and work would start on the Intel killer right away.
Zilog didn't have the money or the connections that Intel had,
but they had a lot more flexibility and a lot more time to get their chip off the ground.
The 8080 had to be built quick, and that led to a lot of issues.
At Intel, Hishima had less than a year to go from rough plans to a working chip.
And that just made for cut corners.
Quoting Shima,
Sometimes it's said that a big success is obtained when developing the challenging new product three times.
And Z80 became the third microprocessor for me. So I was able to concentrate on the
functional specification, the hardware architecture design, and all of the design activities. I was
able to design whatever I wanted. And personally, I wanted to develop the best and the most wonderful
8-bit microprocessor in the world. Also, sparing enough time for the hardware architecture design.
End quote. The layout of Intel's chip for the hardware architecture design. End quote.
The layout of Intel's chip wasn't up to Shima's standards. Everything just had to be crammed onto the silicon. There wasn't enough time to spend planning how the 8080 would work with other chips.
There wasn't enough time planning in general. So with the luxury of time, Zilog could make a much
more fully realized processor. They could make the most
wonderful 8-bit microprocessor in the world. In February of 1975, work would start in full.
Over the next year and some change, the Z80 was slowly built. Early on, it was decided to make
the new processor 100% binary compatible with the 8080. This meant that any program that worked on the
8080 would run exactly the same on the Z80, no changes needed. It also gave the team at
Zilog a starting point for the instruction set.
But from there, the new chip diverges greatly. One of the major improvements that Fagan contributed
came down to power consumption. The 8080 had kind of weird power requirements to
run. It needed plus 5, minus 5, and plus 12 volt inputs to run. While that worked, it was far less
than ideal. For the Z80, Fagan was able to retool some of the underlying design to get the chip
running on just plus 5 volts. But that was only the start.
A number of simplifications were also introduced.
But most of the biggest changes were expansions to what the 8080 was capable of.
From the start, the Zilog crew knew that their chip was going to be more complicated.
So they didn't hold back on features.
They took a no-compromises approach, and that would pay off.
This is most easily seen in the registers used by the Z80.
Registers are essentially little chunks of memory that are internal to the processor.
They're used to store numbers and keep track of the current state of things.
Think of it as the data a processor is currently working on.
The 8080 has only a handful of registers.
Four general-purpose registers, a stack pointer to keep track of where the stack is in memory,
and a program counter to track where the current program is in memory.
Zilog's chip is a little bit different.
In this regard, it basically doubles down on everything.
The Z80 has eight general purpose registers, two index pointers, a stack pointer, and a
program counter.
With this many registers,
you get a lot more flexibility when it comes to programming the chip. But there's something
cooler at play here. The Z80's general registers were arranged as a matching pair, so in effect,
you can swap out the processor's current program state. This made complex things like multitasking a whole lot easier on Zilog's chip.
The extended register space alone made the Z80 much more flexible and much more powerful.
Other major changes included improvements to how memory was accessed, a slate of new instructions,
new interrupt handling, and a faster maximum clock rate. Looking over the laundry list of
improvements, the Z80 was just clearly better.
It could do everything that the 8080 could do,
plus a whole lot more.
It had less complicated power requirements,
and it was just plain faster.
The Outlook got even better on release day.
In 1976, the Zilog Z80 processor was completed,
with samples shipping out to customers in March.
The retail price was just $200, cheaper than a release day 8080. By this time, the 8080's price
had dropped a little bit to around $150, so the two chips were of comparable price. An upgrade
wouldn't break the bank. The Z80 was a masterpiece made by the best minds in the field. But how did it fare in the market?
Did it actually beat back the 8080?
In 1976, Intel had accumulated quite a few loyal customers.
MITS was just one.
The 8080 was becoming the go-to for microcomputers.
There was already a lot of software and hardware built to work with the chip.
That was a massive boon for Intel,
but that existing market was also a boon for Zilog. While the Z80 didn't have the same
pin assignment as the 8080, that wasn't all that much of a problem. A lot of early
systems like the Altair 8800 were designed using a backplane. This meant that each component
was built on a separate board that plugged into a big bus. You could
very easily make a new CPU board and just upgrade from your old 8080 to a new Z80.
New systems would also be built for the Z80, the first big break being the Tandy RadioShack 80
line of computers. These would become one of the big three microcomputers in this first
wave of personal systems. But that would just be the start.
In the coming years, the Z80 would overtake Intel and become the dominant 8-bit chip in the market.
It was just a better chip all around. Since the same designers worked on the 8080 and the Z80,
it's easier to look at the latter chip as a proper successor. That's the short version,
but by the end of the 1970s, things started to get a
little bit strange. I want to close out with this. You've probably used a Z80 and not even realized
it. Microprocessors wouldn't keep selling at $200 or $300 for very long. At the end of 1977,
a Z80 cost just $30. By 1980, you could get a chip for $9, with the 8080 priced just a little
bit lower at about $6, give or take some change. Now, there's a whole lot of reasons for this
drop in price. Economies of scale made manufacturing chips just cheaper in general. The same goes for
changes in silicon technology. Simcos were getting better and better at turning out chips.
Another big factor was that these two chips weren't state-of-the-art anymore.
Intel would release their newest 16-bit processor in 1978.
Motorola and other players were entering the market with similarly advanced chips around this time.
8-bit processors weren't the biggest and best anymore.
To be clear,
a lot of consumer computers still use these older chips, but it just wasn't the newest wave.
Zilog was also trying to move to the new, more modern processor designs, but they never hit it
as big as their first shot. The Z80 had the exact right team backing its development, it had the exact right features,
and it came out at the perfect time.
A follow up, the Z8000, was released in 1979, but it was too little and too late.
The market for processors was a lot more saturated, Intel was no longer their only competition.
The stars had aligned once for Zilog, but that wouldn't happen again.
And here's where things take a really weird twist. The Z80 has actually never gone out
of production. You can still buy a brand new Z80 processor for $4 today. And if you're
a little bit savvy, you can even get free shipping. The same is definitely not true
of the 8080. Intel discontinued that chip
in 1990. So what's going on here? Why is this chip from the mid-70s still being mass-produced
nearly 50 years later? As it turns out, not everything needs to run on state-of-the-art
hardware. One of the early adopters of the Z80 was the fledgling video game industry.
At first, arcade cabinets started using Zilog's chip. It was the fledgling video game industry. At first, arcade cabinets, Pac-Man is one example,
started using Zilog's chip. It was the right combination of price and performance to be used
for early games. You don't really need a full computer to play Pac-Man. You just need a little
bit of logic that can handle events relatively fast. Over the years, this would evolve to the
home market. In 1987, the Sega Master System was released, and it sported a Z80 processor under the hood.
Its follow-up, the Sega Genesis, even had a Z80 onboard for backwards compatibility and a little audio processing.
And this is where we really get to the real legacy of the Z80.
That's embedded systems.
If you aren't familiar, then think of it as a single-purpose computer.
These usually show up inside machines that need to be made smarter,
but don't need a full general-purpose computer built in.
Think of something like a fancy CD player, a router, or maybe a really nice kitchen appliance.
In some cases, embedded systems are built into these kinds of devices to make them do more.
In some cases, embedded systems are built into these kinds of devices to make them do more,
but other times, using a tiny computer and a bit of programming is just easier than designing custom circuits.
While the application of these types of systems are highly varied, there are a few underlying factors.
Across the board, embedded systems tend to be cheap and simple.
No one wants to put hundreds of dollars of silicon into a really nice alarm clock,
but you may be willing to drop $4 to make your life a little easier. And by that same token,
you aren't going to need a very powerful processor for most embedded systems.
And that's exactly the niche that the Z80 shines in. Only a few years after release,
you could get one of Zilog's chips for next to nothing. But it was still a capable and powerful processor.
And with incremental improvements to manufacturing chugging along,
the Z80 would only become a more viable option.
Sure, you can get an 8080 for about the same price.
But Zilog, they just made a better chip.
The other huge factor comes from third parties.
A lot of different manufacturers
actually build Z80s, even to this day. And ironically enough, the processor has actually
been cloned quite a bit, so you can get generic processors that are just renamed Z80s for even
cheaper than the authentic article. The ready availability of this archaic processor, combined
with its cost and functionality,
have kept it alive outside the computer market proper.
I think that Shima said it best.
He wanted to make the most wonderful 8-bit processor in the world.
And he did it.
Today, the Z80 lives on inside everything from remote controls to toys,
toaster ovens to modems, and even in a few old computers if
you know where to look.
Alright, that does it for today's episode.
The 8080 didn't start out as ambitious.
In fact, it started off as an upgrade to older hardware.
Intel would create a great processor at the right time to entangle themselves in the
oncoming microcomputer revolution. But just as Intel was taking a dominant position in the
processor market, it lost some of its best assets. In the grand scheme of things, competition from
Zilog wouldn't spell doom for Intel. But it must have hurt to lose a big chunk of the team that
invented the microprocessor. And while Zilog didn't ever rise
to the scale of Intel, its processors live on in some unexpected places. Moving forward, Intel
would have a new team of developers. Fagan and Shima weren't going to return, but their mark had
already been made. Intel has this habit of reworking its ideas into new products, and we can
already see that as early
as the 8080. Despite all its improvements, at the end of the day, it's still just an
upgrade to the 8008, which itself is just a clone of the Datapoint 2200 terminal. Even
the 4004, a thoroughly unrelated processor, had an effect on the design of the 8080. We
can see a chain starting to form,
one that will grow to link modern-day systems
all the way back to the late 1960s.
The 8080 is just going to solidify a lot of those early links,
and moving forward, we'll see that chain grow.
Thanks for listening to Advent of Computing.
I'll be back in two weeks' time
with another piece of the story of computing's past.
And hey, if you like the show, then there's now a few ways you can support it.
If you know someone else who's interested in computer history,
then why not take a minute to share the show with them?
You can also rate and review the podcast on Apple Podcasts.
And if you want to be a super fan,
you can now support the show through Advent of Computing merch
or signing up as a patron on Patreon.
Patrons get access to early episodes, polls for the direction of the show, and other assorted perks.
You can find links to everything on my website, adventofcomputing.com.
If you have any comments or suggestions for a future episode, then go ahead and shoot me a tweet.
I'm at Advent of Comp on Twitter.
And as always, have a great rest of your day.