Advent of Computing - Episode 105 - Apple's Growing Pains
Episode Date: April 2, 2023The Apple III was a pretty slick machine... in theory. From a lack of launch software, to strait up hardware failures, Apple's 3rd computer didn't really win in the market place. Why was that? Was th...e machine setup for failure from the start? Was it's case really designed before it's motherboard? When it comes to the III there's a surprising amount of folklore to untangle.  Selected Sources:  https://archive.org/details/sim_byte_1985-01_10_1/page/166/mode/1up?view=theater - Interview with Wozniak that covers the III  https://www.digibarn.com/collections/systems/appleIII/sandersinterview.html - Sander discussing the project  https://archive.org/details/apple-design/page/n14/mode/1up?view=theater - AppleDesign  http://www.applelogic.org/AIIIDesignBugs.html - AppleLogic
Transcript
Discussion (0)
When I was growing up, I didn't really have access to good internet at home.
It's not because I'm all that old, more that I just grew up out in the sticks.
We had dial-up basically up until my last year of high school.
Because of this limitation, I had some of a setup going on.
When I was at school, I did have access to fast internet.
At least, it seemed fast compared to my old modem.
So whenever I could get my hands on a computer at school,
I'd just download piles and piles of files.
I'd get all the hits, you know, disk images, nice Linux ISOs, MP3s,
and my most coveted target, emulators.
MP3s, and my most coveted target, emulators. I've always been enamored with the idea of virtualization and emulation. You know, a program that can pretend to be another computer.
That's just cool, there's no other way to put it. This obsession, plus the whole internet situation,
plus the whole internet situation created a perfect storm. Turns out I wasn't a very attentive student in this period of my life. The best example of this is my high school English class.
One semester, we had class in this room where you could check out MacBooks. And there was Wi-Fi.
As I'm sure you can guess, I didn't learn very much that semester. I spent
most of that period surfing the net and messing around with old computers. I had this thumb drive
that was loaded up with all of this stuff. It had various PC emulators, Mini V-Mac, Lisa M,
Vice, some random Apple II stuff, enough to keep me occupied while my classmates, well,
learn the finer points of the English language. To my English teacher, if you're out there,
I am very sorry. I have gotten better. I really just like messing around with these old machines.
At least, most of them. There were some stinkers in the mix.
One that I kept trying to understand was this weird machine called the Apple III.
At the time, there was only one way to use this machine, at least outside of actual hardware.
It was this emulator called Sara.
And it only worked on Macs.
So this period of class was the only time I could actually run the program.
The issue was, well, the 3 didn't seem to have any interesting software.
It had this weird operating system with menus.
That was kind of neat, but it didn't really hold my attention that long.
It also had a spreadsheet, I think.
So what's the deal here?
Usually Apple stuff is pretty interesting.
Was I just missing the point?
Welcome back to Advent of Computing.
I'm your host, Sean Haas, and this is episode 105,
Apple's Growing Pains.
In classic fashion, I've decided to be
just a little too late and just slightly off topic
for Marchintosh.
But hey, Apple's Apple, right?
It's all the same hardware, at least I assume so.
Today's episode is about the Apple III, the attempted
successor to the Apple II. Now, Apple may have had some hugely impactful machines, but mixed in with
the wins are some profound losses. The III represents this to a T. I really like examining
the flops of computer history, especially when
those flops come from well-established players. That said, going into this, I don't know all that
much about the 3. So let me start with the common rendition of the story. The Apple II launched a
fledgling company straight into space. That was a good start, but you can't just rest on your laurels,
especially when the computing industry is moving ahead without you.
There has to be a follow-up.
You have to have constant improvement.
From this spirit, the Apple III was born.
This was a bigger, more powerful, and more professional version of the Apple II.
However, there was a fatal flaw. Steve Jobs' hatred of fans. Jobs insisted that the three
should rely on passive cooling and should be totally silent during operations. As such,
there was no fan built into the computer. This led to crippling overheating
issues that, during prolonged use, would toast the machine. Sales would reflect the design flaw.
At least, that's the story that I'm most familiar with. Like I mentioned back in the hook,
at one point I tried to get into the Apple III. At least, I tried emulating it.
But there's kind of an issue when you're going back to these more obscure machines.
There's this profound lack of software.
There weren't really any unique games on the system, just business software and file management stuff.
While cool back in the day, that's not really fun to mess around with
on an emulator. I don't think anyone's psyched to open up an emulator and, you know, do their taxes.
So is this lack of software due to the machine's failure in the marketplace, or is it part of some
deeper issue surrounding the 3? That's a lot of negative stuff.
There has to be something good in there, right?
The Apple 3 did have some interesting features.
The new machine had a new operating system that was a lot more complicated than anything
Apple had previously developed.
It was faster than the Apple 2.
Apple had previously developed. It was faster than the Apple II. The III even had hard drive support before the IBM PC line did. So there are things to love about the III, but can they outweigh the
issues? Now, before we jump into the episode, I want to give my usual plug here for Notes on Computer History.
The ongoing project does go on, but we're getting really close. Another article just
made it past editing, so this weekend I'm going to be throwing that into the nice
typeset outline. If you haven't heard already, Notes on Computer History is my attempt to create
a publication for discussing the history of computing. I'm trying to
turn this into a community project that's accessible to anyone who wants to write about
the topic, not just academics or people who've published before. If that sounds like you,
and I'm guessing it does if you're listening to Advent of Computing, then I would love to receive
a draft. Right now, I'm looking for one or two
more articles to round out the first issue, so this is really a good time to get in and
kind of save the day for me. If you want to submit, then you can get information on that process
on the website for the project. It's history.computer. Anyway, let's get on with the show and discuss the mysterious Apple III.
At the end of the 1970s, the Apple II was Apple.
They were a company with only one computer in production.
This may sound modest, but Apple was doing quite well for themselves.
The II had been a huge success, and it was continuing to move numbers,
especially among education and hobbyists.
However, there was another market segment that was interested in the 2.
Offices.
You see, the Apple 2 was very much a hobby machine.
It was an evolution of the Apple 1, a computer built by Steve Wozniak in his
garage. That's probably about as hobbyist as you can get. Despite all the polish, despite all the
improvements, the 2 was still meant for home use by early adopters. That's why it was a bit of a
surprise when it started showing up in businesses.
Part of this was, at least ostensibly, due to VisiCalc.
This was the first spreadsheet program for personal computers.
Released in 1979, VisiCalc could turn an Apple II into, essentially, a whole accounting department.
Or at least it could streamline that type of work. Apple was starting to incur on the
traditional grounds of companies like IBM, but it would go deeper than just spreadsheets.
In the PBS documentary, Triumph of the Nerds, we get a scant mention of an interesting phenomenon.
Jack Sams, an engineer at IBM, explains this interesting trend.
When speaking on the Apple II, he recalls,
There were suddenly tens of thousands of people buying machines of that class, and they loved them.
They were very happy, and they were showing up in the engineering departments of our clients
as machines that were brought in because you can't do the job on your mainframe.
clients as machines that were brought in because you can't do the job on your mainframe.
Businesses were using Apple IIs, and probably other personal microcomputers, in conjunction with more traditional business machines. They were augmenting the functionality of a big mainframe
with the convenience of a little desktop computer. If Big Blue was noticing this,
then you can be darn sure Apple had already finalized a few internal memos on the phenomenon.
This represented a growing market segment,
one that Apple really wanted to exploit.
The problem was, they were a one-computer company,
and that one computer was only incidentally useful in an office.
A successor for the two was in order, and this machine would be designed from the ground up
to fill this new niche. How would Apple go about upgrading the two for the business market
specifically? Well, that would get really complicated. Around this time, Apple was
changing in a profound way. Gone were the days of Wozniak hacking together the next cool project
in his garage. Serious money had started flowing into the company. And that would lead to a new
outlook on things. An interview with Wozniak appeared in the January
1985 edition of Byte. In this interview, he discussed this transitionary period. To quote,
We had become a huge business success in 1979. We had really made it with our floppy disks and
VisiCalc, and it looked like we were going a long way. So we decided it was time to start
putting together a real company, a big company. We needed to start staffing up and hire a lot
more engineers. So we set the Apple III project in motion. End quote. The successor to the Apple II,
the company's golden goose, was planned and developed during this period of
growth. It would be the first project by the new Big Apple. Waz goes on to explain that Apple was
straight up hiring new people to work specifically on the three. That included everyone from
programmers to managers. There would, of course, be some serious implications to this type of growth.
We'll get to that.
But first, on a purely technical level, how did the 3 seek to update the 2?
Some choices here are really obvious.
One of the most popular peripherals for the Apple II was the Disk 2.
It's a first-party external 5.25-inch floppy disk drive and a matching expansion card.
As we discussed back in the Canon CAT episode, there were some issues with how expansion was handled on this early computer.
this early computer. Jeff Raskin, one of Apple's first employees, just didn't like the fact that end users were expected to slot in the expansion card for the 2 directly onto the computer's
motherboard. He thought it was messy and exposed an end user to just too much of the machine's
internals. I'm fairly certain that Raskin had no direct involvement with the Apple
3. That said, someone involved with the project shared his opinion here. The new machine was
designed with an internal floppy disk drive. This would come pre-installed. That served the dual
purpose of adding a feature that the 2 sorely lacked, while also removing the need for expansion muckery by the end user.
This is just an easy win right out the gate.
Out of the box, the Apple III booted into an 80-column text mode.
This was another really easy win.
The Apple II natively supported 40 columns of big blocky text. That
was just the status quo for home micros in the era, but that's also not much space to work with.
This gets even worse when taken in the context of business. If an office had a computer,
then it was most likely accessed via a set of terminals.
So for most serious users, their lived experience was life on a terminal, which usually meant
80 columns of text.
That was the standard in that specific market.
If you wanted to compete in the realm of business, 80 columns was expected.
This was a known issue with the Apple II, and yet another example of the hobbyist nature of
the machine. You see, this computer had a lot of space for expansion cards. There were 8 internal
expansion slots, and the interface was very well documented.
As early as 78 or 79, third parties were selling expansion cards that added 80-column text modes.
The issue, well, was that software had to support these new text modes for them to be useful.
these new text modes for them to be useful. The first card, the so-called SupR Terminal,
would get a lot of attention because it was eventually supported by AppleWriter, the first-party text editor for the computer. But that support wouldn't come until 1981.
Until then, you had to rely on kind of weird special-purpose software written for specific 80-column cards.
The Supar terminal also supported something that a stock Apple II just couldn't handle.
Lowercase text.
So, the point here is you could have a much more professional experience on an Apple II, you
just had to be very careful about how you picked your hardware and software.
It was not a turnkey solution.
But despite that, businesses still seemed interested in the Apple II.
These micros started showing up in the hands of IBM customers,
and even helped contribute to the drive towards Big Blue's own PC project. These
office-employed Apple IIs were, very much, suboptimal hardware. I think that's really
a testament to how badly businesses wanted smaller computers.
to how badly businesses wanted smaller computers. Now, there are also the usual gripes about the 2 that would be addressed. These are complaints that aren't special to this machine. It was a
relatively sold computer. The 2 ran off a stock MOS6502 processor clocked at 1MHz. That was roughly comparable with the pack of machines released in 1977.
The TRS-80 was technically a little faster, but it used a different type of processor, so
comparisons are a bit rough. A faster processor, or at least a processor with a higher clock rate,
would go a long way towards giving the new 3 an edge. The chip that Apple landed on was the Syntec 6502B, which clocked in at 1.8 MHz.
The B part just means that it was an updated version of the 6502.
The chip was fully compatible with the original.
Plus, it supported a lot of faster clock speeds.
The B revision could handle up to 3 MHz. So why is it operating below that threshold? We can answer the how question much
more easily. Most microprocessors rely on external clock signals to set their speed.
That clock circuit just provides evenly timed
pulses to some pin on the chip. This can give engineers a bit of wiggle room. Usually,
a processor can handle a number of different clock frequencies. So why? Why make the Apple
3 slower than it could have been? My guess is that Apple was matching the CPU's frequency up to some internal bus frequency.
This is wild speculation, I will admit, but I have seen this pop up with other micros of the era.
Basically, all communication lines inside a computer function at some speed.
They require a series of pulses to tell them when to send or receive bits.
That's a requirement shared by the processor.
Most things in a computer have some frequency.
That ranges from memory chips to floppy disk interfaces up to serial ports.
If you match up all those frequencies, or at least have them as some
multiple of each other, you can greatly simplify your design. Software won't have to wait for slow
interfaces, for instance. You can also just tie in one central clock circuit that way.
Not only does that simplify your physical design, but it also saves a little bit of scratch.
But once again, this is my speculation.
The other usual upgrade was memory.
This is yet another place where we see the hobbyist spirit of Apple II users.
The machine had shipped with just 4 kilobytes of RAM.
That's spacious if we're talking 1950s standards.
Now, you could upgrade your 2 all the way up to 48k by adding chips directly to the motherboard.
In fact, Apple would sell you a maxed out computer if you had some extra money. But what if you wanted to go further? It was possible to bump
up the limit using expansion cards. I've been trying to track down the first expansion card
to do this, and as near as I can tell, it was Apple's own 16K language card. This card was
released in 1979. It added, as the name suggests, 16 kilobytes of RAM to the system. It also included
some software in ROM for programming in Pascal. That said, there may well have been earlier
third-party or homebrew memory expanders. The point here is that RAM is yet another area where
Apple II users were more than willing to cobble things together
to improve their computers.
Accordingly, the Apple III would incorporate expanded memory.
The base model was slated to start with 128 kilobytes of RAM.
That is a vast improvement over, well, four.
But that introduces a new problem. The 3 was still using a 6502,
albeit a slightly faster one. That processor has a 16-bit address bus. This is a lot of numbers
back to back, but stick with me for the payoff. When you have an address bus that size, you can only access
64 kilobytes of memory. That's just a limitation caused by the size of the bus. The largest number
you can go up to in a 16-bit number happens to be 64 kilobytes. So how did this new computer handle so much more RAM? The trick is something called
bank switching. In general terms, bank switching lets you swap between accessing different physical
banks of memory. You might have two different banks of 64 kilobytes. When you boot up,
banks of 64 kilobytes. When you boot up, only the first bank is accessible. Then, via some dark and evil machinations, you call in a switch. Now, the second bank is accessible, while the
first bank is tucked away. As you swap around, you're still dealing with the same addresses,
at least your code is still dealing with the same addresses. It's just that you're now accessing different parts of your overall physical memory stores.
This can be done with a physical switch.
You could, in theory, wire up a very fancy switch to move your address and data bus from
one bank of RAM to another.
That, I mean, honestly, that'd be pretty
cool. I would buy one of those, but it's not really that useful. In fact, you usually don't
even want to switch all your RAM, just parts of it. The most common method is to have some chips
between your processor and your RAM that can
switch around parts of the address bus.
This lets you keep critical code in memory while also accessing extra RAM.
In effect, you get this region of RAM that can be swapped around while other regions
remain constant.
The Apple III's implementation of bank switching is, well, it's a little complex.
If you just run down the list of Apple III's features, then you see, oh yeah, it did bank
switching to support more RAM. But that hand waves away all of the details. Most of the III's memory
was set up statically as those constant regions
that don't change between switches. Only one 16 kilobyte chunk of RAM actually swaps around.
The switch is controlled by writing to a specific memory location. That's FFEF if you're keeping
track at home. That's near the upper bounds of the 6502's address space.
By putting a value in that specific address, that magic number, you're telling the Apple III
which bank of memory you want to wire up. There are 16 possible banks, although not all are used.
That will depend on how much physical memory is installed in the machine.
The Switch is orchestrated by a so-called Switchbox. Its specifics are described pretty
well in the article Bank Switch Razzle Dazzle by John Jepson. It's from an August 1982 edition of Soft Talk. I'm pulling heavily from this because, well, personally, I'm trying to not read circuit
schematics this episode.
It's a self-health kind of thing.
Now, the Switchbox is a chip that's wired between the 6502B and the Apple III's RAM
chips.
It intercepts requests from the processor,
grabs the bank value that's set in memory, then decides which bank to expose to the CPU.
That should sound a little complicated because, dear listener, it is.
One of the beauties of old microcomputers is their resounding simplicity.
You have a processor.
It's wired directly to memory chips, maybe to some I.O. stuff.
You might only have a few different types of chips on a board.
Your programming manual is basically a reference doc for the processor,
plus some documentation on how memory is laid out.
It's all pretty easy to grok. The 3, however, is approaching an interesting level of complexity.
Jepsen describes a handful of caveats that come with this switchbox. There are details about
remapping the 6502's zero page to different regions of RAM or intercepting different types of
memory operations. The bottom line is that the Apple III had a pretty complex way of handling
memory. For me, this evokes some of the first passes at multitasking. Now, bear with me a minute while I try to put this into a cogent thought.
CTSS, the Compatible Time Sharing System, was one of the first multitasking operating systems developed.
It was technically timesharing, but that's just a type of multitasking.
Anyway, one of the early stumbling blocks that was encountered had to do with memory.
How do you protect one process's memory from being stepped on by another process? One of the early stumbling blocks that was encountered had to do with memory.
How do you protect one process's memory from being stepped on by another process?
Part of the eventual solution was a special box that sat between the computer's processor and its memory.
This box was called a memory management unit. It handled things like protecting regions of memory and changing how physical memory mapped into the address space that was exposed to the processor. The 3's switch box is serving a
broadly similar purpose. You configure it to map physical memory into your address space.
That 16k region could be mapped to different physical chips.
I'm not saying that Apple had timesharing in mind here, or that there's even a comparable
level of complexity, more that their new computer was pretty different than their last machine.
The 3 had some complications and some quirks to its design. There are two main stories that get shared when it comes to
folkloric traditions around the Apple III. The first is what I recounted in the introduction.
Turns out Steve Jobs just didn't like fans, and the machine suffered because of it.
The second is the tale of the case. This intersects with the fan-free design. Supposedly, the case for the Apple III
was designed before the rest of the computer. This forced Apple's engineers to build a computer
that fit in this fancy case. This choice, as with the lack of fans, hamstrung the computer.
The III's case is a little strange. It's an almost all-in-one kind of design.
The case was constructed from cast aluminum, which, at least in theory, would serve as a
passive heatsink. If you've seen an Apple II, then just kind of imagine putting a bump on the top
and you get a 3. That bump is there
mainly to hold the floppy disk drive. The top of the bump was even divoted out to accommodate an
Apple-branded CRT display. I'd kind of recommend looking up the three to see an image because I
can't do it justice on the podcast. Despite the jokes, it does actually look pretty good.
on the podcast. Despite the jokes, it does actually look pretty good. Now, when the case story gets repeated, there's usually this air of incompetence or arrogance around it. Bumbling
managers had the case designed before the rest of the computer, and by the time it became a problem,
the project was too far along. Or, alternatively, some higher-ups, maybe Jobs himself, just insisted that the case
had to be just so. Is any of this actually true? The initial person in charge of the three project
was Wendell Sander, a long-time Apple employee, at least as long time as he can be at a young
company. There's a central theme that comes up again and
again in interviews and retellings of Sander's time on the 3. Constraints.
Sander is often called the father of the Apple 3. This goes so far that the computer shared a
codename with Sander's daughter, Sarah. But he wasn't the key decision maker. The higher-ups at Apple
chose what the three needed to do. They worked up a rough design for the new product. Which means,
the machine was built by a committee of not-engineers. Sander then had to implement
all of these ideas into a new machine.
The laundry list of features that we've already covered didn't come from Sander himself.
Apple, as a corporate entity, decided that Apple III needed more memory,
an internal disk drive, 80 columns of text, and so on.
Sander's job wasn't to figure out what this new computer should be capable of,
rather just to figure out how to implement a list of requirements.
So the question becomes, was the case part of those requirements?
Tracking this down gets tricky.
The Three's case was designed by Jerry Manok and...
others.
There's actually a patent on the specific case design. According to legal documents, the design of, quote, personal computer was filed by Steve P. Jobs, Gerald C. Manok,
Dean Hovey, and David Kelly. The other names were, well, Steve Jobs plus some outside contractors.
The actual patent was filed in November of 1980.
That puts it a few months after the release of the 3.
In other words, this does not give us useful information as far as the timeline goes.
However, it does give us a route to an answer. We can use this
to find some more documents. The best evidence for the case story, and the best documentation about
all of this, is the book Apple Design, the work of the Apple Industrial Design Group.
It was actually published by Apple themselves. This text covers the industrial design
side of Apple's products. We're talking cases, basically. We get a whole section on the Apple
3's physical design, which gives us dates for the project. According to the text, the case was
designed between October 1978 and July of 1979. Was this early enough for
the story to ring true? Well, let me read you this passage from the book.
Quote,
Long before the circuit board had been laid out and components such as the floppy drive and power
supply selected, the Apple III's industrial designers,
Jerry Manok and Dean Hovey, had already designed the internal chassis and case,
without being certain that the components would fit inside. The engineers had already given Manok
the maximum board size they thought they would need, and Manok designed accordingly.
they thought they would need. And Mannock designed accordingly. But as the project evolved,
the engineers needed more space than Mannock could give them. End quote.
This gives us, I think, a more interesting story than just incompetence or arrogance.
The engineers referenced here are Sander and his team. So there was some back and forth, but things broke bad.
Initial estimates were made, but they were wrong. The case was completed before the rest of the computer was finalized. Thus, we get into the situation where the case was already set in stone,
or in this case, aluminum, but the final motherboard just doesn't fit.
This could have been dealt with. In fact, it should have been dealt with. But there was an
issue. In a 1986 interview in The Three magazine, Sander explains that there was a time crunch.
The execs at Apple were worried that the Apple II was going
to stop selling, that their golden goose was set to croak. Management wanted to get out ahead of
this by rushing the 3 to market. Sander had around a year to get the computer up and running.
Once the case issue was discovered, there just wasn't enough time to revise the design.
The motherboard was still being worked on, it hadn't gone into production yet.
It was decided to tweak the internals to fit the metal case.
The die for the case was, quite literally, already cast.
We also have our friends at certain three-letter agencies to thank for the Apple III.
One was IBM.
Project Chess had been conducted with relative secrecy.
But, I mean, come on.
This is IBM we're talking about.
Suddenly, there's these massive orders of off-the-shelf chips.
Paper trails start showing up.
The industry has a way of talking.
There were at least rumors that IBM was planning some type of personal computer,
a business machine.
Apple, it seems, had heard these rumors.
IBM was about to enter the microcomputer market.
IBM was about to enter the microcomputer market.
Imagine having a literal giant breathing down your neck.
IBM was called Big Blue for a reason.
It was huge.
It was the biggest computer company in the world.
There was this saying back in the day,
no one ever got fired for buying IBM.
To a great extent, that was true.
The B does stand for business, after all.
So if IBM was going to hit the small market, well, there was nothing Apple could do.
Their only hope was to get to market before the giant breathing down their neck.
The Apple III couldn't be postponed.
It had to be ready before the PC launched.
There are still more issues at play here.
The second three-letter antagonist is the FCC,
aka the Federal Communications Commission, or the Friendly Candy Company,
or Fred Charlie Charlie, if you like that name more.
According to Apple Design, there were concerns over the codification of electromagnetic emission guidelines.
Electronics have this fundamental issue where, no matter what,
there will be these spurious electromagnetic emissions.
It just comes with the territory.
Whenever you have a current passing down a wire, it can create this type of EM radiation.
It gets worse when you have changing current.
In this early era, home computers were notorious for interference issues.
The go-to story here, and the fun easy one,
has to do with the Altair 8800. One of the first practical programs written for the Altair
harnessed this excess radiation to play music on nearby radios. As the 80s dawned,
well, the FCC decided to step in and regulate this problem.
There are ways to lessen interference. The primary tool is shielding. That sounds a lot
cooler than it actually is. You can pretty easily block EM radiation using conductive metal.
The impact of this shielding can be minimal. You basically just put offending circuits in these
little metal cans. If you crack open smaller home micros, then you can usually see a can
wrapped around their RF modulators, for instance. Or you can just put a metal mesh around any wires that are sending out radiation. Or, you know, you could go the
Apple III route, I guess. Apparently, Manok just decided to over-engineer the case to provide
super-shielding. At the time, the FCC's EMI guidelines were still preliminary. So,
Manok went with overkill shielding, just in case their guidelines were
bumped up more than expected. This gave more incentive to leave the case alone.
So, when something had to give, it turned out that the computer looked pretty flexible.
The whole story around the case is much more interesting than just incompetence.
That's a part of it, sure, but there's all these external factors.
Mistakes were made for, I'd say, understandable reasons.
How does this end up altering the design of the computer itself?
Well, the Apple III basically had to get small pretty quickly.
To start with, the chips had to be packed in closer together.
On its own, that's not awful.
In fact, it could have benefits besides size reduction.
Pulling chips closer together means you can route shorter traces between pins.
That can reduce EM emissions.
This could also help with overall costs.
When you get a printed circuit board manufactured, you're usually charged by the area of the board.
Larger boards are, I think understandably, more expensive. Drop the size and you can save cash,
especially when you're operating at a large scale. That's under normal circumstances, though, when you have time to test things and work out the kinks in your design.
Apple was racing IBM. They didn't have time for any of those niceties. So we get to see
some downsides. One of the largest was heat. Recall that the 3 was passively cooled. There
was no fan to annoy Steve Jobs. Packing chips closer together meant that it was harder for
heat to dissipate. If there was a fan, then that would have been much less of an issue.
Fresh, cool air would waft in and hot air would be pulled out. Instead, the chips were left to stew in a cramped metal cavity.
The stagnant air would heat up with each clock tick.
The tight packing just aggravated the heat issues.
In general, heat would be the Apple III's biggest enemies.
This brings us up to the next part of the folklore.
The drop. Supposedly, the Apple III
would get so hot that it would straight up stop working. As the motherboard roasted, components
would warp, which eventually broke connections. Thus, the computer suffered the equivalent of a digital heat stroke.
The solution, according to Apple support, was to drop the computer from six inches in the air.
The sudden jolt would reseat the machine's chips, thus reviving it.
Personally, I just love this story.
Part of that is due to my experience with tech support. I don't think anyone likes calling tech support. You end up getting lost in a phone tree, wasting your precious
time only to be given useless advice. Have you tried turning it off and on again? Are you sure
it's an issue with our company's product? Have you regularly lubricated the main bearing with bear oil?
Needless to say, calling tech support is an exercise in frustration. So just imagine this
scenario. You buy a $7,800 machine for your small business. It works for a day, maybe even a week. You're halfway through payroll, and it dies.
After fiddling around, you give up and you call Apple support. Their response? Well, did you try
dropping it from six inches above a sturdy desk? Maybe that's when you start remembering that jingle about how no one ever got fired for buying IBM.
So I must ask, once again, is this true?
The story is told and retold all over the place.
The idea is that as the machine heated and cooled, either the motherboard or the chips or maybe some sockets would expand and contract. This would warp and
wiggle the chips out of place. Either solder joints would get stressed or chips would come
out of their sockets. But did this occur? There are definitely reports of Apple 3s failing,
but is chip wiggle to blame? Did Apple really recommend percussive maintenance?
After a bit of searching, I'm convinced that this is actually folklore, at least mostly. I haven't
been able to find any early reviews that claim Apple told them to drop their machines. We know
there were issues with the first batch of 3s,
and if a support tech told you to get physical, well, that would make for a really good article
in Byte or any other computer magazine of the time. The earliest mention of this phenomenon,
or at least the one that's cited that I can find, comes from a 2006 low-end Mac article.
This gives us some more details to search. The article specifies that Dan Kotke discovered the
solution when he, quote, picked up the machine a couple of inches in frustration and slammed it on his desk. That seemed to fix the thing. The article also claims
that Kotke kept his solution a secret. So that's nice. That's usually a sign that something's not
going to add up. I haven't been able to turn up any primary sourcing on the story. No accounts from Kot-Key, no mentions in interviews, no listings in technical documents.
Nada.
The best I can do is a post on Stack Exchange.
In it, a user who claims to be Ronald Nicholson, another early Apple employee,
says he worked next to Kot-Key.
He apparently witnessed the slamming event,
and Kotke even recommended the solution to him. This, I think, is a likely story,
but this isn't the kind of nice validated source I like to use.
So the best evidence we have points towards an in-house solution to the problem that,
as near as I can tell, was circulated around Kot-Key's workbench.
That's a little different than Apple support.
This all makes me a little skeptical of the drop story.
So let's back off and ask a more fundamental question.
Would repeated heat cycling actually wiggle components out of their sockets?
Would it break connections?
There's a really neat article on the website applelogic.org that examines the three's issues.
It even goes in-depth about the whole heat warping theory.
AppleLogic's take is that heat cycling alone
shouldn't be enough to pop chips. The logic behind this makes a lot of sense to me.
So here's the root of the issue. Many of the Apple III's chips are held in sockets.
These are little receptacles that allow chips to be swapped in and out. The socket itself is soldered into the
motherboard directly. Then the chip slides into the socket. Each of the chip's pins is held in
place by a piece of bent metal. It's a friction fit, but there's some springiness to it, so in
general you get these really nice solid connections. If we're assuming the whole heatwarp theory is accurate, this is
where the wiggle would occur. The chips or their sockets would flex just enough that pins would
slip out of their holsters. However, at least according to Apple logic, that exact behavior
hasn't been observed or documented. They make the point that the chips and sockets are both
made of the same materials, plastic and tin-coated metal. So even if there was heat-induced wiggling,
everything should wiggle the same. You get a net of nothing. The only issue I see with this theory
would come down to chip packing.
Some chips' bodies are made from plastic, but others are made from ceramic material.
That doesn't warp under heat, at least not the heat that an Apple III could ever hope to generate.
So maybe a solid ceramic chip in a wiggly plastic socket could start to misalign,
but that still seems unrealistic.
That leads us to an interesting juncture. It seems like heat-induced wiggle isn't a very viable theory. At least, I don't buy it. Overheating can cause other problems, but
wiggle? That just doesn't seem like the primary mode of failure. Let's assume that the story about
Kotki's slam is true, that he was able to beat an Apple II back to life. Why would that work?
If there wasn't a wiggle, then why would that do anything? That Apple logic article is very much
the gift that keeps on giving here. It provides an alternative hypothesis that
I kind of like, but I also have my problems with. The idea is that there were poor assembly
standards. This takes a bit of hand-waving and guessing, so just keep that in mind as I present
this. That said, I do think it's worth considering. AppleLogic speculates that the Apple III was manufactured using an automatic integrated circuit inserter.
This is a fancy press that throws chips into sockets for you automatically.
We also know that Apple was rushing the III into production, so they didn't have time to test each unit.
Now, what if this insertion machine
wasn't set up properly? Maybe it wasn't dialed into the proper board thickness, or it wasn't
calibrated correctly, or maybe it was just plain broken. Depending on the scale of the operation,
there may have been multiple machines running at once. Maybe only one or two were on the fritz.
The result would be that some chips were improperly seeded to begin with.
That would make the three fail to operate,
but it could be fixed by a strategic slam.
At least, maybe it could be fixed by a nice hit.
Let's take this a step further,
because I'd like to try and reconcile these two
theories. I think both issues could be at play. Imagine that the insertion machines are just
a little off, such that some chips are inserted enough that they make good contact, but
not far enough to be completely secure. We have tenuous connections. In that case,
a little heat wiggle might be enough to slip a pin just a little bit out of place. We could even say,
for the sake of argument, that the ceramic-packed chips were inserted improperly. Maybe that was
the calibration mistake in the automated insertion machines.
As the 3 heat cycled, the ceramic chips remained solid,
while their plastic sockets warped slightly.
This is a perfect storm kind of theory.
It's approaching Apple fanfiction, I do admit,
but I think it could offer a possible solution to the myth.
This failure mode could account for reports of Apple III's dying within hours of being flipped on,
and units that were dead on arrival.
Was there a perfect storm brewing around the Apple III?
It may sound far-fetched at first,
but there were just a multitude of issues with the machine's production.
at first, but there were just a multitude of issues with the machine's production.
Maybe there weren't enough that weird failure modes were just destined to occur.
Another one of these problems was the real-time clock chip. The 3 was slated to use a national semiconductors chip, the MM58167B. A classic. This is a real-time clock, or an RTC. It's a powerful addition to any
computer. Think of it like a wristwatch that goes on a motherboard. You can set the time
programmatically, then read out that time in a similar fashion. The real-time component means
that it keeps time even if the computer is turned off and unplugged.
I think it's clear to see how this is useful on a business machine.
Take VisiCalc as an example, the killer spreadsheet that every company wanted to use.
If you have a table of orders, it would be really nice to calculate how old each order is.
to calculate how old each order is. Or, maybe if you want to be fancier, set up a column that just says if an order is from the current quarter or from the last quarter. In order to do that,
or at least to do that automatically, you have to have some way to tell time. You need a way to keep
track of when now is. Normally, when you switch off a computer, that's it. The machine is dead
to the world. You can't have some special program that's always running in the background, since
the computer just won't always be running. The RTC solves this specific problem. The chip has
its own internal battery that's usually good for 10 or 20 years.
It will happily tick away even when the computer is packed up in some closet.
This turns out to be a very practical solution to a very specific issue.
How could this possibly break bad, you may ask?
How could this RTC chip become an issue for the Apple III?
Its battery wasn't blowing up from heat cycling, I can tell you that much. The issue was, well, more dumb than anything.
It seems that some of the chips sourced from National Semiconductor were broken. They just
didn't work right. This can sometimes happen with batches of integrated circuits.
It's a notorious issue with microprocessors.
Sometimes complex chips will have production yields measured in tens of percents.
Stuff happens, but it can usually be worked around.
However, there's a whole timeline issue. The three had to be ready for market as
soon as possible. Testing was one of the many things that went by the wayside. From Sander again,
quote, we were growing, becoming a large manufacturer, and there were supportive quality
activities, component selection, and
things like that that had not grown up with the company when it should have. Therefore,
we were not in a position to do adequate component qualification and things of that sort,
and it caught up with us. I think if we had had another six to nine months,
those problems would never have appeared in the marketplace."
months, those problems would never have appeared in the marketplace. End quote.
A little lag time for testing and debugging would have gone a long way. There just wasn't enough time to do things right and do them quickly. The RTC became a problem due to this rush.
If testing had been carried out, Apple would have noticed that they had some faulty chips.
carried out, Apple would have noticed that they had some faulty chips. They could have just ordered a new batch. There just wasn't time for the back-and-forth required to find and address the
issue. That said, there is some contention even around the RTC story. This gets, well, it gets
weird. Once again, AppleLogic presents an alternative theory that is worth at
least examining. In the push to get the Apple III's motherboard to fit into its case, extreme
measures were taken. We've already talked about chip density. That, on its own, was a bit of a
sore spot for the machine. The other compromise was circuit traces. On a printed circuit board, the little
lines that connect things up are called traces. These serve the same purpose as wire, just
with less dimensionality. When you're designing a PCB, there are certain rules you have to follow,
usually called tolerances, but sometimes just called rules or routing rules. This dictates
how close traces can be to one another, how thick or thin traces have to be, what kind of turns
traces can take, that sort of stuff. Now, these tolerances are dictated by manufacturing processes.
Something like trace clearances might be limited by really esoteric
things like the resolution of photo equipment used to make resist masks. Why does this matter
for the three? One way to make a board smaller is just to cram traces closer together and make
the very traces themselves thinner. You can't actually just smush everything down, because the chips
remain the same size. The actual holes for the chip's pins keep their layout. Thus, you have to
reroute everything. Using small traces gives you some cool tricks to work with. You can, for instance,
run a trace between a chip's pins. You can build up these really dense patterns of traces that, if done carefully, will shrink
your board down.
Of course, the episode here is all about trade-offs and bad compromises.
So what's the trade-off?
What happens if you aren't careful?
Well, that's up for debate.
if you aren't careful? Well, that's up for debate. This would, of course, lead to fragile motherboards, at least more fragile in comparison to boards with bigger, bolder traces. This could
render the board more susceptible to damage if, say, it was being rushed around a factory floor
or crammed into a case that's a little too small. How does this relate to the
clock chip? Well, the hypothesis has to do with our friend EMI, electromagnetic interference.
A trace can actually generate EM radiation just on its own. If you have a changing electric field
on the line, it will create a magnetic field. If that magnetic field is wiggling
around, say from variance in the changing current on the trace, then it can induce a new electric
field in a nearby conductor. That's exactly why the FCC wanted to beef up regulation around home
computers and other consumer electronics. The 3's case protected the outside world from these E&M
wiggles, but it did nothing to protect itself. That could, in theory, have an impact on the
computer. The clock circuit held in the RTC was already a little fiddly, so maybe it was susceptible
to this kind of interference. Once again, this is one of those
perfect storm type of stories. Poor management and planning led to the PCB crunch, which led to
strange electromagnetic properties, which led to failing RTCs. Or National Semiconductor just
straight up sent Apple a bad batch of chips,
and they were never properly tested.
We can drop all the speculation because, at the end of the day,
the Apple III had problems on arrival.
The machine just had reliability issues, which may have been caused by any number of faults.
These zero-day failures gave the machine a bit of a reputation.
For a product to shine, you have to stick the first impression.
The 3, it missed that mark.
But hey, second impressions are good too, right?
Once you get past all the launch issues, the poor manufacturing and poor choices, what did users actually get
out of their 3? Once they had a functioning computer, whether that be through support or
getting lucky, then what did they actually get out of it? Software is kinda a big deal.
I like to think that's become a theme in Advent of Computing. A computer on its own
isn't useful. You need software to back that up. I probably need a little sign to tap whenever this
comes up. The point is, computers just need software. IBM was really, really smart about
this when they launched the PC. It was a new platform, so it
had to come out the gate with software to make it useful. On August 21st, 1981, IBM announced the PC
with a press release. This included a list of launch day software. We get all the hits. A text
editor, accounting software, some weird thing called the source,
and VisiCalc, the spreadsheet of the decade. The PC would even launch with multiple operating
systems. At least, kinda. You got IBM PC DOS, CPM86, and UCSD P-System. On day one, you could pick up a copy of DOS. The others would
ship within a few months of release. You could even buy a copy of IBM Microsoft Adventure,
a rendition of Colossal Cave Adventure. IBM really did it right. As soon as the PC was on shelves, it was ready to
go. You could go down and buy a PC with all the fixings and all the software you needed.
You could even grab a game to play when you got sick of running the numbers. No one ever got fired
for buying IBM because, frankly, they sold the whole package on day one. They sold
you a solution. You weren't in a situation where you had to wait around for a text editor to be
released. You didn't unbox a pallet of PCs that inexplicably broke within the week.
broke within the week. If Apple was trying to beat IBM on their home turf, well, they had to get the software part right. They could at least do that much, couldn't they? I usually avoid
extended quotes, but I have to slip this in. This is from Steve Wozniak discussing the failing of the Apple III project in that Byte article I mentioned earlier.
Quote,
Originally, we planned to deliver four applications with the Apple III.
Word processing, a spreadsheet, business graphics, and a database program.
Steve Jobs' thinking at the time was people don't really want to buy a computer.
They don't want to know about microprocessors or cards or buses.
They want to buy VisiCalc.
They want to buy a solution.
So we were going to provide the four major solutions.
But because we were having problems managing the Apple III project while we were building our management structure,
we were only
able to deliver our operating system, SOS, and VisiCalc, which was done by personal software.
End quote. If the III had launched with that slate of software, then it could have done pretty well
for itself. Apple would have still had to deal with some of the hardware issues,
but they could get a handle on that eventually. With a solid lineup of launch software,
the Apple III could have been held up as a standalone solution to every office problem.
But, you know, that didn't happen. They only launched with a spreadsheet and an operating system. And don't get me wrong, VisiCalc is a great program,
but you can't use it to write a strongly worded letter.
As a result, on day one, this new computer just wasn't much of a solution.
It was just a computer with a cool spreadsheet program.
That's nice, but the Apple II already did that.
Why get the more expensive business machine when you can get by with last year's model, so to speak?
There is a possible backstop here that may have helped.
At least, there's the chance it could have helped.
The Apple III is actually backwards compatible with
the Apple II. At least, partly. The III can be booted up into a compatibility mode with certain
restrictions. This mode only emulates a stock-ish Apple II. You get a 40-column display, limited RAM,
and a disk drive. But that's about it.
On the surface, that might seem fine.
It's limited, but that's what you get with the older machine, right?
Well, recall all the upgrades the two users were buying up.
If you were seriously using an Apple II, then you probably didn't own a stock machine.
So sure, you could run old software on an Apple II, then you probably didn't own a stock machine. So sure, you could run old
software on an Apple III. That means that on release day, there were things like text editors
and games that would work on the machine. The problem was, you didn't get any nice features.
You could run a text editor, but only in 40 columns. You could play games, as long as they didn't need extra RAM.
The 2 emulation was nice to have, but it wasn't executed in a smart way.
There was a purpose to this hamstringing, though.
At the time, Apple believed that the 2 was about to die.
They were trying to discourage programmers from writing new software
for the Apple 2. So they decided to limit the Apple 3's emulation capabilities. Some old software
would work, but no new features could be accessed. The 3 was just a better computer, after all.
Apple was trying to push developers towards this new platform instead of giving them
a way to remain in the past. With that, I want us to turn to what software was ready in time
for the 3's release. VisiCalc and SOS. We all know VisiCalc. It is just a spreadsheet. I really
do mean that. A modern spreadsheet program is a pretty close
analog to VisiCalc. Simple, effective, boring. The more slick offering was SOS,
internally pronounced as sauce. You know, Apple sauce, a little bit of a joke. If we believe Wozniak, then SOS, or SOS, was the only part
of the entire 3 project that was done right. Woz's own words speak volumes here. When asked
by Byte, quote, is SOS really that good? Here's what he had to say, quote, I think it's the finest
operating system on any microcomputer ever. It's the
greatest thing in the world, but I wish we gave out listings of it, end quote. So why don't we
close out this episode of negatives with something more upbeat. To start with, what really is SOS
and how does it differ from prior Apple creations? SOS stands for Sophisticated Operating System,
and that's really what Apple was going for.
At the time, the Apple II didn't really have an operating system.
If you booted up a II, then you were dropped into a basic shell.
From there, you could program or load data off tape.
That's technically an operating system, just a really simple one. Once the Disk 2 was released, Apple users were
introduced to Apple DOS. That's a very simple disk operating system. You'd load DOS off a floppy
disk, then you could do all the usual basic stuff, plus a handful of disk
operations. A little nicer, but still pretty simple. The file system used here was also flat,
so no directories or anything. Simple is really the name of the game.
The DOS to SOS jump was a huge one. This newer operating system was, just like the name says, far more sophisticated.
For one, it was built from separate components that were loaded at boot time.
DOS kinda did that, but not in a very smart way.
Loading DOS just loaded some data from the boot sector on the disk.
You got a chunk of that disk into memory, and then that just handled everything. Loading DOS just loaded some data from the boot sector on the disk.
You got a chunk of that disk into memory, and then that just handled everything.
That's fine, but it's not very sophisticated.
SOS was separated into three special parts.
The first part to load was the kernel, the main program that controls everything.
After that, SOS loaded up an interpreter and device drivers. The trick here is that each of those components was just known by
name. The Apple III had a small ROM program that, on boot, checked for a disk with an SOS signature.
an SOS signature. If one was found, then the three started looking for sos.kernel, then sos.interp, and sos.driver. Since you're just looking for names, well, that adds some major flexibility.
There's also the difference between the DOS boot sector and separate named components.
bootsector, and separate named components. Bootsectors are, you know, you can deal with them,
it's just not as easy to deal with as files. Anyway, that's more an aside than anything.
The most obvious use of this flexibility is the ability to ship upgraded versions of SOS.
You throw a new kernel file on a disk and you're done. But that's not super cool.
You could already kind of do the same thing with Apple DOS.
To get a little cooler, you can take the plunge into the whole interp file thing.
This is just an executable file.
It just happens to have a certain name.
This means that you can have the three load up any program you want.
In this way, you can make a self-booting program that has access to the full SOS kernel.
You get all the fancy hardware management code you could ever want,
from RTC to disk stuff to memory management.
On its own, that's pretty swanky.
That said, we can get cooler, and I'm talking ice cold. What's the deal with sos.driver? Simply put, it's the file containing all the hardware drivers for your
Apple III. And just like everything else, it's simply a file with a special name. It gets loaded
in on boot time. So here's the cool part about this setup. You can actually
install new device drivers. When you add some new hardware to your Apple III, be that an external
device or an internal expansion card, you can install drivers for that hardware to your boot
disk. Let's compare this to the Apple II, since that's our point of reference for the
digital progress in this episode. Drivers for the Apple II were, by and large, handled in hardware.
That might sound a little strange at first, but I assure you it does make sense. If an expansion
card needed special software to interface with the machine, then it would have that software stored in a ROM chip on the physical card itself.
Plugging in the expansion card essentially mapped that chunk of ROM into the 2's memory space.
For a simple example, let's look at the AppleSoft Basic expansion card.
That was released pretty early into the 2's lifespan. It's a really
simple card that is basically all ROM. It just upgrades you from the machine's early integer
BASIC to something fancier. That card is just a pile of ROM chips that gets mapped directly over
the region of memory that's usually reserved for the Apple II's internal ROM basic. But this card's
kind of a special case. The Apple II actually had special slots. Cards like the AppleSoft
expansion had to be placed in slot 0, which allowed for this type of memory overwrite.
Other slots worked in a slightly different way. Each slot had its own chunk of memory that could be wired up to its own ROM chips.
Essentially, each card just had its own space in memory to store its driver.
Once plugged in, that code was instantly accessible.
It's a pretty slick solution, as long as things stay relatively small.
as long as things stay relatively small.
You see, each card could only bring along 256 bytes of code.
That's a workable amount, but it's still an annoying limit to deal with.
What happens if your driver is just too complicated?
You could spill your driver over to a floppy disk,
but then you're outside standard operating procedure.
You get to a point where one floppy drive would always have to have your driver disk sitting in it.
That's just not very reasonable.
The same thing would happen if, say, you have a peripheral that doesn't use an expansion card. You'd have to keep the driver disk handy or just always have
one of your drives dedicated to a driver disk. That's annoying in some cases, but could also
be a deal-breaker for certain workflows. So while the ROM route works, it's not very scalable.
It doesn't work for all cases. SOS gets around this issue via driver files. Each
peripheral on the computer, from the keyboard all the way up to the disk drive, was controlled via
a driver. Each of those drivers was loaded at boot time from your boot disk. When you got new
hardware for your 3, be that an expansion card or some external device,
you'd also get a floppy disk containing driver files.
Simply throw in the disk, follow some instructions, and you were good to go.
The drivers would be copied over to your boot disk, and your new hardware was ready to go.
This provides some pretty obvious benefits, I think. Not only is there
a consistent interface for driver software, but you could also ship updates. Drivers could be
larger, more complex. The only downsides are that you have to fiddle with a disk, but you only have
to do that one time. The boot structure, the loading of special files, would carry over into other Apple products.
Look no further than 1984's Macintosh.
Maybe you've heard of it.
On boot, that computer looks for a system folder and loads its operating system from there.
It comes with all the same benefits of SOS's setup.
It comes with all the same benefits of SOS's setup.
Even folders, a mainstay of many later Apple products, start in SOS.
So maybe Wozniak was onto something.
Perhaps this was the good part of the Apple III.
Alright, that brings us to the end of this episode,
closing out my slightly late and slightly off-topic pass at Marchintosh.
On paper, the Apple III sounds like a fantastic machine.
At least, the highlights sound fantastic.
It's a faster Apple II with more memory, better graphics, an integrated disk drive, and just a more polished design. It takes all the expanded features that Apple users
wanted and throws in some of the company's distinctive flourish. But look a little deeper
and things start to fall apart. There are reasons for the 3's shortcomings.
Most of the problems come down to growing pains.
The Apple that designed the 3 was a totally different company than the Apple that designed the 2.
They also chose to play a dangerous game by squaring up against IBM.
Sander was very right when he said that, given more time, this new computer could
have been a big success. But things were rushed, corners were cut, and that led to problems.
Throughout this episode, I've been getting flashbacks to another topic I covered,
Viatron. I actually like the whole Viatron story so much that I have one of their old stock certificates
hanging above my desk right now. I look at it from time to time as I record.
Viatron tried to go up against IBM around the turn of the 1970s. They were a smaller company
trying to break into the business space with a small computer. Their machine, the System 21, sounded
great on paper. It sounded like a real IBM killer, something that every small business in the world
should have at an affordable price. But things fell apart. In the case of Viatron, there may
have been some shady financial dealings involved. It turns out that fighting IBM, especially in the
latter parts of the 20th century, was a battle that just couldn't be won.
There's more to say about the Apple III. Soon after launch, it was revised. The RTC chip was
straight up pulled from the design. The motherboard was rerouted with different types of traces.
The internal power supply,
something that I didn't even have time to talk about, was swapped out for one that generated
less heat. Apple was seriously trying to fix their new machine. In 1982, a revised version,
the 3 Plus, was released. They were with the 3 for the long haul. But the disastrous launch
made adoption an uphill battle. Combine
that with a little thing called the IBM PC, and I think it's understandable why the 3 has
become a footnote. Besides, Apple had bigger hits in the future.
Thanks for listening to Adren of Computing. I'll be back in two weeks' time with another piece of
computing's past.
And hey, if you like the show, there are now a few ways you can support it. If you know someone
else who'd be interested in the history of computing, then please take a second to share
the show with them. You can also leave ratings and reviews on Apple Podcasts. If you want to
be a super fan, then you can support the show directly through Advent of Computing merch or signing up as a patron on Patreon.
Patrons get early access to episodes, polls for the direction of the show, and bonus content.
You can find links to everything on my website, adventofcomputing.com.
If you have any comments or suggestions for a future episode, then go ahead and shoot me a tweet.
I'm at Advent of Comp on Twitter, at least for the time being.
And as always, have a great rest of your day.