Advent of Computing - Episode 43 - ENIAC, Part I
Episode Date: November 16, 2020Completed in 1945, ENIAC was one of the first electronic digital computers. The machine was archaic, but highly influential. But it wasn't a totally new take on computing. Today we are taking a look a...t the slow birth of ENIAC, how analog computers started to fall apart, and how earlier ideas transitioned into the digital future. Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content:Â https://www.patreon.com/adventofcomputing
Transcript
Discussion (0)
Why would anyone ever want a computer? Or perhaps put another way, why bother even making a computer
in the first place? For us, that's a really easy question to answer. Computers are just plain
useful. They're great at crunching numbers, they're really good at getting work done,
and you can also use them to waste a lot of time. But I don't think a citizen of the 21st century
can really answer that question without a heavy dose of bias. We I don't think a citizen of the 21st century can really answer that question
without a heavy dose of bias. We're living in the wake of a digital revolution. If you want to get
nitpicky, probably more like a few digital revolutions. Our world has been so fundamentally
shaped by computers that it's hard to imagine a world without these handy machines. Instead,
I think we should look to the past to try and
make sense of this question. But once again, we reach a problem. When was the first computer built?
Who first came up with the idea of a computer at all? In the course of this show, I've tried to
shy away from that specific topic, mainly because it turns into an exercise in futility. It gets into the strange territory
of defining what a computer is and isn't, which, believe it or not, isn't always that cut and dry.
Machines similar to computers have existed since antiquity. Even concepts like binary
number systems show up in ancient sources. All this is to say that we run into murky waters, and pretty quickly
we get into eras where there just isn't very good recording or accounts of events. But I don't think
we need to go back all that far to find a good reason to build a digital brain. At least, we
shouldn't need to dive into ancient Greek or Chinese sources. We only need to go back as far as the 1940s. World War II,
for all its atrocities and horrors, actually made the perfect incubator for digital technology.
Wartime requirements made cracks apparent in existing machines, and wartime governments were
all too eager to throw money at any problem. This made a perfect storm for a flurry of development,
and in this storm, I think we can find a good reason to create early computers.
Welcome back to Advent of Computing. I'm your host, Sean Haas, and this is episode 43,
ENIAC part 1.
Now, the name should clue you in.
We're heading into another series.
This time, I'm going to be devoting two episodes to ENIAC, one of the first electronic digital computers.
And, I gotta say, I'm actually pretty excited to be getting back into this early era of computing.
As fun as the 70s and 80s were for chips and silicon, it lacks a
certain gravity, at least for me. Once we get back to the first half of the 20th century, we get to
examine more fundamental questions, and I think we really get to cut to the root of why computers are
the way they are. In this series, I'm going to try to answer the question from the top.
Why bother making a computer in the first place? But here's why this is a two-parter. The story of ENIAC is a long one, and I don't just mean that as in there's
a lot of information. The books don't really close on the machine until the middle of the 1970s,
and yeah, you heard right. People were still dealing with the direct repercussions of ENIAC
while the first microprocessors were starting to go on sale.
The reason for this long reach comes down to patent law. ENIAC was a groundbreaking machine.
It was crucial to the development of the field, but it actually came pretty close to destroying
the computer industry. Patents of the technology inside ENIAC end up leading to a very strange
lawsuit, one that would eventually put the basics
of digital computing into the public domain. Now, I'm going to stop before I get too ahead of myself.
To give a quick roadmap, this episode we're going to be looking at ENIAC, why it was made,
its development, and then talk about how it was actually used. The next episode, we'll dive
headlong into the legal battle that was sparked by the computer,
and the court case that very much decided the future of the computer industry at large.
So, without further preamble, let's take a trip back in time, to an era before bits and bytes
really mattered. Let's see why anyone thought that these computer things would be a good idea
in the first place. Let's start off with just some straight facts.
Math really sucks.
Now, let me be 100% clear here.
Mathematics is one of the most important inventions in human history.
As far as I'm concerned, it's up there with agriculture and maybe indoor plumbing.
But math also just sucks.
I don't mean this in an abstract sense.
I'm very much speaking from personal experience.
During undergrad, I managed to take an unhealthy number of mathematics courses.
By the time I graduated, I was actually only a few units away from double majoring in mathematics
and physics.
Once you get beyond algebra and into the realm of calculus or linear algebra,
things get pretty out of hand pretty quick. Theory-wise, higher-level mathematics can be fine.
A page full of symbols, variables, and unrecognizable notation may seem daunting at
first, but it's actually usually somewhat manageable. Years of education, practice,
and building up a certain gut-level intuition
make this kind of work doable. Paraphrasing from Grace Hopper, humans function as fantastic
symbol processors. But there's two problems, there's two big places where this all falls to
pieces. One is speed. It should go without saying that a complex problem, while doable, is usually only
completed with a lot of patience. The other issue is a little more surprising if you're not familiar
with the field. You see, people aren't really good at arithmetic. In the physics department at my old
alma mater, we always used to joke that once you took calculus, you suddenly lose any
ability to calculate tip, or even to do simple sums. It's a pretty common occurrence to see
someone up at a blackboard doing a derivation, and then they'll stop, look down at their hand,
and start counting on their fingers. All jokes and anecdotes aside, this is a very real problem.
Now, for most of human history, these issues haven't been a big
deal. Usually, day to day, you don't need to do massively complex calculations, and there's rarely
any time crunch to it. As the 20th century got going, things changed. And for our story,
the specific change came in the form of some annoying little books at tables.
Despite all outward appearances, waging war is actually a very technical and math-intensive discipline.
Especially so when it comes to artillery.
In the modern day, artillery is usually tricked out with computers and targeting systems, but that's a pretty new development.
During World War II, some guns on ships and some on submarines started to's a pretty new development. During World War II, some guns on
ships and some on submarines started to get a little bit automated. But for the most part,
gunners had to rely on books of range tables. Now, in principle, a range table, sometimes called a
ballistic table or a firing table, is pretty simple. Each table shows the different ranges
and flight time estimates for a given
angle of fire. Some tables go a lot further in depth. They'll list things like drag,
adjustments for wind conditions, terminal velocities, you know, anything you need to
know to get a projectile to do what you want. In the absence of automated artillery, you kind of
need these tables to hit anything. And, well, the issue here is
already starting to come to the surface. These tables are pretty big. They often drill down to
the tenths or hundredths of degrees, and each fraction of a degree has a lot of columns.
You can find hundred plus page books full of these tables. And since each gun, charge,
and projectile acts differently, you kind of need a table for each one of these tables. And since each gun, charge, and projectile acts differently,
you kind of need a table for each one of these possible combinations.
What makes us all the worse is the math involved is pretty involved. If you have a passing
familiarity with physics, say high school class or something similar, then you should know that
usually problems are solved in a vacuum.
That is, physicists really don't like to work with drag, wind, friction, you know, any of the mess of
the real world. But you know who does care about those things? Anyone who wants to fire a big gun
at a target. In the absence of air, you can calculate most parameters for a firing table with pretty
simple algebra.
But adding in drag, and especially with the speeds at play here, things get gross pretty
quick.
You move from solving a simple equation, or sometimes really just plugging in numbers,
to solving differential equations.
So you get all the fun and complexity of calculus, and then plenty of
error-prone arithmetic once you're done. The people who computed these tables were called,
fittingly enough, computers. Initially, tables had to be done by hand, which was slow and not
all that accurate. Humans are pretty error-prone devices, all things considered. As mechanical
calculators started to appear,
things got a little better, but the process was still largely manual. You can't really ask a
simple box of gears to solve an equation or run an integral for you. I think this brings us up
neatly to the 1930s and the introduction of some early mechanical computers. Now,
here's all the usual caveats. These aren't computers that we'd
really recognize today. There's nothing digital about them. Programming them would be a totally
foreign concept. And to put it a little more technically, they're not Turing-complete computers.
This is where we hit the very early preamble of ENIAC. And as far as I'm concerned, the best place
to start is in the middle of the 1930s, with John Mouchley, then a professor at Uresenus College.
Sometime around 1934, Mouchley started research into weather phenomena. Specifically, he was
trying to investigate the effects of solar conditions on weather. The calculations involved
weren't the same thing as firing tables,
but Mouchley's work suffered from a similar slate of issues. Kathleen Mouchley, John's colleague and
eventual wife, put it this way, quote, he observed two things about hand computation that started him
on a path for better, faster ways. One, the labor one does in using a desk calculator is mainly in the entry of numbers
in the keyboard. And two, any errors made are usually the result of transcribing numbers
incorrectly from the dials to paper, or from paper to the keyboard. Therefore, Mouchley's
first objective was to devise some equipment that would store any number once they had been
entered on the keyboard a single time, so that they could be used as often as desired thereafter. Just like with firing tables, a calculator helped.
But Mouchly was still bottlenecked by the human factor.
These issues would lead Mouchly towards early mechanical computers.
But here's where things get weird in a wonderful sort of way.
As I mentioned, mechanical computers existed, and they were already seeing some use.
While they did make complicated calculations more reasonable, they suffered from their own
unique problems. One was that they didn't have the same sort of memory that computers have today.
One was that they didn't have the same sort of memory that computers have today.
Numbers were encoded in terms of rotation on drums, or location on a plane, or, you know, some other physical sense.
The operative fact here is that these are machines operated on analog data. Now, let me explain the issue with an example.
Let's say that you encode a number between 0 and 9 on a rotating shaft.
Each 36 degrees of rotation on the shaft would represent a new value, such that a rotation of
36 degrees would translate to 1, 72 degrees would be 2, and so on down the line. But this leads to
an interesting conundrum. Assuming you're dealing only with whole numbers, how would you interpret a 10 degree rotation?
Is that a zero?
Or is it a one?
The whole system breaks down because this analog nature just doesn't mesh well with
how we do math.
Mathematics are pretty discrete.
I mean, sure, you can have things like fractions and numbers between numbers, but at a certain
point you have to deal with discrete
steps. Analog systems function more on a spectrum. You can interpret an analog value as a discrete
number, but the two systems aren't really the best match. This becomes painfully obvious as
machines start to actually break down. Gears and shafts can slip, this causes errors. Parts of your mechanical
beast can wear out. Friction between rollers can become inconsistent with time. You get to a place
where any carefully crafted designs may as well just be thrown out the window. But in the 1930s,
there weren't really any other alternatives, so Mouchley had to go down a relatively new path.
Eventually, he would land on a promising lead. That would be the vacuum tube. Reliable tubes
were pretty new, all things considered. They offered a largely untapped potential. During
the latter part of the 1930s, Mauchly started working up vacuum tube circuits for counting
and storing numbers. This was a step in the right direction, but ultimately, an incomplete solution.
It wouldn't be until 1941 that Mouchley was able to move forward to the next big step.
That year, he started working at the Morris School at the University of Pennsylvania,
and he met the second John in our story.
That's John Presper Eckert.
Now, Eckert was the younger of
the two Johns. By 1941, he had just graduated from the Moore School, but by that time, he was already
pretty accomplished. Eckert had spent time working on radar systems, so was already well acquainted
with the practical use of vacuum tubes in the field. Specifically, Eckert was working on a
device to accurately and quickly
measure response times of radar pulses. The issue with existing technology was its relatively high
margin of error. Most radar systems of the 30s were analog. To get more accurate measurements
and reduce any errors, Eckert designed a digital replacement for the traditional analog circuits.
This brought him deeper into the fold of vacuum tubes and, eventually, digital numbers.
Eckert and Mauchly actually met in a summer class at the Moore School.
Eckert was teaching a series on electronics.
Mauchly enrolled because he only wanted to learn more about electronics.
From there, the two entered a very long period of collaboration.
But Eckert wouldn't be working with radar systems for all that long,
and Mouchley wouldn't go down in history as a meteorologist.
As World War II expanded and more resources were poured into the war effort,
something unexpected happened.
Eckert recalled, quote,
What happened is that during World War II, a large number of guns, field pieces, were sent over to Africa.
Tables were sent with them, telling them where to set the dials on the gun to allow for the amount of wind blowing and the height of the elevation of the target,
with the expectation that the shell would land where you wanted it to when you followed the prescription in the book.
The shells did not land where you wanted them to.
End quote. For some as-yet-undetermined
reason, firing tables were starting to fail. Conditions on the African front weren't what
artillery expected. Winds were different, vegetation acted differently, and adding this
change in terrain to already-known error-prone math was a recipe for disaster.
Tables needed to be updated, that much was clear, and the process needed to be as quick
and accurate as possible.
Before I continue, a quick note about the sources I'm going to be using.
There were a lot of people involved with the creation of ENIAC, but here I'm going to
primarily be focusing on just one of them.
Moushley died in 1980, and as near as I can tell, he didn't leave behind that big of a paper trail.
There is a collection of his papers at the University of Pennsylvania's archive,
but it's not entirely digitized. So, if there seems to be a profound lack of primary sourcing from Moushley, well, there's the big reason. There's a lot more interviews
and primary sources associated with Eckert, so a lot of the quotes and initial sourcing is going
to be from his side of the story. Anyway, this is where the Johns really got deep into the world of
computing. The Moore School was one of the few sites in the states that actually had a computer.
Well, sort of a computer, at least
what passed for a computer in the 1930s. They had a differential analyzer. It's a very primitive
machine. Just as an interesting aside, the machine at the Moore School was modeled after work done
by Vannevar Bush, a frequent topic on the podcast himself. Now, a differential analyzer was a purely mechanical beast.
By setting up proper gear ratios, pulley placements, and frictional coefficients,
you could model an equation, rotate the shaft the right number of turns, pull that lever,
and input the numbers, and they'd run through the device and you'd get an answer on the other side.
It worked fine, for the most part, but the breakdown of parts, slippage, and errors that I've already covered were a big problem for the machine.
But for this specific case of firing tables, speed ended up becoming just as large of an issue as accuracy.
This showed up in two big issues.
Mechanical devices have a hard limit on how fast they can run, and it's a very physical
limitation. If you spin some gearboxes too fast, well, it's going to break. If you put too much
force on the chain, it snaps. So the Moore School's computer had a hard set upper limit for computation
speed. But the setup time was just as big of an issue. You see, you can't really program a mechanical computer in any modern sense of the term.
To set up a new problem, you essentially rebuild the machine.
Gear ratios have to be updated, shafts and pulleys moved around.
The whole process could take weeks or more to complete.
So yeah, you have a computer.
But it's slow.
It's inaccurate, it takes weeks to program, and
it can actually physically break down in front of your very eyes while you're using it.
But hey, it's the best you got.
The most important part of the differential analyzer, and one of the more finicky components,
was the integrator.
This was a device that was able to calculate integrals, that's the area under a
curve. For differential equations used in firing tables, you run a whole lot of integrals, so
this mechanical integrator was a really important component. But as a mechanical device, it was
unreliable. It works by rotating a disc and ball mechanism. In other words, friction matters. And it matters a lot.
In the real world, friction doesn't like to play along. The net result was that mechanical
integrators run pretty slow. They have a pretty high margin of error. And if you don't maintain
them properly, they can just stop working. With fast and precise calculations being needed,
Eckert got called in.
One of his professors recommended that he adapt work on digital radar circuitry to the problem of making an integrator.
In short order, the Morris School Differential Analyzer actually started to transform.
It would become a slightly more digital machine.
Now's probably a good time to cut you in on a little secret.
I don't really understand how a mechanical integrator works. I know, I'm publicly outing myself as a hack and a fraud, but probably already knew that.
As best as I can understand, a gear turns this platter.
Then a metal ball that's pressed against the platter rotates another gear.
By positioning the ball and platter in the right way, somehow you calculate
an integral. To me, it's a bit of a black box, but that's okay for now. What matters is it takes
input and it gives output, and those are encoded in terms of rotation. But data didn't necessarily
have to always be stored as rotation on gears and shafts.
What Eckert came to realize was that you can actually ditch a lot of the weird mechanical
parts of an integrator, and by doing so, you can make a much better computer.
Over the course of 1941, Eckert, working with the team responsible for the Moore School Analyzer,
started to make some upgrades. This was initially pretty simple.
Eckert was able to grab up some surplus from the military and other projects and use that to
electrify the computer. Quote, By the time we got done, I had put in a couple of hundred tubes into
the otherwise mechanical machine. Not only a couple of hundred of tubes, but 13 amplidines, and 15 to 20 servo motors,
and a bunch of photocells and various other power supplies and contraptions.
So the machine was no longer a mechanical computer.
The original machine could have been driven with a steam engine.
It was just mechanical.
You just had to turn it with a couple of motors.
End quote.
More motors were used to move input gears. Amplidines were
used to turn rotation into electrical impulses. Photocells and vacuum tubes were installed to
read and move around rotational data as binary impulses. Everything was still analog, but slowly
the differential analyzer was starting to become a more modern device.
It was faster, and it was starting to become a little bit more accurate.
It was during this process that Mouchly became involved.
He and Eckert were already acquainted, but hadn't really worked together before.
Seeing the direction things were going, Mouchly made a recommendation.
Why not get rid of all the mechanical components?
And why not start first with the integrator? This was a massive step forward. You could call it a small revolution in itself.
However, I think it was a reasonable step to make. Mauchly suggested that an integrator could
actually be made using counting circuits. This was an arrangement of vacuum tube flip-flops that
increment in value every time a pulse was sent in. With a little arrangement of vacuum tube flip-flops that increment in value
every time a pulse was sent in. With a little bit of extra circuitry, mainly to control the
counting rate, you can actually use this to compute an integral. What makes this such a big deal was
that the vacuum tube integrator was a digital device. There's nothing analog or mechanical
about it. It's not the kind of technology that you can
power with a steam engine alone. It needs discrete electrical pulses. Eckert and Mouchley may have
also been the best team to attempt this. Both had a good deal of experience with vacuum tubes,
Eckert the most out of the pair, and both were familiar enough with analog computers to reshape
them. Dating back to his weather analysis work, Mouchley had built simple counting circuits.
He had even seen demonstrations of similar computing devices built by other researchers.
Mouchley knew that scaling up should be possible, and Eckert was eager to give it a try.
After a little bit of doing, and some strange work to connect the digital integrators up to the analog computer,
things actually started running.
This Frankenstein analyzer was a lot less prone to make errors.
It could be driven a lot harder before it broke down,
and it could turn out firing tables faster than any competition.
But for all that, it was still mechanical, and still subject to the laws
of mechanics. In other words, there was still a hard limit to how far the Johns could push this
machine. To go further, a shift was needed. The outline of a new computer was starting to form.
Whatever this turned into, the crew would need more than just military surplus and spare parts.
Whatever this turned into, the crew would need more than just military surplus and spare parts.
They would need support and they'd need funding.
So in 1942, Mauchly started drumming up interest.
Initially with a short memo titled,
The Use of High-Speed Vacuum Tube Devices for Calculating.
This was one of the earliest papers to suggest the creation of a fully electronic digital computer. That on its
own is groundbreaking. But Mauchly wasn't taking a massive leap forward here. If it seems like I've
been kind of using a slow stride to approach ENIAC, well, that's purposeful. No one knew what
an electronic computer would actually look like. So the researchers of the era were also
carefully moving in the right direction. Mauchly's 1942 memo is a perfect example of this.
It doesn't describe a modern computer. But it's getting there. It's really close. Primarily,
Mauchly described the speed and accuracy benefits that could be realized by dropping all the
mechanical components. He and Eckert had seen how promising vacuum tubes were firsthand,
and expanding from just a few electronic components to a fully electrical device,
that would kick things all the way up to the next level. But once again, this is not a full leap.
Mousley describes his theoretical machine in starkly
mechanical terms. From is then, in every sense, the electric
analog of the mechanical adding, multiplying, and dividing machines which are now manufactured for
ordinary arithmetic purposes. However, the design of the electronic computer allows easy interconnection
of a number of simple component devices and provides for a cycle of operation which will
yield a step-by-step solution of any difference equations within its scope. End quote. He calls
it an electronic computer, but Mouchley describes it using existing analog technology. Just like how
the new upgraded integrators that John and John were building replaced a mechanical component, Mouchly was describing a part-for-part replacement of existing mechanical computers.
Numbers would be digital, but stored as decimal, not binary.
There would be vacuum tube logic, but math operations would be built up separately.
To me, this memo sounds like really uncanny-pally kind of stuff. It looks
like a computer, it even sounds like a computer, but it's not exactly a computer.
Modern notions aside, Mousley was putting forward a pretty big idea. There were other
electronic computer projects circulating, but it was still really early days in that process.
None had been completed, and most were still kept secret by various government organizations.
Mouchley was putting himself and Eckert on the bleeding edge of something really new.
The memo circulated around the Moore School and eventually found its way to the Aberdeen
Proving Ground. That was the U.S. military facility in charge
of testing ordnance, and it was the ground that had been dealing with all the issues around firing
tables. Needless to say, a futuristic electronic machine that could solve their math problems
sounded interesting. Aberdeen requested that the Johns get a proposal for the machine together,
and soon funding was flowing into the Moore School.
Project PX started to take shape. Today, this is better known as ENIAC. Now, this is where I ran
into another sourcing snag that really kind of frustrates me. I can't find a copy of the initial
proposal that becomes Project PX. Eckert mentions the proposal in multiple interviews, and to get
military funding, there would have to be some kind of proposal. But everywhere I've looked,
I just run into a blank. As far as I can tell, it's either never been digitized, or it's most
likely tucked away in some special collection somewhere where I can't find it. And after a
dive into some possible
resting places, mainly at the University of Pennsylvania's archive and a few collections
at the National Archives, I also hit a wall. If the proposal does exist, then it's probably in a
poorly labeled file folder. So if someone listening works at one of these archives or knows the
whereabouts of the missing proposal, hit me up.
I would very much love to see if this exists.
Anyway, the first draft would be interesting, but I think we can pull a more useful source here.
In 1945, Mouchley, Eckert, and a few other researchers wrote Description of the ENIAC and comments on electronic digital computing machines.
This has the advantage of being completed after ENIAC was built, so the crew had the benefit of
hindsight. It means that we're kind of missing a little bit of a chunk of time there, but it's the
best we have to go on right now. The paper describes ENIAC itself, while also going into detail about
why the machine was
built in the first place. In other words, it gives an answer to our initial question.
Between this paper and some comments by Eckert, we can start to put together a pretty complete
picture. The first thing to note is that ENIAC, or at this stage Project PX, was a firmly military
research project. There was a level of secrecy around the whole
thing, a lot of the writing on the machine was initially restricted or classified, and there
were guidelines and timetables to be followed also. Some of that meant that everything moved
forward pretty quickly. As Eckert recalled, development started as soon as the proposal
was accepted. Quote, we were told that day to go ahead.
I came back and started having sockets screwed onto chassis and making sketches,
mostly writing a book of standards that we were to adhere to,
tolerances and putting some parts on life tests and doing various preparatory things on this.
The government then proceeded to send around a series of experts who questioned
all kinds of phases that we were doing. We were able to set them straight, so to speak,
or to quiet their worries." It makes good sense to send government goons around to keep tabs on
the Johns' work. ENIAC was a big investment for the military. Adjusting for inflation,
the initial funding was well over $7 million. But beyond just money, ENIAC was going to be a weapon of war. It wasn't out on
the battlefield, but its calculations were planned to drive the war effort forward.
It had to work, and it had to work well. Standards had to be adhered to, and it had to be completed
as soon as humanly possible. The slate of requirements led to a
machine that was one part radical and one part very traditional. Early on, it was decided that
certain features like input and output would make use of existing technology. Data.io, for instance,
would rely on IBM punch card tabulators in order to just save time. But outside of certain reused components, ENIAC was a fairly new take on computing.
The overall design was modular, with each puzzle piece of the computer crammed into
a refrigerator-sized rack.
These racks housed a total of 20 10-digit accumulators.
There were circuits for multiplication, division, square roots, and a printer, three function tables, and a random smattering of racks for control signals.
Everything was built using vacuum tubes and hand-soldered circuits.
In total, the finished machine weighed some 27 tons, it consumed over 150 kilowatts of power, and it contained somewhere in the neighborhood of 18,000 tubes.
of power, and it contained somewhere in the neighborhood of 18,000 tubes. All details aside,
it's a beast, but once we get over the sheer scale, the details actually become a little bit more interesting than anything else. Most notably, the laundry list of components has no mention of
memory, random access or otherwise. That's because, well, ENIAC didn't really have any
kind of memory. It was a different kind of computer than we're used to. The various
accumulators and buffers scattered around the system served as a type of memory. There was
also the series of function tables, which were kind of like a read-only memory. It's essentially a store of numbers.
Each discrete accumulator could hold up to a 10-digit decimal number, and certain inputs and
outputs had some tubes for buffering and latching to handle signals. But there was no addressable
data store in an ENIAC. Compared to later machines, ENIAC was simplistic enough that
it could get away without it.
The dedicated multiplication, division, and square root circuits are another kind of strange thing.
In more modern computers, the base unit of work comes down to logical operations.
You got AND, OR, XOR, and so on.
More complex operations like addition or multiplication are made up of
chains of these logical operations. This means that there's some lowest common denominator for
these machines. Think of it as shared building blocks. While this makes for an elegant solution,
it's not the only way to build a computer. In ENIAC's case, there isn't any one core building block.
Accumulators are their own special circuit. So are multipliers and dividers.
So why did John and John build their computer like this? I think it was partly to keep the
project on schedule and partly due to the experience with analog computers that everyone
had at the time.
This goes back to the strange combination of new technology and old ideas inside this machine.
Some information moved around as binary numbers, but the machine's accumulators stored decimal.
Math operations were also carried out in decimal.
There are glints of more powerful machines, but ENIAC is still very much in the mold of something older.
I think the accumulators are the best example of this.
These were fully digital devices. There's no moving parts and it works off pulses of data.
But they were based off earlier analog accumulators.
Each was composed of 10 ring counters.
These are a vacuum tube circuit that
can count from 0 to 9 and then send a signal when it overflows. These counters were then wired
together so that once one overflowed, the next significant counter would increment by one.
The differential analyzer had counters that worked much the same way, but just instead of pulses of
data, they ran off a rotating gear.
The other strange component of note is ENIAC's function tables. Now, I know what it sounds like,
but these aren't some physical store of functions. Instead, function tables were just banks of dials
used to input constant values. It could be addressed somewhat like a table, so in that sense it approaches
read-only memory, but the function table is only stored constant values. There's no code and there's
no variables, so I guess you could call it ROM. I don't know if I would. Once we leave the realm
of arithmetic encounters behind, that's where we actually reach what made ENIAC different.
These are the somewhat obtusely named master programmer and cycling unit. This is what made
ENIAC more than just a really expensive calculator. Every operation on the machine was tied to a clock
signal that's generated by the cycling unit. In other words, everything on ENIAC had to happen in a discrete
series of steps. On each cycle, a pulse of a square wave was sent throughout the machine.
It told each unit to do its operation. This meant the problems on ENIAC could be tackled in a
totally different way. On an analog system, an entire equation has to be modeled by gears and pulleys. But on ENIAC, you actually build up a problem as a series of discrete steps.
Add these numbers together.
Move them over here.
Divide the result by two.
It may seem really basic, but this change meant that new types of problems could be
tackled by ENIAC, ones that a differential analyzer just couldn't handle.
The final piece of the puzzle is the master programmer, and this is what ties everything
together.
And it's also where stuff goes back to being really complicated.
Inside this unit was a series of counters and switches and, depending on the value of
those counters, data signals could be redirected around ENIAC.
It's a bit of nasty technology, but with some proper application, the master programmer could
be used to build conditional statements. And, most importantly, it could execute conditional loops.
For instance, you could tell it to subtract from a variable until its value reached zero.
In more technical terms,
the master programmer was the part that made ENIAC a Turing-complete computer. It's the reason we can
call ENIAC one of the first computers and not just one of the biggest calculators ever made.
That's the basic layout of the system. It was hulking and complex, so it should be no surprise that its development was also a bit of an awful ordeal.
The project took off immediately after Eckert and Moxley's proposal was accepted, but it wouldn't become complete until December of 1945.
The team at the Moore School was treading on new ground, so things went a little slow. But they also didn't cut themselves any
slack. Feature creep ended up being a big issue for the project. From Eckert, quote,
We originally expected to build the machine with 5,000 tubes. As it turned out, we built it with
over 18,000. But this isn't because we misestimated how many tubes it took to do something. It's
because the government changed from wanting one function table to three. They changed from wanting And this is precisely why I want to see that initial proposal.
ENIAC grew as a result of Eckert, Mouchley, and all their co-workers finding problems with their earlier plans.
It also grew as a result of government intervention.
So by the end of 1945, a vastly more complex machine had been created.
But this leaves us in a strange place.
The astute among you may realize what's wrong with the timeline. ENIAC was meant as another weapon in America's quiver.
It was meant to aid in the war effort. But World War II ended in September of 1945.
That's just scant months before ENIAC was operational.
So with the war over, what exactly were they going to do with ENIAC?
Despite the war being over, ENIAC is still inherently tied to World War II.
You see, by the end of 1945, the U.S. was at peace.
The war machine was still grinding on, though.
Most importantly for our story,
the Manhattan Project was still in full swing. Even after the nuclear bombs were dropped,
scientists in New Mexico were still working to develop more deadly weapons.
This led to the first actual use of ENIAC. The computer was put to the task of running calculations for the hydrogen bomb project. The scientists from Los Alamos
actually took interest in ENIAC even before the computer was complete. Herman Goldstein,
one of the researchers that was working on ENIAC since the early days, provides a great description
of this period of development. In this section, I'm going to be pulling a lot of details from his
book, The Computer from Pascal to von Neumann. In those pages, Goldstein describes
the final months of Project PX as hectic. In addition to being over budget and off schedule,
visitors from other labs were starting to trickle in. Ostensibly, ENIAC was a secret project,
but news spread within the scientific community. There were other researchers working on similar
devices. Folk like Howard Aiken were really eager to take a look at the competition.
But of all the soon-to-be computer nerds coming through the Moore School,
John von Neumann was probably the biggest fish. And that's right, we are actually coming up on
the third John in this story. Goldstein recalls it this way, quote,
Very early, von Neumann realized the supreme importance that machine could have for Los
Alamos, especially for studying the feasibility of various ideas arising in that laboratory.
Accordingly, he urged the theoreticians there to examine the ENIAC with a view to using it
for their calculations.
They agreed with him, and plans were initiated to do a very large-scale calculation as a test
of the feasibility of a process that was then felt to be of great importance to Los Alamos.
End quote. The book's a little wordy, but it gets the point across. While ENIAC had become
a relatively well-known project, whatever was going
on at Los Alamos was still highly secretive. Goldstein didn't know what exactly Johnny von
Neumann wanted to do with ENIAC, but Goldstein did get to see the equations that von Neumann
wanted to run. There had already been plans floating around the Moore School to find a
project that would serve as a sort of burn-in test for the new computer. And from what Goldstein could see, John von Neumann's project
looked like a really good test. So it was agreed that a group from the Manhattan Project would get
the first crack at using ENIAC. We know now that this project would be calculations needed to
construct the first hydrogen bomb. By the end of 1945, ENIAC was
going to be a weapon of war. It was to become Death, Destroyer of Worlds, just tucked away
inside a laboratory instead of out in the field. With the setup out of the way, I think it's about
time to get to the actual brass tacks of how ENIAC was used. I've been skirting around how the machine was programmed because, well,
it gets really weird. You couldn't really write code for ENIAC. Instead, its collection of modules
had to be wired up with patch cables to slowly build up something like a program. Now, that
should make you a little uncomfortable, especially if you know anything about programming in general.
Modern computers are known as stored program computers.
This means exactly what it sounds like.
A computer usually has a program stored away somewhere in memory and then it executes that.
Not so for ENIAC.
Remember, it had no memory, at least not in the conventional sense.
Instead, it had modules and a big basket full of cables.
Each module in the computer had a series of input and output terminals.
For instance, each accumulator had terminals for setting each digit and then reading each digit off.
Once connected, data was sent around as a series of digital pulses.
So if you wanted to divide two numbers, you first had to wire up two inputs to the divider,
then wire the output of the divider into some actual accumulator to store the result.
To build up a complex program could take weeks of work. Carefully design wiring diagrams and
then maybe weeks more of debugging once you're actually in the lab.
Why would someone design a computer in this way?
How can a computer even exist without code?
Eckert described it as a compromise of convenience.
Quote,
During the war, speed was the main object, for reasons of timeliness rather than reduction of cost.
Not only was emphasis put on computing speed, but also on constructing the machine as soon as possible.
For this reason, extended research and development were curtailed in favor of employing ready methods which could be put into production quickly.
In this case, the ready method was to just not bother with something as complicated
as an instruction set. Analog computers were already being used in a similar way.
To run an operation, you connected a series of gears or hooked up some shafts.
Straying too far off the beaten path would have taken a lot more work.
But something that Eckert and Goldstein don't really mention all that much is the human
side of ENIAC. Since ENIAC was built so much in the image of the differential analyzer,
it was a good deal easier to mint new programmers. Project PX could get away with leveraging existing
human resources. This is where we reach another one of the strange and tragic parts of ENIAC's story.
A lot of the planning for programs like the hydrogen bomb simulations were carried out by researchers outside the Moore School, but the actual programming was done primarily by a dedicated staff, initially of six.
This first staff consisted of Kay McNulty, Betty Jennings, Betty Snyder, Marilyn Meltzer,
Fran Byless, and Ruth Lichterman.
Prior to ENIAC, these women had worked as computers, that is, the people who crunched numbers.
Most of the first crew had also worked with the school's differential analyzer, translating
problems into gear and pulley systems that the analog computer used.
When ENIAC became operational, they turned into some of the first computer programmers in the world.
This group of six women were the ideal candidates for the role.
As computers themselves, they were already very familiar with the by-hand process
that was involved in the kind of problems ENIAC was working with.
And as technicians working with the differential analyzer, they knew how these problems could be mapped onto a machine. ENIAC
was just a new step in the process for them. In the lead-up to completion, the programming crew
received training both at the Moore School and at the Aberdeen Proving Ground. Once Los Alamos
dropped off completed plans, this group of programmers did the most difficult task
of converting equations into actual wiring diagrams for ENIAC. And once diagrams were
completed, the same group physically patched up the computer and operated the machine.
Ultimately, they were responsible for making ENIAC do anything useful. And to put it frankly,
useful. And to put it frankly, they probably knew ENIAC better than anyone. So far, that all sounds great, right? Well, there's a bit of a problem here. At the time, computer work, that is,
running numbers, was considered a clerical job. To many, it was seen on the same level as
secretarial work. The prevailing notion was that it was just grunt work,
it's unimportant, or as certain records recorded it, it was labeled as sub-professional.
I work as a programmer, so trust me when I say it's definitely professional work. It's not just
hard to be a good programmer. It's the kind of work that takes a certain passion and a certain flair to get it right.
In a lot of ways, it's more of an art than a science.
And these women were trailblazing that art.
But to the staff at the Moore School, and to government supervisors,
even the engineers on Project PX, programming was just, quote, women's work.
That's not just belittling,
it's absurd. The women programming ENIAC all held degrees in mathematics. Most of them had tried to pursue grad school, but were unable to. It was common for universities to just not allow women
to enroll in graduate studies. Despite roadblocks, this crew of six would be trailblazers
in computer science. McNulty invented the idea of subroutines while working on ENIAC. That's
something that's very core to how we program today. Holborton especially is an interesting
case. She would later work on UNIVAC, where she not only created the first debuggers,
she would later work on Univac, where she not only created the first debuggers,
but her work inspired Grace Hopper to create the first compiler.
More often than not, their works just kind of been overlooked.
They were never given the same credit that their male counterparts enjoyed.
When ENIAC was publicly announced, photos of the computer started to circulate.
A popular shot shows Holborton and Belas at a control panel,
wires in hand. Newspaper articles just noted them as models, if they noted them at all.
Still worse, due to the classified nature of Project PX and the biases of the time,
NEX programmers just didn't get public credit for decades. I think that historically, programmers have kind of been seen as a second act to the hardware folk. And for the women who actually
made ENIAC run, this is doubly true. Anyway, by the beginning of December, the first practical
test of ENIAC was shaping up. The programming crew had transferred notes from Los Alamos into
actual wiring diagrams.
They'd wired up the machine and everything was ready to run.
The actual program is somewhat shrouded in mystery.
Goldstein admits that he's not 100% sure what the first test program was,
but from subsequent runs, the best guess is it was some kind of Monte Carlo simulation.
This is where ENIAC really gets a chance to
set itself apart from the differential analyzer. For our purposes, I'm going to give a truncated
explanation of the Monte Carlo method. All you need to know is that it's a way of simulating
something using pseudo-random sampling. You set up a model for some physical phenomenon,
you pick a random set of conditions
to sample, and you run the simulation. Usually to make things more efficient, the next set
of starting parameters will be based off previous outputs. It's something like a smart random
walk over a parameter space. By running this over and over again, a picture starts to build
up. With the simulation sampling possible
outcomes, you can zero in on favorable starting conditions. For the Manhattan Project, that meant
figuring out the conditions needed to detonate a hydrogen bomb. The takeaway here is that the
Monte Carlo method isn't straight mathematics. There's an element of randomness and an element of choice involved.
I mean, sure, you could sit down and run a simulation using some dice, a pen, and a stack of paper.
But that would be miserable work. I would wish that on no one.
An analog computer also doesn't actually help.
Analog systems don't really do random numbers.
And conditional branching isn't something that can be accomplished on analog computers. The Monte Carlo method only becomes viable if you have
access to a digital computer. In other words, ENIAC's the secret sauce that makes it all possible.
With some really clever wiring, the machine can execute a series of steps, run an equation,
check the results,
make a conditional branch, and then do it all again. The Monte Carlo method is a fantastic
example of how ENIAC, and computers in general, really changed the world. ENIAC allowed von Neumann
and his colleagues to do something totally new. It let them ask new questions and get answers in new ways. ENIAC was really close
to early analog computing, but the changes at play made all the difference in the world.
By 1946, the first slate of calculations were complete. Dr. Norris Bradbury, one of the
physicists involved at the Los Alamos side of things, had this to say about the computer.
Quote, the calculations which have already been performed in ENIAC, as well as those now being involved at the Los Alamos side of things, had this to say about the computer, quote,
The calculations which have already been performed in ENIAC, as well as those now being performed,
are of very great value to us. The complexity of these problems is so great that it would have been almost impossible to arrive at any solution without the aid of ENIAC. We are extremely
fortunate in having had the use of ENIAC world was about to feel a massive shift.
Computers, like ENIAC, enabled this great change.
These early computers didn't exactly look like modern machines,
but the ideas and techniques they pioneered,
they set us down a path to the modern day. However, this path wouldn't always be a clear
one. And it wouldn't always be free of bad actors.
Alright, that rounds up the first half of our dive into ENIAC. As we've seen, the machine came from the same background as earlier analog computers.
ENIAC wasn't an entirely new beast.
More like a radical step in an evolutionary process.
But at the same time, this computer has a complicated history.
It was intended to be a weapon of war, and despite being completed
after World War II, ENIAC became the perfect tool for making a more deadly bomb. And to make things
all the more worse, the crew of female programmers that actually made ENIAC run didn't get the credit
they deserved until decades later. So where do we stand on our question? Why bother making a computer?
Well, if we look at ENIAC, there are a few different answers. To Eckert and Mouchley,
ENIAC was created to push forward the state of the art. It was built to make complex math easier
to handle. But to someone like von Neumann or the government officials that funded Project PX,
But to someone like von Neumann or the government officials that funded Project PX,
ENIAC was built to help win World War II and any future wars.
It would take time, but those reasons would eventually change into something more recognizable.
I want to close this out by getting back to ENIAC's complicated legacy.
In 1947, Eckert and Mauchly filed a patent on ENIAC. This was done because the two Johns had decided to take computing to the private sector. They founded a company called Eckert Mouchly
Computing Corporation. And armed with an expansive and broad patent, plus some of the best minds in
the field, something strange and interesting happened. A monopoly started to form.
Next episode, we're going to be taking a look at the fallout of ENIAC's patent,
and the ensuing court case that eventually made the computing industry possible and safe for the
rest of time. Thanks for listening to Adren of Computing. I'll be back with the conclusion to
the ENIAC series in two weeks' time.
If you like the show, there are now a few ways you can support it. If you know someone else who
would also like to listen to stories of computing's past, then why not take a minute to share the show
with them? You can also rate and review on Apple Podcasts. If you want to be a super fan, you can
support the show through merch or signing up as a patron. Right now, I'm about to
close up the poll on Patreon for the November bonus episode. So if you want to get in on that
poll, support the show, and get some cool bonus content, then go over and sign up. You also get
early access to new episodes and some other assorted goodies. You can find links to everything
on my website, adventofcomputing.com. If you have
any comments or suggestions for a future episode, then go ahead and shoot me a tweet. I'm at
Advent of Comp on Twitter. And as always, have a great rest of your day.