Advent of Computing - Episode 161 - The IAS Machine
Episode Date: July 13, 2025The first batch of digital computers emerge directly following WWII. The hallmark of this generation is uniqueness: no two computers are the same. However, there is a machine that bucks that trend. Th...e IAS Machine, built in Princeton in the late 1940s, served as the inspiration for at least a dozen later computers. But how similar were these Princeton-class computers? What exactly was so special about the IAS Machine? And how does good 'ol Johnny von Neumann get tied up in all of this? The Eastern Boarder map fundraiser Selected Sources: Bigelow Oral History - https://www.si.edu/media/NMAH/NMAH-AC0196_bige710120.pdf Prelin IAS Machine Report - https://www.ias.edu/sites/default/files/library/Prelim_Disc_Logical_Design.pdf
Transcript
Discussion (0)
In 1953, the RAND Corporation turned on a brand new computer.
This was RAND's first real, honest to goodness, computing machine.
Up to that point, they'd been living out of a pile of tabulators and some IBM 603s.
Now the 603 was a pretty good machine, but it was just an electric calculator that's
plain not on the same level as a stored program
computer.
This new computer would end up serving RAND very well.
It actually operated all the way up to 1967.
It saw expansions for everything from high-level programming to time-sharing.
It was a real beast of a machine, definitely more powerful than an IBM 603.
The computer's name?
Joniac.
The John von Neumann Numeric Integrator and Automatic Computer.
Now I know, bit of a strange name for a computer, right?
We don't really see a lot of machines named after people.
But the reason behind this name is pretty straightforward.
The Johnnyak was based off of designs for another computer called the IAS machine.
That computer was built by a team led by one John von Neumann.
Hence, the researchers at RAND chose to name their new machine in honor of von Neumann.
Despite his renown, Johnny von Neumann was actually a pretty humble guy.
It turns out he did not like the idea of a computer named after himself.
That complaint would spread over the grapevine from researcher to researcher until it reached
John Williams, one of the subgroup heads at Rand. Williams would send
this note over to von Neumann, quote,
Ed Paxton has relayed your blushing disavowals regarding the appropriateness of the term
Johnniac. In this matter, your view just represents dispersion. If it helps any, recall that there
are lots of Johns in the world."
And so, the John von Neumann Numeric Integrator and Automatic Computer would keep its name.
How did Rand get here?
Most early computers were totally unique one-off machines, right?
Well, there is a computer that would break that pattern, and it was
built in Princeton.
Welcome back to Advent of Computing. I'm your host, Sean Haas, and this is episode 161,
the IAS Machine. Today, we're going to be looking at one of the more
influential computers of the early eras, and perhaps I'll be rustling a few feathers.
If you ask anyone to name the most important figures in the development of computing,
you'll probably get Turing, Hopper, Babbage, and von Neumann. Those are kind of the brand name pillars of the field.
If you're a long time listener, then you know my stance on all of this.
I firmly believe in the idea that history is shaped by trends and forces, not so much
great figures.
That's not to say that individuals don't have agency in how history unfolds.
Rather, I think that larger trends exert much more power on the flow of things than any
individual ever could.
We can see that in all the wild cases of independent invention that crop up.
I mean, I'm just coming off that whole vocal basic episode.
I think that's a perfect example of trends and forces at play.
Turns out that the trends were such that two nearly identical languages are developed at
the same time.
It can be very common and very easy to look at history as a series of great people that
do great things.
But that really doesn't explain the full picture.
And I just don't find it as interesting
as examining this larger picture.
That's kind of a long-winded explanation
as to why I don't often do episodes
that focus on a single historical figure.
You won't catch me doing an episode
just on the life and times of Alan Turing.
Again, not to say I don't like these larger than life figures.
I'm a huge fan of Grace Hopper, for instance, but she didn't create a compiler in isolation.
There were forces in the larger field that pushed and pulled and made it possible for
her to create a practical compiler.
That brings us to, well, I guess the figure at hand today, John von Neumann, or, as he preferred to be called, Johnny von Neumann. Have you ever heard the term hagiography? A hagiography is an
account of a saint's life. They tend to be very glowing. They talk up their subject
There's nothing negative just just the good sides
It's a way to confirm that yes, Saint such-and-such really does deserve such a lofty title
When it comes to von Neumann, I think many do view him as something of a digital saint
most things you can read most biographies and histories that talk about von Neumann
are a lot closer to hagiographies than academic histories.
He's often cast as a pivotal, maybe the central figure in the development of modern computing.
So, why don't I talk about him more often? Well,
to put it simply, I don't really get the draw. Most of his work isn't directly
related to computing. He's what's known as a polymath. He does a lot of research
in a lot of fields. Computing is just part of it. When he does get involved in
computing, it's most often on the sidelines. He consulted
on ENIAC. He consulted on the design of EDVAC. The von Neumann architecture, his big named claim
to fame, is only named after him because he wrote a draft of a report that described the system.
It wasn't a Johnny original, it was part of a larger
project.
All that said, there is one computer that is an honest to goodness Johnny von Neumann
original. That's the IAS machine, built at the Institute for Advanced Studies in Princeton.
This is a machine that's already figured on the show before. It was the setting for Nils A. Baricelli's truly wild work on early genetic algorithms,
for one.
The IAS machine was also the inspiration for many, many other computers.
In the early 1950s, there was a whole pile of machines that were based off the designs
of the IAS machine.
They range from Iliac to Janiac to even some IBM computers.
But how closely do those imitations follow the original?
Today, we'll be giving Jani some airtime, but with a little scrutiny, as always.
As I said, a lot of later sources speak very glowingly about his contributions, which makes
me want to tread with caution.
Just what was the IIS machine?
How did von Neumann's earlier consulting work impact the design of this new computer?
And how did its design spread?
Before we get started, I have my announcement corner.
First of all, I wanna plug a fundraiser
that friend of the show, Christops,
over on the Eastern Border podcast is doing.
Right now, Christops is selling
a bunch of old Soviet topographic maps.
They have maps of all over the world,
and they look pretty cool.
These are pre-satellite maps
so a lot of them are drafted and drawn. I'm gonna throw a link to the fundraiser
in the show's description. It explains all the details how you can get maps of
your own hometown taken by the Soviets. It's a neat prospect and it's going to a
good cause. Now second, I'm going to VCF West this year. That's actually coming
up pretty soon, so I had to get a few things in order. As such, I'm going to be skipping
out on next episode. Just going to take a little bit of extra time to get ready for
VCF West this year. The event itself is going on in Mountain View, California, August 1st and 2nd.
I'm going to be talking, one of those days, the schedule's not up yet, about my work on an
LGP30 emulator that's web native. It should be a pretty exciting time.
And as I always say, these vintage computer fests are wonderful events.
So if you're in the area, I highly recommend stopping by and saying hi.
Now with that out of the way, let's get into the IAS machine.
So, the stage dressing.
Digital computing as we know it isn't exactly born during World War II, but it gets really
close.
The US government is directly responsible for making a lot of this happen.
A combination of funding and concentration of human power around the war effort make
a lot of things get done very quickly.
And the biggie out of all of those was the Manhattan Project. The long shadow of the Manhattan Project is forever cast on the early days of computing.
They didn't have electronic digital computers out in the deserts of New Mexico.
However, researchers at Los Alamos could get ready access to some of the first digital
computers.
Advanced computational methods dreamt up at Los Alamos would
find their way onto these early machines, and as for funding, well, that would flow
pretty freely. As the war ended and the veil of secrecy lifted, we started to see
these knock-on effects of the Manhattan Project. One that may not have been
expected was that many of the country's
top scientists had been exposed to new ideas around computing. These very researchers were
then leaving the Manhattan Project and heading back to their old jobs. That would include,
of course, von Neumann. During the later stages of the project, he would even run one of the first problems on ENIAC.
Now, prior to the war, von Neumann worked at the Institute for Advanced Study, the IAS, in Princeton.
The IAS is an interesting organization in its own right.
It was founded as a mix between research center and a kind of non-conformist school in the early 30s.
The plan was to house all these heavy-hitting researchers,
give them a place to work, and then expose students to them.
Kind of a learn by osmosis practice.
The key here was the fact that the IAS
wanted to focus on theoretical work.
This was something of a think tank,
not really an inventor's
shop. As such, there was very little practical tooling around the place.
It was also founded at a highly opportune time. Scientists fleeing Europe just prior
to World War II were easily recruited to the IAS. This led to their roster including giants like
Einstein, Pauli, Godel, and of course, our man Johnny. With the Manhattan Project
winding down, he would eventually find his way back to the IAS. Upon his return,
von Neumann had most recently been involved with the design of a computer
called EDVAC. And this is where we get into a classic controversy. I've told this story a number of times,
I'm going to recount it again in brief here because it's important for the
larger context. Now, EDVAC was designed at the Moore School. That's the same outfit
that built ENIAC. It was envisioned as the next generation of computers,
a machine that wasn't restricted by the rush of wartime constraints.
It would be programmable and have a shared memory space for both code and data.
It was designed by the same team that built ENIAC. The two leaders of that group,
John Eckert and John Mouchley, planned to go into business
building EDVACs, or at least something similar to the EDVAC.
Their plan was to get a patent on the design of EDVAC and use that as the foundation for
a new business, an Eckert-Mouchley Computer Corporation, if you will.
Of course, there were more than two Johns working on the design of Edvac.
Many of the researchers who worked on ENIAC were involved, as was von Neumann.
The story goes that on the way back to Los Alamos, von Neumann drafted a report on Edvac, which described its current design
that was being developed at the Moore School. That was then sent back to the school. Herman Goldstein, another contributor to EdVac, typed up the
report. It was then distributed to a handful of colleagues and then spread
widely around the world. This led to three immediate effects. First, it upset
the two Johns to no end. The distribution of
the Edvac report made the matter of patent tricky if not impossible. Second,
it popularized a pretty practical design for a computer, specifically registers,
stored programs, and a single memory space for code and data. And third, it wed von Neumann's name to computers.
Forevermore, this architecture,
specifically shared memory space,
became known as the von Neumann architecture.
The reason for that is simple.
Von Neumann's is the only name
on the draft of the EdVac report.
This is because it was intended as an internal document. Johnny wrote the memo, he put his name on the draft of the EdVac report. This is because it was intended as an internal
document. Johnny wrote the memo, he put his name on it, it stayed on it. But this wasn't
really a von Neumann design. He had contributed, but it was a team effort. It was just a quirk
that names the von Neumann architecture.
So by 1946, Johnny's back at the IAS and my man wants a computer.
But that immediately led to an issue.
The IAS was a theoretical outfit.
A computer was bleeding-edge technology, but it was still a machine.
Here and for a lot of the episode, I'm working out of an oral history interview with Julian Bigelow,
who would become chief engineer on von Neumann's machine.
In fact, there is nothing vaguely resembling laboratory work here at the Institute,
nor has been since that time, and there was nothing of that sort before that time, and there was almost, I would say,
a loathing on the part of the ivory
tower component of the academic community that anything should take place
end quote. From the outside this can sound ridiculous the IAS is a
theoretical institution so they will never do anything practical. But let me assure you, there is a huge rift in between application
and theory when it comes to academics. Here I can actually speak from experience. I'm not just
guessing. When I was in my degree program for astrophysics, I was very much on the theory side.
I worked with a lot of data from radio telescopes.
I would even write software that would pick apart
and analyze that data.
However, I would never touch a radio scope.
The folk that worked with radio telescopes
and would acquire the data that I used
were from a totally different discipline.
If you sat me down in front of
a radio telescope, I wouldn't know where to start. My skills in education are just non-transferrable.
Even though I've worked very closely with the data that comes out of those scopes, and
I could tell you how those scopes work. The same is true for an astronomer who works with
those scopes and my theoretical work.
If I handed them my analysis software, they wouldn't likely know what to do.
But that arrangement works out because neither of us would ever be expected to be put in
those positions.
In academics, you kind of pick a side and stick to it.
There are very few that work across purposes.
In this early period, computers were dangerous to establish norms because they straddled
that divide.
It's a machine, which makes it an application thing, but it's used to crunch numbers, which
is a theory thing.
To use one of these early computers, you need to be intimately familiar with its inner workings, which is all about electrical engineering and
the like. But you use that knowledge to write and test theoretical models and
complex mathematics. So where is it supposed to go? Where does it land? To the
larger community at IAS, the answer was clear. It's just not theory. But for von Neumann,
he didn't care. He saw computers for what they were, tools. That was true at the
Moore School and it could be true at the IAS. A computer lets you supercharge
your theory work. Von Neumann had been involved in just such an effort on
ENIAC. Near the end of the war, right as ENIAC came online, it was used to model nuclear fusion.
This was done using a totally new approach called the Monte Carlo method.
The program itself was assembled by programmers on the ENIAC team and a group of scientists from Los Alamos,
which included von Neumann. He saw, before his eyes, how a computer
could be used for pure theory, how a machine of pure application could be used to answer theoretical
questions unlike anything else in human history. It's not just that the argument of theory versus
application is kind of dumb. It's that for a computer, you really can have it both ways.
A computer is a superb tool for theoretical work,
and it's a superb tool for practical work.
You can have both embodied in this one device.
But then, how did von Neumann convince the IAS
that they needed to build a computer?
That's the cool thing about Johnny. He's a scientific genius and good at handling people. That is a rare combination.
After the war, he didn't have to come back to the IAS. He actually received a number
of competing offers. As George Dyson explains in Turing's Cathedral,
at least IBM, Harvard, and MIT were after von Neumann.
If the IAS wanted to keep one of their brightest researchers,
then they would have to let him build a computer,
and he knew that.
That didn't mean, however,
that everyone at the institute liked the idea.
The project would be greenlit, given space and funding.
It was called the Electronic Computer Project, and it was carried out in the IAS's basement.
This is, perhaps, one of the early origins of the proud tradition of programmers living underground.
And with that, von Neumann started doing what he did best, building a team.
However, even that would be a little rocky.
Johnny had a basic design for the machine, but he wasn't going to build it on his own.
As Bigelow recalls, quote,
Originally he intended to have the computer built by Presper Eckert of Eckert and Mouchley.
The originally preliminary contract negotiations to get support for the computer had, in fact,
planned to do it with people of Eckert's group, including Eckert himself.
Subsequently, because the field was then growing with almost explosive rapidity, I suppose,
Eckert became difficult to get a hold of and did not show
up at the agreed-upon time, which made von Neumann quite irritated." That would have honestly been a
really good setup. Von Neumann had worked with Eckert before, so he knew his chops. Besides,
there just weren't that many folk in the world who had ever even
used a computer. Eckert and his cohort were some of the best and only experts in the field.
However, that dream did not come to pass. This was likely the result of a larger split between
Johnny and John and John. Eckert and Mouchley had commercial ambitions
and von Neumann was an academic.
In fact, von Neumann wanted computing
to be a purely open and academic field.
Eckert and Mouchley wanted to make money.
Eniac and Edvac would be the only time
the three Johns would ever work together.
Johnny, however, was undeterred. He always had a backup plan.
The team he assembled would be headed by Bigelow. Now, Bigelow himself was an electric engineer
and a cyberneticist. In other words, the nightmare of the IAS. The man had all the tools needed
for application.
Despite the falling out with Eckert, von Neumann did manage to snag some talent from the ENIAC
team.
Arthur Burks and Herman Goldstein would come into the IAS fold to work on this new machine.
As with Eckert, these two were some of the few in the world who were experienced with
building computers, so they would prove critical to the project. Then we get George Brown, another
nightmare for the IAS. He was an electrical engineer from RCA, a company
that made and sold devices that they invented in labs. Talk about practice. The roster was filled out with more illustrious
engineers and mathematicians. In many ways, this mirrors the larger purpose of
the IAS, to provide a home for talented researchers. It just so happens that
von Neumann had very practical plans with his new group. What exactly did the
dream team come up with? Well,
that's the complicated part. The first thing to address is what this machine
was even called. We know it as the IAS machine, but during its construction it
didn't have an official name. Its first report calls the device an
electronic computing instrument. Eventually it's called Maniac, but that's in reference to a duplicate machine built
years and years after the fact.
I'm going to be calling it the IAS machine, since that's what we usually call it today.
The other thing to address is originality.
Some will describe the IAS machine as a grand first, either as the first stored program
computer or the first machine to use the von Neumann architecture, or something like the
first practical computer.
None of those claims are true.
The IAS machine enters serious development in 1946, but isn't fully operational until
1951.
In that time, to use Bigelow's own words, quote,
there was an enormous amount of activity in at least a dozen or perhaps possibly
eight centers aimed at producing computing machines, end quote. Computers
are springing up everywhere in this period, from Whirlwind to the Ace, the Manchester Baby to Messam and Bessam.
By 1948, we get the first stored program computer, which uses the von Neumann architecture.
This is due, in large part, to the spread of the EdVac report. There's a blueprint for a
computer in the wild. Even better, there's a blueprint for a reasonably good and cheap-ish
computer in the wild.
To me, this gives us two immediate lines of inquiry. First of all, given the context of
all of these EDVAC-like computers, was the IIS machine similar to EDVAC? I mean, Von
Neumann did consult on that project after all.
And second, why did it take so long for the IIS machine to become operational?
To answer those questions, we need to get into the details of this new computer.
The machine von Neumann planned ends up being a little different than the actual machine
that's running in 1951.
Those differences will help explain a little bit of why it took five years for the computer
to become operational.
As such, I want to start us off with the earliest reports on the machine.
Things begin in 1946 with a paper titled, Preliminary Discussion of the Logical Design of an Electronic Computing Instrument.
It's authored by Burke Goldstein and von Neumann.
And do note here that all three of these authors work with ENIAC and the EDVAC design.
The report describes in some detail the architecture of the proposed IAS machine.
This is where we get to immediately answer one question.
The new computer isn't just a dressed up EDVAC.
We're dealing with a substantially different machine.
It may be better to think of it as a refinement.
We should do the usual here and start with memory, but to
even get to that, we need to address some of the language used. 1946 puts us at the
very early era of computing. One hallmark of these early papers is the phrase
computing Oregon! As near as I can tell, this is a Johnnyism, or at least the earliest paper I've seen
that used the term organ is the draft report on EdVac.
Von Neumann calls each distinct part of a computer an organ. Memory is called the memory
organ. Some logic circuits are called the organ of control. It sounds antiquated, especially
since we don't really have an equivalent phrase today. When we talk about memory, we
just say a computer is memory. I think technically you could say that organ is close to system
or subsystem, but I rarely see later papers use the term memory system. Quirks aside, what kind of memory
are we dealing with? The IAS machine was originally designed to use random access
memory, as in the computer would ask its memory organ for an address and the
organ will be able to serve that address immediately. That's how things work today, but for the period, this is pretty
new stuff.
Edvax Design called for sequential memory. Specifically, the report recommended recirculating
delay lines. That was the more common approach in the 40s. Bits would circulate through this
buffer that was refreshed with each cycle. Only one address would be accessible at any one time.
So memory access was time domain.
It had to be timed to match that circulation cycle.
Random access memory, by contrast, doesn't have that restriction.
That makes it strictly better if you can get it to work at all.
A painful reality is that memory is probably the hardest part of building a computer.
Electrons don't want to stay in one place.
The IAS machine planned to use this new technology called the Selectron tube,
which would, in theory, solve that problem.
The Selectron was actually a totally new device that would be created
just for the IIS machine. RCA, who had recently set up shop in Princeton, agreed to design,
develop, and manufacture these devices. This was going to be a huge boon for the IIS team,
since they wouldn't even have to worry about the memory organ.
Some poor engineers over at RCA would do all of that for them and then deliver the working
organ, hopefully wrapped and packed with ice.
And the Selectron was a pretty slick device.
It stores data as charges on the inside of a vacuum tube.
Inside a Selectron is a grid of these little metal washers.
An electron beam deposits charge on those washers.
That charge can then be read off.
So each tiny washer is a single bit.
A Selectron tube looks like something straight out of a sci-fi film too.
It doesn't have a face like a CRT display.
Instead, the Selectron is a long tube with this grid positioned sideways near the middle.
During use, you can actually see which bits are on and which are off.
A charged washer will actually glow.
So if your computer is using Selectrons for memory, you could, in theory, directly read
the contents of the organ with your eyes.
But sadly, it just wasn't to be.
As the project spun up, the ECP team made really rapid progress, but RCA did not.
The issue came down to complexity.
Each Selectron was a very delicate and fiddly device.
It would have been one thing if the IAS just needed one working Selectron, but that wasn't
the case.
The computer needed 40.
The IAS machine was designed to use a 40-bit word.
That means that every location in memory would store 40 binary digits of data.
Further, the machine would have 4096 words of memory.
So in theory, you could just have one massive selectron tube with around 163,000 bits of
storage.
That would mean 163,000 tiny washers on a giant pane of glass set in a monstrous vacuum
tube.
That doesn't work for, I hope, obvious reasons.
But let's just say it did.
That would lead to another issue, performance.
One of these tubes can only access one bit at a time.
You have to physically steer the electron beam over to the correct location and fire
it.
That can't read 40 bits at once.
So to read a full word from your mega tube, you'd have to make 40 sequential operations.
That's a tad slow, and that gets us back to the older sequential memory design.
The solution chosen for the IIS machine was what's called parallel memory.
In this scheme, each bit of a word is stored in a separate device.
You still have to do 40 different operations,
but they happen at the same time. Selectron number 1 can grab a bit, while selectrons number 39
through 40 are also grabbing some bits for you. In that way, accessing a single address is done
much faster, since you're parallelizing down at the bit level, or you're doing it bit parallel.
The downside here is the change in scale.
You have to have 40 storage devices.
In the world of mass production, that's easy. A chip is a chip is a chip.
But Selectrons were all handmade at RCA's Princeton factory.
They were very delicate and intricate instruments,
and worst of all, the actual tube was still an analog device. At least half of it was.
Steering the electron beam was handled by sending in analog voltages. Those voltages
energized a set of electromagnets that moved the beam in the x- and y-axis.
That's the normal setup for all cathode ray tubes, no matter how exotic they are.
With this electron, there was, in theory, the issue of calibration. There are
physical locations where each bit must be stored. It's those little washers in
that little grid. The electron beam has to be adjusted to hit
those exact targets. It would be pretty easy for one or two tubes to be out of calibration,
or for one tube to require a different voltage to reach the correct locations for a bit. The
whole setup would be, well, fiddly as all get out. As Bigelow recalls, in 1947 the ECP team hit a turning point.
Quote, we had built and essentially solved the problem of building the arithmetic part of the
machine. Inasmuch as we had set up in which there were essentially 10 digits of the adder-subtractor
multiplier-divider unit operating. The whole thing was to be 40 and this 10-digit away was breadboarded.
By summer of 1947, it was clear that this was a successful piece of componentry, and
we turned more seriously to the question of how to realize a connection with a memory
system."
I realize I just said later people don't say memory system and Bigelow just says memory system, so
maybe I'm just not thinking right, but whatever. We get this combination of factors here. The
computing part of the machine was going better than expected. The memory part was going worse
than expected. It was time to make some hard decisions. During that year of development,
a great many things had changed in the field.
We're in this weird period where some things happened fast and others...
kind of took ages to develop.
Memory was one thing that was changing fast.
Over in England, a new computer called the Manchester Baby was under development.
It would win the race and be the first operational stored program computer.
And it used a totally new device for memory.
We know it today as the Williams-Kilburn tube,
or just the Williams tube.
A Williams tube operates on a very similar principle
to the Selectron.
Bits are stored as regions of electrical charge
inside a vacuum tube.
Everything is read and written using an electron beam. The key difference is a Williams tube can
use a normal CRT. In fact, early examples were made using surplus radar scope screens.
Now, crucially, this is the big distinction. the Williams tube didn't need the special washers
to store charge.
Data was stored as a pattern of dots and dashes projected onto the screen by an electron beam.
Those dots could be anywhere on the screen as long as the beam would always come back
to the same spot.
It just needed way less precision
engineering than the Selectron. So it was decided to drop RCA's tube and switch
to this newer design. This was early enough in design that it didn't really
cause a disruption. It was more that the IAS team was just taking up this new
phase of development. The actual process for that was, well, it was very period,
from Bigelow. Quote, I subsequently went abroad. I took a two-week vacation and went abroad and
visited those various centers and learned a lot at each one, and learned also how different the
approaches would be at each one. Then at Manchester, I stayed with Williams and Kilbourne for about four or five days
studying what they had done and watching it operate. Williams was a most amusing man. As he stood
there and watched his machine, part of it started to burn up because it was built in such a jerry-rigged
fashion, but it didn't bother him at all. He just took some clip leads off and said, that's no good."
He just took some clip leads off and said, that's no good." Papers are good and all, but that's no substitute for face-to-face communication.
While Bigelow was actually seeing Williams' tubes in action and checking out the Manchester
Baby, researchers back at the IAS were getting a tube up and running.
This really speaks to the closeness of the community in this period.
Information isn't just passing around in academic papers and notes, but also in letters,
in phone calls, by word of mouth, and by taking vacations to go look at other labs.
These guys are kind of just hanging out together. There's a type of openness and closeness here that we can't have in the industry
anymore just due to the size of things. In fact, I bet you could track down a dinner party in this
period that had over 50% of the world's programmers in attendance. If anyone can, I don't know,
I'll send you a mug or a thank you note. I'll invite you to a dinner party, perhaps.
I'll send you a mug or a thank you note. I'll invite you to a dinner party, perhaps.
All right, so that's memory.
And it should be clear here that the memory of the IAS
is very different than that of EDVAC.
It's also somewhat unique for the period.
The Manchester Baby, where Williams tubes are first used,
is actually a serial computer.
It's based off EDVAC itself.
It doesn't do the whole parallel trick
that the IAS machine does.
Instead, it reads data one bit at a time.
This parallel design would extend out
from the IAS's memory to the rest of its organs.
All math and logic circuits are parallel.
This is, again, in direct contrast to the draft EDVAC report.
EDVAC called for serial circuits.
This meant that EDVAC's addition circuit, for instance, would only be a 1-bit adder.
To add two 40-bit numbers, you would run 80 bits through that one circuit, one bit at
a time.
The IAS machine used a parallel adder.
That means 40 adding circuits all hooked together. To add two 40-bit numbers,
you just need to make one pass. You feed in both numbers and then a new 40-bit number comes out.
Again, we get the same trade-offs. This is a much faster machine than a serial computer at the cost of complexity and scale.
You see this everywhere you look with the IAS machine.
It's basically 40 times bigger than the designs for EDVAC.
But and here's the kicker, that's not too bad.
These are all duplicated systems, so in practice,
having 40 adding circuits isn't actually 40 times as complex as one adding circuit.
It's just 40 times bigger.
The real complexity comes more in how everything is wired up,
and that's a solvable problem.
That brings us to another little bit of a question.
Why 40? That sounds like a question. Why 40?
That sounds like a strange number, right?
Well, we actually get an explanation.
The preliminary description has this to say,
quote,
We believe that this exceeds the capacities required for problems that one deals with at present by a factor of 10."
Again, this machine's about scale.
4K of 40-bit words would be, for the time, a lot of space to work with.
If we believe the report, about 10 times what you'd actually want to be working with.
And 40-bits gives you space to express both large and
pretty small numbers depending on how you encode things. You can very easily fit a 10
digit decimal number in that space, for instance. But what about instructions? Well, that's
another spot where the IAS machine is strange. It kind of has a fingerprint to it. We can actually get to this weirdness
just by looking at numbers. 4,000 words of memory and each word is 40 bits. So let's do the math.
To address 4k of memory, you only need a 12 bit address. We arrive at that because 2 to the 12 is 4096. But each location is 40 bits wide.
That's a lot of space. So how would you format an instruction? When you get down to it,
basically every instruction a computer runs references some location in memory. So when
you design an instruction set, you usually build it around that
fact. Each instruction will need space for an address. But for the IIS machine, that only takes
up 12 bits. On a normal machine, each instruction will take up one whole word in memory. Sometimes
you even have computers that can use multiple words per instruction. but either way, we're still talking about an instruction being one
or more words of data. The other piece you need is an instruction or operation code, usually just
called the opcode. This is a number that tells the computer, well, what to do. It can range from
halting to adding two numbers to jumping.
But crucially, it's a number.
In general, most computers only have a few opcodes.
Theoretically, you only need one, or, well, actually, you technically can have a computer
with zero operation codes, but that's a different conversation.
In practice, you probably need like 8, maybe 16 to be useful, but I've seen some machines
with as many as 100.
The IIS machine has 30 possible operations, including all their modifiers.
You only need 5 bits to encode that number.
The IIS machine has these massive 40-bit words. So how would you format one
of these instructions for that word size? Well, the team chose to do something kind
of weird. A full instruction is only 20 bits long. It's the 12 bits needed for the address
plus 8 bits for the opcode. That gives you space for up to 30 numbers, plus a few more to grow on.
That sounds, at first, kind of wasteful, right?
A 20-bit instruction should mean that only half of each word is used.
But here's the trick.
Instructions are packed two to a word.
So a program puts two instructions in every
word of memory. Now, I'm not the most well-versed guy in the world, I haven't used every computer.
But I don't think I've ever seen such a scheme before. It's kind of weird to say the least,
but it is efficient.
This would let the IAS machine, in theory, load a program with up to 8,000 instructions.
One final note about that code before we move on.
The IAS machine, as with many early machines, has support for self-modifying code.
In fact, you have to write that way to create effective software.
There are instructions that let you modify the arguments of other instructions. They
let you rewrite that 12-bit address part of another instruction in memory. The manual
even explains that this is required to implement things like lists. Basically, to do an index, you have to rewrite your code
that's accessing memory.
This may have been one of the first computers to do this,
but I'm not entirely sure.
At least the Manchester Baby, the Manchester Mark 1,
and the Edzak don't implement this trick,
and those were all heavily influenced by the EdVac report.
Perhaps this is a topic for another day. There might be a lineage that could be worked out here.
The IAS machine first runs calculations in 1951 and becomes fully operational the next year.
As soon as the computer was operational, it was, well, hammered.
But why did it take so long for the IAS machine to be completed?
Well, there wasn't exactly some grand delay.
At most, it might have been delayed by about a year because the RCA deal fell through,
but even then, I think that's dubious.
Rather, it seems this was just the order of the day. Remember,
we are early into the digital era. Every computer is breaking new ground.
Take Ed Zach, for instance, as a contemporary machine. That computer entered development in
1945 and wasn't operating until 1949. That's some four years of work. The IIS machine took five.
ENIAC took four.
BINAC took four. It was just the timeline back then.
Any new computer was a Herculean force of effort.
The institutions that took up this huge task did so because computers were darn useful.
From Bigelow, quote, the first computation done on our machine was really the thermonuclear bomb,
he continues. A team of people were sent here from the AEC, the Atomic Energy Commission,
and the problem was programmed and put on the machine, and then the machine ran for 60 days, 24 hours
a day."
Like I said, the machine was hammered.
There were about a dozen digital computers in the entire world at this point.
These were, quite literally, wonder machines.
Nothing else could do the calculations a computer could do.
The fusion simulations Bigelow talks about were,
more than likely, Monte Carlo simulations.
That's a method that can only be done using a computer,
but it's critical for simulating certain types of physics
like nuclear fusion.
That right there should give you an idea of the sheer value of the IAS machine.
When you have a handful of instruments in the world, they can do this very unique type
of calculation, each one of those instruments will be of a measurable value.
As a result, they will be in constant demand and constantly running. This is where
we get to the computer's real claim to fame, its clones, or its influence. Supposedly,
the IIS machine served as the basis for many other computers of the era. But is that true?
How many machines are we talking? And if this is true, how similar are these derived works?
We've already seen a number of machines that were based off the EDVAC report.
That includes, well, the Manchester computers and EDSAC.
The list also includes later machines like the eventually completed EDVAC itself.
It's a weird story.
Let's talk about that another time.
But we do have to be careful here.
There's a difference between influence and derivation.
Case in point, CYAC.
The Standard Eastern Automatic Computer.
It was built by the National Bureau of Standards
and became operational in 1950.
Note, this is actually before the IIS machine was up and running,
but according to Bigelow, the machine was basically finished in 1949.
So, suffice to say, the design was already there.
Besides, designs as early as 1946 basically match the final machine,
with the exception of the specific memory technology used.
One of the chief architects of SEAC was Ralph Slutz.
Now, Slutz was on the IAS machine team, but he left the project before completion to go and work on SEAC.
So that's a slam dunk, right? One of the creators of the
IAS machine went to a new job and created another computer. The two must be
twins. Not so fast there. SEAC was a serial machine. It used Mercury delay
lines for memory. It had an even larger 45-bit word. It stored one instruction
in each word. It does support self-modifying code, but that's common in the period. In
all, I don't see much of a family resemblance.
CAC looks a lot more like EDVAC than anything. The core similarities are that CIEC and the IAS machine are both von Neumann computers
and that R. Mann Slutz worked on both machines, but that's it.
So then why will you see CIEC listed as a descendant of the IAS machine?
Well I think it comes down to Turing's Cathedral. In that book, author George Dyson calls the SEAC, quote,
the first of the IAS class designs to become operational, end quote.
From a technical standpoint, I cannot find a reason
that you'd call SEAC a quote IAS class design.
It's just not. So why is there this mistake? Well I
think it's kind of obvious. Dyson isn't a computer scientist, he's a historian. The
technical details in Turing's Cathedral are shaky at the best of times. He also
doesn't cite any technical papers on SIAC. If you just look at the broad historical picture
and follow who worked on what, then sure, SEAC looks like it's obviously connected to
the IAS machine. And it is, just not in a technical manner. The field of computing is
just so small in this era that everyone knows or works with everyone else. People go on vacations overseas
to hang out and look at other people's computers. There will always be some kind of influence going
on. That's why it's important to pair history with concrete technical details when you're looking at
these kinds of machines. Did Slutz take experience gained at IAS and use it on CAC?
Sure, I'm sure he did.
But is CAC a clone of the IAS machine?
Is it the same design class?
No, not at all, it's just not.
But Turning's Cathedral is a very accessible and very popular book so it's cited all over
the place even when some of the technical details
aren't all that solid.
Now what about all the other computers that are supposed to be descendants of the IAS
machine?
There are quite a few that are very, very close copies.
Notably, Janiak, the setting of the last episode, is almost identical to the IAS machine. Johnnyak
uses a 40-bit word, stores two instructions per word, it's fully parallel,
it even ends up using selectron tubes for memory, at least once RCA works out
the kinks. That I'd be much more comfortable calling even a clone because
it's a very, very similar
machine.
And we even have a paper trail to prove Johnny Ack's pedigree.
A 1953 report titled Rand's Digital Computing Effort explains how Rand fell into building
their own machine.
They'd been living out of a few IBM electronic calculators, but they needed something with
more oomph.
So Rand went to market looking for options to quote,
about two years ago, a national tour was made
and it was decided to go along
with several other organizations and produce a copy
or rather a modified version as they've all turned out
of the IAS machine.
I think almost everybody here
is aware of the principal features of that machine. It will only be necessary to describe
the deviations as they developed in the last year and a half or so." End quote.
So a few things to note here, right? First of all, this makes it very clear that Janiak
was made as an intentional derivation of the IAS machine. It's so similar
the paper only describes how the newer machine differs from the original. That is so clear.
Second is the fact that the report acknowledges a number of other labs have done the same thing.
of other labs have done the same thing. And crucially, these descendants aren't exact copies.
Code from the IAS machine couldn't just run on Joniac.
And the final point is how this spread is happening.
It's person to person.
Dyson explains this pretty well.
Von Neumann and other IAS researchers
toured around sharing their work with other researchers.
Outside researchers came to the IAS to see the new machine.
This is that network I keep talking about.
After the war, very few in the field,
maybe with the exception of IBM and EMCC,
were keeping their research private. The community was very open
because it was formed around these academic circles. And there just wasn't a massive community.
Everyone knew each other. So let's turn back to the period sources for a minute. In 1968,
Rand publishes The History of the Johnny Act. 68 is a little after the fact, but that's close enough to primary for me.
This is actually just about two years after Johnnyak is decommissioned.
The introduction gives us something really nice to work off of.
It says that in the early 50s, a number of machines were built that it calls Princeton
Class. a number of machines were built that it calls Princeton class, as in based off this one machine built in Princeton,
you know, the IIS machine.
This period list includes 10 computers.
It's a bunch of things that end in IAC
and some of their friends.
That, I think, should be seen as the definitive starting point
for any list of derivatives, with one reservation.
What's also nice is the paper gives more clear explanation as to why Rand chose to work off the IAS's designs.
In short, commercial computers just weren't ready yet.
In talking with Von Neumann, who was consulting for Rand at the time, and Bigelow,
it was decided that it would be possible to build an IAS-style machine, but improve upon it. Rand
was looking for a computer and an incremental improvement on the state of the art. Specifically,
Rand just wanted a more reliable machine. A working design was out there, it was running, so Rand
took it and, well, they ran with it.
Then what's the reservation I put on this list? Well, it turns out that even this period
list written by a lab that had a clone of the IAS machine has a problem, namely, Sysarac.
Or at least that's how I pronounce it.
I have no idea how that's supposed to be pronounced.
It's the Commonwealth Scientific and Industrial Research Automatic Computer.
I think Sysarac is close enough.
This is a computer that was constructed in Australia and entered operation in 1949.
It, like CAC, is its own computer.
Syserac used serial memory, had a 20-bit word, and its instruction, formatting, and operating principles are
just plain different than the IIS.
I don't know why this is on the RAND list, and oddly enough,
Sysarac isn't on Dyson's list of machines. My point is, there seems to just be some weird
confusion around some of these early computers.
As far as honest-to-goodness clones go, well, I think the best possible example here has to be
from Illinois. Specifically, the University of Illinois. In 1952, two
twin computers sparked to life at the University. ILLIAC-1 and ORDVAC. These
were both Princeton class machines to a T. We're talking 40 parallel tubes for memory, 40-bit
words, parallel math and logic circuits, even the doubled up instructions. What makes these
two machines so special is that they came in a pair. IAS-derived machines were all similar,
but as a rule, they weren't compatible. That wasn't even a consideration yet. Why
would you think about having compatible machines when there's 12 machines in the
whole world? But ILLIAC and ORDVAC WERE compatible. The story goes that the
University of Illinois was commissioned by the army to build a computer for use
at the Ballistics Research Laboratory.
And the university, well, they could use their own computer, so they decided to just build
two at the same time.
The twins would operate for many years and inspire more clones.
One was, and get this, Celiac, the Sydney version of, well, Iliac. It was constructed in Sydney, Australia in
1953. Another Iliac clone, Mystic, was built at Michigan State University in 1957, and
as late as 59, another clone, Cyclone, was built at the University of Iowa.
We get this whole weird cluster of Illinois themed computers.
This was all possible because the University of Illinois was happy to share their designs.
They had been handed designs themselves after all. All right, that does it for the story of the IAS machine.
It was a neat computer for the time, but how influential was it?
As I hope I've explained, that's complicated.
That's very complicated.
There were a number of computers built using the IAS machine's plans.
That happened because the openness around the IAS's work, which was partly result of
von Neumann's insistence, and partly just how academia functions.
From those first descendants sprang even more, until we get a family tree of these Princeton
class computers.
In that sense, yes, the IAS machine was very influential.
But there's also this weird phenomenon of folk claiming that unrelated computers were
inspired by the IAS machine.
At least, SIAC and CSIRAC are victims, and there may be more if you go looking through
all the lists of quote-unquote IAS clones that are kicking around online.
That gets us to the complicated part of the IAS machine's legacy.
The machine is very exciting for the time, but it's not entirely groundbreaking.
It uses a shared memory space for code and data,
the so-called von Neumann architecture.
That wasn't new.
That first appeared in the EDVAC report,
and it was implemented on computers before the IIS machine.
It used Williams tubes for memory,
but that was actually borrowed from another computer.
It was a fully bit parallel machine,
but there are earlier machines that had that same design.
The importance of the IAS machine is, in my estimation,
wrapped up in that larger community
and how information traveled in this period.
None of these ideas were entirely new,
but the designs for the IIS machine were very
open and very shared.
Von Neumann, Bigelow, and the rest of the team were happy to show their machine to anyone
interested and give advice on how you could build your own.
In that sense, the legacy of the IIS machine is less technical and more practical.
It goes back to how I feel about von Neumann in general.
I think it's wrong to call von Neumann the father of computing.
In reality, there's no one person that invents the field.
It's a community of researchers in constant dialogue that slowly figure out this whole
computer thing.
Von Neumann's huge contribution is the fact that he's
not just bright, but he's people smart. he was able to bring people together to
work on grand projects. he could convince institutions that yes, they do
actually need to get one of these computer things, and he could spread the
new digital gospel. that's all vitally important work, even if it's not
specifically technical.
Thanks for listening to Advent of Computing. I'll be back, actually, in a month this time.
Remember, I'm skipping next episode because I'm getting ready for VCF West.
If you want to hear me before next episode, then come out to Mountain View, the Computer History Museum August 1st
and 2nd.
I'll be giving a talk there, but after that it'll be back to regularly scheduled broadcasting.
And until then, as always, have a great rest of your day.