Advent of Computing - Episode 133 - LIVE from Intelligent Speech 2023
Episode Date: June 9, 2024I'm currently out traveling. Due to my poor planning I managed to score back to back trips, for both business and leisure. While I'm not able to get an episode out on time, I do have a replacement! In... 2023 I was invited to speak at the Intelligent Speech conference. So, today, I present the audio of that talk. The topic is, of course, the wild path of the Intel 8086's creation and rise to power! If you prefer to watch, here's the video of the same talk: https://www.youtube.com/watch?v=6ud8LK3-eAM
Transcript
Discussion (0)
Welcome to a somewhat impromptu surprise episode of Advent of Computing.
So, while you're listening to this, I'm on the road.
I'm actually going to be traveling for about two weeks, and due to some earlier pushbacks
in my schedule, I thought I wasn't going to be able to put out an episode for this two-week
period.
But I found a way around that.
In 2023, I was invited to speak at the Intelligence Speech Conference. It's this really fun yearly event where podcasters get to
talk and have a nice little slide deck, if you like, about some shared topic. That year,
the theme was contingency, so I worked up a story about my favorite backup plan of all time, that being,
of course, the Intel 8086. That talk was recorded, and initially those recordings were only available
to people who bought tickets, but they've since been released. Actually, the recordings were
released to the public last week, which lines up perfectly with my failure of planning and scheduling. So this episode, I'm going to
be presenting the audio from that talk I gave at Intelligent Speech 2023. If you remember from the
last live episode I shared on the feed, I do like to use slide decks, but it's more as a flavor
enhancer than the focus of the presentation. You can enjoy this fully as audio alone, since
audio really is my medium of choice after all. If you want to watch the slides and even a little
video of my face, then I'll have a link to the whole video over on YouTube in the description.
After this, I'm going to be dark for my usual two weeks and then come back with an actually produced episode of Advent of Computing.
So until then, I hope this live episode of Advent of Computing serves as a little tie over while I'm boarding a couple more cross-country flights.
My name is Rick and I'm the admin for this talk.
So I'm going to hand over to Sean Haas from the Advent of Computing.
He's going to have a really cool talk, which I'm really looking forward to.
Thanks, Rick.
So like Rick was saying, my name is Sean Haas. I'm the host of the Advent of Computing podcast, where I cover the history of computing.
Specifically, I try to take pretty inaccessible stories that I think are interesting and put them
in a lens that a general audience can appreciate. I think that the history of computing is not only
really important to the modern day, it has these just really, really interesting and really,
I think, frankly, funny stories in it. Today,
I'm going to be sharing what is my favorite story from computing history and kind of what got me
into the field and into the show. That's the story of the 8086. It's this funny little chip
that was intended as a stopgap. Intel designed it as a backup plan, and that didn't really go as planned. It ended
up being a little bit too good of a backup plan. So to start off with, we need to talk about what
a microprocessor is. This is just quick background. A microprocessor or a CPU or just a processor is
the beating heart of the computer. This is where computation actually happens.
A computer has a bunch of different components, but this is really the root of everything.
This defines what a machine is.
So when someone's talking about a computer, they might say, oh, I have an AMD Ryzen or
I have an Intel Xeon.
And that full stop to someone like me who works with machines and writes software,
that tells me everything I need to know about the computer.
So this is the most important bit.
This specific processor that I'm showing you is the 8086,
which I will argue is the most important processor in the whole world and probably in the history of computing.
This chip is so important that it actually is the progenitor of a whole family
called x86. It's called that the x notation just means these are all chips that are something
and then 86 in their product number. This actually is a pretty old line of processors. The 8086,
the first chip in this line, was designed and delivered in 1979. Now all processors in this line was designed and delivered in 1979. Now all processors in this family, which includes
processors we use today, can run code that can be shared between all of them. So a desktop that's
sitting right next to me can run code that was written in 1979 for an 8086. As far as prevalence,
well, this is really the whole show. About 90% of computers use x86 architecture chips.
So this is the most important family.
So the progenitor of that family, very important.
It's a big deal.
So that leads to the interesting question of where this chip came from.
And to answer that, we have to go way back.
Now,
back in the day, there was really only one way you interfaced with a computer. You didn't have nice keyboards and screens. You had the teletype terminal. And these things just kind of sucked.
They were these souped-up electric typewriters that connected to big mainframe computers.
You could type into them like a typewriter, which would type whatever you
clicked onto a paper tape reel. And then when the computer responded, it would also type that out
onto a reel of paper tape. Now, these were just known to suck. I don't think anyone ever liked
these. They wasted paper. They were slow. We're talking like a couple dozen characters per second.
That's not really very useful. They were lame. That's the big
one. But they also only supported one computer. These are electromechanical. They're very simple
devices. So if you have an IBM machine, you need an IBM teletype or a compatible. If you have a
DEC machine, you need a DEC compatible teletype and never the twain shall meet.
In 1969, there's a big innovation that starts to break up the market, kind of.
It's still like a lame office tech innovation.
It's this machine called the Datapoint 2200.
Now, the company Datapoint had been doing some cool stuff with terminals.
They made the first glass teletype, which, like in the lovely product photo here, it's just a terminal that has
a little screen instead of a paper tape feed, which already is a big plus. But those earlier
glass teletypes could still only speak with one machine. So in 69 they developed the 2200, which
is a smart terminal. That just means that you can actually give it a small program and it will
execute that. That kind of means it's its own little computer, but in practice, that's only
used so that it can talk to multiple different mainframes. So an IBM shop could use a data point
2200. So could a DEC or a GE or anyone. If you have a mainframe, 2200 will get you set up real nice it's a neat design but
there's a stupid little issue that it overheats it kind of got way over engineered to make it
programmable it has to have a little computer in it which in 1969 took a few hundred little
integrated circuit chips each of those chips dissipates heat as it runs, and there were
too many of them in too small a box. So the terminal would run for maybe an hour or so,
and then it would be kaput. It would fry itself. Datapoint wants to find some way to reduce this
chip count so they can actually ship this new machine. But they're a terminal company. They
don't know anything about making computers or making little chips.
So they go looking for a contractor.
Intel ends up putting in a bid for this contract.
Their bid is they'll take all of these hundreds of chips in the Datapoint 2200 and reduce it down to a single chip.
Just one.
Uses much less power, and it won't overheat.
And it'll also be cheaper to boot.
So they end up designing this one-chip 2200 terminal, and then they lose the contract.
Datapoint actually just kind of goes back to the drawing board and figures out how to reduce the
chip count in their terminal. And if memory serves, they also add like a fan. So they don't need to work with a con factor.
They're fine.
But that leaves Intel with this.
It's a limited chip, but they do have a microprocessor now.
And they're not under contract anymore.
So they decide, you know, why don't we just spiff it up a little bit?
It'll still be based off these earlier designs.
But, you know, it works well enough.
We can sell it. So in 1972, they released the 8008. And this is the point where I have to apologize.
Intel's entire naming scheme for chips in this period is just eights and zeros. So
I'll do my best to keep this clear. Once this hits shelves in 72, it actually sells
modestly well. It makes enough money that Intel decides they're going to make a bigger chip.
The specifics here, though, are crucial.
The clients they have that are using the 8008 like the chip, but they want something more
powerful.
They want it to be bigger.
They want it faster.
They want more features.
And Intel also gets into talks with prospective clients that say, you know, I would
use the 8008, but it's weak. I need something stronger. So Intel hits the drawing board and they
make this chip called the 8080, which is kind of just a bigger 8008. It's based off the same design.
They add new features. They make it faster. They make it physically bigger.
And they make it compatible with the make it physically bigger, and they make
it compatible with the older chip.
That means that there's an upgrade path.
So software that was written on Intel's 8008 can be converted to run on their new chip.
This actually does amazingly well in the market.
They're able to retain their existing clients and get a lot of new ones.
They also hit the market at just the
right time. The 8080 comes out in 1974, just in the nick of time for this machine called the Altair
8800 to use it. And that's the first home computer in a sense. It's the first machine that a consumer
can buy. It's cheap enough to own. You can just throw it on a desk and you have a machine there's a bunch of caveats but that's usually the high water point for when machines start becoming home machines
so this is a huge win for intel they they're doing great they just broke the market wide open
uh but but there's a bit of a problem with uh with this overall path they're taking. They're starting to build up technical
debt or technical baggage. So this beautiful chip from 1974, it's just based off a design from 1972.
And that's actually based off the Datapoint 2200 from 1969, a random terminal made by a random company that no one knows about anymore by 74.
But we can simplify this chart.
The 8080 is just a data point.
It's literally just a bigger terminal.
That's all Intel's done because they can't lose their client base.
The result here is that Intel has cursed themselves with something very special.
They're actually selling five-year-old technology. I mean, sure, it's bigger, it's nicer,
they have these nice ceramic pads with gold pins. Looks fantastic. But it's still based off
a design from 1969. There are fundamental flaws with that design because it's older. And five years might
not sound like a lot today, but in this period, the pace of innovation with microprocessors
is staggering. And Intel is stuck using a five-year-old design during this period.
But, you know, customers want backwards compatibility, so Intel is just kind of stuck
with this. On the flip side,
competitors like Motorola, for instance, they don't have legacy customers. They can do whatever
they want. They are free to innovate while Intel is stuck. There has to be a way out of this. That's
the only way Intel will be able to truly compete. So they start off this project, this radical new thing called the IAPX432.
And right off the bat, that's an awful name.
No one at Intel was good at naming things during the 70s.
But the point of the 432 is it would be a radical break from tradition.
The project starts in 75.
The plan is just to throw away everything and make a chip unlike anything else ever designed.
It's a huge risk, but this is the kind of risk that it could have really big rewards. They could
actually change the field of computing. And on paper, this is a really cool chip.
On paper, though, things don't really go as planned. Now, I'm going to let you in on a little secret here.
The short story of why the 432 is so cool is that programmers are lazy.
There's this well-known law in the field that for every line of code you write, you introduce a new bug.
And as you introduce new bugs and errors, that makes it harder to write and harder to maintain software. So if you really want to simplify things, you could look at the long arc
of progress in computing as trying to make programmers write less code, because that's
the only way you're going to protect yourself. The long story of why the 432 is innovative
comes down to the different ways we can program a computer. On a fundamental level,
we have to talk in machine code. That's also known as machine language. It's the numbers that a
computer understands. You can look at them as ones and zeros. You can look at them in hexadecimal or
decimal numbers. It's evil. It's inhospitable, but the machine understands it. Historically,
some people have known this language, but there's no reason to do that, primarily because pretty soon after
we develop computers, we develop this thing called assembly language. Now, it's still kind
of inhospitable. It's just a way to add fun mnemonics to the horrifying machine code.
So you can pick out words. You can see there's English phrases.
It's more readable, more reasonable,
and you can write less code to do more.
Pretty soon after that, though,
we throw all that out
and we start using these high-level languages
like C or C++ or Java, HTML even, kind of.
Any language you've ever heard a programmer complain about is a high-level
language. And these are great because you can have one line that turns into a whole mess of
machine code. So in effect, programmers are writing less code. That's better for everyone.
If that went in one side of your skull and out the other, here's the summary. High-level languages
are the best. Those are the only. High-level languages are the best.
Those are the only ones you should use. You can use assembly language. You shouldn't, but you can.
And machine language is just evil. It's a demonic tongue that should not be in the mouths of
mortals. But the unfortunate reality is we kind of need machine code. That's the only thing a
computer knows. So everything has to get turned into machine code eventually, which sucks. You know, we're just stuck with these
evil numeric incantations. But, you know, maybe we don't have to be. What if we got rid of
everything? What if we had a chip that could speak that high level language? We wouldn't need machine
code or assembly language. We wouldn't need machine code or assembly language. We wouldn't
need to massage code around or do any conversions. It would just be great. Programming would be so
easy. You could talk to a computer in a normal programming language. Just imagine how many
human hours that would save. That's the point of the 432. Intel designs this as a chip that can speak a high-level language,
which is revolutionary. There were only a handful of attempts throughout the entire arc of computer
history to do something like this. Now, the specifics get weird. I'm going to gloss over
this. If you want more information, hit me up. I can throw a lot of documents at you.
Essentially, it's built to be able to speak this language called Ada.
And there's this direct equivalence between every feature in the nice high-level language
and every instruction the 432 can run.
So you can kind of shove a high-level language onto this chip, and it figures it out, which
is revolutionary. Full stop. This is
an amazing new technology. But Intel's kind of losing the point. They're kind of,
they've hit this point where they're about to jump the shark. As we recall, a microprocessor,
it's just a little computer. It's just supposed to be, you know, you take a small, simple system and you cram it down into one chip.
You do some magic to get that done, but, you know, it's just a little guy.
It just does a little bit of work.
It doesn't have to be huge.
It doesn't have to have millions of features.
It just has to be a little computer.
But Intel does this really, really concerning thing
as the 432 project goes on. They start calling it not a microcomputer, but a micro mainframe.
Now, for those not versed, a mainframe is the big room-sized computer that you might see in the back
of a sci-fi film. It's second only to a supercomputer
in terms of complexity and power.
There are features on mainframes
that normal computer users, myself included,
can't comprehend and don't even know exist.
And Intel just says, yeah, you know,
we can put that on a chip.
We'll take all the power of a whole machine room
and just put it on a little
wafer of silicon. This, my friends, is called feature creep. This not only kills a project,
but it kills it very slowly. To make matters worse, their competitors aren't doing this.
They're not trying to jump the shark. For instance, there's this great chip called the Z80.
The whole saga around that processor is fascinating.
The two primary designers of the 8080 at Intel,
they just up and quit and start their own company.
And they end up making this chip that's just like Intel's 8080, but better.
And it gets all of Intel's market share.
It kind of wrecks them overnight.
There's this other company up in New England called MOS Technology that makes this 6502
processor, which is what powers the first Apple computers.
A lot of early programmers just love that chip because it's just a little dude.
It's simple, it's cheap, and it works.
There's also this wild company called Texas Instruments. They make calculators nowadays, but back in the 70s, they were doing truly incomprehensible things with silicon.
It's another wild saga that if you ever get me alone in a room, I will go on and on about.
But the point is, Intel's trying to break out of this
rut by doing something that's a little too ambitious. Which brings us to a poor, unfortunate
programmer named Stephen Morse. Now, in the 70s, he was one of the first people to actually really
get into programming a microprocessor, which there's actually some differences between programming a little computer versus a big one. And Morse was one of the few
people that was familiar with that. So in 75, he was a really easy hire for Intel. Now he gets
brought into the company and pretty quickly he sees the 432 project going on and he is very
concerned. He does a full audit and writes up this great internal report.
And the finding really is that the 432 not only sucks, but it's going to kill the company. He is
very concerned about the chip. One of the main things is he thinks it's going to be slow,
which is a problem because it's also going to be very expensive. Around this time, Intel had also
been, as the 432 project
drug on, thinking about making a stopgap measure, something just to put into the market so they had
something that was new that could sell and maybe edge out some of their competition just until the
432 was done. Now, this is a direct quote from one of Morse's recollections of this period.
This is after he submits his report.
He writes, because of my report, management decided that I would be the ideal person to
design the architecture for the stopgap measure.
So he stuck his head up and he kind of had to suffer for doing that.
He becomes the primary designer, the single primary designer of this new chip called the 8086.
But luckily, you know, for a one-man team, the project was simple. Just make a chip that's fine.
They just want something that's okay, that makes their existing customers happy so they can keep
market share. They want it to work with older hardware, and it has to have some kind of
compatibility. They want old 8080 code to run on the 8086.
You know, keep legacy customers around.
They also want it to be 16-bit.
That's just making it bigger so it looks competitive.
This chip will be designed pretty quickly over just a few years, and it will be released in 1979.
But this is making me need to tap the sign again.
There's this whole lineage that Intel
is building up, and they added
another one to it. This was
the whole thing they're trying to get away from,
and they just can't.
They're
kind of double-cursed.
This is a worse
situation now.
The 8086, excuse me, is actually based off a 10-year-old design.
They are feeding right into their greatest fears.
It's actually only adding features to the 80.
So code from the older chip can run on the 8086.
Actually, you could probably make code that works on the data point 2200 run on 8086 it's a
it's just not a new design and it's not even like an obsolete design now it's a
a downright archaic one but you know it's just a contingency it's a stop gap who cares
the 432 will be out in just a few months. That's going
to solve every problem we've had. Computing will change forever. The 8086, it's a flash in the pan.
We'll throw out the dyes. We'll burn the manuals. It'll be great.
And then we hit 1981, and it's not great. That's the year that the 432 comes out,
that's the year that the 432 comes out and it it just dies on the vine it's slow it's expensive it has all the problems that steven morris predicted and it's also not really a microprocessor
this is something i lied about on a previous slide it's two chips it's not one it's two they
put so many features into this processor that they have to have a separate processor for it.
Once it hits market, like, no one buys it.
These aren't used in any computers that anyone knows of.
Intel also doesn't support any of the programming, like, tools for the 432.
It's just a failed project.
It's not able to fix any of the problems.
It's not able to dispel the curse.
And this gets worse. Also in 81, IBM releases a little thing called the PC, the IBM Personal
Computer. It's a full stop, the most successful home computer of all time. If the Altair launched the home
computing revolution, this is the pinnacle of home computing in the era. Initial numbers are
a little hard to come by, but in 83 alone, about 750,000 units are shipped, which is totally
unprecedented. This is a success the likes of which computing has never seen before.
The PC uses as its processor a chip made by Intel called the 8088, which is actually a cost-reduced
version of the 8086. So if we wanted to make an even bigger diagram, this would go at the very end of the chain.
Now, once again, this could have just been a flash in the pan. The PC could have disappeared and been replaced by a bigger, better computer next year. But that doesn't end up happening.
There's this little quirk that IBM builds into the PC. This is an open architecture computer,
which means there's no trade secrets in
it. You can actually buy a full technical reference from IBM. They come in these delightful
hardbound tomes. They're lovely. They have full schematics, full parts lists, and all the
information you need to put together an IBM PC. It also uses all off-the-shelf parts, which means that you could actually walk down to
a local RadioShack and buy everything needed to build a PC. The only thing you'd need would be
a circuit board, which you have the schematics for. You can just do that in your garage.
The only trade secret, well, there's two things that are copyrighted, I guess. One is the IBM logo on the case,
and the other is about eight kilobytes of code, which, you know, for scale, that's just not a lot.
It's reverse engineered within like two years. People just figure out that they can write their
own version of that. And all that code does is turn the machine on and get it up and running.
does is turn the machine on and get it up and running. So the result here is you can just make your own PC. There is literally nothing stopping you but the IBM logo. And people do that. By 1983,
Compaq comes out with the first fully compatible PC clone called the Portable. It is literally just
a PC. In the ensuing years, almost every company under the sun, and even some that didn't exist,
start producing IBM PC clones, which is a big problem for Intel.
Because each of these machines uses an 8086 processor.
They're all x86-based computers.
They're all using that chip that was supposed to
just be out for a few months to maintain market share and keep their legacy customers happy,
and now they're everywhere. By 1985, you can buy a home computer for a few hundred bucks,
and they all run x86 processors.
run x86 processors. In this way, the PC becomes the most popular platform in the market. There's nothing that can compete against it. And with the 432 totally failing, the x86 becomes the most
popular platform on the planet. There just wasn't any competition anymore, especially from within Intel. This is a case where the stopgap not only won, but it has cursed Intel to this very day.
As something of a stinger, sitting right next to me is my home computer.
I use it for recording.
It has an i5 in it, which is an x86 processor.
Chances are, if you're listening to this on a laptop or
a desktop, you too are currently using an x86 processor. It was this fun little backup plan that
went a little better than Intel ever hoped. And that is the presentation.
And that is the presentation. one of my favorite stories in the history of computing because it's kind of one of those cases where intel messed around and really found out that they made awful decisions and they couldn't stop making them no matter how hard they tried do you think i guess it was the market
forces that were pushing it in that direction there because they just couldn't bring this new
the wasn't the 423 to market they they suffered from their own success, and they tried really hard to get out of that rut, but they really pushed too hard, and it didn't work.
And so I was saying about Texas Instruments that's really interesting is they designed this one chip that comes out the same year as the 8086 that has a lot of features that are also on
mainframes. They designed this chip that's kind of a micro mainframe, but they just do it quietly
and they do it in a very conservative way. And it's a very cheap chip. It's not very fast, but
it can do a lot like multiprocessing and protection things that mainframes do.
And Intel just, they went the totally opposite direction
and tried to throw everything into this chip.
And they just kind of got wrecked.
We've got a question here from Yuga.
They say, do you think that the success of 8086
made innovation in this field slower?
Yes, it did. We're kind of still suffering from that
today. Currently, there is some movement to new platforms. Apple's been pushing towards
in-house ARM chips, which is a totally different architecture. There's also
cell phones and tablets often use
ARM or risk-based processors. And those have been actually under development since
the late 80s, early 90s, but they have failed to get market share until very recently,
partly because of economies of scale. x86 processors are just insanely cheap to make because every fabricating factory has the
tooling for it and the tooling's old it's cheap and it's well known whereas switching to a totally
new tooling to make like an arm processor we've we've seen how apple has been struggling if you
follow the news that's taken them years to get like the fabrication capabilities set up for making a chip that's new.
So yeah, the success of the 86 has definitely slown innovation.
Yeah, it's been a bit of a curse for them, hasn't it?
Do you think, well, you said they can't name things.
I thought Ada is quite a good name for the language.
That wasn't named by Intel.
That was named by federal contractors.
Okay.
That makes sense then.
Yeah.
I have a two-part series on Ada that I did last year, actually.
It was this language that was developed by a federal contractor committee, and that was
like picked out of a hat of names.
If it was an Intel name, it would have been all numbers i'm
sure do you think though that that might have been a well it seems the 43 was over engineered but
if it can run this higher level language directly on the chip would that limit the escape for
actually being able to do other things with the chip and other languages. Yes, it would. One of,
so one of the things that kind of wrecked the project is that ADA was very
popular at the tail end of the 1970s,
but by the 1980s federal support for the language,
because it was designed by a government committee kind of faltered and it ended
up only being used on like very specific federal contracting jobs um intel kind of bet
that it was going to be a big general purpose language and then it wasn't so when the 432
came out intel had said oh we're going to have these great compilers we're going to have all
these programming tools for ada and they just never wrote them because the language really wasn't used by non-federal employees anymore so intel just lost
all purpose for the chip do you think there'd be anything similar to that again or do you think
it's too diverse now with the with the different languages so what's been
interesting um watching people trying to claw away from x86 chips with like chips the m1 m2 and m3
that apple's developing or the risk processors that we use in cell phones now. They're all very simplified processors. So the x86 is getting in the weeds.
It's what's known as a complex instruction set computer. So it has these big instructions that
do like a dozen different things. You can do, there's actually this joke, there's this one
instruction called move that lets you move a little bit of data somewhere else.
And it's so complex now that you can actually write an entire program only using move instructions, which is concerning.
That's kind of a problem with legacy chips that only add things.
Whereas REST processors, they're reduced instruction set computers.
So a lot of them don't even have a multiply instruction.
So they're more just simple chips
that are actually now finally cheaper to produce
now that the tooling's caught up,
but they take less silicon, they're a simpler design.
And so most of the work is pushed onto the programmer.
And that's actually ended up being a better way to design chips in the long term it turns out since you know software is more
flexible than silicon yeah um so this reminds me of my first pc actually which my dad's got
which is a 486 it's quite amazing actually listening to this i loved that computer um
but actually yeah you're saying that these machines are so big.
So I remember when you'd have like MMX
was a big thing for a while on the Intel chip.
So it's a case of they kept having to shoehorn
kind of more things into it just to stay competitive.
Partly to stay competitive
and partly to try and find ways to innovate
because they have to,
like Intel did with the 8080 and then 86
they could only add because they wanted to keep legacy so add mmx oh no people are using mmx now
the next chip now has to have mmx we just have to mmx didn't work very well but now we always have
to support it until the end of time so now the chips are a little bigger and a little more complex uh-huh it's that um it's like the legacy legacy a a curse and a and a blessing
that backwards compatibility isn't it yeah exactly and it it is wild that i can i have floppy disks over there that have code for 8086s that I can use on my home computer.
It's neat, like in a preservation kind of way.
But when you get down to it, it's very concerning for the longevity of a platform.
As someone that works in software, I work professionally as a software developer.
So when I see legacy systems like that, I've had to maintain code that was written like 10 years ago.
I've had to maintain Fortran code before that was written like 30 years ago.
And it's horrifying because you get into these situations where it's like, why do we still have all of these features?
We don't have the people or the know-how to support this. Oh why do we still have all of these features? We don't have the
people or the know-how to support this. Oh, but we just have to. We literally can't get rid of
this because our clients will leave us and we won't have money anymore. Yeah. That massive
technical debt that gets built up. I think it's something that's incredibly difficult for people
to understand because it's not a physical thing that you can see or grasp so trying to educate people which i don't know at that level i'm a web developer so
i've got some appreciation of working legacy k bases but to try and get people to realize
the potential long-term issues that are being caused and that you have to have that investment
and that innovation because
i remember because i've always been a pc user and it's always been pretty um solid for me and
reliable then the designers where i previously worked they'd be on mac which obviously meant
power pc intel and then on to arm every time they didn't think it seemed to break and you
could understand for big business that's terrible but it's also allowed apple to do that innovation yeah it's it's a really tough balancing act and
it's something that if you get wrong uh like intel you end up cursed until the end of time
or you just cease to exist i think intel is such an interesting case because they've hit this point,
even in the modern day,
where with the x86,
they're making good money.
They have a great product base,
but they're just kind of stuck.
They're just like in this limbo
where it didn't blow up in their face,
but it just won't go away.
Yeah, do you think there's any exit strategy
for them to get out of this?
Could it be maybe with virtual machines that you could then virtualize the x86 in some new processor?
That's... man, I'm not versed enough on the low level of newer Intel chips.
I know I was reading an article on Hackaday a while back about how some Intel processors now have a tiny second processor on their die that's an ARM processor.
So like the chips we have in phones, it's just used for like managing hardware as the computer turns on, because that was easier than dealing with an x86 chip.
easier than dealing with an x86 there's i think there are probably ways that intel could break out of this kind of legacy trap um but i i think they're all really risky um and that's with the
432 i don't know how much of an institutional memory they have of that.
Because they did take a big risk and it failed very, very badly.
Those kinds of risks, especially in a business context, are very, very hard to make.
Actually, I knew you guys were saying in the chat that as far as they understand, the Intel chips sort of run a virtual machine of x86 at the moment i think you're talking about microcode right please say yes yes yes
so modern chips use this thing called microcode which
it's like galaxy brain stuff they implement this very basic processor that then is programmed in this like secret, super low level language that they use to define how the chip responds to machine code instructions.
I've been really wanting to do an episode on the show about microcode, but it is so hard to explain in a way
that's not like, let me break out the graphs.
Let's look at some technical documentation.
Yeah, it is kind of like a virtual x86.
I'm trying to think of, not transcend.
There's a chip in the 80s
that was like an early, really high-profile abuse of microcode kind of thing.
It might have been called like the Transputer or something.
They used microcode where it was a commercial chip that could be programmed in the field to be different kinds of chips.
And that crashed and burned.
But yeah, it is kind of like virtualization, which is cool.
Oh yeah, and there's totally fortran 77 code out there there's a some running on the pleiades cluster at nasa
aims that i could direct you towards if you want to cry a little bit right um i'm afraid but we're
gonna have to probably finish this up now this um but if you want to continue
the chat you can do in the lobby because it's going to be moving on to the next talk scene
but uh before we go don't do it's anything you want to plug or promote i'll just listen to the
podcast um it's called advent of computing my website's advent of computing.com and i show up
in every podcast player there is.
Brilliant. That was really interesting.
And yeah, I didn't know that history of x86.
So thanks very much.
Yeah, of course.