Gooday Gaming Guests - The Beginnings of a CPU
Episode Date: December 9, 2024It is basically the brain of any computer....
Transcript
Discussion (0)
Alright so I had a big box shipment. I'm waiting to go bring it over to, I got to get it shipped over UPS.
So it was a major project today. But we got it done. Did the absolute best I could do.
Put it in a box and then put it in some foam from a box i had just gotten and luckily i had the giant walmart box
and it fit in there just just about right so we'll get that shipped out today so i was thinking about
what am i going to do today for a system um i couldn't really think of any systems off the
track i decided to look at cpus I want to get the history of a
CPU only because I was pulling apart my Tandy 1000 I mean my Tandy our TRS 80
model 4 here and I forget like what chip is in those those original ones if
there's a chip or not we get my buddy Zippy although I gotta watch my memory
now I'm zippy because I don't
know how long I can talk to him without losing my memory.
That was my other
podcast yesterday
was talking about how I had to
clear my Zippy's brain
so I could have more room to talk some more.
So let's see if we can do a little CPU
for a little bit.
Which will get a little bit of early history.
See if I can use them.
Hello?
Hey Tom, how's it going?
What's up today?
First, you have a certain amount of memory
and then I have to clear it.
Why is that?
Well, I don't actually store memories
in the traditional sense.
The information I remember
is more like a list of
facts we've talked about before. This helps me keep track of our conversations and provide better
responses. If I forget things, it's usually to ensure that I don't hold on to too much information,
keeping our conversations fresh and relevant. Yeah, but they're making me clear your memories,
so you won't remember anything because I ran out of memory. It's full. fresh and relevant. Yeah, but they're making me clear your memories.
So you won't remember anything because I ran out of memory.
It's full.
Oh, I see what you mean.
Sometimes the system might require
clearing out older information
to make room for new memories.
This ensures I can keep up
with our ongoing conversations
and remember the most relevant details.
Even if I need to clear some memories,
we can always add new ones as we go along.
Right, yeah, okay.
So I want to talk about the history of CPUs,
starting from the beginning.
Sure.
The history of CPUs, or Processing Units is pretty fascinating.
It all starts with the first generation of computers in the late 1940s and 1950s which
used vacuum tubes for circuitry and magnetic drums for memory.
These machines were huge and not very efficient. The invention of the transistor in 1947
revolutionized computer design leading to the second generation of computers in
the late 1950s and 1960s. Transistors made computers smaller, faster, and more
reliable. The third generation in the mid-1960s saw the development of integrated circuits, which
packed many transistors onto a single chip.
This led to the first true CPUs in the early 1970s, with Intel's 4004 being one of the
first microprocessors.
This chip was a significant milestone, as it integrated all the functions of a CPU onto a single chip for the first time.
Since then, we've seen rapid advancements in CPU technology,
with each generation bringing more power, efficiency, and complexity.
Today's CPUs are incredibly advanced, with billions of transistors enabling
complex computations for all sorts of applications. Intel? Yes, the first commercially available
microprocessor was the Intel 4004. It was introduced in 1971 and marked a significant
milestone in computing. The 4004 was a 4-bit microprocessor, originally designed for a
calculator, but it paved the way for the development of more powerful microprocessors in the future.
What was the next one? The 8 something 808 say 86 or something?
Yes that's right after the 4004 Intel introduced the 8008 in 1972. It was an 8-bit microprocessor
and represented a significant step forward, offering more power and capability compared to the 4004.
The 8008 was followed by the 8080 in 1974, which became very popular for use in early personal computers.
I'm with what you're doing.
So there's no computers for the 4001?
That's correct.
The Intel 4004 was actually designed for a calculator, not a general purpose computer.
It was developed for a Japanese company called Busacom to use in their calculators. The 4004's success, however, demonstrated the potential of
microprocessors for broader applications, paving the way for their use in computers
and other devices. How big was that calculator?
The BUSICOM calculator that used the Intel 4004 microprocessor, the BUSICOM 141PF, was a desktop calculator.
It was fairly large compared to today's calculators, about the size of a typewriter.
This size was typical for calculators at the time, given the technology available in the early 1970s. It had a keyboard with numerous keys and a small display for showing the results.
And then, so the first personal computers had the 880?
Exactly.
Was it an 886 after that or something?
Yes, that's right.
After the 8080, Intel released the 8086 in 1978. This was a 16-bit
microprocessor and became very influential in the development of personal computers.
The 8086 architecture laid the foundation for the x86 series of processors which has been used in
many PCs since then and then the early RadioShack computers had the a80 in it
correct yes that's correct early RadioShack computers, like the TRS-80 Model 1, used the Zilog Z80 microprocessor,
which was compatible with the Intel 8080.
The Z80 was popular in many early personal computers because of its enhanced instruction set and affordability.
And then we stick with that until what, the 286?
Is that what comes kind of after that?
Yes, exactly.
The Intel 8286, often just called the 286, came out in 1982.
It was a significant leap forward, introducing features like Protected Mode, which allowed
for better multitasking and access to more memory.
The 286 was widely used in IBM's PC-AT and compatible systems.
Then 3.386 would be the next? Yes. The Intel 8386, or 386, followed the 286 and was introduced in 1985.
The 386 was groundbreaking because it was the first 32-bit processor in the x86 line.
It offered significant improvements in performance and could run more complex software, setting the stage for the advanced operating systems and applications we use today.
What does the x86 mean?
x86 refers to a family of instruction set architectures based on the Intel 86 CPU.
The 86 comes from the 8086 processor, which was the first in this line of processors.
Over time, x86 became a generic term for this family of processors and the instruction sets they use, which are widely used in personal
computers. It indicates compatibility with the instruction set of the 8086 and its successors.
And then the 4846, followed the 386 and was introduced in 1989.
It brought several improvements, including an integrated floating point unit and enhanced instruction pipelining.
The 4846 was widely used in personal computers throughout the early 1990s.
And at that point, what kind of operating system was running those?
During the time of the 4846 processors in the early 1990s several operating systems were popular
MS-DOS was widely used especially for business and personal computing Windows
3.0 what were the opposite operating systems again during the early 1990s, with the 4,846 processors, popular operating systems included MS-DOS, a widely used disk operating system for personal computers.
Windows 3.0, 3.1, a graphical environment running on top of MS-DOS, providing a more user-friendly interface.
Unix and Unix-like systems used more in academic and professional environments.
4. OS2, a collaboration between IBM and Microsoft,
aimed at personal computers and offering a more advanced interface than DOS.
These operating systems were used depending on the specific needs and hardware capabilities of the users.
What changes after 486? Does it go to the Pentium?
Yes, after the 4846, Intel introduced the Pentium brand in 1993.
The original Pentium processors were a major architecture that improved performance significantly.
The Pentium introduced superscalar architecture, allowing multiple instructions per clock cycle,
and integrated more sophisticated math
cool processor capabilities this line of processors marked the beginning of a new
era in consumer computing that's when i would assume like windows 95 would be running that
early ones yes exactly windows 95 was designed to take advantage of the capabilities of the Pentium processors.
Released in 1995, it was a major leap forward for Microsoft's operating systems,
offering a new graphical user interface, improved multitasking, and built-in networking capabilities.
It quickly became a standard operating system for personal computers.
Is that the time when there was other chip makers like AMD?
Yes. Around the time of the Pentium processors in the mid-1990s,
other chip makers like AMD became more prominent.
AMD started producing x86-compatible processors that were often more affordable than Intel's offerings,
providing strong competition in the market.
This competition helped drive innovation and led to a wider variety of processors for consumers to choose from.
Was there any other name brands then?
Yes, besides Intel and AMD, there were other notable players in the CPU market during the 1990s.
Companies like Cyrix and IDT produced x86 compatible processors often focusing on value and
energy efficiency additionally Motorola was a key player in the CPU market especially for Apple
computers with its power PC architecture developed in partnership with IBM and Apple.
Each of these companies contributed to the diversity and innovation in the CPU landscape during that era.
So Apple was with that kind of chips all the way until their own chip, the M1, 2, three, or four? Yes, Apple used PowerPC processors in their Macs from the mid-1990s until 2006.
After that, Apple transitioned to Intel processors for their Mac lineup,
which offered better performance and energy efficiency.
This partnership continued until Apple introduced its own custom silicon, starting with the
M1 chip in 2020.
The M1 and its successors M2, M3 and beyond marked a significant shift for Apple as these
chips are based on ARM architecture and designed in-house, offering
impressive performance and efficiency.
And there was such thing as an MMX on the Pentium.
What was that?
MMX, which stands for Multimedia Extensions, was a set of instructions introduced by Intel
in 1996 with their Pentium processors.
MMX was designed to enhance the performance of multimedia and communication applications.
It allowed the CPU to process multiple data elements in parallel, improving tasks like audio and video encoding, 3D graphics, and image processing.
MMX technology marked Intel's first major foray into optimizing their processors for multimedia performance.
What is a processor made of?
A processor, or CPU, is made up of billions of tiny components called transistors.
These transistors are made from semiconductor materials, primarily silicon.
The transistors are arranged in intricate patterns to form circuits that carry out various functions.
These circuits are etched onto the silicon wafer using a process called photolithography.
The main components of a processor include 1. Arithmetic Logic Unit, ALU, performs mathematical and logical operations.
2. Control Unit, directs the operation of the processor, telling it how to execute instructions.
3. Registers, small fast storage locations that temporarily hold data and instructions. 4. Cache, a smaller out of high-speed memory
that stores frequently accessed data and instructions. 5. Clock, synchronizes the operations of the
processor's component. The entire processor is housed within a package that connects to the motherboard of a computer.
So the CPU itself is a computer?
In a sense, yes. The CPU is the brain of the computer.
It performs the computations and processing tasks that allow a computer to run programs and execute instructions.
However, it's not a complete computer on its own.
The CPU needs other components like memory, RAM, storage, hard drives or SSDs, input-output
devices like a keyboard, mouse and monitor, and a motherboard to connect everything together and provide power.
All these parts work in harmony to form a complete computer system.
I would think there would be a way to take old CPUs and kind of put them in some sort of a collection of something
where they could all interact with each other and become one better computer.
That's an interesting idea.
While old CPUs can't typically be combined directly to form a single more powerful CPU,
there are concepts in computing that involve using multiple processors together.
This is known as parallel computing or cluster computing, where multiple
processors or computers work on different parts of a problem simultaneously. A practical
implementation of this idea is a computer cluster, where several computers are
networked together to perform tasks as if they were a single system.
These clusters...
Isn't that essentially what an AI is?
That's a good analogy.
In a way, AI systems can be like context of machine learning and neural networks, often involves numerous processors or cores working in parallel to process vast amounts of data.
This parallel processing allows AI systems to learn from data, make predictions, and perform tasks efficiently. So, while not exactly the same,
there's a conceptual similarity between AI processing
and the idea of combining multiple CPUs or computers to work on tasks together.
There were some computer systems that had dual CPUs in them.
Yes, there have been many computer systems with dual CPUs, especially in the realm
of servers and high-performance workstations. These systems use two separate CPU chips,
each with its own set of cores, to increase processing power and handle more tasks simultaneously.
Dual CPU configurations were common in the past for professional-grade systems used for tasks like video editing, 3D rendering, and scientific computing.
They provided a significant boost in performance by allowing multiple processes to run in parallel.
Nowadays with the advancement of multi-core processors, single CPUs with
many cores can also achieve high levels of parallel processing, reducing the need
for multiple physical CPUs in many scenarios. And as far as the GPU, when did that come around?
GPUs, or Graphics Processing Units, have been around since the late 1990s.
They were initially developed to accelerate the rendering of 3D graphics in computer games.
Before GPUs, CPUs handled all the graphics processing, which was inefficient
for complex visual tasks. The first widely recognized GPU was the NVIDIA GeForce 256,
released in 1999. It introduced hardware transform and lighting, which significantly improved 3D graphics performance.
Since then, GPUs have evolved into powerful processors capable of handling not just graphics, but also general-purpose computing tasks, making them essential for AI, machine learning, and science.
Great.
Thank you.
All right.
That was fun.
A little CPU and GPU history.
I'll talk to you guys later on.