Advent of Computing - Episode 92 - Copy Protection
Episode Date: October 2, 2022It's Spook Month 2022! To kick things off we are diving into the frustrating depth of copy protection, piracy, and the origins of commercial software. In 1969 the Great Unbundling made the software m...arket viable for the first time. Ever since then pirates and software vendors have been locked in a battle over bits. This episode traces the early days of copy protection, and how spite played an important role.  Selected Sources:  https://fadden.com/apple2/cassette-protect.html - In depth analysis of Apple II copy protection  https://www.princeton.edu/~rblee/ELE572Papers/Fall04Readings/CryptoProc_Best.pdf - The crypto-microprocessor  https://sci-hub.se/10.1109/85.988583 - A personal recollection of the Unbundling
Transcript
Discussion (0)
It's time for a traditional reading from the jargon file.
Quote,
Copy protection.
A class of methods for preventing incompetent pirates from stealing software
and legitimate customers from using it.
Considered silly.
End quote.
Now, I'm sure we all have our own stories about running afoul of copy protection.
Sometimes by accident, sometimes purposefully. Or perhaps,
and I say this clutching my pearls already, acts of piracy. I'd like to throw one of mine into the
mix, just to spice things up. Back in high school, my friends and I had particular tastes in music.
We all liked metal, and none of us had any money. Usually, we'd find
music by browsing used CDs at thrift stores, browsing the net, and passing around mp3 files.
This was back in the good old days when people were commonly ripping CDs. I personally built up
a pretty respectable collection of Creative Commons-licensed music. I prefer to
think that my tastes were underground, that's a better way to put it, but hey, whatever works,
you know? Somehow, one of my friends found the band Aelstorm, a pirate-themed metal band. Soon,
files were propagated, and we all had a new favorite band for the week.
At the time, nothing seemed out of the ordinary.
This was our usual MO.
Years later, I got a new car, and a few years later, I got a new stereo, one of those cool ones that has a USB plug for loading up files.
Figured I'd throw on some of my old tunes to keep me comfy on my daily commute.
Inevitably, I found that old Aelstorm folder,
and it was there on my boring daily commute,
listening to songs about crimes on the high sea,
that I was caught by a clumsy form of copy protection.
You see, at some point, Aelstorm had put out some demo tracks for one of their upcoming albums
The tracks were free to download, which makes sense why my friends would have been passing them around
But there was a catch
I guess they were trying to get people to buy the full album
Weird, I know
Each track had an audio watermark of sorts
Partway through, they had overlaid the vocalist in full piratical affectation, saying,
Arr, piracy is a crime.
It took me literal years to realize that that wasn't just part of the song,
but instead a plea for high school Sean to go buy a CD that I couldn't afford.
Was this especially effective? No, it really wasn't. It didn't stop some dumb high schoolers
from sharing files and listening to those songs for years. And I think that sums up my feelings
on most copy protection. It's an annoying and possibly damaging attempt at stopping piracy.
It's frustrating, but not super effective or truly dangerous.
It's the perfect opener for Spook Month here on Advent of Computing.
Welcome back to the show.
I'm your host, Sean Haas, and we are finally entering Spook Month 2022. In the spirit of Halloween, my favorite holiday, I try to set aside each October as a special time to examine the scary side of computing.
The thing is, computers aren't actually that scary.
So I tend to cover the parts of the digital experience that are frustrating, confounding,
or, you know, just a touch spooky, a little scary, but not truly frightening.
In past years, we've looked at computer viruses,
spooky text-based adventure games, spam email, and debugging. To kick things off this year,
we're looking at the history of copy protection and the corporate arms race that has been ruining
software right up to the present day. I think that's suitably spooky for us.
This actually presents a bit of
an interesting challenge. In preparing this episode, one of my big questions has been,
when was the first copy-protected software sold? That's a fine starting point because it actually
breaks down into a pile of pretty difficult-to-answer questions. When did software first become commercialized?
Have programs always been that way? When was it even legal to claim ownership over code? And,
perhaps my favorite here, who was the first software pirate?
The problem with approaching these questions is something that I run into pretty often here at Advent of Computing.
There's a lot of known unknowns on the show. It's relatively easy to answer technical questions.
Worst case, you just need to find an old reference manual and do some reading.
But most of my questions surrounding copy protection are a little different. These are
questions about people and their motives. They're all pretty
soft, you know, they don't have hard edges, they have kind of squishy membranes. This gets us into
the realm of folklore, speculation, and unconfirmed accounts. I think that's a fine place to be as
long as you can accept a little academic risk, so to speak. But hey, I'm not peer-reviewed, so no one can stop me.
The history of copy protection is hand-in-hand with the origins of open source, the commodification
of code, and the birth of the digital pirate. All things are centered here, so where do we even
start? Today, I'm going to take us on a wide-sweeping journey. We leave port and first stop to take a look at the origins of software distribution in general.
From there, we'll sail to the shores of antitrust lawsuits and state-mandated free software.
This will prepare us for the perilous waters of the first commercial code
and how the industry made its first heel turn against its own customers.
Maybe by the end of the journey we'll all be honest-to-goodness pirates ourselves, or
at least find where X really marks.
But before we start, I have a quick disclaimer and announcement.
The disclaimer is, of course, piracy is a crime and crime doesn't pay, So don't actually become a software pirate because of this episode.
In this episode, we're going to be discussing some techniques used to commit acts of digital piracy.
But this is all purely educational.
Advent of computing may be subversive media, but we're not criminals around here.
So don't become a pirate. Advent of computing may be subversive media, but we're not criminals around here.
So, don't become a pirate.
I'm not encouraging actual piracy.
Now, the announcement is, as I've been saying on most episodes lately,
I'm still looking for submissions for Notes on Computer History,
my possibly revolutionary journal on the history of computing. Now,
I do say that with my tongue firmly in my cheek, but I do believe that Notes on Computer History is going to provide a very good place for authors of all experience levels to write about the
history of computing. Now, I've been talking to quite a few people about this, and we do have
our first four articles in the can right now.
We're looking for more, but anyway.
One of the common threads is a lot of people I discuss notes on computer history with say they're on the fence about writing.
If that's you, then I'm speaking directly into your ears right now.
Get off the fence.
Anyone can submit.
Anyone who wants to write can.
Doesn't matter if you have experience.
Doesn't matter if you're a fantastic writer or a total newbie.
Just find something about computer history you're interested in or that you're familiar with,
write about it, and send in a draft.
You can find how to submit actual email addresses and so forth over on the project's website.
That's history.computer.
So don't wait around.
Submit today.
So with that out of the way, let's get into the actual episode.
It's a bit hacky, not in the computer sense, but here's a good question to start with.
What's the deal with open source software? Or maybe we should think about it as code that's free as in freedom, not free as in
beer. When we call something open source, it can have a few meanings. In the most simple sense,
this just means software that's distributed as source code. That's open source
in the sense that you can, you know, open a file and see the source code. We can also be talking
about software that doesn't have any specific license attached to it. That kind of code is
more specifically called public domain. In the modern context, open source usually refers to software that comes
with some type of open source license, like the GPL or MIT licenses. Those licenses put
different stipulations on the use and distribution of the code, but the core facet is always freedom
to read and modify the software alongside freedom to distribute it in certain specific ways.
Historically speaking, most software has fallen into the public domain category.
This was partly due to custom and partly due to necessity. The era of commercial computing starts
around the production of the first Univacs. These were some of the first computers that were sold.
It wasn't so much the introduction of money that reshaped things, although that was a factor.
The bigger change was the fact that, for the first time, there were computer users outside
research labs. Within a few years, there would be more users outside of research labs than inside.
But these early adopters weren't like the users of today.
Computers weren't exactly user-friendly.
So even folks outside research institutions were still pretty adept at handling machines.
You basically had to know how to program in order to use a computer.
And this is where we can pull in some fun detail from the timeline.
The first UNIVAC shipped in 1951.
Fortran, the first programming language to enter any actual use, hit version 1 in 1957.
It's also crucial to remember here that just because Fortran existed didn't mean that
high-level languages were actually in common use. Adoption across the industry took quite a while.
In this early wave, most users would have known either machine code or assembly language. That
was really the only feasible way to program a computer for years, maybe a decade.
In this era, the concept of proprietary or closed-source software just didn't exist.
It simply was not possible.
A program written in assembly language gets translated more or less directly into machine
code.
That's the native tongue of a computer.
Early users knew how to read that language. They knew how to decipher the dialect. Thus, it wasn't really possible to hide
away the workings of your program. Everything was, by the very nature of the industry, open source.
The introduction of compiled languages like Fortran planted the
seeds of a change. A compiler doesn't directly translate each line of code into machine code.
Instead, the compiler goes through some dark inner machinations to generate an efficient
final program. This means that a single line might translate into a pile of instructions.
The process isn't always clear, so you can't always run things in reverse. This makes it
hard to understand what a compiled program is actually doing by looking at the machine code
alone. This isn't a problem so much as a change. It meant that, for instance, there was now a separation between the code a
programmer wrote and the actual program the computer executed. Machine code and actual
readable high-level language code could now be distributed separately. There is now a meaning
to that separation. This will be a factor in how the software market eventually becomes possible,
but for the time
being, this didn't lead directly to the commercialization of software. This is something
in the backdrop. As far as software distribution went, the adoption of high-level languages
didn't change all that much. The earlier era had instilled a sense of openness about code.
era had instilled a sense of openness about code. Plus, there were simple practical matters at play.
IBM's initial business model was to not charge for software. This was called bundling. You'd get a computer plus whatever software and services you needed to run it. IBM would throw
in software for free. So even the revolutionary Fortran compiler was free software
back in the day. The other reality is that most software was distributed as source code. I'm not
sure if Fortran itself was, but the usual practice was to pass around code instead of compiled
programs. This was really the most convenient way to deal with software.
The computer market was full of a diverse cast of machines. Compatibility wasn't really a thing,
even within a vendor's own catalog. Passing around binaries wouldn't provide much coverage for
that wide cast, so it was more common to hand off code since source code could be modified or compiled
to fit more machines. But let me be 100% clear, we aren't talking about third-party vendors here.
The software market still doesn't exist. Instead, a lot of software was provided by
independent user groups or by first parties. One of the earliest
of these independent groups was Share. Started in 1955, Share was a software library and user group
that specialized in IBM hardware. It was an organization that helped IBM users share software
that they had developed. It also facilitated larger collaboration on more complex
projects. The other factor that made Share specifically deal in source code came down
to the community. As I mentioned, Share was initially founded as a way for multiple organizations
to collaborate on large programming projects. Source code had to be shared in order to make that possible.
So we end up with a good number of reasons why source code was the primary mode for software
distribution.
This was the status quo for decades, basically from the inception of commercial computers
up to 1969.
We actually have an exact date for the end of this era because
we have a big defining event. You might call it the Great Unbundling, or the day IBM created the
software market. If you take a broad view of things, then IBM's biggest rival hasn't been
a computer company, per se. Big Blue has always been
fighting with the US Department of Justice. You see, back in the day, IBM wasn't just the
biggest computer company in the market. IBM kind of was the computer market. They also,
prior to that, had been the tabulating and punch card market. This was partly thanks to
timing, and partly thanks to some aggressive choices made in how IBM sold their product.
It was those aggressive choices that would lead to multiple antitrust cases.
Today we're going to focus on the 69 antitrust case.
This all started with a series of lawsuits brought by IBM's smaller competitors.
The main allegation was the Big Blue's practice of bundling software and support with hardware was anti-competitive.
The logic used here is kind of interesting, so let me give an example that's cited in A Personal Recollection,
IBM's Unbundling of Software and Services, by Burton Grad, an ex-IBMer. To quote, In April 1969, IBM was sued by Applied Data Research, ADR, in regard to ADR's Autoflow
program. ADR claimed that IBM was giving away a somewhat comparable flowchart
program, thereby preventing ADR from realizing reasonable economic benefits from its Autoflow
investment. End quote. The bundling policy meant that, in practice, IBM was giving away free
software to their customers. This had started small with
things like operating systems and compilers and assemblers. But over the years, IBM had expanded
their in-house catalog considerably. Customers would often contract out to IBM for custom
software. In certain contracts, IBM would stipulate that resultant programs could be used by other IBM customers.
Somewhere along the lines, a flowchart program was added, hence ADR had started to complain.
It might sound like something of an annoying lawsuit, but think about this for a second.
With IBM providing most of the computers in the world and freely providing any software customers wanted, why would you buy any software at all?
For that matter, why would anyone choose non-IBM hardware?
Bundling, while a practical thing at first, had become a way for Big Blue to keep an iron grip on the computing
market. There was no practical way to compete unless you were on the same level as IBM. And
no one was as big as IBM. A number of similar lawsuits were eventually bundled up and expanded
upon by the DOJ. IBM was skirting up against antitrust laws.
The primary law in question stems from the Sherman Antitrust Act. This act allows the
US government to bust trusts and monopolies. In other words, to prevent companies from conspiring
and to prevent companies from unilaterally controlling markets. Something to note is that this law isn't
necessarily in place for the good of smaller companies. That's a small distinction here.
The Sherman Antitrust Act is instead concerned with protecting consumers. The idea being that
trusts and monopolies can be used to hurt consumers in the name of profit.
Hence, there's an extra layer to the legal arguments that companies like ADR had to make against IBM.
But, you see, Big Blue has always been a bit of a canny operator.
The suits deep in the company's bowels could smell feds on the wind, a skill that I know I wish I could acquire.
feds on the wind, a skill that I know I wish I could acquire. IBM had started to plan the great unbundling in 1968 in an attempt to get out ahead of the DOJ. This wasn't to try to break up
the monopoly, just to maybe abate the lawsuit. A task force was assembled to determine what a
software market would even look like. It was decided that certain
core software, like operating systems in some programming languages, would remain bundled.
However, some services and code would be spun off. The unbundling article I mentioned gives
a list of new policies, and here are the two that matter most for us.
list of new policies, and here are the two that matter most for us.
Quote,
Custom programming was initially to be priced on a cost-plus basis, with the future option to bid fixed-priced contracts, and
17 language, utility, and application software products were announced on a monthly lease
pricing basis, which included telephone support, error correction, and some future enhancements.
End quote.
Just like that, the good times of free IBM software were over.
Or maybe you'd call them the bad times, depending on who exactly you were.
Something to note here is that software wasn't being sold on a one-time cost basis.
Instead, IBM was leasing code on a per-month cost.
I'd say we aren't used to that model, but recently we've been slipping back into that realm.
All I can say here is, if you're a software developer thinking of selling a program,
please don't do this.
I don't care about the arguments around the topic. It's just evil to make someone pay
a subscription fee for software they run on their own computer if they don't want support.
Just stop it. The reason that IBM went with this specific leasing model, as near as I
can tell, is because that's just how they
sold hardware.
You couldn't buy an IBM mainframe, you leased it.
Back in the day, the only line item on your bill would have been for hardware, but after
1969 that changed.
Customers were now on the hook for software fees and support fees as well. The IBM catalog started with just 15 programs,
which had a range in prices. On the low end, $25 a month got you the excited, rigid-frame
selection program. Or you could splurge and spend $1,500 on the generalized information system. That's quite a range. Now, in order to be fair,
IBM did reduce hardware fees, so customers weren't just suddenly being billed more without some
compensation. But the unbundling didn't really go over that well with Big Blue's clients.
but the unbundling didn't really go over that well with Big Blue's clients.
Once again, from Grad,
quote,
Customer's reaction to the IBM announcement was decidedly negative,
with many customers arguing that the hardware price reductions in no way compensated for the projected additional costs of separate charges for all various unbundled services.
End quote.
charges for all various unbundled services, end quote. The unbundling had a largely negative impact on the overall computer industry. Grant explains that tucked in amongst the services
was an interesting item. Customer education. This included free training for programmers.
free training for programmers. That was no longer free to clients, so most didn't bother.
According to Grad, this led to a shortage of trained programmers. For IBM, the unbundling hurt their bottom line. They had decided to slash all hardware lease fees by 3%.
That applied to customers new and old, so IBM immediately took a hit,
followed by the sales complications caused by the unbundling. Maybe this was a bad move on IBM's
part, or maybe these issues were only short-term setbacks. If we're looking for the first software
pirates, I think 1969 is the year we should be investigating.
This would be the first time that someone had the chance to steal commercial software.
There were some commercial products before that, but with the unbundling, IBM was opening
the floodgates.
Commercial software was now actually viable.
So do we see thievery occurring? This actually turns out to be a really
hard question to answer. In June of 1969, IBM announced the unbundling to clients in a series
of letters. Tucked in one, we get a stipulation that, I think, may help us narrow down our search.
stipulation that, I think, may help us narrow down our search.
Each program product customer signs a single license agreement.
Under its terms, as many program product licenses as desired may be obtained.
Each license authorizes the customer to use the program on a CPU at a mutually designated customer location.
So, my working theory here is that the first pirates were probably criminals on technicality.
We're looking for someone who was an IBM customer who decided to run a new program product on a CPU that didn't have an official license in place.
One other interesting avenue to go down is listed in a frequently asked questions section of one of these letters.
There are two questions that I find particularly interesting here. Quote,
Question 17.
Can a user of an IBM system sell machine time and include the use of his licensed program product?
Yes.
However, no copies of the program product are to be removed from the designated location
and restrictions on disclosure must be observed.
Question 18.
When can a program product be transmitted over a communication line? Transmission of a program product to another location is permitted only when the CPU at the receiving location is being used
as an emergency backup system to the licensed CPU system. End quote. Very interesting indeed. Here,
IBM is adding stipulations on how software can be handled, how an end user can share and use
a program. So, we have another way that clients could break contract. Our hypothetical pirate
may be sending a program down a phone line to a computer that
didn't have proper backup licensing. Or perhaps they were a user renting time on a mainframe that
just decided to copy a program. They wouldn't have been copying a floppy, but maybe copying a tape.
Once again, this gives us a lead, but only in a very oblique way. One thing to note is that,
at least at this point, IBM wasn't using anything like copy protection to prevent
a license violation, at least not in practice. The same letter I quoted from earlier makes it
clear that IBM believed legal action and the copyright system would be sufficient to prevent the theft
of software. We aren't dealing with policy built into a program, but rather legal agreements not
to break the rules. So once again, the big question is, was IBM's contract violated?
Were users using software in such a way that it broke contracts? I think the
answer is definitely yes, but we can't really track down the finer details. Think about it this way.
Why would anyone record and then disclose an instance of a user copying a program from one
computer to another? The computer might be in
the same room, or even connected to each other over a short cable. The violations of IBM's
agreements are so mundane as to be everyday events. When I went looking, it was hard to
find specific instances of single users illegally copying software. However, I do have
their shadow. The term pirate actually breaks into the digital lexicon pretty early. A March 1969
issue of Computer World uses the term to describe someone running software on an unlicensed machine.
to describe someone running software on an unlicensed machine. That conforms well to IBM guidelines. That said, I'm pretty certain this isn't the first time the term was used in relation
to computers. The author doesn't make a point of coining the term, they just drop it in as a
shorthand, so I'd wager that pirates are at least as old as 1969.
This also fits the timeline pretty nicely.
The unbundling was one event that contributed to software theft being possible, but it wasn't
an entirely clean break.
There were third parties writing proprietary software around this time, they just weren't
that successful. So there was
the possibility of stealing software, it's just that there wasn't much software to steal yet.
I don't have rock-solid evidence, like I said, this is a speculation-heavy episode,
but I'm willing to wager that piracy, at least digital piracy as we know it, was born in 1969. There is, however, a slight complication
here. The P word seems to have initially held a slightly different meaning, at least in relation
to software. I've kind of gotten into the weeds on this because there seems to be a profound lack of research on the early history of software piracy.
Like, I'm at combing through newspaper levels here.
What I found are a scant number of mentions of piracy prior to 1969,
but these stories fit a slightly different mold.
The best example I found is from a piece in the LA Times from August
1968. Listen to this. As near as I can tell, this is the earliest printed reference to piracy
involving a computer. Quote, the British government has been asked to set up an office that would register computer programs
in much the same way that music and plays are copyrighted now.
Opponents argue that computer programs are mathematics,
and say you can't copyright or patent mathematics.
The aim is to protect programs, which are the script used by computers to solve problems and store information from piracy.
End quote.
Here we have a government looking to codify legal protections for software.
And it's just before the unbundling.
The article goes on to explain the reasoning behind this change in policy.
It wasn't from users misusing software or breaking contracts, oh no.
It came down to corporate espionage.
The specific case has to do with the theft of a program written at the British Overseas Aircrafts Corp.
This theft was perpetrated by two former employees. These conspirators
physically stole a set of magnetic tape that contained source code. That source code was then
given to a competitor company. While interesting, this isn't really the software piracy that we all
know and love. It's something altogether different. It does have to do with the
theft of software, but on a different scale. I think this actually bolsters my claim that the
modern software pirate is born in 1969. We're seeing a bit of flux in what the term means that's
indicative of the newness of the act itself. We've established when piracy starts,
roughly speaking. We've also established the initial countermeasures. Corporations were
relying on the legal system to have their back. However, that was a pretty blunt instrument,
and it turns out not one that worked very well. Corps were using legal action to detract both the
corporate espionage type of piracy and client kind of piracy. Maybe call them grand piracy and
petty piracy, respectively. As discussed, these kinds of piracy aren't really the same. Plus,
legal proceedings aren't known for their speed or
efficiency. What's IBM to do if it discovers that hundreds of clients have been committing
acts of petty piracy? File 100 lawsuits to recoup, what, thousands of dollars in damages?
It just doesn't make financial sense. A better method of control was needed.
This, dear listener, brings us up to copy protection.
That is, systems dedicated to the prevention of petty piracy.
Once again, this is a place where I'm going to try and push the conventional timeline back a bit.
When I was preparing for this episode, I kept running
into articles that claimed copy protection started to be implemented in the home computing era.
I believe that the birth of copy protection is, by necessity, intertwined with the birth of piracy.
That means it predates home computing considerably. The earliest example of copy protection I can find
comes from, surprise surprise, Big Blue themselves. In 1968, IBM filed a patent for what they called a
quote, program security device. This was a physical addition that could be added to a new computer,
preventing the execution of a program on an unauthorized machine.
In other words, IBM was implementing program product contracts in hardware.
But to be fair, this is a patent, so this wasn't necessarily an actual use.
This was all accomplished using a secret code. The patent
is kind of obtuse about this, but I think it would have been something like a license code or a
client's serial number. The core of this patent is a device IBM called the code generator. This was
the new hardware that made protection possible. It worked like this.
At various points in execution, a program would issue a challenge to the code generator.
The program would supply a number to the generator, which would then be compared to some
secret value stored inside the generator. If everything lined up, then the generator would respond that you were
good to proceed. Otherwise, the program would know it was being run illegally and take countermeasures.
What I find interesting about this approach is that we can see IBM trying to work out a sneaky
way to handle copy protection. You don't want to have some instruction that gives the computer's serial number than
an actual program to check that serial number. That turns out to be very easy to circumvent.
Instead, you have this hardware-based check on a suspected number. Sure, this isn't the most
secure solution, but it does show some forethought.
You could build an attack to either get the number a program is using or brute force the value.
Those are some major weaknesses, but I gotta hand it to IBM. Their head was in the right place, even at this early date.
A similar approach was being spearheaded by another company I've already mentioned this episode,
ADR, or Applied Data Research. ADR is a pretty neat case. They are a rare example of a software
company that existed prior to the unbundling. For their part, a suit from ADR against IBM
was one of the cases that helped the unbundling occur. Applied data started down the
copy protection path with a patent. Actually, the first software patent. As we've seen, this was a
controversial idea. The legality of patenting a program was muddy at best. The kind of intellectual property, right, wasn't explicitly protected under U.S. law
at the time. Partly because, yes, programs were synonymous with math at this point,
and partly because there was just no legal precedence. Computers were too new for much
case law to exist. Nonetheless, ADR filed a patent in 1965 for a sorting algorithm. The patent's contents are
kind of unremarkable. It's basically just flowcharts and data structures plus a description
of the two. What makes this interesting is that ADR was the first to leverage the patent system to protect their software.
The patent was finally issued in 1968, which puts us right in the center of the piracy timeline.
ADR's step towards active protection is, well, a little under-documented.
I've only been able to find a mention of the technology in a Computer World article,
but this is the earliest conversation about software copy protection I've yet seen.
So, despite the lack of detail, I think this is an important step for us to examine.
This is, of course, the March 1969 article I mentioned earlier.
I can now unveil the article's spicy title, Hardware System Proposed to Prevent
Software Thefts. The article describes a proposal from ADR to create a physical system to enforce
software licensing contracts. This form of copy protection was implemented using a combination
of software and hardware. The idea was that a CPU
would have to be built in a way to report a serial number. That's just some small register for
tracking a unique identifier. Then software would be internally registered with that serial number.
The program would, at various points in its operation, ensure that it was running on a
CPU that reported the proper number.
This article recommends that the magic number would be stored in some sort of register,
but there aren't a lot of details.
What's neat is that even at this early juncture, even before there is any implementation, people
are poking holes in the idea of copy protection.
To quote,
How could the contents of the register be protected from a would-be pirate?
Software dumps are easy to get.
ADR Vice President Martin A. Goetz suggests that the identifier appear in a number of places scattered throughout the program.
End quote.
In other words, there were already concerns that copy protection could be broken, that
software could be cracked.
We even see this little back and forth of, hey, couldn't that be easily circumvented,
followed by the classic, no way, I'm going to use lots of numbers and lots of checks.
No way, I'm going to use lots of numbers and lots of checks.
It's also important to note that this type of protection jives pretty well with IBM's earlier patent,
it's just more simplistic and, frankly, not as good.
While we may not have had actual implementations of protection at this point,
it was at least on manufacturers' minds. One other short trail I do want to mention is Dylacor. That's D-Y-L-A-K-O-R. I really like that name for a software house.
In the early 70s, Dylacor started offering an innovative sales model to customers. The company sold programs for as little as $1 a day,
with the option to terminate a contract on 30-day notice.
Radical stuff, I know.
For just $5 more than IBM's famous Rigid Frame Selection program,
you could run a copy of DYL-250.
As near as I can tell, this was a series of programming aids. In a 1971 Computer World article, it's claimed that Dylacor had also developed a new
means of copy protection. Quote, the $1 a day software supplied in tape for its users by email presents the possibility of unauthorized
duplication by users. But apparently the firm is not worried about such unauthorized copying.
Quote, for every man who would do that, 99 wouldn't, end quote, said Dialicore president
Jim Case. Furthermore, he believes that there is no way today of adequately protecting software.
To limit the effects of unauthorized copying of the program, however, DYL-250 does, quote,
self-destruct after a number of uses. Legitimate user can get another copy from
Dylacor, but the pirate is stuck. End quote. That's it.
That's all the detail we get on this new and exciting way to stop pirates dead in their tracks.
Now, I have my own speculation here.
Dilacour was a mail-in kind of operation.
That much is made clear through a series of articles in contemporary computer
magazines. They also had this cool software distribution tech. They figured out how to make
these flexible tape holders so you could get 100 foot of tape into a tiny plastic tube. This made
software much cheaper to send in the mail. So here's my pet theory. Just like the article claims, Dial-A-Core had some kind of
self-destruct timer in their software. It sounds like it was hard-coded to track the number of
times it had been executed, and stopped running if a certain threshold was reached. How would this
be tracked? My guess is self-modifying code. Back in the day, self-modification was pretty common practice,
partly due to the limitation of old computers.
This would also make it hard, at least initially,
to track down where the run counter was stored.
There's not some file sitting around with a single number in it.
The number is tucked into the program itself and constantly changing.
with a single number in it. The number is tucked into the program itself and constantly changing.
I'm guessing that Die LaCour was sending out new tapes to its users on a monthly basis, hence the 30-day cancellation policy. You'd pay for a month of use and receive a tape that would be
good for some number of runs. If you kept using the tape for too long, you'd hit the run cap,
and then you'd have to just order a new tape.
If someone copied your tape, well, they'd also hit the run limit, eventually.
The pirate in this case would be forced to pay for a new tape or somehow get more creative.
A key problem for the software industry, or at least what they saw as a problem, was the fact that laws were moving slower than innovation.
Back during the unbundling, IBM had claimed that their new program products would be protected
by patent and copyright law. Well, turns out they were kind of wrong about that.
These legal frameworks offered some protection from grand piracy. That's theft from other companies,
but almost no protection from consumer-level acts of petty piracy. The first lawsuits and
arrests over piracy wouldn't happen until the 1990s, so there's a little lag time here to say
the least. What's kind of funny is that folk in the computer industry reached a level of near mania about piracy.
I've read a number of articles and papers from the 1970s and 80s that claim piracy is
on the verge of destroying the entire computer industry.
Their logic was that developing software was expensive.
The only way to recoup those costs was to get customers to pay for that software.
Piracy removed the second part of the equation. If you could pirate software, you had no reason
to pay for it. Thus, software development firms lost money, and thus it was no longer financially
viable to develop software. There was no reason to stay in the market. Pirates are destroying the entire industry.
Now, there are some huge flaws in this argument. There's a certain short-sightedness present that
I can't help but point out. When suits are scared of piracy destroying the software industry,
they really mean that piracy could harm their specific business model.
They don't care about the industry, they care about them.
Piracy can, in some cases, harm the bottom line for commercial software houses.
But there are other modes for software production.
Take open source, for instance.
Some of the most powerful and complex programs ever written are open source.
People don't pay for Linux, for instance, but it's used everywhere and it's a thriving environment.
There's still incentive to develop that operating system.
develop that operating system. There are other revenue streams possible and other reasons to write software outside of pure profit for a commercial entity. There's also the fact that
the computer industry chugged along fine prior to commercialized software. The entire creation of
the software market was artificial to begin with. It was a preemptive attempt by IBM
to avoid the long arm of the DOJ. Then software companies complain when the legal system doesn't
give them more help to make their very specific business plan viable. It's all a little pathetic,
I think. Someone getting a free copy of Microsoft Word isn't going to tank the market.
It isn't going to destroy the incentive to innovate on computers.
Someone stealing the code for Word and selling it to another company, that would have a bigger impact.
Petty acts of piracy, even if a lot of people are engaged in them, aren't the death blow that suits frame them
as. It's grander acts of corporate espionage that have always been the actual problem. And those
have always been illegal. I think the height of this mania is best expressed by a patent filed
in 1977. This is just eight years after the software market
emerges, less than a decade after corporate software business models become viable. The
patent, submitted by Robert Best, and yes, that's his real name, is for a, quote,
crypto microprocessor for executing enciphered programs. From the abstract. A microprocessor for executing computer programs
which are stored in Cypher to prevent software piracy. Such a crypto microprocessor decipher
the enciphered program piecemeal as it executes it, so that a large enciphered program can be
securely executed without disclosing the deciphered program or associated data to persons who have Now, last episode we were discussing Jay Forrester's 3D memory, and I made a comment that, at first, his premise seemed shallow.
His logic was that he had seen a jump from 1D delay line memory to 2D storage tubes.
Thus, he figured three dimensions would be even better. It's the kind of logic that might seem
shallow at first, but on examination, there are actually huge gains to be had. Forrester was drawing from his experience to craft a good
solution and an enduring solution. The best patent also seems shallow at first. Don't like piracy?
It's destroying the market? Any measures are necessary, so let's make an entire system
dedicated to the eradication of the piratical way of life.
And that's it. The idea doesn't have any hidden depth to it. It's just spite. Luckily,
Best published a paper in IEEE a few years after his patent application. So between that and the
patent, we can build an interesting view of this spiteful attempt at software.
Here's the general overview of Best's crypto chip.
The microprocessor would contain a dedicated circuit for decoding incoming data.
When the processor went to fetch instructions or data,
that information would first be passed through this decoder before the processor proper
set to work. A decryption key was to be stored on the chip as well. Crucially, this would all happen
on the same die as the CPU. This way, decoded data would never exit outside that silicon wafer itself.
exit outside that silicon wafer itself. Everything's internal. That, on its own,
is already frustrating. Best makes a point to explain that a rapid encryption scheme will have to be selected, but it doesn't matter how fast you can bash those bits. A crypto microprocessor
will always be slower than a normal processor.
Every single instruction requires an extra step.
I don't care how fast that is.
You're adding steps.
You're making things slower.
You can't just hand wave that away.
You can't just say, oh, well, the decrypt process will be so quick.
I don't care.
If you add this decryption circuit to any normal processor, the result is a worse product.
BEST is proposing to hinder performance in exchange for security against petty piracy,
which we've already established isn't going to destroy the market.
Ah, dear listener, but it gets worse.
The general theme here is that piracy will destroy the drive
for software development. It'll crash the market and destroy the entire computer industry.
So any countermeasures are justified. To use a crypto processor, companies must distribute their
code pre-encrypted. That code must be encrypted such that it can be read
by the special processor. And you don't want to repeat keys. That defeats the purpose. That's not
cryptographically secure. So each processor has to have a unique decryption key. The result here
is that each binary program is unique to the processor it's intended to run on, the physical processor.
Let's think about how absurd that is, how that ruins any kind of software distribution setup, how that would in fact crash the industry.
the industry. A client has to order direct from the OEM, from the vendor, or from a company with a very close relationship to the manufacturer. The client does not know the decryption key,
so anyone they order from has to have some list of which customers have which keys on which
processors. There has to be centralized inventory tracking
in order to send out the proper software.
In practice, I don't think a consumer would put up with this.
And what happens if a manufacturer goes under?
If they stop supporting your chip or upgrade to a new encryption algorithm?
Well, as the customer, you get to go pound sand.
You are no longer able to get new software, not even updates.
Your computer is useless.
Thus, the crypto microprocessor is inherently disposable.
That's not really a good look.
There's a final feature that makes Best's chip all the worse.
It has a kill switch.
Best explains in his IEEE paper that pirates, well, they're a crafty bunch.
Left to their own devices, they will find a way to break any protection scheme.
One method that Best speculates about is reverse engineering of cryptography.
To quote, that best speculates about is reverse engineering of cryptography.
To quote,
Pirates will also try to trick the processor into disclosing its instructions by altering bits in the enciphered program.
This can be thwarted by including one or more self-disabling opcodes in the instruction set,
which erases the keys if they are executed.
End quote.
instruction set, which erases the keys if they are executed. The plan here is to add random sprinklings of opcodes that will erase the decryption key, thus reducing the chip to inert
silicon. There are two reasons to implement this kind of kill switch. First, it disincentivizes
hacking. If a prospective pirate knows there's a possibility of breaking their computer,
then they'll tread lightly, or not tread at all.
Second, it serves as a time trap.
By using strong encryption methods and deterrents such as deadly instructions,
a pirate could sink hours, days, or even months into a fruitless effort.
Spite is on full display here.
But this also inadvertently targets honest customers.
What happens if someone executes the wrong program?
Or if they're accidentally shipped a program that's encrypted for a different processor?
Maybe the inventory management system messed up and the customer ended up with a program encrypted for a a different processor. Maybe the inventory management system messed up and the
customer ended up with a program encrypted for a totally different processor. Or maybe they tried
to execute a text file. End users do weird things that you would never suspect. With Best's chip,
any of these simple problems can result in the destruction of a piece of hardware.
Any of these simple problems can result in the destruction of a piece of hardware.
And it's good for us to remember that, at this point in time,
piracy isn't a crime.
There's no law against software piracy.
Why would you do this for any other reason besides spite or some bizarre monomania?
To close out this section, let's highlight something.
The solutions used in this early era were decidedly high-octane. We're talking about dedicated hardware solutions or very specific forms of software distribution. These solutions
were expensive, complicated, and only worked because
of the current makeup of the computer market. Relationships between vendors and customers
were a lot closer in the mainframe and mini-market. If you bought from IBM, you had a rep you worked
with, for instance. Companies had tighter controls on, well, everything. This changed with the shift to
microcomputers. Suddenly, some Joe Nobody could go out and buy a machine. They could buy software
off the shelf, off the physical shelf. There wasn't the same kind of company-client relationship.
In most cases, that connection ended once you got your receipt and then probably
threw it out. That change in dynamic meant a change in piracy and a change in copy protection.
The first case of microcomputer piracy was probably when Microsoft Basic was stolen from a
truck in the summer of 1975. Once again, this comes with the caveat that piracy was probably occurring
just as a matter of course at all times and all places.
Perhaps it's better to say that this was the first high-profile instance
of software piracy in the home.
Plus, it's just one of my favorite stories.
I did a bonus episode forever ago that covered this debacle if you want more in-depth of a story.
It's episode 17.5, aka Bill's Problem with Piracy.
Here's the relevant detail.
In 1975, MITS was doing a tour circuit with this truck called the MITS Mobile.
doing a tour circuit with this truck called the MITS Mobile. They would show up to schools and clubs with a handful of Altair 8800s and run demonstrations, make sales, and generally spread
the good word. The big product at the time was Altair Basic, also known as Microsoft Basic,
aka the first product ever sold by Microsoft. At least it was kinda sold by
Microsoft. There was a contractual agreement here. MITS actually sold Basic to customers,
and Microsoft received a royalty on each tape sold. In June, the MITS mobile hit Silicon Valley.
One of the stops was the Homebrew Computer Club, a den of crime and villainy that was at the center of the microcomputing scene.
During the stop, someone stole a copy of BASIC.
I didn't mean steal as in the sense of how all piracy is technically theft.
I mean someone physically picked up a paper tape that contained Altair Basic, put it in their pocket, and walked
off with it. Within the week, copies were being run off. Soon, Basic on the Altair was basically
free. Sales of the software actually slowed down, presumably due to the prevalence of bootleg tapes.
due to the prevalence of bootleg tapes.
Good ol' Billy Gates was enraged by this turn of events.
Remember, the root of copy protection comes back to this mix of a feeling of mania and powerlessness.
Microsoft didn't have a way to actually retaliate,
so Gates wrote an angry letter, and I'm not even joking.
The man wrote this indignant letter that he called the Open Letter to Hobbyists.
To quote from one of my favorite passages,
As the majority of hobbyists must be aware, most of you steal your software.
Hardware must be paid for, but software is something to share.
Who cares if the people who worked on it get paid? End quote. Why so mad? Well, as we've covered, there wasn't a legal mechanism in place
to deal with petty piracy. Even if there was, Gates couldn't really do anything about it.
The original pirate was unknown at the time and will remain a mystery.
So, what, would Gates sue every Altair user who didn't have a receipt for BASIC? I think that
would ruin the computer market. That would actually destroy innovation. From the beginning, the very first years, piracy was a feature of home
computing. And as we know, that meant that copy protection was bound to follow. Some of the
earliest attempts were adaptations of larger-scale copy protection schemes. I present, for your
consideration, WordCraft and its strange dongle.
WordCraft was a text editor for 8-bit microcomputers.
It was released in 1978 for the Commodore PET.
In practice, that meant it could run on a number of broadly similar Commodore machines.
I only point this out because some publicity photos show WordCraft loaded onto a CBM-2. The photo in question has been reproduced in an article written by Mike Lake, one of the creators of Wordcraft. That
article serves as the backbone for this section. According to Lake, a copy of Wordcraft sold for
£425 on release. Turning that through some calculations, we get $2,200 in modern day
money. That presents a bit of a problem, one that a lot of early software had. The market
didn't really, well, make sense yet. A Commodore PET would cost you in the neighborhood of $700.
A Commodore PET would cost you in the neighborhood of $700.
Lake points out that this meant a copy of WordCraft would be about half as expensive as the computer it ran on.
To Lake, that was a justification for copy protection, but in my eyes, it may be an example of how the price of software was out of whack.
example of how the price of software was out of whack. Their solution was interesting for the time, and it has given us an enduring legacy. The scheme was dreamed up by a team of three
conspirators. First was Lake, of course. Second was Pete Dowson, an ex-IBMer and co-creator of
WordCraft. The third man was Graham Hegie, who seems to have been a friend the other
two knew. Their solution was a small hardware device. The trio built a tiny board that connected
to the PET's cassette port. The port itself was usually used for connecting up to a tape drive.
Most computers of the time had a similar interface, but the PET was
a little special. Machines like the Apple II and TRS-80 just connected up to any old tape deck.
Commodore chose to use a custom deck and a custom connector. The upshot here was that the tape port
provided power as well as serial data in and lines. To read tapes, the PET would just wait for data
to come in over the serial interface. However, there was no way to tell if the plug was really
connected to an official Commodore-branded tape drive. The computer only wants some device that
could handle serial communications. The trick that the team came up with,
mostly credited to Heggie since he was the one good with soldering iron, was to jack into this
serial line. They built what's called a shift register. This is a device that stores a small
amount of data and, given a series of pulses, will send that data out in a serial fashion.
of pulses will send that data out in a serial fashion.
Each pulse gives you one bit of the number stored inside.
The rest was easy.
The serial output pin was used to send proper pulses, while the serial end pin was used to read data from the shift register.
The register itself was configured to hold a secret key.
Then, on startup, WordCraft would try to grab that key
off the tape port. If the copy protection board was plugged in, then WordCraft would get back
its precious number. Otherwise, it would fail to run. The entire package was completed with,
of course, packaging. The small board was housed in a plastic shell stuffed full of epoxy. That
way, it was hard to tamper with. And the team called this the dongle. As Lake explains, the
name came from the fact that the prototype board had a way of dangling out the back of the pet
computer. Quote, dangle, dongle. Not exactly rocket science. All those urban myths about the dongle being
invented by Don Gall made me smile. In fact, I am Don Gall. Well, Pete Graham and I are Don Gall.
Maybe I should get a t-shirt with I am Don Gall printed on it. That should impress no one, end quote. The core idea here was
that each legitimate copy of WordCraft shipped with a copy protection dongle. This was a way for
customers to prove that they were really customers, that they paid the toll. Like I said, this is a
page straight from early copy protection, just adapted for home computers.
The main difference, the change here that I want to focus on, was the company-consumer relationship.
A Commodore PET owner didn't have any dedicated sales rep.
That much should go without saying.
By the same token, a customer didn't necessarily have a continued relationship with WordCraft's developers.
But we can actually complete this triangle.
The developer of WordCraft didn't have any kind of relationship with Commodore.
This meant that certain opportunities just weren't possible.
Lake couldn't get Commodore to put a special
protection circuit in the PET, so third-party vendors were on their own when it came to copy
protection. In this sense, the dongle is a pretty smart idea. Most users weren't savvy enough to
make their own dongles or even figure out how the mystical plug works. In theory, someone could
figure out how the dongle worked and retrieve the secure code. In theory, someone could build their
own dongle. In theory, someone could find the code that checked for a dongle and overwrite it.
But that would take a lot of skill and a lot of time. Any avenue of attack is a time sink. It's a kind of trap.
The WordCraft team is betting on the fact that few will have the skill and time to break their
copy protection, even if their solution isn't that secure. This is, classically, what you would call
security through obscurity. The plugin dongle is just complex and annoying enough that casual
users can't get around it. Someone with talent and time could crack the program, but those types
of folk are few and far between. This tactic was especially useful in the pre-internet days.
Let's say I crack WordCraft, I remove the copy protection. I find just the right spot where it checks for the dongle, and I make a patch that just
jumps over it.
I can now run the program on a pet without the dongle plugged in.
It's pretty cool, but what's step two?
Usually the next step would be to share the now-free program, or maybe sell it illegally. With the internet,
that's simple. I'd just upload the cracked program to some file-sharing website. Back in the day,
when the PET was new, you could send in the program to a BBS. Those were close to websites,
they just operated over phone lines, but download speeds were slow. Maybe I could mail it out to interested parties,
or I could pass disks around at a local meetup. But on a fundamental level, my cracked text editor
couldn't get very far. The damage was isolated. If you think about it, that's one of the reasons
why the piracy of Altair Basic was such a big deal. The market for the Altair was
small and pretty well connected. There was no copy protection, so anyone with a paper tape
puncher could make copies of Basic. Those copies could be further duplicated and easily passed
around the tight-knit community. Someone that buys a pet in California would, most likely, have no contact with someone who bought a pet in Massachusetts.
The size of the market plus the barrier of copy protection was able to abate piracy, at least slightly.
Going back to WordCraft, there is, of course, the spite aspect.
Using a dongle does, in fact, worsen your computer experience. In general,
copy protection always finds a way to hurt legitimate customers. In the case of the PET,
the damage was slight. That computer actually had two cassette ports, so you could still use
a tape drive while running WordCraft. However, WordCraft didn't stay relegated to the PET for long.
There were versions for the Commodore VIC-20, the C64, IBM PC, and its clones, as well as a slew of
other machines. All these word processor variants were protected with physical dongles. The VIC-20
and Commodore 64 each had a single tape port,
so you can't use WordCraft and a tape drive at the same time.
You'd probably want to use a floppy disk at this point in history anyway,
but it's still an inconvenience.
If you needed to use the tape port for something else,
you'd have to unplug the dongle,
then plug it back in when you needed to edit a document.
That's a hassle, and it could have caused lasting damage to your computer.
These tape ports were edge connectors.
They're sets of contacts that are bonded to the surface of a circuit board.
These kinds of connectors aren't very robust.
I've seen the lifespan measured in hundreds of cycles.
So, at least theoretically,
dongle swapping often enough could wear out a machine. Now, there are a whole slate of other forms of copy protection that we could talk about. These are very deep waters to be sure.
I want to finish up our roundup with what I think presents an interesting case. Copy protection
on cassette tapes. What can I say? WordCraft has left me in a bit of a serial mood, which
has been helped along by an article written by Andrew McFadden titled Early Copy Protection
on the Apple II. I think this is an interesting angle to look at because in tape software, we can see protection that's 100% software-based.
It's also relatively simple since, well, it was intended to run on simple computers.
We can also see something of an arms race between vendors and pirates.
Copying a tape is trivial, at least in theory.
pirates. Copying a tape is trivial, at least in theory. You just need one of those fancy stereos with two tape decks or two separate cassette recorders that you can plug into each other.
What prevents this from being a surefire solution is the fact that cassettes suck at storing digital
data. This is an analog format that was hijacked for digital storage. Running a copy will result
in some signal degradation. That can make bits unreadable. Unless the copy is perfect at just
the right levels without any weirdness going on, your attempt at piracy will be stopped dead in
its tracks. There were better ways to copy tape software.
The most simple fix was to use a computer.
You know, the thing that reads the tape.
Normally, when you load software from tape,
you just tell the computer to read the contents of the cassette into some location in memory.
To make a copy, all you had to do was dump that memory location to some clean tape.
That's the basic attack here.
Read, then write.
So how did vendors fight back?
McFadden's article gives us a great primer on this part of the saga.
I've looked for primary sources, but those aren't forthcoming.
No one was outing their acts of piracy in public venues back in the day, and vendors weren't spilling their copy protection secrets. So, latter source as it is.
McFadden focuses on the Apple II, so that's also where we'll be pointing our attention.
Just keep in mind that, in general, the Apple II was a typical 8-bit micro of the era, so what works on the Apple II
would also work on other systems, at least given some tweaking. Talking broad brushes here.
There were two main tactics used to thwart tape pirates. The most basic was a rough kind of
encryption. This is a similar approach championed by Robert Best, except implemented in software.
Many tape programs used a so-called two-stage loader.
Users would be instructed to load a small program off the beginning of the tape.
That's the first stage.
Once that stage was running, it would handle loading up the rest of the system.
This allowed for more complicated loading logic, fancy splash screens, and the possibility of the system. This allowed for more complicated loading logic, fancy splash
screens, and the possibility of copy protection. The encryption schemes here were simple but could
pose a problem. The examples I've seen used XOR ciphers. This isn't so much real encryption as it
is obfuscation. Basically, you run an exclusive OR operation on every bit of the program against some
predefined number. The process is reversible by running the program through the same process,
just XORing it against the same number. This isn't very secure, but it will render the program stored
on tape unintelligible. That is, unless you know the number that's used as the encryption key,
if we can call it that. You'd assume this might be hard to determine, but it's pretty easy.
You could brute force the encryption, or you could just look at the first stage loader. You see,
XOR is a built-in operation on most processors. It's in the instruction set.
You just need to scan the loader for an XOR operation
and then see what number is being used as the key.
Easy.
The Apple II provides everything needed to break this kind of encryption.
The machine came stock with a feature called the Machine Language Monitor.
From this interface, a user can disassemble code and memory,
inspect the contents of memory, and handle tape I.O.
In other words, everything needed to crack copy protection.
So while most users would probably get stuck,
those with a little skill and creativity could get the unencrypted program back out.
However, this still doesn't make this weak encryption method a good form of copy protection.
In general, tape just wasn't a very conducive format for copy-protected software.
That's kind of what I like about this. In all cases, you have to load the data off tape into memory.
That means that a malicious user will have access to the program, at least in some way.
Say what you will about Best's crypto chip, but at least it ensured that a user was never exposed
to the raw, unaltered source code. However, it was possible to make it pretty hard to crack a tape using a computer.
The usual attack pattern for a pirate involved loading the program, then either preventing it from executing or exiting the program.
It was crucial that the pirate be able to drop back to the computer's normal operating system, or just never leave the operating system to begin with.
Thus, there was a possible opening
for vendors to attack. The most complex copy protection scheme that McFadden explains
was employed by Module 6, a relatively simple blackjack game. Its approach was to overwrite
large swaths of memory as it was loaded. This had a twofold benefit.
First, it forced automatic execution of the game. You see, the Apple II wasn't a very smart computer.
No computers in this era were. Every time you pressed a key, the Apple II would store that
keystroke in a buffer at a set location in memory. When part of the operating system needed to get
inputs from the keyboard, it would check this buffer. This is a pretty sound design overall.
Buffering inputs like this can prevent certain annoying resource problems, and it makes getting
keystrokes pretty easy and pretty fast. It's also easily exploitable. Some more simple forms of copy protection or just convenience would overwrite this buffer
so automatically the Apple II would run a command for them.
But Module 6 was a little cooler.
You see, relying on that buffer meant you were assuming that the Apple II's ROM would play nice,
that no one would try to break execution.
assuming that the Apple II's ROM would play nice, that no one would try to break execution.
Module 6 on load would basically fill all of the Apple II's available memory.
Instead of just overriding that buffer,
Module 6 would attack by overriding the contents of the computer's stack.
This is important because the stack is where return addresses are stored when functions are called. The Apple II has to call up a function when a user asks the computer to load data from
tape. So a return address is, in fact, on the top of the stack. Module 6 just replaces that with its
own return address. Thus, when the computer finishes loading from tape,
it tries to return to whatever's on the stack, and it jumps directly into Module 6. Thus,
you're trapped in the blackjack zone. But the games aren't done. Module 6 also prevents users
from dropping back to the monitor. This is a little more complex than I
can fully understand. McFadden gives a great explanation, but I'm no Apple II whiz. As I
understand it, Module 6 rewrites an interrupt vector table in such a way that certain system
calls are replaced with calls into Module 6's own code. Specifically, calls to print text on the screen
or read keystrokes get redirected to custom handlers inside Module 6. Those handlers,
in fact, do nothing. That prevents the monitor from functioning even if a user gets back to it.
This means that if a user somehow escapes from the blackjack dimension,
they can't inspect memory and they can't dump the contents of RAM. Piracy prevented.
Here's why I find this interesting. Module 6 is pulling a pretty classic buffer overflow attack.
This is almost exactly what a hacker would do to force a system to execute malicious code.
You profile the system's memory, find where it stores return addresses or important data,
then you find a way to inject data into those key locations. Apple already provides all that
information in the programmer's manual. The traditional form of this attack relies on buffers, locations in
memory used to store temporary data. A hacker looks for a program that isn't very smart about
their buffers, programs that will accept any data and any amount of data to throw into a buffer.
Then it's just a matter of crafting a payload and throwing it to the target machine.
matter of crafting a payload and throwing it to the target machine. Here, Module 6 is doing the same thing. The Apple II's tape loading routines aren't smart about how they handle data, so you
can launch a very similar attack. Now, ultimately, the copy protection used by Module 6 is easy to
break. You just load half the tape, then you copy that half to tape, and then do the same with the other half.
Care has to be taken to load it into a safer region of memory, but you can just do that.
That's not the important part.
Rather, I find it fitting that in order to suppress piracy, vendors had to increasingly turn to malicious techniques.
It's a fitting evolution of a spiteful monomania,
and it continues on into the present day.
Alright, that brings us to the end of this episode. We've only scratched the surface
of copy protection, but I think we've already seen the basic pattern. In the
commercialized mode of software production, everything has to be paid for by the end user.
Prices are chosen in order to generate profit larger than actual costs. The implied agreement
here that makes this mode of production work is that users will pay for their software,
of production work is that users will pay for their software, that no one will make any duplicate copies. Not everyone follows that plan. Prior to the late 70s, there was no legal recourse against
consumer-level pirates. Plus, there was a veil of secrecy over piracy in general. People stealing
software weren't exactly shouting to the heavens about their free copy of IBM's Rigid Frame Selector program. Software vendors could only come up
with one way to fight back. Copy protection. But no scheme is ever strong enough to stop all pirates.
You see, copy protection is kind of like a fence with a big no trespassing sign.
That'll stop most people from passing through,
but if you have some bolt cutters or are just good at climbing,
then no fence can stop you.
Pirates and software vendors enter into this arms race almost immediately.
There's a back and forth as copy protection gets more elaborate.
But at every step, there's only one loser, the average consumer.
Copy protection schemes only serve to complicate the use of software or degrade its usefulness.
That, in turn, can be reason enough for you to take to the seas and become a pirate yourself.
It can also, in some cases, destroy the computer market.
The final joke to all of this is the relative newness of this specific mode of software production.
The unbundling only happens in 1969.
It takes scant years for companies to complain about how pirates are ruining the industry.
So from the beginning, this is something of an artificial arrangement.
IBM unbundled software in order to avoid the long arm of the DOJ. Then these new software companies turn around and complain that the government needs to do more to make their
specific business model viable. They don't care about changing, they want the world to change
so they can keep doing business how they have been. I'm not going to make a big sweeping judgment call, I just want to point out that
maybe commercial software isn't the natural state of the industry. So if you're thinking about
committing piracy, I got a tip for you. Just don't. Go contribute to an open source project or find a better open source alternative to the
software you want to pirate.
I guarantee for basically everything, there are better open source solutions.
That's better than trying to strike back against commercial software.
Just pull yourself out of that situation entirely.
Thanks again for listening to Advent of Computing.
I'll be back in two weeks' time with another piece of Computing's past,
and since it is October, it'll be a spooky story.
If you like the show, there are a few ways you can support it and help me grow.
If you know someone else who'd be interested in the story of Computing's past,
then take a minute to share the show with them.
You can also rate and review the podcast on Apple Podcasts, Spotify, or wherever you listen. If you want to be a super fan, you can
support the show directly through Advent of Computing merch or signing up as a patron on Patreon.
Patrons get early access to episodes, polls for the direction of the show, and a few bonus episodes.
I think there's six now. There's going to be a new one relatively soon.
You can find links to everything on my website, adventofcomputing.com.
If you have any comments or suggestions for a future episode, then go ahead and drop me a line.
I'm at Advent of Comp on Twitter, and I really love hearing from listeners.
As always, have a great rest of your day.