feature article
Subscribe Now

Life Before Windows

Before We Knew What An “Operating System” Was

Gather ’round the campfire, children, as we talk about The Time Before Operating Systems. That was when all computers came with their own built-in software, indistinguishable from the hardware. You didn’t have an IBM computer and an IBM operating system, for example. You had just an IBM system. Yeah, it included software, but nobody thought much about where it came from or what it was called. It was simply part of the machine.

This was also the time of early microprocessors, home computers, and build-it-yourself kits. Again, all of these machines came with their own bundled software. Each one was different, of course, because they were developed by the computer makers alongside the hardware. Amiga, Commodore, Altair, IMSAI, Apple, and other machines all had their own personalities as determined by the bit stream that made them go.

Then came a guy named Gary Kildall. Gary died 20 years ago, but not before changing the entire microprocessor and computer world.

I never met Mr. Kildall, but it so happens that I live just a few doors down from his old house. Coincidentally, his company’s former office building is directly across the street from me; I look out my window every day at Gary’s old office. It’s a pizza place now.

Gary Kildall figured that even early little microprocessors like Intel’s 8080 could run real software, just like the commercial IBM, Data General, or DEC minicomputers of the time. He created something he called a “control program and monitor,” or CP/M for short. The company he formed around it was named Digital Research. 

Computer scientist John Wharton says of his friend, “[Gary] offered the complete package to Intel, along with an editor, assembler, linker, and loader, for $20,000. Intel turned him down, and CP/M went on to sell a quarter of a million copies and become by far the highest-volume operating system of its time.”

“Back before the introduction of the IBM PC, CP/M supported protocols for memory allocation, file sharing, process switching, and peripheral management. When Microsoft bought the rights to an unauthorized quick-and-dirty knockoff of CP/M from Seattle Computer Products and renamed it MS-DOS, these protocols were removed, since Microsoft programmers didn’t understand why they were needed.”

CP/M was arguably the first operating system to separate the software from the hardware. Up until then, the OS (such as it was) was just whatever software that came with the system. With CP/M, the operating system, programming APIs, user interface, disk format, communication protocols, BIOS, and other features were now independent of the hardware, and independent of the company making that hardware. No longer did DEC equipment have to use DEC’s proprietary protocols for everything, or IBM equipment do everything the IBM way. CP/M was portable across processor architectures and instruction sets. It laid the groundwork for the hardware/software division of labor we have today. And it was phenomenally successful.

So why didn’t CP/M become the world’s dominant operating system? The popular myth is that IBM came calling to Digital Research but that Gary Kildall blew them off, preferring to go flying that day rather than meet with a bunch of East Coast guys in white shirts and blue ties.

In reality, Kildall did meet with IBM and he successfully licensed CP/M for their newfangled IBM Personal Computer Model 5150 (aka, the PC). But IBM ultimately offered the PC with a choice of two operating systems: Digital Research’s CP/M or Microsoft’s MS-DOS, which was cheaper. You can guess how that turned out. And thus was born Microsoft’s dominance of the personal computer world for the next 30-odd years.

I’ve often wondered, as I look at Digital Research’s former headquarters, how things might have played out differently. If the company had agreed to different licensing terms, perhaps with different pricing, would we all be using CP/M 8.1 today? Would we curse it as much as we do Windows? And what would have happened to Microsoft, and the whole Redmond tech scene? If Digital Research had become Microsoft, and vice versa, would the area around the company have boomed the same way Seattle’s tech corridor did? Oh, what might have been…

Gary Kildall himself didn’t care, according to his friends. While outsiders tended to tiptoe around the subject of Microsoft’s success or Bill Gates’s phenomenal wealth (both presumably at the expense of Digital Research) out of polite concern that Gary might be a bit sensitive on the subject, the man himself didn’t give a rat’s ass. It was never about the money, success, or fame. Gary Kildall worked at CP/M and other projects because he liked to, not because he wanted to get rich at it. Reportedly, the whole reason he developed CP/M in the first place was because he didn’t want to commute into Silicon Valley to use a “real” timesharing minicomputer. With him, it was all about the product. A real engineer, in other words.

Wharton goes on to say, “At a time when Intel was positioning microprocessors as a replacement for random logic in fixed-function desk calculators, postage scales, and traffic-light controllers, it was Gary who advised Intel that these same chips were flexible enough to be programmed as general-purpose computers. At a time when microcomputer software developers were debating the merits of machine-language programming in octal vs. hex, Gary defined the first programming language and developed the first compiler specifically for microprocessors.”

In any endeavor, somebody has to be first. In microcomputer operating systems, that was Gary Kildall and Digital Research.

Last week our little town placed a commemorative plaque outside Gary Kildall’s house. If you’re in the area and are a fan of the GPS game geocaching, there’s a GZ called “Life Before Windows” (GC10PG1) that will test your hexadecimal skills. I think Gary would have aced it. 

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

Larsen & Toubro Builds Data Centers with Effective Cooling Using Cadence Reality DC Design

Sponsored by Cadence Design Systems

Larsen & Toubro built the world’s largest FIFA stadium in Qatar, the world’s tallest statue, and one of the world’s most sophisticated cricket stadiums. Their latest business venture? Designing data centers. Since IT equipment in data centers generates a lot of heat, it’s important to have an efficient and effective cooling system. Learn why, Larsen & Toubro use Cadence Reality DC Design Software for simulation and analysis of the cooling system.

Click here for more information about Cadence Multiphysics System Analysis

featured chalk talk

Easily Connect to AWS Cloud with ExpressLink Over Wi-Fi
Sponsored by Mouser Electronics and AWS and u-blox
In this episode of Chalk Talk, Amelia Dalton, Lucio Di Jasio from AWS and Magnus Johansson from u-blox explore common pitfalls of designing an IoT device from scratch, the benefits that AWS IoT ExpressLink brings to IoT device design, and how the the NORA-W2 AWS IoT ExpressLink multiradio modules can make retrofitting an already existing design into a smart AWS connected device easier than ever before.
May 30, 2024
7,959 views