feature article
Subscribe Now

Life Before Windows

Before We Knew What An “Operating System” Was

Gather ’round the campfire, children, as we talk about The Time Before Operating Systems. That was when all computers came with their own built-in software, indistinguishable from the hardware. You didn’t have an IBM computer and an IBM operating system, for example. You had just an IBM system. Yeah, it included software, but nobody thought much about where it came from or what it was called. It was simply part of the machine.

This was also the time of early microprocessors, home computers, and build-it-yourself kits. Again, all of these machines came with their own bundled software. Each one was different, of course, because they were developed by the computer makers alongside the hardware. Amiga, Commodore, Altair, IMSAI, Apple, and other machines all had their own personalities as determined by the bit stream that made them go.

Then came a guy named Gary Kildall. Gary died 20 years ago, but not before changing the entire microprocessor and computer world.

I never met Mr. Kildall, but it so happens that I live just a few doors down from his old house. Coincidentally, his company’s former office building is directly across the street from me; I look out my window every day at Gary’s old office. It’s a pizza place now.

Gary Kildall figured that even early little microprocessors like Intel’s 8080 could run real software, just like the commercial IBM, Data General, or DEC minicomputers of the time. He created something he called a “control program and monitor,” or CP/M for short. The company he formed around it was named Digital Research. 

Computer scientist John Wharton says of his friend, “[Gary] offered the complete package to Intel, along with an editor, assembler, linker, and loader, for $20,000. Intel turned him down, and CP/M went on to sell a quarter of a million copies and become by far the highest-volume operating system of its time.”

“Back before the introduction of the IBM PC, CP/M supported protocols for memory allocation, file sharing, process switching, and peripheral management. When Microsoft bought the rights to an unauthorized quick-and-dirty knockoff of CP/M from Seattle Computer Products and renamed it MS-DOS, these protocols were removed, since Microsoft programmers didn’t understand why they were needed.”

CP/M was arguably the first operating system to separate the software from the hardware. Up until then, the OS (such as it was) was just whatever software that came with the system. With CP/M, the operating system, programming APIs, user interface, disk format, communication protocols, BIOS, and other features were now independent of the hardware, and independent of the company making that hardware. No longer did DEC equipment have to use DEC’s proprietary protocols for everything, or IBM equipment do everything the IBM way. CP/M was portable across processor architectures and instruction sets. It laid the groundwork for the hardware/software division of labor we have today. And it was phenomenally successful.

So why didn’t CP/M become the world’s dominant operating system? The popular myth is that IBM came calling to Digital Research but that Gary Kildall blew them off, preferring to go flying that day rather than meet with a bunch of East Coast guys in white shirts and blue ties.

In reality, Kildall did meet with IBM and he successfully licensed CP/M for their newfangled IBM Personal Computer Model 5150 (aka, the PC). But IBM ultimately offered the PC with a choice of two operating systems: Digital Research’s CP/M or Microsoft’s MS-DOS, which was cheaper. You can guess how that turned out. And thus was born Microsoft’s dominance of the personal computer world for the next 30-odd years.

I’ve often wondered, as I look at Digital Research’s former headquarters, how things might have played out differently. If the company had agreed to different licensing terms, perhaps with different pricing, would we all be using CP/M 8.1 today? Would we curse it as much as we do Windows? And what would have happened to Microsoft, and the whole Redmond tech scene? If Digital Research had become Microsoft, and vice versa, would the area around the company have boomed the same way Seattle’s tech corridor did? Oh, what might have been…

Gary Kildall himself didn’t care, according to his friends. While outsiders tended to tiptoe around the subject of Microsoft’s success or Bill Gates’s phenomenal wealth (both presumably at the expense of Digital Research) out of polite concern that Gary might be a bit sensitive on the subject, the man himself didn’t give a rat’s ass. It was never about the money, success, or fame. Gary Kildall worked at CP/M and other projects because he liked to, not because he wanted to get rich at it. Reportedly, the whole reason he developed CP/M in the first place was because he didn’t want to commute into Silicon Valley to use a “real” timesharing minicomputer. With him, it was all about the product. A real engineer, in other words.

Wharton goes on to say, “At a time when Intel was positioning microprocessors as a replacement for random logic in fixed-function desk calculators, postage scales, and traffic-light controllers, it was Gary who advised Intel that these same chips were flexible enough to be programmed as general-purpose computers. At a time when microcomputer software developers were debating the merits of machine-language programming in octal vs. hex, Gary defined the first programming language and developed the first compiler specifically for microprocessors.”

In any endeavor, somebody has to be first. In microcomputer operating systems, that was Gary Kildall and Digital Research.

Last week our little town placed a commemorative plaque outside Gary Kildall’s house. If you’re in the area and are a fan of the GPS game geocaching, there’s a GZ called “Life Before Windows” (GC10PG1) that will test your hexadecimal skills. I think Gary would have aced it. 

Leave a Reply

featured blogs
Mar 28, 2024
'Move fast and break things,' a motto coined by Mark Zuckerberg, captures the ethos of Silicon Valley where creative disruption remakes the world through the invention of new technologies. From social media to autonomous cars, to generative AI, the disruptions have reverberat...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Package Evolution for MOSFETs and Diodes
Sponsored by Mouser Electronics and Vishay
A limiting factor for both MOSFETs and diodes is power dissipation per unit area and your choice of packaging can make a big difference in power dissipation. In this episode of Chalk Talk, Amelia Dalton and Brian Zachrel from Vishay investigate how package evolution has led to new advancements in diodes and MOSFETs including minimizing package resistance, increasing power density, and more! They also explore the benefits of using Vishay’s small and efficient PowerPAK® and eSMP® packages and the migration path you will need to keep in mind when using these solutions in your next design.
Jul 10, 2023
29,599 views