feature article
Subscribe Now

Life Before Windows

Before We Knew What An “Operating System” Was

Gather ’round the campfire, children, as we talk about The Time Before Operating Systems. That was when all computers came with their own built-in software, indistinguishable from the hardware. You didn’t have an IBM computer and an IBM operating system, for example. You had just an IBM system. Yeah, it included software, but nobody thought much about where it came from or what it was called. It was simply part of the machine.

This was also the time of early microprocessors, home computers, and build-it-yourself kits. Again, all of these machines came with their own bundled software. Each one was different, of course, because they were developed by the computer makers alongside the hardware. Amiga, Commodore, Altair, IMSAI, Apple, and other machines all had their own personalities as determined by the bit stream that made them go.

Then came a guy named Gary Kildall. Gary died 20 years ago, but not before changing the entire microprocessor and computer world.

I never met Mr. Kildall, but it so happens that I live just a few doors down from his old house. Coincidentally, his company’s former office building is directly across the street from me; I look out my window every day at Gary’s old office. It’s a pizza place now.

Gary Kildall figured that even early little microprocessors like Intel’s 8080 could run real software, just like the commercial IBM, Data General, or DEC minicomputers of the time. He created something he called a “control program and monitor,” or CP/M for short. The company he formed around it was named Digital Research. 

Computer scientist John Wharton says of his friend, “[Gary] offered the complete package to Intel, along with an editor, assembler, linker, and loader, for $20,000. Intel turned him down, and CP/M went on to sell a quarter of a million copies and become by far the highest-volume operating system of its time.”

“Back before the introduction of the IBM PC, CP/M supported protocols for memory allocation, file sharing, process switching, and peripheral management. When Microsoft bought the rights to an unauthorized quick-and-dirty knockoff of CP/M from Seattle Computer Products and renamed it MS-DOS, these protocols were removed, since Microsoft programmers didn’t understand why they were needed.”

CP/M was arguably the first operating system to separate the software from the hardware. Up until then, the OS (such as it was) was just whatever software that came with the system. With CP/M, the operating system, programming APIs, user interface, disk format, communication protocols, BIOS, and other features were now independent of the hardware, and independent of the company making that hardware. No longer did DEC equipment have to use DEC’s proprietary protocols for everything, or IBM equipment do everything the IBM way. CP/M was portable across processor architectures and instruction sets. It laid the groundwork for the hardware/software division of labor we have today. And it was phenomenally successful.

So why didn’t CP/M become the world’s dominant operating system? The popular myth is that IBM came calling to Digital Research but that Gary Kildall blew them off, preferring to go flying that day rather than meet with a bunch of East Coast guys in white shirts and blue ties.

In reality, Kildall did meet with IBM and he successfully licensed CP/M for their newfangled IBM Personal Computer Model 5150 (aka, the PC). But IBM ultimately offered the PC with a choice of two operating systems: Digital Research’s CP/M or Microsoft’s MS-DOS, which was cheaper. You can guess how that turned out. And thus was born Microsoft’s dominance of the personal computer world for the next 30-odd years.

I’ve often wondered, as I look at Digital Research’s former headquarters, how things might have played out differently. If the company had agreed to different licensing terms, perhaps with different pricing, would we all be using CP/M 8.1 today? Would we curse it as much as we do Windows? And what would have happened to Microsoft, and the whole Redmond tech scene? If Digital Research had become Microsoft, and vice versa, would the area around the company have boomed the same way Seattle’s tech corridor did? Oh, what might have been…

Gary Kildall himself didn’t care, according to his friends. While outsiders tended to tiptoe around the subject of Microsoft’s success or Bill Gates’s phenomenal wealth (both presumably at the expense of Digital Research) out of polite concern that Gary might be a bit sensitive on the subject, the man himself didn’t give a rat’s ass. It was never about the money, success, or fame. Gary Kildall worked at CP/M and other projects because he liked to, not because he wanted to get rich at it. Reportedly, the whole reason he developed CP/M in the first place was because he didn’t want to commute into Silicon Valley to use a “real” timesharing minicomputer. With him, it was all about the product. A real engineer, in other words.

Wharton goes on to say, “At a time when Intel was positioning microprocessors as a replacement for random logic in fixed-function desk calculators, postage scales, and traffic-light controllers, it was Gary who advised Intel that these same chips were flexible enough to be programmed as general-purpose computers. At a time when microcomputer software developers were debating the merits of machine-language programming in octal vs. hex, Gary defined the first programming language and developed the first compiler specifically for microprocessors.”

In any endeavor, somebody has to be first. In microcomputer operating systems, that was Gary Kildall and Digital Research.

Last week our little town placed a commemorative plaque outside Gary Kildall’s house. If you’re in the area and are a fan of the GPS game geocaching, there’s a GZ called “Life Before Windows” (GC10PG1) that will test your hexadecimal skills. I think Gary would have aced it. 

Leave a Reply

featured blogs
Jan 26, 2023
By Slava Zhuchenya Software migration can be a dreaded endeavor, especially for electronic design automation (EDA) tools that design companies… ...
Jan 26, 2023
Are you experienced in using SVA? It's been around for a long time, and it's tempting to think there's nothing new to learn. Have you ever come across situations where SVA can't solve what appears to be a simple problem? What if you wanted to code an assertion that a signal r...
Jan 24, 2023
We explain embedded magnetoresistive random access memory (eMRAM) and its low-power SoC design applications as a non-volatile memory alternative to SRAM & Flash. The post Why Embedded MRAMs Are the Future for Advanced-Node SoCs appeared first on From Silicon To Software...
Jan 19, 2023
Are you having problems adjusting your watch strap or swapping out your watch battery? If so, I am the bearer of glad tidings....

featured video

Synopsys 224G & 112G Ethernet PHY IP OIF Interop at ECOC 2022

Sponsored by Synopsys

This Featured Video shows four demonstrations of the Synopsys 224G and 112G Ethernet PHY IP long and medium reach performance, interoperating with third-party channels and SerDes.

Learn More

featured chalk talk

224 Gbps Data Rates: Separating Fact from Fiction

Sponsored by Samtec

Data rates are getting faster with each passing year. In this episode of Chalk Talk, Amelia Dalton chats with Matthew Burns from Samtec to separate fact from fiction when it comes to 224 Gbps data rates. They take a closer look at the design challenges, the tradeoffs, and architectural decisions that we will need to consider when designing a 224 Gbps design. They also investigate the variety of interconnect solutions that Samtec offers for your next 224 Gbps design.

Click here for more information about Silicon-to-Silicon Application Solutions from Samtec