feature article
Subscribe Now

Life Before Windows

Before We Knew What An “Operating System” Was

Gather ’round the campfire, children, as we talk about The Time Before Operating Systems. That was when all computers came with their own built-in software, indistinguishable from the hardware. You didn’t have an IBM computer and an IBM operating system, for example. You had just an IBM system. Yeah, it included software, but nobody thought much about where it came from or what it was called. It was simply part of the machine.

This was also the time of early microprocessors, home computers, and build-it-yourself kits. Again, all of these machines came with their own bundled software. Each one was different, of course, because they were developed by the computer makers alongside the hardware. Amiga, Commodore, Altair, IMSAI, Apple, and other machines all had their own personalities as determined by the bit stream that made them go.

Then came a guy named Gary Kildall. Gary died 20 years ago, but not before changing the entire microprocessor and computer world.

I never met Mr. Kildall, but it so happens that I live just a few doors down from his old house. Coincidentally, his company’s former office building is directly across the street from me; I look out my window every day at Gary’s old office. It’s a pizza place now.

Gary Kildall figured that even early little microprocessors like Intel’s 8080 could run real software, just like the commercial IBM, Data General, or DEC minicomputers of the time. He created something he called a “control program and monitor,” or CP/M for short. The company he formed around it was named Digital Research. 

Computer scientist John Wharton says of his friend, “[Gary] offered the complete package to Intel, along with an editor, assembler, linker, and loader, for $20,000. Intel turned him down, and CP/M went on to sell a quarter of a million copies and become by far the highest-volume operating system of its time.”

“Back before the introduction of the IBM PC, CP/M supported protocols for memory allocation, file sharing, process switching, and peripheral management. When Microsoft bought the rights to an unauthorized quick-and-dirty knockoff of CP/M from Seattle Computer Products and renamed it MS-DOS, these protocols were removed, since Microsoft programmers didn’t understand why they were needed.”

CP/M was arguably the first operating system to separate the software from the hardware. Up until then, the OS (such as it was) was just whatever software that came with the system. With CP/M, the operating system, programming APIs, user interface, disk format, communication protocols, BIOS, and other features were now independent of the hardware, and independent of the company making that hardware. No longer did DEC equipment have to use DEC’s proprietary protocols for everything, or IBM equipment do everything the IBM way. CP/M was portable across processor architectures and instruction sets. It laid the groundwork for the hardware/software division of labor we have today. And it was phenomenally successful.

So why didn’t CP/M become the world’s dominant operating system? The popular myth is that IBM came calling to Digital Research but that Gary Kildall blew them off, preferring to go flying that day rather than meet with a bunch of East Coast guys in white shirts and blue ties.

In reality, Kildall did meet with IBM and he successfully licensed CP/M for their newfangled IBM Personal Computer Model 5150 (aka, the PC). But IBM ultimately offered the PC with a choice of two operating systems: Digital Research’s CP/M or Microsoft’s MS-DOS, which was cheaper. You can guess how that turned out. And thus was born Microsoft’s dominance of the personal computer world for the next 30-odd years.

I’ve often wondered, as I look at Digital Research’s former headquarters, how things might have played out differently. If the company had agreed to different licensing terms, perhaps with different pricing, would we all be using CP/M 8.1 today? Would we curse it as much as we do Windows? And what would have happened to Microsoft, and the whole Redmond tech scene? If Digital Research had become Microsoft, and vice versa, would the area around the company have boomed the same way Seattle’s tech corridor did? Oh, what might have been…

Gary Kildall himself didn’t care, according to his friends. While outsiders tended to tiptoe around the subject of Microsoft’s success or Bill Gates’s phenomenal wealth (both presumably at the expense of Digital Research) out of polite concern that Gary might be a bit sensitive on the subject, the man himself didn’t give a rat’s ass. It was never about the money, success, or fame. Gary Kildall worked at CP/M and other projects because he liked to, not because he wanted to get rich at it. Reportedly, the whole reason he developed CP/M in the first place was because he didn’t want to commute into Silicon Valley to use a “real” timesharing minicomputer. With him, it was all about the product. A real engineer, in other words.

Wharton goes on to say, “At a time when Intel was positioning microprocessors as a replacement for random logic in fixed-function desk calculators, postage scales, and traffic-light controllers, it was Gary who advised Intel that these same chips were flexible enough to be programmed as general-purpose computers. At a time when microcomputer software developers were debating the merits of machine-language programming in octal vs. hex, Gary defined the first programming language and developed the first compiler specifically for microprocessors.”

In any endeavor, somebody has to be first. In microcomputer operating systems, that was Gary Kildall and Digital Research.

Last week our little town placed a commemorative plaque outside Gary Kildall’s house. If you’re in the area and are a fan of the GPS game geocaching, there’s a GZ called “Life Before Windows” (GC10PG1) that will test your hexadecimal skills. I think Gary would have aced it. 

Leave a Reply

featured blogs
Dec 2, 2020
The folks at LEVL are on a mission is to erase your network footprint. Paradoxically, they do this by generating their own 48-bit LEVL-IDs for your devices....
Dec 1, 2020
More package designers these days, with the increasing component counts and more complicated electrical constraints, are shifting to using a front-end schematic capture tool. As with IC and PCB... [[ Click on the title to access the full blog on the Cadence Community site. ]...
Dec 1, 2020
UCLA’s Maxx Tepper gives us a brief overview of the Ocean High-Throughput processor to be used in the upgrade of the real-time event selection system of the CMS experiment at the CERN LHC (Large Hadron Collider). The board incorporates Samtec FireFly'„¢ optical cable ...
Nov 25, 2020
[From the last episode: We looked at what it takes to generate data that can be used to train machine-learning .] We take a break from learning how IoT technology works for one of our occasional posts on how IoT technology is used. In this case, we look at trucking fleet mana...

featured video

Product Update: Broad Portfolio of DesignWare IP for Mobile SoCs

Sponsored by Synopsys

Get the latest update on DesignWare IP® for mobile SoCs, including MIPI C-PHY/D-PHY, USB 3.1, and UFS, which provide the necessary throughput, bandwidth, and efficiency for today’s advanced mobile SoCs.

Click here for more information about DesignWare IP for 5G Mobile

featured paper

How semiconductor technologies have impacted modern telehealth solutions

Sponsored by Texas Instruments

Innovate technologies have given the general population unprecedented access to healthcare tools for self-monitoring and remote treatment. This paper dives into some of the newer developments of semiconductor technologies that have significantly contributed to the telehealth industry, along with design requirements for both hospital and home environment applications.

Click here to download the whitepaper

Featured Chalk Talk

Protecting Circuitry with eFuse IC

Sponsored by Mouser Electronics and Toshiba

Conventional fuses are rapidly becoming dinosaurs in our electronic systems. Finally, there is circuit protection technology that doesn’t rely on disposable parts and molten metal. In this episode of Chalk Talk, Amelia Dalton chats with Jake Canon of Toshiba about eFuse - a smart solution that will get rid of those old-school fuses once and for all.

Click here for more information about Toshiba efuses