feature article
Subscribe Now

Intel’s x86 Turns 40: Full of Vigor or Doddering Retiree?

Teleology and the Effects of Randomness on Computer Architecture

Forty years ago, the calendar on the kitchen wall said it was 1978. People were taping All in the Family on their Betamax VCRs, the Nobel Prize for Literature went to Isaac Bashevis Singer, Animal House and Jaws 2 were filling movie theaters, and the Western world was, to its everlasting shame, in the grip of disco music.

Meanwhile, in Santa Clara, Intel produced its first 8086 microprocessor chip.

One of these events would change the world.

The x86 microprocessor architecture turns 40 years old this month, and, although it has developed a bit of a middle-aged paunch, it’s still strong and powerful and at the top of its game. The x86 family has amassed unbelievable wealth and power – unbelievable because its origins were so humble and unremarkable. It was never designed to take over the world. It just sort of fell into the role.

It’s easy to fall into the trap of teleology here: to believe that the success of the x86 was inevitable and unavoidable, that it had to happen, and that the personal computer revolution that followed was a direct cause of its creation.

Nope. It didn’t have to be this way, and probably wouldn’t have turned out this way, had not random chance fingered the little chip. The success of that first 8086, and later of the whole x86 family, came as a big surprise to nearly everyone. Even the guy who invented it thinks it wasn’t a big deal. “Any bright engineer could have designed the processor. It would probably have had a radically different instruction set, but it would have had Intel’s backing behind it and all PCs today would be based on that architecture instead. I was just lucky enough to have been at the right place at the right time.”

That’s according to Steve Morse, a programmer – not a hardware designer – who developed the now-ubiquitous x86 instruction set more or less single-handedly. “I usually cringe a bit,” when he’s introduced at conferences, “because I really don’t think it was that great an accomplishment.” His own website barely mentions it.

It’s not as though the 8086 was the only game in town. Far from it; Intel wasn’t even a particularly large CPU maker at the time. Sure, it had the 8080, and the 8008 before that, but they were just run-of-the-mill 8-bit microprocessors of no particular repute. There was also Zilog’s Z80, Data General’s MN601 chip, the TMS9900 from Texas Instruments, and Motorola’s new 68000 (a 32-bit processor!) was on the horizon. Not to mention Intel’s own iA432 processor (confusingly called the 8800 at the time), which was the company’s “real” upgrade from the 8080 generation. The 8086 was a quick fix, a hack, a kludge intended to fill the hole until the 8800 was ready for production.

We know how that turned out.

That such a modest design became so immensely popular says a lot about dumb luck, flukes, and serendipity in our industry. It’s also a bit of a Rorschach test: cynical engineers see it as proof positive that market success is inversely proportional to technical superiority (aka the worst product always wins). An optimist might say that it’s evidence of a level and fair playing field. After all, Intel was just another player in a crowded market. It didn’t become a behemoth until after the x86 became a de facto industry standard. Its success was entirely earned, not preordained.

Fate intervened when corporate giant IBM chose the 8088 (a “castrated” version of the 8086, according to Morse, with the same internal architecture but with an 8-bit external bus) for its personal computer. IBM’s decision wasn’t even down to technical reasons. It’s because the 8088 was cheaper than the alternatives, including the 8086. It was the slower, cheaper 8-bit version that won out over Motorola’s fully 32-bit 68K architecture, or the 16-bit 8086.

In what feels like a subtle jab from the computer gods, IBM called its personal computer the Model 5150, a number that police-scanner aficionados and Van Halen fans will recognize as the California legislative code for “criminally insane.” Just another reminder of how capricious our industry can be.

The IBM PC was successful precisely because it was IBM’s PC. The brand recognition, rather than its componentry, catapulted it to the head of the class. The Model 5150 could have been powered by a twisted rubber band or squirrel on a treadmill; it would have sold just as well, and we’d all be using seventh-generation rubber bands today.

Now, 40 years after the introduction of the x86, Intel’s name is nearly synonymous with microprocessors. Ask a random person to name a semiconductor company of any type, and they’ll likely blurt “Intel.” How many other chipmakers run primetime TV ads? How many microprocessor companies are household names among people who don’t know what a microprocessor is? For years, major PC manufacturers like Dell, HP, and Acer have fought the perception that they’re simply making “Intel PCs.”

But by age 40, many of us start to slow down a bit and lose some youthful agility. Intel’s dominance in CPUs is also waning undeniably. Indeed, in terms of units shipped, Intel hasn’t had the lead for many years (or ever, depending on what chips you count). Mobile devices are overwhelmingly Arm-based, and they ship in larger volumes than PCs ever did. And embedded devices, which ship in even greater quantities, are rarely x86-based, either. It’s big servers where Intel still holds 99% market share, and those chips are expensive, propping up Intel’s corporate margins. But even those earnings aren’t where they used to be.

Intel’s combined design-and-manufacturing strategy ran like clockwork. With each “tick” of the clock, the company would roll out a new microarchitecture. Each “tock” brought an upgrade in manufacturing technology. Thus, Intel’s processors got better all the time, either through design improvements or manufacturing improvements, or both. It was an inexorable freight train of upgrades and one that competitors (mostly AMD) had a hard time matching.  

Until Intel stumbled. Like now. It’s hard to keep upgrading the same old x86 microarchitecture in ways that (a) don’t break backward compatibility, and (b) deliver performance improvements that people want. Having more cores helps, but only to a point – and we’re already there. More threads can be better (depending on your application), but they’re hard to implement. Energy-efficiency is in demand everywhere, but have you tried designing a low-power x86? That’s almost an oxymoron.

At the same time, Intel’s in-house manufacturing technology is slipping. Its 10nm process node has been delayed – again – just as other vendors have started catching up. Once the undisputed leader in semiconductor fabrication, Intel is now falling behind “commodity” foundries like Samsung and TSMC. And, since Intel’s competitors – think AMD, Qualcomm, nVidia, and others – use third-party foundries like TSMC almost exclusively, they’re in an unfamiliarly strong position regarding manufacturing. Plus, they don’t have to pay the eye-watering costs of maintaining a proprietary, in-house fab line. Things are looking up for the little guys.

The trouble with Intel’s tick-tock strategy is that it needs to run like clockwork or the whole mechanism fails. A stumble in the fab department compounds troubles in the design department. Intel has made a fortune building ludicrously complex chips using absurdly advanced manufacturing technology. But, as demand for those chips is declining, there are fewer ludicrous tricks to throw at them, and the manufacturing isn’t absurd enough to make up for the shortfall. After a long and unimaginably profitable run, Intel’s time may finally be up. Not just the x86 – the company as a whole. Unless Intel performs a massive pivot and becomes (for example) a third-party foundry for hire, there’s nothing but a long, slow decline toward the retirement home.

Sometimes when people win the lottery (as Intel did) they imprudently burn through all the money in record time. They think they’ll be rich forever, but they instead land right back in their middle-class lives with an embarrassed whump. It was nice while it lasted, etc. Intel won the big prize and has done a remarkable job parleying that into a long-lasting dynasty and a solid corporation that provided real value to its customers, vendors, and employees. A pillar of the community. But all pillars crumble and even shrewdly invested lottery winnings run out eventually. Luck struck Intel once in its youth, but now it’s time to plan for that long, slow retirement.

2 thoughts on “Intel’s x86 Turns 40: Full of Vigor or Doddering Retiree?”

  1. Not only x86, but the whole RISC approach is doddering. Especially for embedded systems and compute intensive systems where accelerators are popular.

    RISC is primarily justified on the ability of compilers to generate native code, BUT the reality is that there is a thing called Intermediate Language (CIL, MSIL, etc.) and a JIT Compiler back in the bushes but no one even mentions that.

    There is a new compiler “Roslyn” that really gets down to what is important: if/else, for, while, do, switch and assignment expressions. It also has arrays, lists, queues, stacks, among other things that enable applications to focus on function rather than how many cores and whether out of order execution is possible(as if these things are truly worth the cost). Of course Cache coherency cost in performance, power, and resources is not insignificant.

    Professor Dijkstra summed it up “Complexity sells”.

    Then there is “Mono” and Roslyn running on Unix so the Microsoft haters do not have to touch the evil Windows thing. I think Roslyn also runs on Mac.

    Hopefully others will see “the light”. Yes, Virginia, there is a better way to design a computing engine.
    I have a primitive prototype running and am working on an improved version using the Roslyn/CSharp Syntax API.

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Exploring the Potential of 5G in Both Public and Private Networks – Advantech and Mouser
Sponsored by Mouser Electronics and Advantech
In this episode of Chalk Talk, Amelia Dalton and Andrew Chen from Advantech investigate how we can revolutionize connectivity with 5G in public and private networks. They explore the role that 5G plays in autonomous vehicles, smart traffic systems, and public safety infrastructure and the solutions that Advantech offers in this arena.
Apr 1, 2024
3,486 views