feature article
Subscribe Now

Could You Make Your Own Processor?

Could you design your own microprocessor or microcontroller? Would you want to?

Designing a microprocessor is one of those EE-student daydreams, like mechanical engineers doodling cars in their notebooks or art students who sketch, well, sketches. It’s a cool idea, and any EE undergrad worth his salt knows, just knows, he could do it better than anyone else.

What would you put in yours? And what would you leave out?

You’d probably make it fast, and power-efficient, and easy to program. You might add a couple of interesting and clever instructions that are curiously absent from all the mainstream chips. What would yours do? Would it include floating-point math, graphics functions, cryptographic primitives, ASCII string handling, BCD arithmetic, or something else?

And what would you jettison, if anything? RISC dogma says it’s important to leave out anything not absolutely necessary. Save every opcode and transistor you can. But we’ve seen that doctrine watered down over the years until it’s hardly recognizable. Maybe less is really less after all.

Designing your own custom microprocessor used to be nothing more than a fun thought experiment, but with today’s big FPGAs you could actually fabricate your own CPU chip. It might not use the FPGA’s resources very efficiently or run very fast, but it would probably work. Serious 32-bit processors including ARM, PowerPC, MIPS, ARC, Tensilica, and others have all been implemented in standard FPGAs, so it’s obviously doable.

So why don’t we all design our own processors? Two of the companies mentioned above, Tensilica and ARC (now part of Virage Logic) encourage us to do exactly that: design our own customized 32-bit CPU using their design tools and their framework as a starting point. It’s a charming idea: you get exactly the processor you’ve been dreaming of since Computer Science 101, but with the backing and support of a real commercial entity.

The benefits (apart from delayed gratification) are that you can dial-in the performance and capabilities you want, and you get a monopoly on that design. For example, if your application requires a lot of oddball bit-twiddling that mainstream processors don’t do well, you could design a CPU with a special instruction just for that. Or an instruction that calculates checksums, or one that branches on weird conditions – whatever you want to make up. If you’re clever, you might be able to create a chip that runs much faster than any other processor out there, at least on your code.

You also own the design, in the sense that it’s your private architecture and no one else can copy it. For starters, that’ll pretty comprehensively obfuscate your code, since no one can disassemble the object code from a processor they’ve never seen before. In fact, all it would take to confuse hackers is one custom opcode that performs an unknown function, even if the rest of your code is bone-stock 8051.

But a custom instruction set also means custom tools: you bear the burden of your own software development. There’s no gnu compiler for a processor you just made up last Thursday. The DIY processor companies like ARC and Tensilica get you started with a C compiler that’s customizable along with the processor, and that’s enough to get you started, but you’ll never enjoy the third-party tool support of a mainstream chip. And that’s the major rub.

Could You Sell It?

Even if you did design your own super-processor – and there are plenty of cases where engineers have done just that – could you sell the idea to your boss? Would management go for it or would they run screaming for the exits?

In many cases, it’s the latter. Better isn’t always better, in the sense that better engineering solutions aren’t always better commercial solutions. This is just one reason why engineers are suspicious of management, and vice versa. Neither side is stupid (steady, now…) it’s just that they’re paid to optimize different results. Whenever you have players with different incentives, you’re going to see them behave differently, even if they’re on the same team. You move the cheese, you change the path through the maze.

At any rate, your super-duper CPU may be all sorts of better in terms of hardware efficiency, but what about software? It’ll be hard to hire and train programmers if they’ve never seen your processor before. And where do you go for technical support? Any hardware bugs would almost certainly be your own fault. An emulator? Who’s got one of those? Interface logic… peripherals… simulation models… probes… the list of missing items is long and daunting. Your custom processor had better be a whole lot better than anything else available to make up for those shortcomings.

In the end, most of us knuckle under and use a mainstream processor that’s marginally okay for the task. It’s not that we’re in love with the chip or its instruction set; it’s that we love the idea of someone else creating the support infrastructure. Mediocre tools are better than no tools. And creating all your own tools is time-consuming, error-prone, and not what most of us get paid for. This ain’t “New Yankee Workshop.” Like most things in engineering, it’s a reasonable compromise.

The evidence suggests that inertia outweighs innovation. Intel’s x86 family of processors is the longest-running dynasty in chipdom, precisely because Intel never overhauled it by “improving” the miserable instruction set or the spectacularly awkward memory model. The x86 family accumulates ghastly features the way a ship accumulates barnacles. And yet it keeps steaming ahead with nary an iceberg in sight. The same goes for the 8051, and pretty much all of the most popular processors. This is not a coincidence. Processor design is one area where we don’t really want progress or innovation. I’ll have more of the same, please. Keep ’em coming, barkeeper. 

Leave a Reply

featured blogs
Jun 22, 2018
A myriad of mechanical and electrical specifications must be considered when selecting the best connector system for your design. An incomplete, first-pass list of considerations include the type of termination, available footprint space, processing and operating temperature...
Jun 22, 2018
You can't finish the board before the schematic, but you want it done pretty much right away, before marketing changes their minds again!...
Jun 22, 2018
Last time I worked for Cadence in the early 2000s, Adriaan Ligtenberg ran methodology services and, in particular, something we called Virtual CAD. The idea of Virtual CAD was to allow companies to outsource their CAD group to Cadence. In effect, we would be the CAD group for...
Jun 7, 2018
If integrating an embedded FPGA (eFPGA) into your ASIC or SoC design strikes you as odd, it shouldn'€™t. ICs have been absorbing almost every component on a circuit board for decades, starting with transistors, resistors, and capacitors '€” then progressing to gates, ALUs...
May 24, 2018
Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact. A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation '€“ as couples are wont to...