feature article
Subscribe Now

Got MILF?

Microprocessors I’d Like to Find, 2013 Edition

It’s springtime, the groundhog has seen his shadow, Easter is behind us, and a young man’s thoughts turn to… microprocessors.

Every embedded system includes a microprocessor or two, and few things will affect the performance of your system as much as that chip. We all have our favorite CPUs (and a few least-favorite CPUs), but rarely do we find that perfect chip: the one that has absolutely everything we want, just the way we want it. Maybe if we create an imaginary checklist of all the things we’d like to see, one of the microprocessor vendors out there will surprise us come Christmastime.

For starters, I’d want a multicore processor. Surveys show that more than half of embedded systems in design right now will contain multiple processor chips. That’s counting 8-bit, 16-bit, and 32-bit CPUs. Sometimes those are identical chips, sometimes different chips but from the same vendor (mixing different Freescale parts, for example), and sometimes they’re completely different processors from different vendors (an ARM and an x86, for instance). So we’re all over the map as designers. 

Instead of that, I’d want to have one chip that had multiple CPU cores in it. Having different physical chips takes up space and complicates my PCB routing. One mega-chip would make that easier while still giving me the power and flexibility of multiple CPUs. Multicore microprocessors are pretty common nowadays, but you don’t often see chips mixing different CPU architectures. I’d want mine to combine x86, ARM, 68K, MIPS, and maybe Z-80 just for fun. That way, I could write (or reuse) whatever code I wanted without having to worry about software compatibility. Naturally, all the unused cores would power-down and consume zero energy. Hey, a guy can dream.

Next, the perfect processor would be fabricated either by Intel on its 22nm FinFET production line, or by TSMC in 35nm bulk CMOS. Why the difference? If my system is power- or performance-critical, I’d want the leading-edge Intel silicon. But if it’s not, I’d rather go for the run-of-the-mill TSMC process and save money. No point putting a Ferrari engine in a grocery-getter.

For peripherals, I’d have none. Well, no fixed peripherals, anyway. Instead, I’d want virtual peripherals implemented either in programmable logic or simulated entirely in software. We’ve seen various companies try both approaches, but never very successfully. Ubicom, for example, did “virtual peripherals” using software emulation, which is a technique I very much like. You can add, remove, or tweak your peripherals simply by changing code. Pinouts are simply a matter of programming, and peripheral throughput is largely up to you. Want another Ethernet port? No problem, just code one up. Need a USB 3.0 or Thunderbolt port to keep up with the engineers next door? Coming right up.

Instead of the software-emulation route, the chip could use programmable logic for its peripherals; I’m okay with that. A long time ago, Triscend (now part of Xilinx) had a microcontroller with “soft” peripherals that you could create in just a few seconds using a drag-and-drop software tool. Very slick, and very easy to reconfigure to suit your board layout or your interface needs. I’ll take some more of that, please.

As far as the package for this ideal chip, I’d lean toward an old-school PGA. I know the pin density isn’t very good, but it has the advantage of easy socket access, and it’s easy to get oscilloscope or logic analyzer probes on it. Maybe in production I’d switch to a denser surface-mount package, but for development, give me that old time pin-grid array.

Debug is a big deal. Research into developers’ habits shows that embedded programmers typically spend more time debugging code than they did writing it, so throwing resources at debug tools is money well spent. The perfect embedded processor would have its own on-chip debug processor: a real 32-bit beast with private on-chip memory that does nothing but handle debugging. I’d have it run diagnostics, monitor breakpoints, manage stack over- and under-runs, check memory bounds, snoop bus cycles, and execute all sorts of elaborate conditional monitors that I’d create on the fly. Ideally, there’d be a big repository of open-source debug routines for this thing so we’d all benefit from each others’ debug experiences. 

Oh, and while we’re at it, the chip should have a few green LEDs on top. You know, so the debug routines can blink diagnostic codes. If the package is big enough, a seven-segment or LCD display on top would be even better, but let’s not get crazy.

The on-chip cache – and there would be lots of cache – would have to be configurable. I’d want to be able to dial in the cache size to see what it does to performance or predictability. A smaller cache might trim performance a little bit but save power, and maybe that’s a tradeoff I’d want to make. Sometimes I’d want to disable the cache entirely to get deterministic software behavior.

A built-in memory controller would be really handy. I hate designing memory controllers. The ideal one would handle pretty much any type of memory interface ever created, from dumb DRAMs with RAS/CAS timing to several generations of Rambus interface, Hybrid Memory Cube, and modern DDR3/DDR4 chips. SRAM, flash, and NVRAMs would have to be supported, too. The MMU would let me map the memories to any address range I want, of course.

Operating systems? I’ll take all of ’em. The ideal chip should come with ready-made ports of Linux, Windows Embedded, µCOS, ThreadX, Android, VxWorks, Integrity, and MQX, plus a few others to be named later. One big DVD with all the images on it would be nice, thank you. Naturally, each OS would come with drivers for all the soft peripherals I’ll be creating.

While we’re at it, make sure there’s a firmware setting that I can adjust to set the chip’s maximum power consumption. Maybe something that would automatically throttle back the chip’s clock speed to stay within my preset power envelope. Separate settings for instantaneous power and average power (say, over 10 seconds) would be handy, thanks.

This should all come on a $99 evaluation board with a bunch of USB connections, a Wi-Fi daughterboard, AA batteries, LCD and LED displays, PCI Express, and a TV tuner. And be waterproof. Complete circuit schematics, PCB layout, and firmware source code included.

That should about cover it for now. Got that, all you embedded hardware vendors? We’ve just given you the key. All you have to do is follow this simple recipe and you’ll sell a million of ’em. Every engineer, programmer, and development manager in the world will beat a path to your door. Of course, you’ll probably lose a few thousand dollars on every unit you sell, but so what? You’ll make it up in volume. 

Leave a Reply

featured blogs
Jun 21, 2018
Doing business today isn’t quite like it was back in the 80’s. Sparkling teeth and x-ray vision shouldn’t be a side effect of a customer using your product. This, of course, is said in jest, but no longer do we sell only a product; but a product and physical...
Jun 21, 2018
Welcome back to our series on cloud verification solutions. This is part two of a three-part blog'€”you can read part one here . The high-performance computing (HPC) market continues to grow. Analysts say that the HPC market will reach almost $11 billion by 2020'€”that'€...
Jun 7, 2018
If integrating an embedded FPGA (eFPGA) into your ASIC or SoC design strikes you as odd, it shouldn'€™t. ICs have been absorbing almost every component on a circuit board for decades, starting with transistors, resistors, and capacitors '€” then progressing to gates, ALUs...
May 24, 2018
Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact. A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation '€“ as couples are wont to...
Apr 27, 2018
A sound constraint management design process helps to foster a correct-by-design approach, reduces time-to-market, and ultimately optimizes the design process'€”eliminating the undefined, error-prone methods of the past. Here are five questions to ask......