feature article
Subscribe Now

Whither Embedded

Part Deux

In this second installment of our big embedded-executive brain dump, we delve a bit more into the pros and cons of multicore processors and multiprocessor systems, but also take a look at programming languages and customer trends. Here’s what’s happening. 

All four participants agreed that the first multicore project is the hardest. “Moving from one core to two cores is the biggest step,” said CriticalBlue’s CEO David Stewart. “After that, four, eight, even thirty-two cores is a smaller step.”

Ramana Jampala of CebaTech echoed a similar sentiment. Design engineers “have to prepare; it’s a planning process. Programmers have to plan the migration of their software.” Educating – or reeducating – programmers won’t happen overnight, but it’s mandatory for effective multicore programming. Or even to keep one’s head above water.

One member felt that “each industry segment will get there at its own rate. Networking is already there. Handsets are almost there. Other markets, like automotive and industrial, will take longer.”  

Because of the training and inertia involved, programmers may sidestep multicore programming and instead throw hardware at the problem. Ramana gave an example: “If my customer says to me, ‘Make it go faster, make it more efficient,’ would I risk my sequential software and existing hardware tools for parallelization, or would I try to push the boulder uphill by trying to make this software multithreaded?” In his view, it’s easier to take advantage of hardware accelerators and other specialized offloads than to try to integrate and program more processors.

He gave another example of a networking customer struggling with a high-end Intel “Nehalem” (Core i7) processor. “He could barely get 500–600 Mbps. But teamed with hardware offload in the form of an FPGA, we were able to identify parallelization using automated tools and produce hardware at one-third to one-fourth the cost of the Intel CPU, and 5x better performance. I’m not saying multicore or multithreaded software couldn’t have done the job,” he added, “but the amount of effort involved wouldn’t have been worth it.”

As complex as they are, demand for multicore processors keeps rising. ARM’s Ian Ferguson mentioned one customer (also in the networking business) who is “putting down eight Cortex-A9’s in a base station, in two clusters of four.”

Customers want it, he insists, and ARM is accommodating their wishes. “We made some basic assumptions about how SMP [symmetric multiprocessing] would be instantiated… we’ve never really been in network infrastructure.” In designing the Cortex-A9, ARM assumed that a four-way cluster would be sufficient and designed the core with this in mind. Now that at least one customer is clustering double that number, the company may need to rethink that limitation.

Are we as humans multicore capable?

There’s a school of thought that the problem with multicore programming isn’t the chips or the compilers – it’s us. As humans, we’re simply not built to grok multicore programming. Our thought processes are inherently serial, not parallel (so the reasoning goes), so we’re congenitally ill-suited to the task. If that’s so, is there anything we can do about it?

Perhaps a new programming language is required. Or even a whole new programming paradigm that uses symbols (for example) or flowcharts or schematics. There’s been no lack of effort in this area, all the way from academic research teams to unemployed college students with time on their hands. Why has (almost) no one adopted these new languages?

David Stewart replied, “That’s not how people work. We’ve always been through evolutions, and probably always will.” The rest of the group agreed. There’s never a good point when everyone can change. Working engineers have to get tomorrow’s product out the door, so there’s never a convenient two-year gap to relearn everything. We instead make incremental improvements, precluding any wholesale changes in our programming habits or mindsets. As academically attractive as a new programming paradigm might be, it’s impractical. Like any language, to be of use requires that people actually adopt it and use it.

It’s ten years later; do you know where your customers are?

Toward the end of our discussion we talked about changes in the customer base. Are embedded-systems suppliers selling to the same customers they were ten years ago? And do they expect to be selling to the same customers ten years from now?

Atmel’s Jay Johnson predictably replied with, “yes and no.” In his case, companies that laid off ASIC designers are now gravitating toward FPGAs and other programmable products like Atmel’s. “Kids out of [engineering] school are taught that FPGAs are the way to customize.” 

He also noted a shift in the customer base over the last decade. Most of Atmel’s programmable chips used to go into automotive and industrial applications. Now it’s mostly consumer electronics. The switch happened about three years ago and seems permanent. Short consumer design cycles and an urgent need to differentiate products have led designers to rely on programmable logic for an edge.

CebaTech’s Ramana said his customer demographics haven’t changed over time. “We appeal mainly to software companies without a lot of hardware-design expertise, but who need a lot of acceleration.”

ARM is still doing pretty well in mobile applications, said the company’s Ian Ferguson. But he also noted that power consumption is more important to more people than before, and not just in battery-powered products. That’s partly what led ARM to break out its processor product line into three separate Cortex-A, –R, and –M families.

And in the end…

Although there was a log of agreement among our group of experts, they often agreed to disagree. There appears to be no one solution to the “multicore problem,” and it’s been interesting to watch customers feel their own way through the darkness. Some embrace it wholeheartedly; others are in denial. Many take the pragmatic course of modifying as little of their hardware and software as possible to eke out an incremental improvement in performance. Everyone looks to everyone else for the “right” approach, or waits for the industry at large to deliver a breakthrough that’ll make all this complexity go away.

As much as we might like the fairy tale of the white knight riding to our rescue, that’s not likely to happen. Engineers will do what engineers have always done: slog through the muddy waters of design decisions, picking the path that’s best (or least miserable) for their particular problem. 

Even if there were an ideal engineering solution – even if we could somehow prove that it was the best, most power-efficient, cheapest, fastest way – we’d still have different engineers following completely different approaches. That’s’ just the perversity of engineering. And if history is any indication, the worst one will be the most successful.

Which is cool; that’s what makes engineering fun.

Leave a Reply

featured blogs
Apr 23, 2024
The automotive industry's transformation from a primarily mechanical domain to a highly technological one is remarkable. Once considered mere vehicles, cars are now advanced computers on wheels, embodying the shift from roaring engines to the quiet hum of processors due ...
Apr 22, 2024
Learn what gate-all-around (GAA) transistors are, explore the switch from fin field-effect transistors (FinFETs), and see the impact on SoC design & EDA tools.The post What You Need to Know About Gate-All-Around Designs appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

E-Mobility - Charging Stations & Wallboxes AC or DC Charging?
In this episode of Chalk Talk, Amelia Dalton and Andreas Nadler from Würth Elektronik investigate e-mobility charging stations and wallboxes. We take a closer look at the benefits, components, and functions of AC and DC wallboxes and charging stations. They also examine the role that DC link capacitors play in power conversion and how Würth Elektronik can help you create your next AC and DC wallbox or charging station design.
Jul 12, 2023
32,582 views