feature article
Subscribe Now

Whither Embedded?

Picking the Brains at ESC

At last month’s Embedded Systems Conference (ESC) in San Jose, I sat down with executives from four embedded-systems companies to pick their brains. We had what the political pundits call a roundtable discussion, although the table was rectangular, the chairs were hard, and there were no TV cameras. Together we spent an hour jawboning about the embedded-systems business, where we’re all headed, and what’s really good and bad about current technology.

Much of what they (and I) said isn’t printable, but there were some good nuggets of original insight, which is to be expected when you get four bright people in a room (plus me).

The brave participants were Ian Ferguson of ARM, David Stewart of CriticalBlue, Ramana Jampala of CebaTech, and Jay Johnson of Atmel. We had adult supervision in the comely form of Leslie Cumming of Skye Marketing Communications in Portland.

Boutique or Consolidation?

The first topic we discussed was consolidation. Is our industry headed toward massive corporate consolidation, so that 10–20 years from now there will be only 3–5 big companies controlling all of embedded hardware and software?

There’s plenty of evidence to suggest this might happen. There used to be more than 100 automobile companies in the U.S. before General Motors started acquiring and consolidating them. Now you could argue there are only one or two American car companies. Railroads have also consolidated over the years. And we used to have lots of engineering workstations from Daisy, Apollo, Mentor Graphics, Sun, Sony, and others. No longer.

The alternative view is that we’ll fragment (or more accurately, stay fragmented), with thousands of “boutique” embedded companies, each with its own particular specialty niche. In marketing speak, we’ll be disaggregating. Which way do our experts see it going?

ARM’s Ian Ferguson pointed out that the silicon business must consolidate. With the cost of new silicon foundries now topping $6 billion, there’s no way small startups can get into that business. Even large chip companies like NEC and Toshiba are consolidating to share the staggering costs. Semiconductor manufacturing is no longer a differentiating factor because no one can afford to make it one.

Jay Johnson from Atmel echoed that sentiment, pointing out that chip manufacturing “must be the worst business ever. Everything in life gets more expensive except the stuff we sell: chips.” Ounce per ounce, transistors are more expensive than gold, yet more plentiful than grains of rice. It’s a paradox of our industry, and it makes it tough to make money building chips.

If chip makers are consolidating, what about software? Is it just the icing on the silicon cake, or a real standalone industry? Jay Johnson says it depends on the customer. Some use standard or semi-standard parts (such as FPGAs) and innovate through software only. That software becomes the customer’s “value-added,” although that’s sometimes tough to do. “Some customers don’t have any intellectual property,” he adds wryly. I’ve met customers like that, too.

David Stewart of CriticalBlue has observed that his customers don’t really design chips much anymore. They also prefer to use “standard hardware platforms from the semiconductor vendors and add value through software. But if the software comes from the semi vendors, too, how do they differentiate? By the shape of the box?”

CebaTech’s Ramana Jampala compared embedded design to the modern automobile industry. Today’s carmakers don’t really invent anything, he says. They’re more like system integrators, combining innovative products and technologies from third-party suppliers. Likewise, he sees embedded-systems companies becoming overarching “brands” that market innovation from others. In that sense, the industry is headed for both consolidation and fragmentation.

Chip makers have been acquiring software firms for years. Will this continue? Freescale acquired tool-chain supplier Metrowerks more than a decade ago. More recently, Intel bought Wind River and Cavium acquired MontaVista. Is acquisition the new exit strategy for software companies, and, if so, what happens when each of the processor makers has acquired its favorite software firm?

David Stewart pointed out that there’s nothing to stop hardware vendors from acquiring more than one software company, so there’s not necessarily a limit to how long this can go on. But all the participants agreed that perhaps there’s now an upper limit to how big a software company can grow before it becomes acquisition bait. That may be what the founders/owners want, but it does prevent software companies from becoming very large. There may never be another Wind River.

Is Multicore Just a Fad?

Moving on to our next topic, we took a hard look at multicore. Is it just a fad, the latest technology du jour? Or are multicore chips here to stay, and do they solve a real problem?

Jay Johnson was clear: “You don’t do multicore just because you can. You need to have a specific reason. You need a well-defined problem and a well-defined core to handle it.”

A few in the group felt that there’s no good reason for standardized multicore processors, meaning chips with multiple identical cores. The reasoning was that, unlike the PC business, embedded designers are always solving specific problems, not merely throwing CPU cycles at standardized office applications. “Integration for the sake of integration is just a waste of transistors.”

CriticalBlue’s Stewart echoed that concern, with a difference. In his experience with networking customers, those working on data-plane applications can easily partition the work and apply multiple cores and/or accelerators to the problem. Those working on control-plane applications, on the other hand, tend to use SMP (multiple identical cores) instead. They have a tougher time figuring out how to apply the compute resources.

Ramana Jampala pointed out that multicore programmers “have to think like hardware engineers, handling control planes, buffer management, and so on.” Working with multicore processors isn’t like programming single-core processors, but faster. It’s a fundamentally different discipline that almost no one has learned.

Ian Ferguson’s view is that multicore is definitely not a technology fad; it’s here to stay. Too many customers have licensed ARM’s (multicore) Cortex-A9 processor design instead of the less-expensive (and single-core) A8 for it to be a whim. ARM’s initial forays into multicore design with the older ARM9 and ARM11 showed there’s real demand for the technology.

Reinforcing the belief that multicore software is tougher than hardware, someone remarked that, “with so many multicore chips out there, it can’t be that hard to do.”

Atmel’s Johnson pointed out that using nothing but standard processors, whether multicore or not, and adding value through software has its own pitfalls. “You can’t always protect your software [from piracy] adequately enough to defend your product. You have to add something in hardware.”

In general, the group preferred mixed multicore processors over multiple identical cores, á la PC processors, and not just because they don’t work at Intel or AMD. Multiple identical cores don’t solve a specific problem. They just waste space and power, which is mostly irrelevant for PCs but can be a big problem in embedded systems.

Finding ways to exploit those cores is still an inexact science, however. Says David Stewart: “Engineers do pragmatic things. They make the minimum number of changes to get the job done. They ask themselves: ‘What’s the minimum set of code changes I can make to meet my performance target?’” They’re not interested in the academic exercise of finding the optimal parallelization strategy. “They’ve already gone home,” he says.

Ramana Jampala foresees multicore being most popular in new applications where there’s little or no existing code that needs to somehow get “multicore-ized.” Existing software running on traditional single-core processors can’t be easily upgraded, and most programmers aren’t interested in the exercise.

Is a lack of multicore software holding back multicore processors? ARM’s Ian Ferguson says it’s getting better. “We’re seeing more SMP-ready operating systems.”

More to come next week.

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

How NV5, NVIDIA, and Cadence Collaboration Optimizes Data Center Efficiency, Performance, and Reliability

Sponsored by Cadence Design Systems

Deploying data centers with AI high-density workloads and ensuring they are capable for anticipated power trends requires insight. Creating a digital twin using the Cadence Reality Digital Twin Platform helped plan the deployment of current workloads and future-proof the investment. Learn about the collaboration between NV5, NVIDIA, and Cadence to optimize data center efficiency, performance, and reliability. 

Click here for more information about Cadence Data Center Solutions

featured chalk talk

ROHM Automotive Intelligent Power Device (IPD)
Modern automotive applications require a variety of circuit protections and functions to safeguard against short circuit conditions. In this episode of Chalk Talk, Amelia Dalton and Nick Ikuta from ROHM Semiconductor investigate the details of ROHM’s Automotive Intelligent Power Device, the role that ??adjustable OCP circuit and adjustable OCP mask time plays in this solution, and the benefits that ROHM’s Automotive Intelligent Power Device can bring to your next design.
Feb 1, 2024
23,751 views