feature article
Subscribe Now

Watching the Weightless

Integration Transcends Moore’s Law

We engineers like things we can count and measure. Our professional comfort zone is in the metrics, multipliers, and deltas – hard data on which we can rest our reassurances that we’ve progressed – solved the world’s problems one bit at a time. Moore’s Law, of course, is a solid long-arc barometer for the engineering collective – a sequence of numbers on which we can hang our collective hats – carving notches in the stock as we trouble our way toward infinity. And, at some level it is the basis on which we’ve built our businesses, seeking remuneration for our engineering achievements.

Quietly, though, that framework has shifted. While our hypnotic stare has been affixed to the exponentially growing number of transistors, our currency has changed. The limit, it seems, as our principal metric approaches infinity is a new term – the result of a different function. For decades, the engineers at the foundation of the technology pyramid created components – integrated arrays of transistors that combined in new and interesting ways to provide higher-level functions, which in turn were arrayed to produce boards, devices, and (when overlaid with software) systems. These systems solved end-user problems.

As IoT has taken center stage, however, a different tide is moving us forward. No longer is the tally of transistors a valid measure of the technological milestone we’ve achieved. Integrated stacks of silicon and software embody a collective experience that has snowballed through the decades in a sum-is-exponentially-greater than the parts equation that defies any attempt at measurement.

Whoa! What does that even mean?

A press release crossed my desk recently from Intel, announcing that the company had collaborated with NEC in the creation of NEC’s NeoFace facial recognition engine. The release explains that NeoFace, on the hardware side, combines FPGA acceleration with conventional processing elements to deliver impressive performance in facial recognition benchmarks by the U.S. National Institute of Standards and Technology (NIST). The release says:

“The NIST tests evaluated the accuracy of the technology in two real-life test scenarios including a test for entry-exit management at an airport passenger gate. It determined whether and how well the engine could recognize people as they walked through an area one at a time without stopping or looking at the camera. NEC’s face recognition technology won first place with a matching accuracy of 99.2 percent. The error rate of 0.8 percent is less than one-fourth the second-place error rate.”

Obviously this is a tremendous technological achievement – identifying 99.2 percent of people in real time from a 4K video stream as they walk through a complex environment without stopping or pausing, and with diverse lighting and other environmental challenges is shockingly close to “magic”. The technology behind NeoFace is daunting. It packs extreme computational power provided by heterogeneous computing with FPGAs doing the heavy lifting as hardware accelerators for the recognition algorithms. It fuses machine learning, heterogeneous computing, big data, video processing, and a host of other esoteric technologies into a vision solution that achieves what seems to be the impossible. And, even though it might outperform a human in real-time identification of a random collection of people, it is managing only a tiny sliver of what true human vision can accomplish.

Put another way, NeoFace represents but one narrow slice of the rapidly exploding spectrum of embedded vision. As we work our way up the capability chain, we will undoubtedly see strong trends emerge in the hardware, software and methodologies used to achieve machine vision, but for now each real-world application will most likely be uniquely focused. The “what’s going on in this scene” test for intelligent vision demands a far greater awareness of context than anything we’ve seen thus far.

But, NeoFace is relevant to our discussion in another way. It’s the shape of solutions to come. It represents an emerging era where the number of transistors or the number of lines of code embodied in a solution convey little relevant information about how advanced or sophisticated the solution is. NeoFace definitely contains billions of transistors, but so does your average RAM array. NeoFace definitely embodies millions of lines of source code, but so do any number of dull, pedestrian software systems. No metric we have yet defines characterizes the sophistication of solutions that integrate the state-of-the-art in transistor density, compute acceleration, software development, machine learning, big data, and on and on.

As the number of transistors and the number of lines approach infinity, we definitely hit a vanishing point in their usefulness as metrics. Something else defines the sophistication of our solution. As applications start to develop and improve themselves as a result of their deployment history, even the number of hours of engineering investment in developing a system become almost moot. We lose the ability to measure what we’ve created in the usual ways.

This loss of metrics is frightening because it affects the nature of what we sell. IoT hasn’t just moved us from selling stand-alone systems to selling distributed networked systems. It fundamentally changes our value proposition. There really is no meaningful, sustainable way to sell IoT as “devices” or “systems”. The thing we are actually selling is “capability”. With NeoFace, NEC its selling the capability to recognize human faces. Yes, there are Xeon processors and FPGAs and software and boards and modules and memory and data – but none of that has any real meaning out of the context of the capability.

As system integrators, we will move more and more to stitching systems together by acquiring and integrating “capabilities” rather than components. And, capabilities-as-services may well be a dominant business model in the future. But, as we create technology that can perform increasingly sophisticated tasks previously reserved only for humans, and as we begin to endow that technology with super-human capabilities, it will become increasingly difficult to put a fence of ownership around a particular piece of hardware, software, data, or IP. The capability itself will have transcended decomposition, and therefore ownership. While the practice of engineering may not change, the economics and the product of it most certainly will.

Leave a Reply

featured blogs
Apr 19, 2024
In today's rapidly evolving digital landscape, staying at the cutting edge is crucial to success. For MaxLinear, bridging the gap between firmware and hardware development has been pivotal. All of the company's products solve critical communication and high-frequency analysis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

Secure Authentication ICs for Disposable and Accessory Ecosystems
Sponsored by Mouser Electronics and Microchip
Secure authentication for disposable and accessory ecosystems is a critical element for many embedded systems today. In this episode of Chalk Talk, Amelia Dalton and Xavier Bignalet from Microchip discuss the benefits of Microchip’s Trust Platform design suite and how it can provide the security you need for your next embedded design. They investigate the value of symmetric authentication and asymmetric authentication and the roles that parasitic power and package size play in these kinds of designs.
Jul 21, 2023
31,416 views