feature article
Subscribe Now

Watching the Weightless

Integration Transcends Moore’s Law

We engineers like things we can count and measure. Our professional comfort zone is in the metrics, multipliers, and deltas – hard data on which we can rest our reassurances that we’ve progressed – solved the world’s problems one bit at a time. Moore’s Law, of course, is a solid long-arc barometer for the engineering collective – a sequence of numbers on which we can hang our collective hats – carving notches in the stock as we trouble our way toward infinity. And, at some level it is the basis on which we’ve built our businesses, seeking remuneration for our engineering achievements.

Quietly, though, that framework has shifted. While our hypnotic stare has been affixed to the exponentially growing number of transistors, our currency has changed. The limit, it seems, as our principal metric approaches infinity is a new term – the result of a different function. For decades, the engineers at the foundation of the technology pyramid created components – integrated arrays of transistors that combined in new and interesting ways to provide higher-level functions, which in turn were arrayed to produce boards, devices, and (when overlaid with software) systems. These systems solved end-user problems.

As IoT has taken center stage, however, a different tide is moving us forward. No longer is the tally of transistors a valid measure of the technological milestone we’ve achieved. Integrated stacks of silicon and software embody a collective experience that has snowballed through the decades in a sum-is-exponentially-greater than the parts equation that defies any attempt at measurement.

Whoa! What does that even mean?

A press release crossed my desk recently from Intel, announcing that the company had collaborated with NEC in the creation of NEC’s NeoFace facial recognition engine. The release explains that NeoFace, on the hardware side, combines FPGA acceleration with conventional processing elements to deliver impressive performance in facial recognition benchmarks by the U.S. National Institute of Standards and Technology (NIST). The release says:

“The NIST tests evaluated the accuracy of the technology in two real-life test scenarios including a test for entry-exit management at an airport passenger gate. It determined whether and how well the engine could recognize people as they walked through an area one at a time without stopping or looking at the camera. NEC’s face recognition technology won first place with a matching accuracy of 99.2 percent. The error rate of 0.8 percent is less than one-fourth the second-place error rate.”

Obviously this is a tremendous technological achievement – identifying 99.2 percent of people in real time from a 4K video stream as they walk through a complex environment without stopping or pausing, and with diverse lighting and other environmental challenges is shockingly close to “magic”. The technology behind NeoFace is daunting. It packs extreme computational power provided by heterogeneous computing with FPGAs doing the heavy lifting as hardware accelerators for the recognition algorithms. It fuses machine learning, heterogeneous computing, big data, video processing, and a host of other esoteric technologies into a vision solution that achieves what seems to be the impossible. And, even though it might outperform a human in real-time identification of a random collection of people, it is managing only a tiny sliver of what true human vision can accomplish.

Put another way, NeoFace represents but one narrow slice of the rapidly exploding spectrum of embedded vision. As we work our way up the capability chain, we will undoubtedly see strong trends emerge in the hardware, software and methodologies used to achieve machine vision, but for now each real-world application will most likely be uniquely focused. The “what’s going on in this scene” test for intelligent vision demands a far greater awareness of context than anything we’ve seen thus far.

But, NeoFace is relevant to our discussion in another way. It’s the shape of solutions to come. It represents an emerging era where the number of transistors or the number of lines of code embodied in a solution convey little relevant information about how advanced or sophisticated the solution is. NeoFace definitely contains billions of transistors, but so does your average RAM array. NeoFace definitely embodies millions of lines of source code, but so do any number of dull, pedestrian software systems. No metric we have yet defines characterizes the sophistication of solutions that integrate the state-of-the-art in transistor density, compute acceleration, software development, machine learning, big data, and on and on.

As the number of transistors and the number of lines approach infinity, we definitely hit a vanishing point in their usefulness as metrics. Something else defines the sophistication of our solution. As applications start to develop and improve themselves as a result of their deployment history, even the number of hours of engineering investment in developing a system become almost moot. We lose the ability to measure what we’ve created in the usual ways.

This loss of metrics is frightening because it affects the nature of what we sell. IoT hasn’t just moved us from selling stand-alone systems to selling distributed networked systems. It fundamentally changes our value proposition. There really is no meaningful, sustainable way to sell IoT as “devices” or “systems”. The thing we are actually selling is “capability”. With NeoFace, NEC its selling the capability to recognize human faces. Yes, there are Xeon processors and FPGAs and software and boards and modules and memory and data – but none of that has any real meaning out of the context of the capability.

As system integrators, we will move more and more to stitching systems together by acquiring and integrating “capabilities” rather than components. And, capabilities-as-services may well be a dominant business model in the future. But, as we create technology that can perform increasingly sophisticated tasks previously reserved only for humans, and as we begin to endow that technology with super-human capabilities, it will become increasingly difficult to put a fence of ownership around a particular piece of hardware, software, data, or IP. The capability itself will have transcended decomposition, and therefore ownership. While the practice of engineering may not change, the economics and the product of it most certainly will.

Leave a Reply

featured blogs
Mar 31, 2023
Learn how (and why) the semiconductor industry is moving towards chiplet-enabled multi-die systems in our research piece in MIT's Technology Review Insights. The post An Industry-Wide Look at the Move Toward Multi-Die Systems appeared first on New Horizons for Chip Design....
Mar 31, 2023
The Verisium Debug platform is optimized for scalability, supporting debugging of simulation runs and emulation, where support for loading large source files and handling huge amounts of probe data is a must. Join this free Cadence Training Webinar to learn how to automate yo...
Mar 30, 2023
Are you in desperate need of a program manager to instigate a new project or rescue an existing project that is spiraling out of control?...

featured video

First CXL 2.0 IP Interoperability Demo with Compliance Tests

Sponsored by Synopsys

In this video, Sr. R&D Engineer Rehan Iqbal, will guide you through Synopsys CXL IP passing compliance tests and demonstrating our seamless interoperability with Teladyne LeCroy Z516 Exerciser. This first-of-its-kind interoperability demo is a testament to Synopsys' commitment to delivering reliable IP solutions.

Learn more about Synopsys CXL here

featured chalk talk

Power Multiplexing with Discrete Components
Sponsored by Mouser Electronics and Toshiba
Power multiplexing is a vital design requirement for a variety of different applications today. In this episode of Chalk Talk, Amelia Dalton chats with Talayeh Saderi from Toshiba about what kind of power multiplex solution would be the best fit for your next design. They discuss five unique design considerations that we should think about when it comes to power multiplexing and the benefits that high side gate drivers bring to power multiplexing.
Sep 22, 2022
24,276 views