feature article
Subscribe Now

The Hardware Vanishing Point

Someday, Will it All be Software?

The disciplines of hardware and software engineering have always been intertwined and symbiotic – like the yin and yang of some bizarre abstract beast. Software cannot exist without hardware to execute it, of course, and most hardware today is designed in the service of software. The vast majority of systems being designed today involve a mix of both elements working together, with software steadily inheriting more and more of the complexity load.

Let’s think about that for a minute. 

On the one hand, we have digital hardware technology that has rocketed up the Moore’s Law curve for five solid decades, exploding in complexity like nothing ever seen by humans. One might expect, based on that fact alone, that hardware would bear the brunt of system complexity. After all, we have gone from tens of transistors on a chip to billions, and from scant kilobytes of memory and storage to terabytes. A system-on-chip today may have billions of transistors combined to make a device a few millimeters on a side that would have filled large buildings a few decades ago. Multiple processors, peripherals, programmable logic fabric, multi-gigabit IOs, and massive amounts of memory can all be crammed onto a single silicon die. 

But, on the other hand, if we look at the actual systems being designed, it is software that accounts for most of the complexity. In fact, the largest software systems today are thousands of times more complex than the most complicated hardware ever designed. So, while we’ve been able to measure the exponential rise in hardware complexity, software has been quietly exploding at something like a Moore’s Law squared rate for just about the same timeframe.

While the battle for the Moore’s Law explosion in hardware complexity has been largely fought at the process level, finding ways to do lithography at ever-finer resolution, there has been a corresponding challenge to keep engineering processes up with the pace. Schematic methods that were valid for dozens to hundreds of components had to be replaced by register-transfer-level abstractions for designs with thousands to millions of components, and still higher levels of abstraction must be applied as the component count in a typical design rockets toward the billions. This challenge of complexity management has been bounded by the rate of expansion of designs. As hardware engineers, we always need a methodology capable of creating circuits as complex as our fabs can build. 

But, on the software side, there is only complexity. In fact, management of complexity is the only thing that truly limits what we can do in software. In response to that challenge, software engineering has built an impressive array of processes, tools, and techniques for breaking down incredibly complex problems into chunks that humans can comprehend. Programming languages have continued to climb in abstraction, libraries of pre-engineered functions have become vast, and processes for managing massive collaboration across a large number of engineers have been deployed. Many of these software engineering techniques have also found their way into the hardware discipline, as hardware engineers have encountered problems complicated enough to require them.

Now there is a lot of talk about Moore’s Law grinding to a close. We know that the next few process nodes – 14nm, 10nm, 7nm – are probably all feasible to manufacture, but the incremental return-on-investment shrinks dramatically for each successive step. It is very unlikely that a node beyond 7nm will be economically feasible in anything like the Moore’s Law 2-year clock period.

But the hardware we already have will be capable of sustaining a continued dramatic explosion in the complexity of software for a long time to come.

If we extrapolate those trend lines into the future, we reach an interesting vanishing point. Software becomes completely dominant. The craft of hardware engineering becomes a tiny fraction of the enormous code juggernaut. The vast majority of the people creating electronic systems and technologies will be software engineers of some sort. Future archeologists may unearth hardware engineer fossils and wonder what strange beasts must have inhabited these modest bones.

As hardware becomes infinitely powerful and infinitessimally cheap, the value contribution must shift. Revenues cannot continue to be measured in shipments of units of physical objects. Manufacturing itself will become a commodity. Value returned will be almost exclusively from software and other intellectual property, and that’s an economic reality that our society has not yet demonstrated the maturity to manage. 

Consider the lessons of the smartphone economy. A single hardware platform is sold at almost zero margin in enormous quantities. The value of the device is derived mostly from software apps, but the economy of the smartphone app is hyper-deflationary. “What?!? That application only monitors and manages all my fitness activities and financial transactions, walks the dog, organizes my music, and keeps my kids out of danger? – no WAY that’s worth 99 cents!”

While this may sound like doom and gloom for hardware engineers, we should not despair. There is plenty of work for the current generation of multimeter-toting kids, and probably for another round or two after that. And, if you haven’t noticed, more and more of the craft of hardware engineering is already actually software engineering. It’s been insidiously taking us over for years already. That’s right – the call is coming from INSIDE THE HOUSE! 

Notice the makeup of most system design teams today. We hear that the ratio of software engineers to hardware engineers is already 5:1 to 10:1, and the trend seems to be for those ratios to only increase. So, hardware folks, wander over to the next row of cubicles sometime and see what those people are up to. Chances are your job will look more and more like theirs as time goes on.

2 thoughts on “The Hardware Vanishing Point”

  1. I don’t think it’s anywhere near that gloomy for Moore’s law and new innovation.

    Consider that most of the devices we have today, where not imagined 50 years ago in the form and utility they have taken on. Sure Dick Tracy had a watch phone, but few actually imagined that more computing power than existed on the planet in 1961, would exist on the watch-pads that are likely to become a new norm in the next decade … possibly fueled by the UI merging into something like Google Glass.

    Sure we have beaten half of the computing performance bottlenecks, probably into near death with GPU’s and mult-cores in everything … that was the easy low hanging fruit. But the other half of Amdahl’s law, is that sure you can speed up the parallel side of the problem … but the serial problem just becomes the next bottleneck.

    That is where the hard slow large critical serial algorithms are compiled in logic (FPGA’s initially, ASIC or hard cores later) with highly distributed memory will enable new applications and devices that are currently just barely feasible due to the huge computational delays. This is most likely the next five to seven cycles of Moore’s law playing out … moving the algorithms to gates in even larger 3D low power dies.

    There are lot’s of things computationally infeasible tasks today … that are probably just a hard coded logic ASIC that completely abandons everything we think of today as computing, because we are stuck with a legacy anchor of general purpose computing.

    It’s why serial critical path computing compiled into FPGA fabrics and ASIC’s is the next enabling technology to chip away at Amdahl’s law. I still vote for Intel rolling their own reconfigurable logic fabric tightly integrated into servers, desktops, mobile and embedded products.

    But maybe AMD, MIPS, ARM, and the other guys will innovate and get their act together first.

  2. It will never be “all software.”

    You can talk all you want and code all you want, but hardware is a necessity if you want to DO anything!

    Since hardware is a rock-bottom necessity, I am heading the movement back to hardware. In Smart Small Systems, why use a microprocessor and software when a bit of more sophisticated hardware can do the trick?

    I have developeded an alternative technology that is mostly hardware. See “Natural Logic of Space and Time” (NL), available on Amazon.
    http://www.amazon.com/Natural-Logic-Space-Time-Real-time/dp/1508580359/

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Data Connectivity at Phoenix Contact
Single pair ethernet provides a host of benefits that can enable seamless data communication for a variety of different applications. In this episode of Chalk Talk, Amelia Dalton and Guadalupe Chalas from Phoenix Contact explore the role that data connectivity will play for the future of an all electric society, the benefits that single pair ethernet brings to IIoT designs and how Phoenix Contact is furthering innovation in this arena.
Jan 5, 2024
15,634 views