feature article
Subscribe Now

Death of the Hardware Engineer

A Dirge for the Digital Designer

Any engineering discipline done well should ultimately be self-eradicating. The key problems should be solved from the bottom up, and the creative genius of each generation should be absorbed into the collective tooling, IP, and best-practice methodologies of the next. Today, digital design bears little resemblance to what I learned in school twenty something years ago. For many of today’s bright young engineers, DeMorgan equivalents are something they learned in an introductory logic design class, but not anything they apply in their day-to-day work. They’re much more likely to be worried about whether the Ethernet stack they are dropping into the software side of their system is compatible with the version of the MAC they bought from their silicon IP supplier, whether the layout will meet timing without some manual tweaking to the chip layout, and if electro migration will cause a reliability problem in their 90nm-based technology at the junction temperatures they’re likely to be running.

The abstraction level of digital engineers’ thinking has gone from transistor to subsystem over the course of four decades. Sure, there are people still optimizing the design of the common transistor today, but the vast majority of the engineering world takes their work for granted – including the biannual doubling of density and frequency. Atop that transistorized foundation, a framework of ever-higher-level structures has been designed, refined, repeated, and commoditized so that future re-design is mostly unnecessary. While we engineers may be propelled by the “not invented here” syndrome to re-invent the wheel a few times, eventually we tire of the exercise and want to move on to the rest of the car.

In our case, the “rest of the car” has moved from gates to multipliers to arithmetic logic units, and now to subsystems like processors, memory, and I/O modules connected by standardized plug-and-play interconnect fabrics. Once we conquer a level, we rarely go back to visit except for occasional tuning. The result? Our concerns have moved gradually from the core to the periphery, and more and more often the thing we’re designing is an environment in which software can encapsulate most of the true complexity of our application. A growing number of electronic systems today amount to “design a high-performance computing system that fits in this form factor, attaches to these peripherals, and burns this much power.”

Given the state of platform assembly tools, even today, the complexity of that task is rapidly approaching zero. I’ve sat through a number of demonstrations where marketers, company executives, and even supremely unqualified technical editors are used to demonstrate that complex computing systems could be created by relative neophytes in just a few minutes and with just a few mouse clicks. What’s left for the real, trained hardware engineer to do?

Clearly, our work here is not yet done. There are newer, faster interfaces to invent, better busses to build, and more powerful processors to pursue. However, the true frontiers of electronic technology exploration are increasingly moving to where the electronics touch the real world – mechanical interfaces – and where the new complexity resides – software. Think about one of today’s challenging system design problems – hardware/software partitioning. Why do we do this? Obviously anything that we can handle effectively in software is best done that way. What’s left is what we should put into hardware. Usually, that means functions for which we can’t get the desired performance in software. For many of these, there are existing IP blocks that we can grab and plug. For the diminishing number of functions that remain, we need to custom design some hardware – using ASIC, FPGA, or other implementation technology, with hardware description languages like VHDL or Verilog. This is the main domain of the endangered EE.

Now, we have new tools (that are somewhere between primitive and semi-sophisticated) that will allow even these parts of many designs to be created by a software engineer and then compiled into custom hardware accelerators. Over time, these tools, combined with improved programmable hardware fabrics like FPGAs, will allow software engineers to take over many of these acceleration tasks. The realm where true custom logic design is actually needed will be rarified even more.

There are analogies to this type of revolution in many other disciplines – assembly programming is not dead. There are still a few applications that require the bit- and- register-level discipline that can be achieved only with such detailed coding, but the mainstream has moved on to higher levels of abstraction. What is happening today is that the majority of the functionality of any embedded system is moving up the food chain to the software engineer. Over the coming years, specialized digital hardware design skills will be required less and less frequently. We’ll all gradually follow in the footsteps of the one-hour photo lab.

So – what are we to do? Should we hardware types all give up, turn in our soldering irons, take Java classes online, and join the unwashed masses of keyboard-bound pizza gobblers pounding out programs for peanuts? (Oh man! I can hear that comments box filling up already.) Is “digital system designer” joining the ranks of “typist,” “telegraph operator,” “keypunch technician,” and countless others in the Smithsonian Museum of Obsolete Occupations? Should we lobby our governments for protection of the profession – levying taxes on software-based functionality that would make it more attractive to implement new things in hardware? Do we set about creating a subversive plan to upset the foundation – moving to three-state logic or attempting some other stunt that would guarantee us all work for the next few decades re-inventing our last century’s work?

Probably such drastic measures are not required. Our profession will most likely evolve rather than die. As more of the digital system design moves into software, more of the burden of designing to power, form-factor, and interface standards will fall on a single class of super-systems-engineers working with both hardware and software concepts and supported by highly sophisticated tools and technology. This new professional will not be the same software developer that creates applications for desktop computers. This person will also need a new type of education that our system does not yet provide. This category of designer will be born in industry, and will have their discipline later formalized into academia. In fact, their forefathers are out there already – creating the latest versions of consumer and industrial embedded systems using newer, more advanced design techniques – pioneering new methodologies for getting more complex designs to market faster. Their creative breakthroughs will be tomorrow’s pedagogy. It always works that way. Bright minds cannot help but find challenging problems to solve.

Leave a Reply

Death of the Hardware Engineer

A Dirge for the Digital Designer

Exactly two hundred years ago this June, Augustus De Morgan was born. Arguably, before that time, there were no logic designers in the world. For the next 200 years, however, logic designers steadily increased in number until today, when we walk the earth in six or seven digit numbers. In the big picture, however, the time for our species may be drawing to a close. Self-made storm clouds have been on the horizon for awhile now, the engineer-extincting meteors are headed for earth, and the distant dirge of death for the digital design profession as we know it grows ever-louder over the horizon.

Any engineering discipline done well should ultimately be self-eradicating. The key problems should be solved from the bottom up, and the creative genius of each generation should be absorbed into the collective tooling, IP, and best-practice methodologies of the next. Today, digital design bears little resemblance to what I learned in school twenty something years ago. For many of today’s bright young engineers, DeMorgan equivalents are something they learned in an introductory logic design class, but not anything they apply in their day-to-day work. They’re much more likely to be worried about whether the Ethernet stack they are dropping into the software side of their system is compatible with the version of the MAC they bought from their silicon IP supplier, whether the layout will meet timing without some manual tweaking to the chip layout, and if electro migration will cause a reliability problem in their 90nm-based technology at the junction temperatures they’re likely to be running.

The abstraction level of digital engineers’ thinking has gone from transistor to subsystem over the course of four decades. Sure, there are people still optimizing the design of the common transistor today, but the vast majority of the engineering world takes their work for granted – including the biannual doubling of density and frequency. Atop that transistorized foundation, a framework of ever-higher-level structures has been designed, refined, repeated, and commoditized so that future re-design is mostly unnecessary. While we engineers may be propelled by the “not invented here” syndrome to re-invent the wheel a few times, eventually we tire of the exercise and want to move on to the rest of the car.

In our case, the “rest of the car” has moved from gates to multipliers to arithmetic logic units, and now to subsystems like processors, memory, and I/O modules connected by standardized plug-and-play interconnect fabrics. Once we conquer a level, we rarely go back to visit except for occasional tuning. The result? Our concerns have moved gradually from the core to the periphery, and more and more often the thing we’re designing is an environment in which software can encapsulate most of the true complexity of our application. A growing number of electronic systems today amount to “design a high-performance computing system that fits in this form factor, attaches to these peripherals, and burns this much power.”

Given the state of platform assembly tools, even today, the complexity of that task is rapidly approaching zero. I’ve sat through a number of demonstrations where marketers, company executives, and even supremely unqualified technical editors are used to demonstrate that complex computing systems could be created by relative neophytes in just a few minutes and with just a few mouse clicks. What’s left for the real, trained hardware engineer to do?

Clearly, our work here is not yet done. There are newer, faster interfaces to invent, better busses to build, and more powerful processors to pursue. However, the true frontiers of electronic technology exploration are increasingly moving to where the electronics touch the real world – mechanical interfaces – and where the new complexity resides – software. Think about one of today’s challenging system design problems – hardware/software partitioning. Why do we do this? Obviously anything that we can handle effectively in software is best done that way. What’s left is what we should put into hardware. Usually, that means functions for which we can’t get the desired performance in software. For many of these, there are existing IP blocks that we can grab and plug. For the diminishing number of functions that remain, we need to custom design some hardware – using ASIC, FPGA, or other implementation technology, with hardware description languages like VHDL or Verilog. This is the main domain of the endangered EE.

Now, we have new tools (that are somewhere between primitive and semi-sophisticated) that will allow even these parts of many designs to be created by a software engineer and then compiled into custom hardware accelerators. Over time, these tools, combined with improved programmable hardware fabrics like FPGAs, will allow software engineers to take over many of these acceleration tasks. The realm where true custom logic design is actually needed will be rarified even more.

There are analogies to this type of revolution in many other disciplines – assembly programming is not dead. There are still a few applications that require the bit- and- register-level discipline that can be achieved only with such detailed coding, but the mainstream has moved on to higher levels of abstraction. What is happening today is that the majority of the functionality of any embedded system is moving up the food chain to the software engineer. Over the coming years, specialized digital hardware design skills will be required less and less frequently. We’ll all gradually follow in the footsteps of the one-hour photo lab.

So – what are we to do? Should we hardware types all give up, turn in our soldering irons, take Java classes online, and join the unwashed masses of keyboard-bound pizza gobblers pounding out programs for peanuts? (Oh man! I can hear that comments box filling up already.) Is “digital system designer” joining the ranks of “typist,” “telegraph operator,” “keypunch technician,” and countless others in the Smithsonian Museum of Obsolete Occupations? Should we lobby our governments for protection of the profession – levying taxes on software-based functionality that would make it more attractive to implement new things in hardware? Do we set about creating a subversive plan to upset the foundation – moving to three-state logic or attempting some other stunt that would guarantee us all work for the next few decades re-inventing our last century’s work?

Probably such drastic measures are not required. Our profession will most likely evolve rather than die. As more of the digital system design moves into software, more of the burden of designing to power, form-factor, and interface standards will fall on a single class of super-systems-engineers working with both hardware and software concepts and supported by highly sophisticated tools and technology. This new professional will not be the same software developer that creates applications for desktop computers. This person will also need a new type of education that our system does not yet provide. This category of designer will be born in industry, and will have their discipline later formalized into academia. In fact, their forefathers are out there already – creating the latest versions of consumer and industrial embedded systems using newer, more advanced design techniques – pioneering new methodologies for getting more complex designs to market faster. Their creative breakthroughs will be tomorrow’s pedagogy. It always works that way. Bright minds cannot help but find challenging problems to solve.

Leave a Reply

featured blogs
May 2, 2024
I'm envisioning what one of these pieces would look like on the wall of my office. It would look awesome!...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Designing for Functional Safety with Infineon Memory
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Alex Bahm from Infineon investigate the benefits of Infineon’s SEMPER NOR Flash and how the reliability, long-term data retention, and functional safety compliance make this memory solution a great choice for a variety of mission critical applications. They also examine how SEMPER NOR Flash has been architected and designed for functional safety and how Infineon’s Solutions Hub can help you get started using SEMPER NOR Flash in your next design.
Apr 22, 2024
1,305 views