feature article
Subscribe Now

Pyramid or House of Cards

Will EDA Endure for the Ages?

Building a pyramid takes a very long time and a huge number of people. The first stones are the foundation, of course, and, as the layers slowly increase, those seminal pieces are often buried and forgotten – deep inside the ever-evolving structure. Ever so quietly, they continue their duties, even as they and those who placed them fade from memory.

Our technological pyramid isn’t quite so predictably behaved. We haven’t obscured the technique for creating a PN junction. MOSFET transistor design is not yet a lost art. Today’s engineers do not wave wands and burn incense on top of ancient stones engraved with De Morgan’s Laws in order to assure the continued accurate minimization of their logic structures. By and large, we can take most of our high-level hardware technology of today back to fundamental principles and laws of mathematics and physics. Like stones in a pyramid, our modern semiconductor device may have billions of transistors, but each one still operates on the same fundamental concepts as the original ones did many decades ago.

In software, however, things are not quite so rosy.

Today, there is practically no such thing as a bare-metal, clean-sheet piece of software. It would be a rare, expensive, and comparatively simple application that could be developed in a truly clean room. We rely on a complex stack of BIOS, drivers, operating systems, libraries, and other not-invented-here piles of code to get even the simplest application today to the “hello world” level. On top of that, most serious applications have been years or decades in their evolution with constantly changing teams of engineers contributing to the effort. Piles of ancient functions are buried deep within labyrinthine structures and seldom see the light of day. If they never manifest a bug, cause a performance problem, or trip up a port, their lines of source code are unlikely to be seen by any living human.

Is this a problem?

Like the founding stones in a pyramid, will these fundamental blocks of binary continue to carry their weight, allowing modern-day developers to pursue glory by pushing the peak of the structure ever farther into the azure sky? Or will we slowly begin to bury logic whose subtle underpinnings are difficult enough to unravel that the hapless future adventurer won’t be able to divine their inner workings well enough to keep them in working order? If cracks start to form in the foundation, how much engineering will it take to keep the structure standing?

Now consider the complex interplay between hardware and software engineering that happens in electronic design automation (EDA). Next month, the Design Automation Conference (DAC) will gather for the 54th time in as many years. Over that half-century, a small community of engineers have created an amazing stack of software that is absolutely fundamental to every electronic system. Without EDA software, we could design practically nothing on current semiconductor processes. Our technology pyramid would be more like a house of cards, and it would crash to the ground in a most unpleasant fashion.

Each generation of EDA software has given birth to a new era of computing power – which has in turn enabled the next generation of EDA software. This virtuous cycle has continued for decades. In 1984, I worked on a place-and-route system that required 24-48 hours to perform layout on 10,000 gates of logic. Today’s tools crunch through five orders of magnitude more gates in that amount of time, and they solve almost incomprehensibly more complex problems while they’re at it – including timing, power consumption, and manufacturability. The old tools were thrilled when they could get a wire from point A to point B without shorting. But the chips designed by those old tools led to the computers on which today’s tools were born.

The first generation of EDA engineers has now retired. Their contributions live on, however, in algorithms, strategies, design flows, and – of course – enduring chunks of code. As the second and third generations of EDA talent make their way out of the industry, how will those who inherit the incredible responsibility of moving EDA forward manage the immense complexity with which they’ll be saddled? Will their feet ever touch conceptual ground? Or will they live in fear of the day when they’ll have to crack open some fossilized function and try to sort out the subtle elegance that has kept it ticking all these years?

EDA has historically been a slow-growth industry. Even as electronic technology has exploded in our society, taking over key roles in nearly everything we touch in our day-to-day lives, the tribe of priests with the knowledge required to keep all those circuits behaving more-or-less correctly has remained about the same size. New engineers with EDA skills seem to become more and more scarce as competition for software talent increases and a vast gamut of lucrative career options expands before those with promise. It’s increasingly difficult for EDA to compete for talent, and there is no question that the industry has aged significantly as a result. This combination of a shrinking talent pool and an exploding legacy of complexity points toward a disturbing vanishing point.

In fact, there could theoretically be something of a race – will Moore’s Law run out of steam before EDA loses the ability to help us design more complex systems?

It’s unlikely that EDA will lose control of any major application any time soon. Many tools have died of code-rot over the years, but they’ve generally been replaced by more capable rewrites. As long as we have engineering teams who can fully comprehend the problem from top to bottom, we aren’t at risk of the pyramid collapsing into itself. We’ll still be able to design new chips.

Perhaps a bigger risk to the industry is the current anti-globalization, anti-immigration climate in many countries. Most of the world’s EDA technology development happens in the US. Many of the world’s most talented EDA engineers are not from the US. The technology produced by that industry is essential to the entire world, and it cannot thrive if the talent and the opportunities are kept separate from each other. With the entire world quietly depending on an efficient EDA ecosystem, interrupting the flow of talent could be disastrous.

9 thoughts on “Pyramid or House of Cards”

  1. Pingback: My Homepage
  2. Pingback: GVK Biosciences
  3. Pingback: DMPK
  4. Pingback: Bdsm petplay
  5. Pingback: wiet thuisbezord
  6. Pingback: taruhan bola
  7. Pingback: Klinik gigi banten
  8. Pingback: Aws Alkhazraji

Leave a Reply

featured blogs
Jul 5, 2022
The 30th edition of SMM , the leading international maritime trade fair, is coming soon. The world of shipbuilders, naval architects, offshore experts and maritime suppliers will be gathering in... ...
Jul 5, 2022
By Editorial Team The post Q&A with Luca Amaru, Logic Synthesis Guru and DAC Under-40 Innovators Honoree appeared first on From Silicon To Software....
Jun 28, 2022
Watching this video caused me to wander off into the weeds looking at a weird and wonderful collection of wheeled implementations....

featured video

Synopsys 112G Ethernet IP Interoperating with Optical Components & Equalizing E-O-E Link

Sponsored by Synopsys

This OFC 2022 demo features the Synopsys 112G Ethernet IP directly equalizing electrical-optical-electrical (E-O-E) channel and supporting retimer-free CEI-112G linear drive for low-power applications.

Learn More

featured paper

3 key considerations for your next-generation HMI design

Sponsored by Texas Instruments

Human-Machine Interface (HMI) designs are evolving. Learn about three key design considerations for next-generation HMI and find out how low-cost edge AI, power-efficient processing and advanced display capabilities are paving the way for new human-machine interfaces that are smart, easily deployable, and interactive.

Click to read more

featured chalk talk

Machine-Learning Optimized Chip Design -- Cadence Design Systems

Sponsored by Cadence Design Systems

New applications and technology are driving demand for even more compute and functionality in the devices we use every day. System on chip (SoC) designs are quickly migrating to new process nodes, and rapidly growing in size and complexity. In this episode of Chalk Talk, Amelia Dalton chats with Rod Metcalfe about how machine learning combined with distributed computing offers new capabilities to automate and scale RTL to GDS chip implementation flows, enabling design teams to support more, and increasingly complex, SoC projects.

Click here for more information about Cerebrus Intelligent Chip Explorer