feature article
Subscribe Now

The EDA Enigma

55 Years of Identity Crisis

The fifty-fifth Design Automation Conference (DAC) is underway in San Francisco this week. Just let that number settle in for a minute or two. That’s right, EDA existed before Moore’s Law. Back when folks could cram only a handful of transistors on an integrated circuit, there was already a conference dedicated to the development of software tools to design them.

That’s how important design automation is.

The Moore’s Law miracle – five decades of exponential progress in semiconductor technology – taking us from single-digit numbers to double-digit billions of transistors on a chip is normally viewed as a triumph in lithography. Yes, it’s an amazing feat that we’ve learned to print features on chips that are smaller than the wavelength of light used in the lithography. Today’s fabrication equipment and techniques are miracles of modern engineering. But I’ll argue that the true enabler of Moore’s Law is not lithography. It’s design automation.

The innovation achieved by the spectacularly small EDA industry in the past fifty years transcends just about any other aspect of semiconductors you can name. If you weigh in the number of engineers involved, the quotient of innovation per engineer integrated over five decades is unmatched in any field I’m aware of. The algorithmic sophistication of EDA tools is incredible, and the skill set required to develop them – a deep understanding of electronics and electronic design coupled with mastery of large-scale software engineering – is vanishingly rare.

In every generation of Moore’s Law, the big question is not “how will we fabricate it” but “how will we design it?” If you look at what this means for EDA, they have needed to double the capacity of their software every two years, along with the performance. At the same time, they’ve had to constantly add new features, and often completely change the design flow or paradigm for using their tools.

Schematic capture for chip design gave way to RTL, then to RTL with IP integration, then to a wide range of capture methods including high-level synthesis, model-based design, and other domain-specific design techniques. Automation went from simple netlist extraction to gate-level optimization to register-transfer-level synthesis to algorithmic synthesis. In each subsequent step, the capabilities of the “old” downstream steps had to be preserved and enhanced – with high-level-synthesis, for example, there is still a requirement for RTL, netlists, and all the other downstream processes to be intact and stable to get to the end of the design flow. 

Design verification has been perhaps the biggest miracle of all. The sophistication required to verify today’s designs – functionally and physically – is mind-bending. Considering that most EDA algorithms are order n-squared with the number of objects being considered, each bi-annual doubling of transistor count squares the complexity of the problem. This seemingly intractible double-exponential demand has somehow been met by verification tools, however. Design engineers today can verify multi-billion transistor designs as effectively and efficiently as they have with the much smaller designs of the past several decades.

Even now, in the twilight of Moore’s Law, EDA has had a crucial role in extending the run. Technologies such as proximity correction and multi-patterning that are aboslutely essential to single-digit nanometer lithography came from the EDA industry. In fact, EDA has played an essential role in just about every major challenge that has threatened to end Moore’s Law since the early 1980s, when sub-micron fabrication was widely believed to be impossible. 

With the outsized role that EDA has played throughout the history of semiconductors, the lack of industry growth for EDA is shocking. Only a tiny fraction of the worldwide electronics business that EDA enables makes its way back to the companies who generate all that innovation. There are no EDA success/growth stories to rival Facebook, Google, Amazon, or Microsoft – yet none of those companies could even exist without a strong contribution from EDA. The very technologies upon which their businesses are built would never have existed without EDA. 

Why has EDA failed to extract their fair share of loot from the semiconductor mother lode? Much of it probably has to do with the challenge of establishing value for software. When the EDA industry was in its infancy, so was the software market. The world hadn’t yet come to grips with the idea of selling a product whose unit cost was zero. (In many ways, it still hasn’t.) Marketing philosophy was designed around the bricks-and-mortar ideas of manufacturing, and the divide-by-zero error that is injected into all of those formulas by software was just too much for business types to handle. If your competitor dropped their price, you dropped yours. Since both of you had a zero incremental unit cost, competition became a race to the bottom with no winners.

EDA tools are not commodity items, however. In many cases, there is only one tool available that can do the required job, so runaway price competition shouldn’t become an issue. Still, it was hard for EDA companies to sell the idea of an enormous price difference between tools for which there was competition and those for which there was not. It was hard to look your customer in the face and say, “Everybody’s got a simulator, so ours are a dime a dozen. We’ve got the only synthesis tool that can handle your design, though. That’ll be six figures per seat, please.”

Over the years, EDA has experimented with perpetual licenses, annual maintenance fees, term licenses, “all you can eat” multi-year corporate deals, and myriad more obscure schemes for extracting value from their wares. While some have worked better than others, nothing has brought EDA anywhere near the point of extracting value proportional to its contribution. In fact, numerous semiconductor companies have annual revenues that far exceed market cap of the entire EDA industry. Their investment in EDA tools amounts to a rounding error in their P&L.  

This brings up the question, why is there an independent semiconductor EDA industry? Wouldn’t it be just as efficient for large semiconductor companies or even merchant fabs such as TSMC to simply develop their own EDA tools? With the small number of companies doing leading-edge semiconductor design, it’s hard to make an economy-of-scale argument for third-party EDA. But the EDA industry has a significant defensive perimeter – its pool of engineering talent and enormous repository of legacy technology. A new player could invest infinite sums of money trying to develop EDA from scratch and wouldn’t have a prayer of matching what the EDA industry can deliver today.  

The only way to break the lock EDA has on talent and technology would be to buy an EDA company. And, of course, that’s what Siemens did with its acquisition of Mentor Graphics. For now, Siemens seems to be making a go of operating its EDA business as a for-profit business. But, if the competitive landscape changed, it would be easy to imagine them repurposing the technology for their own use and blocking access for competitors. As long as the market price of EDA products is low compared with the value delivered, the industry is vulnerable.  

This year, the Design Automation Conference finds itself in a bit of an existential crisis. Earlier this year, the Electronic System Design Alliance (a consortium of EDA companies that has been one of the key sponsors of DAC for decades) announced that it will be merging with SEMI (an industry consortium of semiconductor manufacturing). The combined organization has said it wants to include EDA as part of its SEMICON series of events. There are also credible rumors that SEMI-ESDA may part ways with DAC and put its energy exclusively into SEMICON as an EDA venue. There are further rumors that some of the big three EDA companies might follow and abandon DAC as well. If so, that would be a huge blow to the venerable event, and a challenge for EDA companies, as there would then be a difficult decision on exhibiting at one show or the other or splitting resources between the two.  

As usual, EDA is struggling with its identity and with its business model.

Early in this year’s show, Cadence rolled out a series of announcements about a new cloud-based EDA offering for their tools. In the past, cloud-based EDA has struggled to get off the ground, but it is possible that the current climate may be more friendly to the idea – particularly with the promise of peak-demand relief for design projects with enormous compute demands. This EDA-as-a-service would be yet another experiment in the industry’s ongoing challenge of extracting the value it deserves from the semiconductor pie. It will be interesting to watch.

One thought on “The EDA Enigma”

  1. From the reports over at EETimes, DARPA has a $100M budget for advanced open source EDA tools in hopes of taking a lot of dollars out of smaller fast turn complex SoC projects.

    https://www.eetimes.com/document.asp?doc_id=1333440

    https://www.eetimes.com/document.asp?doc_id=1333422

    DARPA Unveils $100M EDA Project
    Two programs aim to craft ‘silicon compiler’
    Rick Merritt

    who reported:

    If successful, the programs “will change the economics of the industry,” enabling companies to design in relatively low-volume chips that would be prohibitive today. It could also open a door for designers working under secure regimes in the government to make their own SoCs targeting nanosecond latencies that are not commercially viable, said Olofsson.

    “Most importantly, we have to change the culture of hardware design. Today, we don’t have open sharing … but in software, it’s already happened with Linux. Sharing software costs was the best option for the industry, and we can share some hardware components, too.”

Leave a Reply

featured blogs
Mar 28, 2024
The difference between Olympic glory and missing out on the podium is often measured in mere fractions of a second, highlighting the pivotal role of timing in sports. But what's the chronometric secret to those photo finishes and record-breaking feats? In this comprehens...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Introducing QSPICE™ Analog & Mixed-Signal Simulator
Sponsored by Mouser Electronics and Qorvo
In this episode of Chalk Talk, Amelia Dalton and Mike Engelhardt from Qorvo investigate the benefits of QSPICE™ - Qorvo’s Analog & Mixed-Signal Simulator. They also explore how you can get started using this simulator, the supporting assets available for QSPICE, and why this free analog and mixed-signal simulator is a transformational tool for power designers.
Mar 5, 2024
1,884 views