feature article
Subscribe Now

Why Are Design Tools So Bad?

Or… What? Another bug?

As much as the EDA industry would like us to believe otherwise, it’s almost impossible to find an engineering team who is satisfied with their design tools. More often than not, when chatting with designers about their tools, we get sentiments ranging from “survivable” to “horrible.” The “survivable” end of the spectrum usually amounts to something on the order of, “We managed to get the job done in spite of numerous issues we ran into with tools along the way.” “Horrible” usually leans more toward, “I lost my job because bad design tools caused my project to fail.”

The unicorn we’ve yet to locate is the team who feels that their design tools truly empower them – making their design better, their jobs easier, their schedules shorter, and their professional lives less stressful. Granted, that’s a lot to ask from a pile of software, but the kind of rabid enthusiasm many people show for a wide range of commodity products and services on which they depend is virtually nonexistent in the world of engineers and their tools.

It would be easy to dismiss a lot of the sentiment as just “engineers being engineers.” After all, we are trained to be constantly evaluating solutions – looking for better, faster, more efficient ways to get things done – tweaking and tuning and refining ad-infinitum. It makes sense that we’d never be quite content with the status-quo of anything, let alone the technology on which our professional survival so desperately depends.

But there’s more going on here than simple engineering grumpiness.

First, there is an unfortunate effect from economy-of-scale. In consumer products, the vast size of the potential market justifies enormous investments in product development. If you work in consumer software or hardware, you feel this every day. Making and selling millions of copies of a thing profoundly affects the amount of time, money, and energy you can (and should) spend developing it. For electronic design tools, however, the economy of scale is orders of magnitude smaller. I’ve seen EDA companies producing high-end verification tools where a dozen customers constituted a major success. How would your company get by if your product sold twelve copies a year? Chances are, not well.

Producing a product for a small run of tens to hundreds (rarely thousands) of copies demands a completely different economy. If you have even a dozen engineers working on a product that sells a dozen copies a year – well, you can do the math. It’s gonna cost some serious coin.

Of course, EDA tools do cost more than just about any other type of software. The simple sum of development costs, sales commissions, marketing expenses, support costs, and administrative expenses make profitability a challenge. Sure, many important EDA tools have been written by two engineers in a garage, but, given EDA’s sales model, even a tool with zero development cost would be expensive to deploy. Even with prices jacked up to the six-figures-per-year range for a term license (which many EDA tools enjoy), the funds that find their way back to engineering development are meager at best.

On top of all this, EDA tools are insanely complex. You may think that the media center, connected wearable, avionics system, or automotive ADAS system you’re working on is complicated, but it’s just peanuts compared with the pile of software required to get an application-specific SoC from high-level code down to verified silicon. Most EDA tool code bases number in the millions of lines – sometimes the tens of millions. The amount of manpower required just to maintain, build, test, and deploy applications of that size and complexity is sobering.

Since EDA tools are also driven by Moore’s Law, they also go through a regular re-invention cycle. Most products have the luxury to slowly and steadily improve and evolve – with features, performance, and reliability all climbing steadily up and to the right over the product’s lifespan. EDA tools, however, live in a more Sisyphean universe where the engineering team exhausts themselves pushing the product to a minimal level of functionality, only to have the whole thing be thrown up in the air two years later by the demands of the next wave of Moore’s Law complexity. EDA engineers repeatedly struggle to get products from “new” to “marginally acceptable” and then start over again.

Compounding the problem is the highly specialized skill set required for the development of EDA tools. Since most EDA tools are primarily highly-complex software applications, EDA engineers must be top-flight software developers. However, since they are building tools that literally do the job of hardware engineers, EDA tool developers need an intimate knowledge of electronic design as well. This dual expertise in both hardware and software is rare, and competition for those skills is high. Throw in the other attractive options available to gifted engineers these days, and talent recruiting for EDA companies becomes a major challenge.

What we have, then, is a situation where development costs and tool complexity are exponentially increasing, the audience consuming them is mostly shrinking or staying constant (far fewer teams are designing custom chips these days), and the budgets for tools are roughly static. This is a difficult environment in which to build a profitable software business. If you took this to a logical vanishing point, you’d be building the most complex tool in the world for a single user. Of course, long before that, there would cease to be a third-party EDA industry. Large system houses would simply develop their own tools.

The EDA industry divides into two primary disciplines – chip design and system design. For decades, chip design has been the primary driver and has generated most of the revenue for the industry. System/PCB design has remained a steady, low-growth, profitable business. The difference is that new chip design software is absolutely essential for each new process node. The tools you used two years ago will absolutely not allow you to complete a design today, one tick of the Moore’s Law clock later. EDA was required to constantly innovate or die, and the semiconductor industry was strapped onto that roller coaster with them. If EDA couldn’t solve issues like optical proximity correction, multi-patterning, billion-transistor simulation, multi-million-gate place-and-route, and on and on, the next generation of chips simply would not happen.

System (or PCB) design is not like that. It would be perfectly reasonable to chug along with a single set of system design tools for a decade or more. Sure, PCB design has evolved, and challenges such as signal and power integrity have leapt to the forefront in recent years, but, overall, the requirements for putting a network of components onto a PCB – placing, routing, verifying, and manufacturing the board – haven’t changed that much over the decades. The industry would never come to a screeching halt if the next generation of PCB tools didn’t make it out on time.

Finally, EDA as an industry has repeatedly shot themselves in the foot over capturing their fair share of value from the vast electronics market. While EDA has enabled and empowered just about every innovation for the last four decades, only a minuscule amount of the revenue generated by all that innovation has found its way back to the industry. The challenge of deploying a business model that was both competitive and effective in capturing that value has historically been more than the small industry could muster.

So, when you’re hitting crunch time in your project and you run up against a “show stopper” bug in your design tool that brings things to a crashing halt, understand the engineering environment from which those tools come. It’s not the same as the one delivering the other technology products and services that we all use in our daily lives. EDA is a unique world that has evolved over the past forty years to where it can keep the electronics industry functioning while taking away an extremely modest overall share of the pie. For that, we should all be grateful.

7 thoughts on “Why Are Design Tools So Bad?”

  1. The tools suck because they are old (and badly designed), and companies like Intel don’t do anything to make the processors work better on EDA tasks (i.e. there’s minimal bootstrapping).

    Because the tools suck everybody has a huge stack of workarounds (and slack) in their design flows, and that makes things very fragile, so they all resist change. Anything that requires change (that isn’t automatic) will be rejected if possible. The general denial about the lack of support for DVFS, body-biasing and power handling in digital SoC design flows is a prime example.

    The bulk of the tools are licensed propriety software that views standards as a high bar rather than a starting point, and standards like VHDL have never worked properly, but the IEEE committees refuse to fix them. Not to mention the lack of a good AMS standard for verifying IoT ICs.

    Customers seem to think EDA companies are working in their best interest, but the motivation for the EDA companies is to maximize revenue by selling more licenses, and more complex flows with dysfunctional tools are a good vehicle for selling patch-up and add-on tools, and more simulation licenses to see if they got it right.

    On the up-side there is at least one good open-source simulator –


    – and all the AIs need is that and a good extraction tool to verify their efforts, and a lot of the existing EDA stack may become obsolete quite quickly.

  2. Kevin wrote ” Since most EDA tools are primarily highly-complex software applications, EDA engineers must be top-flight software developers. However, since they are building tools that literally do the job of hardware engineers, EDA tool developers need an intimate knowledge of electronic design as well.”
    And this whole mess started when Verilog made it possible to simulate (physical)”HDL” which meant that the logic design had been done and converted to “physical hardware description”. Then without batting an eyelash — Verilog magically became a logic design language although it was a physical description.
    The Verilog HDL fiasco was ill conceived and both logic and hardware interconnect as well as logic design were ignored.
    Meanwhile OOP compilers and debuggers have been developed so I am parsing a simple logic syntax
    using Boolean algebra for logic and classes for functional/physical modules so the compiler makes sure all interconnections are valid and the Boolean expressions are evaluated during debug. (just like we used to do manually, except easier)
    On top of that there is source control to support a design team. I will have a way to design and debug the logic and then parse and format the source to Verilog for build.
    Oh no, this is Boolean to HDL, not C to HDL.

  3. A few decades ago all the same things were said about Operating Systems, as doing fully re-entrant multi-processor (multi-core, clusters, numa) is pretty nasty stuff, when even a small race condition error will crash the entire system or leave tasks fatally hung.

    There is a reason UNIX stole the show, as standards were fought and won, ultimately leading to nearly every vendor joining the game as equal participants in the shared development of Linux.

    The labor costs far exceeded the revenues any given vendor could spend going it alone. The few that tried, are pretty much forgotten history.

    Someday, there might be hope for this industry too.

Leave a Reply

featured blogs
Nov 30, 2023
Cadence Spectre AMS Designer is a high-performance mixed-signal simulation system. The ability to use multiple engines and drive from a variety of platforms enables you to "rev up" your mixed-signal design verification and take the checkered flag in the race to the ...
Nov 27, 2023
See how we're harnessing generative AI throughout our suite of EDA tools with Synopsys.AI Copilot, the world's first GenAI capability for chip design.The post Meet Synopsys.ai Copilot, Industry's First GenAI Capability for Chip Design appeared first on Chip Design....
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

Dramatically Improve PPA and Productivity with Generative AI

Sponsored by Cadence Design Systems

Discover how you can quickly optimize flows for many blocks concurrently and use that knowledge for your next design. The Cadence Cerebrus Intelligent Chip Explorer is a revolutionary, AI-driven, automated approach to chip design flow optimization. Block engineers specify the design goals, and generative AI features within Cadence Cerebrus Explorer will intelligently optimize the design to meet the power, performance, and area (PPA) goals in a completely automated way.

Click here for more information

featured paper

3D-IC Design Challenges and Requirements

Sponsored by Cadence Design Systems

While there is great interest in 3D-IC technology, it is still in its early phases. Standard definitions are lacking, the supply chain ecosystem is in flux, and design, analysis, verification, and test challenges need to be resolved. Read this paper to learn about design challenges, ecosystem requirements, and needed solutions. While various types of multi-die packages have been available for many years, this paper focuses on 3D integration and packaging of multiple stacked dies.

Click to read more

featured chalk talk

PIC32CX-BZ2 and WBZ451 Multi-Protocol Wireless MCU Family
Sponsored by Mouser Electronics and Microchip
In this episode of Chalk Talk, Amelia Dalton and Shishir Malav from Microchip explore the benefits of the PIC32CX-BZ2 and WBZ45 Multi-protocol Wireless MCU Family and how it can make IoT design easier than ever before. They investigate the components included in this multi-protocol wireless MCU family, the details of the software architecture included in this solution, and how you can utilize these MCUs in your next design.
May 4, 2023