feature article
Subscribe Now

Why Are Design Tools So Bad?

Or… What? Another bug?

As much as the EDA industry would like us to believe otherwise, it’s almost impossible to find an engineering team who is satisfied with their design tools. More often than not, when chatting with designers about their tools, we get sentiments ranging from “survivable” to “horrible.” The “survivable” end of the spectrum usually amounts to something on the order of, “We managed to get the job done in spite of numerous issues we ran into with tools along the way.” “Horrible” usually leans more toward, “I lost my job because bad design tools caused my project to fail.”

The unicorn we’ve yet to locate is the team who feels that their design tools truly empower them – making their design better, their jobs easier, their schedules shorter, and their professional lives less stressful. Granted, that’s a lot to ask from a pile of software, but the kind of rabid enthusiasm many people show for a wide range of commodity products and services on which they depend is virtually nonexistent in the world of engineers and their tools.

It would be easy to dismiss a lot of the sentiment as just “engineers being engineers.” After all, we are trained to be constantly evaluating solutions – looking for better, faster, more efficient ways to get things done – tweaking and tuning and refining ad-infinitum. It makes sense that we’d never be quite content with the status-quo of anything, let alone the technology on which our professional survival so desperately depends.

But there’s more going on here than simple engineering grumpiness.

First, there is an unfortunate effect from economy-of-scale. In consumer products, the vast size of the potential market justifies enormous investments in product development. If you work in consumer software or hardware, you feel this every day. Making and selling millions of copies of a thing profoundly affects the amount of time, money, and energy you can (and should) spend developing it. For electronic design tools, however, the economy of scale is orders of magnitude smaller. I’ve seen EDA companies producing high-end verification tools where a dozen customers constituted a major success. How would your company get by if your product sold twelve copies a year? Chances are, not well.

Producing a product for a small run of tens to hundreds (rarely thousands) of copies demands a completely different economy. If you have even a dozen engineers working on a product that sells a dozen copies a year – well, you can do the math. It’s gonna cost some serious coin.

Of course, EDA tools do cost more than just about any other type of software. The simple sum of development costs, sales commissions, marketing expenses, support costs, and administrative expenses make profitability a challenge. Sure, many important EDA tools have been written by two engineers in a garage, but, given EDA’s sales model, even a tool with zero development cost would be expensive to deploy. Even with prices jacked up to the six-figures-per-year range for a term license (which many EDA tools enjoy), the funds that find their way back to engineering development are meager at best.

On top of all this, EDA tools are insanely complex. You may think that the media center, connected wearable, avionics system, or automotive ADAS system you’re working on is complicated, but it’s just peanuts compared with the pile of software required to get an application-specific SoC from high-level code down to verified silicon. Most EDA tool code bases number in the millions of lines – sometimes the tens of millions. The amount of manpower required just to maintain, build, test, and deploy applications of that size and complexity is sobering.

Since EDA tools are also driven by Moore’s Law, they also go through a regular re-invention cycle. Most products have the luxury to slowly and steadily improve and evolve – with features, performance, and reliability all climbing steadily up and to the right over the product’s lifespan. EDA tools, however, live in a more Sisyphean universe where the engineering team exhausts themselves pushing the product to a minimal level of functionality, only to have the whole thing be thrown up in the air two years later by the demands of the next wave of Moore’s Law complexity. EDA engineers repeatedly struggle to get products from “new” to “marginally acceptable” and then start over again.

Compounding the problem is the highly specialized skill set required for the development of EDA tools. Since most EDA tools are primarily highly-complex software applications, EDA engineers must be top-flight software developers. However, since they are building tools that literally do the job of hardware engineers, EDA tool developers need an intimate knowledge of electronic design as well. This dual expertise in both hardware and software is rare, and competition for those skills is high. Throw in the other attractive options available to gifted engineers these days, and talent recruiting for EDA companies becomes a major challenge.

What we have, then, is a situation where development costs and tool complexity are exponentially increasing, the audience consuming them is mostly shrinking or staying constant (far fewer teams are designing custom chips these days), and the budgets for tools are roughly static. This is a difficult environment in which to build a profitable software business. If you took this to a logical vanishing point, you’d be building the most complex tool in the world for a single user. Of course, long before that, there would cease to be a third-party EDA industry. Large system houses would simply develop their own tools.

The EDA industry divides into two primary disciplines – chip design and system design. For decades, chip design has been the primary driver and has generated most of the revenue for the industry. System/PCB design has remained a steady, low-growth, profitable business. The difference is that new chip design software is absolutely essential for each new process node. The tools you used two years ago will absolutely not allow you to complete a design today, one tick of the Moore’s Law clock later. EDA was required to constantly innovate or die, and the semiconductor industry was strapped onto that roller coaster with them. If EDA couldn’t solve issues like optical proximity correction, multi-patterning, billion-transistor simulation, multi-million-gate place-and-route, and on and on, the next generation of chips simply would not happen.

System (or PCB) design is not like that. It would be perfectly reasonable to chug along with a single set of system design tools for a decade or more. Sure, PCB design has evolved, and challenges such as signal and power integrity have leapt to the forefront in recent years, but, overall, the requirements for putting a network of components onto a PCB – placing, routing, verifying, and manufacturing the board – haven’t changed that much over the decades. The industry would never come to a screeching halt if the next generation of PCB tools didn’t make it out on time.

Finally, EDA as an industry has repeatedly shot themselves in the foot over capturing their fair share of value from the vast electronics market. While EDA has enabled and empowered just about every innovation for the last four decades, only a minuscule amount of the revenue generated by all that innovation has found its way back to the industry. The challenge of deploying a business model that was both competitive and effective in capturing that value has historically been more than the small industry could muster.

So, when you’re hitting crunch time in your project and you run up against a “show stopper” bug in your design tool that brings things to a crashing halt, understand the engineering environment from which those tools come. It’s not the same as the one delivering the other technology products and services that we all use in our daily lives. EDA is a unique world that has evolved over the past forty years to where it can keep the electronics industry functioning while taking away an extremely modest overall share of the pie. For that, we should all be grateful.

7 thoughts on “Why Are Design Tools So Bad?”

  1. The tools suck because they are old (and badly designed), and companies like Intel don’t do anything to make the processors work better on EDA tasks (i.e. there’s minimal bootstrapping).

    Because the tools suck everybody has a huge stack of workarounds (and slack) in their design flows, and that makes things very fragile, so they all resist change. Anything that requires change (that isn’t automatic) will be rejected if possible. The general denial about the lack of support for DVFS, body-biasing and power handling in digital SoC design flows is a prime example.

    The bulk of the tools are licensed propriety software that views standards as a high bar rather than a starting point, and standards like VHDL have never worked properly, but the IEEE committees refuse to fix them. Not to mention the lack of a good AMS standard for verifying IoT ICs.

    Customers seem to think EDA companies are working in their best interest, but the motivation for the EDA companies is to maximize revenue by selling more licenses, and more complex flows with dysfunctional tools are a good vehicle for selling patch-up and add-on tools, and more simulation licenses to see if they got it right.

    On the up-side there is at least one good open-source simulator –

    https://xyce.sandia.gov/

    – and all the AIs need is that and a good extraction tool to verify their efforts, and a lot of the existing EDA stack may become obsolete quite quickly.

  2. Kevin wrote ” Since most EDA tools are primarily highly-complex software applications, EDA engineers must be top-flight software developers. However, since they are building tools that literally do the job of hardware engineers, EDA tool developers need an intimate knowledge of electronic design as well.”
    And this whole mess started when Verilog made it possible to simulate (physical)”HDL” which meant that the logic design had been done and converted to “physical hardware description”. Then without batting an eyelash — Verilog magically became a logic design language although it was a physical description.
    The Verilog HDL fiasco was ill conceived and both logic and hardware interconnect as well as logic design were ignored.
    Meanwhile OOP compilers and debuggers have been developed so I am parsing a simple logic syntax
    using Boolean algebra for logic and classes for functional/physical modules so the compiler makes sure all interconnections are valid and the Boolean expressions are evaluated during debug. (just like we used to do manually, except easier)
    On top of that there is source control to support a design team. I will have a way to design and debug the logic and then parse and format the source to Verilog for build.
    Oh no, this is Boolean to HDL, not C to HDL.

  3. A few decades ago all the same things were said about Operating Systems, as doing fully re-entrant multi-processor (multi-core, clusters, numa) is pretty nasty stuff, when even a small race condition error will crash the entire system or leave tasks fatally hung.

    There is a reason UNIX stole the show, as standards were fought and won, ultimately leading to nearly every vendor joining the game as equal participants in the shared development of Linux.

    The labor costs far exceeded the revenues any given vendor could spend going it alone. The few that tried, are pretty much forgotten history.

    Someday, there might be hope for this industry too.

Leave a Reply

featured blogs
Apr 19, 2024
Data type conversion is a crucial aspect of programming that helps you handle data across different data types seamlessly. The SKILL language supports several data types, including integer and floating-point numbers, character strings, arrays, and a highly flexible linked lis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

IoT Data Analysis at the Edge
No longer is machine learning a niche application for electronic engineering. Machine learning is leading a transformative revolution in a variety of electronic designs but implementing machine learning can be a tricky task to complete. In this episode of Chalk Talk, Amelia Dalton and Louis Gobin from STMicroelectronics investigate how STMicroelectronics is helping embedded developers design edge AI solutions. They take a closer look at the benefits of STMicroelectronics NanoEdge-AI® Studio and  STM32Cube.AI and how you can take advantage of them in your next design. 
Jun 28, 2023
33,824 views