feature article
Subscribe Now

Electronic Elitism

DAC Divulges Design Tool Dilemmas

The 43rd annual Design Automation Conference (DAC) got underway yesterday in San Francisco, California. The technical sessions have begun, the exhibits are open, and the parties, PowerPoints and pejoratives have now commenced. Last night, at a media, analyst, and customer briefing dinner at the San Francisco Museum of Modern Art, Walden Rhines, Chairman and CEO, hosted a customer presentation explaining how Mentor’s new Caliber nmDRC accelerates nanometer design rule checking by “hyperscaling” – a technique that takes efficient advantage of multiple processing elements to deliver many times the previous performance in giant DRC runs.

Beyond promoting the polygon-pushing power of parallel processing, Mentor’s presentation also highlighted an interesting reality of today’s Electronic Design Automation (EDA) industry – nanometer class design tools are getting bigger, faster, more sophisticated, more expensive, and consequently, more exclusive. Among the customers heaping praises on Caliber nmDRC for its parallelizing prowess were AMD and Intel – hardly the novice class in semiconductor design.

It is no secret that the number of ASIC and COT (customer-owned tooling) design starts has declined steadily over the past several years and is forecast to continue declining for the foreseeable future. Along with the decline in the number of design starts, the number of companies and designers engaged in ASIC/COT design has declined as well. At the same time, the cost and complexity of ASIC/COT design and the sophistication of the tools and methods required to complete a current-generation chip has risen almost exponentially.

For the companies that develop tools for these designers, this is both an opportunity and a curse. The opportunity side is obvious – designers need extremely sophisticated tools to complete their projects, and they’re willing to pay top dollar to get them. For the company that can build the fastest, best-est, biggest super-deep submicron development and verification tools, the negotiation process is practically blank-check based. If you are in charge of a $15M-$30M SoC development project, how much would you pay for the software tools you need to help ensure that you will actually have working silicon at the end of your project? Well, I’ll wager that you’re not going to be choosing a weaker tool just to save a few thousand dollars. Spending a few extra thousand of the company’s money to help insure your personal job security tends to be an easy decision.

The curse side of the equation is probably more important, but less self-evident. While the promise of a big payday is obvious for developing “the world’s best” super-critical-something-or-other (SCSOO) that makes the difference between success and failure on high-budget SoC, the payday for developing an “almost good enough” or “second best” SCSOO is also easy to predict – near zero. Simply put, nobody wants to buy anything but the proven best technology when they’re gambling with their careers on a high-stakes SoC design.

All these trends make for an interesting game. For any given high-end EDA technology, we tend to end up with one of two scenarios: a monopoly with a single “winner” that develops and markets a clearly superior technology and several “losers” that attempt and fail to bring similar solutions to market, or a duopoly where two companies succeed in developing mostly indistinguishable solutions and thus share the resulting market. In the first case – to the victor go the spoils. A monopoly on a critical technology is worth tens to hundreds of millions of dollars in EDA. In the second case, however, an unfortunate variation on the famous “prisoner paradox” kicks in. With almost equivalent solutions, software companies tend to immediately jump into competing on price. Since the unit cost of their product is essentially zero and the development costs have already been borne, there is no practical limit to the price reduction a company can endure in order to win market share. With more than one competitor battling for the pie, the size of the pie diminishes dramatically. EDA companies engaged in such markets are lucky to even recoup the development and marketing costs of these unfortunate products.

For EDA management, this presents a dilemma. For any given new EDA product development effort there are three possible outcomes, and two of them are bad. EDA long ago realized this, however, and did some thinking of their own. It turns out that the cost of developing new EDA tools is almost negligible when compared with the cost of marketing, selling, and supporting them. EDA companies can start as many projects as they like with comparatively little risk, as long as they’re extremely careful which products they actually bring to market with full force.

In many instances, EDA has given up on even these low-investment speculative internal development projects. Instead, they keep a watchful eye on the packs of startup ventures regularly jumping into the design tool pool. You can start an EDA company with a laptop and a dollar, so there are typically droves of them to choose from. DAC, in fact, provides a veritable showcase of these new entries each year – creating an acquisition flea-market of sorts for big EDA. EDA companies bet on the struggling contenders like gambling addicts at a dog track, and often end up in bidding wars, paying undeserved premiums for perceived leaders in obscure technology areas.

In some critical technology areas, however, the startup farm is less fruitful. Some EDA capabilities are built on top of rich foundations of previous technology, and most startups don’t have the IP or the wherewithal to build the entire technological pyramid required to tackle issues like, for example, deep submicron design verification. In those cases, the EDA companies slug it out with generation after generation of internally developed tools, far from the protection of startup-based development independence, and with egos and inertia preventing them from backing down on long-fought competitive battles.

This game has proven the most dangerous of all, as it results in EDA companies spending ever-increasing sums developing ever-riskier products for smaller and smaller (but well funded) audiences. Increasingly, this part of the industry is evolving into an exclusive service for the electronics elite. Such a service does not necessarily make good business sense, however, and we’re beginning to see the technical tide changing as a result. Unfortunately, when you build products for a smaller and smaller audience, you begin to lose the advantage of economy of scale.

As the price and complexity of EDA products increases, EDA’s customers start to see advantages in developing technology themselves, either because they can create a competitive advantage by building a superior tool capability, or because EDA tools can no longer handle the complexity of their problems. In fact, an increasing number of the most advanced SoC developers have begun to do just that, and internally developed tools for high-end SoC design are on the increase. Note to Mentor – when you’re developing products for companies like AMD and Intel, and the key new capability you’re providing them is multi-processor acceleration of a computationally-intense algorithm, the line between customer and competitor can become perilously thin.

If we take this trend to its logical extension, the high-end tool zone will have made a complete (albeit three-decade) circle. ASIC tools began as internally developed technology at ASIC vendors in the early 1980s, then became a third-party industry as economy of scale gave rise to commercial EDA, and now may return to internal development as the economy of scale evaporates, leaving EDA to pursue greener pastures.

If so, where might those greener pastures lie? Many have placed bets on FPGA tools. For the past ten years or so, FPGA companies have played Robin Hood for design tool technology. They have taken from the wealth of tools and techniques originally created for well-funded ASIC design and distributed those tool technologies (re-spun for FPGA design) at a very low cost to the multitudes of design teams who sought refuge from the high-stakes game of ASIC by migrating to FPGA-based development. This “free-tool” model has long been a decisive obstacle blocking commercial EDA from investing heavily in FPGA tool development. With a few notable exceptions like Synplicity, Mentor Graphics, Aldec, and Australian-based Altium, the EDA industry has been reluctant to risk diving into the FPGA tank only to have their profits eaten by comparable tools supplied by FPGA vendors.

With the FPGA market itself being very close to a duopoly, the economy of scale has never materialized as it did in ASIC design tools, where there were a multitude of viable silicon suppliers competing. Unless the FPGA market diversifies further, it is unlikely that a highly competitive and lucrative third-party EDA market for FPGA-specific tools will emerge. In fact, today Xilinx and Altera, the two largest FPGA companies, would realistically need to be counted among the world’s largest EDA companies. If measurement were based on the number of installed seats of design tools, they might even very well be the biggest.

With FPGA companies grabbing the designs and design tool budgets of all but the largest, most complex product development efforts, EDA will need to find a new avenue for itself in case the current nanometer boom dries up. It is likely that attacking the design process and productivity at the system level, where combined hardware and software design have exploded in complexity in recent years, is the best direction for long-term, sustainable, technology-independent value creation. If so, watch for electronic system level design (ESL) to take an increasing share of the EDA market in coming years and an increasing share of air time in coming DACs.

Leave a Reply

Electronic Elitism

DAC Divulges Design Tool Dilemmas

Beyond promoting the polygon-pushing power of parallel processing, Mentor’s presentation also highlighted an interesting reality of today’s Electronic Design Automation (EDA) industry – nanometer class design tools are getting bigger, faster, more sophisticated, more expensive, and consequently, more exclusive. Among the customers heaping praises on Caliber nmDRC for its parallelizing prowess were AMD and Intel – hardly the novice class in semiconductor design.

It is no secret that the number of ASIC and COT (customer-owned tooling) design starts has declined steadily over the past several years and is forecast to continue declining for the foreseeable future. Along with the decline in the number of design starts, the number of companies and designers engaged in ASIC/COT design has declined as well. At the same time, the cost and complexity of ASIC/COT design and the sophistication of the tools and methods required to complete a current-generation chip has risen almost exponentially.

For the companies that develop tools for these designers, this is both an opportunity and a curse. The opportunity side is obvious – designers need extremely sophisticated tools to complete their projects, and they’re willing to pay top dollar to get them. For the company that can build the fastest, best-est, biggest super-deep submicron development and verification tools, the negotiation process is practically blank-check based. If you are in charge of a $15M-$30M SoC development project, how much would you pay for the software tools you need to help ensure that you will actually have working silicon at the end of your project? Well, I’ll wager that you’re not going to be choosing a weaker tool just to save a few thousand dollars. Spending a few extra thousand of the company’s money to help insure your personal job security tends to be an easy decision.

The curse side of the equation is probably more important, but less self-evident. While the promise of a big payday is obvious for developing “the world’s best” super-critical-something-or-other (SCSOO) that makes the difference between success and failure on high-budget SoC, the payday for developing an “almost good enough” or “second best” SCSOO is also easy to predict – near zero. Simply put, nobody wants to buy anything but the proven best technology when they’re gambling with their careers on a high-stakes SoC design.

All these trends make for an interesting game. For any given high-end EDA technology, we tend to end up with one of two scenarios: a monopoly with a single “winner” that develops and markets a clearly superior technology and several “losers” that attempt and fail to bring similar solutions to market, or a duopoly where two companies succeed in developing mostly indistinguishable solutions and thus share the resulting market. In the first case – to the victor go the spoils. A monopoly on a critical technology is worth tens to hundreds of millions of dollars in EDA. In the second case, however, an unfortunate variation on the famous “prisoner paradox” kicks in. With almost equivalent solutions, software companies tend to immediately jump into competing on price. Since the unit cost of their product is essentially zero and the development costs have already been borne, there is no practical limit to the price reduction a company can endure in order to win market share. With more than one competitor battling for the pie, the size of the pie diminishes dramatically. EDA companies engaged in such markets are lucky to even recoup the development and marketing costs of these unfortunate products.

For EDA management, this presents a dilemma. For any given new EDA product development effort there are three possible outcomes, and two of them are bad. EDA long ago realized this, however, and did some thinking of their own. It turns out that the cost of developing new EDA tools is almost negligible when compared with the cost of marketing, selling, and supporting them. EDA companies can start as many projects as they like with comparatively little risk, as long as they’re extremely careful which products they actually bring to market with full force.

In many instances, EDA has given up on even these low-investment speculative internal development projects. Instead, they keep a watchful eye on the packs of startup ventures regularly jumping into the design tool pool. You can start an EDA company with a laptop and a dollar, so there are typically droves of them to choose from. DAC, in fact, provides a veritable showcase of these new entries each year – creating an acquisition flea-market of sorts for big EDA. EDA companies bet on the struggling contenders like gambling addicts at a dog track, and often end up in bidding wars, paying undeserved premiums for perceived leaders in obscure technology areas.

In some critical technology areas, however, the startup farm is less fruitful. Some EDA capabilities are built on top of rich foundations of previous technology, and most startups don’t have the IP or the wherewithal to build the entire technological pyramid required to tackle issues like, for example, deep submicron design verification. In those cases, the EDA companies slug it out with generation after generation of internally developed tools, far from the protection of startup-based development independence, and with egos and inertia preventing them from backing down on long-fought competitive battles.

This game has proven the most dangerous of all, as it results in EDA companies spending ever-increasing sums developing ever-riskier products for smaller and smaller (but well funded) audiences. Increasingly, this part of the industry is evolving into an exclusive service for the electronics elite. Such a service does not necessarily make good business sense, however, and we’re beginning to see the technical tide changing as a result. Unfortunately, when you build products for a smaller and smaller audience, you begin to lose the advantage of economy of scale.

As the price and complexity of EDA products increases, EDA’s customers start to see advantages in developing technology themselves, either because they can create a competitive advantage by building a superior tool capability, or because EDA tools can no longer handle the complexity of their problems. In fact, an increasing number of the most advanced SoC developers have begun to do just that, and internally developed tools for high-end SoC design are on the increase. Note to Mentor – when you’re developing products for companies like AMD and Intel, and the key new capability you’re providing them is multi-processor acceleration of a computationally-intense algorithm, the line between customer and competitor can become perilously thin.

If we take this trend to its logical extension, the high-end tool zone will have made a complete (albeit three-decade) circle. ASIC tools began as internally developed technology at ASIC vendors in the early 1980s, then became a third-party industry as economy of scale gave rise to commercial EDA, and now may return to internal development as the economy of scale evaporates, leaving EDA to pursue greener pastures.

If so, where might those greener pastures lie? Many have placed bets on FPGA tools. For the past ten years or so, FPGA companies have played Robin Hood for design tool technology. They have taken from the wealth of tools and techniques originally created for well-funded ASIC design and distributed those tool technologies (re-spun for FPGA design) at a very low cost to the multitudes of design teams who sought refuge from the high-stakes game of ASIC by migrating to FPGA-based development. This “free-tool” model has long been a decisive obstacle blocking commercial EDA from investing heavily in FPGA tool development. With a few notable exceptions like Synplicity, Mentor Graphics, Aldec, and Australian-based Altium, the EDA industry has been reluctant to risk diving into the FPGA tank only to have their profits eaten by comparable tools supplied by FPGA vendors.

With the FPGA market itself being very close to a duopoly, the economy of scale has never materialized as it did in ASIC design tools, where there were a multitude of viable silicon suppliers competing. Unless the FPGA market diversifies further, it is unlikely that a highly competitive and lucrative third-party EDA market for FPGA-specific tools will emerge. In fact, today Xilinx and Altera, the two largest FPGA companies, would realistically need to be counted among the world’s largest EDA companies. If measurement were based on the number of installed seats of design tools, they might even very well be the biggest.

With FPGA companies grabbing the designs and design tool budgets of all but the largest, most complex product development efforts, EDA will need to find a new avenue for itself in case the current nanometer boom dries up. It is likely that attacking the design process and productivity at the system level, where combined hardware and software design have exploded in complexity in recent years, is the best direction for long-term, sustainable, technology-independent value creation. If so, watch for electronic system level design (ESL) to take an increasing share of the EDA market in coming years and an increasing share of air time in coming DACs.

Leave a Reply

featured blogs
Apr 25, 2024
Cadence's seven -year partnership with'¯ Team4Tech '¯has given our employees unique opportunities to harness the power of technology and engage in a three -month philanthropic project to improve the livelihood of communities in need. In Fall 2023, this partnership allowed C...
Apr 24, 2024
Learn about maskless electron beam lithography and see how Multibeam's industry-first e-beam semiconductor lithography system leverages Synopsys software.The post Synopsys and Multibeam Accelerate Innovation with First Production-Ready E-Beam Lithography System appeared fir...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Stepper Motor Basics & Toshiba Motor Control Solutions
Sponsored by Mouser Electronics and Toshiba
Stepper motors offer a variety of benefits that can add value to many different kinds of electronic designs. In this episode of Chalk Talk, Amelia Dalton and Doug Day from Toshiba examine the different types of stepper motors, the solutions to drive these motors, and how the active gain control and ADMD of Toshiba’s motor control solutions can make all the difference in your next design.
Sep 29, 2023
26,297 views