feature article
Subscribe Now


Seen but not Heard

Despite some popular speculation, “DAC” does not stand for “Declining ASIC Commiseration.” The Design Automation Conference is a robust and lively gathering of companies, customers and comrades sharing a common interest in the progressive automation of the electronic design process. As we’ve discussed the past couple of weeks, however, the historical focus of DAC has been on the custom and semi-custom digital IC design process. Sure, there have always been other disciplines and interests represented, but the money, the momentum, and the fear have always been centered around the completion of complex IC designs and the avoidance of career-limiting re-spins in the semiconductor design cycle.

FPGA design is certainly not excluded from DAC. There are special sessions to discuss FPGA design tools – and even FPGA company logos wandering around the show floor on polo shirts, but DAC has never attracted the mainstream FPGA designer, and consequently DAC has never become a venue of choice for those discussing and deploying tool technology for FPGA design.

Of course, there are exceptions to this rule. Synplicity – the company built on FPGA design tools, focuses their entire DAC presence on the design and application of FPGA technology. Mentor Graphics, the only one of the “big three” EDA companies with a serious stake in FPGA design, dedicates a notable portion of their presence to FPGA-related technology as well. Specialized EDA companies like Celoxica – with a tight dependence on FPGA-based development, usually soft-pedal the FPGA connection a bit at DAC, focusing instead on their overall ESL design methodology, of which FPGAs are but one important component.

Walk the floor of DAC, however, and look at the various circuit boards professionally lit inside acrylic showcases, and you’ll find a plethora of programmable logic. There are possibly more FPGAs on display at DAC than any other type of major electronic component. If not for our policy against over-used terminology, I might venture the word “ubiquitous.” (Ouch, now I have to write the sentence “Our programmable platform is synergistic with the proliferation of an engineering ecosystem paradigm.” twenty times as punishment.)

Hidden beneath the skin of this surface-treatment of FPGA design, however, is a startling reality. Depending on how you measure, the “not appearing at DAC” Xilinx and Altera may be the two largest EDA companies in the world. If not, at least they’re two of the largest. Given the explosion in the number of FPGA designers worldwide and the extremely low cost and high availability of FPGA design tools provided by these two companies alone, “X and A” probably each account for more seats of more design tools than any mainstream EDA vendor. If you could get accurate numbers of software engineers engaged in the actual development of EDA technology, the FPGA companies would probably make an impressive showing as well. Each year, more of the FPGA vendors’ engineering budgets seem to swing toward software development, despite the almost exponential difficulty of the hardware side of pacing Moore’s Law.

Even the smaller FPGA companies that depend on OEM relationships with EDA vendors for their design tool technology are somewhere between scarce and obsolete at DAC. While tools are a critical part of their product offering, they don’t see enough interest from the “push the button and go” mainstream of FPGA designers to actually join a forum where design tool technology is the focus. Or, more to the mainstream of EDA, Altium, an EDA company who has focused much of their energy on the future of FPGA-based system design, declines to participate in DAC. If the future of mainstream electronic design tools is shrink-wrapped software mated to pre-fabricated development boards and integrated with pushbutton ease-of-use, we may not need a conference where such products are discussed. I, for one, haven’t been to any conferences recently where Microsoft Outlook was dissected, even though I use it perhaps more than any other single piece of application software in doing my day-to-day job.

As ASIC starts continue to decline, more applications and more design teams are moving to FPGA-based design – at least for prototyping and early production, if not for final implementation. At DAC there are droves of boards on display crammed with dozens of the biggest, most expensive FPGAs – all dedicated to the task of ASIC prototyping. From Dini Group’s huge arrays of big FPGAs to complete verification solutions from companies like Eve, DAC has always had a place for FPGAs in the emulation space. It’s when you cross over that line into higher-volume FPGA deployment that the interest and investment of the EDA community wanes. However, if FPGA-based design continues to move up the food chain toward higher-volume production, it paints an interesting picture for the future of EDA.

There are likely to become two disjointed EDA markets (if there haven’t already). One market will focus on the design of the whole electronic system in a way that is independent of final implementation technology (although programmable logic will probably increasingly become a target technology of choice). In this space, EDA needs to learn about software – and needs to accept the challenge of true system-level design instead of just larger-scale hardware design. For years, EDA’s interests have been hardware-centric, but today’s systems are complex combinations of hardware- and software-based functionality that interact at such an intimate level that segregated system-level tool sets will no longer suffice. At the system level, EDA also needs to once again move beyond the boundaries of the chip. True system-level functionality (even for a digital-dominated system) now includes critical components at the board level and with non-trivial analog behaviors (such as signal integrity high-speed serial I/O channels). True system level design must account for these aspects of the system as well.

The other EDA market will likely remain focused on silicon-specific details at the deep technology level. As Moore’s Law plunders ahead, the polygon-level challenges of IC design become more daunting with each generation. Unfathomably complicated tools with equally indigestible price tags will be required to sustain our progress toward smaller features and the benefits they bring. For now, EDA will continue the fight against these technological villains on a completely different battle front from system level and FPGA-based design.

Through the years, there have been almost incessant comparisons of the emerging FPGA design tool situation to the history of ASIC EDA. For every FPGA problem, there was an ASIC metaphor describing the cause, effect, and solution. One thing we now know, however, is that the evolution of FPGA tools will follow almost anything but the ASIC model. While similar software technologies are involved, the dynamics of the market and the motivators driving engineering behavior are completely different from those seen in ASIC and high-end IC. FPGA is truly a horse of a different color. (Dang, now I have to go back for that anti-cliché therapy again.)

In the future, will we go to DAC to find out about the evolution of those tools and technologies? For me, the jury is still out on that one. If the RTL-to-implementation portion of the FPGA design flow continues to be dominated by the FPGA vendors’ internal tools, and the interesting third-party tool development focuses on higher-level system design, then yes, DAC could well be the forum for such exploration and exposition. If not, it would be a sad day. DAC brings a sense of history to the electronics industry that would not be easy to recapture. Although our design culture spans well less than a century, the chronicling of our methodologies in a respected conference has become a part of our personal histories as well.

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

How NV5, NVIDIA, and Cadence Collaboration Optimizes Data Center Efficiency, Performance, and Reliability

Sponsored by Cadence Design Systems

Deploying data centers with AI high-density workloads and ensuring they are capable for anticipated power trends requires insight. Creating a digital twin using the Cadence Reality Digital Twin Platform helped plan the deployment of current workloads and future-proof the investment. Learn about the collaboration between NV5, NVIDIA, and Cadence to optimize data center efficiency, performance, and reliability. 

Click here for more information about Cadence Data Center Solutions

featured paper

Navigating design challenges: block/chip design-stage verification

Sponsored by Siemens Digital Industries Software

Explore the future of IC design with the Calibre Shift left initiative. In this paper, author David Abercrombie reveals how Siemens is changing the game for block/chip design-stage verification by moving Calibre verification and reliability analysis solutions further left in the design flow, including directly inside your P&R tool cockpit. Discover how you can reduce traditional long-loop verification iterations, saving time, improving accuracy, and dramatically boosting productivity.

Click here to read more

featured chalk talk

VITA RF Product Portfolio: Enabling An OpenVPX World
Sponsored by Mouser Electronics and Amphenol
Interoperability is a very valuable aspect of military and aerospace electronic designs and is a cornerstone to VITA, OpenVPX and SOSA. In this episode of Chalk Talk, Amelia Dalton and Eddie Alexander from Amphenol SV explore Amphenol SV’s portfolio of VITA RF solutions. They also examine the role that SOSA plays in the development of military and aerospace systems and how you can utilize Amphenol SV’s VITA RF solutions in your next design.
Oct 25, 2023