feature article
Subscribe Now

EDA FPGA DOA

Recently, Cadence announced they were acquiring startup EDA company Taray.  It was one of the brighter and smarter (albeit most predictable) moves in the sad, sordid history of commercial EDA’s disastrous dance with programmable logic tool technology.  EDA’s future in the FPGA-related tools business was once bright and promising. Today, however, it is all but decided that EDA has primarily sat on the sidewalk while the FPGA parade marched by, occasionally running into the street in a frenetic dance that would have made the Three Stooges green with envy.

Taking the recent news first: Cadence has steered almost completely clear of the FPGA nonsense for the past decade.  The company has held fast to their hard-core ASIC/SoC/ASSP tool provider role, tempering it only with a steady presence in the PCB design business, where they’ve long been a major player and where they have battled against Mentor at the top end, Altium/Protel on the bottom end, and numerous other players in the middle for the better part of two decades.  

Now, however, FPGA design has intruded into the PCB world.  First, just about every PCB design being done these days contains at least one FPGA.  Being able to handle them adequately is no longer a minor differentiator; it’s a fundamental requirement.  Second, FPGAs – particularly those with large pincounts – provide a unique challenge to PCB design.  The fact that the IOs can be willy-nilly assigned and re-assigned (almost) at will makes major trouble for PCB layout.  Just when you think you’ve got the routes figured out where nothing touches, the things that aren’t supposed to run parallel don’t, and the things that are supposed to be the same length are, the FPGA guy comes in and says they moved this bus to that side of the chip, and that one to this side, and switched the order of this bundle all around.  Your job is now to start over again from scratch and hope that you didn’t drop a bit somewhere and leave the old trace connected to the old pin.  

Cadence has been partnering with Taray, who provides intelligent software for FPGA/PCB pin assignment and tracking, for over a year now.  This acquisition makes major sense – as Cadence has the customers, the sales channel, the support infrastructure, and the complementary tools to make Taray’s technology really practical and useful in the market.  

While the FPGA-on-PCB game is a fairly safe bet, many of the previous attempts at EDA participation in the FPGA phenomenon have been riskier and ultimately unsuccessful.  It would seem, at first blush, that any ASIC tool could be re-tooled to work for FPGA.  After all, FPGA and ASIC share much of the same design methodology – HDL-based design starting with VHDL or Verilog, RTL IP integration, simulation, synthesis, and place-and-route.  

The EDA industry has always existed because of an economy of scale related to developing design tool technologies that could be re-used across a broad range of problems and with a broad range of technologies from different vendors.  In ASIC design, most of the design tools were identical (or very close) regardless of what libraries or semiconductor process they were targeting.  This made it economical for the EDA industry to invest in developing the general-purpose tool technology and selling it broadly.  ASIC companies quickly stopped developing their own tools, gave up on design tools as a basis for differentiation, and allowed commercial EDA to take over the role of supplying tools to designers.  EDA built a profitable and high-growth industry based on that one conceit.

When FPGAs came along, however, things took a different course.  First, the vagaries of FPGA architecture made many of the off-the-shelf EDA technologies – like place-and-route – ill suited for production use.  The effort to get an ASIC-grade place-and-route tool converted over for FPGA was almost as big as writing a tool from scratch.  Second, in FPGA, there were two primary vendors – Xilinx and Altera.  This didn’t give EDA the same leverage for economy-of-scale that they had with the plethora of ASIC suppliers.  Third, since there were only two FPGA vendors, neither of those companies was willing to concede the strategy of using tools as a differentiator.  FPGA companies invested heavily in tools – and rode that horse hard.  If their silicon was comparable on a particular process node, they touted their tools – “our synthesis gives better QoR,”  “Our P&R runs faster” – the claims of tool supremacy were easy, powerful, and almost impossible to validate or disprove.  When tool capabilities and performance weren’t favorable differentiators, the companies fell back on price – basically giving sophisticated tools away for free in order to win silicon business.

In fact, if you consider the massive investment that FPGA companies have poured into tool technology over the years, it would be easy to argue that FPGA companies are actually EDA companies with a different model for extracting value from their technology.  Since no FPGA company actually manufactures silicon, they’re really selling IP, tools, and services — a lot like a typical EDA company of today.

This didn’t stop EDA companies from trying.  First and most famous: Synplicity built a public company on the premise of supplying FPGA-specific RTL synthesis technology for FPGA designers who weren’t satisfied with the vendor offerings.  It was a bold strategy that worked well for a long time.  Ultimately, however, technology trends began dictating that synthesis and place-and-route be tightly integrated, if not combined into one tool.  Since the FPGA vendors had a death grip on FPGA layout technology, the only reasonable way for synthesis to tip was toward the FPGA vendors.  When Synopsys acquired Synplicity, the big question was whether the new company would advance or kill the FPGA synthesis offering.  Still today, Synplify and its derivatives are favored by many design teams that want the most performance out of their FPGAs.  The masses, however, get by with the constantly-improving tools offered by the vendors themselves.  

Mentor Graphics was lucky enough to have an industry-dominant HDL simulator before FPGAs went to HDL-based design, and they were doubly lucky that simulation technology didn’t have to be heavily customized for FPGA use.  Their simulator worked great right out of the box.  This allowed Mentor to capture a dominant share of the FPGA simulation market and to take advantage of the economy-of-scale of developing that technology across both FPGA and ASIC markets.  FPGA vendors had no choice but to capitulate and let Mentor (and to a lesser degree Aldec, Cadence, and Synopsys) own the simulation part of the tool chain.  

Mentor was far less lucky when it came to synthesis.  First, they acquired Exemplar Logic and their Leonardo product – the number one competitor to Synplicity at the time in third-party FPGA synthesis.   Then, they developed their own derivative of Leonardo: Precision Synthesis.  Precision, and its derivatives and successors, has never quite caught on as a broadly-used FPGA tool.  In the numerous surveys FPGA Journal has done of the FPGA design community, Precision has never shown up with more than a tiny share of the reported use as “primary synthesis tool.”  Despite a considerable investment, Mentor has never been able to convert their success with ModelSim over into other successes in the FPGA tool market.  In fact, they’ve done a better job surrounding FPGAs – from the top with ESL and embedded tools used in designs that also have FPGAs, and at the back end with PCB tools integrating support for FPGAs – in much the same fashion as Cadence has just announced.

Prior to acquiring Synplicity, Synopsys had a horrid track record of fits and starts in FPGA.  Several times, the company came out with variants and derivatives of its highly-successful Design Compiler ASIC synthesis tool aimed at the FPGA market.  Each time, the effort failed dismally and the company quietly retreated to the sidelines.  With the acquisition of Synplicity, Synopsys is back in the FPGA game again – although there is reasonable evidence that the company’s main interest in the acquisition was actually the FPGA-based prototyping products from HARDI, who had themselves recently been acquired by Synplicity.  

Not to be left out – Magma has many times touted FPGA tools and technology as well – perhaps most prominently when they acquired and marketed PALACE physical synthesis tools from APlus design technology.  Again, however, the product failed to get commercial traction and quietly faded into obscurity.

In addition to these efforts by the major EDA companies, a host of startups have come and gone on the battlefield of FPGA design tools.  Each time, in addition to facing the rigors of competition and the challenges of a start-up, they had to face the very real possibility that the technology they poured their souls into developing might become the latest “free” addition to the FPGA vendors’ tool suites within a year or two – taking their commercial ambitions down with it.

As a result of all this – we have the world we live in today.  EDA companies still pour most of their energy into developing tools to support semiconductor development.  At the same time, the number of people and teams engaging in semiconductor design becomes smaller each year, while their tool requirements become constantly more extreme.  The business that got EDA to where they are today becomes a game of selling more complex and expensive tools each year – to a smaller and smaller customer base, and that’s not a model that builds sustained growth – or even sustained profitability.  EDA is an industry in peril.

FPGA companies are left with the legacy of going it alone.  On the one hand, it might seem bad that their strategy has blocked a third-party ecosystem emerging for design tools.  On the other hand, the surviving FPGA companies can now use design tools both as differentiators and as barriers to entry for incumbent startups trying to get traction in the FPGA space.  

It’s not clear that they’ve made a bad trade.

Leave a Reply

featured blogs
Aug 18, 2018
Once upon a time, the Santa Clara Valley was called the Valley of Heart'€™s Delight; the main industry was growing prunes; and there were orchards filled with apricot and cherry trees all over the place. Then in 1955, a future Nobel Prize winner named William Shockley moved...
Aug 17, 2018
Samtec’s growing portfolio of high-performance Silicon-to-Silicon'„¢ Applications Solutions answer the design challenges of routing 56 Gbps signals through a system. However, finding the ideal solution in a single-click probably is an obstacle. Samtec last updated the...
Aug 17, 2018
If you read my post Who Put the Silicon in Silicon Valley? then you know my conclusion: Let's go with Shockley. He invented the transistor, came here, hired a bunch of young PhDs, and sent them out (by accident, not design) to create the companies, that created the compa...
Aug 16, 2018
All of the little details were squared up when the check-plots came out for "final" review. Those same preliminary files were shared with the fab and assembly units and, of course, the vendors have c...
Jul 30, 2018
As discussed in part 1 of this blog post, each instance of an Achronix Speedcore eFPGA in your ASIC or SoC design must be configured after the system powers up because Speedcore eFPGAs employ nonvolatile SRAM technology to store its configuration bits. The time required to pr...