feature article
Subscribe Now

FPGA Synthesis Showdown

The Big Game that Never Ends

I’ll be the first to admit that I don’t understand cricket, but I’m told that cricket matches can sometimes run for days on end, and that it’s often unclear who’s winning until, suddenly and unexpectedly, one team meets some arbitrary victory conditions and then goes home to celebrate with a few pints. The FPGA synthesis game is a lot like that. It’s a high-stakes game with vast amounts of revenue at stake, not to mention some serious technology bragging rights, but it’s never really clear who’s ahead, who’s behind, or even what the rules were in the first place. It’s been going on that way for something like twenty years, and despite fierce competition and incredible technological advances in the tools, the scoreboard is still basically a bunch of gibberish. Nonetheless, the audience is stuck on the edge of their seats (Maybe they’ve been Super Glued there?) watching every move in this weird and wacky contest between tool nerds that have never met.

Before we can understand the FPGA synthesis scenario today, we should probably review a bit of history. In digital design, no implementation step is as critical or has as much influence over the final outcome of a design as synthesis. When we engineers write the source code for our design (HDL or some high-level description language), we suffer from the illusion that we are somehow in control – that if we write really clean and clever code, the resulting design will be better, faster, and more compact, and will use less energy. In reality, the code we write is merely a suggestion for our synthesis tool. The real outcome of our efforts depends on how well that tool interprets our code, divines our intent, and infers some circuitry that will honor our requests while adhering to a set of design rules we may not even know, in order to do the best job it can of achieving a set of optimization goals that we probably wouldn’t understand.

Two decades ago, when designs got so large that schematic-based entry of digital designs became unwieldy, the world switched to language-based design. But, in order for language-based design to be real, we needed logic synthesis. Synopsys, the world’s largest EDA company today, was arguably built by logic synthesis. The company’s explosive growth and impressive success record began in the 1990s with Design Compiler, a synthesis tool that became the de facto industry standard for years. Perhaps the name “Design Compiler” belied the complexity and sophistication of the algorithms involved in synthesis. By choosing the word “compiler,” Synopsys sent the message that what was going on here was at a similar level to the software compilers that turned our C code into machine instructions. Nothing could be further from the truth, and as the years passed and the technology advanced, the “compiler” metaphor only got worse. Today, virtually no logic design is done by humans. We could safely argue that 99.99% of the logic circuits in the world were created by algorithms, or – if we’re generous – only indirectly created by the handful of humans whose heuristics guide the algorithms as they interpret our code and infer massive blocks of circuitry that we hope will do what we want.

The business success Synopsys achieved with Design Compiler was the envy of the EDA industry. Every EDA executive and entrepreneur on Earth began a desperate quest to recreate that magic – to find that one franchise-maker tool that could propel their five-person project team to Fortune 500 status on the coattails of one mission-critical solution for the entire electronics industry. Design Compiler’s success was in ASIC design, so when FPGAs arrived on the scene and became the heirs apparent to the custom chip crown, the answer seemed obvious. Whoever could create the best FPGA synthesis tool would win. Players large and small threw their hats into the ring. Synopsys looked like a favorite as they created the FPGA equivalent of the wildly successful Design Compiler. Mentor Graphics set out to save their struggling synthesis program by creating an FPGA version of their AutoLogic tool, and a whole wave of startups appeared on the scene, all aiming to claim the prize, for one lesson the world had learned from Design Compiler was, “There is no second place.” Once a critical tool achieves a winning position, the rest is like the end of a game of Monopoly: the rich get steadily richer, and the poor gradually fail, until finally only one player is left standing.

Unfortunately, in studying the road map of Design Compiler’s voyage to success, there were a few key landmarks that the industry experts missed. In the ASIC market, third party synthesis (tools supplied by an independent EDA company rather than by the chip makers themselves) succeeded largely because of economy of scale. The EDA company could invest the resources to create a world class synthesis tool, and those engineering costs could be easily amortized over a large constellation of ASIC vendors and technologies. No single ASIC vendor had the resources to create their own tool that could match the sophistication of those created by the EDA experts. And, there was a knock-on benefit: by standardizing the industry behind one synthesis tool, your design could become largely technology- and vendor-independent. If you needed to switch to a different ASIC supplier, or wanted to target your design so that multiple suppliers could build it, the synthesis tool provided the firewall that kept any chip vendor from locking you in. Furthermore, as they quietly worked behind the scenes (this will become important later), it had already been established that place-and-route, the next step in ASIC design after synthesis, was in the domain of third-party EDA suppliers rather than the ASIC vendors themselves.

In the nascent FPGA ecosystem, some of those things were different in subtle but staggeringly important ways. First, there was no economy of scale to match the ASIC tool market. Where there had been dozens of competitive ASIC vendors, in FPGA there really were only two: Xilinx and Altera. And where many ASIC companies had been happy that the vendor-independent design facilitated by third-party tools gave them an opportunity to steal business from their competitors, the ferociously feuding FPGA duopoly wanted nothing to do with vendor-independent design. They grabbed any and every opportunity to lock their customers into their products and technology. Finally, because the EDA industry had largely ignored FPGA technology when the original place-and-route tools were being developed, the two major FPGA companies already supplied their own place-and-route tools, effectively locking the EDA industry out of the back end of the design flow.

As often happens, the bright-eyed startups crushed the big established tool companies in the innovation race to capture the emerging FPGA synthesis market. Three notable startups, Synplicity and Exemplar Logic in the Bay Area, and ISI from Grenoble, France, produced FPGA synthesis tools that handily beat those from the big EDA companies as well as the first efforts at tools from the FPGA vendors themselves. Of those, Synplicity quickly established market leadership. Exemplar Logic and ISI were acquired by Mentor Graphics and Xilinx, respectively, to bolster their floundering FPGA synthesis efforts. Soon, as the pundits had predicted, Synplicity bubbled to the top and essentially took over the game. They became not only the dominant third party synthesis supplier, but also the supplier to the FPGA vendors themselves, as the FPGA companies offered detuned OEM versions of Synplicity’s tools as part of their standard design tool packages.

Unlike the Design Compiler story, however, the Synplicity situation was not stable or sustainable. This was for several reasons. The two major FPGA companies were not content to allow a third party to control such a critical piece of differentiating technology. If you had a better synthesis tool, your chips performed better, and the opportunity to gain or lose competitive advantage because of synthesis was enormous. From one year to the next, if Synplicity happened to do a better job supporting either Xilinx or Altera’s new technology, it could have catastrophic effects on the other vendor. So both vendors set about developing their own synthesis technologies with the long-range goal of unseating Synplicity.

The FPGA vendors had two big advantages. First, they could give their tools away essentially for free in order to make money on the silicon sale. Synplicity had to make a business out of tools alone, so they were forced to sell synthesis tools with five digit price tags against zero cost alternatives from the FPGA vendors. Second, because the FPGA vendors controlled the place-and-route technology, they were in a better position to do the physical optimizations that would become more and more important as FPGA technology advanced and routing delay took over a larger and larger share of the performance picture. 

Nonetheless, Synplicity stayed on top for years with the simple strategy of outrunning the competition. In customer benchmarks, the company could consistently show that their tools outperformed the “free” vendor tools by a significant margin. The marketing message was simple — if you cared about the quality of your design, you bought Synplicity. The FPGA vendors were patient and determined, however. Supported by multi-billion-dollar chip revenue streams, Xilinx and Altera staffed their tool organizations into the hundreds. Synplicity, on the other hand, had gone public and was now accountable to shareholders for posting quarterly profits and showing aggressive growth, an environment that does not lend itself well to a long-term engineering investment in a single product. In order to be seen as a viable public corporation, Synplicity needed to expand and diversify their product portfolio, and their engineering talent was stretched thin and defocused as a result. The cash flow from Synplicity’s synthesis products was pumped into other emerging product lines, and while that was happening, the FPGA vendors’ synthesis technology inched ever closer to parity. 

Finally, Synplicity was acquired by Synopsys, probably primarily because of the advanced HAPS prototyping technology Synplicity had recently acquired from Hardi. Folded into the big purple mass of Synopsys, the Synplicity identity slowly melted into obscurity. 

For the next few years, the FPGA companies continued to gain ground. More and more customers saw the FPGA vendor-supplied tools as “good enough,” and smaller and smaller percentages relied on third-party EDA solutions. During this time, a new opportunity emerged and died almost as quickly. Just like schematic-based design had done, RTL-based synthesis began to run out of gas in its ability to support the complexity of the largest new designs. One of the key technologies for overcoming and managing this complexity is high-level synthesis (HLS), and the combination of the productivity of HLS with the fast, low-risk implementation of FPGA was truly serendipitous. Once again, there was an opportunity for the EDA industry to own a key, lucrative component of the FPGA design flow. But even though EDA companies had an incredible lead in HLS technology, they had clearly had enough of the FPGA scene. EDA gave only a passing shrug to the enormous potential of HLS with FPGAs, and, as EDA casually watched, the FPGA companies quickly acquired and built high-level design technologies of their own, assimilating them into their already rapidly improving tool suites.

Today, Xilinx and Altera both offer newly upgraded implementation flows that include their own HLS, synthesis, and place-and-route technologies integrated into sophisticated and productive design flows. At the same time, Synopsys continues to develop, market, and sell the former Synplicity tools to a segment of the market that appreciates their vendor-independence, special capabilities in areas like high-reliability design, and sometimes-better quality of results and performance. 

It’s virtually impossible to run any test that would say with certainty which of these solutions is superior. The companies themselves all run benchmark suites consisting of hundreds of customer designs, and they measure their development success usually in single percentage point improvements on some subset of that test suite. Just about every new optimization that improves the results on twenty designs hurts the results on five others. It’s truly a game of inches, and no one even understands how to build a reliable ruler.

Meanwhile, we all sit here in the bleachers glued to our seats, watching the claims and counterclaims of “20% better this” and “4x more of that,” with no real way to tell if our favorite team is winning or losing. We just hope that when we get home and fire up our latest design, we aren’t left with too many timing violations.

14 thoughts on “FPGA Synthesis Showdown”

  1. Cricket is simple:

    “You have two sides, one out in the field and one in. Each man that’s in the side that’s in goes out, and when he’s out he comes in and the next man goes in until he’s out. When they are all out, the side that’s out comes in and the side thats been in goes out and tries to get those coming in, out. Sometimes you get men still in and not out.

    When a man goes out to go in, the men who are out try to get him out, and when he is out he goes in and the next man in goes out and goes in. There are two men called umpires who stay all out all the time and they decide when the men who are in are out.

    When both sides have been in and all the men have out, and both sides have been out twice after all the men have been in, including those who are not out, that is the end of the game!”

    Your view that ” … it’s often unclear who’s winning until, suddenly and unexpectedly, one team meets some arbitrary victory conditions …” is not the case if you actually understand the game. A true cricket afficianado also sees the way in which the game is played is more important then the outcome.

  2. This statement is wishful thinking on your part.

    “the Synplicity identity slowly melted into obscurity”

    I believe they sit together and are a team still. It is still the best 3rd party FPGA Synthesis tool.

    The truth is that Synplicity totally kicked your butt when you were in charge of Mentor FPGA synthesis.

    You got fired from Mentor for total failure is what I heard. You never even came close to taking on Synplicity.

    The team that melted to obscurity is the Precision team. That whole team got canned. They moved the development to morons in India that don’t even know what an fpga is.

    That’s your team that died into obscurity. Maybe if you would have developed something competitive this would not have been the case.

    Get your facts straight on here. You got dominated Kevin.

  3. Kevin,

    I’m curious. Can you actually be honest with your EE Journal readers?

    Can you admit that you were essentially fired at Mentor Graphics?

    Isn’t that the truth?

    Actually what I heard is you were put in charge of nobody. That is typically the nice way to fire an R&D leader.

    Instead of outright firing them. Then you obviously leave on your own.

    You landed ok. Now you just get to comment on everything instead of actually doing something real.

  4. @gobeavs,

    If “essentially fired” is where someone walks into your office and hands you a blue folder that explains your services are no longer required, then yes, I was “essentially fired.” However, I always thought it was just plain-old “fired” or, to use the euphemism the company employed back in 2003 “laid off.” I’ve got no hard feelings. Mentor is a great company, and I always wanted to start my own business. I love what I’m doing.

    I won’t disrespect the engineering team we had back then. Some very bright and talented engineers poured their hearts and souls into Exemplar/Mentor’s Leonardo Spectrum and Precision Synthesis. Synplicity had a huge head start. Exemplar was going to go public, and was gaining ground fast. Then, Mentor decided to cancel the IPO and bring Exemplar in as a part of a Mentor division. At that point, the wheels came off. Most of the engineering and sales teams left, and Mentor had to rebuild engineering, sales, and marketing basically from scratch. That’s when I joined. We were supposed to build a new engineering team, support and update the Leonardo product, and build the new Precision product all at once.

    For about three years there, we rammed ourselves repeatedly into the Synplicity Juggernaut with all the strength we could muster. I don’t think we even scratched the paint. To say they “kicked our butt” would imply that Synplicity actually believed they were in a fight. But after the spin-in, I imagine they hardly even noticed we were there. I didn’t say very much about Mentor’s effort in the article above, because I don’t think we were ever much more than a footnote in FPGA synthesis history.

    Synplicity has always had not just the best third-party FPGA synthesis tool, but the best FPGA synthesis tool in the world period. I don’t feel too badly that we weren’t able to beat them. It took both Xilinx and Altera another ten years to get their respective tools to be competitive with Synplicity, and they had triple-digit numbers of engineers, unfair competitive advantages over Synplicity, and the requirement to support only a single FPGA vendor. Synplicity was formidable and capable and defended the fort extremely well. They simply outperformed all competitors.

    However, time and market forces have taken their toll. Xilinx and Altera now both have very capable FPGA implementation tools that are well integrated into solid design flows. Although the Synplicity identity did, in fact, fade into the great purple fog, Synopsys continues to market updated versions of the former Synplicity tools, and they are also solid, vendor independent, high-performance tools that are used successfully by a number of companies. They are nowhere near the “must have” solutions they were ten years ago, however, and I think you’ll find that much of Synopsys’s efforts in that space are now focused on FPGA prototyping (at which they excel). They are no longer the de facto standard for FPGA synthesis. Most Xilinx and Altera customers today use the vendors’ own tools.

  5. While I am not sure, how Kevin left the building, made a bit of difference to the accuracy of this article…

    There are a couple of points that need to be made..

    Yeah, Kevin, was fired.. Yeah, he kept his corner office, out of respect for the job he had done, and maybe there needs to be some details as to the job he had done.

    First, he took over the engineering team, after the re-acquisition by Mentor. I was there. I was the ONLY engineering manager, who was a manager for more than 6 months before the acquisition, and stayed on. You see, here are some FACTs, that gobeavs you probably are not aware of.

    First, in the year leading up to the change, Exemplar had 25M dollars in sales and a market cap of over 110M (This is what triggered the IPO according to pre-existing contracts)

    We were KICKING THE LIVING SHIT out of Simplicity.
    Sure we were feeling the pain of having competition.. But we were winning. We had a kick ass sales force. We had a kick ass, amazing engineering team. We had amazing distributors. We had an amazing CEO. We were rolling.

    We were making 8-10M from the MGC channel, from autologic in FPGA and ASIC sales (mostly ASIC) and the rest from the Exemplar channel and the distributor channel in FPGA sales.

    However, when MGC took us in. MAJOR things happened.

    First and foremost, the most painful.. The VP we were put under, decided, to MERGE the sales channels with MTI and Exemplar.. Shutting down the exemplar sales force completely. Both the Exemplar sales team, and the exemplar distributors, the MGC sales force still existed.. but that was it.

    This caused our sales to drop over the next 12 months from 25 million to less than 10M. With the proportion being almost 90% from MGC.. The MTI sales team sold ZERO for the first 3 quarters.

    Second, the majority of the engineering team.. called in rich. Their stock for the most part, was fully vested, and while the bonus system put in place was very profitable, it wasnt enough. We went from a team of 40, to a team of 10. Of the 5 managers, 2 remained, 1 leaving soon after. I was actually moved into another position, and later moved back. Eventually some were promoted up.

    The VP of that group left for truly personal reasons, and a new one came in. Who basically said, keep the group alive, while we figure out what to do.

    Kevin rebuilt the team…

    And part of rebuilding it, it was decided, it was time to really fix some things.. We had some great new people onboard, who were NOT as tied to some old ways of thinking..

    We used Mentors Common Timing Engine, a major undertaking to redo the timing analysis system of any tool.

    We redid many of the area optimization routines.

    We redid the SDC front end

    we redid the UI, to make it an award winning platform (awards from Synopsys btw)

    all the while increasing market share and revenue. Including OEM deals with altera and xilinx.

    Did the revamped project come out on time? Nope…

    Was that the fault of any one person? Nope…

    When the VP/GM who told kevin to keep it alive, put a GM in place, start putting his people in place care? nope…

    Did the new Head of R&D fuck things up even worse.. Yep.

    Did the new Head of Marketing.. totally make Precision irrelevant.. yep..

    Did the new UI team start using technology from the 1980s… Yep..

    Did they start using a parser from India? yep

    Did the “new” head of R&D last 28 months and then move on? yep..

    Did anyone miss him? Nope

    So before you throw barbs at Kevin.. you may want to be more aware at some of the facts from someone who was there.. Someone who still looks at Kevin as one of the BEST second level managers he has ever worked with. Who rebuilt an engineering team, working on a highly complicated codebase while continuing to maintain customer satisfaction.

  6. @gobeavs … I don’t think your anonymously attacking Kevin directly and personally in this or any other industry forum has cause, and is certainly way below the degree of professionalism other readers in this forum expect and deserve.

    What ever personal beef you have with Kevin should be taken elsewhere.

  7. Pingback: GVK BIO
  8. Pingback: DMPK
  9. Pingback: pezevenk
  10. Pingback: AME

Leave a Reply

featured blogs
Mar 19, 2024
I sometimes wonder whether the coffee stations in the office are strategically designed to make people collide and give people who don't normally interact much a chance to talk. If yes, then isn't that an awesome idea? Yesterday, while taking a coffee break, an AE, Benny...
Mar 18, 2024
If you've already seen Vivarium, or if you watch it as a result of reading this blog, I'd love to hear what you think about it....
Mar 18, 2024
Cloud-based EDA tools are critical to accelerating AI chip design and verification; see how NeuReality leveraged cloud-based chip emulation for their 7NR1 NAPU.The post NeuReality Accelerates 7nm AI Chip Tape-Out with Cloud-Based Emulation appeared first on Chip Design....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured paper

Reduce 3D IC design complexity with early package assembly verification

Sponsored by Siemens Digital Industries Software

Uncover the unique challenges, along with the latest Calibre verification solutions, for 3D IC design in this new technical paper. As 2.5D and 3D ICs redefine the possibilities of semiconductor design, discover how Siemens is leading the way in verifying complex multi-dimensional systems, while shifting verification left to do so earlier in the design process.

Click here to read more

featured chalk talk

GaN Solutions Featuring EcoGaN™ and Nano Pulse Control
In this episode of Chalk Talk, Amelia Dalton and Kengo Ohmori from ROHM Semiconductor examine the details and benefits of ROHM Semiconductor’s new lineup of EcoGaN™ Power Stage ICs that can reduce the component count by 99% and the power loss of your next design by 55%. They also investigate ROHM’s Ultra-High-Speed Control IC Technology called Nano Pulse Control that maximizes the performance of GaN devices.
Oct 9, 2023
21,245 views