feature article
Subscribe Now

Bundling Performance

Lessons from Xilinx ISE 8.1i

Every year, FPGA and Structured ASIC Journal has conducted a survey of design teams that have recently completed projects using FPGAs. We collect and analyze a large volume of responses from readers regarding their completed projects, and we publish and sell a detailed report to companies that have a vested interest in gathering data about the current behaviors of FPGA design teams. This is nothing unusual, as many media companies perform similar research and offer similar studies to their customers. This time, however, we noticed one thing that was unusual. There seems to be a shift that is accelerating in FPGA design. More and more design teams seem to be completing their projects using only tools they obtained directly from their FPGA vendor.

This trend toward one-stop shopping for design tools and silicon is not new. It actually has followed something like a single-cycle sine wave over the past decade or so. Many early FPGAs were simple enough to design with basic schematic capture techniques, and the vendor-provided tools were more than adequate for the limited challenge posed by getting one of those small, seminal programmable logic devices up and running.

When FPGAs got more complicated, however, and HDL-based design became necessary, the FPGA industry piggybacked on the extensive development already done by the EDA industry in support of ASIC design. HDL simulators, synthesis tools, and other design aids were adapted and modified for FPGA use, and many customers switched to third-party EDA-supplied design tools for much of their FPGA development. Companies like Synplicity defined an industry by developing high-capability design tools aimed specifically at FPGA design, and at the peak, the majority of FPGA design teams were using third-party tools to supplement the FPGA-vendor supplied suites.

For the past couple of years, however, that trend has apparently been reversing, at least as far as the designs that use third-party tools. While third-party EDA for FPGA has still been growing, the FPGA market has been growing faster, so EDA’s market share has decreased even while their sales increased. Two possible reasons for this shift are easy to understand. First, while newer, high-end devices have been added to the top of the FPGA pile, the low-end devices are still almost as simple as ever. Customers using CPLDs and simple FPGAs can almost always make do with the FPGA-vendor-supplied tool offering. Second, as FPGAs expand into new markets, the newer, less-experienced users are likely to try their first projects using only the default tool suites. It isn’t until they’ve had some practice that design teams often start to understand the additional benefits that third-party tools can offer.

A third factor in this transition bears further investigation, though. The capabilities of FPGA vendor tools are improving at a dramatic rate, causing third-party tool vendors to race to stay ahead with their extra-cost offerings. As a case-in-point, Xilinx announced a new release of their ISE design software bundle this week. Their version 8.1i is said to offer 10-37% better timing performance than the previous version of their tools. This is a staggering performance improvement claim for a software release. In most cases, a 10-37% improvement would give us a one-speed-grade difference in the FPGA required for any given set of timing constraints. Roughly translated, that could also mean a 30% reduction in device cost.

In a technology economy where process-driven improvements of 30% typically involve investing tens of millions of dollars to surf the crest of the Moore’s Law wave, a 30% improvement from synthesis and layout algorithm improvements is an absolute steal. In this accomplishment, Xilinx, like other FPGA vendors, has continued down the path of increasing the raw horsepower in their default synthesis and layout software. Now Xilinx, Altera, Actel, and QuickLogic all have some form of physical synthesis available in their tool suite that improves timing performance significantly. In the case of Xilinx and Altera, the physical synthesis is proprietary, and in the case of Actel and QuickLogic, the capability is provided by special versions of Magma’s “Palace” FPGA optimization tool.

Integrating physical optimization makes strong sense technically. With each process generation, a greater percentage of the total delay in our FPGA designs is due to routing. This means that timing estimates that don’t take placement and routing into account are increasingly inaccurate. When logic synthesis tries to do timing optimization based on inaccurate estimates, the results can be catastrophic, often making timing worse instead of better. The logical solution to the problem is to expand beyond the logical and to include physical optimization in the timing phase.

From a market and business perspective, however, moving more of the synthesis process under the umbrella of the vendor’s place-and-route process cuts away opportunity that third party EDA companies could use to differentiate their technology. Companies like Magma, Mentor, and Synplicity offer highly capable physical synthesis tools for FPGAs that can make a significant improvement in design performance and timing. As the FPGA vendors’ physical synthesis capabilities improve, the incremental advantage from using the third-party tools decreases. This attacks the bottom line of the EDA suppliers and ultimately reduces the money available for them to invest in future technology development as well.

Xilinx’s new release includes physical synthesis improvements such as register retiming (where logic elements are moved past register boundaries to better balance the delay on each side), timing-driven mapping (where packing and placement are done simultaneously), and global and local logic restructuring, replacement, and pin-swapping to improve delays. These are the same types of optimizations that one would normally expect in a high-end physical synthesis/physical optimization tool. Moving them into the default product offering raises the bar for competitors in both the FPGA vendor and EDA vendor camps.

Xilinx has also attacked the problem of design constraints in creative ways with this release. If the designer provides no timing constraints (which is often the case in early phases of design), the new ISE tools use what Xilinx calls “Performance Evaluation Mode,” which evaluates the performance of all the clocks in a design in a single pass. This gives both a reasonable estimate of the performance that can be expected on the design and starting guidance for setting timing constraints for further optimization. Xilinx also offers what they call an “Xplorer” script, which does a multi-pass optimization with increasingly tight timing constraints in an attempt to find the absolute best performance available for a given design.

With this release, Xilinx also includes enhanced power estimation in their “XPower” tool, as well as their own simulator, “ISE Simulator Lite,” in addition to the ModelSim simulator used by most FPGA design teams. They also offer a wide variety of add-on tools such as their ChipScope Pro embedded debugger, the PlanAhead hierarchical floorplanner, the System Generator for DSP, and the Platform Studio for embedded system-on-chip assembly with the MicroBlaze soft processor core. The breadth of these add-ons (and of those offered by competitors) indicates even more clearly the FPGA industry’s determination to provide a comprehensive set of tools that will make their silicon useful in the widest possible range of applications.

The evolution of the FPGA tools market will be interesting to watch. While FPGA vendors invest increasingly huge sums in tool development, they are mindful of driving away the EDA industry that is still actively developing technology to improve the viability of their FPGA products. However, the EDA industry as a whole has never demonstrated a solid enough commitment to FPGA technology to allow the FPGA vendors to rest easily, secure in the notion that EDA would provide all the tool capabilities they need. Because of these conflicting forces, FPGA vendors must continue to develop their own tools, while trying to defend some profitable turf for EDA, keeping them engaged and actively engineering.

If the FPGA vendors can manage to maintain that balance, both the industry and its customers will benefit from higher capability tools. Those tools will make FPGAs better and more competitive products for a wide variety of applications and will fuel further growth in the technology and in the industry. If EDA loses interest and turns away, FPGA vendors will be left with the expensive burden of driving the tools side of the technology as well as the silicon. Progress in performance and other important metrics of FPGA usefulness will surely slow, and market growth will lag. Whichever way the balance tips, we’ll be there to watch.

Leave a Reply

featured blogs
Jun 21, 2018
Doing business today isn’t quite like it was back in the 80’s. Sparkling teeth and x-ray vision shouldn’t be a side effect of a customer using your product. This, of course, is said in jest, but no longer do we sell only a product; but a product and physical...
Jun 21, 2018
Welcome back to our series on cloud verification solutions. This is part two of a three-part blog'€”you can read part one here . The high-performance computing (HPC) market continues to grow. Analysts say that the HPC market will reach almost $11 billion by 2020'€”that'€...
Jun 7, 2018
If integrating an embedded FPGA (eFPGA) into your ASIC or SoC design strikes you as odd, it shouldn'€™t. ICs have been absorbing almost every component on a circuit board for decades, starting with transistors, resistors, and capacitors '€” then progressing to gates, ALUs...
May 24, 2018
Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact. A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation '€“ as couples are wont to...
Apr 27, 2018
A sound constraint management design process helps to foster a correct-by-design approach, reduces time-to-market, and ultimately optimizes the design process'€”eliminating the undefined, error-prone methods of the past. Here are five questions to ask......