feature article
Subscribe Now

FPGA Simulation

Forget what you learned in ASIC design

When someone uses the words “verification” and “FPGA” in the same sentence, I’m always suspicious. In the ASIC design world, where risk avoidance is everything, “verification” is a sacred term. “Verification” is the long pole in the tent, the most time-consuming phase of the design cycle. “Verification” is what you do to protect your job so you’re not blamed with an expensive and time-consuming re-spin of an ASIC design. “Verification” is what EDA companies have learned to trust as their bread-and-butter. Capitalizing on the risk inherent in the ASIC design flow, they produce premium-priced solutions that seek to siphon off the inevitable human failures that find their way into complex engineering projects.

It stands to reason, then, that when FPGA designs began to reach the same complexity as ASIC designs, EDA companies and ASIC designers would have a Pavlovian response, driving them immediately to the question of “verification”. Without so much as a pause to ponder, they mimic the mantras of the ASIC world: “Well, if your FPGA designs are reaching five million gates, you’re going to need some sophisticated verification tools.”

“Verification” is what you do to make sure that your design is correct before you commit it to hardware. It is the final “let’s be sure” step that catches any last minute problems that crept into your code. It represents an investment of time and resources to reduce overall risk. The sensible investment in verification tools, then, is proportional to the likelihood of an error making it through to hardware multiplied by the cost of correcting that error. The more we invest in verification, the smaller the first term (likelihood of error) gets. The second term, cost of correction, however, is where the problem occurs. For ASIC design, the second term is measured in hundreds of thousands of dollars and weeks of schedule time. In addition to those penalties, each re-spin to correct an error carries immeasurable impact in missed market opportunity. For FPGA design, there is no such penalty. FPGAs, in fact, are one of the leading verification technologies for ASIC design. People use FPGAs every day as a platform for prototyping ASIC designs in hardware for accelerated verification. There simply is no way to justify significant investment in software verification tools for the design of FPGAs.

This is not to say that production FPGA designs don’t need to be verified before they’re shipped. They do, of course, but given the state of modern development boards, reprogrammable architectures, and on-chip hardware/software debugging with high-bandwidth data transfer and robust user interfaces, the only sensible way to verify FPGA designs is using an HIL (Hardware-In-the-Loop) approach. Spending tens of thousands for software tools that are more than an order of magnitude slower and don’t model the hardware as accurately just doesn’t make sense.

The keystone of all modern ASIC verification tools is the venerable HDL simulator. Thanks to ASIC design, the concepts of HDL simulation and “verification” are so tightly connected that few are able to entertain the idea that HDL simulation could be used for any other purpose. HDL simulation is an integral part of FPGA design — just not in the role of verification. Most modern FPGAs are developed primarily in VHDL and Verilog, and the HDL simulator is as important to the development of that code as a debugger is to a software developer.

For multi-purpose marketers at EDA companies, it’s hard to break the habit. When you’re used to selling your product to ASIC designers as a verification tool, it’s difficult to turn around and sell the same product to an FPGA audience as a development and debug aid. The requirements, however, are really quite different. As a designer you should be aware of what you need and have patience with the product vendors, who are understandably confused.

For ASIC design, the key attributes of an HDL simulator are (in order) speed, speed, speed, and capacity. The number of events that can be evaluated on a code-crunching farm of five hundred air-conditioned Linux boxes during a 48-hour run is the most meaningful metric to the hard-core ASIC verification crowd.

In FPGA design, however, the heavy lifting is done in hardware. In less time than it takes to upload the simulation kernel and test vectors to a simulation farm, you can be happily probing your actual FPGA hardware implementation with virtual instrumentation, running the design at speeds that would make the simulation savants green with envy.

The important features of FPGA simulation are ease of use, robustness of the debug environment, and enough capacity to handle your most ambitious VHDL or Verilog models. Following in close formation are the ability to handle various forms of IP integration and general smoothness of integration with the rest of the FPGA tool flow. Since FPGA is also a melting pot of designers coming from different disciplines, support for easy integration of models written in other languages is also valuable.

Most FPGA development and debug is done at the module-level, so capacity and performance are secondary considerations for a change. Yes, you’ll want enough capacity to load your whole design (which regardless of “system gate” counts won’t be the size of a large ASIC), but you’ll only really use RTL simulation. Gate level simulation performance, though used as a bragging right by simulation vendors, really never comes into play, as you’ll be in actual hardware long before that stage.

There are a number of other ASIC-related features that can add a lot of hype (and price) to HDL simulation environments that aren’t really of much use in the FPGA area. Features like assertions (which are finding increasing use in ASIC verification) have little utility in the typical FPGA design flow.

While most EDA suppliers are off chasing the ASIC bus, there are at least a couple who understand FPGA and are quite successful with tools aimed at the FPGA market. Foremost of these is Mentor Graphics’ ModelSim, which has been the leading FPGA simulator for a number of years. According to our FPGA project survey, over 60% of FPGA designers use ModelSim to debug their HDL designs. Part of the secret of this is the distribution channel. Most FPGA vendors have agreements to re-distribute ModelSim as part of their standard development kits, so accessibility is high.

“We make a conscious effort to address FPGA designers,” said Anna Leef, Product Marketing Manager for the ModelSim product line at Mentor Graphics. “Many of our FPGA customers value ease of use and debug functionality more than performance.  We’ve spent a lot of resources over the past few years increasing our capabilities in these areas.  For ease of use, we’ve added language templates and better project management, and on the debug side we offer source, waveform and graphical debug capabilities for single- and mixed-language designs.  Through Mentor Graphics we also offer FPGA Advantage products that feature ModelSim integrated with the HDL Designer Series
family of design entry tools and Precision synthesis.”

”We also package and distribute our product in specific ways to address a variety of user needs.  Our OEM products offered directly by the FPGA vendors include enough performance to handle small to medium sized FPGAs, and they also include the same ModelSim user interface available in our higher-priced products.  This makes it easy to migrate to our more powerful products as a user’s designs get larger.  ModelSim PE, our industry-leading Windows based
simulator, is specifically targeted at FPGA designers and offers entry level price points of less than $5,000. Our high-end product, ModelSim SE, provides a simulation, verification, and debug environment for users that want more performance, capacity, and verification capabilities.”

Also aggressively pursuing the FPGA design business is Aldec with their FPGA vendor-independent simulator, Active-HDL.  In addition to providing full-featured HDL entry, verification, and debugging, Aldec has also focused on providing solutions for new generation FPGAs. Active-HDL offers such features as integrated C debugging for embedded processors and DSP co-verification working together with Matlab. Active-HDL also includes built in interfaces and library support for SystemC and SWIFT Smart Models.

So don’t be biased by ASIC requirements in choosing an FPGA simulator. Think about what your team really needs and buy accordingly. Remember that the best acceleration technology available is probably your target FPGA, and thanks to the current generation of embedded debugging aids, you’ll likely have a robust environment that will have you shipping products long before your ASIC counterparts.

14 thoughts on “FPGA Simulation”

  1. Pingback: good seedbox
  2. Pingback: videos
  3. Pingback: DMPK Studies
  4. Pingback: juegos friv
  5. Pingback: orospu
  6. Pingback: SCR888 Casino
  7. Pingback: index

Leave a Reply

featured blogs
Nov 24, 2020
In our last Knowledge Booster Blog , we introduced you to some tips and tricks for the optimal use of the Virtuoso ADE Product Suite . W e are now happy to present you with some further news from our... [[ Click on the title to access the full blog on the Cadence Community s...
Nov 23, 2020
It'€™s been a long time since I performed Karnaugh map minimizations by hand. As a result, on my first pass, I missed a couple of obvious optimizations....
Nov 23, 2020
Readers of the Samtec blog know we are always talking about next-gen speed. Current channels rates are running at 56 Gbps PAM4. However, system designers are starting to look at 112 Gbps PAM4 data rates. Intuition would say that bleeding edge data rates like 112 Gbps PAM4 onl...
Nov 20, 2020
[From the last episode: We looked at neuromorphic machine learning, which is intended to act more like the brain does.] Our last topic to cover on learning (ML) is about training. We talked about supervised learning, which means we'€™re training a model based on a bunch of ...

Featured video

Synopsys and Intel Full System PCIe 5.0 Interoperability Success

Sponsored by Synopsys

This video demonstrates industry's first successful system-level PCI Express (PCIe) 5.0 interoperability between the Synopsys DesignWare Controller and PHY IP for PCIe 5.0 and Intel Xeon Scalable processor (codename Sapphire Rapids). The ecosystem can use the companies' proven solutions to accelerate development of their PCIe 5.0-based products in high-performance computing and AI applications.

More information about DesignWare IP Solutions for PCI Express

featured paper

Keys to quick success using high-speed data converters

Sponsored by Texas Instruments

Whether you’re designing an aerospace system, test and measurement equipment or automotive lidar AFE, hardware designers using high-speed data converters face tough challenges with high-frequency inputs, outputs, clock rates and digital interface. Issues might include connecting with your field-programmable gate array, being confident that your first design pass will work or determining how to best model the system before building it. In this article, we take a look at each of these challenges.

Click here to download the whitepaper

featured chalk talk

Minitek Microspace

Sponsored by Mouser Electronics and Amphenol ICC

With the incredible pace of automotive innovation these days, it’s important to choose the right connectors for the job. With everything from high-speed data to lighting, connectors have a huge impact on reliability, cost, and design. In this episode of Chalk Talk, Amelia Dalton chats with Glenn Heath from Amphenol ICC about the Minitek MicroSpace line of automotive- and industrial-grade connectors.

Click here for more information about Amphenol FCI Minitek MicroSpace™ Connector System