feature article
Subscribe Now

The Value of a Complete FPGA Design Flow

If the chain of tools comprising your design flow works flawlessly in getting your hardware ideas to silicon quickly, then the value of that flow is priceless. However, if one or more of the links in that chain is broken or corrupted, then the value of that flow plummets.. Whether you are thinking about assembling a new design flow or auditing your existing flow, this paper covers methods for improving the effectiveness of that chain of tools.

Introduction

Design flows tend to grow in complexity over time. These flows consist of commercial tools bound together with scripts, home-brewed tools, and free-ware to form a design environment. This flow works well until a new design breaks it. At this point, a mad frenzy of activity takes place to track down the fault and fix the problem. In examining a design flow that has evolved over time, it is difficult to understand all its elements. This seamless flow, in fact, tends to be held together with a lot of script putty that grows old and cracks with age. This breakage is likely to occur when a company shifts to modern design methodologies such as advanced verification or new languages such as SystemC and SystemVerilog.

It is rare that you will be asked to create a brand new design flow from scratch. This typically only occurs when one or more of the following events happen:

  • A change in management or personnel.

  • The last project failed or was incredibly late to market.

  • One of your key tool vendors discontinued support of a tool or exited the market.

  • The company is switching to a more modern design methodology.

More likely, you are auditing the design flow to improve it for the next project. But, in either scenario, there is an important set of questions to be asked to determine areas of improvement.

Based on over twenty years of experience examining customer’s FPGA design flows, Mentor Graphics is in a unique position to understand what works well and what does not. The questions listed below are based upon this experience and will give you a reasonable idea on how to craft a new design flow or to simply improve an existing flow.

The Assessment

Examine your design flow and answer the following questions in order to assess your next steps. The possible answers are: Yes, Sometimes, and No.

  1. Is the flow standards-based? Avoid proprietary formats that can lock you into particular tools and do not interface well with other tools. Try to keep your design in pure VHDL or Verilog.

  2. Is your RTL technology-independent? As soon as you instantiate a technology-specific object into the RTL description, you are locked into a particular technology, making reuse or retargeting difficult.

  3. Does the flow avoid the use of module generators? Counting on a vendor module generator instead of its RTL equivalent can again lock you into a particular vendor, preventing easy reuse of the code.

  4. Does your design language avoid transformations? The more transformations from one language to another that exist in your flow, the more problems you will have in tracking down errors. Stay with RTL for as long as you can and avoid transformations to proprietary data structures or languages such as EDIF.

  5. Does the team use a defined directory structure? If the team standardizes the method for storing projects on disk, finding problems and understanding the data becomes easier.

  6. Do you use a naming convention? Establishing naming standards and conventions within your file structure and RTL help teams debug issues.

  7. Does the flow use only core RTL statements? Using all the constructs in a new language (such as SystemVerilog) can cause flow problems. Typically, tools support a subset of constructs over time. An exotic construct that no one else uses might seem impressive, until it stops the function of a tool.

  8. Do you enforce coding rules? Use a linting tool to enforce best practices as you write RTL, preventing downstream surprises and costly iteration loops.

  9. Does your flow minimize the use of scripts? While scripting is great at adding functionality and automation to your flow, they are the weakest link in the chain. Scripts are usually not documented well, and the author may no longer be with the organization or company. Keep any scripts used simple and well documented.

  10. Do you minimize API calls in scripts? API calls are great for customizing tool functionality. However, not only are API calls tool specific, but they can also change in a tool release.

  11. Do you use version management? You only need to lose a build of a project or require a back-out of a design change once before realizing the value of version management. If possible, stay away from the exotic tools and stick with the industry-standards such as RCS or CVS.

  12. Do you perform regular design reviews? As you move through the design flow, set design review milestones to ensure code quality.

  13. Do you document your design? Documenting code with text, graphics, and tables helps in design reviews, assists management, and allows the reuse of code in another project.

  14. Are you using advanced verification techniques? Using assertions, new testbench technologies, monitors, etc. help attain working silicon faster.

  15. Does the flow support the use of an RTL repository? Design flows typically can move faster if you are not designing a module that has been completed elsewhere in the company. Publish your designs via the company’s intranet to avoid this problem.

  16. Do you have an archiving method? At the end of the project, all the data required to replicate the design and environment should be collected and stored for later reuse.

  17. Does your design chain contain only commercial tools? Commercial tools are well maintained and supported in contrast to home-brew tools and scripts.

  18. Does your flow contain the minimum set of commercial tools? Obviously, the more elaborate the tool chain, the more prone to problems the flow is. Start with tools to specify/reuse RTL, synthesize, simulate, and place and route the design. Add tools as the designs become more complex, with the goal of minimizing the number of tool vendors. Look at FPGA vendor OEM deals to understand who the market leaders are.

  19. Do you have flow tests? Each time a new version of a tool comes out, an automated method is needed to test the design flow to ensure it still works.

  20. Do you account for the PCB? Changes at the RTL interface and the FPGA interface during PCB routing can cause flow iterations. A bi-directional method of accounting for these changes is required.

The Scoring

One method for assessing the flow is to provide it with a score.. Give a score of 10 for each question for which you answered “Yes”. For each answer of “Sometimes,” give a score of 5. Any question answered “No” receives 0 points. Add up the score and refer to the following chart:

Score

Interpretation

180 – 200

A very solid design flow; little if any changes are required.

160 – 175

Not bad, but could use a few adjustments for a better flow.

140 – 165

Average, but could use considerable adjustment.

120 – 160

Needs major work to improve the flow.

Below 120

Consider a re-architecting of the flow.

In this paper, all the questions are of equal weight. Of course, you may consider some aspects more important than others and want to develop a weighted scoring system to evaluate a flow.

Conclusion

Taking a peek behind the curtain of your design flow can be scary. However, periodic review and updates to the flow can improve design efficiency. Consulting with a broad-range EDA supplier can be a good start in building a new flow or improving an existing one. Design automation is the key to successful implementation of ideas into FPGAs. Make sure your design flow remains a priceless possession.

 

20070501_mentor_dewey.jpg

Author’s Bio: Tom Dewey is a Technical Marketing Engineer for the HDL Designer Series product line at Mentor Graphics. Tom has over 19 years of EDA experience, contributing to ASIC and FPGA software products used for creation, synthesis, verification, and test.

 

 

Leave a Reply

featured blogs
Dec 13, 2018
A few years ago, I was working for a large company in Menlo Park. The job itself was not where I wanted to be, and I was happy to leave it for Cadence. But despite my bellyaching about the position,... [[ Click on the title to access the full blog on the Cadence Community si...
Dec 12, 2018
A joint demonstration between Samtec and eSilicon — an eSilicon 7 nm 56 Gbps DSP SerDes over a Samtec 5 meter ExaMAX® backplane cable assembly — caught a lot of attention at SC18. The demo showed a true long-reach capability with a high-performance, flexible, eas...
Dec 10, 2018
With Apple'€™s '€œWearable'€ category of sales setting a new record this September with growth over 50%, and FitBit seeing growth in both trackers......
Nov 14, 2018
  People of a certain age, who mindfully lived through the early microcomputer revolution during the first half of the 1970s, know about Bill Godbout. He was that guy who sent out crudely photocopied parts catalogs for all kinds of electronic components, sold from a Quon...