feature article
Subscribe Now

The Value of a Complete FPGA Design Flow

If the chain of tools comprising your design flow works flawlessly in getting your hardware ideas to silicon quickly, then the value of that flow is priceless. However, if one or more of the links in that chain is broken or corrupted, then the value of that flow plummets.. Whether you are thinking about assembling a new design flow or auditing your existing flow, this paper covers methods for improving the effectiveness of that chain of tools.

Introduction

Design flows tend to grow in complexity over time. These flows consist of commercial tools bound together with scripts, home-brewed tools, and free-ware to form a design environment. This flow works well until a new design breaks it. At this point, a mad frenzy of activity takes place to track down the fault and fix the problem. In examining a design flow that has evolved over time, it is difficult to understand all its elements. This seamless flow, in fact, tends to be held together with a lot of script putty that grows old and cracks with age. This breakage is likely to occur when a company shifts to modern design methodologies such as advanced verification or new languages such as SystemC and SystemVerilog.

It is rare that you will be asked to create a brand new design flow from scratch. This typically only occurs when one or more of the following events happen:

  • A change in management or personnel.

  • The last project failed or was incredibly late to market.

  • One of your key tool vendors discontinued support of a tool or exited the market.

  • The company is switching to a more modern design methodology.

More likely, you are auditing the design flow to improve it for the next project. But, in either scenario, there is an important set of questions to be asked to determine areas of improvement.

Based on over twenty years of experience examining customer’s FPGA design flows, Mentor Graphics is in a unique position to understand what works well and what does not. The questions listed below are based upon this experience and will give you a reasonable idea on how to craft a new design flow or to simply improve an existing flow.

The Assessment

Examine your design flow and answer the following questions in order to assess your next steps. The possible answers are: Yes, Sometimes, and No.

  1. Is the flow standards-based? Avoid proprietary formats that can lock you into particular tools and do not interface well with other tools. Try to keep your design in pure VHDL or Verilog.

  2. Is your RTL technology-independent? As soon as you instantiate a technology-specific object into the RTL description, you are locked into a particular technology, making reuse or retargeting difficult.

  3. Does the flow avoid the use of module generators? Counting on a vendor module generator instead of its RTL equivalent can again lock you into a particular vendor, preventing easy reuse of the code.

  4. Does your design language avoid transformations? The more transformations from one language to another that exist in your flow, the more problems you will have in tracking down errors. Stay with RTL for as long as you can and avoid transformations to proprietary data structures or languages such as EDIF.

  5. Does the team use a defined directory structure? If the team standardizes the method for storing projects on disk, finding problems and understanding the data becomes easier.

  6. Do you use a naming convention? Establishing naming standards and conventions within your file structure and RTL help teams debug issues.

  7. Does the flow use only core RTL statements? Using all the constructs in a new language (such as SystemVerilog) can cause flow problems. Typically, tools support a subset of constructs over time. An exotic construct that no one else uses might seem impressive, until it stops the function of a tool.

  8. Do you enforce coding rules? Use a linting tool to enforce best practices as you write RTL, preventing downstream surprises and costly iteration loops.

  9. Does your flow minimize the use of scripts? While scripting is great at adding functionality and automation to your flow, they are the weakest link in the chain. Scripts are usually not documented well, and the author may no longer be with the organization or company. Keep any scripts used simple and well documented.

  10. Do you minimize API calls in scripts? API calls are great for customizing tool functionality. However, not only are API calls tool specific, but they can also change in a tool release.

  11. Do you use version management? You only need to lose a build of a project or require a back-out of a design change once before realizing the value of version management. If possible, stay away from the exotic tools and stick with the industry-standards such as RCS or CVS.

  12. Do you perform regular design reviews? As you move through the design flow, set design review milestones to ensure code quality.

  13. Do you document your design? Documenting code with text, graphics, and tables helps in design reviews, assists management, and allows the reuse of code in another project.

  14. Are you using advanced verification techniques? Using assertions, new testbench technologies, monitors, etc. help attain working silicon faster.

  15. Does the flow support the use of an RTL repository? Design flows typically can move faster if you are not designing a module that has been completed elsewhere in the company. Publish your designs via the company’s intranet to avoid this problem.

  16. Do you have an archiving method? At the end of the project, all the data required to replicate the design and environment should be collected and stored for later reuse.

  17. Does your design chain contain only commercial tools? Commercial tools are well maintained and supported in contrast to home-brew tools and scripts.

  18. Does your flow contain the minimum set of commercial tools? Obviously, the more elaborate the tool chain, the more prone to problems the flow is. Start with tools to specify/reuse RTL, synthesize, simulate, and place and route the design. Add tools as the designs become more complex, with the goal of minimizing the number of tool vendors. Look at FPGA vendor OEM deals to understand who the market leaders are.

  19. Do you have flow tests? Each time a new version of a tool comes out, an automated method is needed to test the design flow to ensure it still works.

  20. Do you account for the PCB? Changes at the RTL interface and the FPGA interface during PCB routing can cause flow iterations. A bi-directional method of accounting for these changes is required.

The Scoring

One method for assessing the flow is to provide it with a score.. Give a score of 10 for each question for which you answered “Yes”. For each answer of “Sometimes,” give a score of 5. Any question answered “No” receives 0 points. Add up the score and refer to the following chart:

Score

Interpretation

180 – 200

A very solid design flow; little if any changes are required.

160 – 175

Not bad, but could use a few adjustments for a better flow.

140 – 165

Average, but could use considerable adjustment.

120 – 160

Needs major work to improve the flow.

Below 120

Consider a re-architecting of the flow.

In this paper, all the questions are of equal weight. Of course, you may consider some aspects more important than others and want to develop a weighted scoring system to evaluate a flow.

Conclusion

Taking a peek behind the curtain of your design flow can be scary. However, periodic review and updates to the flow can improve design efficiency. Consulting with a broad-range EDA supplier can be a good start in building a new flow or improving an existing one. Design automation is the key to successful implementation of ideas into FPGAs. Make sure your design flow remains a priceless possession.

 

20070501_mentor_dewey.jpg

Author’s Bio: Tom Dewey is a Technical Marketing Engineer for the HDL Designer Series product line at Mentor Graphics. Tom has over 19 years of EDA experience, contributing to ASIC and FPGA software products used for creation, synthesis, verification, and test.

 

 

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

How NV5, NVIDIA, and Cadence Collaboration Optimizes Data Center Efficiency, Performance, and Reliability

Sponsored by Cadence Design Systems

Deploying data centers with AI high-density workloads and ensuring they are capable for anticipated power trends requires insight. Creating a digital twin using the Cadence Reality Digital Twin Platform helped plan the deployment of current workloads and future-proof the investment. Learn about the collaboration between NV5, NVIDIA, and Cadence to optimize data center efficiency, performance, and reliability. 

Click here for more information about Cadence Data Center Solutions

featured chalk talk

Peak Power Introduction and Solutions
Sponsored by Mouser Electronics and MEAN WELL
In this episode of Chalk Talk, Amelia Dalton and Karim Bheiry from MEAN WELL explore why motors and capacitors need peak current during startup, the parameters to keep in mind when choosing your next power supply for these kind of designs, and the specific applications where MEAN WELL’s enclosed power supplies with peak power would bring the most benefit.
Jan 22, 2024
25,955 views