feature article
Subscribe Now

Minimizing the Pain of RTL Design Reviews

Design reviews conger images of engineers carrying reams of code printouts, filing single-file and head down into a room to be judged by others. The positive impact of design reviews has been proven though many studies, but does the preparation and process of the review have to be so painful? This paper provides a practical approach to design reviews that soothes the process and actually results in a positive experience.

The key to a painless design review process is to use techniques and tools as code is developed that contribute to a quality solution, minimizing the time spent in the review. Review time can then be spent on important project debates, such as algorithmic or architecture implementation decisions, instead of drowning in the minutia of line-by-line code discussions. Striving for perfection, automated design review techniques take the place of most manual techniques. In reality, there is always a mix of automated and manual techniques as Figure 1 shows.

Figure 1: Automated techniques provide the most efficient design review.

The Source Code

Upfront techniques applied to source code provide an excellent platform for painless design reviews. Consider the following ideas:

  • Version management. If a version management tool is in use, make sure the correct version of the code is checked out for review and create a difference report on the code base between the last reviewed version and the latest. If version management is not in place, consider using it on this project before the review takes place.

  • Comments. Designers tend to ignore comments in code, but they can help facilitate a smooth design review. Comments should take a narrative approach, explaining what is going on with the code and why certain choices were made. This approach has the amazing effect of the author finding logic flaws and defects before the review takes place. It also provides a guided conversation during the code review.

  • Layout. The team should agree to a consistent layout of code modules. For example: one module per file, order of statements, and use labels on constructs. A standard layout cuts down on review time because the code flows in a consistent manner from module to module.

  • Checklists. Consider creating a checklist for each code developer that contains a set of common errors to check as the code is written. These items can include particular style guidelines, typical mistakes seen in the past, and known best practices. Each designer can add his/her own items to watch out for, based on past experience.

  • Linting. Consider using a lint tool to automate the checklist process. A lint tool can catch many of the coding mistakes as the code is written, report on code complexity and other metrics, and enforce code layout preferences. Some lint tools can score the code based on violations found. Teams can set a minimum score required before the review takes place. During the review the author can present the lint report and concentrate only on why he/she believes an exception should be made for any remaining violations.

The Visuals

In some cases, a picture of the code provides a better basis for high-level review discussions. Some EDA tools can provide these views automatically or they can be created manually. Consider creating the following visual aids:

  • Reference map. Create a map of all the references to files external to the code under review. References can be: include files, import files, packages, and macros. If the code is object oriented, create a class tree in order to track down inherited methods. The maps often take the form of a spreadsheet or a bubble diagram and they are used to quickly trace back to the source during the review.

  • File map. If the code under review is spread out into multiple files, a simple file map can be useful in order to understand the topography. This map is often combined with the reference map.

  • Requirement trace map. This map shows the relationships between functional and verification requirements and the source code being reviewed. Sophisticated maps also trace to the testbench and simulation results. During the review process, missing requirements can quickly be identified and incorrect implementation of requirements at the code level is evident.

  • Visualizations. A high-level block diagram is useful for hierarchical structural code. Complex algorithmic code is easier to review using state machine or flowchart visualizations.

Review Process Considerations

The team chooses the review process. It can be a manual code review or interactive collaboration over the web. Regardless of the process, consider the following concepts:

  • Website. Consider assembling the code, associated documents, maps, and visualizations on a website. This allows easy access to the material via an HTML browser. Tools exist that automate this process, or the website can be created manually.

  • Scope. Studies have consistently shown that the most effective review is performed in an hour on 200 lines of code or less. Keep this in mind when targeting the code for review.

  • Version Management. Manage code changes using the version management tool.

  • Bug tracking. Record any defects found into a bug tracking system. If the team does not currently use a tool, consider an open source tool such as Bugzilla. Finding defects is not useful if they do not get fixed. Bug tracking tools manage this process.

  • Note template. Establish a template for recording the review feedback. It should minimally contain what was reviewed and discovered, who was involved, and any defect resolutions. Tie these notes into a bug tracking system and the version management system.

  • Review. Review the review process. Were defects found? What worked well? What needs improvement? Address these issues before the next review. Also, update those checklists or linter rules based on the defects that you found.


Implement a few of the techniques presented before the next design review and determine what works. Then, explore ways to automate the process. There are many tools out there that help remove the drudgery of preparing for and performing design reviews. These tools have the interesting side effect of improving the overall design process.

8 thoughts on “Minimizing the Pain of RTL Design Reviews”

  1. Pingback: GVK Biosciences
  2. Pingback: DMPK Services
  3. Pingback: zdporn.com

Leave a Reply

featured blogs
Aug 15, 2018
https://youtu.be/6a0znbVfFJk \ Coming from the Cadence parking lot (camera Sean) Monday: Jobs: Farmer, Baker Tuesday: Jobs: Printer, Chocolate Maker Wednesday: Jobs: Programmer, Caver Thursday: Jobs: Some Lessons Learned Friday: Jobs: Five Lessons www.breakfastbytes.com Sign ...
Aug 15, 2018
VITA 57.4 FMC+ Standard As an ANSI/VITA member, Samtec supports the release of the new ANSI/VITA 57.4-2018 FPGA Mezzanine Card Plus Standard. VITA 57.4, also referred to as FMC+, expands upon the I/O capabilities defined in ANSI/VITA 57.1 FMC by adding two new connectors that...
Aug 15, 2018
The world recognizes the American healthcare system for its innovation in precision medicine, surgical techniques, medical devices, and drug development. But they'€™ve been slow to adopt 21st century t...
Aug 14, 2018
I worked at HP in Ft. Collins, Colorado back in the 1970s. It was a heady experience. We were designing and building early, pre-PC desktop computers and we owned the market back then. The division I worked for eventually migrated to 32-bit workstations, chased from the deskto...
Jul 30, 2018
As discussed in part 1 of this blog post, each instance of an Achronix Speedcore eFPGA in your ASIC or SoC design must be configured after the system powers up because Speedcore eFPGAs employ nonvolatile SRAM technology to store its configuration bits. The time required to pr...