feature article
Subscribe Now

Deliver Products On-Time with RTL Hardware Debug

Crunch time on projects always seems to come during lab debug. That’s when the FPGA, software and PCB all come together for the first time. It’s also the last, and frequently, most difficult phase in the project. Any slack time in the schedule has long since been eaten up by unanticipated delays of one sort or another. The entire team has to work together on the same thing and in the same place, possibly for the first time.

Many developers put off thinking seriously about the latter stages of the project and what tools they might need once they get there. There’s so much to do initially in specifying the design, partitioning it and keeping all the parallel efforts on track and in sync to consider what you’ll do when you get to the lab.

But when you’re budgeting the project, it’s important to consider what tools you will need on the back end as well as the front to ensure success. Simulation is fine for logic verification in a test bench environment. But verification in an actual system running at speed is another matter. Most projects require hardware verification in the lab with the system software operating on an embedded processor and interacting with other logic on the FPGA. You use hardware verification tools to debug that system.

Hardware Verification

For lab debug you will need to gather information about what’s going on inside your FPGA and relate it back to the source RTL to implement corrections. If, for example you do not see the expected outputs from a state machine, you need to know what state you’re in and what inputs are failing to move you through the states. When you detect a bug, you have to iterate the design swiftly so that the team does not lose focus waiting.

Probing Logic

You gather that information by probing the logic and storing the results. To probe you need a way to quickly isolate logic and attach sensors and triggers.

One probing solution would be to use a logic analyzer to monitor internal signals. You do that by bringing the signals to pins on the device that are connected to PCB headers. The headers logic analyzer pods plug into the headers. Typically you then select one of the signals as a clock and one as a trigger to begin storage. One set of vectors is stored each clock.

This method does provide a window into the device in operation without any cost in logic resources. In addition the analyzer may be familiar to the design team and its use avoids the delay in learning a new tool. At the same time it has a number of shortcomings. One big issue is that you have to connect signals in your design to the device manually. Nodes at lower levels of the hierarchy have to be routed up to the top level by editing the designs pins and iterating through debug manually each time. Any nodes you seek to view that are not at the device level of the design must be routed to the top-level.

The probing capacity of the analyzers is limited by the number of free pins available on the device and the number of pins placed on the board. The names of the signals have to be entered into the logic analyzer viewer in order to track which node in the design is displayed on which line. The entire process has to be performed every time the probes are moved. Routing nodes in the design to the pins may interfere with device operation or timing.

A few programmable logic vendors offer tools for hardware debugging using the programming channel to access device operation information. These tools use the programming port on the device to connect to the internal nodes so that external pins are not required.

Connections are made using a two tiered component. One component connects to nodes in the design and transfers the results to the second component that forwards them to the JTAG port. The tools support more sampling channels than would be practical using an analyzer and can sample at varying ranges around a trigger event.

The vendor tools offer some advantages over logic analyzers in that they do not require external package pins. The tools also support multiple clock domain sampling and more levels of triggering than a logic analyzer. Unlike logic analyzers the vendor tools use logic resources, however few, that are then unavailable for use in the application.

The vendor tools share some shortcomings with analyzers such as manual text editing to insert probes, limited results reporting and lengthy iteration cycles.

Another solution is the Identify RTL debugger from Synplicity. This product is made up of two tools – an instrumentor and a debugger. The Instrumentor uses a component system similar to those offered by the device vendors. But instead of editing your source files to add nets to connect pins or add probing components to connect nodes, you use the Instrumentor to display the design hierarchy in one window and select the module you wish to view as shown in Figure 1. When you complete the instrumentation, the logic is added automatically. Like the vendor tools, adding probes requires the use of some logic resources, but it is minimal.

Figure 1. Navigate the Design Hierarchy

Once you see the section you wish to probe you select those nets in the design for sampling, triggering or both. Values can be logic levels or enumerated states. All signals are displayed with eyeglass icons and the lenses show the mode you select for the signals.

The hierarchical design display lets you find lines of code fast because it exactly reflects the structure you used to create them. When you navigate to a module or architecture, youll see the branching statements displayed along with their line number.

ll the code branch statements such as IF and ELSE are marked as potential breakpoints and you activate them merely by clicking on the circle. Examples of probes and breakpoints are shown in Figure 2.

Figure 2: Instrumentor Sets Breakpoints and Triggers

After you have instrumented the design you then compile it to add the probes. The tool makes a copy of the design with all the probes and JTAG port communicator included. The additional logic is implemented using logic resources and consumes only a small percentage of even small devices. Their inclusion has no effect on design timing.

The tool supports multiple instrumentations of a single design and you can switch between them simply by clicking a tab and compiling. That feature allows you to leverage the same resources over several sets of probes. Different engineers can use the tool to instrument the same version of the design without interfering with each other.

Regardless of the tool you use the next steps after inserting probes are to synthesize the design, route it on your device and then program the chip in the lab.

Debugging the Design

When you debug you are operating your design in a laboratory and monitoring the behavior of internal nodes. You will be looking for logic transitions at different points in time and storing a series of events for display. Storage begins on a triggering event. The debug process involves iterations of the connection and implementation flow.

When debugging with a logic analyzer you set one signal value as a trigger. When the analyzer sees the value, it captures data in a buffer and displays it on its screen. Logic analyzers do not support complex triggering on multiple events easily although such mechanisms could be designed by the user.

The vendor tools and Synplicitys Identify tool perform all the triggering and storage of results internally. After you have programmed the device and it is operating in the system, the components sample and store behavior of nodes using logic and memory resources on the device.

The vendor tools include a viewer that runs on a PC and accesses the information over the programming port and displays it in the signal waveforms.

The second tool in the Identify suite is the Debugger that offers complex and user-defined triggering for data capture to trap exactly those events that resolve design malfunctions. It also supports multiple sets of probes and the means to seamlessly toggle between sets on a single version of the design.

Controlling Debug

You need to use the Identify Debugger to control when data is collected and how it is displayed. The Debugger lets you design complex trigger mechanisms that look for a single event of one clock, multiple clocks or multiple instances of the event. The triggers can operate on RTL control flow statements such as IF, WHEN, CASE and others. There is also an editor that lets you create state machines that control triggering in a series of events caused by a series of conditions.

Reporting Results

Debugging requires a means of displaying the results. You analyze the results to understand behavior and correct faults. The probed values are stored in a buffer and displayed on a waveform viewer. You want to be able to use the results to pinpoint bugs in your design and correct them.

Logic analyzers contain displays and one benefit of using the analyzer to show FPGA signals is that it may display PCB signals in addition to those from the FPGA. The analyzer screen is smaller than a typical PC monitor and as a result is a cumbersome object in the lab.

The vendor tools have their own waveform viewers that run on a PC. Waveform signals can be labeled and the text displayed between iterations. Waveform viewing shows signal timing relationships, but does not directly relate values back to the code. You still have to relate the behavior of signals observed back to the source code to understand what caused the transitions. This is a time-consuming and error prone task without an RTL debugger like the Identify product.

Like the vendor tools, the Identify Debugger supports waveform viewing, but it can also annotate the logic values directly back into the source code. The values may be binary or enumerated data types. You can step backward and forward in time to view the results over a series of clocks and watch the code being updated each cycle.

ou use the icons, shown in Figure 3, and menus to set the various trigger options, start and stop debugging and view the instrumentation. You use the cycle tabs to move back and forth through the results on individual clocks.

Figure 3: Debugger Icons

The tool includes a waveform viewer and supports exports to other viewers through standard VCD format.

Iterating the Design

Logic analyzers are completely outside of the tool flow of design iterations. The iterations are performed by editing the source code to move probes, synthesizing and routing the entire design.

Vendor tools also require you to re-synthesize unless you do post synthesis insertion of the probe cores. In that case, however, you must have synthesized with an option to retain all net names and also re-specify the prove connections with every iteration because they are not retained through synthesis.

The Identify tool allows you to perform very fast iterations by directly calling the vendor place and route in incremental mode so that only those connections required to move the probes are routed and the remainder of the design is frozen. That allows debug compilations to complete in a fraction of the time required for the entire design. That means the team can see the results of changes right away.

Summary

Hardware verification is an essential phase of the development cycle of systems using FPGA devices. Several methods are available. Synplicity’s Identify RTL Debugger tool supports most popular families of devices and uses their programming cables to transfer commands and data. This allows you to use a single software environment for debugging devices from different FPGA vendors.

The Identify product is the only tool that offers you a complete solution to hardware debug and helps you to quickly find out what you need to complete your design and move it out of the lab.

Leave a Reply

featured blogs
Oct 15, 2018
The talk of the town in the DRAM market (well, apart from its growth in the last couple of years) is DDR5. You might assume from the talk that JEDEC has finalized the standard, but it is actually technically still in development. I believe that the final standard is still exp...
Oct 14, 2018
  The deep learning revolution is the most interesting thing happening in the electronics industry today, said Chris Rowen during his keynote speech at the Electronic Design Process Symposium (EDPS), held last month at the Milpitas headquarters of SEMI, the industry asso...
Oct 12, 2018
In September, we spent some time focusing on our checkout process, continued to improve our industry standards experience, and released a few new content pages. We also rolled out the first version of our new Samtec Cares website. More on these and the rest of the major updat...
Oct 8, 2018
Augmented technology has broken into the mainstream, with popular apps such as Pokemon Go, Snapchat and Instagram face filters available to anyone with a smartphone. The next step in the wave of AR...