feature article
Subscribe Now

Debug Doppelgänger

InPA Aims to Simplify FPGA-based Prototypes

FPGA-based prototyping is probably the most popular and effective method for debugging and verifying complex designs at a reasonable speed.  FPGAs can emulate your hardware design – usually at very near actual operating frequencies.  As a result, you can run tests that you could not even contemplate with methods like simulation – including gathering real-wold stimulus from sensors, cameras, or other high-bandwidth hardware. 

None of this is new.  Unfortunately, another thing that isn’t new is that, despite a decade or two of refinement, FPGA-based prototypes are still unbelievably difficult and complex to get working in the first place, and they can be exceedingly tricky to use and understand properly.  Starting with a nice, clean, HDL-based design that seems to work well (albeit slowly) in your favorite simulator, one might expect it to be a semi-trivial matter to get that same design up and running on that nice new FPGA prototyping board (or box) that you bought (or built). 

Wrong-O

First, as we all know from years of working with FPGAs, plain-old HDL doesn’t quite cut the mustard.  Depending on the FPGA you’re using, you’ll need to make some modifications to account for things like limited clock resources, register-to-logic ratios, logic depth between registers, device-specific IP blocks, quirky FPGA memory architectures, and several others.  You’ll find out which ones apply to you by the way your design doesn’t work – or doesn’t even complete synthesis and place-and-route successfully.  After a few overnight iterations and a number of bouts with error messages ranging from slightly helpful to arcane, you’ll have your design optimized to the point where it fails place-and-route for only one reason:  Not enough resources. 

Now, you need to split your nice, monolithic design up among a few FPGAs.  There are two ways to go about this – manually (using your intimate knowledge of your design and your finely-tuned sense of engineering intuition) or with one of the many automatic partitioning tools available.  The advantage of doing it manually is that you can spend a few days to a few weeks of your valuable time changing your design in a way that makes it more complicated to understand and debug – probably introducing a few new errors along the way. 

The advantage of using an automatic tool is that you can watch your design magically transform from a nicely-organized structure that you know and love to an incomprehensible mish-mash of badly auto-generated code with a structure that resembles a molecular diagram of orangutan DNA. 

Whichever one you choose, you’re practically guaranteed weeks of fun for the whole family.  Amaze your engineering friends as you struggle to figure out the new timing for your design now that signals that used to not be on the critical path now blow timing as they run through new and exciting pairs of IO buffers and across the prototyping board.  Yesterday’s timing problems are a distant memory as you struggle with timing and logic issues you never thought would be part of your real, final design, with the comforting knowledge that they still won’t. 

Once your design seems to work properly on the FPGA board, you can seem to test it.  While seeming to test it, you may notice things that don’t appear to be working correctly.  Don’t panic – this is why you built the prototype in the first place.  You now have several problems, however.  First, you don’t really know if the bug you just found is in your real design, or if it’s just a problem with the prototype.  This is easy to resolve.  Just spend a few days to a few weeks on diagnosis and the answer will become clear. 

Second, the problem you’ve encountered may not be debuggable with your prototype in its current state.  Before you build your prorotype design, you’ll want to decide what signals to make visible for debug.  This is an important step, as it guarantees that any bug you find will not involve these signals.  Instead, your first bug will probably involve signals that you didn’t make visible in the first place.  This too, is easy to solve.  Simply go back and re-build your design several times until the signals you want are visible.  A couple weeks of random trail-and-error will usually do the trick.

Now that you have isolated the badly behaving part of your design, you may need to figure out if the problem is in hardware or software.  You’re on your own, here, buck-o.  Good luck!

As you can see, FPGA-based prototyping is both easy and fun!

Startup InPA is working to make this whole process easier.  InPA is made up of some folks with a whole bunch of experience in FPGA-based prototyping, and they know what they’re talking about.  So far, they aren’t telling us much, however.  Their technology, dubbed “Active Debug”, is designed to give “full visibility” into your FPGA-based prototype, dramatically reducing the number of iterations of synthesis and place-and-route required to get the signals you want to be visible.  The company claims that Active Debug will allow you to capture complex fault vectors and easily gain full visibility to the signals involved in the fault.  If this can get us out of the “guess, pray, synthesize, place-and-route, analyze, repeat” cycle that we currently enjoy when trying to locate a fault and get visibility into the signals involved, we’d be happy enough to pay good money.  If the company has a few more tricks up its sleeve as a bonus, that would be great too.

At the surface, InPA seems to be aimed at a similar spot to the “TotalRecall” technology announced by Synplicity (now part of Synopsys) several years ago.  That technology is now part of the “IdentifyPro” software from Synopsys.  In that solution, two copies of the design are executed simultaneously in separate FPGAs at different time intervals.  When the leading copy hits a fault, exception, or assertion, the system halts, allowing the trailing copy of the design to single-step through the fault scenario. 

InPA has apparently devised a strategy for achieving similar results with a different approach, but we’ll have to wait for the actual product announcement to learn more.

InPA claims that they are partnering with third-party EDA and prototyping board vendors, and that their product will be compatible with a number of commercially-available prototyping boards.  Their first product is scheduled to be released in Q4 of this year.

Leave a Reply

featured blogs
Jul 20, 2024
If you are looking for great technology-related reads, here are some offerings that I cannot recommend highly enough....

featured video

Larsen & Toubro Builds Data Centers with Effective Cooling Using Cadence Reality DC Design

Sponsored by Cadence Design Systems

Larsen & Toubro built the world’s largest FIFA stadium in Qatar, the world’s tallest statue, and one of the world’s most sophisticated cricket stadiums. Their latest business venture? Designing data centers. Since IT equipment in data centers generates a lot of heat, it’s important to have an efficient and effective cooling system. Learn why, Larsen & Toubro use Cadence Reality DC Design Software for simulation and analysis of the cooling system.

Click here for more information about Cadence Multiphysics System Analysis

featured chalk talk

Power Gridlock
The power grid is struggling to meet the growing demands of our electrifying world. In this episode of Chalk Talk, Amelia Dalton and Jake Michels from YAGEO Group discuss the challenges affecting our power grids today, the solutions to help solve these issues and why passive components will be the heroes of grid modernization.
Nov 28, 2023
31,027 views