editor's blog
Subscribe Now

Verdi: Not Just for Debug Anymore

There’s a bit of repositioning going on in EDA-land. It involves Synopsys’s popular Verdi tool, acquired through the SpringSoft purchase. Conceived as a flexible debug tool, it also has an open scripting environment that gives engineers access to data in the fast signal database (FSDB) file. With that capability, folks have been bolting analysis utilities onto Verdi for a while on an ad hoc basis.

This hasn’t gone unnoticed at Synopsys, and they’re now in the process of repositioning Verdi: it’s not just for debug anymore. While it obviously still includes debug in its expanded portfolio, Synopsys is adding features that don’t necessarily fit the debug profile.

One of those is Verdi Coverage. This is intended to help build and track a verification plan that is tightly synchronized with the design requirements. This concept might be familiar to any of you that have seen similar tools in the software space from companies like LDRA.

The assumption here is that verification tests spring from requirements. (If it’s not required, then why are you testing it?) And all requirements should be documented in a requirements document. Verdi Coverage lets you tie tests to requirements and tick off coverage at the requirements level.

Where this can be particularly helpful is when requirements change. Yeah, it’s a thing; it happens. Who knew. Verdi Coverage tracks the requirements documents and can notice when changes occur. This allows you to go in and modify the verification plan accordingly, if needed.

How do they do that? They rely on the PDF file format for the document. Best practice is to use an outline structure in that document. They capture the text from the document, along with some meta-information about where the text is to be found.

And when the document changes? How can they pinpoint the changes? Diff technology. Off the shelf, actually. Apparently the ability to diff two files has gotten pretty good these days. (It’s not as easy as you might think: as soon as one thing changes, then everything after it might seem different unless you can identify the type and scope of the change and then get back on track with unchanged text.) The important thing is this: there’s no special formatting you need to do so that this will work. Write a well-organized, well-structured document (so that a human can process it well) and Verdi Coverage will be able to handle it.

Far from being a debug thing, this becomes a planning tool up front, creating a specific link between requirements and the elements of the verification plan. It applies across verification technologies (formal, simulation, etc.). As long as the requirements document is being kept up to date, there’s no reason for the verification plan to get out of synch with it.

You can find out more in their release.

Leave a Reply

featured blogs
Apr 19, 2021
Cache coherency is not a new concept. Coherent architectures have existed for many generations of CPU and Interconnect designs. Verifying adherence to coherency rules in SoCs has always been one of... [[ Click on the title to access the full blog on the Cadence Community sit...
Apr 19, 2021
Samtec blog readers are used to hearing about high-performance design. However, we see an increase in intertest in power integrity (PI). PI grows more crucial with each design iteration, yet many engineers are just starting to understand PI. That raises an interesting questio...
Apr 15, 2021
Explore the history of FPGA prototyping in the SoC design/verification process and learn about HAPS-100, a new prototyping system for complex AI & HPC SoCs. The post Scaling FPGA-Based Prototyping to Meet Verification Demands of Complex SoCs appeared first on From Silic...
Apr 14, 2021
By Simon Favre If you're not using critical area analysis and design for manufacturing to… The post DFM: Still a really good thing to do! appeared first on Design with Calibre....

featured video

The Verification World We Know is About to be Revolutionized

Sponsored by Cadence Design Systems

Designs and software are growing in complexity. With verification, you need the right tool at the right time. Cadence® Palladium® Z2 emulation and Protium™ X2 prototyping dynamic duo address challenges of advanced applications from mobile to consumer and hyperscale computing. With a seamlessly integrated flow, unified debug, common interfaces, and testbench content across the systems, the dynamic duo offers rapid design migration and testing from emulation to prototyping. See them in action.

Click here for more information

featured paper

Understanding Functional Safety FIT Base Failure Rate Estimates per IEC 62380 and SN 29500

Sponsored by Texas Instruments

Functional safety standards such as IEC 61508 and ISO 26262 require semiconductor device manufacturers to address both systematic and random hardware failures. Base failure rates (BFR) quantify the intrinsic reliability of the semiconductor component while operating under normal environmental conditions. Download our white paper which focuses on two widely accepted techniques to estimate the BFR for semiconductor components; estimates per IEC Technical Report 62380 and SN 29500 respectively.

Click here to download the whitepaper

featured chalk talk

How Trinamic's Stepper Motor Technologies Improve Your Application

Sponsored by Mouser Electronics and Maxim Integrated

Stepper motor control has come a long way in the past few years. New techniques can give greater control, smoother operation, greater torque, and better efficiency. In this episode of Chalk Talk, Amelia Dalton chats with Lars Jaskulski about Trinamic stepper solutions and how to take advantage of micro stepping, load measurement, and more.