feature article
Subscribe Now

Combining Formal Verification With Simulation

You Can Have Your Cake and Eat it Too!

It is well known that formal verification offers exhaustive proofs of correctness and can help find deep corner case bugs. Verification engineers have typically relegated its use to block-level designs, preferring to use simulation, the tried and tested workhorse, for chip-level and full-chip verification. The very strength of formal verification, namely its exhaustiveness, is also its Achilles heel in that with design size growth, formal engines run out of steam due to state space explosion. Simulation, on the other hand, scales well with design size and doesn’t suffer the same fate as formal verification.

Why Simulation Alone Isn’t Enough

With ever-increasing design complexity and the desire to verify a design as exhaustively as possible, chip designers are increasingly interested in adopting formal verification methodologies. Despite being ubiquitous, simulation doesn’t always provide the measure of confidence provided by mathematical proofs delivered by formal tools. The incremental effort required to achieve each percentage point of coverage beyond the first 90 percent increases exponentially with simulation, extending the verification cycle and becoming the long pole in the tent to getting the product out on time. Formal verification comes with the promise of delivering what the verification engineers need – exhaustive coverage for those hard to reach points of the design. The question to ask then is “what is the catch – why not just combine the two technologies to get the best of both worlds?”

Barriers to Formal Verification Adoption

In a predominantly simulation-friendly world, expertise handling formal verification tools is in relatively short supply. Formal tools have also had the reputation of belonging to the verification “nerd” domain. EDA vendors have been working hard to dispel this myth. Then there is the real problem: a user cannot give up simulation to adopt formal – it is not a replacement technology. It falls in the realm of complementary technology, with the ground rule that the user is not willing to give up simulation in exchange for using formal. Also, it shouldn’t take an equal amount of time to set up a formal environment as it does a simulation environment. Reuse of applicable parts of the simulation infrastructure is an all-important bottleneck to formal adoption. Another equally important but completely different barrier to formal adoption is the difficulty of sharing coverage information between formal and simulation. While both methodologies report coverage they frequently don’t follow the same semantics of reporting, making the process of correlating coverage a manual and tedious task.

Hybrid Approach

A hybrid solution offers the best of both worlds by combining the strengths of formal methodologies and simulation to give a powerful one-two verification punch. Formal tools may suffer from non-convergence and state space explosion, while simulation tools require extensive testbench creation to stimulate bugs deep in the design state space to catch the bug, making it a hit or miss proposition. The combination of the two overcomes the limitations of each individual technology. A hybrid approach that uses testbench stimuli to seed the formal searches will suffer from the same drawback as a pure simulation approach – it is still a hit or miss technology because the formal searches are governed by the quality of the stimuli.

Alternatively, a formal-assisted simulation technique can be immensely powerful. In contrast to the approach just mentioned, there is no dependence on testbench stimuli. The idea is to start a formal search and exhaustively search a limited state space. Before running into state space explosion issues, an intelligent switch to deep simulation using the formal engine as a guide can form the basis of a very powerful solution. Once the simulation has been allowed to run for a while and explore parts of the design state space that would have consumed the capacity of the formal engine, such an approach will switch to another localized formal search. By alternately using simulation to move deep into the design state space and deploying exhaustive proofs or searches for bugs with a formal engine, the hybrid solution holds the promise of delivering much better results than a formal or simulation engine alone can ever hope to achieve. With this approach, it must be realized that the quality of the results will depend to a large extent on the heuristics used to switch between simulation and formal and vice versa. Important decisions to be made are when to switch between the two and how long to let each engine run before switching.

A measure of the effectiveness of this approach can be gleaned from the following example. Consider a block of a graphics processing engine with about 300,000 gates and 15,000 sequential elements in the design. About 4,000 coverage goals were specified for this design. After a three-day run time, 58-percent coverage was achieved using a pure constrained random simulation approach. The hybrid-formal simulation approach increased coverage to 92 percent for the same design in the same amount of time. This represents a 59 percent increase in coverage as compared to the simulation only approach. In some cases, coverage improvements as great as 100 percent have been achieved using a hybrid-formal simulation approach.

Challenges in Combining Formal Methodologies With Simulation

The hybrid approach is not without challenges. As with a purely formal approach, convergence can be an issue. Heuristics must be constantly monitored and improved to get the best of both worlds. A second challenge is the ability to integrate the simulation infrastructure with that of the formal engine. Any applicable part of a testbench that can be must be reused. Leveraging the existing testbench infrastructure is a guaranteed way to overcome objections from the hardcore simulation experts who don’t want to relinquish any of the benefits of their tried and tested approach. Finally, an integrated view of coverage information is required to assess the effectiveness of the verification runs. Monitoring this information is essential to gauge if the desired coverage targets have been achieved.

Conclusion

A hybrid approach combining the best of formal and simulation technologies leads to more complete verification. If architected properly, such a solution can lead to broader adoption of formal technologies even with die-hard simulation engineers. The hybrid solution can extend the range of a formal tool and increase the chances of finding bugs deep in the design. Leveraging components of the existing testbench infrastructure and merging coverage information with a simulation environment can help usher in the benefits of formal technology in a predominantly simulation-friendly environment. With the hybrid approach, it is indeed possible to have the simulation cake and eat the formal cake too!

About the author: Krishna Balachandran is currently director of product marketing for low power verification products at Synopsys. He holds a master’s degree in computer engineering from the University of Louisiana and completed the executive MBA certification program from Stanford University.

 

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Stepper Motor Basics & Toshiba Motor Control Solutions
Sponsored by Mouser Electronics and Toshiba
Stepper motors offer a variety of benefits that can add value to many different kinds of electronic designs. In this episode of Chalk Talk, Amelia Dalton and Doug Day from Toshiba examine the different types of stepper motors, the solutions to drive these motors, and how the active gain control and ADMD of Toshiba’s motor control solutions can make all the difference in your next design.
Sep 29, 2023
26,395 views