feature article
Subscribe Now

Combining Formal Verification With Simulation

You Can Have Your Cake and Eat it Too!

It is well known that formal verification offers exhaustive proofs of correctness and can help find deep corner case bugs. Verification engineers have typically relegated its use to block-level designs, preferring to use simulation, the tried and tested workhorse, for chip-level and full-chip verification. The very strength of formal verification, namely its exhaustiveness, is also its Achilles heel in that with design size growth, formal engines run out of steam due to state space explosion. Simulation, on the other hand, scales well with design size and doesn’t suffer the same fate as formal verification.

Why Simulation Alone Isn’t Enough

With ever-increasing design complexity and the desire to verify a design as exhaustively as possible, chip designers are increasingly interested in adopting formal verification methodologies. Despite being ubiquitous, simulation doesn’t always provide the measure of confidence provided by mathematical proofs delivered by formal tools. The incremental effort required to achieve each percentage point of coverage beyond the first 90 percent increases exponentially with simulation, extending the verification cycle and becoming the long pole in the tent to getting the product out on time. Formal verification comes with the promise of delivering what the verification engineers need – exhaustive coverage for those hard to reach points of the design. The question to ask then is “what is the catch – why not just combine the two technologies to get the best of both worlds?”

Barriers to Formal Verification Adoption

In a predominantly simulation-friendly world, expertise handling formal verification tools is in relatively short supply. Formal tools have also had the reputation of belonging to the verification “nerd” domain. EDA vendors have been working hard to dispel this myth. Then there is the real problem: a user cannot give up simulation to adopt formal – it is not a replacement technology. It falls in the realm of complementary technology, with the ground rule that the user is not willing to give up simulation in exchange for using formal. Also, it shouldn’t take an equal amount of time to set up a formal environment as it does a simulation environment. Reuse of applicable parts of the simulation infrastructure is an all-important bottleneck to formal adoption. Another equally important but completely different barrier to formal adoption is the difficulty of sharing coverage information between formal and simulation. While both methodologies report coverage they frequently don’t follow the same semantics of reporting, making the process of correlating coverage a manual and tedious task.

Hybrid Approach

A hybrid solution offers the best of both worlds by combining the strengths of formal methodologies and simulation to give a powerful one-two verification punch. Formal tools may suffer from non-convergence and state space explosion, while simulation tools require extensive testbench creation to stimulate bugs deep in the design state space to catch the bug, making it a hit or miss proposition. The combination of the two overcomes the limitations of each individual technology. A hybrid approach that uses testbench stimuli to seed the formal searches will suffer from the same drawback as a pure simulation approach – it is still a hit or miss technology because the formal searches are governed by the quality of the stimuli.

Alternatively, a formal-assisted simulation technique can be immensely powerful. In contrast to the approach just mentioned, there is no dependence on testbench stimuli. The idea is to start a formal search and exhaustively search a limited state space. Before running into state space explosion issues, an intelligent switch to deep simulation using the formal engine as a guide can form the basis of a very powerful solution. Once the simulation has been allowed to run for a while and explore parts of the design state space that would have consumed the capacity of the formal engine, such an approach will switch to another localized formal search. By alternately using simulation to move deep into the design state space and deploying exhaustive proofs or searches for bugs with a formal engine, the hybrid solution holds the promise of delivering much better results than a formal or simulation engine alone can ever hope to achieve. With this approach, it must be realized that the quality of the results will depend to a large extent on the heuristics used to switch between simulation and formal and vice versa. Important decisions to be made are when to switch between the two and how long to let each engine run before switching.

A measure of the effectiveness of this approach can be gleaned from the following example. Consider a block of a graphics processing engine with about 300,000 gates and 15,000 sequential elements in the design. About 4,000 coverage goals were specified for this design. After a three-day run time, 58-percent coverage was achieved using a pure constrained random simulation approach. The hybrid-formal simulation approach increased coverage to 92 percent for the same design in the same amount of time. This represents a 59 percent increase in coverage as compared to the simulation only approach. In some cases, coverage improvements as great as 100 percent have been achieved using a hybrid-formal simulation approach.

Challenges in Combining Formal Methodologies With Simulation

The hybrid approach is not without challenges. As with a purely formal approach, convergence can be an issue. Heuristics must be constantly monitored and improved to get the best of both worlds. A second challenge is the ability to integrate the simulation infrastructure with that of the formal engine. Any applicable part of a testbench that can be must be reused. Leveraging the existing testbench infrastructure is a guaranteed way to overcome objections from the hardcore simulation experts who don’t want to relinquish any of the benefits of their tried and tested approach. Finally, an integrated view of coverage information is required to assess the effectiveness of the verification runs. Monitoring this information is essential to gauge if the desired coverage targets have been achieved.

Conclusion

A hybrid approach combining the best of formal and simulation technologies leads to more complete verification. If architected properly, such a solution can lead to broader adoption of formal technologies even with die-hard simulation engineers. The hybrid solution can extend the range of a formal tool and increase the chances of finding bugs deep in the design. Leveraging components of the existing testbench infrastructure and merging coverage information with a simulation environment can help usher in the benefits of formal technology in a predominantly simulation-friendly environment. With the hybrid approach, it is indeed possible to have the simulation cake and eat the formal cake too!

About the author: Krishna Balachandran is currently director of product marketing for low power verification products at Synopsys. He holds a master’s degree in computer engineering from the University of Louisiana and completed the executive MBA certification program from Stanford University.

 

Leave a Reply

featured blogs
Mar 3, 2021
In grade school, we had timed math quizzes. With a sheet full of problems and the timer set, the goal was to answer as many as possible. The key to speed is TONS of practice and, honestly, memorization '€“ knowing the problems so well that the answer comes to mind at first ...
Mar 3, 2021
The recent 34th International Conference on VLSI Design , also known as VLSID , was a virtual event, of course. But it is India-based and the conference ran on India time. The theme for this year was... [[ Click on the title to access the full blog on the Cadence Community s...
Feb 26, 2021
OMG! Three 32-bit processor cores each running at 300 MHz, each with its own floating-point unit (FPU), and each with more memory than you than throw a stick at!...
Feb 25, 2021
Learn how ASIL-certified EDA tools help automotive designers create safe, secure, and reliable Advanced Driver Assistance Systems (ADAS) for smart vehicles. The post Upping the Safety Game Plan for Automotive SoCs appeared first on From Silicon To Software....

featured paper

How to Fast-Charge Your Supercapacitor

Sponsored by Maxim Integrated

Supercapacitors (or ultracapacitors) are suited for short charge and discharge cycles. They require high currents for fast charge as well as a high voltage with a high number in series as shown in two usage cases: an automatic pallet shuttle and a fail-safe backup system. In these and many other cases, the fast charge is provided by a flexible, high-efficiency, high-voltage, and high-current charger based on a synchronous, step-down, supercapacitor charger controller.

Click here to download the whitepaper

featured chalk talk

AC Protection & Motor Control in HVAC Systems

Sponsored by Mouser Electronics and Littelfuse

The design of HVAC systems poses unique challenges for things like motor control and circuit protection. System performance and reliability are critical, and those come in part from choosing the right components for the job. In this episode of Chalk Talk, Amelia Dalton chats with Ryan Sheahen of Littelfuse about choosing the right components for your next HVAC design.

Click here for more information about Littelfuse AC Protection & Motor Control in HVAC Solutions