editor's blog
Subscribe Now

Mentor Unifies Verification

Seems like verification unification is in the air. We saw it recently with Synopsys, and now we have a move from Mentor.

While Synopsys’ version looked like an effort to unify acquired technology, Mentor’s efforts seem more internal. The big picture involves the unification of simulation, formal, emulation, and virtual prototyping under one umbrella, one interface. In that scheme, Mentor presents each of the technologies as an engine serving the higher-level verification goal; no longer is each one of these things a separate tool.

But a big part of what’s happening here is about conjoining emulation and simulation more seamlessly – trying to unify them to a greater extent. We actually looked at Cadence’s version of this some time back, when the verification environment was even more fractured and confusing. But the high-level picture of this relates to making the distinction between simulation and emulation more transparent.

In concept, emulation should just be a faster simulation engine, and you should be able to push the pieces of your design around between the simulator or the emulator – or the virtual prototype – based on what needs to be verified in the greatest detail and how many cycles are required either to run the tests themselves or to support the tests (like getting past the boot-up sequence quickly).

In practice, of course, emulators are hardware, and so only so much of the testbench can move from a virtual environment into real hardware – the so-called synthesizable subset. That requires care on the part of verification engineers to support a flexible testbench, and it’s also the point of new verification IP (VIP) that Mentor has included as a part of this announcement – VIP that transitions more easily between simulation and emulation.

So a big part of what’s new here is in Mentor’s Veloce emulator: their new OS3. And there are several new Veloce capabilities that are important for supporting this unification:

  • It supports a more simulation-like interaction.
  • Assertions can now be synthesized.
  • It now tracks coverage.
  • It supports Mentor’s push to move emulators out of the lab and into the data center for more effective sharing and better machine utilization, including multi-tenanting on a single machine.

Two supporting tools help with this. One is VirtuaLAB, which, somewhat surprisingly, was presented as new, but which we actually saw almost exactly two years ago. This is about eliminating rate matchers when generating “real-world” stimulus for verification. The VirtuaLAB boxes can also go into the data center as general stimulus generators, eliminating the need for someone to be physically present in a lab connecting wires to get data.

The other supporting tool is CodeLink, which we saw quite some time ago. While it’s supported offline simulation debug all the while, it now supports offline emulation debugging and review as well.

There’s actually a subtle consideration here for designs underway on existing Veloce machines. In theory, if you migrate your emulators into a data center and start sharing them on the new OS version, it’s likely this will happen in the middle of some design (it’s impossible to imagine a big company where all projects magically finish at the same time, creating an opportune window for change). The thing is, making changes in the middle of a design project is generally not great for schedule confidence. But Mentor assures us of full backwards compatibility, so that verification plans being executed under older expectations should work just as if nothing had changed.

Meanwhile, they’ve announced a new unified debugger called Visualizer that supports all of the engines, removing the need to move between debuggers when moving between engines.

And, in another trend, simulation results are all stored in a single database, regardless of which engine generated them.

This whole unification movement reflects what’s happening on chips themselves: SoCs now integrate pieces that, in earlier times, would have been created, verified, and debugged separately. And with smaller chunks, you could use separate tools for separate parts of the verification plan. But that’s just not feasible now that every aspect of every circuit has to be known to work properly before cutting an outrageously expensive mask set.

You can get more info in their announcement.

Leave a Reply

featured blogs
Sep 22, 2021
'μWaveRiders' 是ä¸ç³»åˆ—æ—¨å¨æŽ¢è®¨ Cadence AWR RF 产品的博客,按æˆæ›´æ–°ï¼Œå…¶å†…容涵盖 Cadence AWR Design Environment æ新的核心功能,专题视频ï¼...
Sep 22, 2021
3753 Cruithne is a Q-type, Aten asteroid in orbit around the Sun in 1:1 orbital resonance with the Earth, thereby making it a co-orbital object....
Sep 21, 2021
Learn how our high-performance FPGA prototyping tools enable RTL debug for chip validation teams, eliminating simulation/emulation during hardware debugging. The post High Debug Productivity Is the FPGA Prototyping Game Changer: Part 1 appeared first on From Silicon To Softw...
Aug 5, 2021
Megh Computing's Video Analytics Solution (VAS) portfolio implements a flexible and scalable video analytics pipeline consisting of the following elements: Video Ingestion Video Transformation Object Detection and Inference Video Analytics Visualization   Because Megh's ...

featured video

Enter the InnovateFPGA Design Contest to Solve Real-World Sustainability Problems

Sponsored by Intel

The Global Environment Facility (GEF) Small Grants Programme, implemented by the U.N. Development Program, is collaborating with the #InnovateFPGA contest to support 7 funded projects that are looking for technical solutions in biodiversity, sustainable agriculture, and marine conservation. Contestants have access to the Intel® Cyclone® V SoC FPGA in the Cloud Connectivity Kit, Analog Devices plug-in boards, and Microsoft Azure IoT.

Learn more about the contest and enter here by September 30, 2021

featured paper

IPU-Based Cloud Infrastructure: The Fulcrum for Digital Business

Sponsored by Intel

As Cloud Service Providers consider their investment strategies and technology plans for the future, learn how IPUs can offer a path to accelerate and financially optimize cloud services.

Click to read more

featured chalk talk

Benefits and Applications of Immersion Cooling

Sponsored by Samtec

For truly high-performance systems, liquid immersion cooling is often the best solution. But, jumping into immersion cooling requires careful consideration of elements such as connectors. In this episode of Chalk Talk, Amelia Dalton chats with Brian Niehoff of Samtec about connector solutions for immersion-cooled applications.

Click here for more information about Samtec immersion cooling solutions