feature article
Subscribe Now

Why Verify?

Musings from the Design Automation Conference

I just returned from attending the 52nd annual Design Automation Conference (DAC) in San Francisco, CA. This was my 30th time to attend this event, so I’ve had more than a little time to contemplate what the whole thing is about. It is fascinating to me that DAC, which celebrates electronic design automation (EDA) – one of the most important enabling technology sectors in the advancement of Moore’s Law – began even before Gordon Moore’s prophetic article laid out the roadmap for the last 50 years of exponential progress in electronic technology.

Yep, DAC pre-dates Moore’s Law. Chew around on that one for a little bit.

Without EDA, there would certainly be no Moore’s Law. Progress would have petered out sometime in the early 1970s, and we’d all still be wire-wrapping logic circuits together out of TTL parts. The IC design and verification technologies developed by the EDA industry over the last 50 years have enabled us to do amazing and interesting things with chips containing north of a billion gates. Design automation software is essential to every step of the process from concept through creation and – most importantly – through Verification.

It is worth noting that EDA is almost 100% funded by fear. While “productivity” tools get a lot of airtime, the big bucks in design automation have always been in the job preservation sphere. This is because IC design has the ominous “tapeout” step. Tapeout is where careers are lost, where millions are squandered, where schedules are made or blown, where winners become winners and losers fade from view. In IC design, tapeout is judgment day – the day when engineers put it all on the line and either take a celebratory bow, collect their bonus, and head off for a well-deserved vacation, or hang their heads in shame and prepare for the agonizing, stress-inducing, midnight-oil-burning, job-threatening re-spin.

EDA companies are always touting wonderful technologies that will “shorten design cycles by XYZ%”, or “increase productivity by 3x!” While these may seem like compelling benefits, they’ve never had the same show-stopping, budget-busting power as “avoid respins.” If an IC project leader goes to a manager and says, “I want to buy a tool that will help us increase our productivity,” he can expect a tepid response at best. But if he goes in and says, “We need this tool to be sure we don’t have a respin,” he’ll generally walk out with a PO approval. 

This year at DAC, I chaired a session presented by Harry Foster – Chief Scientist at Mentor Graphics. Harry was giving the results of an extensive study on design verification across a large number of IC design projects. The presentation gave data to replace commonly repeated myths about design verification, and, while it didn’t bust any important ones, it did give higher resolution to those of us wanting real data to understand design team behavior.

The 2014 study that Foster was presenting said that about 30% of engineering projects are able to achieve first-pass silicon success. That means that 70% have to do at least one respin, and that “logic and functional flaws” are the leading cause of respins. It also said that 61% of projects finish behind schedule. These sobering facts lead us to understand why an average of 57% of engineering resources are spent on design verification.

Interestingly, none of these numbers have changed dramatically since previous studies from 2010 and 2012. Even though design complexity has risen exponentially, design teams are still performing about as well in avoiding respins. Interestingly, the study showed that larger designs actually suffer fewer respins than smaller designs, probably because the teams doing larger designs have greater fear of failure and employ more robust verification strategies. 

There is a common theme in the verification world that FPGA designers play too fast and loose, and that they don’t have proper respect for the verification process. IC designers have enormous verification budgets, capable teams, and sophisticated tools. The simple truth is that fear, not design complexity, drives the advancement of advanced verification methodologies. Like creating software, creating designs for FPGAs does not involve the same Sword of Damocles that hangs over the head of those well-oiled and well-heeled IC design verification teams, and so it is unlikely that FPGA designers will ever employ the same rigor in design verification that we see in IC design – even for designs of equal or greater complexity.

As Moore’s Law crawls toward its end, we are seeing fewer and fewer projects with the volume, budget, and demanding requirements to justify the incredible risk of custom IC design. We are also seeing standard parts such as FPGAs and SoCs grow in capability and flexibility to pick up much of the slack in system design. Relying on the forgiveness of programmability – both in software and in hardware – more and more system designs are escaping the tyranny of the respin that has been the driving factor in modern verification methodologies.

This does not mean that we will see an end to advancement in verification. It does, however, suggest that a different kind of verification will gain traction. Custom IC and ASIC design is, by its very nature, “waterfall” style – where tapeout is the point where the stream drops irrevocably over the edge. But programmable design paradigms lend themselves more naturally to “agile” development practices, which look at verification in a completely different way. No longer will we have to be 100% certain we find every bug before a “big event.” Instead, our designs will evolve rapidly, and verification methodologies will need to emerge that will flow more naturally with that design evolution, rather than acting as gatekeepers at the point of sign-off.

Every year, people ask me what I saw as the “main theme” of DAC. Interestingly, in the thirty years I’ve attended, the main theme has never once changed. DAC is primarily about IC design, and IC design is primarily driven by fear of failure – which makes verification the perpetual main theme of DAC. If verification takes a right turn in the face of the end of Moore’s Law, DAC’s theme may finally change – as the very nature of the EDA industry will be forced to shift as well.

 

7 thoughts on “Why Verify?”

  1. Pingback: cpns kemenkumham
  2. Pingback: DMPK Studies
  3. Pingback: zdporn
  4. Pingback: wedding planners
  5. Pingback: sports betting
  6. Pingback: bgcareersfair.com

Leave a Reply

featured blogs
Oct 4, 2022
We share 6 key advantages of cloud-based IC hardware design tools, including enhanced scalability, security, and access to AI-enabled EDA tools. The post 6 Reasons to Leverage IC Hardware Development in the Cloud appeared first on From Silicon To Software....
Oct 4, 2022
Anyone designing a data center faces complex thermal management challenges . Yes, there's a large amount of electrical power required, but the other side of that coin is that almost all the power gets turned into heat, putting a tremendous strain on the airflow and cooling sy...
Sep 30, 2022
When I wrote my book 'Bebop to the Boolean Boogie,' it was certainly not my intention to lead 6-year-old boys astray....

featured video

PCIe Gen5 x16 Running on the Achronix VectorPath Accelerator Card

Sponsored by Achronix

In this demo, Achronix engineers show the VectorPath Accelerator Card successfully linking up to a PCIe Gen5 x16 host and write data to and read data from GDDR6 memory. The VectorPath accelerator card featuring the Speedster7t FPGA is one of the first FPGAs that can natively support this interface within its PCIe subsystem. Speedster7t FPGAs offer a revolutionary new architecture that Achronix developed to address the highest performance data acceleration challenges.

Click here for more information about the VectorPath Accelerator Card

featured paper

Algorithm Verification with FPGAs and ASICs

Sponsored by MathWorks

Developing new FPGA and ASIC designs involves implementing new algorithms, which presents challenges for verification for algorithm developers, hardware designers, and verification engineers. This eBook explores different aspects of hardware design verification and how you can use MATLAB and Simulink to reduce development effort and improve the quality of end products.

Click here to read more

featured chalk talk

10X Faster Analog Simulation with PrimeSim Continuum

Sponsored by Synopsys

IC design has come a very long way in a short amount of time. Today, our SoC designs frequently include integrated analog, 100+ Gigabit data rates and 3D stacked DRAM integrated into our SoCs on interposers. In order to keep our heads above water in all of this IC complexity, we need a unified circuit simulation workflow and a fast signoff SPICE and FastSPICE architecture. In this episode of Chalk Talk, Amelia Dalton chats with Hany Elhak from Synopsys about how the unified workflow of the PrimeSim Continuum from Synopsys can help you address systematic and scale complexity for your next IC design.

Click to read more about PrimeSim Continuum