feature article
Subscribe Now

Why Verify?

Musings from the Design Automation Conference

I just returned from attending the 52nd annual Design Automation Conference (DAC) in San Francisco, CA. This was my 30th time to attend this event, so I’ve had more than a little time to contemplate what the whole thing is about. It is fascinating to me that DAC, which celebrates electronic design automation (EDA) – one of the most important enabling technology sectors in the advancement of Moore’s Law – began even before Gordon Moore’s prophetic article laid out the roadmap for the last 50 years of exponential progress in electronic technology.

Yep, DAC pre-dates Moore’s Law. Chew around on that one for a little bit.

Without EDA, there would certainly be no Moore’s Law. Progress would have petered out sometime in the early 1970s, and we’d all still be wire-wrapping logic circuits together out of TTL parts. The IC design and verification technologies developed by the EDA industry over the last 50 years have enabled us to do amazing and interesting things with chips containing north of a billion gates. Design automation software is essential to every step of the process from concept through creation and – most importantly – through Verification.

It is worth noting that EDA is almost 100% funded by fear. While “productivity” tools get a lot of airtime, the big bucks in design automation have always been in the job preservation sphere. This is because IC design has the ominous “tapeout” step. Tapeout is where careers are lost, where millions are squandered, where schedules are made or blown, where winners become winners and losers fade from view. In IC design, tapeout is judgment day – the day when engineers put it all on the line and either take a celebratory bow, collect their bonus, and head off for a well-deserved vacation, or hang their heads in shame and prepare for the agonizing, stress-inducing, midnight-oil-burning, job-threatening re-spin.

EDA companies are always touting wonderful technologies that will “shorten design cycles by XYZ%”, or “increase productivity by 3x!” While these may seem like compelling benefits, they’ve never had the same show-stopping, budget-busting power as “avoid respins.” If an IC project leader goes to a manager and says, “I want to buy a tool that will help us increase our productivity,” he can expect a tepid response at best. But if he goes in and says, “We need this tool to be sure we don’t have a respin,” he’ll generally walk out with a PO approval. 

This year at DAC, I chaired a session presented by Harry Foster – Chief Scientist at Mentor Graphics. Harry was giving the results of an extensive study on design verification across a large number of IC design projects. The presentation gave data to replace commonly repeated myths about design verification, and, while it didn’t bust any important ones, it did give higher resolution to those of us wanting real data to understand design team behavior.

The 2014 study that Foster was presenting said that about 30% of engineering projects are able to achieve first-pass silicon success. That means that 70% have to do at least one respin, and that “logic and functional flaws” are the leading cause of respins. It also said that 61% of projects finish behind schedule. These sobering facts lead us to understand why an average of 57% of engineering resources are spent on design verification.

Interestingly, none of these numbers have changed dramatically since previous studies from 2010 and 2012. Even though design complexity has risen exponentially, design teams are still performing about as well in avoiding respins. Interestingly, the study showed that larger designs actually suffer fewer respins than smaller designs, probably because the teams doing larger designs have greater fear of failure and employ more robust verification strategies. 

There is a common theme in the verification world that FPGA designers play too fast and loose, and that they don’t have proper respect for the verification process. IC designers have enormous verification budgets, capable teams, and sophisticated tools. The simple truth is that fear, not design complexity, drives the advancement of advanced verification methodologies. Like creating software, creating designs for FPGAs does not involve the same Sword of Damocles that hangs over the head of those well-oiled and well-heeled IC design verification teams, and so it is unlikely that FPGA designers will ever employ the same rigor in design verification that we see in IC design – even for designs of equal or greater complexity.

As Moore’s Law crawls toward its end, we are seeing fewer and fewer projects with the volume, budget, and demanding requirements to justify the incredible risk of custom IC design. We are also seeing standard parts such as FPGAs and SoCs grow in capability and flexibility to pick up much of the slack in system design. Relying on the forgiveness of programmability – both in software and in hardware – more and more system designs are escaping the tyranny of the respin that has been the driving factor in modern verification methodologies.

This does not mean that we will see an end to advancement in verification. It does, however, suggest that a different kind of verification will gain traction. Custom IC and ASIC design is, by its very nature, “waterfall” style – where tapeout is the point where the stream drops irrevocably over the edge. But programmable design paradigms lend themselves more naturally to “agile” development practices, which look at verification in a completely different way. No longer will we have to be 100% certain we find every bug before a “big event.” Instead, our designs will evolve rapidly, and verification methodologies will need to emerge that will flow more naturally with that design evolution, rather than acting as gatekeepers at the point of sign-off.

Every year, people ask me what I saw as the “main theme” of DAC. Interestingly, in the thirty years I’ve attended, the main theme has never once changed. DAC is primarily about IC design, and IC design is primarily driven by fear of failure – which makes verification the perpetual main theme of DAC. If verification takes a right turn in the face of the end of Moore’s Law, DAC’s theme may finally change – as the very nature of the EDA industry will be forced to shift as well.

 

7 thoughts on “Why Verify?”

  1. Pingback: cpns kemenkumham
  2. Pingback: DMPK Studies
  3. Pingback: zdporn
  4. Pingback: wedding planners
  5. Pingback: sports betting
  6. Pingback: bgcareersfair.com

Leave a Reply

featured blogs
Feb 8, 2023
Part of the PCIe 6.0 specification, learn how the TEE Device Interface Security Protocol (TDISP) secures I/O virtualization & enables secure key exchange. The post New PCIe TDISP Architecture Secures Device Interfaces with Virtual Servers appeared first on From Silicon ...
Feb 8, 2023
At the recent Chiplet Summit, there was a panel session on the last afternoon titled How to Make Chiplets a Viable Market . The panel was moderated by Meta's Ravi Agarwal, and the panelists were (from left to right in the photo): Travis Lanier of Ventana Micro Systems......
Jan 19, 2023
Are you having problems adjusting your watch strap or swapping out your watch battery? If so, I am the bearer of glad tidings....

featured chalk talk

HARTING's HAN® 1A Connector Series

Sponsored by Mouser Electronics and HARTING

There is a big push in the electronics industry today to make our designs smaller and more modular. One way we can help solve these design challenges is with the choice of connector we select for our designs. In this episode of Chalk Talk, Goda Inokaityte from HARTING and Amelia Dalton examine the role that miniaturized connectivity plays in the future of electronic design. They also how HARTING's Han 1A connectors can help reduce errors in installation, improve serviceability and increase modularity in your next design.

Click here for more information about HARTING Han® 1A Heavy Duty Power Connectors