feature article
Subscribe Now

Why Verify?

Musings from the Design Automation Conference

I just returned from attending the 52nd annual Design Automation Conference (DAC) in San Francisco, CA. This was my 30th time to attend this event, so I’ve had more than a little time to contemplate what the whole thing is about. It is fascinating to me that DAC, which celebrates electronic design automation (EDA) – one of the most important enabling technology sectors in the advancement of Moore’s Law – began even before Gordon Moore’s prophetic article laid out the roadmap for the last 50 years of exponential progress in electronic technology.

Yep, DAC pre-dates Moore’s Law. Chew around on that one for a little bit.

Without EDA, there would certainly be no Moore’s Law. Progress would have petered out sometime in the early 1970s, and we’d all still be wire-wrapping logic circuits together out of TTL parts. The IC design and verification technologies developed by the EDA industry over the last 50 years have enabled us to do amazing and interesting things with chips containing north of a billion gates. Design automation software is essential to every step of the process from concept through creation and – most importantly – through Verification.

It is worth noting that EDA is almost 100% funded by fear. While “productivity” tools get a lot of airtime, the big bucks in design automation have always been in the job preservation sphere. This is because IC design has the ominous “tapeout” step. Tapeout is where careers are lost, where millions are squandered, where schedules are made or blown, where winners become winners and losers fade from view. In IC design, tapeout is judgment day – the day when engineers put it all on the line and either take a celebratory bow, collect their bonus, and head off for a well-deserved vacation, or hang their heads in shame and prepare for the agonizing, stress-inducing, midnight-oil-burning, job-threatening re-spin.

EDA companies are always touting wonderful technologies that will “shorten design cycles by XYZ%”, or “increase productivity by 3x!” While these may seem like compelling benefits, they’ve never had the same show-stopping, budget-busting power as “avoid respins.” If an IC project leader goes to a manager and says, “I want to buy a tool that will help us increase our productivity,” he can expect a tepid response at best. But if he goes in and says, “We need this tool to be sure we don’t have a respin,” he’ll generally walk out with a PO approval. 

This year at DAC, I chaired a session presented by Harry Foster – Chief Scientist at Mentor Graphics. Harry was giving the results of an extensive study on design verification across a large number of IC design projects. The presentation gave data to replace commonly repeated myths about design verification, and, while it didn’t bust any important ones, it did give higher resolution to those of us wanting real data to understand design team behavior.

The 2014 study that Foster was presenting said that about 30% of engineering projects are able to achieve first-pass silicon success. That means that 70% have to do at least one respin, and that “logic and functional flaws” are the leading cause of respins. It also said that 61% of projects finish behind schedule. These sobering facts lead us to understand why an average of 57% of engineering resources are spent on design verification.

Interestingly, none of these numbers have changed dramatically since previous studies from 2010 and 2012. Even though design complexity has risen exponentially, design teams are still performing about as well in avoiding respins. Interestingly, the study showed that larger designs actually suffer fewer respins than smaller designs, probably because the teams doing larger designs have greater fear of failure and employ more robust verification strategies. 

There is a common theme in the verification world that FPGA designers play too fast and loose, and that they don’t have proper respect for the verification process. IC designers have enormous verification budgets, capable teams, and sophisticated tools. The simple truth is that fear, not design complexity, drives the advancement of advanced verification methodologies. Like creating software, creating designs for FPGAs does not involve the same Sword of Damocles that hangs over the head of those well-oiled and well-heeled IC design verification teams, and so it is unlikely that FPGA designers will ever employ the same rigor in design verification that we see in IC design – even for designs of equal or greater complexity.

As Moore’s Law crawls toward its end, we are seeing fewer and fewer projects with the volume, budget, and demanding requirements to justify the incredible risk of custom IC design. We are also seeing standard parts such as FPGAs and SoCs grow in capability and flexibility to pick up much of the slack in system design. Relying on the forgiveness of programmability – both in software and in hardware – more and more system designs are escaping the tyranny of the respin that has been the driving factor in modern verification methodologies.

This does not mean that we will see an end to advancement in verification. It does, however, suggest that a different kind of verification will gain traction. Custom IC and ASIC design is, by its very nature, “waterfall” style – where tapeout is the point where the stream drops irrevocably over the edge. But programmable design paradigms lend themselves more naturally to “agile” development practices, which look at verification in a completely different way. No longer will we have to be 100% certain we find every bug before a “big event.” Instead, our designs will evolve rapidly, and verification methodologies will need to emerge that will flow more naturally with that design evolution, rather than acting as gatekeepers at the point of sign-off.

Every year, people ask me what I saw as the “main theme” of DAC. Interestingly, in the thirty years I’ve attended, the main theme has never once changed. DAC is primarily about IC design, and IC design is primarily driven by fear of failure – which makes verification the perpetual main theme of DAC. If verification takes a right turn in the face of the end of Moore’s Law, DAC’s theme may finally change – as the very nature of the EDA industry will be forced to shift as well.

 

7 thoughts on “Why Verify?”

  1. Pingback: cpns kemenkumham
  2. Pingback: DMPK Studies
  3. Pingback: zdporn
  4. Pingback: wedding planners
  5. Pingback: sports betting
  6. Pingback: bgcareersfair.com

Leave a Reply

featured blogs
Aug 15, 2018
https://youtu.be/6a0znbVfFJk \ Coming from the Cadence parking lot (camera Sean) Monday: Jobs: Farmer, Baker Tuesday: Jobs: Printer, Chocolate Maker Wednesday: Jobs: Programmer, Caver Thursday: Jobs: Some Lessons Learned Friday: Jobs: Five Lessons www.breakfastbytes.com Sign ...
Aug 15, 2018
VITA 57.4 FMC+ Standard As an ANSI/VITA member, Samtec supports the release of the new ANSI/VITA 57.4-2018 FPGA Mezzanine Card Plus Standard. VITA 57.4, also referred to as FMC+, expands upon the I/O capabilities defined in ANSI/VITA 57.1 FMC by adding two new connectors that...
Aug 15, 2018
The world recognizes the American healthcare system for its innovation in precision medicine, surgical techniques, medical devices, and drug development. But they'€™ve been slow to adopt 21st century t...
Aug 14, 2018
I worked at HP in Ft. Collins, Colorado back in the 1970s. It was a heady experience. We were designing and building early, pre-PC desktop computers and we owned the market back then. The division I worked for eventually migrated to 32-bit workstations, chased from the deskto...
Jul 30, 2018
As discussed in part 1 of this blog post, each instance of an Achronix Speedcore eFPGA in your ASIC or SoC design must be configured after the system powers up because Speedcore eFPGAs employ nonvolatile SRAM technology to store its configuration bits. The time required to pr...