feature article
Subscribe Now

Why Verify?

Musings from the Design Automation Conference

I just returned from attending the 52nd annual Design Automation Conference (DAC) in San Francisco, CA. This was my 30th time to attend this event, so I’ve had more than a little time to contemplate what the whole thing is about. It is fascinating to me that DAC, which celebrates electronic design automation (EDA) – one of the most important enabling technology sectors in the advancement of Moore’s Law – began even before Gordon Moore’s prophetic article laid out the roadmap for the last 50 years of exponential progress in electronic technology.

Yep, DAC pre-dates Moore’s Law. Chew around on that one for a little bit.

Without EDA, there would certainly be no Moore’s Law. Progress would have petered out sometime in the early 1970s, and we’d all still be wire-wrapping logic circuits together out of TTL parts. The IC design and verification technologies developed by the EDA industry over the last 50 years have enabled us to do amazing and interesting things with chips containing north of a billion gates. Design automation software is essential to every step of the process from concept through creation and – most importantly – through Verification.

It is worth noting that EDA is almost 100% funded by fear. While “productivity” tools get a lot of airtime, the big bucks in design automation have always been in the job preservation sphere. This is because IC design has the ominous “tapeout” step. Tapeout is where careers are lost, where millions are squandered, where schedules are made or blown, where winners become winners and losers fade from view. In IC design, tapeout is judgment day – the day when engineers put it all on the line and either take a celebratory bow, collect their bonus, and head off for a well-deserved vacation, or hang their heads in shame and prepare for the agonizing, stress-inducing, midnight-oil-burning, job-threatening re-spin.

EDA companies are always touting wonderful technologies that will “shorten design cycles by XYZ%”, or “increase productivity by 3x!” While these may seem like compelling benefits, they’ve never had the same show-stopping, budget-busting power as “avoid respins.” If an IC project leader goes to a manager and says, “I want to buy a tool that will help us increase our productivity,” he can expect a tepid response at best. But if he goes in and says, “We need this tool to be sure we don’t have a respin,” he’ll generally walk out with a PO approval. 

This year at DAC, I chaired a session presented by Harry Foster – Chief Scientist at Mentor Graphics. Harry was giving the results of an extensive study on design verification across a large number of IC design projects. The presentation gave data to replace commonly repeated myths about design verification, and, while it didn’t bust any important ones, it did give higher resolution to those of us wanting real data to understand design team behavior.

The 2014 study that Foster was presenting said that about 30% of engineering projects are able to achieve first-pass silicon success. That means that 70% have to do at least one respin, and that “logic and functional flaws” are the leading cause of respins. It also said that 61% of projects finish behind schedule. These sobering facts lead us to understand why an average of 57% of engineering resources are spent on design verification.

Interestingly, none of these numbers have changed dramatically since previous studies from 2010 and 2012. Even though design complexity has risen exponentially, design teams are still performing about as well in avoiding respins. Interestingly, the study showed that larger designs actually suffer fewer respins than smaller designs, probably because the teams doing larger designs have greater fear of failure and employ more robust verification strategies. 

There is a common theme in the verification world that FPGA designers play too fast and loose, and that they don’t have proper respect for the verification process. IC designers have enormous verification budgets, capable teams, and sophisticated tools. The simple truth is that fear, not design complexity, drives the advancement of advanced verification methodologies. Like creating software, creating designs for FPGAs does not involve the same Sword of Damocles that hangs over the head of those well-oiled and well-heeled IC design verification teams, and so it is unlikely that FPGA designers will ever employ the same rigor in design verification that we see in IC design – even for designs of equal or greater complexity.

As Moore’s Law crawls toward its end, we are seeing fewer and fewer projects with the volume, budget, and demanding requirements to justify the incredible risk of custom IC design. We are also seeing standard parts such as FPGAs and SoCs grow in capability and flexibility to pick up much of the slack in system design. Relying on the forgiveness of programmability – both in software and in hardware – more and more system designs are escaping the tyranny of the respin that has been the driving factor in modern verification methodologies.

This does not mean that we will see an end to advancement in verification. It does, however, suggest that a different kind of verification will gain traction. Custom IC and ASIC design is, by its very nature, “waterfall” style – where tapeout is the point where the stream drops irrevocably over the edge. But programmable design paradigms lend themselves more naturally to “agile” development practices, which look at verification in a completely different way. No longer will we have to be 100% certain we find every bug before a “big event.” Instead, our designs will evolve rapidly, and verification methodologies will need to emerge that will flow more naturally with that design evolution, rather than acting as gatekeepers at the point of sign-off.

Every year, people ask me what I saw as the “main theme” of DAC. Interestingly, in the thirty years I’ve attended, the main theme has never once changed. DAC is primarily about IC design, and IC design is primarily driven by fear of failure – which makes verification the perpetual main theme of DAC. If verification takes a right turn in the face of the end of Moore’s Law, DAC’s theme may finally change – as the very nature of the EDA industry will be forced to shift as well.

 

7 thoughts on “Why Verify?”

  1. Pingback: cpns kemenkumham
  2. Pingback: DMPK Studies
  3. Pingback: zdporn
  4. Pingback: wedding planners
  5. Pingback: sports betting
  6. Pingback: bgcareersfair.com

Leave a Reply

featured blogs
May 24, 2024
Could these creepy crawly robo-critters be the first step on a slippery road to a robot uprising coupled with an insect uprising?...
May 23, 2024
We're investing in semiconductor workforce development programs in Latin America, including government and academic partnerships to foster engineering talent.The post Building the Semiconductor Workforce in Latin America appeared first on Chip Design....

featured video

Why Wiwynn Energy-Optimized Data Center IT Solutions Use Cadence Optimality Explorer

Sponsored by Cadence Design Systems

In the AI era, as the signal-data rate increases, the signal integrity challenges in server designs also increase. Wiwynn provides hyperscale data centers with innovative cloud IT infrastructure, bringing the best total cost of ownership (TCO), energy, and energy-itemized IT solutions from the cloud to the edge.

Learn more about how Wiwynn is developing a new methodology for PCB designs with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver.

featured paper

Altera® FPGAs and SoCs with FPGA AI Suite and OpenVINO™ Toolkit Drive Embedded/Edge AI/Machine Learning Applications

Sponsored by Intel

Describes the emerging use cases of FPGA-based AI inference in edge and custom AI applications, and software and hardware solutions for edge FPGA AI.

Click here to read more

featured chalk talk

PIC® and AVR® Microcontrollers Enable Low-Power Applications
Sponsored by Mouser Electronics and Microchip
In this episode of Chalk Talk, Amelia Dalton and Marc McComb from Microchip explore how Microchip’s PIC® and AVR® MCUs are a game changer when it comes to low power embedded designs. They investigate the benefits that the flexible signal routing, core independent peripherals, and Analog Peripheral Manager (APM) bring to modern embedded designs and how these microcontroller families can help you avoid a variety of pitfalls in your next design.
Jan 15, 2024
18,257 views