feature article
Subscribe Now

Why Verify?

Musings from the Design Automation Conference

I just returned from attending the 52nd annual Design Automation Conference (DAC) in San Francisco, CA. This was my 30th time to attend this event, so I’ve had more than a little time to contemplate what the whole thing is about. It is fascinating to me that DAC, which celebrates electronic design automation (EDA) – one of the most important enabling technology sectors in the advancement of Moore’s Law – began even before Gordon Moore’s prophetic article laid out the roadmap for the last 50 years of exponential progress in electronic technology.

Yep, DAC pre-dates Moore’s Law. Chew around on that one for a little bit.

Without EDA, there would certainly be no Moore’s Law. Progress would have petered out sometime in the early 1970s, and we’d all still be wire-wrapping logic circuits together out of TTL parts. The IC design and verification technologies developed by the EDA industry over the last 50 years have enabled us to do amazing and interesting things with chips containing north of a billion gates. Design automation software is essential to every step of the process from concept through creation and – most importantly – through Verification.

It is worth noting that EDA is almost 100% funded by fear. While “productivity” tools get a lot of airtime, the big bucks in design automation have always been in the job preservation sphere. This is because IC design has the ominous “tapeout” step. Tapeout is where careers are lost, where millions are squandered, where schedules are made or blown, where winners become winners and losers fade from view. In IC design, tapeout is judgment day – the day when engineers put it all on the line and either take a celebratory bow, collect their bonus, and head off for a well-deserved vacation, or hang their heads in shame and prepare for the agonizing, stress-inducing, midnight-oil-burning, job-threatening re-spin.

EDA companies are always touting wonderful technologies that will “shorten design cycles by XYZ%”, or “increase productivity by 3x!” While these may seem like compelling benefits, they’ve never had the same show-stopping, budget-busting power as “avoid respins.” If an IC project leader goes to a manager and says, “I want to buy a tool that will help us increase our productivity,” he can expect a tepid response at best. But if he goes in and says, “We need this tool to be sure we don’t have a respin,” he’ll generally walk out with a PO approval. 

This year at DAC, I chaired a session presented by Harry Foster – Chief Scientist at Mentor Graphics. Harry was giving the results of an extensive study on design verification across a large number of IC design projects. The presentation gave data to replace commonly repeated myths about design verification, and, while it didn’t bust any important ones, it did give higher resolution to those of us wanting real data to understand design team behavior.

The 2014 study that Foster was presenting said that about 30% of engineering projects are able to achieve first-pass silicon success. That means that 70% have to do at least one respin, and that “logic and functional flaws” are the leading cause of respins. It also said that 61% of projects finish behind schedule. These sobering facts lead us to understand why an average of 57% of engineering resources are spent on design verification.

Interestingly, none of these numbers have changed dramatically since previous studies from 2010 and 2012. Even though design complexity has risen exponentially, design teams are still performing about as well in avoiding respins. Interestingly, the study showed that larger designs actually suffer fewer respins than smaller designs, probably because the teams doing larger designs have greater fear of failure and employ more robust verification strategies. 

There is a common theme in the verification world that FPGA designers play too fast and loose, and that they don’t have proper respect for the verification process. IC designers have enormous verification budgets, capable teams, and sophisticated tools. The simple truth is that fear, not design complexity, drives the advancement of advanced verification methodologies. Like creating software, creating designs for FPGAs does not involve the same Sword of Damocles that hangs over the head of those well-oiled and well-heeled IC design verification teams, and so it is unlikely that FPGA designers will ever employ the same rigor in design verification that we see in IC design – even for designs of equal or greater complexity.

As Moore’s Law crawls toward its end, we are seeing fewer and fewer projects with the volume, budget, and demanding requirements to justify the incredible risk of custom IC design. We are also seeing standard parts such as FPGAs and SoCs grow in capability and flexibility to pick up much of the slack in system design. Relying on the forgiveness of programmability – both in software and in hardware – more and more system designs are escaping the tyranny of the respin that has been the driving factor in modern verification methodologies.

This does not mean that we will see an end to advancement in verification. It does, however, suggest that a different kind of verification will gain traction. Custom IC and ASIC design is, by its very nature, “waterfall” style – where tapeout is the point where the stream drops irrevocably over the edge. But programmable design paradigms lend themselves more naturally to “agile” development practices, which look at verification in a completely different way. No longer will we have to be 100% certain we find every bug before a “big event.” Instead, our designs will evolve rapidly, and verification methodologies will need to emerge that will flow more naturally with that design evolution, rather than acting as gatekeepers at the point of sign-off.

Every year, people ask me what I saw as the “main theme” of DAC. Interestingly, in the thirty years I’ve attended, the main theme has never once changed. DAC is primarily about IC design, and IC design is primarily driven by fear of failure – which makes verification the perpetual main theme of DAC. If verification takes a right turn in the face of the end of Moore’s Law, DAC’s theme may finally change – as the very nature of the EDA industry will be forced to shift as well.

 

7 thoughts on “Why Verify?”

  1. Pingback: cpns kemenkumham
  2. Pingback: DMPK Studies
  3. Pingback: zdporn
  4. Pingback: wedding planners
  5. Pingback: sports betting
  6. Pingback: bgcareersfair.com

Leave a Reply

featured blogs
Nov 23, 2020
It'€™s been a long time since I performed Karnaugh map minimizations by hand. As a result, on my first pass, I missed a couple of obvious optimizations....
Nov 23, 2020
Readers of the Samtec blog know we are always talking about next-gen speed. Current channels rates are running at 56 Gbps PAM4. However, system designers are starting to look at 112 Gbps PAM4 data rates. Intuition would say that bleeding edge data rates like 112 Gbps PAM4 onl...
Nov 20, 2020
[From the last episode: We looked at neuromorphic machine learning, which is intended to act more like the brain does.] Our last topic to cover on learning (ML) is about training. We talked about supervised learning, which means we'€™re training a model based on a bunch of ...
Nov 20, 2020
Are you a lab instructor sitting at home right now? Have you completed some Cadence Online Training courses for your education and earned Digital Badges for personal promotion and spicing up your CV... [[ Click on the title to access the full blog on the Cadence Community si...

Featured video

Synopsys and Intel Full System PCIe 5.0 Interoperability Success

Sponsored by Synopsys

This video demonstrates industry's first successful system-level PCI Express (PCIe) 5.0 interoperability between the Synopsys DesignWare Controller and PHY IP for PCIe 5.0 and Intel Xeon Scalable processor (codename Sapphire Rapids). The ecosystem can use the companies' proven solutions to accelerate development of their PCIe 5.0-based products in high-performance computing and AI applications.

More information about DesignWare IP Solutions for PCI Express

featured paper

Overcoming PPA and Productivity Challenges of New Age ICs with Mixed Placement Innovation

Sponsored by Cadence Design Systems

With the increase in the number of on-chip storage elements, it has become extremely time consuming to come up with an optimized floorplan using manual methods, directly impacting tapeout schedules and power, performance, and area (PPA). In this white paper, learn how a breakthrough technology addresses design productivity along with design quality improvements for macro-dominated designs. Download white paper.

Click here to download the whitepaper

Featured Chalk Talk

Mindi Analog Simulator

Sponsored by Mouser Electronics and Microchip

It’s easy to go wrong in the analog portion of your design, particularly if you’re not an analog “expert.” Electrical simulation can help reduce risk and design re-spins. In this episode of Chalk Talk, Amelia Dalton chats with Rico Brooks of Microchip about the MPLAB Mindi tool, and how it can help reduce your design risk.

Click here for more information about MINDI Analog Simulator.