feature article
Subscribe Now

Does Noise Analysis Accuracy Really Matter?

There have been a lot of new faces springing up in the timing and signal integrity (SI) analysis market over the past few years, and the trend appears to point toward products that deliver quick and reasonably good timing signoff, with some signal integrity analysis tacked on as an afterthought. This prompted us to ask: Just how important is noise analysis accuracy and quality?

To answer this question, we first looked back at the history of noise analysis and how it evolved from being a nice-to-have security blanket to an integral part of design closure. To make a long story short, signal ntegrity analysis began to catch on in 2000 as 180nm – 130nm design starts increased.  By 2002, new 90nm design starts began to ramp and by 2004, signal integrity analysis went mainstream, becoming a ‘must-have’ to ensure reliability while maintaining design margins. Today, SI fixing is a regular part of the design closure loop and a standard offering of all signoff timing solutions.

Signal integrity analysis has proved its worth many times over, saving numerous designs from failing in silicon. This is no small feat and is accomplished by finding, and then fixing, functional and timing failures induced by crosstalk noise during design implementation. In today’s world of cutting-edge low-power design, this means that SI analysis must be able to take into account the complex signal integrity problems that can come from mixing multi-Vt cells, or creating domains with different supply voltages, or by adding new low power cells such as level-shifters and power gates. Your noise analysis solution must also consider that smaller process nodes and enormous net counts mean an explosion in the number and complexity of signal integrity issues, which could potentially slow the overall design effort.

So will any SI solution do?

Digging deeper, we found that the quality and completeness of the noise analysis solution can make the difference between first silicon success and a costly re-spin. We recently observed four instances in customer evaluations where major noise-induced problems were not found by more simplistic and less-accurate noise analysis solutions.

When all of these issues were checked against the golden SPICE simulation, they proved to be issues that, if left unresolved, would very likely have caused failures in silicon. Issues such as “Double Clocking”, unpredictable accuracy, and delay pushout optimism were the most common.

In one of these cases, the customer found out through testing and physical measurement that they had a functional failure related to noise-induced “double clocking” in their recent tapeout. “Double clocking” is a situation that most noise-on-delay analyses cannot catch and happens when signals adjacent to the clock net switch in the opposite direction as the clock transitions.  If this causes a bump during the clock’s transition, then double clocking can occur.  This can only be detected by looking at the worst opposite slope on clock nets and determining if this non-montonicity causes a clocking event (see figure 1). The SI analysis solution the customer used for this tapeout did not detect this failure so they went looking for a solution that could. By using a more accurate and complete noise analysis solution, they were able to successfully detect the same exact double-clocking events causing the functional failures allowing them to avoid this costly situation for future projects.

20090428_cadence_fig1.jpg
Figure 1.

While some solutions offer impressive turnaround time at the expense of accuracy, these customers decided that they weren’t ready to take the risk of a re-spin which could cost over $1M at smaller process nodes.

Net-Net

If the cost of silicon re-spins is not reason enough to consider SI accuracy and quality, you can also consider that better accuracy means less pessimism which results in less margin needed during implementation to ensure you can sign-off and fewer violations to resolve in the final stages of design closure. From our perspective, noise analysis accuracy and quality really does matter — in really big way!  Shortcuts, by contrast, just are not worth the risk of extra design iterations or, even worse, failing silicon.

Michael Jacobs joined Cadence Design Systems in 1998 and has served in the capacity of design services, technical field applications, and product marketing management supporting digital verification and RTL to GDSII solutions. Mr. Jacobs holds a BSEE from the University of Central Florida as well as a degree in Management from University of Texas, Austin.

Trisha Kristof is a Product Engineer for Encounter Timing System, specializing in Signal Integrity. She received her MSEE from Santa Clara University and her BSEE from UC Davis.

Leave a Reply

featured blogs
Apr 19, 2021
Cache coherency is not a new concept. Coherent architectures have existed for many generations of CPU and Interconnect designs. Verifying adherence to coherency rules in SoCs has always been one of... [[ Click on the title to access the full blog on the Cadence Community sit...
Apr 19, 2021
Samtec blog readers are used to hearing about high-performance design. However, we see an increase in intertest in power integrity (PI). PI grows more crucial with each design iteration, yet many engineers are just starting to understand PI. That raises an interesting questio...
Apr 15, 2021
Explore the history of FPGA prototyping in the SoC design/verification process and learn about HAPS-100, a new prototyping system for complex AI & HPC SoCs. The post Scaling FPGA-Based Prototyping to Meet Verification Demands of Complex SoCs appeared first on From Silic...
Apr 14, 2021
By Simon Favre If you're not using critical area analysis and design for manufacturing to… The post DFM: Still a really good thing to do! appeared first on Design with Calibre....

featured video

The Verification World We Know is About to be Revolutionized

Sponsored by Cadence Design Systems

Designs and software are growing in complexity. With verification, you need the right tool at the right time. Cadence® Palladium® Z2 emulation and Protium™ X2 prototyping dynamic duo address challenges of advanced applications from mobile to consumer and hyperscale computing. With a seamlessly integrated flow, unified debug, common interfaces, and testbench content across the systems, the dynamic duo offers rapid design migration and testing from emulation to prototyping. See them in action.

Click here for more information

featured paper

Understanding the Foundations of Quiescent Current in Linear Power Systems

Sponsored by Texas Instruments

Minimizing power consumption is an important design consideration, especially in battery-powered systems that utilize linear regulators or low-dropout regulators (LDOs). Read this new whitepaper to learn the fundamentals of IQ in linear-power systems, how to predict behavior in dropout conditions, and maintain minimal disturbance during the load transient response.

Click here to download the whitepaper

Featured Chalk Talk

Keeping Your Linux Device Secure

Sponsored by Siemens Digital Industries Software

Embedded security is an ongoing process, not a one-time effort. Even after your design is shipped, security vulnerabilities are certain to be discovered - even in things like the operating system. In this episode of Chalk Talk, Amelia Dalton chats with Kathy Tufto from Mentor - a Siemens business, about how to make a plan to keep your Linux-based embedded design secure, and how to respond quickly when new vulnerabilities are discovered.

More information about Mentor Embedded Linux®