feature article
Subscribe Now

Faster Space Exploration

Infiniscale Uses Behavioral Modeling to Design for Analog Yield

In yet another installment of how life has gotten complicated in the design-for-manufacturing (or design-for-yield) world, we re-enter the world of the modern designer as contrasted from those of yore. Erstwhile designers followed rules, and, assuming the designs passed their tests on the way from design to manufacturing, the designer could give him- or herself a well-deserved pat on the back, release a satisfied sigh, and move on to the next project. What happened to the design at that point was not of concern to the design engineer because the design had escaped the realm of design. Testing was handled by test engineers, and yield enhancement was handled by – you guessed it — yield enhancement engineers.

Things aren’t quite so cozy anymore, and designers are being held responsible for designs that actually yield. The issue is no longer one of making sure that the peak of the distribution lies within the specified window; the trick now is trying to cram as much of the distribution as possible – hopefully everything – inside that window. It’s bad enough with digital, where your basic parametrics are speed and power, but with analog, you may have any number of parameters that matter, relating to gain, phase, stability, etc. Keeping all of those parameters with tight spec distributions is no trivial task, and the number of parameters impacting optimization is growing way out of control.

What we have here is some serious multi-variable mathematics that must be optimized – multiple inputs and multiple outputs. And you thought scheduling airplanes was complicated. (Of course they manage to tolerate much less robustness in their schedules, but I digress.) Conceptually, it’s a multi-dimensional surface, and you’re trying to find simultaneous global maxima/minima (optima?) without getting lulled into some local optimum that pushes other parameters out of whack.

What typically happens is that simulation is used in an iterative process to locate an optimal solution on this mathematical surface. The surface has a huge number of points (ok, technically infinite, but huge from a practical standpoint), and the entire space can be explored to decide which point is best. Each point can be determined by running a simulation with a given set of values for each of the dimensions. Of course, these simulations can take a while to run, so while theoretically you can map every point on the surface to some satisfactory degree of precision, in practice that would take far too long. The other issue is that each time you want an answer to a question regarding some as-yet unexplored point on the surface, you have to wait for the simulation to complete before you know the answer, limiting the speed with which you can run “what if” scenarios.

Infiniscale has taken a different approach, one they call behavioral modeling. Their intent is to provide a more efficient means of optimizing the design of analog, RF, and mixed-signal models to improve yield. They still use simulation, but in a more limited manner, and in a way that allows the actual exploration of the design space to happen much more quickly.

First the range of each dimension is specified, and then the tool automatically runs a number of simulations across some number of points and then does a curve fit to create a mathematical model of the design space surface. They don’t say exactly how they create the curves, but it’s apparently not polynomial or posynomial (I’m not even going to pretend I didn’t have to look that one up). Obviously the more points simulated, the more accurate the model, but the longer it takes to generate. Users can then play with design scenarios, and with each point being explored, a simulation doesn’t have to be run: the point is simply calculated on the surface using the mathematical, or as they call it, behavioral model.

This technology is part of their overall Lysis tool suite. The TechModeler tool is the one that builds the model. It can use the output of any SPICE-like simulation tool that provides a comma-separated-values output. It automates the process of setting up the parameter values for each simulation, launching the simulation, recording the results, and then building the model.

Once built, this model can then be used by the TechAnalyzer tool, which provides Monte Carlo and sensitivity analysis. They claim that the equivalent of 100,000 runs can be handled in five seconds. A TechSizer tool is available to play with the sizing of components to identify corners and optimize performance. These tools have been on the market for up to two years.

They have just released a new tool, TechYielder, which optimizes the distributions of multiple parameters to provide the highest yield. While individual design space points can be queried, with results in a few minutes, the tool can also automatically determine a global optimum that will place, if possible, the entire process distribution within the yielding range. Even if the distribution is already high-yielding, it will attempt to find a solution that better centers, narrows, and normalizes the distribution.

Examples of the typical size range and content of blocks that can be optimized are analog-digital converters, voltage-controlled oscillators, bandgap regulators, filters, serdes circuits, and the like. The approach isn’t generally suited to full-chip optimization, since a full SoC will have multiple blocks, each having a set of performance specs. Instead, each block of the SoC, or perhaps library elements, are optimized and then assembled in the SoC.

With increased ability to model the effects of process variations, designers can optimize their circuits before ever signing them off. Rather than yield enhancement engineers trying to tweak things here and there to improve yields – a process that would seem somewhat dicey with sensitive analog circuitry – the design can be centered by the person most familiar with the circuit. And if the block is a cell or library module, then the benefits of that optimization can accrue to every circuit using that block, leveraging the optimization work much further.

Reviewing some of the materials from the earlier product, it feels like the behavioral modeling technology originally started as a means of looking at different data points without having to do a lot of new simulations, with progressively more automation being added since then. Looking at today’s big picture with the current offering, the TechYielder tool allows the design to be robust with respect to all the variation anticipated in the manufacturing process, and the TechAnalyzer and TechSizer tools allow robustness with respect to the variation in the use environment. If they deliver as promised, it certainly simplifies the process of trying to design around the increasing number of variables that must be considered.

Leave a Reply

featured blogs
Jan 22, 2021
Amidst an ongoing worldwide pandemic, Samtec continues to connect with our communities. As a digital technology company, we understand the challenges and how uncertain times have been for everyone. In early 2020, Samtec Cares suspended its normal grant cycle and concentrated ...
Jan 22, 2021
I was recently introduced to the concept of a tray that quickly and easily attaches to your car'€™s steering wheel (not while you are driving, of course). What a good idea!...
Jan 22, 2021
This is my second post about this year's CES. The first was Consumer Electronics Show 2021: GM, Intel . AMD The second day of CES opened with Lisa Su, AMD's CEO, presenting. AMD announced new... [[ Click on the title to access the full blog on the Cadence Community...
Jan 20, 2021
Explore how EDA tools & proven IP accelerate the automotive design process and ensure compliance with Automotive Safety Integrity Levels & ISO requirements. The post How EDA Tools and IP Support Automotive Functional Safety Compliance appeared first on From Silicon...

featured paper

Overcoming Signal Integrity Challenges of 112G Connections on PCB

Sponsored by Cadence Design Systems

One big challenge with 112G SerDes is handling signal integrity (SI) issues. By the time the signal winds its way from the transmitter on one chip to packages, across traces on PCBs, through connectors or cables, and arrives at the receiver, the signal is very distorted, making it a challenge to recover the clock and data-bits of the information being transferred. Learn how to handle SI issues and ensure that data is faithfully transmitted with a very low bit error rate (BER).

Click here to download the whitepaper

Featured Chalk Talk

SensorTile. Box - A Ready to Go IoT Node

Sponsored by Mouser Electronics and STMicroelectronics

In the highly competitive IoT market, getting your idea to the prototype stage as quickly as possible is critical. But, designing non-differentiated things like connectivity, power supplies, sensor interfaces, and so forth soaks up valuable design time. In this episode of Chalk Talk, Amelia Dalton chats with Thiago Reis from STMicroelectronics about SensorTile Box - a ready-to-go IoT node development kit that’s just waiting for your great IoT idea.

Click here for more information about STMicroelectronics STEVAL-MKSBOX1V1 SensorTile.box Development Kit