feature article
Subscribe Now

Faster Space Exploration

Infiniscale Uses Behavioral Modeling to Design for Analog Yield

In yet another installment of how life has gotten complicated in the design-for-manufacturing (or design-for-yield) world, we re-enter the world of the modern designer as contrasted from those of yore. Erstwhile designers followed rules, and, assuming the designs passed their tests on the way from design to manufacturing, the designer could give him- or herself a well-deserved pat on the back, release a satisfied sigh, and move on to the next project. What happened to the design at that point was not of concern to the design engineer because the design had escaped the realm of design. Testing was handled by test engineers, and yield enhancement was handled by – you guessed it — yield enhancement engineers.

Things aren’t quite so cozy anymore, and designers are being held responsible for designs that actually yield. The issue is no longer one of making sure that the peak of the distribution lies within the specified window; the trick now is trying to cram as much of the distribution as possible – hopefully everything – inside that window. It’s bad enough with digital, where your basic parametrics are speed and power, but with analog, you may have any number of parameters that matter, relating to gain, phase, stability, etc. Keeping all of those parameters with tight spec distributions is no trivial task, and the number of parameters impacting optimization is growing way out of control.

What we have here is some serious multi-variable mathematics that must be optimized – multiple inputs and multiple outputs. And you thought scheduling airplanes was complicated. (Of course they manage to tolerate much less robustness in their schedules, but I digress.) Conceptually, it’s a multi-dimensional surface, and you’re trying to find simultaneous global maxima/minima (optima?) without getting lulled into some local optimum that pushes other parameters out of whack.

What typically happens is that simulation is used in an iterative process to locate an optimal solution on this mathematical surface. The surface has a huge number of points (ok, technically infinite, but huge from a practical standpoint), and the entire space can be explored to decide which point is best. Each point can be determined by running a simulation with a given set of values for each of the dimensions. Of course, these simulations can take a while to run, so while theoretically you can map every point on the surface to some satisfactory degree of precision, in practice that would take far too long. The other issue is that each time you want an answer to a question regarding some as-yet unexplored point on the surface, you have to wait for the simulation to complete before you know the answer, limiting the speed with which you can run “what if” scenarios.

Infiniscale has taken a different approach, one they call behavioral modeling. Their intent is to provide a more efficient means of optimizing the design of analog, RF, and mixed-signal models to improve yield. They still use simulation, but in a more limited manner, and in a way that allows the actual exploration of the design space to happen much more quickly.

First the range of each dimension is specified, and then the tool automatically runs a number of simulations across some number of points and then does a curve fit to create a mathematical model of the design space surface. They don’t say exactly how they create the curves, but it’s apparently not polynomial or posynomial (I’m not even going to pretend I didn’t have to look that one up). Obviously the more points simulated, the more accurate the model, but the longer it takes to generate. Users can then play with design scenarios, and with each point being explored, a simulation doesn’t have to be run: the point is simply calculated on the surface using the mathematical, or as they call it, behavioral model.

This technology is part of their overall Lysis tool suite. The TechModeler tool is the one that builds the model. It can use the output of any SPICE-like simulation tool that provides a comma-separated-values output. It automates the process of setting up the parameter values for each simulation, launching the simulation, recording the results, and then building the model.

Once built, this model can then be used by the TechAnalyzer tool, which provides Monte Carlo and sensitivity analysis. They claim that the equivalent of 100,000 runs can be handled in five seconds. A TechSizer tool is available to play with the sizing of components to identify corners and optimize performance. These tools have been on the market for up to two years.

They have just released a new tool, TechYielder, which optimizes the distributions of multiple parameters to provide the highest yield. While individual design space points can be queried, with results in a few minutes, the tool can also automatically determine a global optimum that will place, if possible, the entire process distribution within the yielding range. Even if the distribution is already high-yielding, it will attempt to find a solution that better centers, narrows, and normalizes the distribution.

Examples of the typical size range and content of blocks that can be optimized are analog-digital converters, voltage-controlled oscillators, bandgap regulators, filters, serdes circuits, and the like. The approach isn’t generally suited to full-chip optimization, since a full SoC will have multiple blocks, each having a set of performance specs. Instead, each block of the SoC, or perhaps library elements, are optimized and then assembled in the SoC.

With increased ability to model the effects of process variations, designers can optimize their circuits before ever signing them off. Rather than yield enhancement engineers trying to tweak things here and there to improve yields – a process that would seem somewhat dicey with sensitive analog circuitry – the design can be centered by the person most familiar with the circuit. And if the block is a cell or library module, then the benefits of that optimization can accrue to every circuit using that block, leveraging the optimization work much further.

Reviewing some of the materials from the earlier product, it feels like the behavioral modeling technology originally started as a means of looking at different data points without having to do a lot of new simulations, with progressively more automation being added since then. Looking at today’s big picture with the current offering, the TechYielder tool allows the design to be robust with respect to all the variation anticipated in the manufacturing process, and the TechAnalyzer and TechSizer tools allow robustness with respect to the variation in the use environment. If they deliver as promised, it certainly simplifies the process of trying to design around the increasing number of variables that must be considered.

Leave a Reply

featured blogs
Oct 26, 2020
Last week was the Linley Group's Fall Processor Conference. The conference opened, as usual, with Linley Gwenap's overview of the processor market (both silicon and IP). His opening keynote... [[ Click on the title to access the full blog on the Cadence Community s...
Oct 23, 2020
Processing a component onto a PCB used to be fairly straightforward. Through-hole products, or a single or double row surface mount with a larger centerline rarely offer unique challenges obtaining a proper solder joint. However, as electronics continue to get smaller and con...
Oct 23, 2020
[From the last episode: We noted that some inventions, like in-memory compute, aren'€™t intuitive, being driven instead by the math.] We have one more addition to add to our in-memory compute system. Remember that, when we use a regular memory, what goes in is an address '...
Oct 23, 2020
Any suggestions for a 4x4 keypad in which the keys aren'€™t wobbly and you don'€™t have to strike a key dead center for it to make contact?...

featured video

Demo: Inuitive NU4000 SoC with ARC EV Processor Running SLAM and CNN

Sponsored by Synopsys

See Inuitive’s NU4000 3D imaging and vision processor in action. The SoC supports high-quality 3D depth processor engine, SLAM accelerators, computer vision, and deep learning by integrating Synopsys ARC EV processor. In this demo, the NU4000 demonstrates simultaneous 3D sensing, SLAM and CNN functionality by mapping out its environment and localizing the sensor while identifying the objects within it. For more information, visit inuitive-tech.com.

Click here for more information about DesignWare ARC EV Processors for Embedded Vision

featured paper

Fundamentals of Precision ADC Noise Analysis

Sponsored by Texas Instruments

Build your knowledge of noise performance with high-resolution delta-sigma ADCs. This e-book covers types of ADC noise, how other components contribute noise to the system, and how these noise sources interact with each other.

Click here to download the whitepaper

Featured Chalk Talk

Passive Component Solutions for Automotive Safety Electronics

Sponsored by Mouser Electronics and AVX

In today’s demanding automotive safety applications, choosing high-quality passives with the right performance properties can make the difference between success and catastrophic failure. With issues like power quality, EMI suppression, circuit protection, and antennas, getting the right passives is critical. In this episode of Chalk Talk, Amelia Dalton chats with Daniel West of AVX about how to choose the right passives for safety-critical automotive applications.

Click here for more information about AVX Solutions for Automotive Safety Electronics