feature article
Subscribe Now

Faster Space Exploration

Infiniscale Uses Behavioral Modeling to Design for Analog Yield

In yet another installment of how life has gotten complicated in the design-for-manufacturing (or design-for-yield) world, we re-enter the world of the modern designer as contrasted from those of yore. Erstwhile designers followed rules, and, assuming the designs passed their tests on the way from design to manufacturing, the designer could give him- or herself a well-deserved pat on the back, release a satisfied sigh, and move on to the next project. What happened to the design at that point was not of concern to the design engineer because the design had escaped the realm of design. Testing was handled by test engineers, and yield enhancement was handled by – you guessed it — yield enhancement engineers.

Things aren’t quite so cozy anymore, and designers are being held responsible for designs that actually yield. The issue is no longer one of making sure that the peak of the distribution lies within the specified window; the trick now is trying to cram as much of the distribution as possible – hopefully everything – inside that window. It’s bad enough with digital, where your basic parametrics are speed and power, but with analog, you may have any number of parameters that matter, relating to gain, phase, stability, etc. Keeping all of those parameters with tight spec distributions is no trivial task, and the number of parameters impacting optimization is growing way out of control.

What we have here is some serious multi-variable mathematics that must be optimized – multiple inputs and multiple outputs. And you thought scheduling airplanes was complicated. (Of course they manage to tolerate much less robustness in their schedules, but I digress.) Conceptually, it’s a multi-dimensional surface, and you’re trying to find simultaneous global maxima/minima (optima?) without getting lulled into some local optimum that pushes other parameters out of whack.

What typically happens is that simulation is used in an iterative process to locate an optimal solution on this mathematical surface. The surface has a huge number of points (ok, technically infinite, but huge from a practical standpoint), and the entire space can be explored to decide which point is best. Each point can be determined by running a simulation with a given set of values for each of the dimensions. Of course, these simulations can take a while to run, so while theoretically you can map every point on the surface to some satisfactory degree of precision, in practice that would take far too long. The other issue is that each time you want an answer to a question regarding some as-yet unexplored point on the surface, you have to wait for the simulation to complete before you know the answer, limiting the speed with which you can run “what if” scenarios.

Infiniscale has taken a different approach, one they call behavioral modeling. Their intent is to provide a more efficient means of optimizing the design of analog, RF, and mixed-signal models to improve yield. They still use simulation, but in a more limited manner, and in a way that allows the actual exploration of the design space to happen much more quickly.

First the range of each dimension is specified, and then the tool automatically runs a number of simulations across some number of points and then does a curve fit to create a mathematical model of the design space surface. They don’t say exactly how they create the curves, but it’s apparently not polynomial or posynomial (I’m not even going to pretend I didn’t have to look that one up). Obviously the more points simulated, the more accurate the model, but the longer it takes to generate. Users can then play with design scenarios, and with each point being explored, a simulation doesn’t have to be run: the point is simply calculated on the surface using the mathematical, or as they call it, behavioral model.

This technology is part of their overall Lysis tool suite. The TechModeler tool is the one that builds the model. It can use the output of any SPICE-like simulation tool that provides a comma-separated-values output. It automates the process of setting up the parameter values for each simulation, launching the simulation, recording the results, and then building the model.

Once built, this model can then be used by the TechAnalyzer tool, which provides Monte Carlo and sensitivity analysis. They claim that the equivalent of 100,000 runs can be handled in five seconds. A TechSizer tool is available to play with the sizing of components to identify corners and optimize performance. These tools have been on the market for up to two years.

They have just released a new tool, TechYielder, which optimizes the distributions of multiple parameters to provide the highest yield. While individual design space points can be queried, with results in a few minutes, the tool can also automatically determine a global optimum that will place, if possible, the entire process distribution within the yielding range. Even if the distribution is already high-yielding, it will attempt to find a solution that better centers, narrows, and normalizes the distribution.

Examples of the typical size range and content of blocks that can be optimized are analog-digital converters, voltage-controlled oscillators, bandgap regulators, filters, serdes circuits, and the like. The approach isn’t generally suited to full-chip optimization, since a full SoC will have multiple blocks, each having a set of performance specs. Instead, each block of the SoC, or perhaps library elements, are optimized and then assembled in the SoC.

With increased ability to model the effects of process variations, designers can optimize their circuits before ever signing them off. Rather than yield enhancement engineers trying to tweak things here and there to improve yields – a process that would seem somewhat dicey with sensitive analog circuitry – the design can be centered by the person most familiar with the circuit. And if the block is a cell or library module, then the benefits of that optimization can accrue to every circuit using that block, leveraging the optimization work much further.

Reviewing some of the materials from the earlier product, it feels like the behavioral modeling technology originally started as a means of looking at different data points without having to do a lot of new simulations, with progressively more automation being added since then. Looking at today’s big picture with the current offering, the TechYielder tool allows the design to be robust with respect to all the variation anticipated in the manufacturing process, and the TechAnalyzer and TechSizer tools allow robustness with respect to the variation in the use environment. If they deliver as promised, it certainly simplifies the process of trying to design around the increasing number of variables that must be considered.

Leave a Reply

featured blogs
Apr 19, 2024
Data type conversion is a crucial aspect of programming that helps you handle data across different data types seamlessly. The SKILL language supports several data types, including integer and floating-point numbers, character strings, arrays, and a highly flexible linked lis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured chalk talk

Shift Left with Calibre
In this episode of Chalk Talk, Amelia Dalton and David Abercrombie from Siemens investigate the details of Calibre’s shift-left strategy. They take a closer look at how the tools and techniques in this design tool suite can help reduce signoff iterations and time to tapeout while also increasing design quality.
Nov 27, 2023
19,325 views