feature article
Subscribe Now

Analog at 20

Cadence’s Virtuoso for Advanced Nodes

Analog designers bear a heavier burden than many other designers. If you’re a digital guy, someone is out there creating cells for you, abstracting away the nasty bits so that you can operate unsullied in a land of make-believe that, magically, seems to work.

No such luck for the analog engineer. He or she has to do a lot of heavy lifting on his/her own. And this is largely of their choosing, since “trust” is not easy for analog folks: it’s too easy for someone else to make their design look bad.

This has significant implications when major change happens. Like what’s happening in the move to 20nm and beyond. We’ve got new transistors, we’ve got double patterning, and we’ve got new rules. OK, I guess that last one isn’t new; we get new rules every node, but you put all of these together, and it can be a mess. Digital designers have lots of people scurrying around to clean up that mess before they get involved; analog engineers less so.

Three topics in particular affect analog engineers, and they are areas of focus in Cadence’s latest Virtuoso release. We’ll address two of the topics here, while reserving the third – local interconnect – for its own discussion in the future.

An evolutionary change is the increased importance of proximity effects. You’ve probably dealt with them before, since this isn’t the first node requiring their consideration. “Well proximity” and “length of diffusion” effects have been known for a long time; poly spacing and active area spacing started becoming important at the 40nm node, according to Cadence.

If you haven’t dealt with proximity effects before, they affect layout spacing. Actually – that’s not true: it’s the reverse, because these aren’t DRC rules. DRC rules affect layout spacing: if you violate the rule, you have to change the spacing. With proximity effects, however, the spacing affects the design.

The idea is that, with certain items, placing them in proximity – even if they’re unrelated in any functional way – will have an impact on performance. They won’t necessarily cause things to fail outright, but you might notice that gain or speed or something isn’t what you expected it to be. In other words, proximity has to be figured into the simulation in order to correctly model the circuit.

This can be a problem, since most circuit simulation is done prior to layout. You tune your circuit, simulating conscientiously, before handing it to the layout guy. The layout guy then does things that can now change the results of that simulation, creating an extra design loop.

But, even for the layout guy, the design isn’t completed all at once – it might not even be done entirely by the same person. Given that layout can affect design performance, it helps to get an early read on portions of the layout rather than waiting until the entire thing is done. So Cadence has enabled what they call “partial layout” simulation.

Rather than functional simulation – which is run from the schematic – or post-layout simulation – which uses a layout description of the design, partial layout simulation allows a netlist that contains both schematic and layout elements. This means you can lay out a particular chunk of the circuit and then re-simulate to identify any effects caused by the layout and address them early (either in the design or by changing the layout).

Why is this an analog issue? Well, it’s more accurately a custom issue. Cell designers will run into this too, but they’re filling out a library. An analog designer is creating a final circuit and, unlike his digital counterpart, will need to take this into account.

The other bit that’s new and important is coloring. At 20nm, double patterning is now a reality for Metal 1 at least. This creates a couple of issues. The first is the ability to decompose a single layer into two partial layers. This has to be done in a way that resolves all of the critical spacing that can’t be handled in a single layer.

We’ve talked about coloring before (although I haven’t seen stitches involved in any of the more current solutions). Early on, the foundries wanted control of the coloring. So they would take your single metal mask and decompose it into two. But for analog designs, you might want to have some input, since it makes a difference for matching, for example. Likewise, cell designers will do much of their own coloring, which will be incorporated into digital circuits.

The idea is that you only constrain as much color information as necessary and then let the foundry do the rest. That might mean coloring specific nets a specific color, or it might be simply stating that two nets should be of the same color (but it doesn’t matter which color). The latter might be useful in cells to ensure that the cell can be decomposed properly, but to provide flexibility for placement next to other cells; that placement might create color conflicts, and swapping all of the colors in one of the cells may fix it.

The practical issue with this is where cycles (or loops) are created. These are unresolvable coloring problems (which in theory could be fixed by stitching but are realistically handled by tweaking the layout to remove the problem). The challenge is that, when coloring a particular net, you may end up inadvertently creating a cycle that you would never have suspected – and it can be pretty long. So real-time checks ensure that you can see such cycles right when you create them.

Of course, when a cycle is present, the tool can’t tell you what to fix. A cycle is a collection of coloring conflicts; fix any one of them and the problem goes away. So if you create a new cycle with a new layout change, it doesn’t mean that the new change is the problem: you may elect to leave that in place and change some other part of the layout to eliminate the cycle.

The other problem that double patterning creates is that it makes it harder to account for variation. Let’s say you have double patterning for Metal 1. Ordinarily, you’d have to account for the variations in single metal layers and the single exposures used when printing them. But now Metal 1 has two exposures and two chances to introduce errors (or variation). So you have variation within Metal 1 and then from each of the Metal 1 exposures to Metal 2. It gets worse if Metal 2 is also double-patterned. And if you’re letting the foundry do the coloring, you don’t know what’s going to be on each color, so you can’t explicitly account for the variation.

Cadence takes a conservative approach to this, which more or less means assuming that you could have the full double-patterning error anywhere. The more you know about where the colors will be, the more you can tighten things down without over-designing. Here, the burden of doing it yourself as an analog designer works in your favor: you know more accurately how things will work, and so you can dial things in better.

 

More info:

Cadence’s Virtuoso release

Cadence Virtuoso

One thought on “Analog at 20”

  1. We looked at a couple of the advanced-node issues Cadence is addressing in their latest Virtuoso version; are there other issues you’re concerned about at these nodes?

Leave a Reply

featured blogs
Apr 18, 2021
https://youtu.be/afv9_fRCrq8 Made at Target Oakridge (camera Ziyue Zhang) Monday: "Targeting" the Open Compute Project Tuesday: NUMECA, Computational Fluid Dynamics...and the America's... [[ Click on the title to access the full blog on the Cadence Community s...
Apr 16, 2021
Spring is in the air and summer is just around the corner. It is time to get out the Old Farmers Almanac and check on the planting schedule as you plan out your garden.  If you are unfamiliar with a Farmers Almanac, it is a publication containing weather forecasts, plantin...
Apr 15, 2021
Explore the history of FPGA prototyping in the SoC design/verification process and learn about HAPS-100, a new prototyping system for complex AI & HPC SoCs. The post Scaling FPGA-Based Prototyping to Meet Verification Demands of Complex SoCs appeared first on From Silic...
Apr 14, 2021
By Simon Favre If you're not using critical area analysis and design for manufacturing to… The post DFM: Still a really good thing to do! appeared first on Design with Calibre....

featured video

Learn the basics of Hall Effect sensors

Sponsored by Texas Instruments

This video introduces Hall Effect, permanent magnets and various magnetic properties. It'll walk through the benefits of Hall Effect sensors, how Hall ICs compare to discrete Hall elements and the different types of Hall Effect sensors.

Click here for more information

featured paper

Understanding the Foundations of Quiescent Current in Linear Power Systems

Sponsored by Texas Instruments

Minimizing power consumption is an important design consideration, especially in battery-powered systems that utilize linear regulators or low-dropout regulators (LDOs). Read this new whitepaper to learn the fundamentals of IQ in linear-power systems, how to predict behavior in dropout conditions, and maintain minimal disturbance during the load transient response.

Click here to download the whitepaper

featured chalk talk

Fundamentals of ESD/TVS Protection

Sponsored by Mouser Electronics and Nexperia

ESD protection is a critical, and often overlooked design consideration in many of today’s systems. There is a wide variety of solutions available for ESD protection, and choosing the right one for your design can be a daunting and confusing task. In this episode of Chalk Talk, Amelia Dalton chats with Tom Wolf of Nexperia about choosing the right ESD protection for your next design.

Click here for more information about Nexperia PCMFxUSB3B/C - CMF EMI filters with ESD Protection