feature article
Subscribe Now

A Merger of Unequals

Magma Announces the Union of Analog and Digital in Titan

As the orks circled the tower in growing numbers, efforts to finish the weapon became increasingly frantic. The mechanical portion was almost complete: all of the strength and stress tests had passed, so the structure was ready to go. They had done practice shots with weights equivalent to the final payload, and distance and accuracy looked good. They fiddled a bit more with the pivots and joints to make sure that wear wouldn’t be excessive. But the real thing they were waiting for was the payload itself. This was a mystery concoction brewed up by some tall mysterious guy with a long beard and a pointy hat in one of the secure rooms near the top of the tower. They had no idea what it was or how he made it. They knew only that it would be poured into the carrier through some system of tubes. They had tested the tubes with water, but because they didn’t know the chemistry of the actual payload, they couldn’t be sure that viscosity wouldn’t be an issue, or even that the liquid wouldn’t react with the tubing or the carrier. And once the payload was ready, they had no time to run a whole new set of tests; the advancing hordes weren’t of the genteel sort that would put their attack on hold while T’s were being crossed. No, they simply had to assume that mystery dude knew how to brew the goo, put it in the shell, and hope it worked.

As we go through our engineering courses in school, certain patterns emerge. There’s a clean world of ones and zeros, Karnaugh maps, state machines. You decide what you want to do, create a digital model, and do it. In this highly deterministic world, the only thing separating you from certain success is how complex a problem you want to solve and how many all-nighters you want to pull solving it. Then there’s a shadowier world. One of two-ports and eigenvectors, Smith charts and convolutions, LaPlace transforms and Nyquist criteria. Purple robes and misty rooms. All is not so simple here. You can’t just march out and create new things; you must first master the ancient lore of those who have come before. A blithe design, likely as not, will harbor disaster, as the wrong resonances may push it into instability or interfere with other parts of the circuit, or perhaps an evil green light will emanate, destroying all within its reach, if the wrong words are spoken. This is not for the faint of heart: anyone can figure out digital design, but those that master analog design, they are the respected, the venerated, the slightly suspect.

Digital designers have had to incorporate more analog concepts into their designs as speeds have increased, but for the most part digital and analog circuits are created by separate people in separate universes. The problem is, today’s system-on-chip (SoC) designs need both analog and digital, so now they have to come together on a single chip and play nicely together. There’s no opportunity for old-school integration, where the wizened guru touches different parts of the circuit to locate the problem and magically cures it by taping a newt to one of the power transistors. No, this has to work at 45 nm, hopefully the first time it’s built.

I would probably invite the wrath of a dozen EDA companies were I to suggest that designing tools for digital logic was easy. OK, so maybe it isn’t easy, but it’s, well, largely deterministic. There are rules. Yes, when you get into layout issues things get a bit messier, but with enough rules, you can pretty much ensure success. Tools for analog, on the other hand, must incorporate the accumulated wisdom of the Great Initiates. They must automate the ancient dark incantations. SPICE simulations figure much more heavily here, as third- and fourth-order effects may have an impact.

Then comes the challenge of marrying and verifying the digital and analog parts. There has been no way of verifying both the analog and digital portions simultaneously. One side or the other has to be “black-boxed” during simulation. And because each side is created in a different environment, small changes on either side then have to be re-integrated, a task that can take a couple days at best.

The other challenge with analog circuits is that, because they are generally highly hand-crafted, they are hard to migrate between technologies. There’s no quick optical shrink, there’s no TCL script to port some tried-and-true piece of analog IP down to the next technology node. While digital logic is diving down to the 45-nm node, analog is trudging through mud back at the 90- and 130-nm nodes. Chip finishing activities – those last details required to integrate and unify the final elements of the chip – may happen in a different environment, and so there’s a disconnect between the analog circuit as it originally looked and as it looked after being brought into the full SoC. This disconnect makes it hard to bring the analog circuits forward as the underlying technology advances; the analog circuit pretty much ends up being redone largely from scratch.

Magma is trying to improve this scenario by unifying the design environment in what they call Titan. The first element of this is the creation of a single database containing all elements of the entire chip, whether analog or digital. Any changes made to either portion of the circuit will be visible to all the tools immediately – no conversion between domains is required. The ability to coordinate activities both digital and analog, to correlate events and artifacts in any part of the circuit, can be completely transformed by the use of that single database, all of which is accessible at any moment.

Another element of Titan is an initial focus on improving chip finishing, an often manual process, where the loop has historically been very long and small last-minute changes have been painful. The idea with Titan is that there is really no separate “integration” step, since the design pretty much starts out integrated from the beginning, and any finishing activities are applied with equal visibility to all portions of the circuit.

The routing tool also benefits – it can perform a global routing step, which gives a rough routing of the entire SoC – analog and digital. This is followed by a fine routing step, and here separate tools are used for digital and analog, but they operate on the same database. In order to accommodate the fact that analog circuits can have rather novel dimensions, the system is shape-based, not grid-oriented. Finally, but significantly, full-chip simulations can be performed with SPICE-level accuracy for the analog portions and Fast SPICE-level accuracy for the digital portion. Because of the unified database, elements in both analog and digital domains can be viewed at the same time.

The ultimate test in inviting analog to the same party as digital will be in allowing analog IP to capture the knowledge of the analog gurus through parameters and constraints. Reuse of analog IP is rare, and successful reuse will be a true indication that the mist over analog design will have lifted somewhat. The use of parameterized cells is an area that has community support in the IPL (Interoperable P-Cell Libraries) effort for digital cells, but analog becomes a new player in that effort. The extent to which analog IP can be successfully parameterized has yet to be proven. For the time being, while analog and digital realms can now coexist in the same universe, the analog magi are still critical in the process as tools try to turn black art into science.

Leave a Reply

featured blogs
Apr 18, 2021
https://youtu.be/afv9_fRCrq8 Made at Target Oakridge (camera Ziyue Zhang) Monday: "Targeting" the Open Compute Project Tuesday: NUMECA, Computational Fluid Dynamics...and the America's... [[ Click on the title to access the full blog on the Cadence Community s...
Apr 16, 2021
Spring is in the air and summer is just around the corner. It is time to get out the Old Farmers Almanac and check on the planting schedule as you plan out your garden.  If you are unfamiliar with a Farmers Almanac, it is a publication containing weather forecasts, plantin...
Apr 15, 2021
Explore the history of FPGA prototyping in the SoC design/verification process and learn about HAPS-100, a new prototyping system for complex AI & HPC SoCs. The post Scaling FPGA-Based Prototyping to Meet Verification Demands of Complex SoCs appeared first on From Silic...
Apr 14, 2021
By Simon Favre If you're not using critical area analysis and design for manufacturing to… The post DFM: Still a really good thing to do! appeared first on Design with Calibre....

featured video

Learn the basics of Hall Effect sensors

Sponsored by Texas Instruments

This video introduces Hall Effect, permanent magnets and various magnetic properties. It'll walk through the benefits of Hall Effect sensors, how Hall ICs compare to discrete Hall elements and the different types of Hall Effect sensors.

Click here for more information

featured paper

From Chips to Ships, Solve Them All With HFSS

Sponsored by Ansys

There are virtually no limits to the design challenges that can be solved with Ansys HFSS and the new HFSS Mesh Fusion technology! Check out this blog to know what the latest innovation in HFSS 2021 can do for you.

Click here to read the blog post

featured chalk talk

Microwave/Millimeter Cable Assemblies and Interconnects

Sponsored by Mouser Electronics and Samtec

Cabling and connectors for RF design are critical to performance. And, in the world of microwave and millimeter-wave design, choosing the right interconnect for your frequency band is key to signal integrity. In this episode of Chalk Talk, Amelia Dalton chats with Matthew Burns of Samtec about what you need to know to choose the right interconnect solution for your next RF design.

Click here for more information about Samtec Precision RF Connectors & Cable Assemblies