feature article
Subscribe Now

A Merger of Unequals

Magma Announces the Union of Analog and Digital in Titan

As the orks circled the tower in growing numbers, efforts to finish the weapon became increasingly frantic. The mechanical portion was almost complete: all of the strength and stress tests had passed, so the structure was ready to go. They had done practice shots with weights equivalent to the final payload, and distance and accuracy looked good. They fiddled a bit more with the pivots and joints to make sure that wear wouldn’t be excessive. But the real thing they were waiting for was the payload itself. This was a mystery concoction brewed up by some tall mysterious guy with a long beard and a pointy hat in one of the secure rooms near the top of the tower. They had no idea what it was or how he made it. They knew only that it would be poured into the carrier through some system of tubes. They had tested the tubes with water, but because they didn’t know the chemistry of the actual payload, they couldn’t be sure that viscosity wouldn’t be an issue, or even that the liquid wouldn’t react with the tubing or the carrier. And once the payload was ready, they had no time to run a whole new set of tests; the advancing hordes weren’t of the genteel sort that would put their attack on hold while T’s were being crossed. No, they simply had to assume that mystery dude knew how to brew the goo, put it in the shell, and hope it worked.

As we go through our engineering courses in school, certain patterns emerge. There’s a clean world of ones and zeros, Karnaugh maps, state machines. You decide what you want to do, create a digital model, and do it. In this highly deterministic world, the only thing separating you from certain success is how complex a problem you want to solve and how many all-nighters you want to pull solving it. Then there’s a shadowier world. One of two-ports and eigenvectors, Smith charts and convolutions, LaPlace transforms and Nyquist criteria. Purple robes and misty rooms. All is not so simple here. You can’t just march out and create new things; you must first master the ancient lore of those who have come before. A blithe design, likely as not, will harbor disaster, as the wrong resonances may push it into instability or interfere with other parts of the circuit, or perhaps an evil green light will emanate, destroying all within its reach, if the wrong words are spoken. This is not for the faint of heart: anyone can figure out digital design, but those that master analog design, they are the respected, the venerated, the slightly suspect.

Digital designers have had to incorporate more analog concepts into their designs as speeds have increased, but for the most part digital and analog circuits are created by separate people in separate universes. The problem is, today’s system-on-chip (SoC) designs need both analog and digital, so now they have to come together on a single chip and play nicely together. There’s no opportunity for old-school integration, where the wizened guru touches different parts of the circuit to locate the problem and magically cures it by taping a newt to one of the power transistors. No, this has to work at 45 nm, hopefully the first time it’s built.

I would probably invite the wrath of a dozen EDA companies were I to suggest that designing tools for digital logic was easy. OK, so maybe it isn’t easy, but it’s, well, largely deterministic. There are rules. Yes, when you get into layout issues things get a bit messier, but with enough rules, you can pretty much ensure success. Tools for analog, on the other hand, must incorporate the accumulated wisdom of the Great Initiates. They must automate the ancient dark incantations. SPICE simulations figure much more heavily here, as third- and fourth-order effects may have an impact.

Then comes the challenge of marrying and verifying the digital and analog parts. There has been no way of verifying both the analog and digital portions simultaneously. One side or the other has to be “black-boxed” during simulation. And because each side is created in a different environment, small changes on either side then have to be re-integrated, a task that can take a couple days at best.

The other challenge with analog circuits is that, because they are generally highly hand-crafted, they are hard to migrate between technologies. There’s no quick optical shrink, there’s no TCL script to port some tried-and-true piece of analog IP down to the next technology node. While digital logic is diving down to the 45-nm node, analog is trudging through mud back at the 90- and 130-nm nodes. Chip finishing activities – those last details required to integrate and unify the final elements of the chip – may happen in a different environment, and so there’s a disconnect between the analog circuit as it originally looked and as it looked after being brought into the full SoC. This disconnect makes it hard to bring the analog circuits forward as the underlying technology advances; the analog circuit pretty much ends up being redone largely from scratch.

Magma is trying to improve this scenario by unifying the design environment in what they call Titan. The first element of this is the creation of a single database containing all elements of the entire chip, whether analog or digital. Any changes made to either portion of the circuit will be visible to all the tools immediately – no conversion between domains is required. The ability to coordinate activities both digital and analog, to correlate events and artifacts in any part of the circuit, can be completely transformed by the use of that single database, all of which is accessible at any moment.

Another element of Titan is an initial focus on improving chip finishing, an often manual process, where the loop has historically been very long and small last-minute changes have been painful. The idea with Titan is that there is really no separate “integration” step, since the design pretty much starts out integrated from the beginning, and any finishing activities are applied with equal visibility to all portions of the circuit.

The routing tool also benefits – it can perform a global routing step, which gives a rough routing of the entire SoC – analog and digital. This is followed by a fine routing step, and here separate tools are used for digital and analog, but they operate on the same database. In order to accommodate the fact that analog circuits can have rather novel dimensions, the system is shape-based, not grid-oriented. Finally, but significantly, full-chip simulations can be performed with SPICE-level accuracy for the analog portions and Fast SPICE-level accuracy for the digital portion. Because of the unified database, elements in both analog and digital domains can be viewed at the same time.

The ultimate test in inviting analog to the same party as digital will be in allowing analog IP to capture the knowledge of the analog gurus through parameters and constraints. Reuse of analog IP is rare, and successful reuse will be a true indication that the mist over analog design will have lifted somewhat. The use of parameterized cells is an area that has community support in the IPL (Interoperable P-Cell Libraries) effort for digital cells, but analog becomes a new player in that effort. The extent to which analog IP can be successfully parameterized has yet to be proven. For the time being, while analog and digital realms can now coexist in the same universe, the analog magi are still critical in the process as tools try to turn black art into science.

Leave a Reply

featured blogs
Jun 7, 2023
We explain how semiconductor designers create reliable, safe, and secure aerospace designs by leveraging IP and standards from automotive chip designs. The post Why Aerospace Semiconductor Designers Are Taking a Page from Their Automotive Friends appeared first on New Horizo...
Jun 6, 2023
At this year's DesignCon, Meta held a session on '˜PowerTree-Based PDN Analysis, Correlation, and Signoff for MR/AR Systems.' Presented by Kundan Chand and Grace Yu from Meta, they talked about power integrity (PI) analysis using Sigrity Aurora and Power Integrity tools such...
Jun 2, 2023
I just heard something that really gave me pause for thought -- the fact that everyone experiences two forms of death (given a choice, I'd rather not experience even one)....

featured video

Synopsys Solution for RTL to Signoff Power Analysis

Sponsored by Synopsys

Synopsys’ industry-leading power analysis solution built on PrimePower technology that enables early RTL exploration, low power implementation and power signoff for design of energy-efficient SoCs.

Learn more about Synopsys’ Energy-Efficient SoCs Solutions

featured paper

EC Solver Tech Brief

Sponsored by Cadence Design Systems

The Cadence® Celsius™ EC Solver supports electronics system designers in managing the most challenging thermal/electronic cooling problems quickly and accurately. By utilizing a powerful computational engine and meshing technology, designers can model and analyze the fluid flow and heat transfer of even the most complex electronic system and ensure the electronic cooling system is reliable.

Click to read more

featured chalk talk

Megawatt Chargers in Electric Commercial Vehicle Infrastructure
In order to move forward with the large-scale implementation of commercial electric vehicles, we need to consider efficiency, availability, reliability, and longevity for the mega-watt chargers required for these applications. In this episode of Chalk Talk, Dr. Martin Schulz from Littelfuse joins Amelia Dalton to discuss the infrastructure demands of electric commercial vehicles, the role that galvanic isolation plays here and why thyristors may be a great choice for the future of electric commercial vehicles.
Jan 17, 2023
18,713 views