feature article
Subscribe Now

A Merger of Unequals

Magma Announces the Union of Analog and Digital in Titan

As the orks circled the tower in growing numbers, efforts to finish the weapon became increasingly frantic. The mechanical portion was almost complete: all of the strength and stress tests had passed, so the structure was ready to go. They had done practice shots with weights equivalent to the final payload, and distance and accuracy looked good. They fiddled a bit more with the pivots and joints to make sure that wear wouldn’t be excessive. But the real thing they were waiting for was the payload itself. This was a mystery concoction brewed up by some tall mysterious guy with a long beard and a pointy hat in one of the secure rooms near the top of the tower. They had no idea what it was or how he made it. They knew only that it would be poured into the carrier through some system of tubes. They had tested the tubes with water, but because they didn’t know the chemistry of the actual payload, they couldn’t be sure that viscosity wouldn’t be an issue, or even that the liquid wouldn’t react with the tubing or the carrier. And once the payload was ready, they had no time to run a whole new set of tests; the advancing hordes weren’t of the genteel sort that would put their attack on hold while T’s were being crossed. No, they simply had to assume that mystery dude knew how to brew the goo, put it in the shell, and hope it worked.

As we go through our engineering courses in school, certain patterns emerge. There’s a clean world of ones and zeros, Karnaugh maps, state machines. You decide what you want to do, create a digital model, and do it. In this highly deterministic world, the only thing separating you from certain success is how complex a problem you want to solve and how many all-nighters you want to pull solving it. Then there’s a shadowier world. One of two-ports and eigenvectors, Smith charts and convolutions, LaPlace transforms and Nyquist criteria. Purple robes and misty rooms. All is not so simple here. You can’t just march out and create new things; you must first master the ancient lore of those who have come before. A blithe design, likely as not, will harbor disaster, as the wrong resonances may push it into instability or interfere with other parts of the circuit, or perhaps an evil green light will emanate, destroying all within its reach, if the wrong words are spoken. This is not for the faint of heart: anyone can figure out digital design, but those that master analog design, they are the respected, the venerated, the slightly suspect.

Digital designers have had to incorporate more analog concepts into their designs as speeds have increased, but for the most part digital and analog circuits are created by separate people in separate universes. The problem is, today’s system-on-chip (SoC) designs need both analog and digital, so now they have to come together on a single chip and play nicely together. There’s no opportunity for old-school integration, where the wizened guru touches different parts of the circuit to locate the problem and magically cures it by taping a newt to one of the power transistors. No, this has to work at 45 nm, hopefully the first time it’s built.

I would probably invite the wrath of a dozen EDA companies were I to suggest that designing tools for digital logic was easy. OK, so maybe it isn’t easy, but it’s, well, largely deterministic. There are rules. Yes, when you get into layout issues things get a bit messier, but with enough rules, you can pretty much ensure success. Tools for analog, on the other hand, must incorporate the accumulated wisdom of the Great Initiates. They must automate the ancient dark incantations. SPICE simulations figure much more heavily here, as third- and fourth-order effects may have an impact.

Then comes the challenge of marrying and verifying the digital and analog parts. There has been no way of verifying both the analog and digital portions simultaneously. One side or the other has to be “black-boxed” during simulation. And because each side is created in a different environment, small changes on either side then have to be re-integrated, a task that can take a couple days at best.

The other challenge with analog circuits is that, because they are generally highly hand-crafted, they are hard to migrate between technologies. There’s no quick optical shrink, there’s no TCL script to port some tried-and-true piece of analog IP down to the next technology node. While digital logic is diving down to the 45-nm node, analog is trudging through mud back at the 90- and 130-nm nodes. Chip finishing activities – those last details required to integrate and unify the final elements of the chip – may happen in a different environment, and so there’s a disconnect between the analog circuit as it originally looked and as it looked after being brought into the full SoC. This disconnect makes it hard to bring the analog circuits forward as the underlying technology advances; the analog circuit pretty much ends up being redone largely from scratch.

Magma is trying to improve this scenario by unifying the design environment in what they call Titan. The first element of this is the creation of a single database containing all elements of the entire chip, whether analog or digital. Any changes made to either portion of the circuit will be visible to all the tools immediately – no conversion between domains is required. The ability to coordinate activities both digital and analog, to correlate events and artifacts in any part of the circuit, can be completely transformed by the use of that single database, all of which is accessible at any moment.

Another element of Titan is an initial focus on improving chip finishing, an often manual process, where the loop has historically been very long and small last-minute changes have been painful. The idea with Titan is that there is really no separate “integration” step, since the design pretty much starts out integrated from the beginning, and any finishing activities are applied with equal visibility to all portions of the circuit.

The routing tool also benefits – it can perform a global routing step, which gives a rough routing of the entire SoC – analog and digital. This is followed by a fine routing step, and here separate tools are used for digital and analog, but they operate on the same database. In order to accommodate the fact that analog circuits can have rather novel dimensions, the system is shape-based, not grid-oriented. Finally, but significantly, full-chip simulations can be performed with SPICE-level accuracy for the analog portions and Fast SPICE-level accuracy for the digital portion. Because of the unified database, elements in both analog and digital domains can be viewed at the same time.

The ultimate test in inviting analog to the same party as digital will be in allowing analog IP to capture the knowledge of the analog gurus through parameters and constraints. Reuse of analog IP is rare, and successful reuse will be a true indication that the mist over analog design will have lifted somewhat. The use of parameterized cells is an area that has community support in the IPL (Interoperable P-Cell Libraries) effort for digital cells, but analog becomes a new player in that effort. The extent to which analog IP can be successfully parameterized has yet to be proven. For the time being, while analog and digital realms can now coexist in the same universe, the analog magi are still critical in the process as tools try to turn black art into science.

Leave a Reply

featured blogs
Mar 18, 2024
If you've already seen Vivarium, or if you watch it as a result of reading this blog, I'd love to hear what you think about it....
Mar 18, 2024
Innovation in the AI and supercomputing domains is proceeding at a rapid pace, with each new advancement heralding a future more tightly interwoven with the threads of intelligence and computation. Cadence, with the release of its Millennium Platform, co-optimized with NVIDIA...
Mar 18, 2024
Cloud-based EDA tools are critical to accelerating AI chip design and verification; see how NeuReality leveraged cloud-based chip emulation for their 7NR1 NAPU.The post NeuReality Accelerates 7nm AI Chip Tape-Out with Cloud-Based Emulation appeared first on Chip Design....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured paper

Reduce 3D IC design complexity with early package assembly verification

Sponsored by Siemens Digital Industries Software

Uncover the unique challenges, along with the latest Calibre verification solutions, for 3D IC design in this new technical paper. As 2.5D and 3D ICs redefine the possibilities of semiconductor design, discover how Siemens is leading the way in verifying complex multi-dimensional systems, while shifting verification left to do so earlier in the design process.

Click here to read more

featured chalk talk

Peak Power Introduction and Solutions
Sponsored by Mouser Electronics and MEAN WELL
In this episode of Chalk Talk, Amelia Dalton and Karim Bheiry from MEAN WELL explore why motors and capacitors need peak current during startup, the parameters to keep in mind when choosing your next power supply for these kind of designs, and the specific applications where MEAN WELL’s enclosed power supplies with peak power would bring the most benefit.
Jan 22, 2024
8,315 views