feature article
Subscribe Now

Power’s Progress

Cadence Looks at Migration; Baum Looks at Emulation

Speed must be feeling pretty unloved these days. I mean, let’s face it: there was a time when the mettle of any decent chip was gauged by how fast it could go. Having a performance issue? Aw, just juice it up a little – there’s plenty more current where that came from!

Then it became harder to sacrifice power for anything – it was no longer free. It wasn’t expensive, but it had to be considered, much to timing’s annoyance. Now timing couldn’t get what it wanted anytime just because it wanted it. It had to prove that the extra power was really worth it. Or perhaps some redesign was necessary. Or re-synthesis with new constraints.

Timing was still first fiddle, but there was a second-fiddle upstart that folks were starting to listen to. Timing suspected that it was but a matter of time before power challenged timing for first position.

And then cooling got to be an issue. And batteries happened more than ever before. And data warehouses started needing their own personal nuclear power plants (figuratively speaking – or is it?) in order to remain viable. Now power wasn’t just threatening first position; it was woodshedding like a demon, honing its skills so that, come challenge time, there would be no question.

The Great Migration

And now? Geez, yeah, performance is still a goal, but power is now practically a solo performer, with everyone else relegated to the role of backup, doing the electronic equivalent of the doo-ops (not to be confused with the no-ops). Exhibit A as evidence? The fact that, according to Cadence in a conversation at DAC, power sign-off at the 7-nm node is now taking longer than timing sign-off. Which also suggests that a burst of energy is being put into power-analysis tools.

Granted, few people are designing at 7 nm (and, as we’ll discuss in the future, at least one foundry is placing 7 nm on hold), but it’s not like it’s not going to happen ever. And, if the lurch to the next node is, in fact, taking a breather, perhaps it’s a chance for tools to catch up.

Now, straight extrapolation would suggest that the reason it’s so hard to achieve power sign-off is simply that power analysis takes too long or that fixes are tedious to do. Underlying that, the obvious problem might seem to be that power is just too high. But, again according to Cadence, that’s not the issue. I mean, OK, maybe it’s an issue, but it’s not the main one. The main one is: electromigration.

It’s turning out that, at 7 nm, even signal wires are subject to electromigration. As a result, the electromigration rules are incredibly complex. As a further result, analysis is backing off of the entire-chip level and focusing more on each block. Analysis can be run more quickly (with Cadence speaking for their Voltus (digital) and Voltus-Fi (analog) power-analysis tools), with fixes being automatically applied during place-and-route (leaving a few manual chip-level ones).

The Great Emulation

One way of accelerating simulation is through emulation. According to Baum in a separate DAC conversation, customers are keenly interested in running power analysis alongside not just simulation, but also emulation. And emulation is becoming more popular because it allows designers to see what the power profile of their chip will be when running specific software. Simulators can’t run software (OK, technically speaking, they can – if you have a few years to await the results); emulators can (if not application software, then, at the very least, low-level driver software).

Running power analysis with simulation is easy (as such things go) because there’s a simulation API that allows the simulator and the power analyzer to tag-team the problem in real time. As the simulator runs, it can feed info to the power analysis tool. Alas, says Baum, there is no corresponding API in an emulator, and so they must work on a file-exchange basis.

So, in this first phase of emulation-assisted power analysis, you run emulation first to generate a file with the signal transition information and then feed that file to the power-analysis engine. The second phase involves what Baum characterizes as interest on the part of emulator makers in providing the necessary API that would enable concurrent emulation and power analysis, much the same way it’s doable in simulation today.

There’s a potential bottleneck awaiting that approach, however, that probably needs to be addressed alongside addition of the API: bandwidth for data exchange. The emulator sends over data on every cycle – whether or not that particular cycle has any relevance for power. Got a signal that stays low for 10,000 cycles? You’ll get 10,000 cycles of data confirming that, yup, it’s still low. And you have to transport that data and put it somewhere before you can figure out that, yeah, I didn’t really need that after all.

So having the emulator send data over only when specified signals change – and perhaps only for those signals that did change – would reduce the amount of data exchanged to only that which matters for power. This would likely remove the bottleneck lurking in the shadows at the moment.

While having an emulator feed the power analysis tool – hosted not in the emulator but in an accompanying… um… host computer – would help speed things up, there’s yet another possible speed-up that would lead to a third phase of power emulation. And that opportunity comes from literally synthesizing the power model and putting it onto the emulator. Now there’s no data exchange to the host at all; power is calculated at emulation speed, all within the emulator.

Yeah, I know what you might be thinking: power models are analog, right? Cuz current is analog? How the heck do you synthesize something like that into an emulator? I went back to last-year’s coverage and noted that Baum is all about doing power analysis at the RTL level, which pretty much rules out analog. But, even if it’s in an RTL-level language, that doesn’t mean that it’s synthesizable. But I did confirm with Baum that their models are, in fact, synthesizable.

But, to be clear, this vision of a synthesized model isn’t a specific product plan. It’s simply an idea of what could be farther down the pipeline as power analysis continues its rapid evolution.

 

More info:

Baum power analysis

Cadence Voltus

Cadence Voltus-Fi

One thought on “Power’s Progress”

Leave a Reply

featured blogs
Mar 24, 2023
With CadenceCONNECT CFD less than a month away, now is the time to make your travel plans to join us at the Santa Clara Convention Center on 19 April for our biggest CFD event of the year. As a bonus, CadenceCONNECT CFD is co-located with the first day of CadenceLIVE Silicon ...
Mar 23, 2023
Explore AI chip architecture and learn how AI's requirements and applications shape AI optimized hardware design across processors, memory chips, and more. The post Why AI Requires a New Chip Architecture appeared first on New Horizons for Chip Design....
Mar 10, 2023
A proven guide to enable project managers to successfully take over ongoing projects and get the work done!...

featured video

First CXL 2.0 IP Interoperability Demo with Compliance Tests

Sponsored by Synopsys

In this video, Sr. R&D Engineer Rehan Iqbal, will guide you through Synopsys CXL IP passing compliance tests and demonstrating our seamless interoperability with Teladyne LeCroy Z516 Exerciser. This first-of-its-kind interoperability demo is a testament to Synopsys' commitment to delivering reliable IP solutions.

Learn more about Synopsys CXL here

featured chalk talk

Key Elements of Indoor Air Quality: Why Do They Matter and Why Do We Detect Them?
Sponsored by Mouser Electronics and Sensirion
Measuring indoor air pollution is a valuable tool to monitor our health and productivity. In this episode of Chalk Talk, Amelia Dalton and Timothy Kennedy from Sensirion discuss the what, how, and why of indoor air quality testing and how the all in one air quality sensor called Sen5X from Sensirion can make measuring our indoor air quality easier than ever before.
Jun 23, 2022
33,321 views