feature article
Subscribe Now

Monotonic Convergence

An Anti-Engineering Concept

Synopsys recently announced the results of a flow collaboration with Fujitsu. Modestly buried in the discussion was a mention of 33% improvement in logic per area.

33%.

We’ve been at this game for a long time, and you’d think that the low-hanging fruit had long ago been picked. Which would leave us with the occasional 5-10% improvement in this and that after lots of algorithmic tweakage.

And yet here we are, in 2014, with a 33% improvement. Maybe I’m naïve, but that seems significant.

I asked what it was that got them there, and they credited three things:

  • Early design exploration and the impact of such tools as DC Graphical, which can explore the physical implications of various layouts, looking for congestion and other issues and passing placement seed suggestions to IC Compiler;
  • Integration of Fujitsu’s floorplanning tools into the flow; and
  • Improvements in Design Compiler, via their 13.12 release.

That last item there, their latest Design Compiler thing, itself claims a 10% reduction in area and leakage. And they make reference to a mysterious sounding feature called “monotonic convergence.” “What in tarnation,” you might ask, “is that?” Almost sounds like a torture mechanism whereby a bunch of bad technical presenters deliver conference paper presentations to you all at the same time.

But no, that’s not what it is. In fact, it’s a concept that, when you think about it, is the antithesis of engineering.

Engineering, as we all know, is about trade-offs. Everything is give-and-take. You want a little of this, well, you’re gonna have to give up some of that to get it. Nothing is free. (If it were free, it wouldn’t be engineering; it would be science.)

If you take that away and make life too easy, you might think that you’re going to get put out of a job. But, then again, when your primary design goal is high level, you really want to stay at that level. If optimizing your design constantly means that you’re trading off low-level area and performance and power, well, after a few decades of that, it starts to lose its novelty. You’d rather be making higher-level system tradeoffs than constantly fretting with the layout.

Synopsys describes the low-level optimizations algorithms they’ve used before this release as “greedy.” You say you want more performance? OK, baby – here we go. We’re going to get you all the performance we can find.

But here’s the deal: they’ve discovered a concept that’s kind of obvious when you think about it (even though it has yet to find its way into financial circles): “enough.” Those older algorithms are busy finding every last bit of performance they can, even though they may have gone far beyond the performance you need. That’s the greedy bit.

The thing is, after a certain point, these algorithms are getting you, well, more than you could ever spend in a lifetime. You can’t will those extra picoseconds of slack time to your progeny, so they’re really not doing anyone any good.

And, through this process, they bring lots of tradeoffs into the picture. Each performance improvement comes with fine print: leakage or active power consumption may be excessive. You may be using far more area than necessary. And so you have to play with various optimization scenarios, balancing the need for one parameter (slack, power, area, throughput, bandwidth, latency…) against all the other ones.

This exercise can cause some turbulence in what has otherwise become a smoother design process. Improvements in tools, with sign-off quality engines used early in the flows, for example, have meant fewer redos, fewer loops back into the design cycle. Convergence has become more, well, monotonic. More feed forward, less feed back.

You take your design from high-level abstraction to the point of layout, moving forward most of the way, and suddenly you hit the point where you want to do more optimizations to achieve the right balance. And you go round and round trading things off, exploring the impact of this change and that change, trying for two steps forward with possibly one back rather than the opposite. You’ve lost monotonicity.

And that’s what Synopsys attacked in their latest Design Compiler release. They did something that, at first blush, sounds like one of those ridiculous marketing demands: they told the developers that they wanted to have optimizations come with no tradeoffs. You can almost hear the team groaning, “Oh jeez, they hired more marketing MBAs with no engineering background!”

But it’s not quite what it sounds like. It’s simply a realization that greed, for lack of a better word, isn’t always good. Those algorithms of the past have gone off and found way more of whatever than was needed. Instead, now they will go off and get as much as they can – as long as they don’t compromise the other parameters.

This is actually a cool concept. As a designer, you can get your design to a point and then push a button to tighten down area or reduce leakage or something, and you know that all of the other characteristics that you’ve worked hard to set up will remain intact. From a design flow standpoint, this can be a big deal. Up to now, the older greed has meant that you had to save a goodly chunk of time for optimization so that you could push and pull and explore and make sure that you’re not giving up something important with your optimizations.

With monotonic convergence, the intent is that the algorithms will go find ways to tighten everything up in one go, guaranteeing that nothing will be degraded in the process. If you’re a cautious engineer, you’re probably going to want to go verify that this did indeed happen. But if it works as promised a few times, then you’ll probably relax a little as what was once an unpredictable optimization cycle becomes a simple push of the button.

This whole “getting something for nothing” idea sounds almost too good to be true, and it definitely seems to go against what engineering is supposed to be about. But in reality, it’s yet another submission of an engineering task to automation, hopefully leaving you to apply your engineering talents to something higher-level and more interesting.

 

More info:

Synopsys Design Compiler

3 thoughts on “Monotonic Convergence”

  1. It looks like ASIC design is just catching up to the push-button methodology of FPGA design. “Just enough” has been the norm for FPGA place & route tools for a decade now.

  2. Ah gabor, you don’t understand. For years the ASIC guys have been the cutting edge, demanding as much of everything that they can get, while regarding FPGA guys as knuckle dragging peasants who aren’t really designers. So how could they learn from them that “the best is oft the enemy of the good”?

Leave a Reply

featured blogs
Oct 24, 2024
This blog describes how much memory WiFi IoT devices actually need, and how our SiWx917M Wi-Fi 6 SoCs respond to IoT developers' call for more memory....
Nov 1, 2024
Self-forming mesh networking capability is a fundamental requirement for the Firefly project, but Arduino drivers don't exist (sad face)...

featured chalk talk

Vector Funnel Methodology for Power Analysis from Emulation to RTL to Signoff
Sponsored by Synopsys
The shift left methodology can help lower power throughout the electronic design cycle. In this episode of Chalk Talk, William Ruby from Synopsys and Amelia Dalton explore the biggest energy efficiency design challenges facing engineers today, how Synopsys can help solve a variety of energy efficiency design challenges and how the shift left methodology can enable consistent power efficiency and power reduction.
Jul 29, 2024
58,321 views