feature article
Subscribe Now

Monotonic Convergence

An Anti-Engineering Concept

Synopsys recently announced the results of a flow collaboration with Fujitsu. Modestly buried in the discussion was a mention of 33% improvement in logic per area.

33%.

We’ve been at this game for a long time, and you’d think that the low-hanging fruit had long ago been picked. Which would leave us with the occasional 5-10% improvement in this and that after lots of algorithmic tweakage.

And yet here we are, in 2014, with a 33% improvement. Maybe I’m naïve, but that seems significant.

I asked what it was that got them there, and they credited three things:

  • Early design exploration and the impact of such tools as DC Graphical, which can explore the physical implications of various layouts, looking for congestion and other issues and passing placement seed suggestions to IC Compiler;
  • Integration of Fujitsu’s floorplanning tools into the flow; and
  • Improvements in Design Compiler, via their 13.12 release.

That last item there, their latest Design Compiler thing, itself claims a 10% reduction in area and leakage. And they make reference to a mysterious sounding feature called “monotonic convergence.” “What in tarnation,” you might ask, “is that?” Almost sounds like a torture mechanism whereby a bunch of bad technical presenters deliver conference paper presentations to you all at the same time.

But no, that’s not what it is. In fact, it’s a concept that, when you think about it, is the antithesis of engineering.

Engineering, as we all know, is about trade-offs. Everything is give-and-take. You want a little of this, well, you’re gonna have to give up some of that to get it. Nothing is free. (If it were free, it wouldn’t be engineering; it would be science.)

If you take that away and make life too easy, you might think that you’re going to get put out of a job. But, then again, when your primary design goal is high level, you really want to stay at that level. If optimizing your design constantly means that you’re trading off low-level area and performance and power, well, after a few decades of that, it starts to lose its novelty. You’d rather be making higher-level system tradeoffs than constantly fretting with the layout.

Synopsys describes the low-level optimizations algorithms they’ve used before this release as “greedy.” You say you want more performance? OK, baby – here we go. We’re going to get you all the performance we can find.

But here’s the deal: they’ve discovered a concept that’s kind of obvious when you think about it (even though it has yet to find its way into financial circles): “enough.” Those older algorithms are busy finding every last bit of performance they can, even though they may have gone far beyond the performance you need. That’s the greedy bit.

The thing is, after a certain point, these algorithms are getting you, well, more than you could ever spend in a lifetime. You can’t will those extra picoseconds of slack time to your progeny, so they’re really not doing anyone any good.

And, through this process, they bring lots of tradeoffs into the picture. Each performance improvement comes with fine print: leakage or active power consumption may be excessive. You may be using far more area than necessary. And so you have to play with various optimization scenarios, balancing the need for one parameter (slack, power, area, throughput, bandwidth, latency…) against all the other ones.

This exercise can cause some turbulence in what has otherwise become a smoother design process. Improvements in tools, with sign-off quality engines used early in the flows, for example, have meant fewer redos, fewer loops back into the design cycle. Convergence has become more, well, monotonic. More feed forward, less feed back.

You take your design from high-level abstraction to the point of layout, moving forward most of the way, and suddenly you hit the point where you want to do more optimizations to achieve the right balance. And you go round and round trading things off, exploring the impact of this change and that change, trying for two steps forward with possibly one back rather than the opposite. You’ve lost monotonicity.

And that’s what Synopsys attacked in their latest Design Compiler release. They did something that, at first blush, sounds like one of those ridiculous marketing demands: they told the developers that they wanted to have optimizations come with no tradeoffs. You can almost hear the team groaning, “Oh jeez, they hired more marketing MBAs with no engineering background!”

But it’s not quite what it sounds like. It’s simply a realization that greed, for lack of a better word, isn’t always good. Those algorithms of the past have gone off and found way more of whatever than was needed. Instead, now they will go off and get as much as they can – as long as they don’t compromise the other parameters.

This is actually a cool concept. As a designer, you can get your design to a point and then push a button to tighten down area or reduce leakage or something, and you know that all of the other characteristics that you’ve worked hard to set up will remain intact. From a design flow standpoint, this can be a big deal. Up to now, the older greed has meant that you had to save a goodly chunk of time for optimization so that you could push and pull and explore and make sure that you’re not giving up something important with your optimizations.

With monotonic convergence, the intent is that the algorithms will go find ways to tighten everything up in one go, guaranteeing that nothing will be degraded in the process. If you’re a cautious engineer, you’re probably going to want to go verify that this did indeed happen. But if it works as promised a few times, then you’ll probably relax a little as what was once an unpredictable optimization cycle becomes a simple push of the button.

This whole “getting something for nothing” idea sounds almost too good to be true, and it definitely seems to go against what engineering is supposed to be about. But in reality, it’s yet another submission of an engineering task to automation, hopefully leaving you to apply your engineering talents to something higher-level and more interesting.

 

More info:

Synopsys Design Compiler

3 thoughts on “Monotonic Convergence”

  1. It looks like ASIC design is just catching up to the push-button methodology of FPGA design. “Just enough” has been the norm for FPGA place & route tools for a decade now.

  2. Ah gabor, you don’t understand. For years the ASIC guys have been the cutting edge, demanding as much of everything that they can get, while regarding FPGA guys as knuckle dragging peasants who aren’t really designers. So how could they learn from them that “the best is oft the enemy of the good”?

Leave a Reply

featured blogs
May 13, 2022
The Cadence Learning and Support Portal is useful to academia in many ways: Online Training, Rapid Adoption Kits (RAKs), Generic Process Design Kits (GPDKs), troubleshooting database, and so much... ...
May 12, 2022
Our PCIe 5.0 IP solutions, including digital controllers and PHYs, have passed PCI-SIG 5.0 compliance testing, becoming the first on the 5.0 integrators list. The post Synopsys IP Passes PCIe 5.0 Compliance and Makes Integrators List appeared first on From Silicon To Softwar...
May 12, 2022
By Shelly Stalnaker Every year, the editors of Elektronik in Germany compile a list of the most interesting and innovative… ...
Apr 29, 2022
What do you do if someone starts waving furiously at you, seemingly delighted to see you, but you fear they are being overenthusiastic?...

featured video

Building safer robots with computer vision & AI

Sponsored by Texas Instruments

Watch TI's demo to see how Jacinto™ 7 processors fuse deep learning and traditional computer vision to enable safer autonomous mobile robots.

Watch demo

featured paper

5 common Hall-effect sensor myths

Sponsored by Texas Instruments

Hall-effect sensors can be used in a variety of automotive and industrial systems. Higher system performance requirements created the need for improved accuracy and more integration – extending the use of Hall-effect sensors. Read this article to learn about common Hall-effect sensor misconceptions and see how these sensors can be used in real-world applications.

Click to read more

featured chalk talk

Industrial CbM Solutions from Sensing to Actionable Insight

Sponsored by Mouser Electronics and Analog Devices

Condition based monitoring (CBM) has been a valuable tool for industrial applications for years but until now, the adoption of this kind of technology has not been very widespread. In this episode of Chalk Talk, Amelia Dalton chats with Maurice O’Brien from Analog Devices about how CBM can now be utilized across a wider variety of industrial applications and how Analog Device’s portfolio of CBM solutions can help you avoid unplanned downtime in your next industrial design.

Click here for more information about Analog Devices Inc. Condition-Based Monitoring (CBM)