feature article
Subscribe Now

Bringing it Together – Some of It Anyway

Synopsys Announces Lynx and Discovery

 

The EDA world is rife with point solutions. No sooner might you think it’s time to stitch together a unified flow when some new requirement of some new technology makes some new point tool necessary for effective design.

And so it goes; an IC design environment might have a dozen or two (or more) tools that must be invoked at one time or another. It’s not a flow, it’s more like an artist’s palette, with all the capabilities laid out in a more or less unstructured fashion, and you dip your brush in one or the other as needed, ad hoc, to accomplish the design goals of a given project.

Synopsys and ARM have addressed this to some extent by their Manuals. There’s a Methodology one for low power. There’s a Methodology one for verification. There are the Reference Methodologies (RMs) for aspects of IC design. And some of these have been jointly published as non-trivial books. The idea is to capture best practices and document them to provide something more of a roadmap of the landscape and help engineers navigate.

Which is fine. But to me, when a methodology manual is required, it means the process being described isn’t clear, isn’t intuitive, isn’t obvious. If it really takes a map to get from here to there, then that means that the turns along the way won’t be clear, the signage will be poor, and it’s easy to get caught in suburban cul-de-sac traps that offer no easy escapes.

This is not specifically to say that Synopsys has done a bad job and must therefore patch things up with manuals. Rather, it’s to some extent just the nature of EDA. No one company has the best everything, and when first silicon failure means millions in new mask charges (not to mention delays), designers are going to be careful to hand pick tools and use them in a manner that they feel will help to ensure success.

Designers patch these tools together by constructing elaborate scripts and flows and build routines. These are updated, massaged, tweaked, and mutated over time so that it can require multiple bodies just to grapple with the idiosyncrasies of all the tools in all the possible flows that might be used for all of the possible chips that might be designed by that company, not to mention the legacy they hold from all of the projects that they have already done.

It is, therefore, a testament to this complexity that Synopsys is just now announcing a unified flow product: their Lynx environment. It attempts to cast a net around the unruly pieces of the flow and manage them in a single place. It follows the RMs that they and ARM have written and adds management visibility for improved project tracking.

It is further testament to this complexity that this platform doesn’t (yet) encompass the entire chip design flow: it is at present limited to the RTL-to-GSD-II portion of the flow. “Limited” being a relative word, since the scope includes such thorny areas as low-power design and design-for-manufacturing. There’s a lot going on here, so they’ve focused to ensure that the problem remains tractable.

It is yet further testament to this complexity that setting up the flow is something they can help with, and their target is five days to get it up and running. This sounds crazy to someone used to getting Excel going out of the box, or installing FPGA tools and doing their first build, or getting a simple C compiler running. The fact that five days is a huge improvement gives you a sense of what it might be without that.

Could you please verify that?

There are two other nested bringing-togethers that Synopsys recently announced. At the inside of this nest is CustomSim, their AMS verification tool. It unites the capabilities of various separate verification tools: NanoSim, for custom digital; HSIM, for memory; and XA, for “digitally-controlled analog,” which can auto-detect the analog and digital bits and dynamically adapt the simulation time-scale for faster simulation.

Coupling this CustomSim product with HSPICE provides a continuum from the highest-accuracy to the fastest simulation. Synopsys currently perceives a gap between today’s accurate SPICE and less-accurate-but-can-get-results-sometime-this-month FastSPICE; their intent is that CustomSim eliminate that gap.

CustomSim, being an analog/mixed-signal (AMS) verification tool, is then being integrated with VCS, a digital verification tool, under the aegis of the Discovery brand, the outside of the nest. Thereby proving that if there’s one thing EDA companies are good at, it’s generating brands that incorporate constituent brands, each of which covers other lower-level brands, and none of which sound anything like the others. And I’m not picking on Synopsys here; all the big guys have this. So it goes something like this, using substitution:

Discovery   = VCS + CustomSim

                  = VCS + (NanoSim + HSIM + XA)

Although, a higher fidelity representation of their message would be:

Discovery   ≥ VCS + (NanoSim + HSIM + XA)

The juxtaposition of the AMS and digital verification platforms is analogous to the prior bringing-together of their digital and AMS design environments with their Custom Designer announcement last year. While analog and digital have historically been considered separate activities, increasingly they’re both important on the same chip. So if you can design them both more or less together, then you should be able to verify them more or less together.

Of course there are still separate engines handling the specifics of the verification, but including them in a single environment and letting the tools talk to each other means that you can do a single unified verification much more quickly – and, just as importantly, more easily.

So, one by one, some of the gaps between tools that have to be manually bridged are being closed. Even so, the closure can be overridden by tweaking scripts to modify things or add new tools or capabilities. But at least now the default is making some moves towards unification and automation and the notion of an actual flow.

While the bringing together of digital and analog is something of a one-time event, the integration of point tools into a flow is a work that could forever be in progress. Once the RTL-and-below portion is under control (assuming it achieves that status), Synopsys will look around at remaining portions of the flow – in particular, portions earlier in the process with higher levels of design abstraction, to see if they have settled down enough to be included within the confines of a flow. At which point there will likely be plenty of new renegade point tools that will remain outside the taming grasp of a domesticated flow.

Links:

Synopsys Lynx

Synopsys Discovery

 

Leave a Reply

featured blogs
Jun 9, 2023
In this Knowledge Booster blog, let us talk about the simulation of the circuits based on switched capacitors and capacitance-to-voltage (C2V) converters using various analyses available under the Shooting Newton method using Spectre RF. The videos described in this blog are ...
Jun 8, 2023
Learn how our EDA tools accelerate 5G SoC design for customer Viettel, who designs chips for 5G base stations and drives 5G rollout across Vietnam. The post Customer Spotlight: Viettel Accelerates Design of Its First 5G SoC with Synopsys ASIP Designer appeared first on New H...
Jun 2, 2023
I just heard something that really gave me pause for thought -- the fact that everyone experiences two forms of death (given a choice, I'd rather not experience even one)....

featured video

Shift-left with Power Emulation Using Real Workloads

Sponsored by Synopsys

Increasing software content and larger chips are demanding pre-silicon power for real-life workloads. Synopsys profile, analyze, and signoff emulation power steps to identify and analyze interesting stimulus from seconds of silicon runtime are discussed.

Learn more about Synopsys’ Energy-Efficient SoCs Solutions

featured paper

EC Solver Tech Brief

Sponsored by Cadence Design Systems

The Cadence® Celsius™ EC Solver supports electronics system designers in managing the most challenging thermal/electronic cooling problems quickly and accurately. By utilizing a powerful computational engine and meshing technology, designers can model and analyze the fluid flow and heat transfer of even the most complex electronic system and ensure the electronic cooling system is reliable.

Click to read more

featured chalk talk

Peltier Modules
Do you need precise temperature control? Does your application need to be cooled below ambient temperature? If you answered yes to either of these questions, a peltier module may be the best solution for you. In this episode of Chalk Talk, Amelia Dalton chats with Rex Hallock from CUI Devices about the limitations and unique benefits of peltier modules, how CUI Devices’ arcTEC™ structure can make a big difference when it comes to thermal stress and fatigue of peltier modules, and how you can get started using a peltier module in your next design.
Jan 3, 2023
20,722 views