feature article
Subscribe Now

New In the Cloud

EDA and CAE

7 years ago! Almost to the day! Astounding!

We think of cloud computing as a relatively recent phenomenon, but it’s been seven years since we covered the first EDA forays into cloud computing. To state the obvious, in our world, seven years ago does not qualify as recent.

This was the time when both Synopsys and Cadence were debuting different ways of handling cloud computing. It sounded good in principle, but the perennial bugaboo questions – largely about security – persisted, and the cloud offerings never really went anywhere.

The motivation was largely about providing capacity for peak usage. That would allow their customers to buy fewer licenses – more attuned to what their normal needs would be, turning to the cloud for those rush periods when they needed more than their normal licenses would allow.

It was also a way for smaller companies to get tools with funding more on a startup scale. That certainly expands the customer base – if you assume that the little guys can’t afford the full-up tools anyway.

Since then, other companies have dived in – some more successfully than others. And all have tried to work around that big bugaboo, which, summarized succinctly, says something like, “No one is going to trust sending their most precious family jewels up into some unknown computer.”

And, in fact, that’s how it was for a long time. Successful launches tended to be selective about which elements got sent to the cloud in the hopes that potential customers could be convinced that the cloudy parts, if compromised, couldn’t be traced back to critical information. Rather than eliminating the need to buy local tools, the goal here was more about accelerating the computationally intensive bits in the cloud in an anonymous or obfuscated way and doing the rest at home.

Well, today we’re here to talk about a couple of new full-on cloud offerings. So… what’s changed?

EDA Metrics

The first company we’ll talk about is closer to our EDA origins. It’s a company called Metrics, and they’ve launched a cloud simulator (yes, newly written). Which is the same thing Synopsys launched with lo those many years ago. Which naturally stimulated my asking why this should work now.

They responded that they see things differently today because the high-level decision-makers also see things differently today. Basically, execs realized that they were hosting all of their critical financial information in the cloud… which made the EDA stuff much less threatening. To be clear, Metrics said that their offering probably wouldn’t have worked two or three years ago. So yes, this coming-around does qualify as recent.

Metrics uses Google for their hosting, and they say that their model fits the NIST definition of true cloud computing, which comprises five “essential characteristics”:

  • On-demand self service
  • Broad network access
  • Resource pooling
  • Rapid elasticity
  • Measured service

While many cloud-computing models involve so-called multi-tenanting, where one server hosts several jobs within multiple virtual machines that are supposed to provide isolation between jobs, Metrics allocates one server per job (on top of a basic cluster for management). Data is encrypted – both when stored and in transit. If VMs provide perfect isolation, then a dedicated server might not really provide better security – but it sure feels like better security. And if there are any holes in the VM isolation, then it’s definitely locked down better in a dedicated server. In addition, all CPU cycles are yours – none are “wasted” on jobs belonging to pesky co-tenants.

When needed, additional machines spin up in about a minute or so (an improvement from the old days). Metrics may also keep a few machines “warm” for even faster scaling. Upgrades can happen automatically – but you can choose a specific version for the duration of a project so that you’re not fighting moving results with every upgrade.

While security has been the big concern, Metrics discusses another historical limiter for EDA-in-the-cloud: lack of integration with other tools. They have addressed this by providing an API that allows other tools to get in and access key simulation data.

But this comes in addition to another element of the offering that has the long-term goal of making Metrics the go-to site for EDA – at least for small- to medium-sized companies. They have github facilities for hosting and managing code and revisions. That seems ambitious, given that, at present, only simulation (and verification management) is hosted up there. I asked about that future, and they see more tools up there, but through collaboration with partners. Doesn’t sound like they’re ready to roll their own versions of the rest of the flow.

If customers have their own management dashboards, then Metrics results can be downloaded (or presumably grabbed via the API) for including the simulation jobs in the dashboard results. But they have their own dashboard as well – consistent with trying to become the home for the entire project.

Their pricing model is simple (and consistent with NIST’s cloud view): $0.04 per minute of job time. I confirmed with them that browsing time isn’t charged, and there is no charge for other tools reaching in and getting data via the API. An amount of storage typical for a project comes for free; excess storage will result in Google storage charges being passed through. Likewise, Google has extra charges for data download; that charge is also passed through. Metrics’s hope is that most data can stay resident in the cloud, with minimal need for downloading.

CAE OnScale

Our next story moves from EDA to CAE. It’s somewhat out of realm for circuit design, but it’s definitely relevant for designing physical devices like sensors or other MEMS structures. (And for those of us that spend little or no time with mechanical tools, CAD refers only to the graphic modeling of a device. Solving all of the behavioral properties is CAE.) The company name is OnScale, and they’ve launched their own cloud-based finite-element solver – used for analyzing all manner of physical – and multi-physical – phenomena.

The focus of this story is less on the utilization narrative and more on the cost of acquisition of solver tools. Which suggests it’s a story better suited to new companies trying to get into this space, since well-established incumbents may well have already completed their acquisitions (albeit with ongoing licensing charges). That said, they’re claiming significant performance gains over incumbents – which we’ll return to.

While many application spaces make use of solvers, OnScale is focusing on several specific markets in particular. Targets include biomedical, automotive (specifically, ADAS), MEMS for the IoT, and 5G RF – picture filters).
These tools have heavier computing requirements than traditional EDA tools. EDA is about abstraction and flow and modeling accuracy (at the risk of dramatic over-simplification). Solvers, by contrast, are all about computing intensity. Much work goes into making them more efficient – faster for the same accuracy.

According to OnScale, this means that acquiring a solver is about more than simply purchasing tools. Companies typically have to purchase high-performance computing (HPC) servers as well. The outlay for both – six digits over three years, according to their case study – also means a lengthy evaluation cycle of 6 to 12 months.

Their answer to this is OnScale Cloud, a solver hosted in the cloud. They refer to this as a “solver-as-a-service,” which could be conveniently abbreviated as SaaS, except that’s already taken (and, frankly, could also stand for “simulation-as-a-service” for Metrics). Buzzphrases aside, the idea is simple: jobs can be launched into the cloud, leveraging the massive parallelism that the cloud affords.

That potential for massive scaling could mean a faster run than can be done in-house. Or it could mean the ability to run more scenarios or sweep more parameters in an attempt to better optimize a design. They’re claiming a 100x performance gain over incumbent tools. I checked in with them to see exactly what this means, and they say that it’s an apples-to-apples comparison between the same jobs using different tools on similar hardware, averaged (and rounded) over hundreds of jobs. Parallelism boosts this gain even further.

As for the solver itself, you might wonder about a new proprietary one, given all the history behind the current incumbents. Well, it turns out to be new to them, but not new. They got it through an acquisition, and there’s 30 years behind the technology.

Pricing is again on a per-use basis, although measured and billed through a subscription, differently from how Metrics does it. Instead of measuring minutes, they use core-hours (abbreviated “CH”), and their three-tiered pricing structure comes with bundles of CHes plus charges for overage.

  • Free: 10 CH/month, with overage at $10/CH
  • Professional ($300/mo): 50 CH/month, with overage at $9/CH
  • Team ($1000/mo): 200 CH/month, with overage at $7/CH

They address different markets through plug-ins. For instance, there’s an EDA plug-in for connecting with EDA tools and delivering results into the EDA flow.

(Image courtesy OnScale)

So both Metrics and OnScale are tying their futures to the cloud. Let’s see if times have truly changed.

More info:

Metrics

OnScale

8 thoughts on “New In the Cloud”

    1. I once was managing a software team working on a giant EDA project. We were trying to sort out an almost incomprehensible mess with our code, and the company had put serious “lock down” security measures in place – making it almost impossible for our team to get our work done. We joked that possibly the most damaging thing we could do to our competitors would be to just put all our source code on an open server. We figured it would set them back years…

  1. Github hosts one of the largest repositories of closed and open source code. I believe half of their revenue comes from enterprise customers. I don’t remember any security incidents where there is a massive breach. (At least not now) The problem is not the infrastructure. The problem is the people who are managing the infrastructure and security.

  2. When your project is finished how do you reliably archive it. I have seen archives that were the project files, the CAD software to use them, and the machines to run the software on all went into the vault. I don’t see an easy way to achive this when someone else owns the software and the hardware to run it. Can you realy trust in a cloud company that may be gone tommorrow.

  3. I’m glad, and not at all surprised, to see the emergence of these offerings from Metrics and CAE. In my view, widespread use of EDA tools in the public cloud is inevitable. I work for a company delivering cloud data management infrastructure and, amongst other things, our platform is being leveraged for production EDA workloads. Based on what I’ve seen (both from IC manufacturers and fabless IP design firms), use of the public cloud for EDA workflows is gaining momentum…and that momentum is only increasing. The benefits of the cloud (e.g. IT resource elasticity, consumption-based payment models, access to cutting-edge HW and services) are simply too compelling and they are outweighing the legacy concerns over cloud security. Ultimately, no environment (on-prem, in-cloud, wherever) can ever be completely secure from attack…but GCP, AWS, and Azure service many sensitive workloads and they have far more security expertise than a typical enterprise. I’d argue that most enterprises are likely much less secure than the cloud…they simply benefit from anonymity.

Leave a Reply

featured blogs
Apr 19, 2024
In today's rapidly evolving digital landscape, staying at the cutting edge is crucial to success. For MaxLinear, bridging the gap between firmware and hardware development has been pivotal. All of the company's products solve critical communication and high-frequency analysis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

Power High-Performance Applications with Renesas RA8 Series MCUs
Sponsored by Mouser Electronics and Renesas
In this episode of Chalk Talk, Amelia Dalton and Kavita Char from Renesas explore the first 32-bit MCUs based on the new Arm® Cortex® -M85 core. They investigate how these new MCUs bridge the gap between MCUs and MPUs, the advanced security features included in this new MCU portfolio, and how you can get started using the Renesas high performance RA8 series in your next design. 
Jan 9, 2024
14,438 views