feature article
Subscribe Now

The Year in EDA

Did Anything Happen?

2013 is coming to a close, and this is usually a time for reflecting on what’s happened in the past year and what’s going to happen in the coming year. The thing is, though, when I sit back and reflect, well, I don’t know; it just seems like 2013 was a quiet year for EDA.

So I took a couple of approaches to reviewing the year. One is to see what the Big Guys did and the other was to solicit some other opinions as to what’s in and what’s out.

Why focus on the Big Guys? Well, after all, they are pretty much the only exit strategy in EDA (he said on a day when Mentor announced they’d gobbled up Oasys). They’re the big Black Hole that sweeps up anything interesting that gets within reach. Unlike with our stellar brethren, it’s typically a failure if you don’t manage to hit their event horizons.

So I did something very unscientific: I attempted to add up the press releases coming from Synopsys, Cadence, and Mentor. Now… just in case this encourages some PR person to think that more press releases automatically equals winning… um, not so much. I wanted to focus on releases that announced new products. That means leaving off announcements about going to conferences, bragging that someone used their product, inclusion in reference flows (everyone ultimately gets that), collaborations, and business announcements.

I also wanted to focus on semiconductor design, so I omitted software and firmware as well as PCB design. Mentor had a number of announcements in these categories; Synopsys also had some. So those wouldn’t show up in the numbers.

I specifically did include product upgrades and extensions as well as new announcements for tools and IP. This includes verification IP. And here’s what I tallied up, splitting up tools and IP:













This actually surprised me. Somehow I had the sense that Cadence would dominate in the tools side, with Synopsys winning the IP side. Looking back, I think that my impressions were swayed by the fact that Cadence had two major new product releases: Tempus and Voltus. Synopsys had a larger number of less dramatic releases.

One trend did emerge that probably comes as no surprise, in retrospect. Yet I double-checked it by expanding my search beyond just the Big Three. That trend was the overwhelming dominance of ARM. Everywhere. There were more ARMs appearing than on a Hydra after a blender accident. I looked to see which companies had announced something that related to ARM: Cadence, Synopsys, Altium, ASSET, Xilinx, Altera, element14, IAR, Imperas, Ambedded Technology, [deep breath] Green Hills, NAG, Atollic, SuVolta, Micro Digital, Hillcrest Labs, Forte, Aldec, PLS, STMicroelectronics, and, after scouring six months’ worth of releases, I got bored and figured I’d pretty much made my point.

SPICE was another topic that got a lot of attention, with news from Tanner, Agilent, ProPlus, Berkeley Design Automation, Cadence, Linear (!), and Synopsys. This gets to the increasing importance of analog in SoCs, with more and more stuff that used to be on a separate chip joining the party on an SoC.

The second thing I did was to solicit some input from companies themselves, both small and large: what did they think the big events of the industry were, and what are the upcoming issues for 2014? Now… first of all, when I sit down to send a note to a list of folks, I’m not going to hit everyone. It doesn’t necessarily mean I like some folks better; it mostly means that either I was too lazy to Google around if I didn’t know whom to contact, or my brain did its usual thing about remembering only so much and omitting others for no particularly good reason. So I may stir up some angst from the folks I didn’t contact; if it makes you feel any better, not everyone I contacted responded. Other than that, well, did I mention unscientific? Apologies to anyone feeling left out. Seriously.

I honestly didn’t get a good sense of any big “aha” moments in EDA. Trends developed that will continue into 2014, and so I’ll discuss them in that context. Granted, some companies looked only at their own accomplishments or issues while others were broader in their assessments. So it fell to me to referee and filter (and I have no problem with that). So, in no particular order…

FinFETs and 14 nm

Yup, they’re coming folks. Next year everyone is going to have to start taking FinFETs seriously as 14-nm designs (yes, and 16-nm designs) commence. And it’s clear that the attendant complexity has folks quivering in their boots a little. Uniquify put it simply: FinFETs will dominate the agenda.

Real Intent opined that this will pose a particular challenge for tools that don’t have quality-of-results (QoR) settings. That would suggest that all-or-nothing algorithms may be frustrated by the extra difficulty of managing the complexity, while tools with knobs can have the effort level dialed back to merely Herculean. That said, if dialing down QoR means poor results, it won’t be an acceptable solution.

Mixed-signal verification

This is something that Synopsys specifically brought up, but it ties to the whole SPICE thing above. The analog/digital boundary is hard to verify. And the tools used to do the job can take a long time to run, especially as mixed-signal blocks get larger. So this will continue to be an area that demands attention and improvement from tools.

System C

Forte articulated specifically what others also noted: SystemC is coming into its own in the US. It has particular benefit for driving both high-level synthesis (HLS) and verification. In other words, a single spec can be used both to tell the designers what to build and to let the testers decide whether it got built right. That not only saves time, but it eliminates a potential source of errors, since a separately-generated verification spec is not needed. The verification infrastructure can also be further leveraged to drive validation: post-silicon checkout.

Which leads us to the Big One:

System and full-chip verification

This seems to be the one that’s got everyone’s panties in a bit of a twist. A small part of it might be attributed to the fact that analog now constitutes part of the full chip, but other than Synopsys, this wasn’t the part that folks were concerned about.

One of the obvious main drivers here is scale: we’re getting to the billion-gate mark. (By companies other than Intel.) Both OneSpin and Breker noted that UVM is breaking down at the full chip level: it’s actually added complexity. So methods other than simulating-until-you-drop are needed.

Another contributing factor is that SoC design is no longer merely a team effort within a company: it’s the result of the inputs of multiple companies. That’s obviously due to the dramatic importance of IP, much of which comes from outside, but it also reflects the use of contractors; they’re essentially design departments that don’t belong to the company that will own the final design. So all of the pieces have to be validated – even those that have protected design details.

Several folks mentioned that the big winners in this are formal technology and emulation. More than one said that formal has gone mainstream this past year. I was cautious of this conclusion, since many of these were formal companies, and such a conclusion might be seen as self-serving. But none of them was an emulation company, so they weren’t simply tooting their own horns.

That said, Synopsys specifically said that formal and static technology are wavering under the weight of the amount of verification they need to handle. That could be interpreted as a call for improvements in those tools, or it could be a suggestion that simulation is the answer. If the former, then this becomes a point of differentiation between different formal folks. If the latter, it could spark a lively debate. Which is always good fun.

In addition, we’ve seen before that formal technology is still hard to use outside of a few very specific applications. So if formal is destined to take on more responsibility, then ease of use is likely to raise its head as an issue to be solved.

The other impact here comes from software. OneSpin in particular noted that companies are trying to standardize on as few hardware platforms as possible. They then use software to differentiate. That brings in a whole new set of headaches. Breker, of course, is trying to leverage that in the verification space by auto-generating C-level tests and moving the verification – and design – focus to the creation of well-defined usage scenarios.

This reinforces the challenges of full-chip design, since the effort moves from circuit verification to system integration, making sure that everything hooks up well together. After all, at the risk of inviting politics (which I am explicitly NOT doing), the original fingers pointing when the HealthCare.gov website fiasco became apparent suggested that the pieces had all been tested – they just hadn’t been tested as a combined unit, working together. Whether or not the pieces were indeed fine, they certainly didn’t work well together. No SoC design team wants to see a similar cluster… thingy… happen on their watch.

It also suggests a growing acquaintance between EDA and embedded software design. After all, at the system level, certain things have to work. Whether they work due to great hardware or great software is a lower-level detail that shouldn’t be burdened by the fact that these have, traditionally, been very distinct fields. EDA, meet Embedded. Embedded, meet EDA. You guys are going to be roommates for a long time. You might even be considering a tighter commitment sometime in the future.

Low power

Finally, I was surprised that, while mentioned, power didn’t figure large and in charge. Only Jasper mentioned it as a top-line item. Maybe that’s because it’s obvious: everything these days seems to be about power, and there’s no sign that that’s going to change. 

So, in summary, it does seem to have been a quiet year. But big trends have been building nonetheless, and it appears that some of those will spill out it the coming year. Hopefully we’ll have lots of exciting things to talk about.

2 thoughts on “The Year in EDA”

  1. In tandem with Forte’s observations on SystemC for HLS and verification, design creation in ESL will come back around as a productivity gap (i.e., lack of it) from text-based editing will need to be addressed.

Leave a Reply

featured blogs
Dec 8, 2023
Read the technical brief to learn about Mixed-Order Mesh Curving using Cadence Fidelity Pointwise. When performing numerical simulations on complex systems, discretization schemes are necessary for the governing equations and geometry. In computational fluid dynamics (CFD) si...
Dec 7, 2023
Explore the different memory technologies at the heart of AI SoC memory architecture and learn about the advantages of SRAM, ReRAM, MRAM, and beyond.The post The Importance of Memory Architecture for AI SoCs appeared first on Chip Design....
Nov 6, 2023
Suffice it to say that everyone and everything in these images was shot in-camera underwater, and that the results truly are haunting....

featured video

Dramatically Improve PPA and Productivity with Generative AI

Sponsored by Cadence Design Systems

Discover how you can quickly optimize flows for many blocks concurrently and use that knowledge for your next design. The Cadence Cerebrus Intelligent Chip Explorer is a revolutionary, AI-driven, automated approach to chip design flow optimization. Block engineers specify the design goals, and generative AI features within Cadence Cerebrus Explorer will intelligently optimize the design to meet the power, performance, and area (PPA) goals in a completely automated way.

Click here for more information

featured paper

3D-IC Design Challenges and Requirements

Sponsored by Cadence Design Systems

While there is great interest in 3D-IC technology, it is still in its early phases. Standard definitions are lacking, the supply chain ecosystem is in flux, and design, analysis, verification, and test challenges need to be resolved. Read this paper to learn about design challenges, ecosystem requirements, and needed solutions. While various types of multi-die packages have been available for many years, this paper focuses on 3D integration and packaging of multiple stacked dies.

Click to read more

featured chalk talk

Achieving Reliable Wireless IoT
Sponsored by Mouser Electronics and CEL
Wireless connectivity is one of the most important aspects of any IoT design. In this episode of Chalk Talk, Amelia Dalton and Brandon Oakes from CEL discuss the best practices for achieving reliable wireless connectivity for IoT. They examine the challenges of IoT wireless connectivity, the factors engineers should keep in mind when choosing a wireless solution, and how you can utilize CEL wireless connectivity technologies in your next design.
Nov 28, 2023