feature article
Subscribe Now

The Year in EDA

Did Anything Happen?

2013 is coming to a close, and this is usually a time for reflecting on what’s happened in the past year and what’s going to happen in the coming year. The thing is, though, when I sit back and reflect, well, I don’t know; it just seems like 2013 was a quiet year for EDA.

So I took a couple of approaches to reviewing the year. One is to see what the Big Guys did and the other was to solicit some other opinions as to what’s in and what’s out.

Why focus on the Big Guys? Well, after all, they are pretty much the only exit strategy in EDA (he said on a day when Mentor announced they’d gobbled up Oasys). They’re the big Black Hole that sweeps up anything interesting that gets within reach. Unlike with our stellar brethren, it’s typically a failure if you don’t manage to hit their event horizons.

So I did something very unscientific: I attempted to add up the press releases coming from Synopsys, Cadence, and Mentor. Now… just in case this encourages some PR person to think that more press releases automatically equals winning… um, not so much. I wanted to focus on releases that announced new products. That means leaving off announcements about going to conferences, bragging that someone used their product, inclusion in reference flows (everyone ultimately gets that), collaborations, and business announcements.

I also wanted to focus on semiconductor design, so I omitted software and firmware as well as PCB design. Mentor had a number of announcements in these categories; Synopsys also had some. So those wouldn’t show up in the numbers.

I specifically did include product upgrades and extensions as well as new announcements for tools and IP. This includes verification IP. And here’s what I tallied up, splitting up tools and IP:













This actually surprised me. Somehow I had the sense that Cadence would dominate in the tools side, with Synopsys winning the IP side. Looking back, I think that my impressions were swayed by the fact that Cadence had two major new product releases: Tempus and Voltus. Synopsys had a larger number of less dramatic releases.

One trend did emerge that probably comes as no surprise, in retrospect. Yet I double-checked it by expanding my search beyond just the Big Three. That trend was the overwhelming dominance of ARM. Everywhere. There were more ARMs appearing than on a Hydra after a blender accident. I looked to see which companies had announced something that related to ARM: Cadence, Synopsys, Altium, ASSET, Xilinx, Altera, element14, IAR, Imperas, Ambedded Technology, [deep breath] Green Hills, NAG, Atollic, SuVolta, Micro Digital, Hillcrest Labs, Forte, Aldec, PLS, STMicroelectronics, and, after scouring six months’ worth of releases, I got bored and figured I’d pretty much made my point.

SPICE was another topic that got a lot of attention, with news from Tanner, Agilent, ProPlus, Berkeley Design Automation, Cadence, Linear (!), and Synopsys. This gets to the increasing importance of analog in SoCs, with more and more stuff that used to be on a separate chip joining the party on an SoC.

The second thing I did was to solicit some input from companies themselves, both small and large: what did they think the big events of the industry were, and what are the upcoming issues for 2014? Now… first of all, when I sit down to send a note to a list of folks, I’m not going to hit everyone. It doesn’t necessarily mean I like some folks better; it mostly means that either I was too lazy to Google around if I didn’t know whom to contact, or my brain did its usual thing about remembering only so much and omitting others for no particularly good reason. So I may stir up some angst from the folks I didn’t contact; if it makes you feel any better, not everyone I contacted responded. Other than that, well, did I mention unscientific? Apologies to anyone feeling left out. Seriously.

I honestly didn’t get a good sense of any big “aha” moments in EDA. Trends developed that will continue into 2014, and so I’ll discuss them in that context. Granted, some companies looked only at their own accomplishments or issues while others were broader in their assessments. So it fell to me to referee and filter (and I have no problem with that). So, in no particular order…

FinFETs and 14 nm

Yup, they’re coming folks. Next year everyone is going to have to start taking FinFETs seriously as 14-nm designs (yes, and 16-nm designs) commence. And it’s clear that the attendant complexity has folks quivering in their boots a little. Uniquify put it simply: FinFETs will dominate the agenda.

Real Intent opined that this will pose a particular challenge for tools that don’t have quality-of-results (QoR) settings. That would suggest that all-or-nothing algorithms may be frustrated by the extra difficulty of managing the complexity, while tools with knobs can have the effort level dialed back to merely Herculean. That said, if dialing down QoR means poor results, it won’t be an acceptable solution.

Mixed-signal verification

This is something that Synopsys specifically brought up, but it ties to the whole SPICE thing above. The analog/digital boundary is hard to verify. And the tools used to do the job can take a long time to run, especially as mixed-signal blocks get larger. So this will continue to be an area that demands attention and improvement from tools.

System C

Forte articulated specifically what others also noted: SystemC is coming into its own in the US. It has particular benefit for driving both high-level synthesis (HLS) and verification. In other words, a single spec can be used both to tell the designers what to build and to let the testers decide whether it got built right. That not only saves time, but it eliminates a potential source of errors, since a separately-generated verification spec is not needed. The verification infrastructure can also be further leveraged to drive validation: post-silicon checkout.

Which leads us to the Big One:

System and full-chip verification

This seems to be the one that’s got everyone’s panties in a bit of a twist. A small part of it might be attributed to the fact that analog now constitutes part of the full chip, but other than Synopsys, this wasn’t the part that folks were concerned about.

One of the obvious main drivers here is scale: we’re getting to the billion-gate mark. (By companies other than Intel.) Both OneSpin and Breker noted that UVM is breaking down at the full chip level: it’s actually added complexity. So methods other than simulating-until-you-drop are needed.

Another contributing factor is that SoC design is no longer merely a team effort within a company: it’s the result of the inputs of multiple companies. That’s obviously due to the dramatic importance of IP, much of which comes from outside, but it also reflects the use of contractors; they’re essentially design departments that don’t belong to the company that will own the final design. So all of the pieces have to be validated – even those that have protected design details.

Several folks mentioned that the big winners in this are formal technology and emulation. More than one said that formal has gone mainstream this past year. I was cautious of this conclusion, since many of these were formal companies, and such a conclusion might be seen as self-serving. But none of them was an emulation company, so they weren’t simply tooting their own horns.

That said, Synopsys specifically said that formal and static technology are wavering under the weight of the amount of verification they need to handle. That could be interpreted as a call for improvements in those tools, or it could be a suggestion that simulation is the answer. If the former, then this becomes a point of differentiation between different formal folks. If the latter, it could spark a lively debate. Which is always good fun.

In addition, we’ve seen before that formal technology is still hard to use outside of a few very specific applications. So if formal is destined to take on more responsibility, then ease of use is likely to raise its head as an issue to be solved.

The other impact here comes from software. OneSpin in particular noted that companies are trying to standardize on as few hardware platforms as possible. They then use software to differentiate. That brings in a whole new set of headaches. Breker, of course, is trying to leverage that in the verification space by auto-generating C-level tests and moving the verification – and design – focus to the creation of well-defined usage scenarios.

This reinforces the challenges of full-chip design, since the effort moves from circuit verification to system integration, making sure that everything hooks up well together. After all, at the risk of inviting politics (which I am explicitly NOT doing), the original fingers pointing when the HealthCare.gov website fiasco became apparent suggested that the pieces had all been tested – they just hadn’t been tested as a combined unit, working together. Whether or not the pieces were indeed fine, they certainly didn’t work well together. No SoC design team wants to see a similar cluster… thingy… happen on their watch.

It also suggests a growing acquaintance between EDA and embedded software design. After all, at the system level, certain things have to work. Whether they work due to great hardware or great software is a lower-level detail that shouldn’t be burdened by the fact that these have, traditionally, been very distinct fields. EDA, meet Embedded. Embedded, meet EDA. You guys are going to be roommates for a long time. You might even be considering a tighter commitment sometime in the future.

Low power

Finally, I was surprised that, while mentioned, power didn’t figure large and in charge. Only Jasper mentioned it as a top-line item. Maybe that’s because it’s obvious: everything these days seems to be about power, and there’s no sign that that’s going to change. 

So, in summary, it does seem to have been a quiet year. But big trends have been building nonetheless, and it appears that some of those will spill out it the coming year. Hopefully we’ll have lots of exciting things to talk about.

2 thoughts on “The Year in EDA”

  1. In tandem with Forte’s observations on SystemC for HLS and verification, design creation in ESL will come back around as a productivity gap (i.e., lack of it) from text-based editing will need to be addressed.

Leave a Reply

featured blogs
Jul 12, 2024
I'm having olfactory flashbacks to the strangely satisfying scents found in machine shops. I love the smell of hot oil in the morning....

featured video

Larsen & Toubro Builds Data Centers with Effective Cooling Using Cadence Reality DC Design

Sponsored by Cadence Design Systems

Larsen & Toubro built the world’s largest FIFA stadium in Qatar, the world’s tallest statue, and one of the world’s most sophisticated cricket stadiums. Their latest business venture? Designing data centers. Since IT equipment in data centers generates a lot of heat, it’s important to have an efficient and effective cooling system. Learn why, Larsen & Toubro use Cadence Reality DC Design Software for simulation and analysis of the cooling system.

Click here for more information about Cadence Multiphysics System Analysis

featured paper

DNA of a Modern Mid-Range FPGA

Sponsored by Intel

While it is tempting to classify FPGAs simply based on logic capacity, modern FPGAs are alterable systems on chips with a wide variety of features and resources. In this blog we look closer at requirements of the mid-range segment of the FPGA industry.

Click here to read DNA of a Modern Mid-Range FPGA - Intel Community

featured chalk talk

Introduction to the i.MX 93 Applications Processor Family
Robust security, insured product longevity, and low power consumption are critical design considerations of edge computing applications. In this episode of Chalk Talk, Amelia Dalton chats with Srikanth Jagannathan from NXP about the benefits of the i.MX 93 application processor family from NXP can bring to your next edge computing application. They investigate the details of the edgelock secure enclave, the energy flex architecture and arm Cortex-A55 core of this solution, and how they can help you launch your next edge computing design.
Oct 23, 2023