feature article
Subscribe Now

A Few Rounds on EDA360

Several months ago, Cadence announced a strategy called EDA360. It was accompanied by a whitepaper that was eventually made public. It was, and is, billed as the brainchild of Cadence CMO John Bruggeman and was issued as a call to arms – a manifesto, even – to the EDA industry.

It can be summarized as, “It’s the software, stupid.”

And it caused a bit of a stir, some of it the kind Cadence would like, some less so.

Well, now that some time has passed to let everyone calm down some, seems like it might be good to come back to this topic in the hope that rational heads can prevail. What does this mean for Cadence and for the industry?

OK, that sounds way too generic. Here’s the real question people ask: is this just marketing hype or is there some substance behind it? The specific reason that question applies is that most examples given by Cadence about how EDA360 gets put into play involve pointing to products that have existed long before EDA360. Which almost makes it look like the strategy is really a repositioning of existing products.

In fact, to hear Cadence describe what’s happened since then, there’s both substance and marketing going on here. But… I’m getting ahead of things. First, let’s back up and review what the hoo-hah is all about.

Motivated by a so-called “glass ceiling” where EDA can’t seem to raise their functionality above synthesis (not to mention raise their collective revenues), the concept is that creating chips shouldn’t be the primary focus of EDA. The industry should travel further upstream to that which executes on the chips: the software application. Or applications (a system architect’s “application engine” is a chip designer’s “mode”).

There was a time when chips did things. They no longer do so much. Software does things, and the chips let the software do its thing. So you could say that designing a chip without knowing the software is like filling a spice rack without knowing which type of cuisine you’re going to cook. It’s not all just paprika and garlic salt anymore.

Cadence divides the effort needed to bridge between application and IC into three layers. On the inside is the IC layer, which pretty much covers a lot of what is done in EDA today. It’s how you make a chip.

Today’s bigger chips are monolithic embedded systems – the so-called System on Chip (SoC). The next layer that Cadence defines is SoC realization. This takes the SoC itself and slathers over it the basic software required to connect the hardware with other software. It’s a patina of software, a kind of passivation layer (to pacify software guys) that makes hardware look like software. Basically, drivers. And debug support. Low-level stuff. They also include integration of hard silicon IP here.

At the top is system realization. This pretty much includes everything else: operating system, virtualization, middleware, application.

Pretty simple. Software on top, hardware on the bottom, and an adaptation layer in between.

And with something like this, there’s always more than one schema that can represent the order of things, so that, while people can quibble about whether it’s right, it’s definitely not wrong.

And here’s where the questions start. Cadence and the rest of EDA have most visibly occupied the lower (or inner) layer: the silicon one. Actually… that’s not quite true. Mentor, for example, has been doing embedded stuff and hardware/software co-design since before “seamless” was trendy. But, in fairness, if you look at the amount of focus on what it takes to implement designs at 28 nm, well, yeah, there’s a lot going into silicon, no doubt.

And, in fact, when you look at Cadence’s products and their recent “holistic approach” announcement, it focuses on the silicon side of things. (That’s not my interpretation; they’re very up front about it.)

A discussion with Cadence’s EDA360 Evangelist Steve Leibson sheds some further light. (As an aside, being an evangelist isn’t always an enviable job… it can carry the connotation of a message that must be delivered to ears that are not always receptive…). He points out that Cadence has completely restructured along the lines of the three layers, and that people are busily executing away against the vision. Not a small matter for a large company.

So is it substantial or marketing? Yes. Both. It’s clear they’re tying existing products into the vision. That’s all they have for now. And, frankly, if you have a vision in which your current products have no place, you’ve got a problem. It also sounds like they’re working on a lot of stuff they’re not talking about yet. So, until they announce new stuff, it’s all going to be about the old stuff.

So if they are doing more, are they tackling the entire vision or just parts of it? Here things get a bit more vague. Leibson claims that Bruggeman’s intention is to “create cooperation where competition doesn’t do any good.” Again, it’s hard to point to specific new initiatives since the two examples mentioned (CPF and UVM) predate the strategy.

The whole concept of standards and openness and such seems to hit a bit of a sore spot. After all, CPF does co-exist with UPF, a less-than-desirable situation. And some of that dispute was about openness. So rather than open all of that up again, let’s just say that, to paraphrase Steve, human nature has to be factored into all of this. Bottom line: Cadence isn’t necessarily taking on the entire thing; they’ll be looking for willing collaborators. Exactly how much they do remains to be told.

Exactly what Cadence plans to do on its own is further clouded by their position on having a full flow. Steve correctly points to the countless IC flows and the number of people employed not in creating ICs, but in managing the flows. These flows comprise numerous tools (presumably considered “best of breed”) stitched and scripted together into a tottering, wobbly, amorphous mass that bears silence in its proximity lest the whole thing crumble in the acoustic wake.

So Cadence’s solution is simple: you should use a full-flow provider. (Us.) Don’t mess with all those point tools out there. One flow can easily provide designs that customers “will be happy with.” Which sounds like “good enough is good enough.” Which is often true. What about in this case?

When asked about the “best of breed” concept, Steve points out that the notion is actually very difficult to define since most tools provide a range of results depending on the design, and that the ranges from competing tools tend to overlap a lot. So every company can point to the particular set of benchmarks that prove its tool is the best. That’s not so much “good enough is good enough” as much as it is “there’s no such thing as ‘best of breed’; the tools are more or less all the same.”

When asked about the start-ups that tend to bring focused new innovation and then get gobbled up by some big guy, he just doesn’t see the little guys being able to out-innovate the well-funded large companies to the point where they could create that much differentiation. Yeah, you might be 10% better, but that’s not going to turn many heads if you’re a small unknown company.

That doesn’t necessarily square with what’s happened in the industry over the last while, but, then again, this whole full-flow discussion goes back and forth, and each company has to re-hash the issue at some point. Magma was focused on full flow and then decided to limit their efforts to areas where they could provide unique value; they’ll argue that full-flow provides no value. (Ooo, wait, if they changed their mind again, then they’d have come full circle and could have a clever new campaign about 360 deg… oh… wait… never mind.)

Something tells me we won’t be settling the flow issue any time soon.

So, for all the sturm und drang associated with this initiative, we’ve uncovered very few areas that would drive more than perhaps some animated discussion. So what is it about this that got everyone’s goat? Do people think the strategy is fundamentally wrong?

The answer to that last question is a vigorous “No.” No one is saying that the concept of software taking on primacy is wrong. In fact, many say that many have been saying it for quite a while. And that’s where people get annoyed: Cadence has been pointing to recent keynotes and other speeches (and only recent ones) where Synopsys’s Aart de Geus and Mentor’s Wally Rhines have been speaking of the importance of software as evidence of Synopsys and Mentor coming on board and endorsing the Cadence message.

And the hackles rise.

Both Synopsys and Mentor say they’ve been articulating this message and executing against it for some time now.

Synopsys’s John Chilton says, “As software has consumed a larger part of the design cycle, the complexity problem increasingly has become a software problem. In fact, this has been a key plank in Aart’s messaging for several years.” He points to their efforts and acquisitions in the IP and systems space as examples of execution both organically and by going outside for technology.

Mentor’s Ry Schwark notes that, “We’ve been in the embedded software market… for about 15 years because we believe in the importance of software in design.  Our embedded software is in more than half of all cellphones on the market.  Odds are our embedded software is in your handset.”

Feathers clearly remain ruffled. Said one person who declined to be named, “[It’s] like crashing a party then telling the host you’ve decided it’s your house.”

So, beating a discreet retreat, let’s summarize what this all amounts to.

It comes back to this: Cadence is saying, “It’s the software, stupid,” and re-orienting their business around it.

(And everyone else is saying, “Yeah, we knew that.”)

Leave a Reply

featured blogs
Mar 28, 2024
'Move fast and break things,' a motto coined by Mark Zuckerberg, captures the ethos of Silicon Valley where creative disruption remakes the world through the invention of new technologies. From social media to autonomous cars, to generative AI, the disruptions have reverberat...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

FleClear: TDK’s Transparent Conductive Ag Film
Sponsored by Mouser Electronics and TDK
In this episode of Chalk Talk, Amelia Dalton and Chris Burket from TDK investigate the what, where, and how of TDK’s transparent conductive Ag film called FleClear. They examine the benefits that FleClear brings to the table when it comes to transparency, surface resistance and haze. They also chat about how FleClear compares to other similar solutions on the market today and how you can utilize FleClear in your next design.
Feb 7, 2024
7,154 views