feature article
Subscribe Now

DAC, Death, and the Future

Who Will Design Your SoC?

This year’s Design Automation Conference in San Diego was, as has become tradition, kicked off by Gary Smith’s presentation on the state of the EDA industry and EDA’s role in the design of the coming generations of electronic devices.  As usual, the presentation was a mix of data collected from numerous industry sources, interesting charts and graphs projecting market behaviors into the distant future, and educated analysis mixed with wild speculation on the complex dynamics of technology and money that is the semiconductor industry.

The next day, Cadence hosted a panel on the challenges of 20nm design, where there were discussions of the staggering issues that will be faced by design teams creating semiconductor devices at that node.  The combination of technical challenges for delivering on the next round or two of Moore’s law is daunting indeed, and design at those tiny geometries will be an activity exclusively reserved for the most knowledgeable and best funded design teams in the world.

According to the combination of conventional wisdom floating around at this year’s DAC, designing an SoC on the current leading-edge processes requires a team of 100 to 200 engineers and a budget of somewhere in the range $25M-$50M.  That’s for a single chip design.  If you do some quick math on that, there are several things we can infer.  First, if you assume that a fully-burdened engineer costs a company an average of $200,000 per year (counting salary, benefits, office, equipment, espresso, pizza, etc), and the average custom SoC design takes 18-months to 2 years to complete, almost all of that $25M-$50M design cost is accounted for by the size of the engineering team required.  While we hear a lot about mask costs and other non-recurring-engineering (NRE) charges associated with the fabrication process itself, those costs are still dwarfed by the fact that it takes just too many engineers to design a chip.

Like any task where there is too much human labor required, the solution to the SoC engineering glut is better tools and more automation of engineering.  This is the responsibility of the EDA industry – which is represented here at DAC.  At the conference, we have also heard data that the EDA industry currently spends somewhere in the realm of $1B developing tools to support each new process node, and that cost is increasing as the complexity of the tools required continues to rise. 

That number is problematic on both ends. 

First, in a process with a huge deficit of automation, the world is investing only the equivalent of twenty to forty chip designs on designing the tools to solve the single biggest problem – skyrocketing costs.  EDA can’t afford to invest more, however.  They don’t make enough.  With competition in the tools business, the first thing to go is pricing discipline.  EDA companies lower prices to compete with each other, and the result is a significant reduction in the total money coming into EDA. 

To follow through this logic circle a bit:  Almost nobody can afford to design custom SoCs today because it’s too expensive.  It’s too expensive because the EDA tools aren’t good enough.  The EDA tools aren’t good enough because the EDA industry doesn’t earn enough money to invest sufficiently in R&D to make better tools.  The EDA industry doesn’t earn enough money because there aren’t enough people designing custom SoCs.  Loop back to step one.

When the EDA industry was born, it was because economy of scale was available in creating design tools that worked across a broad set of semiconductor fabs.  Today, with the incredible consolidation that’s happened in the fab world, it’s not clear that economy of scale still exists.  We might well have better tools if each semiconductor fabrication company owned its own tools – and distributed them – just like in the old days.  Then, fabs could compete on tools as well as process.  If independent EDA companies could make tools that would work better, they could try to OEM them to the semi companies.  This model would allow the cost of the tools to be amortized over the silicon revenues, and each semi company could decide how much to invest in tools in order to keep their offering competitive. 

Before you go calling this idea crazy, it’s exactly what is working today in the FPGA business.  The majority of the world’s electronic designers DO get tools from the same place they buy chips – from their FPGA supplier.  Those designers pay almost nothing for their tools, yet the tool development efforts thrive.  In the FPGA business, tools are a huge part of the competitive picture.  If an FPGA company wants to beat their competitors in performance, they can optimize their hardware architecture, or they can tune their synthesis and place-and-route software to give better results.  Either approach gives them a competitive edge. 

With the current and coming generation of FPGAs, enormous capabilities are on tap.  Complex SoCs will be dropped on our desk for a relative pittance, and the whole embedded subsystem will already work.  If your SoC will be something like some ARM processors with memory and peripherals all connected via the standard protocols – along with some commercially-available IP blocks for standard functions, plus a little bit of magic your team adds themselves, you can probably do it in an FPGA, and most of the work is done before you buy the chip.  If you compare that with the ASIC/COT/SoC design flow, where a reported 70% of the engineering is spent on design verification, the FPGA folks are almost done before they start.  Walk down the boothways at DAC and look at the tough problems that are being tackled by the tool suppliers.  For 90% of those problems, one could say “Not applicable to FPGA design.”

As a result, in the FPGA option, instead of 100-200 engineers to design a complex SoC – it takes one to five.  Amortize those cost differences out over the expected volume run of your product and see how much each of those custom SoCs really costs you.  Oh, and don’t forget that, a couple of years from now, when you have a better idea for your product or it has to support some new standard to be current, your ASIC/SoC-based product will need a do-over.  Your FPGA-based system may be OK with a simple update in the field.

The combination of the EDA problem and the FPGA solution paint a pretty compelling picture of the potential future of custom SoC design.  If your business card doesn’t say something like “Apple” or one of about ten other mega-titles, you probably won’t be doing any, and you may also not be doing any business with the main companies represented here at DAC.  That’s a problem that EDA needs to solve even more urgently than the 20nm process node, double-patterning, or through-silicon vias.  It’s a problem of the survival of their industry.  I hope for their sake that they can sort it out.

Leave a Reply

featured blogs
Sep 18, 2021
Projects with a steampunk look-and-feel incorporate retro-futuristic technology and aesthetics inspired by 19th-century industrial steam-powered machinery....
Sep 17, 2021
Dear BoardSurfers, I want to unapologetically hijack the normal news and exciting feature information that you are accustomed to reading about in the world of PCB Design blogs to eagerly let you know... [[ Click on the title to access the full blog on the Cadence Community s...
Sep 15, 2021
Learn how chiplets form the basis of multi-die HPC processor architectures, fueling modern HPC applications and scaling performance & power beyond Moore's Law. The post What's Driving the Demand for Chiplets? appeared first on From Silicon To Software....
Aug 5, 2021
Megh Computing's Video Analytics Solution (VAS) portfolio implements a flexible and scalable video analytics pipeline consisting of the following elements: Video Ingestion Video Transformation Object Detection and Inference Video Analytics Visualization   Because Megh's ...

featured video

ARC® Processor Virtual Summit 2021

Sponsored by Synopsys

Designing an embedded SoC? Attend the ARC Processor Virtual Summit on Sept 21-22 to get in-depth information from industry leaders on the latest ARC processor IP and related hardware and software technologies that enable you to achieve differentiation in your chip or system design.

Click to read more

featured paper

Ultra Portable IO On The Go

Sponsored by Maxim Integrated (now part of Analog Devices)

The Go-IO programmable logic controller (PLC) reference design (MAXREFDES212) consists of multiple software configurable IOs in a compact form factor (less than 1 cubic inch) to address the needs of industrial automation, building automation, and industrial robotics. Go-IO provides design engineers with the means to rapidly create and prototype new industrial control systems before they are sourced and constructed.

Click to read more

featured chalk talk

BLDC Applications and Product Solutions

Sponsored by Mouser Electronics and ON Semiconductor

In many ways, Industry 4.0 is encouraging innovation in the arena of brushless motor design. In this episode of Chalk Talk, Amelia Dalton chats with CJ Waters of ON Semiconductor about the components involved in brushless motor design and how new applications like collaborative robots can take advantage of the benefits of BLDCs.

Click here for more information about ON Semiconductor Brushless DC Motor Control Solutions