EDA is Not Quite Dead After All
When companies become zombies, it’s not quite as obvious as with humans. Sure, the symptoms are similar - being dead but walking around as if still alive, no capacity for rational thought, pursuit of a single-minded hunger - all while the inside is rapidly decomposing. Oh, and then there’s the smell.
The Moore’s Law apocalypse is taking place in the world of custom chip design right now, and by all rights EDA companies should be among the walking dead - mindlessly scouring the engineering countryside for leftover morsels of brains.
Packet Plus Brings Debugging to Networking Engineers
Networking engineers are some of the best and brightest among us. There are good reasons for this. Designing networking equipment is a demanding discipline, spanning a wide gamut of areas from analog and signal integrity to digital design to software - and integrating all of these elements at something near their maximum performance potential. In order to get a competitive piece of network hardware out the door, you are literally designing at the bleeding edge of everything.
You are designing a new product as an SoC and need some processing power - not a huge amount - and you have tight power and real estate budgets. So you drop in an 8051 core. Job done? Well, not according to the folks at Cortus. These guys, a multinational mix of people based in the Southern French town of Montpellier, whose backgrounds include working on processors for Intel, Bosch, Infineon, Siemens, and Synopsys, are likely to say that you may have made a poor move. Your real estate and power budgets can be achieved with a processor that will also give you a great deal more processing horsepower and a lower overall cost of ownership - their APS3 32-bit core.
During the Christmas break, I took time out from roasting an ox on the open fire, distributing presents to the assembled multitude of staff, chasing foxes across the rolling acres of Selwood Towers and feasting, wassailing and carousing to think about the past year and embedded technology stuff. I managed to overcome the urge and went back to roasting an ox etc, but, now the break is over, it seems worth having another think.
Where Green Pastures End
Just about every electronic technology on the market today has alternatives. Between custom chips, ASSPs, pre-built modules, embedded processors, microcontrollers, FPGAs, and a host of other silicon-based goodies, there are always numerous ways to solve any given problem. As engineers, we make our choice based on any number of criteria - cost, power, size, reliability, our familiarity and experience with the technology, our company’s preferences... all of them weigh into our decision.
Perhaps you’re new to the US and you’re investigating some recipes to make. You’ve resigned yourself to the fact that, here, things are measured more by volume than by weight, and the measurement units have that peculiar non-metric feel about them that forces you to prove you can still do mental arithmetic. But the terms for things are sometimes different too, even if you come from another purported English-speaking country. We don’t do aubergines and courgettes; we do eggplants and zucchini.
A Look Back at 2011
Yawn! Another boring year of exponential improvement in capability, cost, and power consumption. Bo-o-oring. When will something truly exciting happen in electronics? It’s just the same old grind, year after year, with nothing all that interesting going on.
Moore’s Law is a harsh mistress. It sets the bar for our industry at an incredibly high level. If you manage a 2x improvement in everything you do every two years, there’s not really anything of interest to report. You met the standard - status quo - move on along - nothing to see here. Furthermore, if you try to brag that you’ve “doubled” this or “ten-times’d” that, you get thrown into the bin of “marketing-hypers” and your credibility plummets.
Horrendous quote of the day – “27% of the industry requires 3 or more spins.” This is the headline on a slide from Harry Foster, of Mentor, based on a large worldwide survey of silicon and FPGA implementers and their verification problems, conducted by Wilson Research in 2010. OK, the positive side is that 73% get it right by second spin, and a further 23% by the fourth time round. But with spins costing multiple millions of dollars, you have to have a huge market for a chip to justify that number of spins, and a market that is prepared to wait for the chip to arrive, since, by most estimates, a re-spin is going to take three or more months. And re-spins are only part of the reason why 66% of projects are delivered late.
How EDA and Disneyland May Have More In Common Than You Think
Did you ever consider that EDA is a lot like Disneyland? I know it may sound a little kooky but check out this week’s Fish Fry to find out more! I interview Shawn McCloud (Vice President of Marketing - Calypto Design Services) about Catapult-C’s transition to Calypto, the ambiguous nature of the term ESL, and what Calypto brings to the EDA party. Also this week, I interview Shishpal Rawat (Accellera Chair) about why it's important to have IP standards and where he sees IP and system design standards headed.
How do you get from an idea to an ASIC? Classically you shell out multiple dollars for a design tool-chain, and many more dollars for people to drive it. After a lengthy period of definition, design, and verification, you send data to a foundry, pay many more millions of dollars for a mask set, and get back, weeks later, a box of wafers. Now you spend even more money on probing the wafers, and, if you are lucky/skilled, you have a working device and can move into production – that is, after putting together a logistic chain for running the devices from the foundry through a test facility and then an assembly house and finally back to where you need to use the devices. If you are not lucky/skilled, you pay even more dollars for the design to be re-worked and more millions for a new mask set, and then you still have to put together your logistics chain.
A Closer Look at Cell-Aware Modeling
Chip testing is always a delicate balance between testing enough and not testing too much. In reality, you want to find the “necessary and sufficient” set of tests for best quality at the lowest test cost. That’s a tough thing to get right.
Throw on top of that goal the fact that SoCs and other modern digital chips require automation to generate test vectors. Even if you find that perfect test balance, if you can’t figure out how to craft an algorithm to implement that balance automatically, it becomes an academic exercise.
Xilinx Premieres Premier
In the years we’ve been covering FPGAs, the technology and the market have been expanding in all dimensions. The devices themselves have grown exponentially bigger, faster, more capable, and more complex. The number and variety of applications have expanded too - with new application areas annexed into the FPGA arena on a regular basis. FPGAs have branched out from simple glue logic to complex system-on-chip integration devices in a wide gamut of markets and systems.
Combining Emulation and Offline Debugging
Today’s system-on-chip (SoC) designs are increasingly dependent on firmware and device drivers given the challenges of controlling various components (including the microcontroller, microprocessor or DSP cores, peripherals and interfaces). Accordingly, leading semiconductor companies are working to integrate software development and validation with silicon design and verification. One obstacle to such integration is the difficulty in effectively debugging early-stage embedded software. In this article we describe a way around this obstacle by way of a new software debugging methodology for software and system-level integration.
Zuken Redesigns their Board Tools from Scratch
Anyone who’s ever done any serious remodeling of their home knows the big decision. At some point, wouldn’t it really be easier just to mow down the existing structure and start over?
Little by little, as you add new ideas – “while you guys are at it” – the costs mount, and that’s even without considering the surprises that are inevitably encountered. And if you go from a two-dimensional home – one story – and add a third dimension, it gets crazier. Most single-story homes aren’t built strong enough to support a second story. So you end up doing things like building a separate support framework to hold up the new top floor or, even more crazily, hoisting the original house up to make it the top floor and then building a new first floor under it. (Yes, people do this.)
In 1999, DAC (the Design Automation Conference) was in New Orleans. The industry was at the height of its growth, and, when you got off the plane, it looked as though at least a third of the cabs had illuminated Synopsys advertisements on their roofs. There were almost 250 exhibitors, many of them recent start-ups, and it took forever to get from the show booths to the demo booths. In the evening, DAC vendor parties were everywhere, and, despite the humidity and heat, it was a wonderful time to hear about new ideas.