Don’t Give Up on Simulation Yet!
It has been widely documented that the complexity of system-on-chip (SoC) designs is increasing exponentially, with most SoCs now including multi-threaded processors and many memories with multiple clock domains.
The ITRS report of 2010 shows that the number of processors for portable consumer devices is expected to increase ten-fold between 2009 and 2016, with the performance of each processor leaping 50x over the same time period. ITRS prognosticates that consumer SoC devices could embed 75 processors 10 years from now. Of course, this will be achieved under the constraint of a constant power budget. And, to top it all, design and verification schedules are shrinking. Somehow, it is not surprising a study conducted by Mentor Graphics reported that more than 70% of designs need at least two respins.
Altium Alters Course
Altium has long been a standout in the EDA industry. The company got its start as Protel - a supplier of affordable desktop PCB layout solutions. When the big EDA suppliers were exclusively selling expensive, workstation-based, enterprise-level board design systems, Altium (Protel) provided a strong, usable desktop solution for everybody else - the people who didn’t have a huge design tool budget.
As Protel evolved into Altium, their differences from the pack became more pronounced. Altium became the company of vision, with the goal of supplying the masses with a comprehensive tool suite including design capture, FPGA design, embedded software development, and trusty-old board layout. They wanted to give you an affordable desktop solution that could take your product design from soup to nuts, as long as you weren’t doing custom IC design.
A Look at Coventor’s Element Library
“I need a brush.”
What would you do given such instruction by someone to whom the response, “Can you be more specific, please?” would be considered inappropriate? It’s a hard request (or demand) to satisfy if you know absolutely nothing about his or her intent. It’s almost as bad as the “Bring me a rock” theory of management, except that that’s simply a way of ensuring that your employees are never quite sure if they’re doing the right thing, and so they remain nervous and stressed; putty in your hands. No, in this case, we’re just assuming poor communication skills, nothing Machiavellian.
The Future Belongs to Programmers
Three thousand dollars is lot to pay for a radio.
A friend of mine recently bought herself a nice new car. Not Rolls-Royce or Ferrari nice, but more in the Mercedes/Jaguar/Lexus category. And one of the optional upgrades she decided to spring for was a $3000 “Premium Comfort” package. Being both an engineering nerd and a car nut, I was curious about what actually went into this $3K bundle of goodies.
From what I could tell, it was mostly just firmware upgrades for little things like the keyless entry, cruise control, or GPS features. She wasn’t paying for any actual hardware, just for bits. The only tangible item in the whole option package was an upgraded radio, which probably cost the automaker about $75 in extra hardware. So by implication, the firmware upgrades cost my friend about $2925.
The Lighter Side of EE in 2012
Here at EE Journal, we have always believed that engineering is fun. As engineers ourselves, we know that there is a special kind of reward in solving problems and creating new and interesting things with technology. We have always believed that one of the things that really differentiate EE Journal from other trade publications is our sense of humor and fun. We don’t think engineering has to be a humdrum drone of microwatts and gigabits, and we know you don’t either.
Our Best of 2012
It was an awesome year here at EE Journal. Technology continued to evolve at a breakneck pace, and the EE Journal editorial team continued to drink beer. Oh, and (luckily, prior to those beers) we wrote down our thoughts and observations about what was going on in the world of electronic design. Apparently some of what we wrote was either interesting or inflammatory enough that bazillions of you found your way to the pages of EEJournal.com to read, comment, critique, disagree, or just stare in amazement at the pictures.
It's the slow decline of December. We’re wrapping up our projects, toiling away in expense reports, and lining up our ducks for 2013. Speaking of lining up ducks, we’re looking into the future this week - the future of EDA. Where in the heck is the tool market headed next year? How will recent major mergers and acquisitions affect the design tool landscape? My guest is Mike Gianfagna from Atrenta and we’re gonna talk about all of this and more.
In the last decade we have seen the process of timing signoff become increasingly complex. Initial timing analyses at larger process nodes such as 180nm and 130nm were concerned mostly with operation at worst-case and best-case conditions. The distance between adjacent routing tracks was such that coupling capacitances were marginalized by ground and pin capacitance. Hence, engineers seldom looked at the potential issues associated with cross-coupling and noise effects. It was simply easier to add a small amount of margin than to analyze crosstalk.
Starting at 90nm, and even more prominently at 65nm, an increase in coupling capacitance due to narrower routing pitches and taller metal segment profiles resulted in crosstalk effects becoming a significant concern.
MEMS is 20 years behind ICs.
So says MEMS consultant Alyssa Fitzgerald of AMFitzgerald. A lot can happen in 20 years – and it could well be argued that MEMS doesn’t have 20 years to catch up. If it has a choice. And if it can even catch up completely.
The issue is the “one device, one process” component of Yole’s MEMS Law. This is something you would never see with ICs, especially in today’s fabless/foundry world. With ICs, the foundry has a process, it works a particular way, it has been thoroughly characterized twelve ways from Sunday, and those results have been incorporated into increasingly sophisticated models that EDA tools can use to predict with reasonable accuracy just what the results of a particular circuit will be.
Synopsys Upgrades HAPS
Verification and test have always faced a tricky paradox: How do you build equipment to test and verify the biggest, fastest devices ever created?
After all, it stands to reason that the tester has to be faster than the thing it’s testing, and the prototype has to be bigger than the thing it’s prototyping. It means that those folks have to always be running ahead of the fastest runners in order to handle the problem.
When prototyping large SoC designs, this issue has always been handled by throwing a wall of FPGAs at the problem. Even though this poses significant challenges with issues like design partitioning and mapping the design to an FPGA-friendly format, it has been the most effective method available for getting a usable prototype up and working.
The military has dealt with this for years. And first responders ran headlong into the issue with 9/11.
You have a localized entity – a police department, a platoon, Red Cross folks on the front line – and it has its way of communicating internally. But when it has to work with another group – the fire department or the police from another town or perhaps a platoon from a different branch of the service – then suddenly they have to figure out how to patch all of these things that work fine on their own into a cohesive whole, getting messages from one to the other without any of it getting lost in the jumble at the boundaries.
MIPS Technologies Acquired by Imagination Technologies
It’s the circle of life. The great wheel of existence. One door closes; another opens. The end of a chapter, the beginning of another. Pick your favorite metaphor—MIPS Technologies has been packed up and sold.
That’s actually pretty good news for MIPS’s 160-some employees, but it still feels like the end of an era to me. One of the darlings of the RISC computer era, and an innovator in computers, microprocessors, and business models, is now just a division within the larger company of Imagination Technologies.
Xilinx Discusses 20nm
The two big FPGA companies want to be sure that you know they’re ahead.
They always have. It isn’t because you really needed to know, or because one or the other of them being ahead at any given time had any long-term industry-shaping ramifications. It’s just that this myopic, tit-for-tat, red vs blue, Hatfield and McCoy, be-the-first-to-blink behavior is, according to recent economic research, the optimal solution for members of a symmetric pre-emptive duopoly.
Or, maybe both sides just really hate those other guys.
A few weeks ago, Altera announced their vision for FPGA technology on the upcoming 20nm node. Now, it’s Xilinx’s turn. Does this mean that Altera is 2 months ahead of Xilinx in the all-important “next process node”?
The next process node is coming faster and faster with every passing press release. This week we’re taking a closer look at the brand new 14nm test chip rolled out by Cadence, ARM, and IBM, and we’re looking into the new nanotube memory technology being developed by IMEC and Nantero. Speaking of breaking new ground, my guest this week is Brad Quinton (Tektronix) and we’re going to chat about the most recent developments in FPGA prototyping, what Brad sees as the biggest problems for FPGA prototyping today, and why embedded instrumentation can be more effective than physical instruments.
Embedded Instrumentation Boosts Boards to Emulator Status
FPGAs are clearly the go-to technology for prototyping large ASIC/SoC designs. Whether you’re custom-designing your own prototype, using an off-the-shelf prototyping board, or plunking down the really big bucks for a full-blown emulator, FPGAs are at the heart of the prototyping system. Their reprogrammability allows you to get hardware-speed performance out of your prototype orders of magnitude faster than simulation-based methods. If you’re trying to verify a complex SoC or write and debug software before the hardware is ready, there is really no option but an FPGA-based hardware prototype.
There are basically two options for FPGA-based prototyping - simple prototyping boards and emulators.