It was the best of tools, it was the worst of tools, it was the age of verification, it was the age of RTL power estimation. As we navigate through these tumultuous times of electronic design, we need a design tool compass to help show us the way and Fish Fry is here to set the course. Frank Schirrmeister (Cadence Design Systems) and I hunt for the deep dark secrets of ARM-based design verification and examine how Cadence’s Palladium Emulation system fits into the verification landscape. Norman Chang (VP - Ansys) and I dip into the murky waters of ESD and together we swim toward a new solution that can make the most troublesome paths in our designs a little easier to navigate.
SNL’s “Weekend Update” Has Nothing on the Software Industry
Programmers, call off your drug pushers.
I know, I know… you think you’re helping. But really, you’re not. You think you’ve got my best interests at heart. You don’t. Your marketing people have convinced you that you’re “providing a service” to your customers. That you’re “ensuring quality.” Here’s where you can stick your quality, fellah.
I’m talking, of course, about automatic forced updates to my software. That’s right, I said “my software.” Not “your software,” and not “the software you created.” Once I lay my money down and get a copy of (excuse me: a license for) the software your company sold me, it’s not yours anymore. And the fundamental rules of property ownership that our society has observed for millennia make me responsible
Could the Little ESL Company be the Next Synopsys?
Back in the 1980s, chips were designed with schematics. There was a comprehensive design tool flow to support schematic-based methodology, and the world had three big EDA companies - Daisy, Mentor, and Valid - whom most folks simply referred to as “DMV”. Those three companies thrived on the tools that defined that schematic-based flow - schematic capture, gate-level simulation, timing analysis, place-and-route, and design-rule checking. Life was good, the world was stable, and folks made some decent chips.
Unfortunately, Moore’s Law kept going. Designs got bigger and schematics got unwieldy. We needed a new thing, and that new thing was language-based design.
While DMV were off trying to invent the next generation of schematic-based tools, a new company called Synopsys brought logic synthesis technology out of the lab and commercialized it. That product, Design Compiler, revolutionized chip design by raising the fundamental design abstraction level. It also shook the EDA industry at its roots. By the time the dust settled, we still had a “big 3” EDA landscape, but now the players were Synopsys, Mentor, and Cadence.
Uniquify and the Bitcoin Boom
Get out your pickaxes, canaries, and a high-powered ASIC or two - we're going mining! In this week's Fish Fry, we venture deep into the Bitcoin caves with Bob Smith (VP - Uniquify). Bob and I chat about how the Bitcoin mining race is heating up (literally) and how Uniquify is using their ASIC expertise to create super-powered machines mining today's hottest (and most controversial) virtual commodity. Also this week, I unveil a new unique Amelia-alternative to the current hardware-biased Bitcoin race. I've got two words for you: Bitcoin MMORPG. So strap on your headlamps ladies and gentlemen, we're going in.
450mm Wafers are Still Some Way Away
Years ago I saw a television wildlife programme about penguins. One image that has remained in my mind was that of the hungry penguins clustering on the edge of the ice, needing to go to catch fish, but each frightened to be the first in, as there might be an equally hungry leopard seal wanting a meal of penguin. Eventually, a penguin gets pushed in by his friends. If he survives, the rest then jump in after him.
This image has always recurred to me as chip manufacturers approach the next wafer size increase. They all want to get the benefits of a larger size wafer, but they are frightened to be the first to use the new equipment that will be needed. Eventually, someone makes the leap, and then the rest pour in.
Rolling the Dice and Spinning the Wheel
Take two steps forward and three steps back. Not all parts of our design process are created equal. In this week's Fish Fry, we examine one of the most painful, frightening, and frustrating parts of our design process - verification. My first guest is Tom Anderson (Breker Verification Systems), and we chat about formal verification, what Breker’s new verification technology TrekSoC-Si is all about, and where you can the best vinyl in Silicon Valley. Then, continuing the formal V theme - we go to Vigyan Singhal, CEO of Oski Technology. Vigyan and I dive into the details of the "Decoding Formal Club." The first rule of "Decoding Formal Club"? Well, we're gonna break that one right here. Vigyan also reveals the secret behind the name "Oski". Also this week, I investigate how Netflix is looking to read your thoughts with a little help from Amazon's Cloud services. Better put on that foil hat!
The Mistake of Marketing Market Share
We humans are a competitive bunch. Our competitive instinct inspires us to many of our greatest accomplishments. It’s not enough to simply do a thing. We need to do that thing better than the other guy, or the other team. Engineering is no different. We may pretend that we are simply “problem solvers,” but the truth is - we don’t just want to trap mice, or even trap mice efficiently. We want to design the “better” mousetrap.
After all, that is the primary purpose of modern technology - to WIN!
Of course, before we can have a proper competition, we need a way to keep score. Unfortunately, since most of us are in technology as a business, we inherit from the business world one of the world’s worst scoreboards - the market share meter.
An Anti-Engineering Concept
Synopsys recently announced the results of a flow collaboration with Fujitsu. Modestly buried in the discussion was a mention of 33% improvement in logic per area.
We’ve been at this game for a long time, and you’d think that the low-hanging fruit had long ago been picked. Which would leave us with the occasional 5-10% improvement in this and that after lots of algorithmic tweakage.
And yet here we are, in 2014, with a 33% improvement. Maybe I’m naïve, but that seems significant.
Cadence Acquires Forte High-Level Synthesis
High-level synthesis has always been the “personal jet pack” of electronic design automation. We all know that someday, “in the future,” we won’t need all these cars and roads and stuff. We’ll each have our own personal jet pack to take us quickly and directly wherever we want to go. And, when we get there, we’ll do all of our electronic designs in abstract, powerful, high-level languages and synthesize them with high-level synthesis (HLS) technology. Hunger and war will be things of the past, disease will no longer exist, and billion gate semiconductor designs will be automagically conjured up from a few simple lines of easy-to-understand algorithmic code.
Timing analysis and RTL debugging? Bah! Those will be problems of the past - like repairing broken wagon wheels. In the future, our designs will be correct-by-construction masterpieces,
Cadence Acquires Forte
Here at Fish Fry, when big news breaks, we swim to it. When the EE landscape changes, we evolve, lose our gills, begin breathing air, and start walking on land. This is one of those weeks. When we got wind that Cadence was in the process of acquiring Forte, we jumped on the chance to get the goods on this groundbreaking news story. My guest this week is Craig Cochran (Vice President, Corporate Marketing at Cadence Design Systems) and we discuss the growing adoption of high level synthesis and how Forte plays into the Cadence system-level design flow. Also this week, we check out how the Hybrid Memory Cube aims to take out DDR, and how FPGAs can help you with your HMC heavy lifting.
Fish Fry Takes on DesignCon 2014
The lights: Fluorescent. The carpet: Padded. The lanes: Routed. Where in the world could Fish Fry be? DesignCon 2014, of course. In this special DesignCon episode of Fish Fry, we launch ourselves into the multi-faceted world of electronic design with a couple of interesting interviews. My first guest is none other than Kilopass CEO Charlie Cheng. Charlie and I get down to the nuts and bolts of non-volatile memory, and Charlie gives me his take on where he thinks the technology is headed over the next few years. Next, I chat with Mark Toth (CadSoft) about CadSoft's ubiquitous EAGLE PCB Design Software, and I get the inside scoop on the results of their recent PCB design survey.
FPGA Design Starts with You
In the early days of FPGAs, we did our work with schematics. FPGAs were small, and you could stitch together a little gate-level schematic pretty easily. Then, it was just a matter of running the FPGA tool flow, and a few seconds later you had a bitstream ready to program your cute little programmable logic device. It was all pretty easy, and - with schematics being the universal language of EE, there wasn’t a lot of special skill required.
About fifteen years ago, FPGAs outgrew gate-level schematic entry. We moved on to hardware description languages like VHDL and Verilog and began fighting with logic synthesis tools to try to get our designs to behave. This actually narrowed the field of FPGA designers somewhat. The universe of “people who knew HDL and synthesis” was quite a bit smaller than “people who could draw a decent schematic.” The FPGA companies didn’t care. They were chasing the lucrative communications and networking infrastructure market, and the folks writing the big checks had plenty of HDL experts.
A New Step, Championed by Atrenta
The concept, “High risk, high reward” doesn’t hold with semiconductors. Heck, we have an entire industry built on the notion of risk reduction: EDA. EDA tools are enormously expensive to develop and are therefore expensive to acquire by the user.
There’s only one reason someone would spend that much money on tools: because it’s going to prevent even more gargantuan losses if something goes wrong with the chip.
OK, granted, much of what EDA is about is productivity – just try cutting a few billion transistors out of rubylith and see what that does for your carpal tunnels. So, really, it’s the 70% of EDA that’s about verification that we’re really talking about here.
Forecasting the FPGA Future
The ball has dropped, the bubbly sipped, and the resolutions resolved. 2013 has ended, and before us we have a new year, a new universe of opportunity, and a crazy cauldron of activity in our beloved world of programmable logic. It’s time to throw down, gaze into the crystal ball, read the tea leaves, interpret the Tarot, and extrapolate the trend lines. Here, then, is our unflinching forecast for FPGAs in the months and years to come.
Before we fire up our forecast fest, we should nail down what we mean by “FPGA.” After all, the definition has been morphing, expanding, and shifting over the years, and even the companies with thousands of employees dedicated to nothing but making and selling FPGAs don’t seem to agree on the current meaning of the acronym. Ours will be simple - if it has a look-up-table (LUT) cell, it is an FPGA. (Yes, we hear the screams out there. Bear with us. It will all come out in the wash.)
Did Anything Happen?
2013 is coming to a close, and this is usually a time for reflecting on what’s happened in the past year and what’s going to happen in the coming year. The thing is, though, when I sit back and reflect, well, I don’t know; it just seems like 2013 was a quiet year for EDA.
So I took a couple of approaches to reviewing the year. One is to see what the Big Guys did and the other was to solicit some other opinions as to what’s in and what’s out.