Cadence Acquires Forte High-Level Synthesis
High-level synthesis has always been the “personal jet pack” of electronic design automation. We all know that someday, “in the future,” we won’t need all these cars and roads and stuff. We’ll each have our own personal jet pack to take us quickly and directly wherever we want to go. And, when we get there, we’ll do all of our electronic designs in abstract, powerful, high-level languages and synthesize them with high-level synthesis (HLS) technology. Hunger and war will be things of the past, disease will no longer exist, and billion gate semiconductor designs will be automagically conjured up from a few simple lines of easy-to-understand algorithmic code.
Timing analysis and RTL debugging? Bah! Those will be problems of the past - like repairing broken wagon wheels. In the future, our designs will be correct-by-construction masterpieces,
The Case For
There is some debate these days about teaching our kids to “code” and integrating software development into the mandatory curriculum along with reading, writing, and arithmetic. Many in our industry feel that this trend is misguided - including our own Dick Selwood, who recently wrote a piece making the case that programming is the wrong thing to be teaching. The arguments range from pointing to some of the very visible recent failures of software-based systems to worrying that programming is a passing fad that could leave our youth prepared for an obsolete profession.
I strongly disagree.
Software-based systems fail because modern software is the most complex creation ever attempted by humans. Programming is a multi-disciplined, massively-collaborative art that stresses every known attempt at complexity management, cooperation, verification, and human understanding. Software fails because nobody has yet figured out a way to make it never fail. And, the problem just keeps getting worse because, every single day, software becomes more complex and more difficult to create and verify.
Bitcoin Mining for Fun and Profit
Every ten minutes, the green flag drops. Billions of dollars worth of exotic hardware strain the very limits of physics and engineering wherewithal. Elegant efficiency squares off against brute force barbarism in a contest of skill, cunning, and nerve. Enormous quantities of coulombs course through countless traces, giving off enough waste heat to warm a city. This is balls-to-the-wall, heavy-iron, high-stakes racing at its best.
Strapped into the competitive cockpits are the unlikeliest adrenaline junkies - computer and software engineers. This sport isn’t soaked in sweat and combustibles. There is no burning rubber or white-hot exhaust. There are no grandstands or cheering crowds. This race is about the pure ego food of getting to the finish before anyone else and claiming a modest prize.
Welcome to the weird world of bitcoin mining.
FPGA Design Starts with You
In the early days of FPGAs, we did our work with schematics. FPGAs were small, and you could stitch together a little gate-level schematic pretty easily. Then, it was just a matter of running the FPGA tool flow, and a few seconds later you had a bitstream ready to program your cute little programmable logic device. It was all pretty easy, and - with schematics being the universal language of EE, there wasn’t a lot of special skill required.
About fifteen years ago, FPGAs outgrew gate-level schematic entry. We moved on to hardware description languages like VHDL and Verilog and began fighting with logic synthesis tools to try to get our designs to behave. This actually narrowed the field of FPGA designers somewhat. The universe of “people who knew HDL and synthesis” was quite a bit smaller than “people who could draw a decent schematic.” The FPGA companies didn’t care. They were chasing the lucrative communications and networking infrastructure market, and the folks writing the big checks had plenty of HDL experts.
Pandora Walks Into a Bar...
John was up earlier than usual. The body monitor he wore on his wrist had awakened him during the perfect phase of his sleep cycle and told him that his heart rate and respiration were slightly elevated. It was almost certain - even though he exhibited no symptoms - that he was coming down with the flu. John’s smart phone also gave him the news, telling him additionally that he should not go into the office today in order to avoid infecting others. He was further informed that he had likely been exposed to the flu two days earlier while riding a commuter train - by the ten-year-old boy he sat next to. That boy was now mid-stage flu with a temperature of 102.1 and he was the 1,035th person confirmed with this particular strain. Antiviral medications had been ordered automatically for John, and they would be delivered by courier to his home within a few minutes. John’s wife and children had already been informed of the situation and had donned face masks for the remainder of their time at home this morning.
With predictions of over a trillion sensors deployed worldwide within the next few years, a scenario like this is not hard to imagine. In the near future, it is highly likely that most of us in the civilized world will be monitored by sophisticated arrays of cheap sensors, cameras, and other electronic devices during just about every aspect of our mostly-mundane lives.
Forecasting the FPGA Future
The ball has dropped, the bubbly sipped, and the resolutions resolved. 2013 has ended, and before us we have a new year, a new universe of opportunity, and a crazy cauldron of activity in our beloved world of programmable logic. It’s time to throw down, gaze into the crystal ball, read the tea leaves, interpret the Tarot, and extrapolate the trend lines. Here, then, is our unflinching forecast for FPGAs in the months and years to come.
Before we fire up our forecast fest, we should nail down what we mean by “FPGA.” After all, the definition has been morphing, expanding, and shifting over the years, and even the companies with thousands of employees dedicated to nothing but making and selling FPGAs don’t seem to agree on the current meaning of the acronym. Ours will be simple - if it has a look-up-table (LUT) cell, it is an FPGA. (Yes, we hear the screams out there. Bear with us. It will all come out in the wash.)
Not Just Software vs. Hardware
We recently took a look at Lattice’s approach to sensor hubs. We’ve seen many other ways of implementing sensor hubs in the past, but all of those were software-based; it was just a question of where the software executes. Lattice’s approach is hardware, and that raises all kinds of new questions.
The biggest red flag that it raises for me is that moving a task from software to hardware in the design phase is not trivial. (Trying to keep it with the software guys, using tools that automatically generate hardware is, for the most part, a quixotic goal that seems largely to have been lovingly placed back on the shelf.) In my quest to figure this part out, I found that there’s more to the sensor hub world than all-software and all-hardware. And that makes the design question even more complex.
Xilinx Rolls UltraScale
The FPGA world has a unique obsession with semiconductor process nodes. Every two years or so we witness an epic battle between the two major market-share holders, centered mostly around who gets their devices working first on the next new semiconductor process. Historically, the stakes were very high. With FPGAs being among the first devices to go to production on a new node, and with the high-margin spoils of victory going largely to the winner - the biennial financial fates of the two big FPGA companies rode heavily on winning the next-generation derby.
Now, Xilinx is starting to ship samples of their new UltraScale family, based on TSMC’s 20nm planar CMOS process. This kind of “first to ship” announcement is usually a sign of impending victory, as the first to sample is usually the first to ship in volume and the first to collect the bulk of eager early adopters just chomping at the bit to design the biggest baddest silicon into their next system.
Obsolescence Comes in Assorted Flavors
I am an intense follower of technological trends.
But, I can’t tell you when the 8-track tape died - probably not even within a decade. I don’t know when the last dial telephone left our abode. I really have no idea when the points-and-condenser distributor went the way of the dodo, or during what year the final line of FORTRAN was punched into the last card. I couldn’t even make a good estimate of when the last non-novelty VCR was sold, or when the laserdisc finally went forever out of production.
New technology arrives in our lives with a bang and departs with a whimper. The long tail of obsolescence lulls us to sleep as once-innovative bright ideas fade slowly to black. One day, a device seems an indispensable part of our day-to-day lives, and, another day, we notice that we haven’t seen or thought about one in a very long time. There is no press release, advertising campaign, or breaking newscast announcing the departure of doomed inventions. People do not camp out and line up around the block in hopes of being the first to experience the end of an era.
Plays to its Base with AD14
The headline new feature for Altium’s newly released Altium Designer 14 (AD 14) is “Rigid-Flex Support.” True, rigid-flex is there, and it’s cool, but the headline might lead the casual reader to miss some very important changes that are happening at Altium. Altium has a new focus and a new mission these days. The Altium folks are going back to their roots, playing to their base, and trying to re-establish a strong partnership with the engineers the company was created to serve - the common, hard-working, in-the-trenches, everyday designers who are trying to create cool stuff but who don’t have the resources for the fantastically-expensive, enterprise-oriented PCB solutions from the likes of Mentor and Cadence.
For the past several years, Altium has been a bit like that genius ADD kid in the back of the classroom - full of brilliant ideas, but not at all focused on what is going on in class at the time. Altium has suffered from, if anything, an excess of forward-thinking vision - leading their customers with fascinating new design paradigm ideas and features, but failing them somewhat in delivering rock-solid implementation of the day-to-day, pedestrian PCB design capabilities needed for plain-old place-and-route. The rub on the street about Altium was that they were too focused on the flashy and not enough on fixing old bugs.