feature article
Subscribe Now

The Valley of FPGA

Where Green Pastures End

Just about every electronic technology on the market today has alternatives.  Between custom chips, ASSPs, pre-built modules, embedded processors, microcontrollers, FPGAs, and a host of other silicon-based goodies, there are always numerous ways to solve any given problem.  As engineers, we make our choice based on any number of criteria – cost, power, size, reliability, our familiarity and experience with the technology, our company’s preferences… all of them weigh into our decision.

Some solutions – like full custom chips – occupy an “end” of the solution space.  When you have enough budget, a talented and experienced design team, a large production volume to amortize development costs, and hard technical demands in areas like system performance, power consumption, integration, or unit cost – custom ICs are the limit of what can be done.  On the other end of the spectrum are solutions like pre-manufactured boards and modules with embedded processors.  One guy in a garage can write a little code and practically ship a new product by himself. 

Between these extremes lies the Valley.  The Valley is a rich and fertile marketing ground, inhabited by all manner of furry-friendly woodland creatures with big welcoming eyes and even happier datasheets.  FPGAs live here in the Valley.  They are cheaper to design than custom ICs, yet more capable than fixed-hardware boards.  They require far less expertise and risk to design than ASICs, but they are much more challenging to use than simply programming an embedded processor.  They can offer tremendous power advantages over a dedicated DSP chip, but they don’t quite have the degree of efficiency for many battery-powered applications.  

FPGAs are bounded by alternative solutions in just about every dimension and direction.  Choose anything you want to measure – cost, power, size, performance – and FPGAs will be in between two other viable alternatives.  Unfortunately, this means that FPGAs are always under attack, and always defending and challenging borders.  For years, FPGAs have been billing themselves as “ASIC replacements,” trying to shave off the bottom end of the ASIC market – those who can no longer afford to do an ASIC on the latest process node.  In return, low-cost ASIC companies made a business on “FPGA conversion” – taking FPGA designs and reducing unit cost while improving power and performance by converting them to less-than-the-latest-node ASIC processes. 

FPGAs also attack the processor space – and get attacked in return.  FPGAs offer an integration story – your CPU and any combination of peripherals can be integrated into a single programmable logic device.  Processors retaliate by offering the CPU with most of the peripherals you could want – all integrated into a lower-cost, higher-performance device.  FPGAs offer the ability to accelerate performance-critical parts of your application, and application processors retaliate by getting faster and faster, obviating the need for hardware acceleration. 

Every direction you look, FPGA technology is under attack from alternative solutions.  And each FPGA company is under attack as well – from other FPGA competitors.  Often, the companies lose sight of the external boundaries and focus their energy on wrestling market share points from like-minded competition.  These are expensive revenues to earn.  It’s an attractive battle to fight because you know your enemy, and the score is easily divined by market share reports.  What is more difficult to see is the state of the battle against the outside alternatives.  As long as the field where you’re grazing at the moment is rich and green, you may not notice that the entire Valley is quietly closing in around you.

The real key to the battle for these borders is tools.  As we have mentioned many times, most of the cost of an FPGA is margin.  FPGA companies charge an enormous premium over the raw cost of the silicon they sell.  Where does all that money go?  Into private jets and limos for FPGA company executives, of course.  No, just kidding.  Most of that money goes into the development, maintenance, and support of the tool suites required for FPGA design.  The two largest FPGA companies have giant engineering teams working on tools and IP – certainly larger than the teams who work on the development of the FPGAs themselves.  At stake in the tool battle is nothing less than the viability of the FPGA companies themselves, for without winning tools, engineers will flee to alternative solutions – whether those are the other company’s FPGAs or something else entirely.  

The challenges of producing high-quality tools for FPGA design just keep getting tougher, too.  Every two years, like clockwork, the FPGA companies roll out a new device family on a new process node with double the LUTs, more than double the routing, more memory, more hard-wired macros, and longer clock lines.  Each one of these improvements breaks the old tool chain in new and different ways.  Synthesis runs that used to take minutes now take hours.  Layout runs that used to take hours now take days.  Timing optimization becomes an endless iterative loop that never seems to converge.  Management of routing resources becomes untenable…. Just about every aspect of our friendly FPGA tools gets broken with each new family and has to be re-engineered and repaired.  If the tools aren’t working, and working well, it doesn’t matter how good a new FPGA family is.  Nobody will be able to use it. 

Unlike in the ASIC & COT market – the playing field is not level.  For Altera and Xilinx, if their own tools cannot handle the design challenges posed by their chips, they are absolutely dead in the water.  No EDA company will come riding over the horizon with third-party technology to save their bacon.  If one competitor noses ahead of the other, it can be catastrophic for the loser.  In the past couple of decades, we’ve seen this twice.  Once, when Altera released their first set of “Quartus” tools – the issues were so bad it almost wiped out the company.  However, Altera recovered and took a sustained lead with their “Quartus II” suite, and they have kept Xilinx playing catch-up ever since.  Now, Xilinx is rumored to be re-engineering their entire tool suite from the ground up – a make-or-break play with enormous implications.  It could literally determine the future of the company and its technology.

We are currently reaching a season of great change in the Valley of FPGAs.  With the field of 28nm devices now just coming to market, new suites of tools slated to arrive, and fierce competitors attacking the fertile FPGA Valley from every side, we are at a pivotal juncture.  FPGAs may begin a rapid expansion and join the ranks of processors and memories as assumed major components of every electronic system made, or they may retreat into niche applications where there are not more robust and usable alternatives.  We’ll see what’s revealed when spring melts away the winter’s frost.

One thought on “The Valley of FPGA”

Leave a Reply

featured blogs
Apr 19, 2024
Data type conversion is a crucial aspect of programming that helps you handle data across different data types seamlessly. The SKILL language supports several data types, including integer and floating-point numbers, character strings, arrays, and a highly flexible linked lis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

Gas Monitoring and Metering with Sensirion SFC6000/SFM6000 Solutions
Sponsored by Mouser Electronics and Sensirion
In this episode of Chalk Talk, Amelia Dalton and Negar Rafiee Dolatabadi from Sensirion explore the benefits of Sensirion’s SFM6000 Flow Meter and SFC Flow Controller. They examine how these solutions can be used in a variety of applications and how you can get started using these technologies for your next design.
Jan 17, 2024
13,275 views