feature article
Subscribe Now

Altera Looks Forward

Insight from an Industry Leader

In the rough-and-tumble, day-to-day, my-chip’s-bigger-than-your-chip schoolyard scrap that characterizes the top tier of the FPGA industry, a glimpse of vision, long-term insight and strategy are a rare breath of fresh air. We often feel that the two toughest competitors in the business spend too much time staring each other down and not enough time strategizing on how to conquer more of the vast landscape of logic design opportunity waiting patiently at the forefronts of their fiefdoms. However, when we sat down this week with Danny Biran, Altera’s Vice President of Product and Corporate Marketing, vision and insight are exactly what we got.

It seems that Altera has donned some panoramic goggles and is surveying the silicon situation with a calm and rational head, looking at the emerging role of programmable logic in the broader electronics industry. Altera’s vision starts at the very root of our core assumptions about modern electronics systems – the distinction between hardware and software. “We have all been programmed to believe that hardware is difficult to change and software is easy to change,” observes Biran, “but programmable logic breaks those stereotypes.” Of course, we’ve all become accustomed to the assumption that hardware is expensive to develop, has long development times, and is difficult to change once designed. Software is easier and faster to change, simpler to develop, and comparatively easy to deploy. We also generally assume and accept that software can be updated after the product is in the customer’s hands while hardware cannot.

FPGAs change those well-established rules, however. Hardware can be modified rather easily, even in the field. The death-defying challenge of ASIC verification melts away, and the choice of implementing algorithms as processor instructions or logic elements becomes a more complex tradeoff of performance, power consumption, silicon area, and design time dedicated to each module. The line between hardware and software becomes ever so fuzzy. If the concepts of hardware and software are no longer sacrosanct, what exactly does Altera see in the new digital design domain?

Altera sees that systems that formerly were almost completely defined in hardware have grown to include large software components. Operating systems were added to insulate the complexity of the software from changes in the underlying hardware platform and to manage memory and interrupts. Hardware architectures grew more complex as well, and performance expectations increased almost exponentially. At the same time, design cycles and product life cycles decreased dramatically. Systems companies were faced with the challenge of creating and verifying increasingly complex systems on shorter schedules in order to hit ever-diminishing market windows.

At the same time, the cost of deploying the predominant custom silicon technology – cell-based ASIC, is increasing with each new process generation. This has made the ASIC option evolve into the purview of the few, available only to companies with the resources, experience, and product volume to justify the astronomical development costs and risks. The rest of the industry was left to come up with other options like combinations of ASSPs, FPGAs, and functions that could be implemented in embedded software running on standard processing platforms. Over time, FPGAs have grown in capacity, performance, and capability, subsuming more and more of the functionality of the system and providing software-like flexibility to the portion of the system that they encompassed.

Designers soon began to view FPGAs as something more than a simple substitute for ASIC in lower-volume or lower-budget applications. By leveraging reprogrammability, teams are designing products that get to market faster and stay in the field longer. Altera sees field programmability as migrating from a “nice to have” to a “must” for companies wanting return on their system development investments. “I don’t know of a market that isn’t going through this,” continues Biran. “No one has the luxury anymore in any established market to wring out the profit from a hardware design for five years. No sooner is a design frozen than cost reduction and feature enhancement programs are underway.”

At the same time, Altera observes that globalization hasn’t conquered the standards dilemma. If you’re introducing a new product for the global market, you’ll find that what seems like it should be a single silicon implementation may multiply into a mess of variants for different geographic regions. Even if your overall production volume might justify an ASIC, by the time you segment into a different version for each different networking standard and for each geography, you may have created ten mid-volume products from a single high-volume one. With programmable logic, however, we have the opportunity to create a single hardware platform that can be configured differently for each variant. Ironically, by using what is normally considered a lower-volume technology (FPGA) we can push the hardware portion of our design back into the high-volume arena.

This product fragmentation goes far beyond simple geographic lines, however. One of the hottest new markets for FPGAs is computer and television displays. These products are offered in a daunting array of sizes and configurations, taking into account screen size and resolution, performance, HDTV and broadcast standards, and geographic target. FPGAs can bring an almost unmanageable problem back into the realm of sanity. Without programmable logic, many product variations simply could not exist due to the economics of creating custom silicon for lower-demand models.

The final points of Biran’s programmable logic prognostication relate to the topic of performance. Altera observes that processors have hit a wall in their performance progression, and that processor companies are rapidly migrating toward dual-core and multi-core solutions to circumnavigate the barrier. The same trend is emerging whether we look at conventional general-purpose processor architectures or more domain-specific processors such as DSPs. There is widespread industry agreement that parallelism and not increased clock frequency will account for the next several rounds of performance improvement in computing.

If that is true, Altera argues, the processing platform of the future will include billions of transistors on a single chip, reconfigurable circuit blocks, optimized special-purpose blocks for real-time signal processing, on-chip memory, high-speed interconnect, and compatibility with existing software. Altera points out that this hardware configuration is exactly what we already have in leading FPGAs today. Altera makes a good case. While Moore’s Law may take us up and to the right in both density and performance, the basic modern FPGA hardware architecture is a logical evolution of the track that conventional processors, DSPs, and programmable logic have been following for the past few years.

Altera’s conclusion matches that of many other groups with whom we’ve consulted about the future of computing. All roads ultimately lead to a reconfigurable hardware platform with the hardware/software partitioning decision based on performance, power, and real-estate factors. If true, we interpret this trend to mean that the key enabling technology in electronics for the next couple of decades will be design tools, from conventional compilers and electronic design automation aids to next-generation high-level language synthesis and hardware/software partitioning technology.

Today, the biggest single barrier to widespread adoption of FPGAs as algorithm accelerators is the immaturity of the design methodology. The average system designer just doesn’t want to incur the time and expense of producing a register-transfer level VHDL or Verilog design in order to get key components of their system implemented in FPGA fabric. EDA and FPGA vendors are chipping away at that problem with IP-based design methodologies, C/C++ synthesis, Matlab/Simulink integration, and a number of novel and innovative approaches that will make the power of the FPGA fabric available to a wider audience. Ultimately, though, these tools and methods still have a long way to go.

Both tools and engineers must evolve to meet the challenges of the coming decade. As the barriers between hardware and software blur and ultimately disappear, so must the distinction between hardware and software engineers. Lest any of us get too comfortable in our roles, we should be mindful of the history-proven fact that the only constant in our industry is change. That change knows no barriers and observes no established definitions of technology, tools, or job functions. As long as we understand that engineers’ true expertise must always be creating and adapting to that change, we’ll survive and flourish. Evidently, Altera agrees.

Leave a Reply

featured blogs
Nov 30, 2021
We live in a world where the idea of usability is to make products easy to use, make things easily accessible, and visually appealing. It's our constant endeavor to improve the usability of our... [[ Click on the title to access the full blog on the Cadence Community si...
Nov 29, 2021
Tell me if you've heard this before, but I'm looking for a Nordic word that has a sufficiently amorphous gestalt to make it confusing to explain in Norwegian....
Nov 29, 2021
Lean how virtual electronic control units (ECUs) accelerate automotive design and enable advanced driver-assistance systems (ADAS) for connected vehicles. The post From Road to PC: Accelerating Intelligent Software Growth with Virtual ECUs appeared first on From Silicon To S...
Nov 8, 2021
Intel® FPGA Technology Day (IFTD) is a free four-day event that will be hosted virtually across the globe in North America, China, Japan, EMEA, and Asia Pacific from December 6-9, 2021. The theme of IFTD 2021 is 'Accelerating a Smart and Connected World.' This virtual event ...

featured video

Integrity 3D-IC: Industry’s First Fully Integrated 3D-IC Platform

Sponsored by Cadence Design Systems

3D stacking of ICs is emerging as a preferred solution for chip designers facing a slowdown in Moore’s Law and the rising costs of advanced nodes. However, chip stacking creates new complexities, with extra considerations required for the mechanical, electrical, and thermal aspects of the whole stacked system. Watch this video for an overview of Cadence® Integrity™ 3D-IC, a comprehensive platform for 3D planning, implementation, and system analysis, enabling system-driven PPA for multi-chiplet designs.

Click here for more information

featured paper

Enable faster real-time control with high-precision position sensors

Sponsored by Texas Instruments

The demand for highly automated industrial systems is continuing to increase and often requires advanced, reliable position sensing solutions to control equipment performance and collect factory-level data. Learn how the industry’s most accurate linear 3D Hall-effect position sensor, the TMAG5170, enables faster real-time control and increased diagnostic capabilities to help design safer, more reliable, automated industrial systems.

Click to read more

featured chalk talk

Benefits and Applications of Immersion Cooling

Sponsored by Samtec

For truly high-performance systems, liquid immersion cooling is often the best solution. But, jumping into immersion cooling requires careful consideration of elements such as connectors. In this episode of Chalk Talk, Amelia Dalton chats with Brian Niehoff of Samtec about connector solutions for immersion-cooled applications.

Click here for more information about Samtec immersion cooling solutions