feature article
Subscribe Now

Advancing FPGA Design Efficiency: A Proven Standard Solution

For decades the SoC design community has consistently lost ground in the battle to match advances in design technology productivity with the growth of available silicon technology. The silicon evolution roadmap has long been chronicled via Moore’s law, so how could the design community allow the existence of the well-known “Design Gap?” Understanding how we got to this point will make it easier to answer that question and make reasonable adjustments for the future, especially if there are obvious things to be learned from the evolution of many analogous industries.

Even in the advanced design field of FPGAs today; most FPGA IP is vendor-supplied and locks the purchasing company into the vendor it was purchased from. This paradigm limits the amount of growth that can happen in industry. In order for the market to grow as a whole, FPGA IP vendors must support and adopt industry standards. By embracing and adopting more standards into FPGAs we can ensure the time to market and cost advantages of standard FPGA IP are met.

Despite any and all management attempts, marketing programs and training, FPGA development is “product driven” in thought and implementation of design methodology, tools and flows. This is largely unavoidable as the entire practice around automation has evolved from and been built around the immediate gratification of “just automating what the designer is doing.” Simply, take the activity, implement it in code or hardware, and string it to the next activity in the design chain.

That worked well in the beginning and did make the design process faster, though it initially left the integration of tools as a largely manual activity, patched into place as so-called integrated tool flows evolved. New tools are individually grafted-in as older versions break from the demands of increasing design complexity, and so tool “integration” has progressed as best it can. “Best of breed” tool alternatives, often with incompatible interfaces, are sometimes brute-forced into flows, especially when the alternative is a prolonged wait for a proprietary solution from a vendor to which you are often inevitably tied.

Hence the existence of the design gap is now entirely predictable. Design technologies are today operational, segmented and compartmentalized for design-function specialists, clearly lagging what the FPGA community needs to create products with the right features, on-time and fully utilizing the available silicon technology. If we had done better with the design process, could we have demanded and achieved faster evolution in silicon (and other) implementation technology?

Nevertheless, there are signs of industry maturation on the horizon. The last five years have seen a clear acknowledgement among major design teams and corporate visionaries that things need to change. We have moved (see Figure 1) from a mentality of seeking solutions with a brute-force building and management of massive SoC design teams and an “integrate or die” approach, through “collaborate or die” where one or two major players get together to pool resources on critical high profile programs, and now to “standardize or die.” In this more mature phase, groups of companies are establishing nimble consortiums that might broadly address those design infrastructure needs that have grown too large for individuals to create them alone.

These consortiums are led by the rapid prominence of groups such as OCP-IP and SPIRIT and ECSI out of Europe, and are also evidenced by the much older and original appearances of VSIA in the U.S. and STARC in Japan. The new arrivals are up, running and building structure, while the older protagonists have evolved and continue to refine and tune their roles for the membership they serve.

So what does the model look like that we have followed, and what might we do in the future to better advance SoC design? Consider the “Industry Development Model” (see Figure 2). Here we plainly see the evolution of how products (P) progress. Most companies start with a well-defined, native product and make as much headway possible selling it in its minimum form until competition dictates they must bring more to the table to survive and thrive.

Thus, products next evolve to a value-added form and competition drops off as competitors fail to keep up, or focus efforts to more viable ground for their circumstances. Still, the competition stiffens even more and another evolution takes place wherein “whole products” are evolved (as popularized by Geoffrey Moore’s descriptions). A whole product is an even more compelling offering whereby the purchaser can often replace a whole set of products and internal operation or development, by getting a superior external solution. He is given a simpler “make or buy” alternative, if he trusts to outsource a once-internal solution. Again (per our model), from the product provider perspective, the number of competitors drops off and a few providers remain, but in a much stronger and more prominent industry position than when they entered that product market with their simple, standalone native product.

But what does Industry get as a result? Generally, we then have a few strong players competing for other related providers to add to their personal infrastructure and make their proprietary product the only real business alternative. Herein lies the problem. What is considered an ideal infrastructure is determined by a commercial and proprietary entity. Often great solutions can and will emerge, but if they are not well suited to the diverse needs of users, this can be a real set-back.

A natural evolution of the whole product is when that product itself supports a standard (S). When this transition occurs, the total number of infrastructures needing support shrinks dramatically. Indeed, the whole products themselves can be a part of the “Standard” infrastructure. Suddenly we can center efforts around just a single infrastructure and bring clarity and focus to a community of providers. The entirety of a shared infrastructure can dramatically focus industry efforts and indeed help whole-product proprietary providers, by letting them focus on their product, without fighting for Industry following to build what is often a token, “complete” solution.

Thus, everyone can focus on known needs and all players can expand their individual core offerings with the diversity a much larger market can embrace. So, how do we evolve to the Infrastructure model? In general, when we follow the product-driven corporate path we will see good technical progress and corporate growth, but if we wish to truly accelerate the growth of an industry, we must evolve with specific direction and shared effort. The bottom line, industry grows and matures faster when there are shared and common resources and practices, and this can ONLY come through the common ground of standards.

The inevitable question is: “How do we accelerate this process?” Unfortunately, in many ways this is an evolutionary event. As we see from discussion above, there is already progress afoot with real, cross-company coordination. Lessons for this maturation process are all around as we look at most all adjacent technological spaces. As industries mature, the major players serve segments of those industries. So, for a new generation of technology to emerge, it is normal to see major players get together and define standards to exchange work. We see this in aerospace, telecommunications, ship-building, etc. and indeed all major industries where problems grow so large that needs ultimately outweigh individual goals. Major players will look for competitive advantage by sharing to grow and this, in larger total markets that are created as a result. If the FPGA Community bands together to support and adopt open standards pressure is put on the larger companies to integrate FPGA’s and bring them together in a uniform way.

What does it mean for the design industry? Expect to see more collaboration and standardization between major users and providers; hope to see real collaboration and coordination between the groups. As we inevitably embrace the need to move along design productivity more rapidly and get on a different design integration path, we will abandon the “product first” approach and build on standards, such that we capitalize on the massive shared infrastructure inherently produced.

Leave a Reply

featured blogs
Apr 19, 2024
Data type conversion is a crucial aspect of programming that helps you handle data across different data types seamlessly. The SKILL language supports several data types, including integer and floating-point numbers, character strings, arrays, and a highly flexible linked lis...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...
Apr 18, 2024
See how Cisco accelerates library characterization and chip design with our cloud EDA tools, scaling access to SoC validation solutions and compute services.The post Cisco Accelerates Project Schedule by 66% Using Synopsys Cloud appeared first on Chip Design....

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured chalk talk

Shift Left with Calibre
In this episode of Chalk Talk, Amelia Dalton and David Abercrombie from Siemens investigate the details of Calibre’s shift-left strategy. They take a closer look at how the tools and techniques in this design tool suite can help reduce signoff iterations and time to tapeout while also increasing design quality.
Nov 27, 2023
19,309 views