feature article
Subscribe Now

Taking the FPGA Pulse

Who Does What and With What?

For years now various people have been tracking the EDA process for ASICs and SoCs, and we have a pretty good idea of what the problems are and where the bottlenecks are. (Although we don’t seem, for some reason, to be able to take corrective action to solve the problems and remove, or at the least, ease the bottlenecks.)

FPGAs are now generally big beasts, comparable in complexity to the custom products of only a few years ago. To some extent, their development process has not been tracked, in part because (usually) the developers use the tool chain supplied by the FPGA company, and so there is less commercial interest in the big EDA companies who are not making much of a mark in this area. Management loves free tools, although, as the FPGA companies spend a lot of money on developing and maintaining these tools, in the cost of each chip there is a significant chunk covering the cost of the software.

A couple of years ago, the Wilson Group Functional Verification Study of how designers of SoCs and ASICs work world-wide was extended to include FPGAs – but, as its title suggests, it focuses mainly on verification. I will mention this occasionally, as I have yet to see the full survey. However, last year in Britain, FPGA industry veteran Doug Amos (Altera, Synplicity and Synopsys) undertook a survey of FPGA usage in the UK for NMI. (NMI describes itself as the champion and official trade association representing the UK Electronic Systems, Microelectronics, and Semiconductor Communities.)

In early February, Doug presented some of the findings of the NMI FPGA Usage Survey 2014 at the annual Verification Futures conference organised by TVS. The event has previously been purely ASIC/SoC but this year had a session on FPGA verification. While Doug’s presentation naturally concentrated on his finding on verification, we sat down afterwards and discussed the wider implications.

Doug worked with Aldec, Altera, FirstEDA, Mentor Graphics, Synopsys, Xilinx, XMOS, and UK publication New Electronics to devise and publicise the questionnaire. It was carried out on-line in October and attracted 174 respondents. As estimates suggest that there around 2,000 designers working with FPGAs in the UK, this is statistically safe. Unlike most surveys, the questions were often open-ended. This meant that Doug had to do some hard work to complete the analysis, but he felt that it would be a richer source if he did so.

Obviously the first question was: “Which FPGAs or other programmable devices do you typically use?” A subsidiary question asked for a summary of why that device was chosen. This was not designed to be a market share analysis but to get a grasp of how the remaining questions were answered.

Some respondents just named the supplier, others named specific device families, and, to be honest, there were no real surprises, although Doug felt pleased to see two Achronix users alongside Altera, Xilinx, Microsemi and Lattice. (See Doug’s comment here.) From a personal point of view, it was interesting to see the name Actel regularly used, even though it is over four years since the company became part of Microsemi. The named devices included Zynq and Arria, as well as Cyclone, Spartan, etc. An interesting point in the Wilson report was that the percentage of products using the FPGA SoCs, Zynq, Arrisa and PlusFusion had nearly doubled from 14% in 2012 to 27% in 2014.

Equally obvious, the next question was, “For which of the following application areas do you design FPGAs?” The three biggest areas were aerospace and defence, industrial, and video and image processing, with wired and wireless communication, ASIC prototyping, and scientific instruments close behind. In all, Doug identified 17 application areas.

They were also asked about CPUs/MCUs in their design. Around a third of those answering don’t use any processors on their FPGAs. In response to a multi-choice question, 45% use soft core from the FPGA vendor, 38% use the hard IP, 15% use opensource or third party processor cores, and 16% design their own processor cores (Wow). Yes – it adds up to more than 100%, as some respondents use more than one approach.

OK, that is the groundwork out of the way. Now comes some of the interesting stuff – things we would like to ask our colleagues but usually don’t – so it should give you an insight into how other people are developing their FPGAs.

I think an important question was, “How critical is your company’s FPGA capability to your product’s success?” A staggering 78% said it was essential. This, rather than figures about design starts and other analyst information, shows how the FPGA is replacing the SoC/ASIC at the heart of designs.

“How well do your typical FPGA projects run to schedule?” I must confess the result was better than anecdotal evidence had led me to expect. Of 130 responses to this question, 4% said “Early” and 29% said “On time.” 43% were “Slightly over schedule” and 11% “Well over schedule.” The remainder made no comment. Depending on what each respondent regarded as slightly over schedule, this is a very encouraging result. Some of the comments were interesting. One respondent, who did not answer the question, said that it was more important to get the design right than to incur the expense of later applying a fix. My favourite response has to be, “What is a schedule?”

“What percentage of FPGA project time do you estimate is spent on the following tasks?” Initial Design Spec (14%), RTL entry (20%), and RTL verification (21%) made up most of the project time, although when Doug broke out the figures for those developing to meet standards like DO-254, IEC 615080 and ISO 26262 for respondents who were working to RTL, verification jumped to 29% (nearly 40% growth), and time spent on sign-off and documentation increased by 30%.

How verification is carried out is interesting. Vanishingly small numbers use the formal equivalence checking or assertion-based methods that are increasingly standard techniques in the ASIC/SoC world. Instead, they rely on in-system testing and various forms of simulation, as well as manual RTL code reviews.

The next question asked for the three most challenging tasks. Verification was up there alongside “timing analysis and timing closure”, both ahead of initial design specification, configuration and debug, and integration of software in the device. When the question is “Are these tasks challenging or just time consuming?”, the very challenging timing analysis and closure is seen as not very time consuming, compared to RTL verification and even RTL entry.

There is an interesting spread in the answer to “How much total engineering resource is required on a typical FPGA project?” Around 7% of respondents say less than an engineer-month, 33% between 1 and 6 engineer-months, 30% between 6 and 12 engineer-months, 20% between 12 and 24 engineer-months, 7% between 24 and 36 engineer-months, and 5% take more than 36 engineer-months (my rounding). All that can really be read into this is surely, “It depends on what you mean by an FPGA?” A project for a Virtex UltraScale 7 or a Stratix 10 is going to require just a little more engineering effort than a Lattice ICE40.

The availability of appropriate skills is interesting. 27% of respondents thought that their knowledge of FPGAs put them in the category of experts, and 46% rated their knowledge as good, 13% as fair, but needs brushing up, and 9% as not as good as it could be. The remainder rated themselves as beginners. Half of the most recent projects ploughed on with their exisiting team, 22% made time to give staff training, 17% added new staff, 9% brought in consultants (although that word covers a wide range of skills) and 9% felt unable to comment.

The survey covered hardware debug, software development and debug, and a range of other aspects, most of which produced a wide scatter of results, showing that developers are deploying a very wide range of tools.

There are some caveats. These results are only a snapshot in one country. It is hoped to repeat the survey in Britain in future years, and it would be interesting to see similar surveys in other geographical areas.

Leave a Reply

featured blogs
Apr 24, 2024
Diversity, equity, and inclusion (DEI) are not just words but values that are exemplified through our culture at Cadence. In the DEI@Cadence blog series, you'll find a community where employees share their perspectives and experiences. By providing a glimpse of their personal...
Apr 23, 2024
We explore Aerospace and Government (A&G) chip design and explain how Silicon Lifecycle Management (SLM) ensures semiconductor reliability for A&G applications.The post SLM Solutions for Mission-Critical Aerospace and Government Chip Designs appeared first on Chip ...
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Designing for Functional Safety with Infineon Memory
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Alex Bahm from Infineon investigate the benefits of Infineon’s SEMPER NOR Flash and how the reliability, long-term data retention, and functional safety compliance make this memory solution a great choice for a variety of mission critical applications. They also examine how SEMPER NOR Flash has been architected and designed for functional safety and how Infineon’s Solutions Hub can help you get started using SEMPER NOR Flash in your next design.
Apr 22, 2024
308 views