feature article
Subscribe Now

Toward Self-Driving FPGA Tools

Leaving the Experts Behind

Most “real” drivers are not enthusiastic about self-driving cars. Sure, it’s likely that self-driving auto technology will save thousands of lives per year, reduce congestion on our roads, improve our utilization of natural resources, and make transportation much simpler than today. But, for those who have invested considerable time and energy in developing their driving skill, it represents a loss of personal value and threatens to take away an activity that they enjoy. 

Most real FPGA designers are not enthusiastic about self-driving FPGA tools, either. For decades now, FPGA design tools have been crafted specifically for the needs of FPGA experts. When the feedback loop between the customer base and the tool developers stays locked tight for year after year, there can be a bit of an echo-chamber effect. The tools and their audience evolve together, often at the exclusion of the outside world.

As with most electronic design automation solutions, FPGA tools started out as a hodgepodge of bare-bones software from various sources. Fledgling FPGA companies pieced together working design flows in any way they could – OEMing some commercial EDA tools (like synthesis and simulation), buying some from other third-parties, and building the rest themselves. Most flows weren’t very good, and they required a lot of hand-holding to get through successfully (where “successfully” is defined as getting your FPGA to actually do what you intended). That hand holding was generally supplied by some of the most capable field applications engineering (FAE) teams in the industry – teams who bear much of the responsibility for the early success of FPGAs as a technology. Without those FAE forces, we might well not have FPGAs today, or at least the landscape would look much different than it does now. 

Time passed, and those FAEs and their customers reported the problems they encountered. The tool development teams responded. The FAEs and customers requested new features and new levels of control, and the tool development teams responded again. The customers, the tool developer, and the tools themselves became expert and powerful together – learning from each other. The further it evolved, the more of a closed system it became. Penetrating the labyrinth of options, optimizations, best practices, and tribal knowledge became a daunting task for the newbie, the outsider, or the occasional user.

Of course, FPGA companies wanted to attract new users and penetrate new markets, but, at the same time, they needed to defend their turf in the highly competitive arena of FPGA expert users. Badly designed wizards were liberally deployed in an attempt to automate the middle-of-the-road options for most common tasks. Graphical user interfaces were introduced in an effort to make FPGA design seem easy. After all, you just point and click, right?

As we all know, you didn’t just point and click. In fact, before any pointing or clicking, you most likely had to learn a hardware description language like Verilog or VHDL. Then, you had to master a very specific FPGA-esque dialect of that language. Then, when you ran your carefully crafted design through the tools the first time – they probably crashed. Maybe the second time too. Finally, you figured out the magic incantations required to get your design all the way through and you got out your first report – with 10,000 timing violations and hundreds and hundreds of cryptic errors and warnings. Welcome to the cutting edge!

Since then, FPGA tools have undergone tremendous change in four dimensions. First, they have primarily been consolidated into integrated suites of tools developed and supported by the FPGA companies themselves. Second, they have matured in their stability, speed, usability, and quality of results. Third, they have accumulated massive numbers of “expert” features to allow FPGA savants to get every last ounce of performance and functionality out of their devices. And fourth, they have expanded their front-end capabilities to include multiple entry points and styles for various types of embedded, system-level, and high-level design.

Today’s tools are incredibly sophisticated pieces of software with almost incomprehensible numbers of features. There are countless paths from design entry through integration, debug, optimization, and implementation – to the point that FPGA companies can no longer rely exclusively on “generalist” FAEs. They now need expert support staff in areas like DSP, embedded computing, automotive – the list goes on and on.

Approaching a modern-day FPGA design tool suite can be intimidating. I recently talked with an engineer from a team who were only occasional users of FPGAs. They had completed a single design several years ago and now were attempting to re-do that same design with some new capabilities using today’s FPGAs. Their design was not complex. They were not pushing the edge of performance, power consumption, or any other dimension that would make their task difficult. It seemed like their project should be a straightforward one. Their description of the process spoke volumes: “It’s like we’re trying to get from here to a spot just on the other side of the parking lot. It’s flat. It’s close. There is nothing in the way. But, to get there, we are required to take the space shuttle. We don’t even know how to get through the first checklist of steps to start the thing up.”

Designers of FPGA tools should take heed. There is a vast number of different types of users entering the FPGA domain, and the majority are not FPGA experts. If FPGAs are to expand into the numerous new and exciting markets for which they’re suitable, the primary battleground will be tools, not chips. New users should not have to learn FPGA-ese in order to get an FPGA to work in their system. At some point, people with little or no hardware expertise at all will need to be able to customize the function of FPGAs. 

We have certainly taken steps down this path, but there are miles to go. Higher levels of abstraction in design creation need to replace HDL. System-level design tools need to take into account both the hardware and software components of an application. Tools – particularly lower-level implementation tools such as synthesis and place-and-route – need to move ever closer to full automation. We actually need to put the synthesis and place-and-route expert user out of a job. Only then can we be positioned for a better, self-driving, programmable future. 

6 thoughts on “Toward Self-Driving FPGA Tools”

  1. Pingback: Petplay
  2. Pingback: DMPK
  3. Pingback: TS Escorts
  4. Pingback: taruhan bola
  5. Pingback: their explanation

Leave a Reply

featured blogs
May 8, 2024
Learn how artificial intelligence of things (AIoT) applications at the edge rely on TSMC's N12e manufacturing processes and specialized semiconductor IP.The post How Synopsys IP and TSMC’s N12e Process are Driving AIoT appeared first on Chip Design....
May 2, 2024
I'm envisioning what one of these pieces would look like on the wall of my office. It would look awesome!...

featured video

Introducing Altera® Agilex 5 FPGAs and SoCs

Sponsored by Intel

Learn about the Altera Agilex 5 FPGA Family for tomorrow’s edge intelligent applications.

To learn more about Agilex 5 visit: Agilex™ 5 FPGA and SoC FPGA Product Overview

featured paper

Achieve Greater Design Flexibility and Reduce Costs with Chiplets

Sponsored by Keysight

Chiplets are a new way to build a system-on-chips (SoCs) to improve yields and reduce costs. It partitions the chip into discrete elements and connects them with a standardized interface, enabling designers to meet performance, efficiency, power, size, and cost challenges in the 5 / 6G, artificial intelligence (AI), and virtual reality (VR) era. This white paper will discuss the shift to chiplet adoption and Keysight EDA's implementation of the communication standard (UCIe) into the Keysight Advanced Design System (ADS).

Dive into the technical details – download now.

featured chalk talk

GaN Solutions Featuring EcoGaN™ and Nano Pulse Control
In this episode of Chalk Talk, Amelia Dalton and Kengo Ohmori from ROHM Semiconductor examine the details and benefits of ROHM Semiconductor’s new lineup of EcoGaN™ Power Stage ICs that can reduce the component count by 99% and the power loss of your next design by 55%. They also investigate ROHM’s Ultra-High-Speed Control IC Technology called Nano Pulse Control that maximizes the performance of GaN devices.
Oct 9, 2023
27,970 views