feature article
Subscribe Now

Toward Self-Driving FPGA Tools

Leaving the Experts Behind

Most “real” drivers are not enthusiastic about self-driving cars. Sure, it’s likely that self-driving auto technology will save thousands of lives per year, reduce congestion on our roads, improve our utilization of natural resources, and make transportation much simpler than today. But, for those who have invested considerable time and energy in developing their driving skill, it represents a loss of personal value and threatens to take away an activity that they enjoy. 

Most real FPGA designers are not enthusiastic about self-driving FPGA tools, either. For decades now, FPGA design tools have been crafted specifically for the needs of FPGA experts. When the feedback loop between the customer base and the tool developers stays locked tight for year after year, there can be a bit of an echo-chamber effect. The tools and their audience evolve together, often at the exclusion of the outside world.

As with most electronic design automation solutions, FPGA tools started out as a hodgepodge of bare-bones software from various sources. Fledgling FPGA companies pieced together working design flows in any way they could – OEMing some commercial EDA tools (like synthesis and simulation), buying some from other third-parties, and building the rest themselves. Most flows weren’t very good, and they required a lot of hand-holding to get through successfully (where “successfully” is defined as getting your FPGA to actually do what you intended). That hand holding was generally supplied by some of the most capable field applications engineering (FAE) teams in the industry – teams who bear much of the responsibility for the early success of FPGAs as a technology. Without those FAE forces, we might well not have FPGAs today, or at least the landscape would look much different than it does now. 

Time passed, and those FAEs and their customers reported the problems they encountered. The tool development teams responded. The FAEs and customers requested new features and new levels of control, and the tool development teams responded again. The customers, the tool developer, and the tools themselves became expert and powerful together – learning from each other. The further it evolved, the more of a closed system it became. Penetrating the labyrinth of options, optimizations, best practices, and tribal knowledge became a daunting task for the newbie, the outsider, or the occasional user.

Of course, FPGA companies wanted to attract new users and penetrate new markets, but, at the same time, they needed to defend their turf in the highly competitive arena of FPGA expert users. Badly designed wizards were liberally deployed in an attempt to automate the middle-of-the-road options for most common tasks. Graphical user interfaces were introduced in an effort to make FPGA design seem easy. After all, you just point and click, right?

As we all know, you didn’t just point and click. In fact, before any pointing or clicking, you most likely had to learn a hardware description language like Verilog or VHDL. Then, you had to master a very specific FPGA-esque dialect of that language. Then, when you ran your carefully crafted design through the tools the first time – they probably crashed. Maybe the second time too. Finally, you figured out the magic incantations required to get your design all the way through and you got out your first report – with 10,000 timing violations and hundreds and hundreds of cryptic errors and warnings. Welcome to the cutting edge!

Since then, FPGA tools have undergone tremendous change in four dimensions. First, they have primarily been consolidated into integrated suites of tools developed and supported by the FPGA companies themselves. Second, they have matured in their stability, speed, usability, and quality of results. Third, they have accumulated massive numbers of “expert” features to allow FPGA savants to get every last ounce of performance and functionality out of their devices. And fourth, they have expanded their front-end capabilities to include multiple entry points and styles for various types of embedded, system-level, and high-level design.

Today’s tools are incredibly sophisticated pieces of software with almost incomprehensible numbers of features. There are countless paths from design entry through integration, debug, optimization, and implementation – to the point that FPGA companies can no longer rely exclusively on “generalist” FAEs. They now need expert support staff in areas like DSP, embedded computing, automotive – the list goes on and on.

Approaching a modern-day FPGA design tool suite can be intimidating. I recently talked with an engineer from a team who were only occasional users of FPGAs. They had completed a single design several years ago and now were attempting to re-do that same design with some new capabilities using today’s FPGAs. Their design was not complex. They were not pushing the edge of performance, power consumption, or any other dimension that would make their task difficult. It seemed like their project should be a straightforward one. Their description of the process spoke volumes: “It’s like we’re trying to get from here to a spot just on the other side of the parking lot. It’s flat. It’s close. There is nothing in the way. But, to get there, we are required to take the space shuttle. We don’t even know how to get through the first checklist of steps to start the thing up.”

Designers of FPGA tools should take heed. There is a vast number of different types of users entering the FPGA domain, and the majority are not FPGA experts. If FPGAs are to expand into the numerous new and exciting markets for which they’re suitable, the primary battleground will be tools, not chips. New users should not have to learn FPGA-ese in order to get an FPGA to work in their system. At some point, people with little or no hardware expertise at all will need to be able to customize the function of FPGAs. 

We have certainly taken steps down this path, but there are miles to go. Higher levels of abstraction in design creation need to replace HDL. System-level design tools need to take into account both the hardware and software components of an application. Tools – particularly lower-level implementation tools such as synthesis and place-and-route – need to move ever closer to full automation. We actually need to put the synthesis and place-and-route expert user out of a job. Only then can we be positioned for a better, self-driving, programmable future. 

6 thoughts on “Toward Self-Driving FPGA Tools”

  1. Pingback: Petplay
  2. Pingback: DMPK
  3. Pingback: TS Escorts
  4. Pingback: taruhan bola
  5. Pingback: their explanation

Leave a Reply

featured blogs
Apr 18, 2024
Analog Behavioral Modeling involves creating models that mimic a desired external circuit behavior at a block level rather than simply reproducing individual transistor characteristics. One of the significant benefits of using models is that they reduce the simulation time. V...
Apr 16, 2024
Learn what IR Drop is, explore the chip design tools and techniques involved in power network analysis, and see how it accelerates the IC design flow.The post Leveraging Early Power Network Analysis to Accelerate Chip Design appeared first on Chip Design....
Mar 30, 2024
Join me on a brief stream-of-consciousness tour to see what it's like to live inside (what I laughingly call) my mind...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured chalk talk

ROHM Automotive Intelligent Power Device (IPD)
Modern automotive applications require a variety of circuit protections and functions to safeguard against short circuit conditions. In this episode of Chalk Talk, Amelia Dalton and Nick Ikuta from ROHM Semiconductor investigate the details of ROHM’s Automotive Intelligent Power Device, the role that ??adjustable OCP circuit and adjustable OCP mask time plays in this solution, and the benefits that ROHM’s Automotive Intelligent Power Device can bring to your next design.
Feb 1, 2024
10,515 views