feature article
Subscribe Now

Legacy of Languages

Culture in Code

Using VHDL or Verilog to design FPGAs is just plain wrong.

Talk with any expert in languages, in logic synthesis, in hardware architecture.  If you get past the “but that’s how we do it” layer of defenses, you’ll pretty quickly uncover a vast ocean of oddity that will make you wonder just why anyone ever considered the idea of doing FPGA design with HDLs, let alone how HDL-based design became the de-facto standard.

First, taking VHDL as an example: most of the things you can write in VHDL cannot be synthesized into hardware with any degree of efficiency.  In fact, to paraphrase one of my favorite college math professors, if you wrote all the possible VHDL on a big wall, and threw tomatoes at the wall all day long, you’d probably never hit any code that would synthesize well.  (This is a variant of what my professor called the “tomato theorem.”)  Those of us that successfully write VHDL for synthesis have learned through experience how to step on a select few hidden rocks just under the water’s surface in order to keep our footing through synthesis.   Deviate from those rocks, however, and you’re likely to take a dunk in code that won’t synthesize at all or will create hardware that’s impractically inefficient if not completely un-buildable.

Of course, VHDL wasn’t intended for designing hardware.  Yes, the D stands for “Description,” but in its original context, that “Description” was more like “Documentation.”  VHDL was intended to solve a documentation problem for the US Department of Defense.  The advent of application-specific chips had created a daunting documentation problem for both the DoD and its suppliers, and VHDL was developed to address that challenge.  Now, ASIC designs could be described in a standard, machine-readable language instead of through piles of ad-hoc paper documentation.

Once we had VHDL as a documentation language, people wanted to be able to execute it.  If you could execute your documentation, you could verify that it was functionally correct.  The first simulators were crude, clunky, and slow, and didn’t support a lot of the language.  If you wanted to simulate your documentation, you had to stay within the boundaries of what could be simulated using the technology of the time.  Eventually, of course, compiled code simulators came along and dramatically improved execution speeds, and the simulator capabilities expanded so that almost all of the language could be simulated.

Verilog didn’t start with quite the handicap of VHDL.  It was originally designed to be simulatable.  The Verilog simulator from Gateway Design Automation was a big hit with designers who didn’t want to mess with the mass of VHDL and who wanted clean, fast, language-based simulation of their logic designs.  Driven in part by both mandate and momentum, however, VHDL continued full-steam as well.  The designer population bifurcated, the lines were drawn, and by sometime in the late 1980s, the cultural seeds were planted that landed us where we are today.

Once you have a language in which you can write a machine-readable description of your ASIC design and simulate it, the natural next step is to try to convert that language into an implementation language so you can use it to directly do design work.  This is where logic synthesis comes in.  The first logic synthesis of VHDL and Verilog could take advantage of only a tiny subset of the language and, even then, only in a very specific structure.  Over time, logic synthesis tools expanded the allowable subset of the languages, the variation in structure, and the quality of the resulting logic when synthesizing from hardware description languages like VHDL and Verilog. 

Using an HDL as an implementation language locks down the semantics of the description itself.  Although both VHDL and Verilog are capable of capturing design intent or behavior to some degree, synthesizable descriptions are still relegated to little more than the description of structure.  Register transfer level (RTL) has been the de-facto abstraction level for the description of synthesizable logic for close to two decades now.

What does all this have to do with FPGAs?  Well, FPGAs have a completely different logic structure from ASICs.  In using HDLs to synthesize ASIC designs, we at least have a pretty open field for using “normal” logic elements to complete our design.  In FPGAs, however, the fabric is based upon fixed-width look-up tables (LUTs).  When we start to synthesize HDLs for FPGAs, we’re shoe-horning the mapping of logic gates into some strange combination of LUTs and other elements.  Many of the assumptions that work for ASIC mapping have to be compromised to get reasonable results in FPGA fabric.

What we have, then, is a documentation language shoe-horned into a simulation language, shoe-horned into a synthesis language, and finally shoe-horned into an FPGA technology target.  Does that sound like a lot of shoe-horning?  It is.  This Rube-Goldbergian schema has been our default design methodology for FPGAs for years now.  It is neither optimal nor justifiable for any reason other than that it takes advantage of technology we already had lying around the house – leftover from our ASIC design projects.

Now, ESL comes in and promises to revolutionize our design methodologies with — what else?  — behavioral description in languages like C and C++.  Instead of laying down LUTS using languages designed for the documentation of ASIC designs, we will use a language that is a thin layer of abstraction over a von Neumann architecture processor architecture – intended for ease-of-use in portably implementing operating system features in software.  Perhaps we are not going the right direction here?

Certainly, there are many good reasons to do hardware design in a language like C.  The number of people in the world already accomplished in C is far greater than the sum of both VHDL and Verilog masters, so just based on the number of trained designers alone, it has quite a compelling start.  Hardware is parallel in nature and C is sequential, so we have the ultra-non-trivial problem of generating parallel structures from sequential code.  However, people tend to think and develop algorithms sequentially most of the time anyway, so even if your compiler isn’t yet smart enough to do that conversion, you’ll be able to.  If we were starting from scratch to come up with the perfect language to concisely describe the desired behavior of our hardware, however, I doubt any of us would re-invent C.

The thing is, design language choice, like spoken and written language choice, is far more cultural than logical.  We don’t analyze the various languages available for communication and choose to learn and speak the one that is able to most efficiently or effectively capture and express our ideas.  Certainly if that were the case, I’d never have chosen English.  Our choice of language is based on our culture and our history.  Design language is no different.  Engineering is a culture.  We’ll never stop and re-invent ourselves with a blank slate, crafting the perfect dialect for designing our masterpieces.  Instead, we’ll follow in the footsteps of history and convention and use the quaint and quirky communications tools that we’ve inherited from our peers — and have fun doing it.

Leave a Reply

featured blogs
Dec 10, 2018
I titled my preview of the RISC-V Summit RISC-V Summit Preview: Pascal or Linux? since it is clear that RISC-V is really the only game in town inside academia, but it still hasn't conquered the... [[ Click on the title to access the full blog on the Cadence Community si...
Dec 7, 2018
That'€™s shocking! Insulation Resistance and Dielectric Withstanding Voltage are two of the qualification tests that Samtec performs in-house during part qualification testing. These tests will ensure that when a connector is used in environmental conditions at the rated wo...
Nov 28, 2018
The futuristic concept of testing for a variety of inconsistencies in blood with just a drop seemed within reach with the promising company Theranos....
Nov 14, 2018
  People of a certain age, who mindfully lived through the early microcomputer revolution during the first half of the 1970s, know about Bill Godbout. He was that guy who sent out crudely photocopied parts catalogs for all kinds of electronic components, sold from a Quon...