feature article
Subscribe Now

HDL is Dead

Long Live HDL

The mid-term exam was straightforward.  We were given five specifications, and we had to write a program for each.  Each program would be worth 20 points, for a total of 100 on the exam.  The exam was 40% of our final grade in the class.  Oh, I almost forgot – the language was Motorola 6809 assembly, and we had to hand-assemble the programs and submit machine code.  The exam was on paper (in a blue book), and no computers or calculators could be used.

 

As I was taking the exam, it occurred to me that the only thing harder than taking this test would be grading it.  Seriously!  Would a graduate student who was working as a teaching assistant stay up for several days straight looking blurry-eyed at rows and columns of op codes, trying to figure out whether this program did what the specification asked, on exam after exam after exam – with 5 programs on each?

 

It turns out the answer was “no.”

 

I was one of the best students in the class.  I had done a lot of microprocessor programming on my own, and I was confident in my skills.  The exam programs were straightforward, and I crafted (modestly speaking, of course) very elegant solutions for all five of them.  I finished first, in 5 minutes more than the allotted time, and I couldn’t imagine how some of the less experienced students could have possibly finished.  Still, I was very confident when I handed in my exam.

 

Two weeks later, the tests came back – the professor handed them out unceremoniously and without comment.  I opened the blue book and found my grade… “20%”

 

What?

 

Looking more closely, four of my five programs were marked with a big red “X” and a “-20”  One was marked with a check mark and a “20.”  Well, that’s straightforward enough.  There was no other notation on the exam – no hint of why four of five programs were marked wrong, no partial credit, nothing.

 

I was upset, to say the least, but I wanted to have my facts straight first.  I walked down to the lab and entered each of my five programs (even the correct one) on a development system.  They all worked perfectly.  I carefully re-read the questions/specifications and couldn’t find anything I’d missed.  I decided I was ready for the trip to the professor’s office.

 

I made an appointment to see the professor.  I arrived early and waited outside his office.  He arrived about twenty minutes late.  He seemed distracted as I explained that I’d like him to review my exam.  “Well, give it here…” he sounded annoyed.  He looked at the first page and traced through my answer… “Ok, ok, ok – good, that’s more compact than my example solution…  yeah.  You’re right.  This is good.”  He scratched out the -20 and wrote “40” on the front cover, then quickly handed me back the book and started to look at other work on his desk…  

 

“Um… actually there are several more I had the same question about.  Could you look at those as well?”

 

Now he really seemed annoyed.  “OK, ok, give it back.”

 

He paged through the rest of the exam and, in each case, decided that my solution was correct.  He added a “+20” note to each of the remaining three questions.  At the end, he went through and totaled them up, scratched out the “40” on the front cover and wrote in “100” and handed me back the exam.  “There ya go.  Thanks.”

 

I asked if he maybe needed to change the grade in his grade book, and not just on my copy of the exam.  Apparently, this had never crossed his mind.  With no emotion other than an apparent annoyance at my continued presence in his office, he pulled out a pad of paper and made a note – presumably to remind himself to tell the appropriate teaching assistant to change the grade of… “What was your name again?” from a “Let’s see – 20% to 100%, hmm.”

 

There was no apparent concern on his part spawned by the fact that his grading system had under-rated my performance by a factor of five.  In the grading criteria for the exam, we were reminded that our solution to each problem had to execute within 20% of the speed of the “example” solution, and our memory footprint had to be within 20% of that of the example solution.  It seems that the same 20% tolerance was not required of the grading system.  Also unspoken was the grading process itself, which must have consisted of comparing the student’s program with the example solution, giving full credit if the op codes matched 1:1 and giving no credit if the programs did not match.  Four of my five programs were both smaller and faster than the example solution.  The remaining program is the one originally marked “correct.”

 

I left the office both encouraged and dismayed.  I was encouraged that my grade had jumped from the lowest echelons of failure to the maximum possible score, and dismayed by the utter indifference shown by the professor about the dramatic failure of his grading system.   In speaking with my peers, I determined that the class was divided into two groups – those who failed the exam miserably, and those who initially failed the exam miserably, but went to the professor to have their grade raised and thus passed.  

 

In order to process these events, I had to back away and survey the bigger picture.  Maybe the real lesson of this part of the course was not microprocessor programming.  Perhaps it taught others what it taught me – always question authority, never trust the system completely, always do your best work – and always be prepared to defend it.  These lessons have proven far more valuable in my career than the 6809 op code for CMPA – 81 Hexadecimal.  

 

A second learning from this course was that, although hand assembly was the purview of “real” programmers (and was still the “official” way to program microprocessors), these methods did not represent the future of software development.  Higher-level languages and methodologies would evolve that would favor productivity over +/- 20% optimization.  Eventually, even the optimization argument would fail as automated compilers did a superior job optimizing large, complex algorithms than a typical human could manage with manual machine coding techniques.

 

Today, HDLs are the official languages of FPGA development.  There is no doubt that “real” FPGA designers write HDL and will continue to do so for some time.  However, just as my hand assembly skills became obsolete, so may the entities and architectures of outrageous fortune soon abandon our daily design lives.  Already, most designs use some amount of encapsulated IP – HDL snippets squirreled away where the end-user’s eyes will never venture.  Every day, more designers enter the FPGA arena via alternative entry points like graphical block diagram editors, high-level language compilers, pre-optimized design blocks, and any number of other means that bypass the signals and registers of conventional RTL design.  

 

Instead of lamenting the loss of +/- 20% optimality falsely conveyed by our current designs styles, we need to back away and survey the bigger picture.  Productivity will always eventually trump optimality.  Design will move from the realm of the highly-skilled bit-by-bit programmers to the passing interest of the slap-it-together-from-pre-defined-blocks mentality of the system-level tourist.  The real lessons of our design experience will remain with us long after we forget the proper way to infer block RAM. We just need to open our minds to them.

Leave a Reply

featured blogs
Apr 16, 2024
In today's semiconductor era, every minute, you always look for the opportunity to enhance your skills and learning growth and want to keep up to date with the technology. This could mean you would also like to get hold of the small concepts behind the complex chip desig...
Apr 11, 2024
See how Achronix used our physical verification tools to accelerate the SoC design and verification flow, boosting chip design productivity w/ cloud-based EDA.The post Achronix Achieves 5X Faster Physical Verification for Full SoC Within Budget with Synopsys Cloud appeared ...
Mar 30, 2024
Join me on a brief stream-of-consciousness tour to see what it's like to live inside (what I laughingly call) my mind...

featured video

MaxLinear Integrates Analog & Digital Design in One Chip with Cadence 3D Solvers

Sponsored by Cadence Design Systems

MaxLinear has the unique capability of integrating analog and digital design on the same chip. Because of this, the team developed some interesting technology in the communication space. In the optical infrastructure domain, they created the first fully integrated 5nm CMOS PAM4 DSP. All their products solve critical communication and high-frequency analysis challenges.

Learn more about how MaxLinear is using Cadence’s Clarity 3D Solver and EMX Planar 3D Solver in their design process.

featured chalk talk

Intel AI Update
Sponsored by Mouser Electronics and Intel
In this episode of Chalk Talk, Amelia Dalton and Peter Tea from Intel explore how Intel is making AI implementation easier than ever before. They examine the typical workflows involved in artificial intelligence designs, the benefits that Intel’s scalable Xeon processor brings to AI projects, and how you can take advantage of the Intel AI ecosystem to further innovation in your next design.
Oct 6, 2023
24,722 views