feature article
Subscribe Now

Passing the Test

Vennsa Tries to Figure Out Who Screwed Up

Several years ago, while renting a vehicle for an event I was going to attend, the rental guy pointed out that my driver’s license had expired a couple months prior. So he couldn’t rent me the vehicle. My wife at the time bailed me out, but I decided to postpone my departure by a day to avoid the risk of getting pulled over with no license to show. Which meant an emergency trip to the DMV.

I found that I could get licensed quickly, but I had to take the written test in order to do so – something I hadn’t taken (or studied for) since I was a teenager. So, with no preparation, I went off and answered the questions, knowing that these tests have a history of being flaky, and nervous that my ability to drive legally lay in the balance.

I don’t remember what the numbers are, but you aren’t allowed to miss many. And I missed one too many. And I asked about the questions, and there was one in particular that stood out:

“In which of the following places is it not OK to park?”

I don’t remember two of the answers (they were obviously not it), but the other two were:

“In a parking space with striped lines in it”

“In a bike lane”

Now, I couldn’t figure out what a parking spot with striped lines in it was – I couldn’t remember ever having seen one, and, since I hadn’t read the manual in decades, that choice was lost on me. But I was pretty damn sure you can’t park in a bike lane, which was good, since that meant that it was the answer and I didn’t have to worry about the other one.

Wrong.

It turns out that the parking space with the stripes in it is the area next to a handicapped parking area that’s marked off for the wheelchair to have room to maneuver in. To me, that was “not a parking space,” but was a non-parking area striped off so as not to be confused with a parking space. It wasn’t “a parking space with stripes.” Well, that’s not how the DMV saw it.

Fortunately, as I talked to the grader, I said, “But you can’t park in a bike lane, can you?” And she emphatically agreed, “Oh, no.” And so, bless her heart, she gave me that one. And I drove away. Legally.

But it raises the point that tests are complicated things. When something fails, it’s not always obvious what the problem is. The only thing you really know is that there’s a problem. In this case, there were three possible sources of the problem:

  • The question could have been faulty or ambiguous
  • The answer key could have been incorrect
  • I may have simply gotten the wrong answer

You can test your chip designs as well during your verification cycle. You pose questions through the testbench by stimulating the design to see what answer it gives. You use assertions to act as the answer key and flag when the answer is wrong. And, presumably, if there’s a mistake in the design, it will be identified.

So, just like the DMV’s test, when an answer is wrong, it could actually be an indication of one of three things:

  • A testbench problem
  • An assertion problem
  • A problem with the design

Debugging each instance of an assertion firing can get tiresome. And it can be downright mind-boggling if complex assertions and/or logic are involved. So to address this challenge, a new company named Vennsa has launched a tool called OnPoint that is supposed to do a lot of that debugging work for you.

Theoretically, automating this kind of debugging is easy. You take the cone of influence and perturb each contributor to it to see what happens. And you perturb all combinations (and permutations, if that matters) and check the results. And any that give the observed failure become candidates for the root cause of the problem, understanding that the problem could be in the testbench or assertion as well as the design.

And this works fine if you’ve got all the time in the world to explore this exploding solution space. Which most of us don’t. What Vennsa has brought to the party is apparently a clever approach to keeping that solution space tractable. The result is typically a dozen or so root cause suspects per issue.

These suspects are then ranked. Vennsa has a number of considerations that go into the rankings (and they can’t resist the temptation to compare themselves to Google – something I’m sure the VCs like, since you have to say you’re “the Google of …” or “the eBay of …” or something like that to get their attention). For example, in an intuitive reversal of Occam’s Razor, they rank more complicated suspects higher than simple ones. In other words, if the area in question is complex, it’s more likely that there’s something wrong there.

Then, along with each suspect, comes a suggested fix, proffered via waveform. These fixes have already been vetted to guarantee that doing any of them won’t cause any other assertion along that simulation trajectory to fail. This goes part way towards avoiding a whack-a-mole problem where one fix causes another problem. But it doesn’t eliminate the need to re-verify the design as a whole after the fix is in place to ensure that it didn’t screw something up further afield.

The fixes suggested could involve the testbench, the assertion, or the design. It isn’t simply assumed that, just because the assertion fired, there is indeed a problem with the design.

They accomplish all of this with a combination of technologies, including, by their description, formal (which makes up about 80% of what goes on), along with sat solvers, binary decision trees, and other computational arcana. They can work with a variety of simulators and formal tools that are in the verification path. Those are the tools testing the design; OnPoint is the tool testing the errors.

In a perfect world, tests are always clear and unambiguous, the answer keys are always correct, and the test-taker is the only unknown. Actually, in a truly perfect world, the test-taker also has perfect knowledge and would never fail a test. But we don’t live in that world (well, Chuck Norris does, but none of us do). Given that unfortunate reality, Vennsa is hoping to help manage the challenge of figuring out what went wrong when something goes wrong.

Now…  whether they’d be able to bring order to the DMV, well, that’s quite a different question…

 

More info:  Vennsa

 

Leave a Reply

featured blogs
Sep 28, 2022
Learn how our acquisition of FishTail Design Automation unifies end-to-end timing constraints generation and verification during the chip design process. The post Synopsys Acquires FishTail Design Automation, Unifying Constraints Handling for Enhanced Chip Design Process app...
Sep 28, 2022
You might think that hearing aids are a bit of a sleepy backwater. Indeed, the only time I can remember coming across them in my job at Cadence was at a CadenceLIVE Europe presentation that I never blogged about, or if I did, it was such a passing reference that Google cannot...
Sep 22, 2022
On Monday 26 September 2022, Earth and Jupiter will be only 365 million miles apart, which is around half of their worst-case separation....

featured video

PCIe Gen5 x16 Running on the Achronix VectorPath Accelerator Card

Sponsored by Achronix

In this demo, Achronix engineers show the VectorPath Accelerator Card successfully linking up to a PCIe Gen5 x16 host and write data to and read data from GDDR6 memory. The VectorPath accelerator card featuring the Speedster7t FPGA is one of the first FPGAs that can natively support this interface within its PCIe subsystem. Speedster7t FPGAs offer a revolutionary new architecture that Achronix developed to address the highest performance data acceleration challenges.

Click here for more information about the VectorPath Accelerator Card

featured paper

Algorithm Verification with FPGAs and ASICs

Sponsored by MathWorks

Developing new FPGA and ASIC designs involves implementing new algorithms, which presents challenges for verification for algorithm developers, hardware designers, and verification engineers. This eBook explores different aspects of hardware design verification and how you can use MATLAB and Simulink to reduce development effort and improve the quality of end products.

Click here to read more

featured chalk talk

Double Density Cool Edge Next Generation Card Edge Interconnect

Sponsored by Mouser Electronics and Amphenol ICC

Nowhere is the need for the reduction of board space more important than in the realm of high-performance servers. One way we can reduce complexity and reduce overall board space in our server designs can be found in the connector solutions we choose. In this episode of Chalk Talk, Amelia Dalton chats with David Einhorn from Amphenol about how Amphenol double-density cool edge interconnects can not only reduce space but also lessen complexity and give us greater flexibility.

Click here for more information about Amphenol FCI Double Density Cool Edge 0.80mm Connectors