feature article
Subscribe Now

Does the World Need a New Verification Technology?

Verification is one of those perennial problems. I long ago lost count of the white papers I have read and the presentations I have sat through that discuss the issues of testing (roughly translated as “does this work?”) and verification (roughly translated as “does this do what the system architect or whomever intended?”). Accompanying this are always graphs showing the increasing percentage of the project lifecycle that is taken up by test and verification. And the same arguments are rolled out for chip-, software- and system-development.

Now a new British company, Coveritas, has come up with another approach to the problem of verifying embedded systems and software. In fact, to say that Coveritas is new is not strictly true; the company has been around for over three years. While in stealth mode, it shipped product to some big names, including some working on low-power wireless and the internet of things. The team is small, and the three senior guys (Sean Redmond, Giles Hall and Paul Gee) each have a quarter century or more experience in electronics. They have bootstrapped the company financially, and only now does the website reveal anything about what they are doing.

What they have done is to take an approach called spec-based verification, an approach in use in the chip-design industry for some time, and transfer it to software. The first use of the term spec-based verification that I can find was by Verisity nearly fifteen years ago, so it is no surprise that both Redmond and Hall spent time with Verisity, both before and after the acquisition by Cadence.

Spec-based verification has several useful features. It takes the verification activity to a higher level of abstraction: the test team should not be getting involved in the nitty-gritty of code when the code is numbered in millions of lines, nor should the tests themselves require writing many hundreds or thousands of lines of code or scripts. It tests against the specifications – functionality, functional test plan, interface specification, and design specification – capturing the rules for the system that are embedded in these specifications and then generating tests. If the specification is changed, then the tests can evolve to meet the changes. And the tests are available for later use in other, similar projects.

How does this differ from what most people are doing today? (And in this context, when we talk about most people, we are talking about those operating in a controlled development environment. Not Harry the Hairy Hacker.)

Today, the standard method is to manually create a test environment, developing tests for scenarios that the test team can imagine. This is time-consuming, inflexible (a change in the specification may require hours of work in changing the tests), doesn’t cope with complex issues like corner cases, and still lets bugs through to the end product. The time taken to develop and run the tests can have an impact on the overall project, particularly on the integration between hardware and software.

An attempt to get around the problem of testing only what the testers can think of testing is fuzzing — the automatic test generation of random inputs within certain defined limits. Redmond compares this to testing by rolling an orange across the keyboard, and he explains that, since the randomisation is within defined limits, fuzzing is still testing only what the tester thinks needs testing.

Coveritas’s approach, in the Xcetera tool suite, starts with writing a functional test plan as executable C++ case definitions and system scenarios. This is based on the system specification, and, when testing is carried out, the functional coverage analyser can display, in the same way as the specification defines functionality, what functionality has and has not been tested. In a standards-based project, for example in developing stacks for communication using standard protocols, an expert in the protocol can quickly review the coverage and recommend further work.

From this starting point, the next step is the use of constraint-driven automatic test generation. Coveritas considers two types: specification constraints and test constraints.

Specification constraints are about the legal parameters, such as input definitions, protocol definitions, I.O., relationships, and dependencies. These are the boundaries within which the system has to operate and therefore provide the boundaries for the test generator.

Test constraints, such as use case scenarios, target areas, probabilities and weights, corner case tests, and bug bypasses, are drawn from the functional test plan, and they give the test generator the information needed for specific tests.

The generator then creates random tests except for those cases where the constraints specifically tell it that it shouldn’t — exactly the opposite to fuzzing. This approach, claims Coveritas, can find bugs that were not considered in the functional test plan, such as ambiguities in the original specification or misinterpretation of the specification. It also can pick up behaviour that the specifier, the developer, or the test case writer had not considered.

The generator creates the tests on the fly, and it can examine software running within a software model, a virtual prototype, an FPGA prototype, or even the production system. As the testing gets closer to the actual system, so the complex issues that develop only when the code is executed on real hardware, such as some classes of corner cases, will be created, and the test generator can reveal them.

The Xcetera software tool suite is now available – visit www.coveritas.com.

As I was writing this, I discovered that Coveritas had been selected as one of the best British technology start-ups and was present at the British Business Embassy, a government venture that is running alongside the Olympic Games in London. The idea is that among the many high-level visitors to the Olympics are potential customers for British companies.  Certainly Coveritas found it useful, but more relevant to this piece is that they persuaded one of their customers to go on video for the event and to publicly endorse them. NXP has been developing a low-power wireless solution for the internet of things, called JenNet. (The underlying technology was acquired when NXP bought Jennic, a UK based fabless semiconductor company, two years ago.) Coveritas was used widely in the development of the JenNet-IP software, and NXP says on the video that the environment enabled the “engineers to find difficult-to-detect bugs” and “reduced the risk time and cost of development.”

So that’s a good start, then.

But, to return to a theme that is a regular concern of mine, the Coveritas approach can work only within a well-defined development environment, essential to creating an appropriate specification. This also means it is likely that static code analysis and code reviews will have removed many of the minor bugs, leaving the heavyweight solution to cope with the complex and hard-to-find issues for which it has been designed. Having said this, there is ample evidence that, even with the best of environments and using the best developers, the complexity of the demands made upon modern software is such that a tool like Xcetera is essential if the end product is going to be affordable and delivered on time and will actually work well in service. And, while the approach that Coveritas is taking is new to software, it draws on a heritage of proven solutions. 

One thought on “Does the World Need a New Verification Technology?”

Leave a Reply

featured blogs
Jun 18, 2018
Many years ago, when Nokia was at the top of its game'€”one in every three phones shipped was a Nokia'€”I chatted to the sister of a friend of mine who was something senior in Nokia finance. I think she was the controller for a good part of their Africa business. Since in...
Jun 14, 2018
Samtec has released the industry'€™s first 0.50 mm pitch edge card socket with justification beam. This design allows high-speed signals to pass through an incredibly dense connector while keeping the mating PCB at a reasonable cost. The socket'€™s justification beam is d...
Jun 7, 2018
If integrating an embedded FPGA (eFPGA) into your ASIC or SoC design strikes you as odd, it shouldn'€™t. ICs have been absorbing almost every component on a circuit board for decades, starting with transistors, resistors, and capacitors '€” then progressing to gates, ALUs...
May 24, 2018
Amazon has apparently had an Echo hiccup of the sort that would give customers bad dreams. It sent a random conversation to a random contact. A couple had installed numerous Alexa-enabled devices in the home. At some point, they had a conversation '€“ as couples are wont to...