feature article
Subscribe Now

Does the World Need a New Verification Technology?

Verification is one of those perennial problems. I long ago lost count of the white papers I have read and the presentations I have sat through that discuss the issues of testing (roughly translated as “does this work?”) and verification (roughly translated as “does this do what the system architect or whomever intended?”). Accompanying this are always graphs showing the increasing percentage of the project lifecycle that is taken up by test and verification. And the same arguments are rolled out for chip-, software- and system-development.

Now a new British company, Coveritas, has come up with another approach to the problem of verifying embedded systems and software. In fact, to say that Coveritas is new is not strictly true; the company has been around for over three years. While in stealth mode, it shipped product to some big names, including some working on low-power wireless and the internet of things. The team is small, and the three senior guys (Sean Redmond, Giles Hall and Paul Gee) each have a quarter century or more experience in electronics. They have bootstrapped the company financially, and only now does the website reveal anything about what they are doing.

What they have done is to take an approach called spec-based verification, an approach in use in the chip-design industry for some time, and transfer it to software. The first use of the term spec-based verification that I can find was by Verisity nearly fifteen years ago, so it is no surprise that both Redmond and Hall spent time with Verisity, both before and after the acquisition by Cadence.

Spec-based verification has several useful features. It takes the verification activity to a higher level of abstraction: the test team should not be getting involved in the nitty-gritty of code when the code is numbered in millions of lines, nor should the tests themselves require writing many hundreds or thousands of lines of code or scripts. It tests against the specifications – functionality, functional test plan, interface specification, and design specification – capturing the rules for the system that are embedded in these specifications and then generating tests. If the specification is changed, then the tests can evolve to meet the changes. And the tests are available for later use in other, similar projects.

How does this differ from what most people are doing today? (And in this context, when we talk about most people, we are talking about those operating in a controlled development environment. Not Harry the Hairy Hacker.)

Today, the standard method is to manually create a test environment, developing tests for scenarios that the test team can imagine. This is time-consuming, inflexible (a change in the specification may require hours of work in changing the tests), doesn’t cope with complex issues like corner cases, and still lets bugs through to the end product. The time taken to develop and run the tests can have an impact on the overall project, particularly on the integration between hardware and software.

An attempt to get around the problem of testing only what the testers can think of testing is fuzzing — the automatic test generation of random inputs within certain defined limits. Redmond compares this to testing by rolling an orange across the keyboard, and he explains that, since the randomisation is within defined limits, fuzzing is still testing only what the tester thinks needs testing.

Coveritas’s approach, in the Xcetera tool suite, starts with writing a functional test plan as executable C++ case definitions and system scenarios. This is based on the system specification, and, when testing is carried out, the functional coverage analyser can display, in the same way as the specification defines functionality, what functionality has and has not been tested. In a standards-based project, for example in developing stacks for communication using standard protocols, an expert in the protocol can quickly review the coverage and recommend further work.

From this starting point, the next step is the use of constraint-driven automatic test generation. Coveritas considers two types: specification constraints and test constraints.

Specification constraints are about the legal parameters, such as input definitions, protocol definitions, I.O., relationships, and dependencies. These are the boundaries within which the system has to operate and therefore provide the boundaries for the test generator.

Test constraints, such as use case scenarios, target areas, probabilities and weights, corner case tests, and bug bypasses, are drawn from the functional test plan, and they give the test generator the information needed for specific tests.

The generator then creates random tests except for those cases where the constraints specifically tell it that it shouldn’t — exactly the opposite to fuzzing. This approach, claims Coveritas, can find bugs that were not considered in the functional test plan, such as ambiguities in the original specification or misinterpretation of the specification. It also can pick up behaviour that the specifier, the developer, or the test case writer had not considered.

The generator creates the tests on the fly, and it can examine software running within a software model, a virtual prototype, an FPGA prototype, or even the production system. As the testing gets closer to the actual system, so the complex issues that develop only when the code is executed on real hardware, such as some classes of corner cases, will be created, and the test generator can reveal them.

The Xcetera software tool suite is now available – visit www.coveritas.com.

As I was writing this, I discovered that Coveritas had been selected as one of the best British technology start-ups and was present at the British Business Embassy, a government venture that is running alongside the Olympic Games in London. The idea is that among the many high-level visitors to the Olympics are potential customers for British companies.  Certainly Coveritas found it useful, but more relevant to this piece is that they persuaded one of their customers to go on video for the event and to publicly endorse them. NXP has been developing a low-power wireless solution for the internet of things, called JenNet. (The underlying technology was acquired when NXP bought Jennic, a UK based fabless semiconductor company, two years ago.) Coveritas was used widely in the development of the JenNet-IP software, and NXP says on the video that the environment enabled the “engineers to find difficult-to-detect bugs” and “reduced the risk time and cost of development.”

So that’s a good start, then.

But, to return to a theme that is a regular concern of mine, the Coveritas approach can work only within a well-defined development environment, essential to creating an appropriate specification. This also means it is likely that static code analysis and code reviews will have removed many of the minor bugs, leaving the heavyweight solution to cope with the complex and hard-to-find issues for which it has been designed. Having said this, there is ample evidence that, even with the best of environments and using the best developers, the complexity of the demands made upon modern software is such that a tool like Xcetera is essential if the end product is going to be affordable and delivered on time and will actually work well in service. And, while the approach that Coveritas is taking is new to software, it draws on a heritage of proven solutions. 

One thought on “Does the World Need a New Verification Technology?”

Leave a Reply

featured blogs
Mar 28, 2024
'Move fast and break things,' a motto coined by Mark Zuckerberg, captures the ethos of Silicon Valley where creative disruption remakes the world through the invention of new technologies. From social media to autonomous cars, to generative AI, the disruptions have reverberat...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Medical Grade Power
Sponsored by Mouser Electronics and RECOM
In this episode of Chalk Talk, Amelia Dalton and Louis Bouche from RECOM explore the various design requirements for medical grade power supplies. They also examine the role that isolation and leakage current play in this arena and the solutions that RECOM offers in terms of medical grade power supplies.
Nov 9, 2023
18,089 views