feature article
Subscribe Now

Does the World Need a New Verification Technology?

Verification is one of those perennial problems. I long ago lost count of the white papers I have read and the presentations I have sat through that discuss the issues of testing (roughly translated as “does this work?”) and verification (roughly translated as “does this do what the system architect or whomever intended?”). Accompanying this are always graphs showing the increasing percentage of the project lifecycle that is taken up by test and verification. And the same arguments are rolled out for chip-, software- and system-development.

Now a new British company, Coveritas, has come up with another approach to the problem of verifying embedded systems and software. In fact, to say that Coveritas is new is not strictly true; the company has been around for over three years. While in stealth mode, it shipped product to some big names, including some working on low-power wireless and the internet of things. The team is small, and the three senior guys (Sean Redmond, Giles Hall and Paul Gee) each have a quarter century or more experience in electronics. They have bootstrapped the company financially, and only now does the website reveal anything about what they are doing.

What they have done is to take an approach called spec-based verification, an approach in use in the chip-design industry for some time, and transfer it to software. The first use of the term spec-based verification that I can find was by Verisity nearly fifteen years ago, so it is no surprise that both Redmond and Hall spent time with Verisity, both before and after the acquisition by Cadence.

Spec-based verification has several useful features. It takes the verification activity to a higher level of abstraction: the test team should not be getting involved in the nitty-gritty of code when the code is numbered in millions of lines, nor should the tests themselves require writing many hundreds or thousands of lines of code or scripts. It tests against the specifications – functionality, functional test plan, interface specification, and design specification – capturing the rules for the system that are embedded in these specifications and then generating tests. If the specification is changed, then the tests can evolve to meet the changes. And the tests are available for later use in other, similar projects.

How does this differ from what most people are doing today? (And in this context, when we talk about most people, we are talking about those operating in a controlled development environment. Not Harry the Hairy Hacker.)

Today, the standard method is to manually create a test environment, developing tests for scenarios that the test team can imagine. This is time-consuming, inflexible (a change in the specification may require hours of work in changing the tests), doesn’t cope with complex issues like corner cases, and still lets bugs through to the end product. The time taken to develop and run the tests can have an impact on the overall project, particularly on the integration between hardware and software.

An attempt to get around the problem of testing only what the testers can think of testing is fuzzing — the automatic test generation of random inputs within certain defined limits. Redmond compares this to testing by rolling an orange across the keyboard, and he explains that, since the randomisation is within defined limits, fuzzing is still testing only what the tester thinks needs testing.

Coveritas’s approach, in the Xcetera tool suite, starts with writing a functional test plan as executable C++ case definitions and system scenarios. This is based on the system specification, and, when testing is carried out, the functional coverage analyser can display, in the same way as the specification defines functionality, what functionality has and has not been tested. In a standards-based project, for example in developing stacks for communication using standard protocols, an expert in the protocol can quickly review the coverage and recommend further work.

From this starting point, the next step is the use of constraint-driven automatic test generation. Coveritas considers two types: specification constraints and test constraints.

Specification constraints are about the legal parameters, such as input definitions, protocol definitions, I.O., relationships, and dependencies. These are the boundaries within which the system has to operate and therefore provide the boundaries for the test generator.

Test constraints, such as use case scenarios, target areas, probabilities and weights, corner case tests, and bug bypasses, are drawn from the functional test plan, and they give the test generator the information needed for specific tests.

The generator then creates random tests except for those cases where the constraints specifically tell it that it shouldn’t — exactly the opposite to fuzzing. This approach, claims Coveritas, can find bugs that were not considered in the functional test plan, such as ambiguities in the original specification or misinterpretation of the specification. It also can pick up behaviour that the specifier, the developer, or the test case writer had not considered.

The generator creates the tests on the fly, and it can examine software running within a software model, a virtual prototype, an FPGA prototype, or even the production system. As the testing gets closer to the actual system, so the complex issues that develop only when the code is executed on real hardware, such as some classes of corner cases, will be created, and the test generator can reveal them.

The Xcetera software tool suite is now available – visit www.coveritas.com.

As I was writing this, I discovered that Coveritas had been selected as one of the best British technology start-ups and was present at the British Business Embassy, a government venture that is running alongside the Olympic Games in London. The idea is that among the many high-level visitors to the Olympics are potential customers for British companies.  Certainly Coveritas found it useful, but more relevant to this piece is that they persuaded one of their customers to go on video for the event and to publicly endorse them. NXP has been developing a low-power wireless solution for the internet of things, called JenNet. (The underlying technology was acquired when NXP bought Jennic, a UK based fabless semiconductor company, two years ago.) Coveritas was used widely in the development of the JenNet-IP software, and NXP says on the video that the environment enabled the “engineers to find difficult-to-detect bugs” and “reduced the risk time and cost of development.”

So that’s a good start, then.

But, to return to a theme that is a regular concern of mine, the Coveritas approach can work only within a well-defined development environment, essential to creating an appropriate specification. This also means it is likely that static code analysis and code reviews will have removed many of the minor bugs, leaving the heavyweight solution to cope with the complex and hard-to-find issues for which it has been designed. Having said this, there is ample evidence that, even with the best of environments and using the best developers, the complexity of the demands made upon modern software is such that a tool like Xcetera is essential if the end product is going to be affordable and delivered on time and will actually work well in service. And, while the approach that Coveritas is taking is new to software, it draws on a heritage of proven solutions. 

One thought on “Does the World Need a New Verification Technology?”

Leave a Reply

featured blogs
Jun 2, 2023
Diversity, equity, and inclusion (DEI) are not just words but values that are exemplified through our culture at Cadence. In the DEI@Cadence blog series, you'll find a community where employees share their perspectives and experiences. By providing a glimpse of their personal...
Jun 2, 2023
I just heard something that really gave me pause for thought -- the fact that everyone experiences two forms of death (given a choice, I'd rather not experience even one)....
Jun 2, 2023
Explore the importance of big data analytics in the semiconductor manufacturing process, as chip designers pull insights from throughout the silicon lifecycle. The post Demanding Chip Complexity and Manufacturing Requirements Call for Data Analytics appeared first on New Hor...

featured video

Synopsys 224G & 112G Ethernet PHY IP Demos at OFC 2023

Sponsored by Synopsys

Watch this video of the Synopsys 224G & 112G Ethernet PHY IP demonstrating excellent performance and IP successful ecosystem interoperability demonstrations at OIF.

Learn More

featured paper

EC Solver Tech Brief

Sponsored by Cadence Design Systems

The Cadence® Celsius™ EC Solver supports electronics system designers in managing the most challenging thermal/electronic cooling problems quickly and accurately. By utilizing a powerful computational engine and meshing technology, designers can model and analyze the fluid flow and heat transfer of even the most complex electronic system and ensure the electronic cooling system is reliable.

Click to read more

featured chalk talk

Spectral and Color Sensors
Sponsored by Mouser Electronics and ams OSRAM
There has been quite a bit of advancement in the world of spectrometers of the last several years. In this episode of Chalk Talk, Amelia Dalton and Jim Archibald from ams OSRAM investigate how multispectral sensing solutions are driving innovation in a variety of different fields. They also explore the functions involved with multispectral sensing, the details of ams OSRAM’s AS7343 spectral sensor, and why smoke detection is a great application for this kind of multispectral sensing.
Mar 6, 2023
11,918 views