feature article
Subscribe Now

Are You Covered?

Software Test Coverage Isn’t as Straightforward as You Might Hope

LDRA issued a press release sometime back on their work with the Mathworks; I addressed the overall topic at the time, but there was something else that caught my eye.

In the press release was the statement, “The LDRA tool suite provides full code coverage whether statement, branch or decision, linear code sequence and jump (LCSAJ), or modified condition/decision coverage (MC/DC) of code created from Simulink models and hand-written code.”

I thought, if there are these various criteria for considering software covered, then is there a matrix of who or what standard requires what level of coverage?

As you might expect, the answer isn’t simple. Actually, the specific answer to that specific question is simple: it’s “No.” There is no such matrix.

Before going into the nuance, I thought it useful to present the Wikipedia definitions of the various standards as a baseline (combining the contents of a couple of entries)*.

Condition: A condition is a leaf-level Boolean expression (it cannot be broken down into a simpler Boolean expression).

Decision: A Boolean expression composed of conditions and zero or more Boolean operators. A decision without a Boolean operator is a condition.

Condition Coverage: Every condition in a decision in the program has taken all possible outcomes at least once.

Decision Coverage:  Every point of entry and exit in the program has been invoked at least once, and every decision in the program has taken all possible outcomes at least once.

LCSAJ: An LCSAJ is a software code path fragment consisting of a sequence of code (a Linear Code Sequence, or basic block) followed by a control flow Jump, and consists of the following three items:

    the start of the linear sequence (basic block) of executable statements
    the end of the linear sequence
    the target line to which control flow is transferred at the end of the linear sequence.

Condition/Decision Coverage:  Every point of entry and exit in the program has been invoked at least once, every condition in a decision in the program has taken all possible outcomes at least once, and every decision in the program has taken all possible outcomes at least once. 

Modified Condition/Decision Coverage:  Every point of entry and exit in the program has been invoked at least once, every condition in a decision in the program has taken on all possible outcomes at least once, and each condition has been shown to affect that decision outcome independently. A condition is shown to affect a decision’s outcome independently by varying just that condition while holding fixed all other possible conditions. The condition/decision criterion does not guarantee the coverage of all conditions in the module because in many test cases, some conditions of a decision are masked by the other conditions. Using the modified condition/decision criterion, each condition must be shown to be able to act on the decision outcome by itself, everything else being held fixed. The MC/DC criterion is thus much stronger than the condition/decision coverage.

I had a conversation with LDRA’s Mark Pitchford to get some more clarity (or at least color) on how this stacks up in real life.

While some standards may have their own matrices for coverage according to safety integrity level (SIL), there is much that’s left to the dude that adjudicates whether or not you’ve met your duty under whichever standard you’re working towards; this is the so-called designated engineering representative (DER).

In particular, MC/DC has some surprising subtleties associated with it, depending on a couple of ways of interpreting the requirements – some of which are, of course, much more conservative than others.

One subtlety involves the “short-circuit” feature of different languages. In languages like C or C++, if a decision involves multiple conditions ANDed or ORed together, then evaluation goes only as far as necessary to determine the overall outcome. If A, B, and C are ANDed together and A is FALSE, then B and C don’t have to be evaluated; you already know the result is FALSE. Ada is more conservative: all conditions have to be evaluated regardless of outcome.

This issue was addressed by the DO-178B Certification Authority Software Team (CAST), and they defined DC and MC/DC as follows, given a series of increasingly tough requirements:

a)  Every statement in the program has been invoked at least once;
b)  Every point of entry and exit in the program has been invoked at least once;
c)  Every control statement (i.e., branchpoint) in the program has taken all possible outcomes (i.e.,branches) at least once;                
d)  Every non-constant Boolean expression in the program has evaluated to both a True and a False result;
e)  Every non-constant condition in a Boolean expression in the program has evaluated to both a True and a False result;
f)  Every non-constant condition in a Boolean expression in the program has been shown to independently affect that expression’s outcome.

Based upon these definitions:

  • Statement Coverage requires (a) only
  • DC requires (b, c, d)
  • MC/DC requires (b, c, d, e, f)

There’s yet another nuance when it comes to hardware registers. There’s disagreement as to whether assigning a value to a register should constitute a decision. Which, on its face, sounds bizarre: how could assignment be considered a decision?? But here’s the reasoning: this applies in particular when writing to memory-mapped registers that hand off some result to a peripheral.

A conservative approach would assume that the peripheral will be making some decision based on the written value, and that decision is outside the scope of the program being tested – and yet it depends on the value created by the program being tested. Therefore, some DERs will require that all values for that register be covered as a proxy for covering the decisions the peripheral will make.

Less conservative DERs will require condition coverage only when (and if) the hardware register value is actually involved in a decision somewhere later in the code being tested (which is the more intuitive approach).

One of the key take-aways here is that DER discretion can make a difference. The conservative interpretations take more work (although LDRA does support them with switches in their tools). If two competing companies have two different DERs, one of which is nice and one of which is harsh, then the unlucky company will have higher costs due to the greater rigor required of them, even though both companies are working to the same standard.

DO-178C is supposed to close many of the loopholes and make more concrete some of the vaguer aspects of DO-178B, but for now, at least, these “clear-cut” test coverage definitions are anything but.

 

*At least, this is what Wikipedia says as of 11/2/11…

One thought on “Are You Covered?”

Leave a Reply

featured blogs
Sep 30, 2022
Wow, September has flown by. It's already the last Friday of the month, the last day of the month in fact, and so time for a monthly update. Kaufman Award The 2022 Kaufman Award honors Giovanni (Nanni) De Micheli of École Polytechnique Fédérale de Lausanne...
Sep 29, 2022
We explain how silicon photonics uses CMOS manufacturing to create photonic integrated circuits (PICs), solid state LiDAR sensors, integrated lasers, and more. The post What You Need to Know About Silicon Photonics appeared first on From Silicon To Software....
Sep 22, 2022
On Monday 26 September 2022, Earth and Jupiter will be only 365 million miles apart, which is around half of their worst-case separation....

featured video

PCIe Gen5 x16 Running on the Achronix VectorPath Accelerator Card

Sponsored by Achronix

In this demo, Achronix engineers show the VectorPath Accelerator Card successfully linking up to a PCIe Gen5 x16 host and write data to and read data from GDDR6 memory. The VectorPath accelerator card featuring the Speedster7t FPGA is one of the first FPGAs that can natively support this interface within its PCIe subsystem. Speedster7t FPGAs offer a revolutionary new architecture that Achronix developed to address the highest performance data acceleration challenges.

Click here for more information about the VectorPath Accelerator Card

featured paper

Algorithm Verification with FPGAs and ASICs

Sponsored by MathWorks

Developing new FPGA and ASIC designs involves implementing new algorithms, which presents challenges for verification for algorithm developers, hardware designers, and verification engineers. This eBook explores different aspects of hardware design verification and how you can use MATLAB and Simulink to reduce development effort and improve the quality of end products.

Click here to read more

featured chalk talk

Reduce Power System Needs with Multichannel Power Monitors

Sponsored by Mouser Electronics and Microchip

Power monitors can be very effective in terms of power management for a variety of designs and the use of a multichannel power monitors can not only lower your overall system power but also lower your code overhead, simplify prototyping and event detection. In this episode of Chalk Talk, Amelia Dalton chats with Mitch Polonsky from Microchip about the benefits of multichannel power monitors and how Microchip’s PAC194x and PAC195x can help you monitor your power in your next design.

Click here for more information about Microchip Technology PAC194x & PAC195x Monitors