editor's blog
Subscribe Now

Requirements Modeling and Simulation

Flow_image.jpgSo you’re working on a design… Are you sure you’re building what was intended? Yes, you’re building what they asked for… or, at least, what you think they asked for, but is that what they wanted?

Requirements can be dicey; they’re based on natural language, which, as we know all too well, is subject to interpretation. According to Argosim, many companies have institutionalized styles – sentence templates, for instance – to ensure consistent, clear, unambiguous articulation of requirements. That’s not a guarantee of clarity, but it certainly helps.

What it doesn’t necessarily help with, however, is the following question: do you know if, out of all the requirements, some of them conflict or are mutually inconsistent? If one requirement is a dense material to protect against radiation and another requirement is that the material has to float, those two might not work together. Remember to use the right radiation protection to avoid any heath risks.

There has, until recently, been no way to formally check requirements for consistency and overall requirements correctness. Those two requirements above (dense + floating) are relatively vague – you wouldn’t be able to test them without the ability to extract the semantics and then simulate the physics. But many requirements are functional, and they can be expressed in a more formal manner that can then be tested.

This might sound like a nice-to-have for many electronic products. We see too many cheap consumer goods that clearly haven’t been well thought out or possibly don’t even have all of their features working properly. This would be great for that, but such products typically have cost and schedule requirements that don’t really allow for a more thoughtful process.

Safety-critical equipment, however, is a completely different matter, having strict requirements for requirements traceability. And, while some product requirements will always have a level of vagueness, functional requirements in such systems explicitly must be verifiable. This means that they should be specific enough to have their mutual consistency and other properties tested.

This is what Argosim has introduced in their STIMULUS offering. It’s a mechanism for specifying requirements, testing them for correctness and consistency, and then generating tests from them that can be used in future validation testing.

At present, the workflow is somewhat disconnected from existing flows; ideally, requirements would be specified using STIMULUS – that’s Argosim’s long-term vision. For the moment, it’s more of a collaborative process.

First, a requirements engineer will create natural-language requirements in the same way as is done today. He then hands them off to a verification engineer, who will create a model in STIMULUS using a formal language that can then be simulated. Both requirements and assumptions are included in the description. Through simulation, problems might be identified – and the original requirements engineer is then consulted for discussion and correction.

If no issues are identified, then the verification engineer can generate tests that will be used downstream to close the loop and ensure that the implementation of the requirements matched the intent. If tests aren’t generated, or if the test set is incomplete, then this is still something of an open-loop process.

You can find out more about STIMULUS in their announcement.

Leave a Reply

featured blogs
May 24, 2024
Could these creepy crawly robo-critters be the first step on a slippery road to a robot uprising coupled with an insect uprising?...
May 23, 2024
We're investing in semiconductor workforce development programs in Latin America, including government and academic partnerships to foster engineering talent.The post Building the Semiconductor Workforce in Latin America appeared first on Chip Design....

featured video

Why Wiwynn Energy-Optimized Data Center IT Solutions Use Cadence Optimality Explorer

Sponsored by Cadence Design Systems

In the AI era, as the signal-data rate increases, the signal integrity challenges in server designs also increase. Wiwynn provides hyperscale data centers with innovative cloud IT infrastructure, bringing the best total cost of ownership (TCO), energy, and energy-itemized IT solutions from the cloud to the edge.

Learn more about how Wiwynn is developing a new methodology for PCB designs with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver.

featured paper

Achieve Greater Design Flexibility and Reduce Costs with Chiplets

Sponsored by Keysight

Chiplets are a new way to build a system-on-chips (SoCs) to improve yields and reduce costs. It partitions the chip into discrete elements and connects them with a standardized interface, enabling designers to meet performance, efficiency, power, size, and cost challenges in the 5 / 6G, artificial intelligence (AI), and virtual reality (VR) era. This white paper will discuss the shift to chiplet adoption and Keysight EDA's implementation of the communication standard (UCIe) into the Keysight Advanced Design System (ADS).

Dive into the technical details – download now.

featured chalk talk

The Future of Intelligent Devices is Here
Sponsored by Alif Semiconductor
In this episode of Chalk Talk, Amelia Dalton and Henrik Flodell from Alif Semiconductor explore the what, where, and how of Alif’s Ensemble 32-bit microcontrollers and fusion processors. They examine the autonomous intelligent power management, high on-chip integration and isolated security subsystem aspects of these 32-bit microcontrollers and fusion processors, the role that scalability plays in this processor family, and how you can utilize them for your next embedded design.
Aug 9, 2023
33,662 views