feature article
Subscribe Now

Attacking Constraint Complexity

Part 2 – E Soft and SystemVerilog Default Constraints

The growth in both the size and complexity of chips is driving greater use of constrained-random testing.  As a direct result, the size and complexity of constraint problems are also growing, and with it the consequences of not just constraint-related mistakes, but of employing less-than-optimum strategies.   This is all driving the need to understand constraints in order to get the most out of their capacity and overall performance potential.  Part 1 of this article series focused on verification IP reuse, examining how a solver typically interprets constraints and providing a constraint case study around a networking ASIC. Part 2 of this article series focuses on the SystemVerilog[1] E[2] and OpenVera[3] constructs that allow constraints to be disabled or overridden using a soft or default keyword.

This article will explore the similarities and differences, including subtle semantic differences, between E Soft Constraints and OpenVera Default constraints in the interest of optimizing constraint performance and speeding validation.  Test writers must be aware of the constraint semantics and the constraints present in the VIP and testbench environment in either SystemVerilog, OpenVera or E.  Without such knowledge, confusing constraint failures or incorrect test stimulus are likely to occur.  This results in significant debug to find the root cause, and, with it, the likelihood that a design team will incur greater development costs and also miss critical deadlines.  For this reason, understanding the differences between soft and default constraints types is critical to successful validation.

“E” Soft Constraints

The E language contains a ‘keep soft’ construct, allowing a VIP creator to specify constraints that may be automatically overridden by the test layer if a conflict occurs.  Such a soft constraint is disabled when a conflict occurs.  Otherwise it behaves like a regular constraint.  This can lead to unexpected side effects that can impact performance and is therefore important to understand. Consider the example shown in Figure 1, which has the following characteristics:

  • The packet base class has a soft constraint to set the length of the packet to the legal range of [32..1500].
  • The short_packet constraint does not conflict, resulting in packets of length [32..63]. 
  • The long_packet constraint does not conflict, resulting in packets of length [65..1500]. 
  • The err1_packet constraints conflict and the soft constraint is disabled, resulting in packets of length [1501…2000] 
  • The err2_packet constraints do not conflict and the soft constraint is active, resulting in all packets of length 1500, probably not what was desired.

Article 2 - Figure 1.jpg

Figure 1 – E soft Constraints 

Keeping in mind the example provided in Figure 1, there are three key items that should be remembered when writing soft constraints:

  1. Soft constraints allow a VIP creator to specify rules that may be overridden by the test writer without causing a solver failure.
  2. Soft constraints are disabled if a conflict occurs, otherwise they will remain active and add to the constraint set for the randomization.
  3. Soft constraints that do not conflict may result in an unintended small range or fixed value. In these situations, the soft constraint should be disabled with an explicit call.

Soft and regular constraints are typically not present in the same file. Often the test writer did not write and is not familiar with the VIP constraints, further complicating the situation.  For these reasons, a clear understanding of how soft constraints are disabled is critical when writing random tests.

SystemVerilog Constraints

In SystemVerilog, each constraint is defined with a name allowing derived classes to replace parent class constraints or add constraints not already in the parent class.  The semantics are similar to adding variables or defining virtual functions in object-oriented methodologies.  If a constraint block in a derived class has the same name, the constraint block replaces the parent definition.  If the constraint block in a derived class has a different name, the constraint block adds to the existing constraints.  When using constraints, several best practices should be observed. One best practice is the use of additive constraints when you want to add extra constraints into a class.  One example is limiting the length to create a test with small packets. Another best practice is using a named replacement when you want to violate the protocol and disable one of the valid constraints.  The name should match the name of the base-class constraint that is being disabled.

SystemVerilog code for a simple packet class with a random length is shown in Figure 2 with the following characteristics:

  • A constraint ensures the packet length is in the range [32…1500]
  • The short_packet constraint name is not present in the parent class so valid_data_len and short_data_len are both be applied resulting in length = [32…64]
  • The error_packet constraint in the derived class has the same name as the parent constraint name.  Only one constraint remains resulting in a length = [1501..5000], intentionally violating our protocol.

Article 2 - Figure 2.jpg

Figure 2– SystemVerilog Ethernet Example

SystemVerilog allows individual blocks to be enabled or disabled at runtime before the randomize() call occurs.  This control allows a test to enable or disable a constraint within an instance as shown in Figure 3.

Article 2 - Figure 3.jpg

Figure 3 – Constraint Mode Control

SystemVerilog does not currently have a soft construct, so there is no direct equivalent to E.  However a dist operator with a very heavily weighted range has similar semantics to a soft constraint as shown in figure 4.  When packet p1 is randomized the length is almost always 32, matching the soft constraint semantics.  When p2 is randomized, the with {…} construct forces the length to be 100.  This is legal and does not conflict with the dist operator in the base class.

Article 2 - Figure 4.jpg

Figure 4 – Soft and Dist Constraints

OpenVera Default Constraints

One disadvantage of soft constraints is that the decision to disable a constraint occurs at runtime, instead of compile time.  If state variables are present, the soft constraints may be disabled during multiple randomization calls to the same object depending on the state variable values.  The change in constraints depending on the state variables may degrade solver performance.  The OpenVera Language Reference Manual[6] defines a default constraint construct that is an alternative to soft constraints.  Synopsys VCS supports default constraints in OpenVera and SystemVerilog; however this construct is not part of the P1800 SystemVerilog standard at this time.

A default constraint for any variable will be disabled if the same variable is constrained elsewhere by another non-default constraint. The replacement is deterministic and independent of the values or conflicts that may occur.  This potentially enables additional compile or runtime optimizations.  In Figure 5 the short_packet and medium_packet classes both override the default constraint, as each class contains a constraint with the same variable names.  These behave as expected resulting in packets of length less than 64 and [100…500] respectively.  The long_packet class also overrides the default constraint, but in this case the length is now >64, with no upper bound.   If the packet code contains a dynamic array type with size specified by len, a simulation memory failure will occur.  While default constraints allow a test writer to easily replace constraints in VIP code, care should be taken to ensure that there is a complete set of constraints with the same variable names in order to avoid unintended results.

Article 2 - Figure 5.jpg

Figure 5 – OpenVera Default Constraints

Comparison of  Soft and  Default Constraints

Default constraints and soft constraints are used for similar reasons — to enable a VIP creator to provide a complete working environment, while allowing a test writer to easily extend and override the VIP class constraints.  However, the mechanism for replacing or disabling a constraint is quite different. Soft constraints are only disabled when a constraint conflict occurs.  Default constraints are overridden when the variable names match.  One problem with soft constraint semantics is that a soft constrain is overridden only when a contradiction occurs.  This means the solver must solve the constraints to find a contradiction, override the soft constraint and try again, degrading performance. 

Despite the performance penalty, this flawed methodology can be made to work.  However, it severely limits solver implementation possibilities and will likely involve complex backtracking algorithms.  This is one of the key reasons why Vera’s default constraints use a deterministic override scheme that allows default constraint to be disabled prior to solving the constraint network.  In addition, E users tend to create many soft constraints to simply initialize values, which are never intended to be overridden.  This is particularly useful when porting E code to OpenVera or SystemVerilog, as the soft construct is often used to set default values for all random variables in a class.  When used in this manner, a soft constraint and a default constraint are often interchangeable, as shown in figures 6 and 7.  The soft constraint retains the performance penalty compared to default constraints.  

Article 2 - Figure 6.jpg

Figure 6 – E Soft Constraint

Article 2 - Figure 7.jpg

Figure 7 – OpenVera Default Constraint

Closing Thoughts

Soft and default constraints can both have unintended consequences if not used correctly.  In particular, soft constraints remain enabled unless a conflict occurs, potentially resulting in a small range or fixed values.  In addition, default constraints expressions are replaced entirely and care must be taken to ensure a complete set of replacement constraints is specified, otherwise unbounded values may occur.  Understanding these and other differences between soft and default constraints is critical to enhancing validation performance and enabling design teams to stay on schedule.  Seemingly minor matters in the creation and usage of constraints can impact the the entire design process.  As verification engineers look to more aggressively leverage constrained-random testing in the face of rising challenges, the need to understand and optimize constraints will keep rising.  Fortunately, with a proper understanding of the nature of constraints and their use in a scalable constraint methodology, the validation obstacles around new generations of complex chips can be overcome.

AUTHOR BIOS

Benjamin Chen, benxchen@cisco.com

Benjamin Chen is a design and verification engineer at Cisco Systems.  Over the past eight years, he has contributed significantly to the successful tape-out of multiple silicon-proven ASICs, 3M to 22M gates and in technologies ranging from 0.65 to 0.18u.  He holds patents on technologies used in switching platforms and network storage systems. 

Harish Krishnamoorthy, hariskri@cisco.com

Harish Krishnamoorthy is a hardware engineer at Cisco Systems.  Over the past five years, Harish has specialized in embedded systems, software engineering and networking.  Prior to Cisco, Harish was a systems engineer at Solidus Networks.

Srinath Atluri, atluri@cisco.com

Srinath Atluri is an engineering manager at Cisco Systems, responsible for design and verification. Mr. Atluri manages the verification of multiple ASIC and SoC projects for the next generation network storage and switching platform, Nexus 7000, one of the most strategically important product lines at Cisco.

Nimalan Siva, nimalan@cisco.com

Nimalan Siva is a design and verification lead at Cisco.  He manages all aspects of the verification efforts of multi-million gate ASICs for network storage and switching platforms.  Prior to Cisco, Nimalan was a software and verification engineer at several leading companies including Force 10 Networks, Xerox PARC, Hewlett Packard and Nortel. 

Alexander Wakefield, alexw@synopsys.com

Alexander Wakefield is a Principal Corporate Applications Engineer at Synopsys. During the past 11 years at Synopsys, he has worked as a verification consultant on various processor and SoC projects. His primary focus is on constrained-random validation and testbench methodology.

Balamurugan Veluchamy, bmurugan@synopsys.com

Balamurugan Veluchamy is a Corporate Applications Engineer at Synopsys.  Over the past six years, Bala has guided key semiconductor companies on verification with SystemVerilog using VMM, optimal constrained random verification techniques, verification planning and management, and low power verification. 

REFERENCES

[1] IEEE P1800 SystemVerilog Language Reference Manual, www.ieee.org

[2] IEEE P1647 E Language Reference Manual, www.ieee.org

[3] OpenVera Language Reference Manual.  www.open-vera.com

10 thoughts on “Attacking Constraint Complexity”

  1. Pingback: Training
  2. Pingback: binaural
  3. Pingback: Funny cats
  4. Pingback: DMPK
  5. Pingback: wedding planners
  6. Pingback: insulation

Leave a Reply

featured blogs
Jun 7, 2023
We explain how semiconductor designers create reliable, safe, and secure aerospace designs by leveraging IP and standards from automotive chip designs. The post Why Aerospace Semiconductor Designers Are Taking a Page from Their Automotive Friends appeared first on New Horizo...
Jun 6, 2023
At this year's DesignCon, Meta held a session on '˜PowerTree-Based PDN Analysis, Correlation, and Signoff for MR/AR Systems.' Presented by Kundan Chand and Grace Yu from Meta, they talked about power integrity (PI) analysis using Sigrity Aurora and Power Integrity tools such...
Jun 2, 2023
I just heard something that really gave me pause for thought -- the fact that everyone experiences two forms of death (given a choice, I'd rather not experience even one)....

featured video

The Role of Artificial Intelligence and Machine Learning in Electronic Design

Sponsored by Cadence Design Systems

In this video, we talk to Paul Cunningham, Senior VP and GM at Cadence, about the transformative role of artificial intelligence and machine learning (AI/ML) in electronic designs. We discuss the transformative period we are experiencing with AI and ML and how Cadence is revolutionizing how we design and verify chips through “computationalizing intuition” and building intuitive systems that learn and adapt to the world around them. With human lives at stake, reliability, and safety are paramount.

Learn More

featured paper

EC Solver Tech Brief

Sponsored by Cadence Design Systems

The Cadence® Celsius™ EC Solver supports electronics system designers in managing the most challenging thermal/electronic cooling problems quickly and accurately. By utilizing a powerful computational engine and meshing technology, designers can model and analyze the fluid flow and heat transfer of even the most complex electronic system and ensure the electronic cooling system is reliable.

Click to read more

featured chalk talk

Key Elements of Indoor Air Quality: Why Do They Matter and Why Do We Detect Them?
Sponsored by Mouser Electronics and Sensirion
Measuring indoor air pollution is a valuable tool to monitor our health and productivity. In this episode of Chalk Talk, Amelia Dalton and Timothy Kennedy from Sensirion discuss the what, how, and why of indoor air quality testing and how the all in one air quality sensor called Sen5X from Sensirion can make measuring our indoor air quality easier than ever before.
Jun 23, 2022
40,616 views