Page loading . . .

 Category: SOCcentral Feature Articles & Columns: Feature Articles: Friday, September 19, 2014
Simulation Coverage and Formal Verification: Unlikely Collaborators?  
Contributor: Synopsys, Inc.
 Printer friendly
 E-Mail Item URL

January 13, 2012 -- Closing the coverage gap has been a long-standing challenge in simulation-based verification. Formal verification set-up methodologies lack completion measures like coverage on the simulation side. This article articulates a methodology that uses formal tools to automate coverage closure and specifies coverage as the completion measure for formal verification set-ups.

A measure of completeness is a key factor in any verification project. Simulation a workhorse of ASIC and SOC verification relies on mature metrics such as code (line and condition) coverage and functional coverage. In a typical simulation-based verification environment, the bulk of coverage is obtained relatively quickly, but the remaining few percent takes most of the work. Figure 1 illustrates this point.

Figure 1. Code coverage saturation as verification progresses.

One of the reasons for this phenomenon is the nature of simulation technology itself. Simulation is by nature not exhaustive and dependent on the testbench and stimulus.

As estimated by a leading graphics chip company, it takes more than two to three person months per project unit in this phase of coverage closure. In this area, tools are evolving in an attempt to boost the coverage by tweaking the existing simulation testbench. But they are far from mainstream deployment.

On the other hand, formal technologies are exhaustive in nature and independent from testbench- or user-specified stimuli. They have the ability to explore the entire state-space of the design and can determine which states are reachable and which cannot be reached under any circumstances. This nature of the formal tools makes them a candidate to solve our coverage-closure problem.

Formal tools, even though do not need a testbench, need to be told what the legal stimulus is for a given design. This is accomplished with constraints written in assertion languages such as SystemVerilog Assertions (SVA) or Property Specification Language (PSL). The quality and accuracy of the results obtained using formal tools is directly correlated to the accuracy of the constraints.

If the constraints are too tight (referred to as over-constraint), some of the states that should be reachable will become unreachable, thus masking some of the bugs in that state-space. On the contrary, if constraints are too loose (referred to as under-constraint), some of the states that should not be reached will become reachable, making the engineer chase problems that do not exist in real life (called false negatives). The question then becomes how to figure out if the constraints are over or under constrained? This is where we need a measure of completeness such as that which exists on the simulation side. What if we could use the same measures such as code and functional coverage as metrics? The following sections describe methodologies to address these situations.

Closing the coverage gap

Simulation can monitor and evaluate the coverage metrics of a design under test, during simulation, to determine which portions of the design have not been tested. The results of the analysis are reported so that users can see the shortcomings in their testbenches and improve tests as needed to obtain complete coverage. Code coverage falls into two categories:
  • Control-flow coverage that tracks which lines and branches have taken flow of execution through the code. Control-flow metrics include line coverage, condition coverage and branch coverage.
  • Value coverage that monitors what values are taken by signals and expressions during simulation. Value coverage metrics include toggle coverage and finite state machine (FSM) coverage.

Let's look at condition coverage in more detail with the following code snippet:

The expression (a&&B)||(c&&d) controls which branch of the "if" statement is executed. This expression itself has two sub-expressions (a&&b) and (c&&d). Based on the simulation test run, here is a sample of a coverage report that's generated.

As can be seen, out of a possible 9 scenarios of executing this condition, only 3 scenarios are covered, giving a coverage of 33.33%. The "Not Covered" scenarios indicate that certain values of variables "a, b, c, d" were never exercised during the course of simulation tests but does not tell whether it is ever possible to execute those values. Consider the scenario of having a=1 and b=1; this scenario is "Not Covered." It is possible that by design, this scenario cannot possibly occur, or it could be that the testbench is restricting this scenario from happening. If it is the former, the user can exclude those scenarios from the coverage analysis and effectively boost the coverage.

As an example, see the revised coverage report below:

If we eliminate the "Unreachable" scenarios in the above report, we get 3 scenarios covered out of 7 scenarios, effectively increasing the coverage to 42.86%. Finding this in practice takes a significant amount of manual labor based on the complexity of the design.

For a leading graphics chip company, the coverage closure effort consumes hundreds of person weeks per chip. The first step is to manually analyze the reachability, often involving tracing through the design code and creating waivers or exclusions. This task is very tedious and non-motivating for the engineers. The next task is writing new tests for coverages that are determined to be reachable in the first step. Both of these steps typically take 15 person weeks for one block. In a typical graphics chip there are 30 such blocks, so by conservative estimate, 450 person weeks are spent on the coverage closure phase in a typical verification cycle for a single chip.

What if we could determine this automatically ? Not only will we have better coverage completion, but we can also save valuable resources that would otherwise be spent doing this analysis manually.

Formal-verification tools do the same kind of analysis. Consider the scenario where a=1 and b=1 in the above coverage report. Simulation was unable to execute that path and was marked as "Not Covered."

Let's say we expressed that scenario as a property. The property will state that "a && b" can never be "1."

Then ask the formal tool to tell us if that property can be proven or falsified, and say the formal-verification tool comes back with answer "proven".

Then we can very easily conclude that, the scenario that is "Not Covered" is indeed "Unreachable." As a result, we can exclude it from the coverage analysis and improve the coverage.

So now the focus shifts to how does one come up with properties for each of the "Not Covered" scenarios. Practically, it is impossible to manually come up with such a huge number of properties. What we need is some kind of a link that can read-in the coverage database, extract the "Not Covered" scenarios, formulate the properties that a formal tool can understand and tell if it is proven. If proven, then we want to annotate the original "Not Covered" scenario as "Unreachable" in the coverage database. Figure 2 shows this flow:

Figure 2. Block Diagram for formal assisted coverage closure flow.

To make this work, both the simulation and formal tools must use the same semantics for instrumenting the code coverage. The coverage database also must be common to both the simulation and formal tools with robust read-write APIs.

This flow was used for coverage closure on some of the blocks at the same graphics chip company referenced earlier.

For one of the blocks with a complexity of 400,000 sequential elements and 10 million gates, 25,000 condition coverage objects were "Not Covered" after running their full regressions. By utilizing formal-assisted coverage closure flow, 5,000 of these were determined to be "Unreachable," and it took 92 hours; mostly a machines time.

On another block with a complexity of 23 million gates, 14,000 condition coverage objects were "Not Covered" by the full regression suite. The formal-based coverage-closure flow was able to determine 4,000 of those as "Unreachable" in 120 hours of run time.

By utilizing a formal-assisted coverage-closure flow, the time spent in coverage closure was dramatically reduced from 15 person weeks to a couple of hundred hours of compute time.

This is not the only usage of linking the simulation coverage to a formal tool. Let's look at how coverage helps increase confidence in the formal set-up.

Using coverage as a completion measure for formal verification setup

A measure of completion is an important aspect of verification methodology. For simulation, proven methodologies such as Verification Methodology Manual (VMM) or Universal Verification Methdology (UVM) provide well-defined metrics to measure the completeness. On the formal-verification side, a standard methodology providing a notion of completion metrics does not exist. Constraint development is an important and critical part of the formal-verification set-up. Improper constraints can often lead to a disastrous formal-verification outcome. The lack of a completion measure makes a user unaware of this situation. This creates a barrier for widespread deployment of formal tools. Typical concerns for formal constraint development include:
  • Correctness - Are constraints over-constraining, thus limiting the legal state-space being explored by the formal tool ? A side-effect of over-constraining is getting false proofs, masking the bugs in the design being verified.
  • Completeness - Are constraints under-constraining, meaning the formal tool explores the state-space that is beyond the one supported by design? A side-effect of under-constraining is getting false bugs, causing verification engineers to spend resources debugging something that is not a true bug.
  • Conflict - When constraints conflict with each other, a formal tool typically resolves them by inferring additional constraints that can cause over-constraints. The results cannot be trusted in this case.

What we need here is some kind of completion metric that we can use to validate our constraint set-up. What better metrics than the proven code coverage? This will only work if the formal tool can understand the code coverage.

We can adapt the technology as depicted in the earlier section to collect the coverage metrics during the formal run. An analysis of coverage status can expose whether constraints have one of the concerns mentioned above.

Let's say a condition expression that describes a "fifo full" condition is reported as "Unreachable." This may indicate a potential "over-constraint," as one would expect the design to handle the "fifo full" condition and activate the back pressure logic on the input interface.

On the contrary, if the expression indicating "fifo overflow" is reached, it may indicate a potential "under-constraint," as the constraints that respond to the "fifo full" and back pressure may be missing.

In this way, simulation coverage linked to the formal verification can provide a measure of completeness for the formal-verification set-up.

In summary

Closing the gap between coverage numbers obtained through simulation and the desired coverage goals is a time-consuming but vital task for verification completion. Formal technology can automate this process if it can understand the coverage instrumentation. Having a common coverage database with robust APIs is a key enabler for this methodology.

Having completion metrics for a formal-verification set-up is a key requirement to have confidence in the results obtained. Coverage being a proven and well-understood completion metric seems a natural fit. Analysis of coverage numbers obtained through a formal tool can provide vital insight into the the formal verification constraints whether they are correct and complete and devoid of conflicts.

Mandar Munishwar

Mandar Munishwar has been working at Synopsys for more than nine years in the role of corporate application engineer and is currently focused on providing support for formal-verification technologies. He has 19 years of experience verifying complex SOC designs using state-of-the-art verification technologies such as constraint-random simulation, assertions and formal. Prior to joining Synopsys, he had brief stints at Cadence Design Systems, Inc. and Mentor Graphics Corp. He has a bachelors degree in computer engineering from Walchand College of Engineering (Shivaji University, Kolhapur, India) and has co-authored a patent on the integration of coverage and formal technologies.

Go to the Synopsys, Inc. website to learn more.

Keywords: ASICs, ASIC design, FPGAs, field programmable gate arrays, FPGA design, EDA, EDA tools, electronic design automation, formal verification, simulation, simulators, assertion based verification, assertion-based verification, ABV, SOCcentral, Synopsys,
488/37389 1/13/2012 4984 4984
Add a comment or evaluation (anonymous postings will be deleted)

Designer's Mall
Halloween countdown banner

 Search for:
            Site       Current Category  
   Search Options

Subscribe to SOCcentral's
SOC Explorer
and receive news, article, whitepaper, and product updates bi-weekly.


Deep Semantic and Formal Analysis

Dr. Pranav Ashar
CTO, Real Intent

SOCcentral Job Search

SOC Design
ASIC Design
ASIC Verification
FPGA Design
CPLD Design
PCB Design
DSP Design
RTOS Development
Digital Design

Analog Design
Mixed-Signal Design
IC Packaging

Special Topics/Feature Articles
3D Integrated Circuits
Analog & Mixed-Signal Design
Design for Manufacturing
Design for Test
ESL Design
Floorplanning & Layout
Formal Verification/OVM/UVM/VMM
Logic & Physical Synthesis
Low-Power Design
On-Chip Interconnect
Selecting & Integrating IP
Signal Integrity
Timing Analysis & Closure
Transaction Level Modeling (TLM)
Design Center
Tutorials, Whitepapers & App Notes
Archived Webcasts


Sponsorship/Advertising Information

The Home Port  EDA/EDA Tools  FPGAs/PLDs/CPLDs  Intellectual Property  Electronic System Level Design  Special Topics/Feature Articles  Vendor & Organization Directory
News  Major RSS Feeds  Articles Online  Tutorials, White Papers, etc.  Webcasts  Online Resources  Software   Tech Books   Conferences & Seminars  About
Copyright 2003-2013  Tech Pro Communications   1209 Colts Circle    Lawrenceville, NJ 08648    Phone: 609-477-6308
553.488  0.984375