April 15, 2005 -- Formal analysis has come a long way since its early days. Formal tools rely on sophisticated algorithms that validate whether a design operates correctly, according to a specification captured in the form of assertions. One of the key advantages of formal analysis is that it does not require testbenches to run and therefore can be used to verify RTL source very early in the design process. This enables designers to find bugs quickly without the need to develop large sets of test vectors.
Formal analysis has evolved significantly, both in its capability and adoption, from the early days of "model checking." Figure 1 illustrates the evolution of the formal analysis tools and their adoption in commercial design projects over the past several decades.
Figure 1. Evolution and Adoption of Formal Analysis
As seen in Figure 1, while some developments such as engine technology, efficient algorithms and standard assertion-based languages improved the usability of formal analysis tools, mainstream adoption was not achieved until recently. In this article we will talk about the evolution of formal analysis tools and how different stages of developments combined with a coherent methodology available today has increased customer adoption.
Evolution of Formal Engines
Academic research and development in the area of formal analysis have been around for a long time. The formal techniques and tools in the 70's were rudimentary, based on solvers that were used to automate the otherwise manual formal verification process. As the size of the designs and the complexity of the verification process grew, the interest in utilizing formal analysis tools within industrial settings increased. Tremendous research activity, mainly within the academia and a few companies yielded constant improvements in formal algorithms and tools.
The first formal analysis tools were explicit state model checkers. These tools used state enumeration techniques that required explicit representation of each state of the design and therefore were able to handle only limited design sizes. A big technology leap came in the late 1980s with the introduction of symbolic techniques. BDDs (Binary Decision Diagrams) were used to symbolically represent the states and the transitions of the design. Symbolic representation enabled model checkers to handle much larger design sizes. Symbolic Model Verifier (SMV), the first industrial strength model checker, was developed at Carnegie Mellon University and was the basis for several early adopters' tools used in a few leading semiconductor companies.
Subsequently there was a period of active research that explored various permutations of the BDD algorithms. The next big technology occurred with the introduction of satisfiability solvers (SAT) and automatic test pattern generation (ATPG) engines. SAT based tools appeared in the mid 1990s leading to interpolation based methods in early 2000. The first commercial adoption of ATPG in formal analysis appeared in the latter part of the 1990s. Various related techniques were also developed, including BMC (Bounded Model Checking) and incorporation of more advanced abstraction techniques to further improve the capacity of designs where formal engines could be applied.
Early adopters of formal analysis tools used their expertise and intimate knowledge of the design to manually apply abstraction techniques. Today, state-of-the-art commercial formal analysis tools incorporate all the above technologies and more. Since each technology has it owns strengths and weaknesses, some state-of-the-art commercial tools include components that automate the process of choosing the right engine for a specific verification task.
While the improvements in the formal analysis engine technology were necessary to facilitate adoption, they were not sufficient to provide enough incentive for companies to adopt formal analysis into their design flows. However, this reluctance to adopt formal analysis tools diminished with the advent of property-based (assertion) languages and with the advancements in methodology.
Standardization of Property-Based Languages
The use of formal analysis tools requires specification for the design under test using a formal property specification language. Until recently, languages used to specify design properties had largely been proprietary, and supported only in a few large companies. Much of the work on applying formal analysis to electronic design occurred within internal research groups, and each group developed its own notation for specifying properties. One notable exception was the temporal logic languages LTL (Linear Time Logic) and CTL (Computation Tree Logic) which were used by early adopters of model checking tools. These languages served as the basis for some of the proprietary languages that were developed later.
The lack of well defined, expressive and usable standard property languages had been a great obstacle to the wide adoption of formal analysis. Companies were reluctant to invest in a new methodology that used a proprietary language, and designers were intimidated by the scientific notation and the limited usability offered by the academic languages.
In the late 1990s, an effort to develop a standard language for formal verification was underway in Accellera's Formal Verification Technical Committee (FVTC). The FVTC was chartered to develop a standard language for formal verification, but the group wisely decided to expand the scope to include simulation as well, so that the language it developed could be used in a variety of verification tools and flows. Various proprietary languages were donated to the verification committee and eventually IBM's "Sugar" language was selected as the basis for what has become known as the Accellera Property Specification Language, or PSL.
In addition to PSL, Accellera developed the SystemVerilog standard, which has its own assertion language, referred to as SystemVerilog Assertions, or SVA. SVA is very similar to PSL in many ways; the two languages share a common semantic core and many syntactic features. In addition to these two assertion languages, the Open Verification Library (OVL), a different approach to property specification, using library elements was also developed by Accellera. This standardization of property/assertion languages and libraries for Assertion-Based Verification (ABV) has been a major driver for development and adoption of ABV methodologies in design flows.
Advancements in Methodology
The advent of assertion-based verification methodology and the creation of standards for assertion-based specification enabled designers to create assertions easily. Almost all tools, including formal analysis, simulation, emulation and acceleration tools, offered today, support PSL, SVA, and OVL. The use of assertions in simulation, starting with OVL and more recently with PSL and SVA, is accelerating rapidly as users realize the tremendous benefits of having assertions actively monitor their RTL code for errors.
With increased usage of assertions, companies have more incentive to consider an additional step of investment in formal analysis. Having crossed this initial hurdle to consider investment in formal analysis, designers are now asking themselves questions such as:
- Who should write the assertions and run the formal analysis tools?
- When in the design flow should the formal analysis tool be applied?
- What type and size of designs are most suitable for formal analysis?
These and other similar questions are basically an attempt to answer the fundamental question of what are the right use models for formal analysis and what is the value proposition?
For years, the lack of definitive answers to these types of questions prevented the wide adoption of formal analysis. In the early days most of the focus was on expert users working on difficult problems and trying to fully prove the correctness of some parts of the design. While this mode of operation has been important for some companies in some critical design areas, for most of the industry this type of usage is not cost effective and cannot be proliferated. While the technology showed some promise, simulation remained the predominant verification method in most applications. The lack of a coherent methodology that demonstrated consistently good results at low investment, prevented formal analysis from gaining large-scale acceptance in design flows.
After experimenting with a variety of formal analysis use models, the industry has converged on some characteristics of a productive formal analysis methodology. The first element of this methodology is that formal analysis must be used in the context of a comprehensive Assertion-Based Verification methodology which includes formal analysis and simulation. This approach can be extended to acceleration and to emulation as well. Using different verification techniques together in a flow allows them to complement each other. The second element is that formal analysis should be used on parts of the design that are best suited for it. Knowing where formal analysis can provide the most value is therefore an important consideration for adopting it.
Formal analysis applied at the module and block level by designers is an extremely powerful design and verification methodology. In this methodology, application of formal analysis starts early prior to random and directed test suites, with the designers writing assertions and quickly verifying their own RTL. With this approach, bugs are detected much earlier in the design cycle and are easier to fix, and at a significantly lower cost. In addition, corner-case bugs, which cannot be found using simulation, are discovered early by formal analysis and this reduces the risk of re-spins. The assertions written by the designer also serve as a good documentation for the verification engineers who can leverage the same assertions for cluster-level and chip-level verification, using simulation, acceleration, and emulation.
Last, but not least, the current generation of formal analysis tools incorporates many usability and productivity features. The tools provide excellent debug capabilities and a user interface which closely resembles the debug capability of simulation tools. This aspect has helped designers to easily adopt formal analysis tools without making sacrifices on the usability front. In fact, the modern formal analysis tools look and feel like an integrated design environment that combines both formal analysis and simulation capabilities. To further increase productivity, most tools incorporate a rich set of automatically extracted structural and functional checks. This allows the users to extract significant value even without writing assertions.
The use of assertions in standard languages and the ability to leverage these assertions in the well established simulation methodologies opened the door for a wider adoption of formal analysis. At the same time, increased focus on usability and productivity yielded methodological approaches that are suitable for industrial settings.
Formal analysis has undergone a major metamorphosis since its earlier days. Not only has the technology matured, but so have the use models, evolving from a set of niche uses by a group of experts to mainstream adoption where such tools are used in standard design flows. Besides the advances in the technology, and development of standard languages, a comprehensive methodology has played a key role in enabling successful adoption and deployment of formal analysis products. Moreover, the assertion languages that apply to both formal analysis and simulation have enabled formal analysis to become a natural extension of existing simulation-based verification flows. Design teams in leading-edge companies now use formal analysis tools to improve both quality and productivity. Formal analysis today has gained recognition as one of the best ways to address functional verification challenges faced by most companies, and is actively deployed into production flows. Formal analysis has indeed come a long way.
By Avner Landver.
This article was co-authored by Alok Jain, Erich Marschner, and Bob Kurshan.
Landver is the Incisive Formal Group Director, Jain is the Incisive Formal Sr. Architect, Marschner is the Incisive platform Sr. Architect, and Kurshan is a Fellow at Cadence Design Systems, Inc.
Go to the Cadence Design Systems, Inc. website to learn more.