ACM Computing Surveys 28A(4), December 1996, http://www.acm.org/surveys/1996/Formatting/. Copyright © 1996 by the Association for Computing Machinery, Inc. See the permissions statement below.
Abstract: Significant advances have been made since people first started building automated verification tools, in the mid seventees. This note briefly summarizes some of the steps that were taken, and considers what we need to do to secure that the tools we have today can become a permanent part of the software designer's toolkit.
Categories and Subject Descriptors: D.2.4 [Software Engineering]: Program Verification - Validation; F.3.1 [Logics and Meanings of Programs]: Specifying and Verifying and Reasoning about Programs - Specification Techniques;
General Terms: Verification, Model Checking, Automata Theory
Additional Key Words and Phrases: Software design verification, Formal models of software designs, Reachability analysis
The first tools for the automated verification of asynchronous process systems, were written almost two decades ago today. The best known of these tools are Jan Hajek's system Approver and Colin West's implementations of his perturbation analysis technique. [Hajek 1978], [West 1978] Both Hajek's and West's system allowed for automated proofs of simple predefined safety properties of relatively small protocol specifications. Hajek's tool also allowed for a small set of standard liveness properties. It is, for instance, but little known that the Approver tool was used to render proofs of correctness of all protocol descriptions in the first edition of Tanenbaum's well-known Computer Networks book, before it went to press. [HajekTanenbaum 1995] It is regrettable that the Approver system could not influence the developments in formal verification more. The algorithms on which it was based have never been published.
Without yet knowing about the earlier work of Hajek and West, I wrote a first on-the-fly verification system Pan, that we used to prove safety properties for an experimental version of a telephone switch called Tpc (the phone company) and a range of protocol specifications at Bell Labs in mid and late 1980. [Holzmann 1981] All of these early tools were based on the search of a reachability graph: reachability analysis and either depth- or breadth-first search, and the Pan system was based from the start, for reasons of efficiency, on on-the-fly verification techniques. In retrospect it can be seen that earlier attempts to perform manual reachability analyses of abstract state machine models, pioneered by Gregor Bochmann and Andre Danthine, were based on comparable techniques. Even the attempts to construct socalled token-machines that could simulate the execution of Petri Net models, can be seen to be comparable. It was quickly pointed out at the time though, most notably by Hajek in his early papers, that the manual techniques were unreliable, and that the published results obtained with these techniques contained embarassing errors that had escaped the scrutiny of authors and reviewers alike. We also now know that the correctness of vanilla Petri Net models is undecidable, so that they would need further simplification to become eligable as a formal description technique for automated verification. To realize the ideal of automated verification, a number of problems had to be solved first: the development of an effective notation, or underlying model, for both systems and correctness properties, and the development of efficient search algorithms.
Between then and now lies two decades of the sharpening of our insights in the theory of automated verification, and the design and construction of efficient verification algorithms and tools. Seminal steps for on-the-fly model techniques that are today available in tools such as Spin [Holzmann 1991] were the introduction of logic model checking techniques in 1983, independently by Allen Emerson, Ed Clarke [ClarkeEmerson 1981] and Jean-Pierre Quielle, Joseph Sifakis [QuielleSifakis 1981], the development of the automata theoretic framework for verification, jointly by Moshe Vardi and Pierre Wolper [VardiWolper 1986], and great improvements in efficient graph search algorithms and memory management techniques by many others. The more recent introduction of BDDs in hardware model checking, and of aggressive partial order reduction techniques in software verification has helped still further to make a more general applicability of automated verification techniques a distinct reality.
Not to be underestimated is also the practical importance of the speedup of the basic hardware that our tools run on. The software that existed two decades ago would today run at least three orders of magnitude faster, with access to at least two orders of magnitude more memory than before to perform the verification task. Even without all other notable improvements that have occured, this would have made a significant difference for the practical applicability of our tools.
A wide range of sophisticated verification tools is now available. There are several variants of logic model checking tools, that are traditionally focused on hardware verification tasks. Relatively little attention has so far been given to the verification of asynchronous systems as a separate problem. Spin and its predecessors, going back to its earliest ancester Pan, are representatives of attempts to built highly efficient on-the-fly model checking tool for asynchronous systems. These tools are powerful enough to perform full LTL (linear temporal logic) model checking for very substantial industrial applications, such as communications protocol design and distributed systems design. [GregHolPel 1997]
In the last two decades, a sound formal framework for the construction of effective tools has been put in place. The remarkable improvements in hardware speed and memory size are predicted to continue for the foreseeable future. It is therefore likely that well-designed formal verification tools will earn their place among the central tools of every designer, be it hardware or software, and that the proper use of these tools will become a standard part of every college curriculum. All this presumes that the tools we built or refine are designed with the ultimate user in mind. To increase the impact that our verification tools can have in practice, they must be usable by non-experts, not just by the experts of formal verification. This places high demands on user-interface design, feedback about verification results to the user, and the choice of specification languages and notations for expressing correctness requirements.
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Publications Dept, ACM Inc., fax +1 (212) 869-0481, or firstname.lastname@example.org.