Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 11.

Slides:



Advertisements
Similar presentations
Automated Theorem Proving Lecture 1. Program verification is undecidable! Given program P and specification S, does P satisfy S?
Advertisements

Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 12.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 10.
In this episode of The Verification Corner, Rustan Leino talks about Loop Invariants. He gives a brief summary of the theoretical foundations and shows.
Techniques for Proving the Completeness of a Proof System Hongseok Yang Seoul National University Cristiano Calcagno Imperial College.
Semantics Static semantics Dynamic semantics attribute grammars
Copyright , Doron Peled and Cesare Tinelli. These notes are based on a set of lecture notes originally developed by Doron Peled at the University.
ICE1341 Programming Languages Spring 2005 Lecture #6 Lecture #6 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Reasoning About Code; Hoare Logic, continued
Hoare’s Correctness Triplets Dijkstra’s Predicate Transformers
Formal Semantics of Programming Languages 虞慧群 Topic 5: Axiomatic Semantics.
Axiomatic Verification I Prepared by Stephen M. Thebaut, Ph.D. University of Florida Software Testing and Verification Lecture 17.
Partial correctness © Marcelo d’Amorim 2010.
Copyright © 2006 Addison-Wesley. All rights reserved.1-1 ICS 410: Programming Languages Chapter 3 : Describing Syntax and Semantics Axiomatic Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
Copyright © 2006 Addison-Wesley. All rights reserved. 3.5 Dynamic Semantics Meanings of expressions, statements, and program units Static semantics – type.
Predicate Transformers
1 Semantic Description of Programming languages. 2 Static versus Dynamic Semantics n Static Semantics represents legal forms of programs that cannot be.
1/22 Programs : Semantics and Verification Charngki PSWLAB Programs: Semantics and Verification Mordechai Ben-Ari Mathematical Logic for Computer.
CS 355 – Programming Languages
Comp 205: Comparative Programming Languages Semantics of Imperative Programming Languages denotational semantics operational semantics logical semantics.
Programming Language Semantics Axiomatic Semantics Chapter 6.
1 Operational Semantics Mooly Sagiv Tel Aviv University Textbook: Semantics with Applications.
Denotational Semantics Syntax-directed approach, generalization of attribute grammars: –Define context-free abstract syntax –Specify syntactic categories.
Axiomatic Semantics Dr. M Al-Mulhem ICS
CS 330 Programming Languages 09 / 18 / 2007 Instructor: Michael Eckmann.
Programming Language Semantics Mooly SagivEran Yahav Schrirber 317Open space html://
PSUCS322 HM 1 Languages and Compiler Design II Formal Semantics Material provided by Prof. Jingke Li Stolen with pride and modified by Herb Mayer PSU Spring.
Dr. Muhammed Al-Mulhem 1ICS ICS 535 Design and Implementation of Programming Languages Part 1 Fundamentals (Chapter 4) Axiomatic Semantics ICS 535.
Semantics with Applications Mooly Sagiv Schrirber html:// Textbooks:Winskel The.
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
Software Verification Bertrand Meyer Chair of Software Engineering Lecture 2: Axiomatic semantics.
Operational Semantics Semantics with Applications Chapter 2 H. Nielson and F. Nielson
Describing Syntax and Semantics
CSE 755, part3 Axiomatic Semantics Will consider axiomatic semantics (A.S.) of IMP: ::=skip | | | | ; | | Only integer vars; no procedures/fns; vars declared.
Reading and Writing Mathematical Proofs
Program Analysis and Verification Spring 2015 Program Analysis and Verification Lecture 2: Operational Semantics I Roman Manevich Ben-Gurion University.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
CS 363 Comparative Programming Languages Semantics.
Propositional Logic Dr. Rogelio Dávila Pérez Profesor-Investigador División de Posgrado Universidad Autónoma Guadalajara
Muhammad Idrees Lecturer University of Lahore 1. Outline Introduction The General Problem of Describing Syntax Formal Methods of Describing Syntax Attribute.
Reading and Writing Mathematical Proofs Spring 2015 Lecture 4: Beyond Basic Induction.
Program Analysis and Verification Spring 2014 Program Analysis and Verification Lecture 4: Axiomatic Semantics I Roman Manevich Ben-Gurion University.
3.2 Semantics. 2 Semantics Attribute Grammars The Meanings of Programs: Semantics Sebesta Chapter 3.
Chapter 3 Part II Describing Syntax and Semantics.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Semantics In Text: Chapter 3.
COP4020 Programming Languages Introduction to Axiomatic Semantics Prof. Robert van Engelen.
CS6133 Software Specification and Verification
From Hoare Logic to Matching Logic Reachability Grigore Rosu and Andrei Stefanescu University of Illinois, USA.
Program Analysis and Verification Spring 2015 Program Analysis and Verification Lecture 4: Axiomatic Semantics I Roman Manevich Ben-Gurion University.
Principle of Programming Lanugages 3: Compilation of statements Statements in C Assertion Hoare logic Department of Information Science and Engineering.
CSC3315 (Spring 2009)1 CSC 3315 Languages & Compilers Hamid Harroud School of Science and Engineering, Akhawayn University
Program Analysis and Verification
Formal Semantics of Programming Languages 虞慧群 Topic 2: Operational Semantics.
Operational Semantics Mooly Sagiv Reference: Semantics with Applications Chapter 2 H. Nielson and F. Nielson
Operational Semantics Mooly Sagiv Reference: Semantics with Applications Chapter 2 H. Nielson and F. Nielson
Spring 2017 Program Analysis and Verification
Formal Methods in Software Engineering 1
Lecture 5 Floyd-Hoare Style Verification
Programming Languages and Compilers (CS 421)
Semantics In Text: Chapter 3.
Formal Methods in software development
Predicate Transformers
Formal Methods in software development
Axiomatic Semantics Will consider axiomatic semantics (A.S.) of IMP:
Program correctness Axiomatic semantics
Programming Languages and Compilers (CS 421)
COP4020 Programming Languages
Presentation transcript:

Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 11

Semantics of Programming Languages Denotational Semantics – Meaning of a program is defined as the mathematical object it computes (e.g., partial functions). – Example: Abstract Interpretation Axiomatic Semantics – Meaning of a program is defined in terms of its effect on the truth of logical assertions. – Example: Hoare Logic (Structural) Operational Semantics – Meaning of a program is defined by formalizing the individual computation steps of the program. – Example: Labeled Transition Systems

IMP: A Simple Imperative Language An IMP program: p := 0; x := 0; while x < n do x := x + 1; p := p + m;

Syntax of IMP Commands Commands (Com) c ::= skip | x := e | c 1 ; c 2 | if b then c 1 else c 2 | while b do c Notes: – The typing rules have been embedded in the syntax definition. – Other parts are not context-free and need to be checked separately (e.g., all variables are declared). – Commands contain all the side-effects in the language. – Missing: references, function calls, …

Labeled Transition Systems A labeled transition system (LTS) is a structure LTS = (Q, Act, ! ) where – Q is a set of states, – Act is a set of actions, – ! µ Q £ Act £ Q is a transition relation. A labeled transition system (LTS) is a structure LTS = (Q, Act, ! ) where – Q is a set of states, – Act is a set of actions, – ! µ Q £ Act £ Q is a transition relation. We write q ! q’ for (q, a, q’) 2 !. a

q q ++ {x  n} x := e Operational Semantics of IMP ⇓ n q skip q q’’ c 1 ; c 2 q q’ c1c1 q’ q’’ c2c2 q q’ if b then c 1 else c 2 ⇓ true q q’ c1c1 if b then c 1 else c 2 ⇓ false q q’ c2c2 q q’’ while b do c ⇓ true q q’ c q while b do c ⇓ false q’ q’’ while b do c

Axiomatic Semantics An axiomatic semantics consists of: – a language for stating assertions about programs; – rules for establishing the truth of assertions. Some typical kinds of assertions: – This program terminates. – If this program terminates, the variables x and y have the same value throughout the execution of the program. – The array accesses are within the array bounds. Some typical languages of assertions – First-order logic – Other logics (temporal, linear) – Special-purpose specification languages (Z, Larch, JML)

Assertions for IMP The assertions we make about IMP programs are of the form: {A} c {B} with the meaning that: – If A holds in state q and q ! q’ – then B holds in q’ A is the pre-condition and B is the post-condition For example: { y ≤ x } z := x; z := z + 1 { y < z } is a valid assertion These are called Hoare triples or Hoare assertions c

Assertions for IMP {A} c {B} is a partial correctness assertion. It does not imply termination of c. [A] c [B] is a total correctness assertion meaning that – If A holds in state q – then there exists q’ such that q ! q and B holds in state q’ Now let’s be more formal – Formalize the language of assertions, A and B – Say when an assertion holds in a state – Give rules for deriving Hoare triples c

The Assertion Language We use first-order predicate logic with IMP expressions A :: = true | false | e 1 = e 2 | e 1 ≥ e 2 | A 1 Æ A 2 | A 1 Ç A 2 | A 1 ) A 2 | ∀ x.A | ∃ x.A Note that we are somewhat sloppy and mix the logical variables and the program variables. Implicitly, for us all IMP variables range over integers. All IMP boolean expressions are also assertions.

Semantics of Assertions We introduced a language of assertions, we need to assign meanings to assertions. Notation q ² A to say that an assertion holds in a given state. – This is well-defined when q is defined on all variables occurring in A. The ² judgment is defined inductively on the structure of assertions. It relies on the semantics of arithmetic expressions from IMP.

Semantics of Assertions q ² truealways q ² e 1 = e 2 iff ⇓ = ⇓ q ² e 1 ≥ e 2 iff ⇓ ≥ ⇓ q ² A 1 Æ A 2 iff q ² A 1 and q ² A 2 q ² A 1 Ç A 2 iff q ² A 1 or q ² A 2 q ² A 1 ) A 2 iff q ² A 1 implies q ² A 2 q ² ∀ x.A iff 8 n 2 Z. q[x:=n] ² A q ² ∃ x.A iff 9 n 2 Z. q[x:=n] ² A

Semantics of Hoare Triples Now we can define formally the meaning of a partial correctness assertion: ² {A} c {B} iff 8 q 2 Q. 8 q’ 2 Q. q ² A Æ q ! q’ ) q’ ² B and the meaning of a total correctness assertion: ² [A] c [B] iff 8 q 2 Q. q ² A ) 9 q’ ∈ Q. q ! q’ Æ q’ ² B or even better: 8 q 2 Q. 8 q’ ∈ Q. q ² A Æ q ! q’ ) q’ ² B Æ 8 q 2 Q. q ² A ) 9 q’ 2 Q. q ! q’ Æ q’ ² B c c c c

Inferring Validity of Assertions Now we have the formal mechanism to decide when {A} c {B} – But it is not satisfactory, – because ² {A} c {B} is defined in terms of the operational semantics. – We practically have to run the program to verify an assertion. – Also it is impossible to effectively verify the truth of a ∀ x. A assertion (by using the definition of validity) So we define a symbolic technique for deriving valid assertions from others that are known to be valid – We start with validity of first-order formulas

` A[e/x] Inference Rules We write ` A when A can be inferred from basic axioms. The inference rules for ` A are the usual ones from first- order logic with arithmetic. Natural deduction style axioms: ` A Æ B ` A ` B ` A Ç B ` A ` A Ç B ` B ` 8 x. A ` A[a/x] where a is fresh ` 8 x. A ` A[e/x] ` 9 x. A ` B ` 9 x. A ` B ` A ) B ` B ` A ) B ` A ` A[a/x]... where a is fresh ` A...

Inference Rules for Hoare Triples Similarly we write ` {A} c {B} when we can derive the triple using inference rules There is one inference rule for each command in the language. Plus, the rule of consequence ` A’ ) A ` {A} c {B} ` B ) B’ ` {A’} c {B’}

Inference Rules for Hoare Logic One rule for each syntactic construct: ` {A} skip {A} ` {A[e/x]} x:=e {A} ` {A} if b then c 1 else c 2 {B} ` {A Æ b} c 1 {B} ` {A Æ : b} c 2 {B} ` {A} c 1 ; c 2 {C} ` {A} c 1 {B} ` {B} c 2 {C} ` { I } while b do c { I Æ : b} ` { I Æ b} c { I }

Hoare Rules For some construct, multiple rules are possible alternative “forward axiom” for assignment: alternative rule for while loops: These alternative rules are derivable from the previous rules, plus the rule of consequence. ` {A} x:=e { 9 x 0. x 0 = e Æ A[x 0 /x]} ` { I } while b do c {B} ` {C} c { I } ` I Æ b ) C ` I Æ : b ) B

Exercise: Hoare Rules Is the following alternative rule for assignment still correct? ` {true} x:=e {x = e}

Example: Conditional D1 :: ` {true Æ y ≤ 0} x := 1 {x > 0} D2 :: ` {true Æ y > 0} x := y {x > 0} ` {true} if y ≤ 0 then x := 1 else x := y {x > 0} D1 is obtained by consequence and assignment ` {1 > 0} x := 1 {x > 0} ` true Æ y ≤ 0 ) 1 > 0 ` {true Æ y ≤ 0} x := 1 {x ≥ 0} D2 is also obtained by consequence and assignment ` {y > 0} x := y {x > 0} ` true Æ y > 0 ) y > 0 ` {true Æ y > 0} x := y {x > 0}

Example: a simple loop We want to infer that ` {x ≤ 0} while x ≤ 5 do x := x + 1 {x = 6} Use the rule for while with invariant I ´ x · 6 ` x ≤ 6 Æ x ≤ 5 ) x + 1 ≤ 6 ` {x + 1 ≤ 6} x := x + 1 {x ≤ 6} ` {x ≤ 6 Æ x ≤ 5} x := x + 1 {x ≤ 6} ` {x ≤ 6} while x ≤ 5 do x := x + 1 { x ≤ 6 Æ x > 5} Then finish-off with the rule of consequence ` x ≤ 0 ) x ≤ 6 ` x ≤ 6 Æ x > 5 ) x = 6 ` {x ≤ 6} while... {x ≤ 6 Æ x > 5} ` {x ≤ 6} while... { x ≤ 6 Æ x > 5}

Example: a more interesting program We want to derive that {n ¸ 0} p := 0; x := 0; while x < n do x := x + 1; p := p + m {p = n * m}

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} Only applicable rule (except for rule of consequence): ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} c1c1 c2c2 B A ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C}

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } We can match { I } with {C} but we cannot match { I Æ : b} and {p = n * m} directly. Need to apply the rule of consequence first! c1c1 c2c2 B A

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? B’A’ ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } ` A’ ) A ` {A} c’ {B} ` B ) B’ ` {A’} c’ {B’} Rule of consequence: c’ A B I = A = A’ = C

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is I ? ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } Let’s keep it as a placeholder for now! ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x := x+1; p:=p+m { I } Next applicable rule: ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} B A c1c1 c2c2

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x := x+1; p:=p+m { I } B A c1c1 c2c2 ` { I Æ x<n } x := x+1 {C} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` {C} p:=p+m { I }

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]}

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } Need rule of consequence to match { I Æ x<n } and { I [x+1/x, p+m/p]}

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} ` { I [p+m/p} p:=p+m { I } ` I Æ x < n ) I [x+1/x, p+m/p] ` { I [x+1/x, p+m/p]} x:=x+1 { I [p+m/p]} Let’s just remember the open proof obligations!...

Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Let’s just remember the open proof obligations!... Continue with the remaining part of the proof tree, as before. ` { I [0/x]} x:=0 { I } ` { n ¸ 0} p:=0 { I [0/x]} ` { I [0/p, 0/x]} p:=0 { I [0/x]} ` n ¸ 0 ) I [0/p, 0/x] Now we only need to solve the remaining constraints!

Example: a more interesting program ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Find I such that all constraints are simultaneously valid: ` n ¸ 0 ) I [0/p, 0/x] I ´ p = x * m Æ x · n ` p = x * n Æ x · n Æ x ¸ n ) p = n * m ` p = p * m Æ x · n Æ x < n ) p+m = (x+1) * m Æ x+1 · n ` n ¸ 0 ) 0 = 0 * m Æ 0 · n All constraints are valid!

Using Hoare Rules Hoare rules are mostly syntax directed There are three obstacles to automation of Hoare logic proofs: – When to apply the rule of consequence? – What invariant to use for while ? – How do you prove the implications involved in the rule of consequence? The last one is how theorem proving gets in the picture – This turns out to be doable! – The loop invariants turn out to be the hardest problem! – Should the programmer give them?

Hoare Logic: Summary We have a language for asserting properties of programs. We know when such an assertion is true. We also have a symbolic method for deriving assertions. A {A} P {B} ² A ² {A} P {B} ` A ` {A} P {B} semantics soundness completeness theorem proving

Verification Conditions Goal: given a Hoare triple {A} P {B}, derive a single assertion VC(A,P,B) such that ² VC(A,P,B) iff ² {A} P {B} VC(A,P,B) is called verification condition. Verification condition generation factors out the hard work – Finding loop invariants – Finding function specifications Assume programs are annotated with such specifications – We will assume that the new form of the while construct includes an invariant: { I } while b do c – The invariant formula I must hold every time before b is evaluated.

Verification Condition Generation Idea for VC generation: propagate the post- condition backwards through the program: – From {A} P {B} – generate A ) F(P, B) This backwards propagation F(P, B) can be formalized in terms of weakest preconditions.

Weakest Preconditions The weakest precondition WP(c,B) holds for any state q whose c-successor states all satisfy B: q ² WP(c,B) iff 8 q’ 2 Q. q ! q’ ) q’ ² B Compute WP(P,B) recursively according to the structure of the program P. B WP(c,B) q q’ q’’ c c c c

Loop-Free Guarded Commands Introduce loop-free guarded commands as an intermediate representation of the verification condition c ::= assume b | assert b | havoc x | c 1 ; c 2 | c 1 c 2

Operational Semantics of GCs States of guarded commands are variable assignments plus flow component: Q = (L ! Z ) £ Flow Flow ::= Norm | Error Extend satisfiability of assertions to GC states: (s, flow) ² A iff flow = Norm Æ s ² A

(s, Norm) (s[x := n], Norm) havoc x Operational Semantics of GCs n 2 Z (s, Norm) assume b s ² bs ² b (s, Norm) assert b s ² bs ² b (s, Norm) (s, Error) assert b s ² :bs ² :b

Operational Semantics of GCs (s, Norm) q’’ c 1 ; c 2 (s, Norm) q’ c1c1 q’ q’’ c2c2 (s, Norm) q’ c1 c2c1 c2 c1c1 (s, Error) c (s, Norm) q’ c1 c2c1 c2 c2c2

From Programs to Guarded Commands GC( skip ) = assume true GC(x := e) = assume tmp = x; havoc x; assume (x = e[tmp/x]) GC(c 1 ; c 2 ) = GC(c 1 ) ; GC(c 2 ) GC( if b then c 1 else c 2 ) = (assume b; GC(c 1 )) (assume : b; GC(c 2 )) GC({ I } while b do c) = ? where tmp is fresh

Guarded Commands for Loops GC({ I } while b do c) = assert I ; havoc x 1 ;...; havoc x n ; assume I ; (assume b; GC(c); assert I ; assume false) assume : b where x 1,..., x n are the variables modified in c

Computing Weakest Preconditions WP(assume b, B) = b ) B WP(assert b, B) = b Æ B WP(havoc x, B) = B[a/x](a fresh in B) WP(c 1 ;c 2, B) = WP(c 1, WP(c 2, B)) WP(c 1 c 2,B) = WP(c 1, B) Æ WP(c 2, B)

Putting Everything Together Given a Hoare triple H ´ {A} P {B} Compute c H = assume A; GC(P); assert B Compute VC H = WP(c H, true) Infer ` VC H using a theorem prover.

Example: VC Generation {n ¸ 0} p := 0; x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m {p = n * m}

assume n ¸ 0; GC(p := 0; x := 1; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; GC(x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; GC({p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; GC(x := x + 1; p := p + m); assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m Computing the guarded command Example: VC Generation

WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n; assert p = n * m, true) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n, assert p = n * m, true) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n, p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n, p = n * m) Computing the weakest precondition WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) ) ) p = n * m) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n), false ) p = n * m) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n), true) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; p = p 1 + m ) p = x * m Æ x · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1), p 1 = p Æ pa 1 = p 1 + m ) pa 1 = x * m Æ x · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( assume x < n ), x 1 = x Æ xa 1 = x Æ p 1 = p Æ pa 1 = p 1 + m ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; ((x < n Æ x 1 = x Æ xa 1 = x Æ p 1 = p Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, pa 2 = xa 2 * m Æ xa 2 · n ) ((xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (xa 2 ¸ n ) pa 2 = n * m) n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n Æ (pa 2 = xa 2 * m Æ xa 2 · n ) ((xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (xa 2 ¸ n ) pa 2 = n * m)) Example: VC Generation

The resulting VC is equivalent to the conjunction of the following implications Example: VC Generation n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 · n ) xa 2 ¸ n ) pa 2 = n * m n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m ) pa 1 = xa 1 * m Æ xa 1 · n

simplifying the constraints yields all of these implications are valid, which proves that the original Hoare triple was valid, too. Example: VC Generation n ¸ 0 ) 0 = 0 * m Æ 0 · n xa 2 · n Æ xa 2 ¸ n ) xa 2 * m = n * m xa 2 < n ) xa 2 * m + m = (xa 2 + 1) * m Æ xa · n

The Diamond Problem assume A; c d; c’ d’; assert B A ) WP (c, WP(c’, B) Æ WP(d’, B)) Æ WP (d, WP(c’, B) Æ WP(d’, B)) Number of paths through the program can be exponential in the size of the program. Size of weakest precondition can be exponential in the size of the program. c c’ d d’

Avoiding the Exponential Explosion Defer the work of exploring all paths to the theorem prover: WP’(assume b, B, C) = (b ) B, C) WP’(assert b, B, C) = (b Æ B, C) WP’(havoc x, B, C) = (B[a/x], C)(a fresh in B) WP’(c 1 ;c 2, B, C) = let F 2, C 2 = WP’(c 2, B, C) in WP’(c 1, F 2, C 2 ) WP’(c 1 c 2,B, C) = let X = fresh propositional variable in let F 1, C 1 = WP’(c 1, X, true) and F 2, C 2 = WP’(c 2, X, true) in (F 1 Æ F 2, C Æ C 1 Æ C 2 Æ (X, B)) WP(P, B) = let F, C = WP’(P, B, true) in C ) F

Translating Method Calls to GCs requires assignable x 1,..., x n ensures Q; T m (T 1 p 1,..., T k p k ) {... } A method call y = x.m(y 1,..., y k ); is desugared into the guarded command assert P[x/this, y 1 /p 1,..., y k /p k ] ; havoc x 1 ;..., havoc x n ; havoc y ; assume Q [ x/this, y / \result ]

Handling More Complex Program State When is the following Hoare triple valid? {A} x.f = 5 {x.f + y.f = 10} A ought to imply “y.f = 5 Ç x = y” The IMP Hoare rule for assignment would give us: (x.f + y.f = 10) [5/x.f] ´ 5 + y.f = 10 ´ y.f = 5 (we lost one case) How come the rule does not work?

Modeling the Heap We cannot have side-effects in assertions – While generating the VC we must remove side-effects! – But how to do that when lacking precise aliasing information? Important technique: postpone alias analysis to the theorem prover Model the state of the heap as a symbolic mapping from addresses to values: – If e denotes an address and h a heap state then: – sel(h,e) denotes the contents of the memory cell – upd(h,e,v) denotes a new heap state obtained from h by writing v at address e

Heap Models We allow variables to range over heap states – So we can quantify over all possible heap states. Model 1 – One “heap” for each object – One index constant for each field. We postulate f1 ≠ f2. – r.f1 is sel(r,f1) and r.f1 = e is r := upd(r,f1,e) Model 2 (Burnstall-Bornat) – One “heap” for each field – The object address is the index – r.f1 is sel(f1,r) and r.f1 = e is f1 := upd(f1,r,e)

Hoare Rule for Field Writes To model writes correctly, we use heap expressions – A field write changes the heap of that field { B[upd(f, e 1, e 2 )/f] } e 1.f = e 2 {B} Important technique: model heap as a semantic object And defer reasoning about heap expressions to the theorem prover with inference rules such as (McCarthy): sel(upd(h, e 1, e 2 ), e 3 ) = e 2 if e 1 = e 3 sel(h, e 3 ) if e 1 ≠ e 3

Example: Hoare Rule for Field Writes Consider again: { A } x.f = 5 { x.f + y.f = 10 } We obtain: A ´ (x.f + y.f = 10)[upd(f, x, 5)/f] ´ (sel(f, x) + sel(f, y) = 10)[upd(f, x, 5)/f] ´ sel(upd(f x 5) x) + sel(upd(f x 5) y) = 10 (*) ´ 5 + sel(upd(f, x, 5), y) = 10 ´ if x = y then = 10 else 5 + sel(f, y) = 10 ´ x = y Ç y.f = 5 (**) To (*) is theorem generation. From (*) to (**) is theorem proving.

Modeling new Statements Introduce – a new predicate isAllocated(e, t) denoting that object e is allocated at allocation time t – and a new variable allocTime denoting the current allocation time. Add background axioms: allocTime = 0 8 x t. isAllocated(x, t) ) isAllocated(x, t+1) isAllocated(null, 0) Translate new x.T() to havoc x; assume : isAllocated(x, allocTime); assume Type(x) = T; assume isAllocated(x, allocTime + 1); allocTime := allocTime + 1; **Translation of call to constructor x.T()**