Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 11.

Similar presentations


Presentation on theme: "Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 11."— Presentation transcript:

1 Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 11

2 Semantics of Programming Languages Denotational Semantics – Meaning of a program is defined as the mathematical object it computes (e.g., partial functions). – Example: Abstract Interpretation Axiomatic Semantics – Meaning of a program is defined in terms of its effect on the truth of logical assertions. – Example: Hoare Logic (Structural) Operational Semantics – Meaning of a program is defined by formalizing the individual computation steps of the program. – Example: Labeled Transition Systems

3 IMP: A Simple Imperative Language An IMP program: p := 0; x := 0; while x < n do x := x + 1; p := p + m;

4 Syntax of IMP Commands Commands (Com) c ::= skip | x := e | c 1 ; c 2 | if b then c 1 else c 2 | while b do c Notes: – The typing rules have been embedded in the syntax definition. – Other parts are not context-free and need to be checked separately (e.g., all variables are declared). – Commands contain all the side-effects in the language. – Missing: references, function calls, …

5 Labeled Transition Systems A labeled transition system (LTS) is a structure LTS = (Q, Act, ! ) where – Q is a set of states, – Act is a set of actions, – ! µ Q £ Act £ Q is a transition relation. A labeled transition system (LTS) is a structure LTS = (Q, Act, ! ) where – Q is a set of states, – Act is a set of actions, – ! µ Q £ Act £ Q is a transition relation. We write q ! q’ for (q, a, q’) 2 !. a

6 q q ++ {x  n} x := e Operational Semantics of IMP ⇓ n q skip q q’’ c 1 ; c 2 q q’ c1c1 q’ q’’ c2c2 q q’ if b then c 1 else c 2 ⇓ true q q’ c1c1 if b then c 1 else c 2 ⇓ false q q’ c2c2 q q’’ while b do c ⇓ true q q’ c q while b do c ⇓ false q’ q’’ while b do c

7 Axiomatic Semantics An axiomatic semantics consists of: – a language for stating assertions about programs; – rules for establishing the truth of assertions. Some typical kinds of assertions: – This program terminates. – If this program terminates, the variables x and y have the same value throughout the execution of the program. – The array accesses are within the array bounds. Some typical languages of assertions – First-order logic – Other logics (temporal, linear) – Special-purpose specification languages (Z, Larch, JML)

8 Assertions for IMP The assertions we make about IMP programs are of the form: {A} c {B} with the meaning that: – If A holds in state q and q ! q’ – then B holds in q’ A is the pre-condition and B is the post-condition For example: { y ≤ x } z := x; z := z + 1 { y < z } is a valid assertion These are called Hoare triples or Hoare assertions c

9 Assertions for IMP {A} c {B} is a partial correctness assertion. It does not imply termination of c. [A] c [B] is a total correctness assertion meaning that – If A holds in state q – then there exists q’ such that q ! q and B holds in state q’ Now let’s be more formal – Formalize the language of assertions, A and B – Say when an assertion holds in a state – Give rules for deriving Hoare triples c

10 The Assertion Language We use first-order predicate logic with IMP expressions A :: = true | false | e 1 = e 2 | e 1 ≥ e 2 | A 1 Æ A 2 | A 1 Ç A 2 | A 1 ) A 2 | ∀ x.A | ∃ x.A Note that we are somewhat sloppy and mix the logical variables and the program variables. Implicitly, for us all IMP variables range over integers. All IMP boolean expressions are also assertions.

11 Semantics of Assertions We introduced a language of assertions, we need to assign meanings to assertions. Notation q ² A to say that an assertion holds in a given state. – This is well-defined when q is defined on all variables occurring in A. The ² judgment is defined inductively on the structure of assertions. It relies on the semantics of arithmetic expressions from IMP.

12 Semantics of Assertions q ² truealways q ² e 1 = e 2 iff ⇓ = ⇓ q ² e 1 ≥ e 2 iff ⇓ ≥ ⇓ q ² A 1 Æ A 2 iff q ² A 1 and q ² A 2 q ² A 1 Ç A 2 iff q ² A 1 or q ² A 2 q ² A 1 ) A 2 iff q ² A 1 implies q ² A 2 q ² ∀ x.A iff 8 n 2 Z. q[x:=n] ² A q ² ∃ x.A iff 9 n 2 Z. q[x:=n] ² A

13 Semantics of Hoare Triples Now we can define formally the meaning of a partial correctness assertion: ² {A} c {B} iff 8 q 2 Q. 8 q’ 2 Q. q ² A Æ q ! q’ ) q’ ² B and the meaning of a total correctness assertion: ² [A] c [B] iff 8 q 2 Q. q ² A ) 9 q’ ∈ Q. q ! q’ Æ q’ ² B or even better: 8 q 2 Q. 8 q’ ∈ Q. q ² A Æ q ! q’ ) q’ ² B Æ 8 q 2 Q. q ² A ) 9 q’ 2 Q. q ! q’ Æ q’ ² B c c c c

14 Inferring Validity of Assertions Now we have the formal mechanism to decide when {A} c {B} – But it is not satisfactory, – because ² {A} c {B} is defined in terms of the operational semantics. – We practically have to run the program to verify an assertion. – Also it is impossible to effectively verify the truth of a ∀ x. A assertion (by using the definition of validity) So we define a symbolic technique for deriving valid assertions from others that are known to be valid – We start with validity of first-order formulas

15 ` A[e/x] Inference Rules We write ` A when A can be inferred from basic axioms. The inference rules for ` A are the usual ones from first- order logic with arithmetic. Natural deduction style axioms: ` A Æ B ` A ` B ` A Ç B ` A ` A Ç B ` B ` 8 x. A ` A[a/x] where a is fresh ` 8 x. A ` A[e/x] ` 9 x. A ` B ` 9 x. A ` B ` A ) B ` B ` A ) B ` A ` A[a/x]... where a is fresh ` A...

16 Inference Rules for Hoare Triples Similarly we write ` {A} c {B} when we can derive the triple using inference rules There is one inference rule for each command in the language. Plus, the rule of consequence ` A’ ) A ` {A} c {B} ` B ) B’ ` {A’} c {B’}

17 Inference Rules for Hoare Logic One rule for each syntactic construct: ` {A} skip {A} ` {A[e/x]} x:=e {A} ` {A} if b then c 1 else c 2 {B} ` {A Æ b} c 1 {B} ` {A Æ : b} c 2 {B} ` {A} c 1 ; c 2 {C} ` {A} c 1 {B} ` {B} c 2 {C} ` { I } while b do c { I Æ : b} ` { I Æ b} c { I }

18 Hoare Rules For some construct, multiple rules are possible alternative “forward axiom” for assignment: alternative rule for while loops: These alternative rules are derivable from the previous rules, plus the rule of consequence. ` {A} x:=e { 9 x 0. x 0 = e Æ A[x 0 /x]} ` { I } while b do c {B} ` {C} c { I } ` I Æ b ) C ` I Æ : b ) B

19 Exercise: Hoare Rules Is the following alternative rule for assignment still correct? ` {true} x:=e {x = e}

20 Example: Conditional D1 :: ` {true Æ y ≤ 0} x := 1 {x > 0} D2 :: ` {true Æ y > 0} x := y {x > 0} ` {true} if y ≤ 0 then x := 1 else x := y {x > 0} D1 is obtained by consequence and assignment ` {1 > 0} x := 1 {x > 0} ` true Æ y ≤ 0 ) 1 > 0 ` {true Æ y ≤ 0} x := 1 {x ≥ 0} D2 is also obtained by consequence and assignment ` {y > 0} x := y {x > 0} ` true Æ y > 0 ) y > 0 ` {true Æ y > 0} x := y {x > 0}

21 Example: a simple loop We want to infer that ` {x ≤ 0} while x ≤ 5 do x := x + 1 {x = 6} Use the rule for while with invariant I ´ x · 6 ` x ≤ 6 Æ x ≤ 5 ) x + 1 ≤ 6 ` {x + 1 ≤ 6} x := x + 1 {x ≤ 6} ` {x ≤ 6 Æ x ≤ 5} x := x + 1 {x ≤ 6} ` {x ≤ 6} while x ≤ 5 do x := x + 1 { x ≤ 6 Æ x > 5} Then finish-off with the rule of consequence ` x ≤ 0 ) x ≤ 6 ` x ≤ 6 Æ x > 5 ) x = 6 ` {x ≤ 6} while... {x ≤ 6 Æ x > 5} ` {x ≤ 6} while... { x ≤ 6 Æ x > 5}

22 Example: a more interesting program We want to derive that {n ¸ 0} p := 0; x := 0; while x < n do x := x + 1; p := p + m {p = n * m}

23 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} Only applicable rule (except for rule of consequence): ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} c1c1 c2c2 B A ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C}

24 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } We can match { I } with {C} but we cannot match { I Æ : b} and {p = n * m} directly. Need to apply the rule of consequence first! c1c1 c2c2 B A

25 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? B’A’ ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } ` A’ ) A ` {A} c’ {B} ` B ) B’ ` {A’} c’ {B’} Rule of consequence: c’ A B I = A = A’ = C

26 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is I ? ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } Let’s keep it as a placeholder for now! ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x

27 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x

28 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } ` { I Æ x

29 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x

30 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x

31 Example: a more interesting program ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Let’s just remember the open proof obligations!... Continue with the remaining part of the proof tree, as before. ` { I [0/x]} x:=0 { I } ` { n ¸ 0} p:=0 { I [0/x]} ` { I [0/p, 0/x]} p:=0 { I [0/x]} ` n ¸ 0 ) I [0/p, 0/x] Now we only need to solve the remaining constraints!

32 Example: a more interesting program ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Find I such that all constraints are simultaneously valid: ` n ¸ 0 ) I [0/p, 0/x] I ´ p = x * m Æ x · n ` p = x * n Æ x · n Æ x ¸ n ) p = n * m ` p = p * m Æ x · n Æ x < n ) p+m = (x+1) * m Æ x+1 · n ` n ¸ 0 ) 0 = 0 * m Æ 0 · n All constraints are valid!

33 Using Hoare Rules Hoare rules are mostly syntax directed There are three obstacles to automation of Hoare logic proofs: – When to apply the rule of consequence? – What invariant to use for while ? – How do you prove the implications involved in the rule of consequence? The last one is how theorem proving gets in the picture – This turns out to be doable! – The loop invariants turn out to be the hardest problem! – Should the programmer give them?

34 Hoare Logic: Summary We have a language for asserting properties of programs. We know when such an assertion is true. We also have a symbolic method for deriving assertions. A {A} P {B} ² A ² {A} P {B} ` A ` {A} P {B} semantics soundness completeness theorem proving

35 Verification Conditions Goal: given a Hoare triple {A} P {B}, derive a single assertion VC(A,P,B) such that ² VC(A,P,B) iff ² {A} P {B} VC(A,P,B) is called verification condition. Verification condition generation factors out the hard work – Finding loop invariants – Finding function specifications Assume programs are annotated with such specifications – We will assume that the new form of the while construct includes an invariant: { I } while b do c – The invariant formula I must hold every time before b is evaluated.

36 Verification Condition Generation Idea for VC generation: propagate the post- condition backwards through the program: – From {A} P {B} – generate A ) F(P, B) This backwards propagation F(P, B) can be formalized in terms of weakest preconditions.

37 Weakest Preconditions The weakest precondition WP(c,B) holds for any state q whose c-successor states all satisfy B: q ² WP(c,B) iff 8 q’ 2 Q. q ! q’ ) q’ ² B Compute WP(P,B) recursively according to the structure of the program P. B WP(c,B) q q’ q’’ c c c c

38 Loop-Free Guarded Commands Introduce loop-free guarded commands as an intermediate representation of the verification condition c ::= assume b | assert b | havoc x | c 1 ; c 2 | c 1 c 2

39 Operational Semantics of GCs States of guarded commands are variable assignments plus flow component: Q = (L ! Z ) £ Flow Flow ::= Norm | Error Extend satisfiability of assertions to GC states: (s, flow) ² A iff flow = Norm Æ s ² A

40 (s, Norm) (s[x := n], Norm) havoc x Operational Semantics of GCs n 2 Z (s, Norm) assume b s ² bs ² b (s, Norm) assert b s ² bs ² b (s, Norm) (s, Error) assert b s ² :bs ² :b

41 Operational Semantics of GCs (s, Norm) q’’ c 1 ; c 2 (s, Norm) q’ c1c1 q’ q’’ c2c2 (s, Norm) q’ c1 c2c1 c2 c1c1 (s, Error) c (s, Norm) q’ c1 c2c1 c2 c2c2

42 From Programs to Guarded Commands GC( skip ) = assume true GC(x := e) = assume tmp = x; havoc x; assume (x = e[tmp/x]) GC(c 1 ; c 2 ) = GC(c 1 ) ; GC(c 2 ) GC( if b then c 1 else c 2 ) = (assume b; GC(c 1 )) (assume : b; GC(c 2 )) GC({ I } while b do c) = ? where tmp is fresh

43 Guarded Commands for Loops GC({ I } while b do c) = assert I ; havoc x 1 ;...; havoc x n ; assume I ; (assume b; GC(c); assert I ; assume false) assume : b where x 1,..., x n are the variables modified in c

44 Computing Weakest Preconditions WP(assume b, B) = b ) B WP(assert b, B) = b Æ B WP(havoc x, B) = B[a/x](a fresh in B) WP(c 1 ;c 2, B) = WP(c 1, WP(c 2, B)) WP(c 1 c 2,B) = WP(c 1, B) Æ WP(c 2, B)

45 Putting Everything Together Given a Hoare triple H ´ {A} P {B} Compute c H = assume A; GC(P); assert B Compute VC H = WP(c H, true) Infer ` VC H using a theorem prover.

46 Example: VC Generation {n ¸ 0} p := 0; x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m {p = n * m}

47 assume n ¸ 0; GC(p := 0; x := 1; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; GC(x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; GC({p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; GC(x := x + 1; p := p + m); assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m Computing the guarded command Example: VC Generation

48 WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n; assert p = n * m, true) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n, assert p = n * m, true) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n, p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n, p = n * m) Computing the weakest precondition WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) ) ) p = n * m) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n), false ) p = n * m) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n), true) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; p = p 1 + m ) p = x * m Æ x · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1), p 1 = p Æ pa 1 = p 1 + m ) pa 1 = x * m Æ x · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; (WP( assume x < n ), x 1 = x Æ xa 1 = x Æ p 1 = p Æ pa 1 = p 1 + m ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, WP(havoc x; havoc p; assume p = x * m Æ x · n; ((x < n Æ x 1 = x Æ xa 1 = x Æ p 1 = p Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (x ¸ n ) p = n * m) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n, pa 2 = xa 2 * m Æ xa 2 · n ) ((xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (xa 2 ¸ n ) pa 2 = n * m) n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n Æ (pa 2 = xa 2 * m Æ xa 2 · n ) ((xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (xa 2 ¸ n ) pa 2 = n * m)) Example: VC Generation

49 The resulting VC is equivalent to the conjunction of the following implications Example: VC Generation n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 · n ) xa 2 ¸ n ) pa 2 = n * m n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x Æ p 1 = pa 2 Æ pa 1 = p 1 + m ) pa 1 = xa 1 * m Æ xa 1 · n

50 simplifying the constraints yields all of these implications are valid, which proves that the original Hoare triple was valid, too. Example: VC Generation n ¸ 0 ) 0 = 0 * m Æ 0 · n xa 2 · n Æ xa 2 ¸ n ) xa 2 * m = n * m xa 2 < n ) xa 2 * m + m = (xa 2 + 1) * m Æ xa · n

51 The Diamond Problem assume A; c d; c’ d’; assert B A ) WP (c, WP(c’, B) Æ WP(d’, B)) Æ WP (d, WP(c’, B) Æ WP(d’, B)) Number of paths through the program can be exponential in the size of the program. Size of weakest precondition can be exponential in the size of the program. c c’ d d’

52 Avoiding the Exponential Explosion Defer the work of exploring all paths to the theorem prover: WP’(assume b, B, C) = (b ) B, C) WP’(assert b, B, C) = (b Æ B, C) WP’(havoc x, B, C) = (B[a/x], C)(a fresh in B) WP’(c 1 ;c 2, B, C) = let F 2, C 2 = WP’(c 2, B, C) in WP’(c 1, F 2, C 2 ) WP’(c 1 c 2,B, C) = let X = fresh propositional variable in let F 1, C 1 = WP’(c 1, X, true) and F 2, C 2 = WP’(c 2, X, true) in (F 1 Æ F 2, C Æ C 1 Æ C 2 Æ (X, B)) WP(P, B) = let F, C = WP’(P, B, true) in C ) F

53 Translating Method Calls to GCs requires assignable x 1,..., x n ensures Q; T m (T 1 p 1,..., T k p k ) {... } A method call y = x.m(y 1,..., y k ); is desugared into the guarded command assert P[x/this, y 1 /p 1,..., y k /p k ] ; havoc x 1 ;..., havoc x n ; havoc y ; assume Q [ x/this, y / \result ]

54 Handling More Complex Program State When is the following Hoare triple valid? {A} x.f = 5 {x.f + y.f = 10} A ought to imply “y.f = 5 Ç x = y” The IMP Hoare rule for assignment would give us: (x.f + y.f = 10) [5/x.f] ´ 5 + y.f = 10 ´ y.f = 5 (we lost one case) How come the rule does not work?

55 Modeling the Heap We cannot have side-effects in assertions – While generating the VC we must remove side-effects! – But how to do that when lacking precise aliasing information? Important technique: postpone alias analysis to the theorem prover Model the state of the heap as a symbolic mapping from addresses to values: – If e denotes an address and h a heap state then: – sel(h,e) denotes the contents of the memory cell – upd(h,e,v) denotes a new heap state obtained from h by writing v at address e

56 Heap Models We allow variables to range over heap states – So we can quantify over all possible heap states. Model 1 – One “heap” for each object – One index constant for each field. We postulate f1 ≠ f2. – r.f1 is sel(r,f1) and r.f1 = e is r := upd(r,f1,e) Model 2 (Burnstall-Bornat) – One “heap” for each field – The object address is the index – r.f1 is sel(f1,r) and r.f1 = e is f1 := upd(f1,r,e)

57 Hoare Rule for Field Writes To model writes correctly, we use heap expressions – A field write changes the heap of that field { B[upd(f, e 1, e 2 )/f] } e 1.f = e 2 {B} Important technique: model heap as a semantic object And defer reasoning about heap expressions to the theorem prover with inference rules such as (McCarthy): sel(upd(h, e 1, e 2 ), e 3 ) = e 2 if e 1 = e 3 sel(h, e 3 ) if e 1 ≠ e 3

58 Example: Hoare Rule for Field Writes Consider again: { A } x.f = 5 { x.f + y.f = 10 } We obtain: A ´ (x.f + y.f = 10)[upd(f, x, 5)/f] ´ (sel(f, x) + sel(f, y) = 10)[upd(f, x, 5)/f] ´ sel(upd(f x 5) x) + sel(upd(f x 5) y) = 10 (*) ´ 5 + sel(upd(f, x, 5), y) = 10 ´ if x = y then = 10 else 5 + sel(f, y) = 10 ´ x = y Ç y.f = 5 (**) To (*) is theorem generation. From (*) to (**) is theorem proving.

59 Modeling new Statements Introduce – a new predicate isAllocated(e, t) denoting that object e is allocated at allocation time t – and a new variable allocTime denoting the current allocation time. Add background axioms: allocTime = 0 8 x t. isAllocated(x, t) ) isAllocated(x, t+1) isAllocated(null, 0) Translate new x.T() to havoc x; assume : isAllocated(x, allocTime); assume Type(x) = T; assume isAllocated(x, allocTime + 1); allocTime := allocTime + 1; **Translation of call to constructor x.T()**


Download ppt "Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 11."

Similar presentations


Ads by Google