Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 12.

Similar presentations


Presentation on theme: "Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 12."— Presentation transcript:

1 Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 12

2 Axiomatic Semantics An axiomatic semantics consists of: – a language for stating assertions about programs; – rules for establishing the truth of assertions. Some typical kinds of assertions: – This program terminates. – If this program terminates, the variables x and y have the same value throughout the execution of the program. – The array accesses are within the array bounds. Some typical languages of assertions – First-order logic – Other logics (temporal, linear) – Special-purpose specification languages (Z, Larch, JML)

3 Assertions for IMP The assertions we make about IMP programs are of the form: {A} c {B} with the meaning that: – If A holds in state q and q ! q – then B holds in q A is the pre-condition and B is the post-condition For example: { y x } z := x; z := z + 1 { y < z } is a valid assertion These are called Hoare triples or Hoare assertions c

4 Semantics of Hoare Triples Now we can define formally the meaning of a partial correctness assertion: ² {A} c {B} iff 8 q 2 Q. 8 q 2 Q. q ² A Æ q ! q ) q ² B and the meaning of a total correctness assertion: ² [A] c [B] iff 8 q 2 Q. q ² A ) 9 q Q. q ! q Æ q ² B or even better: 8 q 2 Q. 8 q Q. q ² A Æ q ! q ) q ² B Æ 8 q 2 Q. q ² A ) 9 q 2 Q. q ! q Æ q ² B c c c c

5 Inference Rules for Hoare Triples We write ` {A} c {B} when we can derive the triple using inference rules There is one inference rule for each command in the language. Plus, the rule of consequence ` A ) A ` {A} c {B} ` B ) B ` {A} c {B}

6 Inference Rules for Hoare Logic One rule for each syntactic construct: ` {A} skip {A} ` {A[e/x]} x:=e {A} ` {A} if b then c 1 else c 2 {B} ` {A Æ b} c 1 {B} ` {A Æ : b} c 2 {B} ` {A} c 1 ; c 2 {C} ` {A} c 1 {B} ` {B} c 2 {C} ` { I } while b do c { I Æ : b} ` { I Æ b} c { I }

7 Example: A Proof in Hoare Logic We want to derive that {n ¸ 0} p := 0; x := 0; while x < n do x := x + 1; p := p + m {p = n * m}

8 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} Only applicable rule (except for rule of consequence): ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} c1c1 c2c2 B A ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Example: A Proof in Hoare Logic

9 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } We can match { I } with {C} but we cannot match { I Æ : b} and {p = n * m} directly. Need to apply the rule of consequence first! c1c1 c2c2 B A Example: A Proof in Hoare Logic

10 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is C? BA ` {C} while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 {C} Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` { I } while b do c { I Æ : b} ` { I Æ b} c { I } ` A ) A ` {A} c {B} ` B ) B ` {A} c {B} Rule of consequence: c c A B I = A = A = C Example: A Proof in Hoare Logic

11 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} What is I ? ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } Lets keep it as a placeholder for now! ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x := x+1; p:=p+m { I } Next applicable rule: ` {A} c 1 ; c 2 {B} ` {A} c 1 {C} ` {C} c 2 {B} B A c1c1 c2c2 Example: A Proof in Hoare Logic

12 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x := x+1; p:=p+m { I } B A c1c1 c2c2 ` { I Æ x<n } x := x+1 {C} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` {C} p:=p+m { I } Example: A Proof in Hoare Logic

13 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} What is C?Look at the next possible matching rules for c 2 ! Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} Example: A Proof in Hoare Logic

14 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} Only applicable rule (except for rule of consequence): ` {A[e/x]} x:=e {A} ` { I [p+m/p} p:=p+m { I } Need rule of consequence to match { I Æ x<n } and { I [x+1/x, p+m/p]} Example: A Proof in Hoare Logic

15 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` { I } while x < n do (x:=x+1; p:=p+m) { I Æ x ¸ n} ` { I Æ x<n } x:=x+1; p:=p+m { I } ` { I Æ x<n } x:=x+1 { I [p+m/p]} ` { I [p+m/p} p:=p+m { I } ` I Æ x < n ) I [x+1/x, p+m/p] ` { I [x+1/x, p+m/p]} x:=x+1 { I [p+m/p]} Lets just remember the open proof obligations!... Example: A Proof in Hoare Logic

16 ` {n ¸ 0} p:=0; x:=0; while x < n do (x:=x+1; p:=p+m) {p = n * m} ` { I } while x < n do (x:=x+1; p:=p+m) {p = n * m} ` {n ¸ 0} p:=0; x:=0 { I } ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Lets just remember the open proof obligations!... Continue with the remaining part of the proof tree, as before. ` { I [0/x]} x:=0 { I } ` { n ¸ 0} p:=0 { I [0/x]} ` { I [0/p, 0/x]} p:=0 { I [0/x]} ` n ¸ 0 ) I [0/p, 0/x] Now we only need to solve the remaining constraints! Example: A Proof in Hoare Logic

17 ` I Æ x ¸ n ) p = n * m ` I Æ x < n ) I [x+1/x, p+m/p] Find I such that all constraints are simultaneously valid: ` n ¸ 0 ) I [0/p, 0/x] I ´ p = x * m Æ x · n ` p = x * n Æ x · n Æ x ¸ n ) p = n * m ` p = p * m Æ x · n Æ x < n ) p+m = (x+1) * m Æ x+1 · n ` n ¸ 0 ) 0 = 0 * m Æ 0 · n All constraints are valid! Example: A Proof in Hoare Logic

18 Using Hoare Rules Hoare rules are mostly syntax directed There are three obstacles to automation of Hoare logic proofs: – When to apply the rule of consequence? – What invariant to use for while ? – How do you prove the implications involved in the rule of consequence? The last one is how theorem proving gets in the picture – This turns out to be doable! – The loop invariants turn out to be the hardest problem! – Should the programmer give them?

19 Verification Conditions Goal: given a Hoare triple {A} P {B}, derive a single assertion VC(A,P,B) such that ² VC(A,P,B) iff ² {A} P {B} VC(A,P,B) is called verification condition. Verification condition generation factors out the hard work – Finding loop invariants – Finding function specifications Assume programs are annotated with such specifications – We will assume that the new form of the while construct includes an invariant: { I } while b do c – The invariant formula I must hold every time before b is evaluated.

20 Verification Condition Generation Idea for VC generation: propagate the post- condition backwards through the program: – From {A} P {B} – generate A ) F(P, B) This backwards propagation F(P, B) can be formalized in terms of weakest preconditions.

21 Weakest Preconditions The weakest precondition WP(c,B) holds for any state q whose c-successor states all satisfy B: q ² WP(c,B) iff 8 q 2 Q. q ! q ) q ² B Compute WP(P,B) recursively according to the structure of the program P. B WP(c,B) q q q c c c c

22 Loop-Free Guarded Commands Introduce loop-free guarded commands as an intermediate representation of the verification condition c ::= assume b | assert b | havoc x | c 1 ; c 2 | c 1 c 2

23 From Programs to Guarded Commands GC( skip ) = assume true GC(x := e) = assume tmp = x; havoc x; assume (x = e[tmp/x]) GC(c 1 ; c 2 ) = GC(c 1 ) ; GC(c 2 ) GC( if b then c 1 else c 2 ) = (assume b; GC(c 1 )) (assume : b; GC(c 2 )) GC({ I } while b do c) = ? where tmp is fresh

24 Guarded Commands for Loops GC({ I } while b do c) = assert I ; havoc x 1 ;...; havoc x n ; assume I ; (assume b; GC(c); assert I ; assume false) assume : b where x 1,..., x n are the variables modified in c

25 Computing Weakest Preconditions WP(assume b, B) = b ) B WP(assert b, B) = b Æ B WP(havoc x, B) = B[a/x](a fresh in B) WP(c 1 ;c 2, B) = WP(c 1, WP(c 2, B)) WP(c 1 c 2,B) = WP(c 1, B) Æ WP(c 2, B)

26 Putting Everything Together Given a Hoare triple H ´ {A} P {B} Compute c H = assume A; GC(P); assert B Compute VC H = WP(c H, true) Infer ` VC H using a theorem prover.

27 Example: VC Generation {n ¸ 0} p := 0; x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m {p = n * m}

28 assume n ¸ 0; GC(p := 0; x := 1; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; GC(x := 0; {p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; GC({p = x * m Æ x · n} while x < n do x := x + 1; p := p + m ); assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; GC(x := x + 1; p := p + m); assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assume false ) assume x ¸ n; assert p = n * m Computing the guarded command Example: VC Generation

29 WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n; assert p = n * m, true) WP (assume n ¸ 0; assume p 0 = p; havoc p; assume p = 0; assume x 0 = x; havoc x; assume x = 0; assert p = x * m Æ x · n; havoc x; havoc p; assume p = x * m Æ x · n; (assume x < n; assume x 1 = x; havoc x; assume x = x 1 + 1; assume p 1 = p; havoc p; assume p = p 1 + m; assert p = x * m Æ x · n; assert false ) assume x ¸ n, assert p = n * m, true) Computing the weakest precondition Example: VC Generation n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n Æ (pa 2 = xa 2 * m Æ xa 2 · n ) ((xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x 1 + 1 Æ p 1 = pa 2 Æ pa 1 = p 1 + m) ) pa 1 = xa 1 * m Æ xa 1 · n) Æ (xa 2 ¸ n ) pa 2 = n * m))

30 The resulting VC is equivalent to the conjunction of the following implications Example: VC Generation n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 ) pa 3 = xa 3 * m Æ xa 3 · n n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 · n ) xa 2 ¸ n ) pa 2 = n * m n ¸ 0 Æ p 0 = p Æ pa 3 = 0 Æ x 0 = x Æ xa 3 = 0 Æ pa 2 = xa 2 * m Æ xa 2 < n Æ x 1 = xa 2 Æ xa 1 = x 1 + 1 Æ p 1 = pa 2 Æ pa 1 = p 1 + m ) pa 1 = xa 1 * m Æ xa 1 · n

31 simplifying the constraints yields all of these implications are valid, which proves that the original Hoare triple was valid, too. Example: VC Generation n ¸ 0 ) 0 = 0 * m Æ 0 · n xa 2 · n Æ xa 2 ¸ n ) xa 2 * m = n * m xa 2 < n ) xa 2 * m + m = (xa 2 + 1) * m Æ xa 2 + 1 · n

32 The Diamond Problem assume A; c d; assert B A ) WP (c, WP(c, B) Æ WP(d, B)) Æ WP (d, WP(c, B) Æ WP(d, B)) Number of paths through the program can be exponential in the size of the program. Size of weakest precondition can be exponential in the size of the program. c c d d

33 Avoiding the Exponential Explosion Defer the work of exploring all paths to the theorem prover: WP(assume b, B, C) = (b ) B, C) WP(assert b, B, C) = (b Æ B, C) WP(havoc x, B, C) = (B[a/x], C)(a fresh in B) WP(c 1 ;c 2, B, C) = let F 2, C 2 = WP(c 2, B, C) in WP(c 1, F 2, C 2 ) WP(c 1 c 2,B, C) = let X = fresh propositional variable in let F 1, C 1 = WP(c 1, X, true) and F 2, C 2 = WP(c 2, X, true) in (F 1 Æ F 2, C Æ C 1 Æ C 2 Æ (X, B)) WP(P, B) = let F, C = WP(P, B, true) in C ) F

34 Translating Method Calls to GCs /*@ requires P; @ assignable x 1,..., x n ; @ ensures Q; T m (T 1 p 1,..., T k p k ) {... } A method call y = x.m(y 1,..., y k ); is desugared into the guarded command assert P[x/this, y 1 /p 1,..., y k /p k ] ; havoc x 1 ;..., havoc x n ; havoc y ; assume Q [ x/this, y / \result ]

35 Handling More Complex Program State When is the following Hoare triple valid? {A} x.f = 5 {x.f + y.f = 10} A ought to imply y.f = 5 Ç x = y The IMP Hoare rule for assignment would give us: (x.f + y.f = 10) [5/x.f] ´ 5 + y.f = 10 ´ y.f = 5 (we lost one case) How come the rule does not work?

36 Modeling the Heap We cannot have side-effects in assertions – While generating the VC we must remove side-effects! – But how to do that when lacking precise aliasing information? Important technique: postpone alias analysis to the theorem prover Model the state of the heap as a symbolic mapping from addresses to values: – If e denotes an address and h a heap state then: – sel(h,e) denotes the contents of the memory cell – upd(h,e,v) denotes a new heap state obtained from h by writing v at address e

37 Heap Models We allow variables to range over heap states – So we can quantify over all possible heap states. Model 1 – One heap for each object – One index constant for each field. We postulate f1 f2. – r.f1 is sel(r,f1) and r.f1 = e is r := upd(r,f1,e) Model 2 (Burnstall-Bornat) – One heap for each field – The object address is the index – r.f1 is sel(f1,r) and r.f1 = e is f1 := upd(f1,r,e)

38 Hoare Rule for Field Writes To model writes correctly, we use heap expressions – A field write changes the heap of that field { B[upd(f, e 1, e 2 )/f] } e 1.f = e 2 {B} Important technique: model heap as a semantic object And defer reasoning about heap expressions to the theorem prover with inference rules such as (McCarthy): sel(upd(h, e 1, e 2 ), e 3 ) = e 2 if e 1 = e 3 sel(h, e 3 ) if e 1 e 3

39 Example: Hoare Rule for Field Writes Consider again: { A } x.f = 5 { x.f + y.f = 10 } We obtain: A ´ (x.f + y.f = 10)[upd(f, x, 5)/f] ´ (sel(f, x) + sel(f, y) = 10)[upd(f, x, 5)/f] ´ sel(upd(f x 5) x) + sel(upd(f x 5) y) = 10 (*) ´ 5 + sel(upd(f, x, 5), y) = 10 ´ if x = y then 5 + 5 = 10 else 5 + sel(f, y) = 10 ´ x = y Ç y.f = 5 (**) To (*) is theorem generation. From (*) to (**) is theorem proving.

40 Modeling new Statements Introduce – a new predicate isAllocated(e, t) denoting that object e is allocated at allocation time t – and a new variable allocTime denoting the current allocation time. Add background axioms: allocTime = 0 8 x t. isAllocated(x, t) ) isAllocated(x, t+1) isAllocated(null, 0) Translate new x.T() to havoc x; assume : isAllocated(x, allocTime); assume Type(x) = T; assume isAllocated(x, allocTime + 1); allocTime := allocTime + 1; **Translation of call to constructor x.T()**

41 Invariant Generation

42 Tools such as ESC/Java enable automated program verification by – automatically generating verification conditions and – automatically checking validity of the generated VCs. The user still needs to provide the invariants. – This is often the hardest part. Can we generate invariants automatically?

43 Programs as Systems of Constraints 1: assume y ¸ z; 2: while x < y do x := x + 1; 3: assert x ¸ z `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) move( ` 1, ` 2 ) = pc = ` 1 Æ pc = ` 2 skip(x 1,..., x n ) = x 1 = x 1 Æ... Æ x n = x n

44 Program P = (V, init, R, error) V : finite set of program variables init : initiation condition given by a formula over V R : a finite set of transition constraints – transition constraint ½ 2 R given by a formula over V and their primed versions V – we often think of R as disjunction of its elements R = ½ 1 Ç... Ç ½ n error : error condition given by a formula over V

45 P = (V, init, R, error) V = {pc, x, y, z} init = pc = ` 1 R = { ½ 1, ½ 2, ½ 3, ½ 4, ½ 5 } where ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) error = pc = ` err Programs as Systems of Constraints `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z

46 Programs as Transition Systems states Q are valuations of program variables V initial states Q init are the states satisfying the initiation condition init Q init = {q 2 Q | q ² init } transition relation ! is the relation defined by the transition constraints in R q 1 ! q 2 iff q 1, q 2 ² R error states Q err are the states satisfying the error condition error Q err = {q 2 Q | q ² error }

47 Partial Correctness of Programs a state q is reachable in P if it occurs in some computation of P q 0 ! q 1 ! q 2 !... ! qwhere q 0 2 Q init denote by Q reach the set of all reachable states of P a program P is safe if no error state is reachable in P Q reach Å Q err = ; or, if Q reach is expressed as a formula reach over V ² reach Æ error ) false

48 state space Q error states Q err Partial Correctness of Programs reachable states Q reach initial states Q init

49 Example: Reachable States of a Program 1: assume y ¸ z; 2: while x < y do x := x + 1; 3: assert x ¸ z Reachable states reach = pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x ¸ y `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z What is the connection with invariants? Can we compute reach?

50 Invariants of Programs an invariant Q I of a program P is a superset of its reachable states: Q reach µ Q I an invariant Q I is safe if it does not contain any error states: Q I Æ Q err = ; or if Q I is expressed as a formula I over V ² I Æ error ) false reach is the smallest invariant of P. In particular, if P is safe then so is reach.

51 state space Q error states Q err Partial Correctness of Programs reachable states Q reach initial states Q init safe invariant Q I

52 Strongest Postconditions The strongest postcondition post( ½,A) holds for any state q that is a ½ -successor state of some state satisfying A: q ² post( ½,A) iff 9 q 2 Q. q ² A Æ q,q ² ½ or equivalently post( ½, A) = ( 9 V. A Æ ½ ) [V/V] Compute reach by applying post iteratively to init

53 Example: Application of post A = pc = ` 2 Æ y ¸ z ½ = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) post( ½, A) = ( 9 V. A Æ ½ ) [V/V] = ( 9 pc x y z. pc = ` 2 Æ y ¸ z Æ pc = ` 2 Æ pc = ` 2 Æ x<y Æ x = x+1 Æ y = y Æ z = z) [pc/pc, x/x, y/y, z/z] = (y ¸ z Æ pc = ` 2 Æ x + 1 < y) [pc/pc, x/x, y/y, z/z] = y ¸ z Æ pc = ` 2 Æ x · y

54 Iterating post A, if i = 0 post(post i -1 ( ½, A) if i > 0

55 Finite Iteration of post May Suffice

56 Example Iteration ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) post 0 (R, init) = init = pc = ` 1 post 1 (R, init) = post( ½ 1, init) = pc = ` 2 Æ y ¸ z post 2 (R, init) = post( ½ 2, post(R, init)) Ç post( ½ 3, post(R, init)) = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x ¸ y post 3 (R, init) = post( ½ 2, post 2 (R, init)) Ç post( ½ 3, post 2 (R, init)) Ç post( ½ 4, post 2 (R, init)) Ç post( ½ 5, post 2 (R, init))

57 Example Iteration ½ 1 = move( ` 1, ` 2 ) Æ y ¸ z Æ skip(x,y,z) ½ 2 = move( ` 2, ` 2 ) Æ x < y Æ x = x + 1 Æ skip(y,z) ½ 3 = move( ` 2, ` 3 ) Æ x ¸ y Æ skip(x,y,z) ½ 4 = move( ` 3, ` err ) Æ x < z Æ skip(x,y,z) ½ 5 = move( ` 3, ` exit ) Æ x ¸ z Æ skip(x,y,z) post 2 (R, init) = post( ½ 2, post(R, init)) Ç post( ½ 3, post(R, init)) = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x ¸ y post 3 (R, init) = post( ½ 2, post 2 (R, init)) Ç post( ½ 3, post 2 (R, init)) Ç post( ½ 4, post 2 (R, init)) Ç post( ½ 5, post 2 (R, init)) = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x = y Ç pc = ` exit Æ y ¸ z Æ x · y Ç false

58 Example Iteration post 3 (R, init) = = pc = ` 2 Æ y ¸ z Æ x · y Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x · y post 4 (R, init) = post 3 (R, init) Fixed point: reach = post 0 (R, init) Ç post 1 (R, init) Ç post 2 (R, init) Ç post 3 (R, init) = pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x · y

59 Checking Safety An inductive invariant I contains the initial states and is closed under successors: ² init ) I and ² post(R, I ) ) I A program is safe if there exists a safe inductive invariant. reach is the strongest inductive invariant.

60 Inductive Invariants for Example Program weakest inductive invariant: true – set of all states – contains error states strongest inductive invariant: reach pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Æ y ¸ z Æ x ¸ y slightly weaker inductive invariant: pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit Can we drop another conjunct in one of the disjuncts?

61 1: assume y ¸ z; 2: while x < y do x := x + 1; 3: assert x ¸ z Safe inductive invariant: pc = ` 1 Ç pc = ` 2 Æ y ¸ z Ç pc = ` 3 Æ y ¸ z Æ x ¸ y Ç pc = ` exit `1`1 `1`1 `2`2 `2`2 `3`3 `3`3 ` err ` exit y ¸ z x ¸ z x ¸ y x < y Æ x = x + 1 x < z Inductive Invariants for Example Program

62 Computing Inductive Invariants We can compute the strongest inductive invariants by iterating post on init. Can we ensure that this process terminates? In general no: checking safety of programs is undecidable. But we can compute weaker inductive invariants – conservatively abstract the behavior of the program – iterate an abstraction of post that is guaranteed to terminate.


Download ppt "Rigorous Software Development CSCI-GA 3033-011 Instructor: Thomas Wies Spring 2012 Lecture 12."

Similar presentations


Ads by Google