Presentation is loading. Please wait.

Presentation is loading. Please wait.

Logic for Computer Security Protocols John Mitchell Stanford University.

Similar presentations


Presentation on theme: "Logic for Computer Security Protocols John Mitchell Stanford University."— Presentation transcript:

1 Logic for Computer Security Protocols John Mitchell Stanford University

2 Outline uPerspective Math foundations vs computer science uMotivating example Floyd-Hoare logic of programs uSecurity protocols Examples of protocols Toward logics of security protocols

3 This is a logic conference … uGeneral properties of logical systems First-order logic –Model theory Constructive logic –Proof theory uSpecific theories Set theory –Large cardinals, independence results, …

4 “All mathematics is set theory” uNumbers represented as sets “There is only one principle of induction, induction on the integers… That’s what the axiom is.” – P. Cohen uEvery other mathematical thing is represented using sets Functions, graphs, algebraic structures, geometry (?), … Platonic universe = Set theory ?

5 CS View uNumbers are a special case Sets are a special case too, but less important since sets are unordered uMany other important structures Data structures –Lists, Trees, Graphs, … Algorithmic concepts –Programs, Functions, Objects, …

6 Some CS Goals uGeneral theories Frameworks for defining and reasoning about computational concepts –Data structures and computations –Numbers and sets could be examples in the framework, but not especially important uSpecific theories Logic of simple imperative programs Logic of recursive functional programs Data type specification and refinement

7 Disclaimer/Context (Digression) uTheoretical computer science Quantitative theory – how many steps –Algorithm design and Complexity theory –Dominant subject in US computer science Qualitative theory - understand computational universe and its properties –Programming languages and semantics –Logics for specification and verification

8 Part II Logic of programs Historical references: Floyd, … Hoare, …

9 Before-after assertions uMain idea F G –If F is true before executing P, then G after uTwo variants Total correctness F [P] G –If F before, then P will halt with G Partial correctness F {P} G –If F before, and if P halts, then G

10 While programs uPrograms P ::= x := e | P;P | if B then P else P | while B do P where x is any variable e is any integer expression B is a Boolean expression (true or false)

11 Assertion about assignment uAssignment axiom F(t) { x := t } F(x) uExamples 7=7 { x := 7 } x=7 (y+1)>0 { x := y+1 } x>0 x+1=2 { x := x+1 } x=2 This is not most general case. Need to assume no aliasing…

12 Rule of consequence uIf F { P } G uAnd F’  F and G  G’ uThen F’ { P } G’

13 Example uAssertion y>0 { x := y+1 } x>0 uProof (y+1)>0 { x := y+1 } x>0 (assignment axiom) y>0 { x := y+1 } x>0 (consequence) uAssertion x=1 { x := x+1 } x=2 uProof x+1=2 { x := x+1 } x=2 (assignment axiom) x=1 { x := x+1 } x=2 (consequence)

14 Conditional F  B { P 1 } G F  B {P 2 } G F { if B then P 1 else P 2 } G uExample true { if y  0 then x := y else x := -y } x  0

15 Sequence F { P 1 } G G { P 2 } H F { P 1 ; P 2 } H uExample x=0 { x := x+1 ; x := x+1 } x=2

16 Loop Invariant F  B { P } F F { while B do P } F  B uExample true { while x  0 do x := x-1 } x=0

17 Example: Compute d=x-y uAssertion y  x {d:=0; while (y+d)<x do d := d+1} y+d=x uMain ideas in proof Choose loop invariant y+d  x y+d  x  B {P 1 } y+d  x y+d  x {while B do P 1 } y+d  x  B Use assignment axiom and sequence rule to complete the proof of property of P 1 P0P0 BP1P1

18 Facts about Hoare logic uCompositional Proof follows structure of program uSound u“Relative completeness” Properties of computation over N provable from properties of N Some technical issues … uImportant concept: Loop invariant !!! Common practice beyond Hoare logic

19 Part III Protocol Examples Warning: bait and switch

20 Contract signing uBoth parties want to sign a contract uNeither wants to commit first Immunity deal

21 General protocol outline uTrusted third party can force contract Third party can declare contract binding if presented with first two messages. AB I am going to sign the contract Here is my signature

22 B A m1= sign(A,  c, hash(r_A)  ) sign(B,  m1, hash(r_B)  ) r_A r_B Agree A B Network T Abort ??? ResolveAttack? B A Net T sig T (m 1, m 2 ) m1m1 ??? m2m2 A T Asokan-Shoup-Waidner protocol If not already resolved a 1 sig T (a 1,abort)

23 Needham-Schroeder Key Exchange { A, Nonce a } { Nonce a, Nonce b } { Nonce b } KaKa Kb Result: A and B share two private numbers not known to any observer without Ka -1, Kb -1 AB Kb

24 Anomaly in Needham-Schroeder AE B { A, N a } { N a, N b } { N b } Ke Kb Ka Ke Evil agent E tricks honest A into revealing private key N b from B. Evil E can then fool B. [Lowe]

25 Analysis Methods uSimplified model Exhaustive search methods –Limited to finite model Grungy proof methods based on traces –Isabelle, etc; Attempts to identify reasoning methods uNot so simple model Ad hoc proofs Attempts to identify reasoning methods

26 Part IV A Logic for Security Protocols work with N. Durgin, D. Pavlovic

27 Intuition uReason about local information I chose a new number I sent it out encrypted I received it decrypted Therefore: someone decrypted it uIncorporate knowledge about protocol Protocol: Server only sends m if sent m’ If server not corrupt and I receive m signed by server, then server received m’

28 Intuition: Picture uAlice’s information Protocol Private data Sends and receives Honest Principals, Attacker Protocol Send Receive

29 Logical assertions uModal operator [ actions ] P  - after actions, P reasons  uPredicates in  Sent(X,m) - principal X sent message m Created(X,m) – X assembled m from parts Decrypts(X,m) - X has m and key to decrypt m Knows(X,m) - X created m or received msg containing m and has keys to extract m from msg Source(m, X, S) – Y  X can only learn m from set S Honest(X) – X follows rules of protocol

30 Semantics uProtocol Q Defines set of roles (e.g, initiator, responder) Run R of Q is sequence of actions by principals following roles, plus attacker uSatisfaction Q, R |  [ actions ] P  Some role of P in R does exactly actions and  is true in state after actions completed Q |  [ actions ] P  Q, R |  [ actions ] P  for all runs R of Q Formula is [ actions ] P  where  has no [ …] P’

31 Sample axioms about actions uNew data [ new x ] P Knows(P,x) [ new x ] P Knows(Y,x)  Y=P uSend [ send m ] P Sent(P,m) uReceive [recv m] P Knows(P,m) uDecyption [x := decrypt(k,encrypt(k,m))] P Decrypts(P, encrypt(k,m))

32 Reasoning about “Knows” uPairing Knows(X,  m,n  )  Knows(X, m)  Knows(X, n) uEncryption Knows(X, encrypt(k,m))  Knows(X, k -1 )  Knows(X, m)

33 Source predicate uNew data [ new x ] P Source(x,P,{ }) uMessage [ acts ] P Source(x,Q,S) [ acts; send m] P Source(x,Q,S  {m}) uReasoning Source{m,X,{ encrypt(k,m) })  Knows(Y,m)  Y  X   Z. Decrypts(Z, encrypt(k,m) )

34 Bidding conventions (motivation) uBlackwood response to 4NT –5  : 0 or 4 aces –5  : 1 ace –5 : 2 aces –5  : 3 aces uReasoning If my partner is following Blackwood, then if she bid 5, she must have 2 aces

35 Honesty rule (rule scheme)  roles R of Q.  initial segments A  R. Q |- [ A ] X  Q |- Honest(X)   This is a finitary rule: –Typical protocol has 2-3 roles –Typical role has 1-3 receives –Only need to consider A waiting to receive

36 Honesty hypotheses for NSL [ new m; send encrypt( Key(Y),  X,m  ) ] X  [ new m; send encrypt( Key(Y),  X,m  ); recv encrypt( Key(X),  m, B, n  ); send encrypt( Key(B), n ) ] X  [ ] X  [ recv encrypt( Key(X),  Y,m  ); new n; send encrypt( Key(Y),  m, X, n  ) ] X  [ recv encrypt( Key(B),  A,m  ); new n; send encrypt( Key(A),  m, B, n  ); recv encrypt( Key(B), n ) ] X  “Alice” “Bob”

37 Honesty rule (example use)  roles R of Q.  initial segments A  R. Q |- [ A ] X  Q |- Honest(X)   Example use: –If Y receives a message from X, and Honest(X)  (Sent(X,m)  Received(X,m’)) then Y can conclude Honest(X)  Received(X,m’))

38 Benchmarks uCan prove repaired NSL protocol uProof fails for original NS protocol

39 Correctness of NSL uBob knows he’s talking to Alice [ recv encrypt( Key(B),  A,m  ); new n; send encrypt( Key(A),  m, B, n  ); recv encrypt( Key(B), n ) ] B Honest(A)  Csent(A, msg1)  Csent(A, msg3) where Csent(A, …)  Created(A, …)  Sent(A, …) msg1msg3

40 Proof uses “honesty rule” uConsequent of honesty rule Honest(A)  Decrypts(A, encrypt( Key(A),  m, B, n  ))  Csent(A, msg1)  Csent(A, msg3) The implication Decrypts(X,…)  Csent(X,…)  Csent(X,…) is an invariant of the protocol. It is true at each waiting state of each role X.

41 Initiator role OK for NS,NSL uAlice wants similar conclusions, e.g., [ new m; send encrypt( Key(B),  A,m  ); recv encrypt( Key(A),  m, B, n  ); send encrypt( Key(B), n ) ] A Honest(B)  Csent(B, msg2)

42 Proof uses “honesty rule” uConsequent of honesty rule Honest(X)  Csent(X, encrypt( Key(Y),  m, Z, n  )  X=Z FIX THIS !!! Do not use slide as is!

43 Failure for “buggy” NS uBob hopes he’s talking to Alice [ recv encrypt( Key(B),  A,m  ); new n; send encrypt( Key(A),  m, B, n  ); recv encrypt( Key(B), n ) ] B Honest(A)  Csent(A, m1(B))  Csent(A, m3(B)) where Csent(A, …)  Created(A, …)  Sent(A, …) m1(B) m3(B)

44 Attempt use of “honesty rule” uProtocol invariant Honest(A)  Decrypts(A, encrypt( Key(A),  m, B, n  ))  Csent(A, m1(Z))  Csent(A, m3(Z)) Cannot prove Z=B

45 Status of this logic uJust starting This is a first attempt at a style of protocol logic uLots more to do Work out more examples Simplify the formalism –Express Created using Knows and Sent, Decrypts using Knows, … Refine the proof rules, semantics Make sure others can use this

46 Conclusion uComputer security is good application People care about correctness Cryptography understood, usage less so uSecurity protocols Simple distributed programs Complexity comes from possible attacks uProposed protocol logic Hide behavior of attacker in model Identify concepts that give understanding


Download ppt "Logic for Computer Security Protocols John Mitchell Stanford University."

Similar presentations


Ads by Google