Presentation is loading. Please wait.

Presentation is loading. Please wait.

Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification.

Similar presentations


Presentation on theme: "Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification."— Presentation transcript:

1 Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification purposes only, and is not intended to convey or imply MITRE's concurrence with, or support for, the positions, opinions or viewpoints expressed by the author.

2 Introduction This talk: soundness of symbolic proofs for security protocols Think: Are proofs in an ‘ideal’ world meaningful in the real world? Even when national secrets are on the line? Answer: mostly ‘yes,’ but sometimes ‘no’ But first: what are security protocols? Scenario: A and B want to create shared secret key Must communicate over unsecured network

3 Needham-Schroeder protocol (Prev: A, B get other’s public encryption keys) AB E KB (A || Na) E KA (Na || Nb) E KB (Nb) B,K Version 1: K = Na Version 2: K = Nb A,K

4 Security goals Authentication of A to B: “If B outputs (A,K), then A outputs (B,K’)” Mutual authentication: both A to B and B to A Key agreement: If A outputs (X,K) and B outputs (Y,K’), then K=K’ Secrecy: surprisingly tricky to define Intuition: only people who can know K should be A, B Does Needham-Schroeder achieve any of these?

5 Needham-Schroeder: broken AB E KM (A || Na) E KM (Nb) M,K A,K M E KB (A || Na) E KB (Nb) E KA (Na || Nb) A = Alice, B = Alice’s bank, M = on-line merchant Alice buys goods from merchant Merchant masquerades as Alice to her bank (Lowe, 1995)

6 Needham-Schroeder-Lowe protocol ‘Fix’ by Lowe (1995) AB E KB (A || Na) E KA (Na || Nb || B) E KB (Nb) B,K A,K Added B’s name to 2nd message Is this secure? Is TLS? Kerberos? SSH? More importantly: how to analyze?

7 The symbolic model Analysis framework for security protocols Originally proposed by Dolev & Yao (1983) General philosophy: be as high-level as possible Three general intuitions: Axiomatize the messages Axiomatize the adversary Security is unreachability

8 Axiomatize the message space Messages are parse trees Use symbols to represent atomic messages Countable symbols for keys ( K, K’, KA, KB, KA -1, KB -1 …) Countable symbols for nonces ( N, N’, Na, Nb, …) Countable symbols for names ( A, B,…) Just symbols: no a priori relationships or structure Helper functions: keyof(A) = KA, inv(KA)= KA -1 Encryption ( E K (M) ) pairing ( M || N ) are constructors Protocols described (mostly) by messages sent/received

9 Axiomatize the adversary Described by explicitly enumerated powers Interact with countable number of participants Each participant can play any role Adversary also legitimate participant Knowledge of all public values, non-secret keys Limited set of re-write rules: Adversary can (non-deterministically) compose atomic abilities M 1, M 2  M 1 || M 2  M 1, M 2 M, K  E K (M) E K (M), K -1  M

10 Security is unreachability Some state is unreachable via chain of adversary actions Secrecy (symbolic model): “If A or B output (X,K), then no composition of adversary actions can result in K” Authentication of A to B : “If B outputs (A,K), then no composition of adversary actions can result in A outputting (X,K’) where X≠B” Main advantage of symbolic model: security proofs are simple Automatable, in fact! Demo 1-- NSL provides both: Mutual authentication Key agreement Secrecy for both Na, Nb

11 A biased sample of previous work (symbolic model) Analysis methods/mathematical frameworks Many, many, many proposed Two main survivors: spi calculus [AG] & strand spaces [THG] Automation Undecidable in general [EG, HT, DLMS] but: Decidable with bounds [DLMS, RT] Also, general case can be automatically verified in practice Cryptographic Protocol Shape Analyzer [DHGT] Many others Extensions Diffie-Hellman [MS, H] Trust-management / higher-level applications [GTCHRS] Compilation Cryptographic Protocol Programming Language (CPPL) [GHRS]

12 Central issue of this talk So what? Symbolic model has weak adversary, strong assumptions No a priori guarantees about stronger adversaries 1. Real adversaries can make up new “ciphertexts” 2. Real adversaries can try decrypting with wrong key 3. Real adversaries can exploit relationships between nonces/keys Symbolic proofs may not apply! This talk: ways in which symbolic proofs are (and are not) meaningful in the computational model Can we trust symbolic security proofs in the ‘real world’?

13 The computational model Outgrowth of complexity theory Symbolic modelComputational model Keys, names, etc.SymbolsBit-strings EncryptionConstructorPoly-time algorithm CiphertextsCompound parse-treesBit-strings AdversaryRe-write rulesArbitrary poly-time algorithm Proof methodReachability analysisReduction to hard problem SecurityUnreachabilityParticular asymptotic property

14 Example: semantic security [GM] Described as game between ref and adversary: 1. Ref generates fresh key-pair 2. Ref gives public key to adversary 3. Adversary provides two messages: m 0 and m 1 4. Ref chooses one randomly, encrypts it 5. Adversary gets resulting ciphertext 6. Adversary guesses which was encrypted Semantic security: no adversary can do better than chance R A G K, K -1 K m 0, m 1 U(0,1) b E K, m b c g  poly-time A: Pr[b=g] ≈.5

15 Example II: real-or-random secrecy (‘universally composable’ version) Another game, between adversary and protocol participants 1. Participants engange in protocol 2. Adversary has control over network 3. When any participant finishes protocol, outputs either real key or random key 4. Other participants continue protocol, output same key 5. Adversary guesses ‘real’ or ‘random’ Real-or-random secrecy: no adversary can do better than chance  poly-time A: Pr[A is correct] ≈.5  poly-time A: Pr[A is correct] ≈.5 P1P1 P3P3 P2P2 A K K K real/random

16 Soundness Computational properites are strong, but complex and hard to prove Symbolic proofs are much easier, but unconvincing Soundness: symbolic proofs imply computational properties Protocol Symbolic property Computational property Hard Easy Hard, but done once Result: automated proof-methods yield strong properties!

17 Previous work (soundness) [AR]: soundness for indistinguishability Passive adversary [MW, BPW]: soundness for general trace properties Includes mutual authentication; active adversary Many, many others Remainder of talk: 2 non-soundness results Key-cycles (joint work with Adao, Bana, Scedrov) Secrecy (joint work with Canetti)

18 Key cycles When a key is used to encrypt itself E K (K) More generally: K 1 encrypts K 2, K 2 encrypts K 3 … until K n encrypts K 1 E K1 (…K 2 …) E K2 (…K 3 …) … E Kn (…K 1 …) Problem for soundness Symbolic model: key-cycles are like any other encryption Computational model: standard security defs don’t apply

19 Semantic security, revisited Adversary generates m 0 and m 1 based on public key only! Doesn’t talk about messages based on private keys Easy to devise semantically secure schemes that fail in presence of key-cycles R A G K, K -1 K m 0, m 1 U(0,1) b E K, m b c g

20 Counter-example Let E be a semantically-secure encryption algorithm Let E’ be: E’ K (M) = E K (M), if M≠K K, if M=K Semantically secure, unless encounters a key-cycle Contrived example, but valid counterexample Symbolic encryption stronger than semantic security Soundness requires new computational security definition

21 Resolution: ‘KDM security’ ‘Key-dependent message security’ Proposed by [BRS/AC] Implies soundness in presence of key cycles [ABHS] Future work Devise a KDM-secure encryption algorithm Find a non-contrived non-KDM algorithm Define & implement KDM-secure hashing Note: hash-based key-cycles occur in TLS and SSH!

22 Soundness for secrecy Does symbolic secrecy imply computational secrecy? Implies weakened notion [CW], but… Unfortunately, not the UC definition Counter-example: Demo: NSL satisfies symbolic secrecy for Nb Cannot provide UC real-or-random secrecy Symbolic modelComputational model “If A or B output (X,K), then no composition of adversary actions can result in K” (Key does not totally leak) “No adversary can distinguish real key from random key” (No partial leaks)

23 The ‘Rackoff attack’ (on NSL) AB E KB ( A || Na) E KA ( Na || Nb || B ) E KB (Nb) Adv K =? Nb E KB (K) K if K = Nb  O.W. ?

24 Achieving soundness Every single symbolic secrecy proof has been wrong weak Symbolic secrecy implies only weak computational properties ‘Real’ soundness requires new symbolic definition of secrecy [BPW]: ‘traditional’ secrecy + ‘non-use’ Thm: new definition implies secrecy But: must analyze infinite concurrent sessions and all resulting protocols Here: ‘traditional’ secrecy + symbolic real-or-random Non-interference property; close to ‘strong secrecy’ [B] Thm: new definition equivalent to UC real-or-random Demonstrably automatable (Demo 2)

25 Decidability of secrecy Traditional secrecySymbolic real-or-random Unbounded sessionsUndecidable [EG, HT, DLMS] Undecidable [B] Bounded sessionsDecidable (NP-complete) [DLMS, RT] Decidable (NP-complete) Side effect of proof method: Computational crypto automagically prevents cross- session interaction Thus, suffices to analyze single session in isolation

26 More future work Soundness Implement decision procedure for symbolic real-or-random Extend result past public-key encryption (e.g., hashing, symmetric encrypion) Apply analysis to real-world protocols (TLS, SSH, etc) What is traditional symbolic secrecy good for? Symbolic model Apply methods to new problems (crypto APIs) Unify compilation, analysis tools Symbolic notions for new properties (e.g., anonymity)

27 Conclusion Want to prove protocols secure Easy to prove security in ‘ideal’ setting (symbolic model) Meaningful to prove security in ‘real’ setting (computational model) Soundness: ‘ideal’ proof implies ‘real’ security Two aspects of symbolic model are not sound Key-cycles: must strengthen computational encryption Secrecy: must strengthen symbolic definition Important side-effect: soundness for new definition implies decidability

28 Thanks!

29 KDM-secure encryption (oversimplified) Adversary provides two functions f 0 and f 1 Referee chooses one, applies to private key, encrypts result KDM security: no adversary can do better than random Strictly stronger than semantic security R A G K, K -1 K f 0, f 1 U(0,1) b E K, f b (K -1 ) c g

30 Overview This talk: symbolic analysis can guarantee universally composable (UC) key exchange (Paper also includes mutual authentication) Symbolic (Dolev-Yao) model: high-level framework Messages treated symbolically; adversary extremely limited Despite (general) undecidability, proofs can be automated Result: symbolic proofs are computationally sound (UC) For some protocols For strengthened symbolic definition of secrecy With UC theorems, suffices to analyze single session Implies decidability!

31 Two approaches to analysis Standard (computational) approach: reduce attacks to weakness of encryption Alternate approach: apply methods of the symbolic model Originally proposed by Dolev & Yao (1983) Cryptography without: probability, security parameter, etc. Messages are parse trees Countable symbols for keys ( K, K’,…), names ( A, B,…) and nonces ( N, N’, Na, Nb, …) Encryption ( E K (M) ) pairing ( M || N ) are constructors Participants send/receive messages Output some key-symbol

32 The symbolic adversary Explicitly enumerated powers Interact with countable number of participants Knowledge of all public values, non-secret keys Limited set of re-write rules: M 1, M 2  M 1 || M 2  M 1, M 2 M, K  E K (M) E K (M), K -1  M

33 ‘Traditional’ symbolic secrecy Conventional goal for symbolic secrecy proofs: “If A or B output K, then no sequence of interactions/rewrites can result in K” Undecidable in general [EG, HT, DLMS] but: Decidable with bounds [DLMS, RT] Also, general case can be automatically verified in practice Demo 1: analysis of both NSLv1, NSLv2 So what? Symbolic model has weak adversary, strong assumptions We want computational properties! …But can we harness these automated tools?

34 Two challenges 1. Traditional secrecy is undecidable for: Unbounded message sizes [EG, HT] or Unbounded number of concurrent sessions (Decidable when both are bounded) [DLMS] 2. Traditional secrecy is unsound Cannot imply standard security definitions for computational key exchange Example: NSLv2 (Demo)

35 Prior work: BPW New symbolic definition Implies UC key exchange (Public-key & symmetric encryption, signatures) Theory Practice

36 Our work New symbolic definition: ‘real-or-random’ Equiv. to UC key exchange (Public-key encryption [CH], signatures [P]) UC suffices to examine single protocol run Automated verification! + Finite system Decidability? Theory Practice Demo 3: UC security for NSLv1

37 Our work: solving the challenges Soundness: requires new symbolic definition of secrecy Ours: purely symbolic expression of ‘real-or-random’ security Result: new symbolic definition equivalent to UC key exchange UC theorems: sufficient to examine single protocol in isolation Thus, bounded numbers of concurrent sessions Automated verification of our new definition is decidable!… Probably

38 Summary Summary: Symbolic key-exchange sound in UC model Computational crypto can now harness symbolic tools Now have the best of both worlds: security and automation! Future work

39 Secure key-exchange: UC ? PP A KK Answer: yes, it matters Negative result [CH]: traditional symbolic secrecy does not imply universally composable key exchange

40 Secure key-exchange: UC ? PP A Adversary gets key when output by participants Does this matter? (Demo 2) KK F S ?

41 Secure key-exchange [CW] PP A Adversary interacts with participants Afterward, receives real key, random key Protocol secure if adversary unable to distinguish NSLv1, NSLv2 satisfy symbolic def of secrecy Therefore, NSLv1, NSLv2 meet this definition as well K, K’

42 KE ? PP A F S Adversary unable to distinguish real/ideal worlds Effectively: real or random keys Adversary gets candidate key at end of protocol NSL1, NSL2 secure by this defn.

43 Analysis strategy Concrete protocol UC key-exchange functionality Dolev-Yao protocol Dolev-Yao key-exchange Would like Natural translation for large class of protocols Simple, automated Main result of talk (Need only be done once)

44 Proof overview (soundness) Multi-session KE (CCA-2 crypto) Symbolic key-exchange Single session UC KE (ideal crypto) Multi-session UC KE (ideal crypto) UC w/ joint state [CR] (Info-theor.) UC theorem Construct simulator Information-theoretic Must strengthen notion of UC public-key encryption Intermediate step: trace properties (as in [MW,BPW]) Every activity-trace of UC adversary could also be produced by symbolic adversary Rephrase: UC adversary no more powerful than symbolic adversary

45 “Simple” protocols Concrete protocols that map naturally to Dolev-Yao framework Two cryptographic operations: Randomness generation Encryption/decryption (This talk: asymmetric encryption) Example: Needham-Schroeder-Lowe P1P2 {P1, N1} K2 {P2, N1, N2} K1 {N2} K2

46 UC Key-Exchange Functionality F KE (P 1 P 2 ) k  {0,1} n Key P 2 P1P1 (P 1 P 2 ) Key k P2P2 (P 2 P 1 ) Key k (P 1 P 2 ) A Key P 1 (P 2 P 1 ) Key P 2 (P 2 P 1 ) X

47 The Dolev-Yao model Participants, adversary take turns Participant turn: A P1P2 M1M1 M2M2 L Local output: Not seen by adversary

48 The Dolev-Yao adversary Adversary turn: P1P2 A Know Application of deduction

49 Dolev-Yao adversary powers Already in Know Can add to Know M 1, M 2 Pair(M 1, M 2 ) M 1 and M 2 M, KEnc(M,K) Enc(M, K), K -1 M Always in Know : Randomness generated by adversary Private keys generated by adversary All public keys

50 The Dolev-Yao adversary A P1P2 Know M

51 Dolev-Yao key exchange Assume that last step of (successful) protocol execution is local output of (Finished Pi Pj K) 1. Key Agreement: If P1 outputs (Finished P1 P2 K) and P2 outputs (Finished P2 P1 K’) then K = K’. 2. Traditional Dolev-Yao secrecy: If Pi outputs (Finished Pi Pj K), then K can never be in adversary’s set Know Not enough!

52 Goal of the environment Recall that the environment Z sees outputs of participants Goal: distinguish real protocol from simulation In protocol execution, output of participants (session key) related to protocol messages In ideal world, output independent of simulated protocol If there exists a detectable relationship between session key and protocol messages, environment can distinguish Example: last message of protocol is {“confirm”} K where K is session key Can decrypt with participant output from real protocol Can’t in simulated protocol

53 Real-or-random (1/3) Need: real-or-random property for session keys Can think of traditional goal as “computational” Need a stronger “decisional” goal Expressed in Dolev-Yao framework Let  be a protocol Let  r be , except that when participant outputs (Finished Pi Pj Kr), Kr added to Know Let  f be , except that when any participant outputs (Finished Pi Pj Kr), fresh key Kf added to adversary set Know Want: adversary can’t distinguish two protocols

54 Real-or-random (2/3) Attempt 1: Let Traces(  ) be traces adversary can induce on . Then: Traces(  r ) = Traces(  f ) Problem: Kf not in any traces of  r Attempt 2: Traces(  r ) = Rename ( Traces(  f ), Kf  Kr ) Problem: Two different traces may “look” the same Example protocol: If participant receives session key, encrypts “yes” under own (secret) key. Otherwise, encrypts “no” instead Traces different, but adversary can’t tell

55 Real-or-random (3/3) Observable part of trace: Abadi-Rogaway pattern Undecipherable encryptions replaced by “blob” Example: t = {N1, N2} K1, {N2} K2, K1 -1 Pattern(t) = {N1, N2} K1, K2, K1 -1 Final condition: Pattern ( Traces(  r ) ) = Pattern ( Rename ( Traces(  f ), Kf  Kr) ) )

56 Main results Let key-exchange in the Dolev-Yao model be: Key agreement Traditional Dolev-Yao secrecy of session key Real-or-random Let  be a simple protocol that uses UC asymmetric encryption. Then: DY(  ) satisfies Dolev-Yao key exchange iff UC(  ) securely realizes F KE

57 Future work How to prove Dolev-Yao real-or-random? Needed for UC security Not previously considered in the Dolev-Yao literature Can it be automated? Weaker forms of DY real-or-random Similar results for symmetric encryption and signatures

58 Summary & future work Result: symbolic proofs are computationally sound (UC) For some protocols For strengthened symbolic definition of secrecy With UC theorems, suffices to analyze single session Implies decidability! Additional primitives Have public-key encryption, signatures [P] Would like symmetric encryption, MACs, PRFs… Symbolic representation of other goals Commitment schemes, ZK, MPC…


Download ppt "Proving Security Protocols Correct— Correctly Jonathan Herzog 21 March 2006 The author's affiliation with The MITRE Corporation is provided for identification."

Similar presentations


Ads by Google