Presentation is loading. Please wait.

Presentation is loading. Please wait.

Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008.

Similar presentations


Presentation on theme: "Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008."— Presentation transcript:

1 Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008

2 Overview 1.Anonymous communication and onion routing 2.Formally model and analyze onion routing (Financial Cryptography 2007) 3.Probabilistic analysis of onion routing (Workshop on Privacy in the Electronic Society 2007) 1

3 Anonymous Communication: What? Setting 2

4 Anonymous Communication: What? Setting –Communication network 2

5 Anonymous Communication: What? Setting –Communication network –Adversary 2

6 Anonymous Communication: What? Setting –Communication network –Adversary Anonymity 2

7 Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity 2

8 Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity 2

9 Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity w.r.t. a message 2

10 Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity –Unlinkability w.r.t. a message 2

11 Anonymous Communication: What? Setting –Communication network –Adversary Anonymity –Sender anonymity –Receiver anonymity –Unlinkability w.r.t. a message w.r.t. all communication 2

12 Anonymous Communication: Why? 3

13 Useful –Individual privacy online –Corporate privacy –Government and foreign intelligence –Whistleblowers 3

14 Anonymous Communication: Why? Useful –Individual privacy online –Corporate privacy –Government and foreign intelligence –Whistleblowers Interesting –How to define? –Possible in communication networks? –Cryptography from anonymity 3

15 Anonymous Communication Protocols Mix Networks (1981) Dining cryptographers (1988) Onion routing (1999) Anonymous buses (2002) 4

16 Anonymous Communication Protocols Mix Networks (1981) Dining cryptographers (1988) Onion routing (1999) Anonymous buses (2002) Crowds (1998) PipeNet (1998) Xor-trees (2000) 4 Tarzan (2002) Hordes (2002) Salsa (2006) ISDN,pool, Stop-and-Go, timed,cascade mixes etc.

17 Deployed Anonymity Systems anon.penet.fi Freedom Mixminion Mixmaster Tor JAP FreeNet anonymizer.com and other single-hop proxies I2P MUTE Nodezilla etc. 5

18 Onion Routing Practical design with low latency and overhead Open source implementation (http://tor.eff.org) Over 1000 volunteer routers Estimated 200,000 users Sophisticated design 6

19 Anonymous Communication Mix Networks Dining cryptographers Onion routing Anonymous buses DeployedAnalyzed 7

20 A Model of Onion Routing with Provable Anonymity Johnson, Feigenbaum, and Syverson Financial Cryptography 2007 Formally model onion routing using input/output automata Characterize the situations that provide possibilistic anonymity 8

21 How Onion Routing Works User u running client Internet destination d Routers running servers ud 12 3 4 5 9

22 How Onion Routing Works ud 1. u creates l-hop circuit through routers 12 3 4 5 9

23 How Onion Routing Works ud 1. u creates l-hop circuit through routers 12 3 4 5 9

24 How Onion Routing Works ud 1. u creates l-hop circuit through routers 12 3 4 5 9

25 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 12 3 4 5 9

26 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{{m} 3 } 4 } 1 12 3 4 5 9

27 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{m} 3 } 4 12 3 4 5 9

28 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {m} 3 12 3 4 5 9

29 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged m 12 3 4 5 9

30 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged m 12 3 4 5 9

31 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {m} 3 12 3 4 5 9

32 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{m} 3 } 4 12 3 4 5 9

33 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged {{{m} 3 } 4 } 1 12 3 4 5 9

34 How Onion Routing Works ud 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged. 4.Stream is closed. 12 3 4 5 9

35 How Onion Routing Works u 1. u creates l-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data are exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes. 12 3 4 5 d 9

36 How Onion Routing Works u 12 3 4 5 d 10

37 How Onion Routing Works u 12 3 4 5 d 11

38 How Onion Routing Works u 12 3 4 5 d Theorem 1: Adversary can only determine parts of a circuit it controls or is next to. 11

39 How Onion Routing Works u 12 3 4 5 d Theorem 1: Adversary can only determine parts of a circuit it controls or is next to. u12 11

40 Model Constructed with I/O automata (Lynch & Tuttle, 1989) –Models asynchrony –Relies on abstract properties of cryptosystem Simplified onion-routing protocol –Each user constructs a circuit to one destination –No separate destinations –No circuit teardowns Circuit identifiers 12

41 Automata Protocol u v w 13

42 Automata Protocol u v w 13

43 Automata Protocol u v w 13

44 Automata Protocol u v w 13

45 Automata Protocol u v w 13

46 Automata Protocol u v w 13

47 Automata Protocol u v w 13

48 Automata Protocol u v w 13

49 Automata Protocol u v w 13

50 Automata Protocol u v w 13

51 Creating a Circuit u123 15

52 Creating a Circuit [0,{CREATE} 1 ] 1.CREATE/CREATED u123 15

53 Creating a Circuit [0,CREATED] 1.CREATE/CREATED u123 15

54 Creating a Circuit 1.CREATE/CREATED u123 15

55 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [0,{[EXTEND,2, {CREATE} 2 ]} 1 ] u123 15

56 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [l 1,{CREATE} 2 ] u123 15

57 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [l 1,CREATED] u123 15

58 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [0,{EXTENDED} 1 ] u123 15

59 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [0,{{[EXTEND,3, {CREATE} 3 ]} 2 } 1 ] u123 15

60 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] u123 [l 1,{[EXTEND,3, {CREATE} 3 ]} 2 ] 15

61 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 2,{CREATE} 3 ] u123 15

62 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 2,CREATED] u123 15

63 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 1,{EXTENDED} 2 ] u123 15

64 Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [0,{{EXTENDED} 2 } 1 ] u123 15

65 Input/Ouput Automata States Actions transition between states Alternating state/action sequence is an execution In fair executions actions enabled infinitely often occur infinitely often In cryptographic executions no encrypted protocol messages are sent before they are received unless the sender possesses the key 14

66 I/O Automata Model Automata –User –Server –Complete network of FIFO Channels –Adversary replaces some servers with arbitrary automata Notation –U is the set of users –R is the set of routers –N = U R is the set of all agents –A N is the adversary –K is the keyspace –l is the (fixed) circuit length –k(u,c,i) denotes the ith key used by user u on circuit c 16

67 User automaton 17

68 Server automaton 18

69 Anonymity 19 Definition (configuration): A configuration is a function U R l mapping each user to his circuit.

70 Anonymity Definition (indistinguishable executions): Executions and are indistinguishable to adversary A when his actions in are the same as in after possibly applying the following: : A permutation on the keys not held by A. : A permutation on the messages encrypted by a key not held by A. Definition (configuration): A configuration is a function U R l mapping each user to his circuit. 19

71 Anonymity 20 Definition (indistinguishable configurations): Configurations C and D are indistinguishable to adversary A when, for every fair, cryptographic execution C, there exists a fair, cryptographic execution D that is indistinguishable to A.

72 Anonymity Definition (unlinkability): User u is unlinkable to d in configuration C with respect to adversary A if there exists an indistinguishable configuration D in which u does not talk to d. 20 Definition (indistinguishable configurations): Configurations C and D are indistinguishable to adversary A when, for every fair, cryptographic execution C, there exists a fair, cryptographic execution D that is indistinguishable to A.

73 C u v 12 3 4 5 21 Main Theorems

74 3 2 D 21 Main Theorems C u v 12 3 4 5

75 21 Main Theorems C u v 12 3 4 5 3 2 D v u 225 4

76 21 C u v 12 3 4 5 Main Theorems D u v 12 3 4 5

77 Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable. 21 Main Theorems

78 Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable. 21 Main Theorems Theorem 2: Given configuration C, let (r i-1,r i,r i+1 ) be three consecutive routers in a circuit such that {r i-1,r i,r i+1 } A=. Let D be identical to configuration C except r i has been replaced with r i A. Then C and D are indistinguishable.

79 Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable. 21 Main Theorems Theorem 2: Given configuration C, let (r i-1,r i,r i+1 ) be three consecutive routers in a circuit such that {r i-1,r i,r i+1 } A=. Let D be identical to configuration C except r i has been replaced with r i A. Then C and D are indistinguishable. Theorem 3: If configurations C and D are indistinguishable, then D can be reached from C by applying a sequence transformations of the type described in Theorems 1 and 2.

80 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A. 22

81 Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C 1 (u) (C 1 (v)) in with a message sent or received between v (u) and C 1 (u) (C 1 (v)). 22 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.

82 Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C 1 (u) (C 1 (v)) in with a message sent or received between v (u) and C 1 (u) (C 1 (v)). 2. Let the permutation send u to v and v to u and other users to themselves. Apply to the encryption keys. 22 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.

83 Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C 1 (u) (C 1 (v)) in with a message sent or received between v (u) and C 1 (u) (C 1 (v)). 2. Let the permutation send u to v and v to u and other users to themselves. Apply to the encryption keys. i. is an execution of D. ii. is fair. iii. is cryptographic. iv. is indistinguishable. 22 Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.

84 Unlinkability Corollary: A user is unlinkable to its destination when: 23

85 Unlinkability 2 3 u 4? 5? The last router is unknown. Corollary: A user is unlinkable to its destination when: 23

86 OR Unlinkability 2 3 u 4? 5? The last router is unknown. 1 2 4 The user is unknown and another unknown user has an unknown destination. 5 2? 5? 4? Corollary: A user is unlinkable to its destination when: 23

87 OR 1 2 4 The user is unknown and another unknown user has a different destination. 5 1 2 Unlinkability 2 3 u 4? 5? The last router is unknown. 1 2 4 The user is unknown and another unknown user has an unknown destination. 5 2? 5? 4? Corollary: A user is unlinkable to its destination when: 23

88 Model Robustness Only single encryption still works Can include data transfer Can allow users to create multiple circuits 24

89 A Probabilistic Analysis of Onion Routing in a Black-box Model Johnson, Feigenbaum, and Syverson Workshop on Privacy in the Electronic Society 2007 Use a black-box abstraction to create a probabilistic model of onion routing Analyze unlinkability Provide upper and lower bounds on anonymity Examine a typical case 25

90 Anonymity u12 3 4 5 d 1. 2. 3. 4. v w e f 26

91 Anonymity u12 3 4 5 d 1.First router compromised 2. 3. 4. v w e f 26

92 Anonymity u12 3 4 5 d 1.First router compromised 2.Last router compromised 3. 4. v w e f 26

93 Anonymity u12 3 4 5 d 1.First router compromised 2.Last router compromised 3.First and last compromised 4. v w e f 26

94 Anonymity u12 3 4 5 d 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised v w e f 26

95 Black-box Abstraction ud v w e f 27

96 Black-box Abstraction ud v w e f 1. Users choose a destination 27

97 Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 27

98 Black-box Abstraction ud v w e f 1. Users choose a destination 2.Some inputs are observed 3.Some outputs are observed 27

99 Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. 28

100 Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary. 28

101 Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary. 28

102 Black-box Anonymity ud v w e f The adversary can link observed inputs and outputs of the same user. Any configuration consistent with these observations is indistinguishable to the adversary. 28

103 Probabilistic Black-box ud v w e f 29

104 Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v pupu 29

105 Probabilistic Black-box ud v w e f Each user v selects a destination from distribution p v Inputs and outputs are observed independently with probability b pupu 29

106 Black Box Model Let U be the set of users. Let be the set of destinations. Configuration C User destinations C D : U Observed inputs C I : U {0,1} Observed outputs C O : U {0,1} Let X be a random configuration such that: Pr[X=C] = u [p u C D (u) ][b C I (u) (1-b) 1-C I (u) ][b C O (u) (1-b) 1-C O (u) ] 30

107 Probabilistic Anonymity ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations 31 Conditional distribution: Pr[u d] = 1

108 Probabilistic Anonymity The metric Y for the unlinkability of u and d in C is: Y(C) = Pr[X D (u)=d | X C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Unlinkability given that u visits d: E[Y | X D (u)=d] 32

109 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 33

110 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v =1 for all v u, where p v p v e for e d b. p v d =1 for all v u 33

111 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v =1 for all v u, where p v p v e for e d E[Y | X D (u)=d] b + (1-b) p u d + O( logn/n) b. p v d =1 for all v u E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d + O( logn/n) 33

112 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d 34

113 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: 34

114 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] 34

115 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] 34

116 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. 34

117 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. E[Y | X D (u)=d X I (u)=0] = i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d] 34

118 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. E[Y | X D (u)=d X I (u)=0] = i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d] ( i Pr[D i ] Pr[C i ] / Pr[C i ] ) 2 Pr[X D (u)=d] by Cauchy- Schwartz 34

119 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let C i be the configuration equivalence classes. Let D i be the event C i X D (u)=d. E[Y | X D (u)=d X I (u)=0] = i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d] ( i Pr[D i ] Pr[C i ] / Pr[C i ] ) 2 Pr[X D (u)=d] = p u d by Cauchy- Schwartz 34

120 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] 34

121 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] b 2 + b(1-b) p u d + (1-b) p u d 34

122 Lower Bound Theorem 2: E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d X I (u)=0] b 2 + b(1-b) p u d + (1-b) p u d = b 2 + (1-b 2 ) p u d 34

123 Upper Bound 35

124 Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u 35

125 Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. 35

126 Show max. occurs when, for all v u, e v = d or e v =. Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. 35

127 Show max. occurs when, for all v u, e v = d or e v =. Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u Show max. occurs when, for all v u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v u, or when e v = for all v u. 35

128 Upper-bound Estimates Let n be the number of users. 36

129 Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users. 36

130 Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Theorem 5: When p v d =1 for all v u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O( logn/n) ] Let n be the number of users. 36

131 Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of users. 36

132 Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d Let n be the number of users. For p u small 36

133 Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. For p u small 36

134 Upper-bound Estimates Theorem 4: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b. For p u small 36

135 Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 6: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) 37

136 Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 6: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) E[Y | X D (u)=d] b 2 + ( 1 b 2 )p u d 37

137 Future Work Investigate improved protocols to defeat timing attacks. Examine how quickly users distribution are learned. Formally analyze scalable, P2P designs. 38

138 Related work A Formal Treatment of Onion Routing Jan Camenisch and Anna Lysyanskaya CRYPTO 2005 A formalization of anonymity and onion routing S. Mauw, J. Verschuren, and E.P. de Vink ESORICS 2004 I/O Automaton Models and Proofs for Shared- Key Communication Systems Nancy Lynch CSFW 1999 5

139 Overview Formally model onion routing using input/output automata –Simplified onion-routing protocol –Non-cryptographic analysis Characterize the situations that provide anonymity 6

140 Overview Formally model onion routing using input/output automata –Simplified onion-routing protocol –Non-cryptographic analysis Characterize the situations that provide anonymity –Send a message, receive a message, communicate with a destination –Possibilistic anonymity 6

141 Future Work Construct better models of time Exhibit a cryptosystem with the desired properties Incorporate probabilistic behavior by users 26

142 Related Work A Model of Onion Routing with Provable Anonymity J. Feigenbaum, A. Johnson, and P. Syverson FC 2007 Towards an Analysis of Onion Routing Security P. Syverson, G. Tsudik, M. Reed, and C. Landwehr PET 2000 An Analysis of the Degradation of Anonymous Protocols M. Wright, M. Adler, B. Levine, and C. Shields NDSS 2002


Download ppt "Towards a Theory of Onion Routing Aaron Johnson Yale University 5/27/2008."

Similar presentations


Ads by Google