Download presentation

Presentation is loading. Please wait.

Published byMary Cobb Modified over 2 years ago

1
A Formal Analysis of Onion Routing 10/26/2007 Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)

2
Papers 1.A Model of Onion Routing with Provable Anonymity Financial Cryptography and Data Security A Probabilistic Analysis of Onion Routing in a Black-box Model Workshop on Privacy in the Electronic Society 2007

3
Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Relationship anonymity: Adversary cant determine who talks to whom

4
Anonymous Communication Sender anonymity: Adversary cant determine the sender of a given message Receiver anonymity: Adversary cant determine the receiver of a given message Relationship anonymity: Adversary cant determine who talks to whom

5
How Onion Routing Works User u running client Internet destination d Routers running servers ud

6
How Onion Routing Works ud 1.u creates 3-hop circuit through routers

7
How Onion Routing Works ud 1.u creates 3-hop circuit through routers

8
How Onion Routing Works ud 1.u creates 3-hop circuit through routers

9
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d

10
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 }

11
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 }

12
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m}

13
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m

14
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged m

15
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {m}

16
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{m} 3 }

17
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged {{{m} 3 } 4 }

18
How Onion Routing Works ud 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed

19
How Onion Routing Works u 1. u creates 3-hop circuit through routers 2. u opens a stream in the circuit to d 3.Data is exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes d

20
Results

21
1.Formally model onion routing using input/output automata

22
Results 1.Formally model onion routing using input/output automata 2.Analyze relationship anonymity a.Characterize situations with possibilistic anonymity b.Bound probabilistic anonymity in worst-case and typical situations

23
Related Work A Formal Treatment of Onion Routing Jan Camenisch and Anna Lysyanskaya CRYPTO 2005 A formalization of anonymity and onion routing S. Mauw, J. Verschuren, and E.P. de Vink ESORICS 2004 Towards an Analysis of Onion Routing Security P. Syverson, G. Tsudik, M. Reed, and C. Landwehr PET 2000

24
Model Constructed with I/O automata –Models asynchrony –Relies on abstract properties of cryptosystem Simplified onion-routing protocol –No key distribution –No circuit teardowns –No separate destinations –Each user constructs a circuit to one destination –Circuit identifiers

25
Automata Protocol u v w

26
u v w

27
u v w

28
u v w

29
u v w

30
u v w

31
u v w

32
u v w

33
u v w

34
u v w

35
Creating a Circuit u123

36
[0,{CREATE} 1 ] 1.CREATE/CREATED u123

37
Creating a Circuit [0,CREATED] 1.CREATE/CREATED u123

38
Creating a Circuit 1.CREATE/CREATED u123

39
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [0,{[EXTEND,2, {CREATE} 2 ]} 1 ] u123

40
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [l 1,{CREATE} 2 ] u123

41
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [l 1,CREATED] u123

42
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED [0,{EXTENDED} 1 ] u123

43
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [0,{{[EXTEND,3, {CREATE} 3 ]} 2 } 1 ] u123

44
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] u123 [l 1,{[EXTEND,3, {CREATE} 3 ]} 2 ]

45
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 2,{CREATE} 3 ] u123

46
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 2,CREATED] u123

47
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [l 1,{EXTENDED} 2 ] u123

48
Creating a Circuit 1.CREATE/CREATED 2.EXTEND/EXTENDED 3.[Repeat with layer of encryption] [0,{{EXTENDED} 2 } 1 ] u123

49
Adversary u d Active & Local

50
Possibilistic Anonymity

51
Anonymity Let U be the set of users. Let R be the set of routers. Let l be the path length.

52
Anonymity Definition (configuration): A configuration is a function U R l mapping each user to his circuit. Let U be the set of users. Let R be the set of routers. Let l be the path length.

53
Anonymity Definition (indistinguishability): Executions and are indistinguishable to adversary A when his actions in are the same as in after possibly applying the following: : A permutation on the keys not held by A. : A permutation on the messages encrypted by a key not held by A. Definition (configuration): A configuration is a function U R l mapping each user to his circuit. Let U be the set of users. Let R be the set of routers. Let l be the path length.

54
Definition (relationship anonymity): u and d have relationship anonymity in configuration C with respect to adversary A if, for every fair, cryptographic execution of C in which u talks to d, there exists a fair, cryptographic execution that is indistinguishable to A in which u does not talk to d. Anonymity Definition (fair): In fair executions actions enabled infinitely often occur infinitely often Definition (cryptographic): In cryptographic executions no encrypted control messages are sent before they are received unless the sender possesses the key

55
Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then for every fair, cryptographic execution of C there exists an indistinguishable, fair, cryptographic execution of D. The converse also holds.

56
C u v

57
u v CD

58
u v CD v u 225 4

59
u v CD Theorem 1: Let C and D be configurations for which there exists a permutation : U U such that C i (u) = D i ( (u)) if C i (u) or D i ( (u)) is compromised or is adjacent to a compromised router. Then for every fair, cryptographic execution of C there exists an indistinguishable fair, cryptographic execution of D. The converse also holds. u v

60
Relationship anonymity Corollary : A user and destination have rel. anonymity iff:

61
Relationship anonymity 2 3 u 4? 5? The last router is unknown. Corollary : A user and destination have rel. anonymity iff:

62
OR Relationship anonymity 2 3 u 4? 5? The last router is unknown The user is unknown and another unknown user has an unknown destination. 5 2? 5? 4? Corollary : A user and destination have rel. anonymity iff:

63
OR The user is unknown and another unknown user has a different destination Relationship anonymity 2 3 u 4? 5? The last router is unknown The user is unknown and another unknown user has an unknown destination. 5 2? 5? 4? Corollary : A user and destination have rel. anonymity iff:

64
Probabilistic Anonymity

65
Adding probability 1.To configurations a.User u selects destination d with probability p u d. b.Users select routers randomly. 2.To executions a.Each execution in a configuration is equally likely.

66
Definition (relationship anonymity): u and d have relationship anonymity in configuration C with respect to adversary A if, for every fair, cryptographic execution of C in which u talks to d, there exists a fair, cryptographic execution that is indistinguishable to A in which u does not talk to d.

67
Probabilistic relationship anonymity metric: Let X be a random configuration. Let X D : U D be the destinations in X. The metric Y for the relationship anonymity of u and d in C is: Y(C) = Pr[X D (u)=d | X C]

68
Probabilistic Anonymity Other measures (e.g. entropy, min entropy) Measures effect on the individual user Exact Bayesian inference –Adversary after long-term intersection attack –Worst-case adversary

69
Probabilistic Anonymity 1. Fixing u and d doesnt determine C. 2. Y(C) thus has a distribution. 3.This distribution depends on each p v. 4. E[Y | X D (u)=d] a.Worst case b.Typical case

70
Worst-case Anonymity Theorem 2: The maximum of E[Y | X D (u)=d] over (p v ) v u occurs when 1. p v =1 for all v u OR 2. p v d =1 for all v u Let p u 1 p u 2 p u d-1 p u d+1 … p u

71
Worst-case Estimates Let n be the number of nodes. Let b be the fraction of compromised nodes.

72
Worst-case Estimates Theorem 3: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of nodes. Let b be the fraction of compromised nodes.

73
Worst-case Estimates Theorem 3: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Theorem 4: When p v d =1 for all v u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O( logn/n) ] Let n be the number of nodes. Let b be the fraction of compromised nodes.

74
Worst-case Estimates Theorem 3: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] Let n be the number of nodes. Let b be the fraction of compromised nodes.

75
Worst-case Estimates Theorem 3: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d Let n be the number of nodes. Let b be the fraction of compromised nodes.

76
Worst-case Estimates Theorem 3: When p v =1 for all v u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u ) b )) + O( logn/n) ] b + (1-b) p u d E[Y | X D (u)=d] b 2 + (1-b 2 ) p u d Let n be the number of nodes. Let b be the fraction of compromised nodes.

77
Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 5: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n)

78
Typical Case Let each user select from the Zipfian distribution: p d i = 1/( i s ) Theorem 5: E[Y | X D (u)=d] = b 2 + (1 b 2 )p u d + O(1/n) E[Y | X D (u)=d] b 2 + ( 1 b 2 )p u d

79
Results 1.Formally model onion routing using input/output automata 2.Analyze relationship anonymity a.Characterize situations with possibilistic anonymity b.Bound probabilistic anonymity in worst-case and typical situations

80
Future Work 1.Extend analysis to other types of anonymity and to other systems. 2.Examine how quickly users distribution are learned. 3.Analyze timing attacks.

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google