Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:

Similar presentations


Presentation on theme: "1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:"— Presentation transcript:

1 1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum http://www.cs.yale.edu/homes/jf WITS’08; Princeton NJ; June 18, 2008 Acknowledgement: Aaron Johnson

2 2 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

3 3 Anonymity: What and Why The adversary cannot tell who is communicating with whom. Not the same as confidentiality (and hence not solved by encryption). Pro: Facilitates communication by whistle blowers, political dissidents, members of 12-step programs, etc. Con: Inhibits accountability

4 4 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

5 5 Anonymity Systems Remailers / Mix Networks –anon.penet.fi –MixMaster –Mixminion Low-latency communication –Anonymous proxies, anonymizer.net –Freedom –Tor –JAP Data Publishing –FreeNet

6 6 Mix Networks First outlined by Chaum in 1981 Provide anonymous communication –High latency –Message-based (“message-oriented”) –One-way or two-way

7 7 Mix Networks UsersMixesDestinations

8 8 Mix Networks Adversary UsersMixesDestinations

9 9 Mix Networks UsersMixesDestinations Protocol Adversary

10 10 Mix Networks 1.User selects a sequence of mixes and a destination. M1M1 M2M2 M3M3 ud Protocol Adversary UsersMixesDestinations

11 11 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. M1M1 M2M2 M3M3 ud Protocol Adversary UsersMixesDestinations

12 12 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. Adversary UsersMixesDestinations

13 13 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. {{{ ,d} M 3,M 3 } M 2,M 2 } M 1 Adversary UsersMixesDestinations

14 14 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. {{{ ,d} M 3,M 3 } M 2,M 2 } M 1 Adversary UsersMixesDestinations

15 15 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. {{ ,d} M 3,M 3 } M 2 Adversary UsersMixesDestinations

16 16 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix. { ,d} M 3 Adversary UsersMixesDestinations

17 17 Mix Networks 1.User selects a sequence of mixes and a destination. 2.Onion-encrypt the message. 3.Send the message, removing a layer of encryption at each mix. M1M1 M2M2 M3M3 ud Protocol Onion Encrypt 1.Proceed in reverse order of the user’s path. 2.Encrypt (message, next hop) with the public key of the mix.  Adversary UsersMixesDestinations

18 18 Mix Networks ud Adversary Anonymity? 1.No one mix knows both source and destination. UsersMixesDestinations

19 19 Mix Networks ud Adversary Anonymity? 1.No one mix knows both source and destination. 2.Adversary cannot follow multiple messages through the same mix. v f UsersMixesDestinations

20 20 Mix Networks ud Adversary Anonymity? 1.No one mix knows both source and destination. 2.Adversary cannot follow multiple messages through the same mix. 3.More users provides more anonymity. ve wf UsersMixesDestinations

21 21 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

22 22 Provable Anonymity in Mix Networks N users Passive, local adversary – Adversary observes some of the mixes and the links. – Fraction f of links are not observed by adversary. Users and mixes are roughly synchronized. Users choose mixes uniformly at random. Setting

23 23 Provable Anonymity in Mix Networks Users should be unlinkable to their destinations. Let  be a random permutation that maps users to destinations. Let C be the traffic matrix observed by the adversary during the protocol. C ei = # of messages on link e in round i Definition e1e1 e2e2 12345 10 01 0 1 11 00

24 24 Use information theory to quantify information gain from observing C. H(X) =  x -Pr[X=x] log(Pr[X=x]) is the entropy of r.v. X I(X : Y) is the mutual information between X and Y. I(X : Y) = H(X) – H(X | Y) =  x,y -Pr[X=x  Y=y] log(Pr[X=x  Y=y]) Provable Anonymity in Mix Networks Information-theory background

25 25 Provable Anonymity in Synchronous Protocols Definition: The protocol is  (N)-unlinkable if I(C :  )   (N). Definition: An  (N)-unlinkable protocol is efficient if: 1. It takes T(N) = O(polylog(N/  (N))) rounds. 2. It uses O(N  T(N)) messages. Theorem (Berman, Fiat, and Ta-Shma, 2004): The basic mixnet protocol is  (N)-unlinkable and efficient when T(N) =  (log(N) log 2 (N/  (N))).

26 26 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

27 27 Onion Routing [GRS’96] Practical design with low latency and overhead Connection-oriented, two-way communication Open source implementation (http://tor.eff.org) Over 1000 volunteer routers Estimated 200,000 users

28 28 How Onion Routing Works User u running client Internet destination d Routers running servers ud 12 3 4 5

29 29 How Onion Routing Works ud 12 3 4 5 1. u creates 3-hop circuit through routers (u.a.r.).

30 30 How Onion Routing Works ud 12 3 4 5 1. u creates 3-hop circuit through routers (u.a.r.).

31 31 How Onion Routing Works ud 12 3 4 5 1. u creates 3-hop circuit through routers (u.a.r.).

32 32 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. ud 12 3 4 5

33 33 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{{  } 3 } 4 } 1 ud 12 3 4 5

34 34 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{  } 3 } 4 ud 12 3 4 5

35 35 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {}3{}3 ud 12 3 4 5

36 36 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged.  ud 12 3 4 5

37 37 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. ’’ ud 12 3 4 5

38 38 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {  ’} 3 ud 12 3 4 5

39 39 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{  ’} 3 } 4 ud 12 3 4 5

40 40 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. {{{  ’} 3 } 4 } 1 ud 12 3 4 5

41 41 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. 4.Stream is closed. ud 12 3 4 5

42 42 How Onion Routing Works 1. u creates 3-hop circuit through routers (u.a.r.). 2. u opens a stream in the circuit to d. 3.Data are exchanged. 4.Stream is closed. 5.Circuit is changed every few minutes. ud 12 3 4 5

43 43 Adversary u 12 3 4 5 d Active & Local

44 44 Outline Anonymity: What and why Examples of anonymity systems Theory: Definition and proof Practice: Onion Routing Theory meets practice

45 45 Formal Analysis (F., Johnson, and Syverson, 2007) u12 3 4 5 d v w e f 1. 2. 3. 4. Timing attacks result in four cases:

46 46 1.First router compromised 2. 3. 4. Timing attacks result in four cases: u12 3 4 5 d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

47 47 1.First router compromised 2.Last router compromised 3. 4. Timing attacks result in four cases: u12 3 4 5 d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

48 48 1.First router compromised 2.Last router compromised 3.First and last compromised 4. Timing attacks result in four cases: u12 3 4 5 d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

49 49 1.First router compromised 2.Last router compromised 3.First and last compromised 4.Neither first nor last compromised Timing attacks result in four cases: u12 3 4 5 d v w e f Formal Analysis (F., Johnson, and Syverson, 2007)

50 50 Black-Box, Onion-Routing Model Let U be the set of users. Let  be the set of destinations. Let the adversary control a fraction b of the routers. Configuration C User destinations C D : U  Observed inputs C I : U  {0,1} Observed outputs C O : U  {0,1} Let X be a random configuration such that: Pr[X=C] =  u [p u C D (u) ][b C I (u) (1-b) 1-C I (u) ][b C O (u) (1-b) 1-C O (u) ]

51 51 Indistinguishability ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations

52 52 Indistinguishability ud v w e f ud v w e f ud v w e f ud v w e f Indistinguishable configurations Note: Indistinguishable configurations form an equivalence relation.

53 53 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C]

54 54 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C] Note: This is different from the metric of mutual information used to analyze mix nets.

55 55 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary

56 56 Probabilistic Anonymity The metric Y for the linkability of u and d in C is: Y(C) = Pr[X D (u)=d | X  C] Exact Bayesian inference Adversary after long-term intersection attack Worst-case adversary Linkability given that u visits d: E[Y | X D (u)=d]

57 57 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d

58 58 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v  =1 for all v  u, where p v   p v e for e  d b. p v d =1 for all v  u

59 59 Anonymity Bounds 1.Lower bound: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d 2.Upper bounds: a. p v  =1 for all v  u, where p v   p v e for e  d E[Y | X D (u)=d]  b + (1-b) p u d + O(  logn/n) b. p v d =1 for all v  u E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d + O(  logn/n)

60 60 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d

61 61 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof:

62 62 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]

63 63 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]

64 64 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d.

65 65 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d. E[Y | X D (u)=d  X I (u)=0]  =  i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d]

66 66 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d. E[Y | X D (u)=d  X I (u)=0]  =  i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d]   (  i Pr[D i ]  Pr[C i ] /  Pr[C i ] ) 2 Pr[X D (u)=d] by Cauchy- Schwarz

67 67 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let {C i } be the configuration equivalence classes. Let D i be the event C i  X D (u)=d. E[Y | X D (u)=d  X I (u)=0]  =  i (Pr[D i ]) 2 Pr[C i ] Pr[X D (u)=d]   (  i Pr[D i ]  Pr[C i ] /  Pr[C i ] ) 2 Pr[X D (u)=d] = p u d by Cauchy- Schwarz

68 68 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]

69 69 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]  b 2 + b(1-b) p u d + (1-b) p u d

70 70 Lower Bound Theorem 2: E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Proof: E[Y | X D (u)=d] = b 2 + b(1-b) p u d + (1-b) E[Y | X D (u)=d  X I (u)=0]  b 2 + b(1-b) p u d + (1-b) p u d = b 2 + (1-b 2 ) p u d

71 71 Upper Bound

72 72 Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u 

73 73 Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u  Show max. occurs when, for all v  u, p v e v = 1 for some e v.

74 74 Show max. occurs when, for all v  u, e v = d or e v = . Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u  Show max. occurs when, for all v  u, p v e v = 1 for some e v.

75 75 Show max. occurs when, for all v  u, e v = d or e v = . Upper Bound Theorem 3: The maximum of E[Y | X D (u)=d] over (p v ) v  u occurs when 1. p v  =1 for all v  u OR 2. p v d =1 for all v  u Let p u 1  p u 2  p u d-1  p u d+1  …  p u  Show max. occurs when, for all v  u, p v e v = 1 for some e v. Show max. occurs when e v =d for all v  u, or when e v =  for all v  u.

76 76 Upper-bound Estimates Let n be the number of users.

77 77 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ] Let n be the number of users.

78 78 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ] Theorem 5: When p v d =1 for all v  u: E[Y | X D (u)=d] = b 2 + b(1-b)p u d + (1-b) p u d /(1-(1- p u d ) b ) + O(  logn/n) ] Let n be the number of users.

79 79 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ] Let n be the number of users.

80 80 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ]  b + (1-b) p u d Let n be the number of users. For p u  small

81 81 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ]  b + (1-b) p u d E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let n be the number of users. For p u  small

82 82 Upper-bound Estimates Theorem 4: When p v  =1 for all v  u: E[Y | X D (u)=d] = b + b(1-b)p u d + (1-b) 2 p u d [ (1-b)/(1-(1- p u  ) b )) + O(  logn/n) ]  b + (1-b) p u d E[Y | X D (u)=d]  b 2 + (1-b 2 ) p u d Let n be the number of users. Increased chance of total compromise from b 2 to b. For p u  small

83 83 Conclusions Many challenges remain in the design, implementation, and analysis of anonymous-communication systems. It is hard to prove theorems about real systems – or even to figure out what to prove. “Nothing is more practical than a good theory!” (Tanya Berger-Wolfe, UIC)


Download ppt "1 Modeling and Analysis of Anonymous-Communication Systems Joan Feigenbaum WITS’08; Princeton NJ; June 18, 2008 Acknowledgement:"

Similar presentations


Ads by Google