Presentation is loading. Please wait.

Presentation is loading. Please wait.

Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1.

Similar presentations


Presentation on theme: "Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1."— Presentation transcript:

1 Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1

2 Social Advertising Armani Gucci Prada Armani Gucci Prada Recommend ads based on private shopping histories of “friends” in the social network. 2 AliceBetty Nikon HP Nike Nikon HP Nike

3 3 Social Advertising … in real world A product that is followed by your friends … Items (products/people) liked by Alice’s friends are better recommendations for Alice

4 Social Advertising … privacy problem 4 Fact that “Betty” liked “VistaPrint” is leaked to “Alice” Alice Betty Only the items (products/people) liked by Alice’s friends are recommendations for Alice

5 Social Advertising … privacy problem 5 Alice Betty Recommending irrelevant items some times improves privacy, but reduces accuracy

6 6 Social Advertising Privacy problem AliceBetty Alice is recommended ‘X’ Can we provide accurate recommendations to Alice based on the social network, while ensuring that Alice cannot deduce that Betty likes ‘X’ ?

7 Outline of this talk Formal social recommendations problem – Privacy for social recommendations – Accuracy of social recommendations – Example private algorithm and its accuracy Privacy-Accuracy trade-off – Properties satisfied by a general algorithm – Theoretical bound 7

8 Social Recommendations A set of agents – Yahoo/Facebook users, medical patients A set of recommended items – Other users (friends), advertisements, products (drugs) A network of edges connecting the agents, items – Social network, patient-doctor and patient-drug history Problem: – Recommend a new item i to agent a based on the network 8

9 Social Recommendations(this talk) A set of agents – Yahoo/Facebook users, medical patients A set of recommended items – Other users (friends), advertisements, products (drugs) A network of edges connecting the agents, items – Social network, patient-doctor and patient-drug history Problem: – Recommend a new friend i to target user a based on the social network 9

10 Social Recommendations 10 Target Node (a) Candidate Recommendations u(a, i 3 )u(a, i 2 ) u(a, i 1 ) Utility Function – u(a, i) utility of recommending candidate i to target a Examples [Liben-Nowell et al. 2003]: # of Common Neighbors # of Weighted Paths Personalized Page Rank Utility Function – u(a, i) utility of recommending candidate i to target a Examples [Liben-Nowell et al. 2003]: # of Common Neighbors # of Weighted Paths Personalized Page Rank

11 Non-Private Recommendation Algorithm 11 u(a, i 3 )u(a, i 2 ) u(a, i 1 ) Utility Function – u(a, i) utility of recommending candidate i to target a Utility Function – u(a, i) utility of recommending candidate i to target a Algorithm For each target node a For each candidate i Compute p(a, i) that maximizes Σ u(a,i) p(a,i) endfor Randomly pick one of the candidates with probability p(a,i) endfor Algorithm For each target node a For each candidate i Compute p(a, i) that maximizes Σ u(a,i) p(a,i) endfor Randomly pick one of the candidates with probability p(a,i) endfor a

12 Example: Common Neighbors Utility 12 Utility Function – u(a, i) utility of recommending candidate i to target a Utility Function – u(a, i) utility of recommending candidate i to target a Common Neighbors Utility: “Alice and Bob are likely to be friends if they have many common neighbors” u(a,i 1 ) = f(2), u(a, i 2 ) = f(3), u(a,i 3 ) = f(1) Non-Private Algorithm Return the candidate with max u(a, i) Randomly pick a candidate with probability proportional to u(a,i) Common Neighbors Utility: “Alice and Bob are likely to be friends if they have many common neighbors” u(a,i 1 ) = f(2), u(a, i 2 ) = f(3), u(a,i 3 ) = f(1) Non-Private Algorithm Return the candidate with max u(a, i) Randomly pick a candidate with probability proportional to u(a,i) u(a, i 3 )u(a, i 2 ) u(a, i 1 ) a

13 Outline of this talk Formal social recommendations problem – Privacy for social recommendations – Accuracy of social recommendations – Example private algorithm and its accuracy Privacy-Accuracy trade-off – Properties satisfied by a general algorithm – Theoretical bound 13

14 Differential Privacy For every output … OD2D2 D1D1 Adversary should not be able to distinguish between any D 1 and D 2 based on any O Pr[D 1  O] Pr[D 2  O]. Adversary should not be able to distinguish between any D 1 and D 2 based on any O Pr[D 1  O] Pr[D 2  O]. For every pair of inputs that differ in one value 1) log [Dwork 2006]

15 Privacy for Social Recommendations Sensitive information: Recommendation should not disclose the existence of an edge between two nodes. 15 Pr[ recommending (i, a) | G 1 ] Pr[ recommending (i, a) | G 2 ] log< ε a i G1G1 a i G2G2

16 Outline of this talk Formal social recommendations problem – Privacy for social recommendations – Accuracy of social recommendations – Example private algorithm and its accuracy Privacy-Accuracy trade-off – Properties satisfied by a general algorithm – Theoretical bound 16

17 Measuring loss in utility due to privacy Suppose algorithm A recommends node i of utility u i with probability p i. Accuracy of A is defined as – comparison with utility of non-private algorithm 17

18 Outline of this talk Formal social recommendations problem – Privacy for social recommendations – Accuracy of social recommendations – Example private algorithm and its accuracy Privacy-Accuracy trade-off – Properties satisfied by a general algorithm – Theoretical bound 18

19 Algorithms for Differential Privacy Theorem: No deterministic algorithm guarantees differential privacy. Exponential Mechanism – Sample output space based on a distance metric. Laplace Mechanism – Add noise from a Laplace distribution to query answers. 19

20 Privacy Preserving Recommendations Must pick a node with non-zero probability even if u = 0 20 Exponential Mechanism [McSherry et al. 2007] Exponential Mechanism [McSherry et al. 2007] Randomly pick a candidate with probability proportional to exp( ε∙u(a,i) / Δ ) (Δ is maximum change in utilities by changing one edge) Randomly pick a candidate with probability proportional to exp( ε∙u(a,i) / Δ ) (Δ is maximum change in utilities by changing one edge) u(a, i 3 )u(a, i 2 ) u(a, i 1 ) a Satisfies ε -differential privacy

21 Accuracy of Exponential Mechanism + Common Neighbors Utility 21 WikiVote Network (ε = 0.5) 60% of users have accuracy < 10%

22 Accuracy of Exponential Mechanism + Common Neighbors Utility 22 Twitter sample (ε = 1) 98% of users have accuracy < 5%

23 Can we do better? Maybe common neighbors utility is an especially non- private utility … – Consider a general utility functions that follow intuitive axioms Maybe the Exponential Mechanism algorithm does not guarantee sufficient accuracy... – Consider any algorithm that satisfies differential privacy 23

24 Outline of this talk Formal social recommendations problem – Privacy for social recommendations – Accuracy of social recommendations – Example private algorithm and its accuracy Privacy-Accuracy trade-off – Properties satisfied by a general algorithm – Theoretical bound 24

25 u(a, i 4 ) Axioms on Utility Functions 25 u(a, i 3 ) u(a, i 2 ) u(a, i 1 ) a Identical with respect to ‘a’. Hence, u(a, i 3 ) = u(a, i 4 ) Identical with respect to ‘a’. Hence, u(a, i 3 ) = u(a, i 4 )

26 Axioms on Utility Functions 26 “Most of the utility of recommendation to a target is concentrated on a small number of candidates.”

27 Outline of this talk Formal social recommendations problem – Privacy for social recommendations – Accuracy of social recommendations – Example private algorithm and its accuracy Privacy-Accuracy trade-off – Properties satisfied by a general algorithm – Theoretical bound 27

28 Accuracy-Privacy Tradeoff 28 Common Neighbors & Weighted Paths Utility*: To achieve constant accuracy for target node a, ε > Ω(log n / degree(a)) Common Neighbors & Weighted Paths Utility*: To achieve constant accuracy for target node a, ε > Ω(log n / degree(a)) * under some mild assumptions on the weighted paths utility …

29 Implications of Accuracy-Privacy Tradeoff 29 WikiVote Network (ε = 0.5) 60% of users have accuracy < 55%

30 Implications of Accuracy-Privacy Tradeoff 30 Twitter sample (ε = 1) 95% of users have accuracy < 5%

31 Takeaway … “For majority of the nodes in the network, recommendations must either be inaccurate or violate differential privacy!” – Maybe this is a “bad idea” – Or, Maybe differential privacy is too strong a privacy definition to shoot for. 31

32 Intuition behind main result 32 Skip >>

33 Intuition behind main result 33 a i G1G1 j a i G2G2 j u 1 (a, i), p 1 (a, i) u 1 (a, j), p 1 (a, j) u 2 (a, i), p 2 (a, i) u 2 (a, j), p 2 (a, j) p 1 (a,i) p 2 (a,i) < e ε

34 Intuition behind main result 34 a i G2G2 j p 1 (a,i) p 2 (a,i) < e ε a i G3G3 j p 3 (a,j) p 1 (a,j) < e ε a i G1G1 j

35 Using Exchangeability 35 a i G2G2 j p 1 (a,i) p 2 (a,i) < e ε a i G3G3 j p 3 (a,j) p 1 (a,j) < e ε G3 is an isomorphism of G2. u 2 (a,i) = u 3 (a,j) implies p 2 (a,i) = p 3 (a,j)

36 Using Exchangeability 36 p 1 (a,i) p 1 (a,j) < e 2ε G3 is an isomorphism of G2. u 2 (a,i) = u 3 (a,j) implies p 2 (a,i) = p 3 (a,j)

37 Using Exchangeability In general if any node i can be “transformed” to node j in t edge changes. Then, 37 p 1 (a,i) p 1 (a,j) < e tε probability of recommending highest utility node is at most e tε times probability of recommending worst utility node.

38 Final Act: Using Concentration Few nodes have high utility for target a – 10s of nodes share a common neighbor with a Many nodes have low utility for target a – Millions of nodes don’t share a common neighbor with a Thus, there exist i and j such that 38 p 1 (a,i) p 1 (a,j) < e tε Ω(n) =

39 Summary of Social Recommendations Question: “Can social recommendations be made while guaranteeing strong privacy conditions?” – General utility functions satisfying natural axioms – Any algorithm satisfying differential privacy Answer: “For majority of nodes in the network, recommendations must either be inaccurate or violate differential privacy!” – Maybe this is a “bad idea” – Or, Maybe differential privacy is too strong a privacy definition to shoot for. 39

40 Summary of Social Recommendations Answer: “For majority of nodes in the network, recommendations must either be inaccurate or violate differential privacy!” – Maybe this is a “bad idea” – Or, Maybe differential privacy is too strong a privacy definition to shoot for. Open Question: “What is the minimum amount of personal information that a user must be willing to disclose in order to get personalized recommendations?” 40

41 Thank you 41


Download ppt "Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1."

Similar presentations


Ads by Google