Privacy and trust in social network

Slides:



Advertisements
Similar presentations
Simulatability “The enemy knows the system”, Claude Shannon CompSci Instructor: Ashwin Machanavajjhala 1Lecture 6 : Fall 12.
Advertisements

Quality Aware Privacy Protection for Location-based Services Zhen Xiao, Xiaofeng Meng Renmin University of China Jianliang Xu Hong Kong Baptist University.
Anonymizing Location-based data Jarmanjit Singh Jar_sing(at)encs.concordia.ca Harpreet Sandhu h_san(at)encs.concordia.ca Qing Shi q_shi(at)encs.concordia.ca.
Hani AbuSharkh Benjamin C. M. Fung fung (at) ciise.concordia.ca
Private Analysis of Graph Structure With Vishesh Karwa, Sofya Raskhodnikova and Adam Smith Pennsylvania State University Grigory Yaroslavtsev
PRIVACY AND SECURITY ISSUES IN DATA MINING P.h.D. Candidate: Anna Monreale Supervisors Prof. Dino Pedreschi Dott.ssa Fosca Giannotti University of Pisa.
1 Differentially Private Analysis of Graphs and Social Networks Sofya Raskhodnikova Pennsylvania State University.
Project topics – Private data management Nov
Xiaowei Ying Xintao Wu Univ. of North Carolina at Charlotte 2009 SIAM Conference on Data Mining, May 1, Sparks, Nevada Graph Generation with Prescribed.
1 A Distortion-based Metric for Location Privacy Workshop on Privacy in the Electronic Society (WPES), Chicago, IL, USA - November 9, 2009 Reza Shokri.
Finding Effectors in Social Networks T. Lappas (UCR), E. Terzi (BU), D. Gunopoulos (UoA), H. Mannila (Aalto U.) Presented by: Eric Gavaletz 04/26/2011.
An Authentication Service Based on Trust and Clustering in Wireless Ad Hoc Networks: Description and Security Evaluation Edith C.H. Ngai and Michael R.
C MU U sable P rivacy and S ecurity Laboratory 1 Privacy Policy, Law and Technology Data Privacy October 30, 2008.
Chapter 9 Graph algorithms Lec 21 Dec 1, Sample Graph Problems Path problems. Connectedness problems. Spanning tree problems.
Suppose I learn that Garth has 3 friends. Then I know he must be one of {v 1,v 2,v 3 } in Figure 1 above. If I also learn the degrees of his neighbors,
Privacy in Social Networks:
1 A DATA MINING APPROACH FOR LOCATION PREDICTION IN MOBILE ENVIRONMENTS* by Gökhan Yavaş Feb 22, 2005 *: To appear in Data and Knowledge Engineering, Elsevier.
Music Recommendation By Daniel McEnnis. Outline Sociology of Music Recommendation Infrastructure –Relational Analysis Toolkit Description Evaluation –GATE.
Privacy-Aware Computing Introduction. Outline  Brief introduction Motivating applications Major research issues  Tentative schedule  Reading assignments.
The Union-Split Algorithm and Cluster-Based Anonymization of Social Networks Brian Thompson Danfeng Yao Rutgers University Dept. of Computer Science Piscataway,
1Evimaria Terzi7/14/2015 Towards identity-anonymization on graphs K. Liu & E. Terzi, SIGMOD 2008 A framework for computing the privacy score of users in.
PRIVACY CRITERIA. Roadmap Privacy in Data mining Mobile privacy (k-e) – anonymity (c-k) – safety Privacy skyline.
Preserving Privacy in Clickstreams Isabelle Stanton.
Lecturer: Moni Naor Foundations of Privacy Formal Lecture Zero-Knowledge and Deniable Authentication.
Structure based Data De-anonymization of Social Networks and Mobility Traces Shouling Ji, Weiqing Li, and Raheem Beyah Georgia Institute of Technology.
TOWARDS IDENTITY ANONYMIZATION ON GRAPHS. INTRODUCTION.
Temporal Event Map Construction For Event Search Qing Li Department of Computer Science City University of Hong Kong.
P RIVACY -P RESERVING R ELEASES OF S OCIAL N ETWORKS Chih-Hua Tai Dept. of CSIE, National Taipei University, New Taipei City, Taiwan.
Private Analysis of Graphs
Social Networking and On-Line Communities: Classification and Research Trends Maria Ioannidou, Eugenia Raptotasiou, Ioannis Anagnostopoulos.
Overview of Privacy Preserving Techniques.  This is a high-level summary of the state-of-the-art privacy preserving techniques and research areas  Focus.
Page 1 WEB MINING by NINI P SURESH PROJECT CO-ORDINATOR Kavitha Murugeshan.
Wherefore Art Thou R3579X? Anonymized Social Networks, Hidden Patterns, and Structural Stenography.
Background Knowledge Attack for Generalization based Privacy- Preserving Data Mining.
Systems and Internet Infrastructure Security (SIIS) LaboratoryPage Systems and Internet Infrastructure Security Network and Security Research Center Department.
Network of Epidemiology Digital Objects Naren Sundar, Kui Xu Client: Sandeep Gupta, S.M. Shamimul CS6604 Class Project.
Accuracy-Constrained Privacy-Preserving Access Control Mechanism for Relational Data.
Resisting Structural Re-identification in Anonymized Social Networks Michael Hay, Gerome Miklau, David Jensen, Don Towsley, Philipp Weis University of.
A Data-Reachability Model for Elucidating Privacy and Security Risks Related to the Use of Online Social Networks S. Creese, M. Goldsmith, J. Nurse, E.
Xiaowei Ying, Xintao Wu Univ. of North Carolina at Charlotte PAKDD-09 April 28, Bangkok, Thailand On Link Privacy in Randomizing Social Networks.
Preservation of Proximity Privacy in Publishing Numerical Sensitive Data J. Li, Y. Tao, and X. Xiao SIGMOD 08 Presented by Hongwei Tian.
Xiaowei Ying, Xintao Wu Dept. Software and Information Systems Univ. of N.C. – Charlotte 2008 SIAM Conference on Data Mining, April 25 th Atlanta, Georgia.
On the Approximability of Geometric and Geographic Generalization and the Min- Max Bin Covering Problem Michael T. Goodrich Dept. of Computer Science joint.
Anonymized Social Networks, Hidden Patterns, and Structural Stenography Lars Backstrom, Cynthia Dwork, Jon Kleinberg WWW 2007 – Best Paper.
The Structure of the Web. Getting to knowing the Web How big is the web and how do you measure it? How many people use the web? How many use search engines?
Privacy Preserving Payments in Credit Networks By: Moreno-Sanchez et al from Saarland University Presented By: Cody Watson Some Slides Borrowed From NDSS’15.
1 1 COMP5331: Knowledge Discovery and Data Mining Acknowledgement: Slides modified based on the slides provided by Lawrence Page, Sergey Brin, Rajeev Motwani.
Privacy-preserving data publishing
Trajectory Data Mining Dr. Yu Zheng Lead Researcher, Microsoft Research Chair Professor at Shanghai Jiao Tong University Editor-in-Chief of ACM Trans.
Information Technology (Some) Research Trends in Location-based Services Muhammad Aamir Cheema Faculty of Information Technology Monash University, Australia.
Privacy Protection in Social Networks Instructor: Assoc. Prof. Dr. DANG Tran Khanh Present : Bui Tien Duc Lam Van Dai Nguyen Viet Dang.
EVALUATING LBS PRIVACY IN DYNAMIC CONTEXT 1. Outline 2  Overview Attack Model  Classification Defend Model  Evaluation Module  Conclusion.
Graph Data Management Lab, School of Computer Science Personalized Privacy Protection in Social Networks (VLDB2011)
Yang, et al. Differentially Private Data Publication and Analysis. Tutorial at SIGMOD’12 Part 4: Data Dependent Query Processing Methods Yin “David” Yang.
Personalized Privacy Preservation: beyond k-anonymity and ℓ-diversity SIGMOD 2006 Presented By Hongwei Tian.
ROLE OF ANONYMIZATION FOR DATA PROTECTION Irene Schluender and Murat Sariyar (TMF)
Subgraph Search Over Uncertain Graphs Erşan Demircioğlu.
Privacy Issues in Graph Data Publishing Summer intern: Qing Zhang (from NC State University) Mentors: Graham Cormode and Divesh Srivastava.
Xiaowei Ying, Kai Pan, Xintao Wu, Ling Guo Univ. of North Carolina at Charlotte SNA-KDD June 28, 2009, Paris, France Comparisons of Randomization and K-degree.
Cohesive Subgraph Computation over Large Graphs
Link-Based Ranking Seminar Social Media Mining University UC3M
A Condensation Approach for Privacy Preserving Data Mining
Personalized Privacy Protection in Social Networks
Postdoc, School of Information, University of Arizona
Personalized Privacy Protection in Social Networks
جستجو در وب عميق ارائه‌دهنده: حسين شريفي‌پناه
Presented by : SaiVenkatanikhil Nimmagadda
Published in: IEEE Transactions on Industrial Informatics
Towards identity-anonymization on graphs
Presentation transcript:

Privacy and trust in social network Michelle Hong 2009/03/02

Outline What is privacy and trust? Privacy in social network Basic privacy requirement Privacy in graph Trust in social network Reference

What is Privacy Privacy is the ability of an individual or group to seclude themselves or information about themselves and thereby reveal themselves selectively. Different privacy boundaries and content Voluntarily sacrificed Uniquely identifiable data relating to a person or persons

What is Trust? Trust is a relationship of reliance. Not related to good character, or morals Trust does not need to include an action that you and the other party are mutually engaged in. Trust is a prediction of reliance on an action. Conditional

Privacy and Trust Tradeoff Need legal rights Reveal more data to trustworthy people Provide access rights Gain trust through open sensitive data

Outline What is privacy and trust? Privacy in social network Basic privacy requirement Privacy in graph Trust in social network Reference

K-anonymous [1] Have at least k answers Given multiple data publisher Get sensitive value Have at least k answers

L-diversity [2] Have at least l different sensitive answers

t-closeness [3] T semantic meaning result

Dynamic Anonymization [4]

Outline What is privacy and trust? Privacy in social network Basic privacy requirement Privacy in graph Trust in social network Reference

Possible Attacks On Anonymized Graphs Attack method [5] Identify by neighborhood information It includes: Vertex Refinement Queries Sub-graph Queries Hub Fingerprint Queries Attack types[6] Active Attacks Create a small number of new user accounts linking with other users before the anonymized graph is generated Passive Attacks Indentify themselves in the published graph Semi-passive Attacks Create necessary link with other users

Vertex Refinement Queries H*’s computation is linear in the number of edges in the graph, very efficiently.

Sub-graph Queries Query is the subgraph information adjacent to the target node Computation intensive

Hub Fingerprint Queries Suppose Dave and Ed are selected as hubs F1(fred) = (1, 0) (The shortest path length to each hub) F2(fred) = (1, 2) If F1(fred) = (1, 0) in open world, then both F1(fred) = (1, 0) and (1, 1) are candidate because the adversary may not have the complete knowledge

Avoid attacks Request authorities to linkage confirmation Users confirm a request about adding a friend Website provides checking on users Identify and remove attack nodes Find the strange structure nodes

k-degree anonymous[7] The kind of attack Objective Method Vertex Refinement Queries (H(1)) Objective The published graph For every node v, there exist at least k-1 other nodes in the graph with the same degree as v Minimum edges are added in to reserve the graph’s shape as much as possible Method Add edges into the original anonymized graph First compute the new degree vector that satisfy k-degree Then generate the new graph based on this degree vector

K-neighbor anonymous [8]

Resist neighborhood attack through graph generalization[5] 2 Step1: Partition the graph, each partition contains at least k nodes 1 1 3 2 2 2 Step2: For each partition, generate a super node 2 3 3 2 Step3: Draw the edges between partitions, the weight is the edge number In this paper, he use simulated annealing to find the partitions maximize the likelihood function Step3: Draw the sel-edges for each partition, the weight is the edge number with it

Outline What is privacy and trust? Privacy in social network Basic privacy requirement Privacy in graph Trust in social network Reference

Mining Privacy in Social Network What’s the problem in Web 2.0:  Activity streams: users are not aware that some mini-feeds on the profile Unwelcome linkage: a friend who explicitly write the link for other user's profile merge social graph: link of link

Privacy in Social Data Different users have different opinions on sensitive data Website enables users to set up access permission Construct trust network from social data

Reference [1] L. Sweeney. k-anonymity: a model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems, 10 (5), 2002; 557-570. [2] Ashwin Machanavajjhala , Daniel Kifer , Johannes Gehrke , Muthuramakrishnan Venkitasubramaniam, L-diversity: Privacy beyond k-anonymity, ACM Transactions on Knowledge Discovery from Data (TKDD), v.1 n.1, p.3-es, March 2007 [3] Ninghui Li, Tiancheng Li, and Suresh Venkatasubramanian, "t-Closeness: Privacy Beyond k- Anonymity and l-Diversity," in IEEE International Conference on Data Engineering (this proceedings), 2007. [4] Xiao, X., Tao, Y. Dynamic Anonymization: Accurate Statistical Analysis with Privacy Preservation. Proceedings of ACM Conference on Management of Data (SIGMOD), pages 107-120, 2008. [5] Michael Hay, Gerome Miklau, David Jensen, Don Towsley and Philipp Weis, Resisting Structural Re-identification in Anonymized Social Networks. PVLDB08 [6] Lars Backstrom, Cynthia Dwork and Jon Kleinberg, Wherefore Art Thou R3579X? Anonymized Social Networks, Hidden Patterns, and Structural Steganography. WWW2007 [7] Kun liu and Evimaria Terzi, Towards Identity Anonymization on Graphs. SIGMOD08 [8] Bin Zhou and Jian Pei, Preserving Privacy in Social Networks Against Neighborhood Attacks ICDE08