Presentation is loading. Please wait.

Presentation is loading. Please wait.

Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy.

Similar presentations


Presentation on theme: "Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy."— Presentation transcript:

1 Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy

2 What is Privacy? Privacy is the protection of an individual’s personal information. Privacy is the rights and obligations of individuals and organizations with respect to the collection, use, retention, disclosure and disposal of personal information. Privacy  Confidentiality 2Topic 21: Data Privacy

3 OECD Privacy Principles 1. Collection Limitation Principle –There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject. 2. Data Quality Principle –Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date. 3Topic 21: Data Privacy

4 OECD Privacy Principles 3. Purpose Specification Principle –The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose. 4. Use Limitation Principle –Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with Principle 3 except: –a) with the consent of the data subject; or –b) by the authority of law. 4Topic 21: Data Privacy

5 OECD Privacy Principles 5. Security Safeguards Principle –Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data. 6. Openness Principle –There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller. 5Topic 21: Data Privacy

6 OECD Privacy Principles 7. Individual Participation Principle –An individual should have the right: –a) to request to know whether or not the data controller has data relating to him; –b) to request data relating to him, … –c) to be given reasons if a request is denied; and –d) to request the data to be rectified, completed or amended. 8. Accountability Principle –A data controller should be accountable for complying with measures which give effect to the principles stated above. 6Topic 21: Data Privacy

7 Areas of Privacy Anonymity –Anonymous communication: e.g., The TOR software to defend against traffic analysis Web privacy –Understand/control what web sites collect, maintain regarding personal data Mobile data privacy, e.g., location privacy Privacy-preserving data usage Topic 21: Data Privacy7

8 Privacy Preserving Data Sharing The need to sharing data –For research purposes E.g., social, medical, technological, etc. –Mandated by laws and regulations E.g., census –For security/business decision making E.g., network flow data for Internet-scale alert correlation –For system testing before deployment –… However, publishing data may result in privacy violations 8Topic 21: Data Privacy

9 GIC Incidence [Sweeny 2002] Group Insurance Commissions (GIC, Massachusetts) –Collected patient data for ~135,000 state employees. –Gave to researchers and sold to industry. –Medical record of the former state governor is identified. Patient 1 Patient 2 Patient n GIC, MA DB …… AgeSexZip codeDisease 69M47906Cancer 65M47907Cancer 52F47902Flu 43F46204Gastritis 42F46208Hepatitis 47F46203Bronchitis Name Bob Carl Daisy Emily Flora Gabriel 9 Re-identification occurs! Topic 21: Data Privacy

10 AOL Data Release [NYTimes 2006] In August 2006, AOL Released search keywords of 650,000 users over a 3-month period. –User IDs are replaced by random numbers. –3 days later, pulled the data from public access. 10 “landscapers in Lilburn, GA” queries on last name “Arnold” “homes sold in shadow lake subdivision Gwinnett County, GA” “num fingers” “60 single men” “dog that urinates on everything” Thelman Arnold, a 62 year old widow who lives in Liburn GA, has three dogs, frequently searches her friends’ medical ailments. AOL searcher # 4417749 NYT Re-identification occurs! Topic 21: Data Privacy

11 Netflix Movie Rating Data [Narayanan and Shmatikov 2009] Netflix released anonymized movie rating data for its Netflix challenge –With date and value of movie ratings Knowing 6-8 approximate movie ratings and dates is able to uniquely identify a record with over 90% probability –Correlating with a set of 50 users from imdb.com yields two records Netflix cancels second phase of the challenge 11 Re-identification occurs! Topic 21: Data Privacy

12 Genome-Wide Association Study (GWAS) [Homer et al. 2008] A typical study examines thousands of singe-nucleotide polymorphism locations (SNPs) in a given population of patients for statistical links to a disease. From aggregated statistics, one individual’s genome, and knowledge of SNP frequency in background population, one can infer participation in the study. –The frequency of every SNP gives a very noisy signal of participation; combining thousands of such signals give high- confidence prediction 12 Membership disclosure occurs! Stud y grou p Avg Popu latio n Avg Targe t indivi dual SNP 1=A 43%42%yes SNP 2=A 11% no SNP 3=A 58%59%no SNP 4=A 23%22%yes … … Topic 21: Data Privacy

13 Need for Data Privacy Research Identification Disclosure (GIC, AOL, Netflix) –Leaks the subject individual of one record Attribute Disclosure –leaks more precise information about the attribute values of some individual Membership Disclosure (GWAS) –leaks an individual’s participation is in the dataset Research Program: Develop theory and techniques to anonymize data so that they can be beneficially used without privacy violations. How to define privacy for anonymized data? How to publish data to satisfy privacy while providing utility? 13Topic 21: Data Privacy

14 11 k-Anonymity [Sweeney, Samarati ]  Privacy is “protection from being brought to the attention of others.”  k-Anonymity Each record is indistinguishable from  k-1 other records when only “quasi-identifiers” are considered These k records form an equivalence class  To achieve k-Anonymity, uses Generalization: Replace with less-specific values Suppression: Remove outliers Male Female * 476** 47677 4767847602 2* 29 2722 Zipcode AgeGender Topic 21: Data Privacy

15 12 Example of k-Anonymity & Generalization QIDSA ZipcodeAgeGenDisease 4767729FOvarian Cancer 4760222FOvarian Cancer 4767827MProstate Cancer 4790543MFlu 4790952FHeart Disease 4790647MHeart Disease QIDSA ZipcodeAgeGenDisease 476** 2* ****** Ovarian Cancer Prostate Cancer 4790* [43,52] ****** Flu Heart Disease The MicrodataThe Generalized Table  3-Anonymous table The adversary knows Alice’s QI values (47677, 29, F) The adversary does not know which one of the first 3 records corresponds to Alice’s record. Topic 21: Data Privacy

16 16 Attacks on k-Anonymity ZipcodeAgeDisease 476** 2* Heart Disease 4790* ≥40 Flu Heart Disease Cancer 476** 3* Heart Disease Cancer A 3-anonymous patient table Bob ZipcodeAge 4767827 Carl ZipcodeAge 4767336  k-anonymity does not provide privacy if: Sensitive values lack diversity The attacker has background knowledge Homogeneity Attack Background Knowledge Attack Carl does not have heart disease Topic 21: Data Privacy

17 17  l –Diversity: [Machanavajjhala et al. 2006] Principle –Each equi-class contains at least l well-represented sensitive values Instantiation –Distinct l -diversity Each equi-class contains l distinct sensitive values –Entropy l -diversity entropy(equi-class)≥log 2 ( l ) Topic 21: Data Privacy

18 18 The Skewness Attack on l- Diversity l-diversity does not consider the overall distribution of sensitive values  Two values for the sensitive attribute HIV positive (1%) and HIV negative (99%)  Highest diversity still has serious privacy risk Consider an equi-class that contains an equal number of positive records and negative records.  l-diversity does not differentiate: Equi-class 1: 49 positive + 1 negative Equi-class 2: 1 positive + 49 negative Topic 21: Data Privacy

19 19 The Similarity Attack on l- Diversity Bob ZipAge 4767827 ZipcodeAgeSalaryDisease 476** 2* 20K 30K 40K Gastric Ulcer Gastritis Stomach Cancer 4790* ≥40 50K 100K 70K Gastritis Flu Bronchitis 476** 3* 60K 80K 90K Bronchitis Pneumonia Stomach Cancer A 3-diverse patient table Conclusion 1.Bob’s salary is in [20k,40k], which is relative low. 2.Bob has some stomach-related disease. l-diversity does not consider semantic meanings of sensitive values Topic 21: Data Privacy

20 20 t-Closeness Principle: Distribution of sensitive attribute value in each equi-class should be close to that of the overall dataset (distance  t) –Assuming that publishing a completely generalized table is always acceptable –We use Earth Mover Distance to capture semantic relationship among sensitive attribute values (n,t)-closeness: Distribution of sensitive attribute value in each equi-class should be close to that of some natural super-group consisting at least n tuples N. Li, T. Li, S. Venkatasubramanian: t- Closeness: Privacy Beyond k- Anonymity and l-diversity. In ICDE 2007. Journal version in TKDE 2010. Topic 21: Data Privacy

21 From Syntactical Privacy Notions to Differential Privacy Limitation of previous privacy notions: –Requires identifying which attributes are quasi-identifier or sensitive, not always possible –Difficult to pin down due to background knowledge –Syntactic in nature (property of anonymized dataset) Not exhaustive in inference prevented Differential Privacy [Dwork et al. 2006] –Privacy is not violated is one’s information is not included –Output does not overly depend on any single tuple 21Topic 21: Data Privacy

22 Variants of Differential Privacy Bounded Differential Privacy: D and D’ are neighbors if and only D’ can be obtained from D by replacing one tuple with another tuple –D and D’ have the same number of tuples –Revealing size of dataset does not affect privacy Unbounded Differential Privacy: D and D’ are neighbors if and only D’ can be obtained from D by adding or removing one tuple –The numbers of tuples in D and D’ differ by 1 In most cases, can use either one. Other definitions of neighboring datasets have also been considered Topic 21: Data Privacy22

23 One Primitive to Satisfy Differential Privacy: Add Noise to Output Intuition: f(D) can be released accurately when f is insensitive to individual entries x 1, … x n Global sensitivity GS f = max neighbors D,D’ ||f(D) – f(D’)|| 1 –Example: GS average = 1/n for sets of numbers between 0 and 1 Theorem: f(x) + Lap(GS f /  ) is  -indistinguishable –Noise generated from Laplace distribution slide 23 Tell me f(D) f(D)+noise x1…xnx1…xn Database User

24 Sensitivity with Laplace Noise slide 24

25 Another Primitive for Satisfying Differential Privacy: Exponential Mechanism The goal is to output f(D); f(D)  R –E.g., which item is purchased the most frequently Define a quality function q(D,r  R) –which gives a real number describing the desirability of outputting r on input dataset D Compute the sensitivity of the quality function –  q = max r max D,D’ |q(D,r) – q(D’,r)| Returns r with probability proportional to exp(q(D,r) / 2   q) satisfies  -DP Topic 21: Data Privacy25

26 Properties of Differential Privacy Composability –If A 1 satisfies  1 -DP, and A 2 satisfies  2 -DP, then outputting both A 1 and A 2 satisfies (  1 +  2 )-DP Some queries, such as counting queries, can be answered relatively accurately –Since one tuple affects the result by at most 1 –A small amount of noise (following the Laplace distribution) can be added to achieve DP Some queries are hard to answer –E.g., max, since it can be greatly affected by a single tuple Challenge in using it –Find suitable queries to ask so that noisy answers provide most utility 26Topic 21: Data Privacy

27 Relaxing Differential Privacy Satisfying differential privacy can result in too much distortion in many applications –Several efforts to relax differential privacy Differential Privacy means that one cannot distinguish between D  {t} and D. –With precise knowledge of D and t Relaxation: cannot distinguish between D  {t} and D, with precise knowledge of t, but only statistical knowledge of D –I.e., the setting of GWAS –GIC, AOL, Netflix requires only precise knowledge of t 27

28 Differential Privacy under Sampling 28 N. Li, W. Qardaji, D. Su: On sampling, anonymization, and differential privacy: or, k-anonymization meets differential privacy. ASIACCS 2012:

29 The Amplification Effect of Sampling A smaller sampling rate can achieve a stronger privacy protection 29

30 Safe k-Anonymization Meets Differential Privacy k-Anonymization has two steps 1.Output how the domain is partitioned strongly safe if this step does not depend on the dataset 2.Output how many tuples there are in each cell, outputting 0 when a partition contains fewer than k tuples 30

31 Towards Membership Privacy A series of papers challenging differential privacy –Differential privacy is not robust to arbitrary background knowledge (Kifer and Machanavajjhala) –Difficult to choose  in DP, propose differential identifiability (Lee and Clifton) –Differential privacy does not prevent attribute disclosure (Cormode) Needs a framework to examine these privacy notions –Privacy violation = positive membership disclosure –No membership disclosure means no attribute disclosure and no re-idenification disclosure 31

32 Intuition for Membership Privacy Adversary has some prior belief about what the input dataset (a prob. distribution over all possible datasets) –Gives the prior probability of any t’s membership Adversary updates belief after observing output of the algorithm, via Bayes rule –Obtains posterior probability of any t’s membership For any t, posterior belief should not change too much from prior –Whether this holds depends on the prior distribution Membership privacy is relative to the family of prior distributions the adversary is allowed to have 32Topic 21: Data Privacy

33 The Membership Privacy Framework 33 N. Li, W. Qardaji, D. Su,: Y. Wu, W. Yang: Membership Privacy: A Unifying Framework For Privacy Definitions. ACM CCS 2013. Topic 21: Data Privacy

34 Results from Membership Privacy Membership privacy for the family of all possible distributions is infeasible –Requires publishing similar output distributions for two completely different datasets –Output has (almost) no utility Moral: One has to make some assumptions about the adversary’s prior belief –Assumptions need to be clearly specified and reasonable 34Topic 21: Data Privacy

35 Results from Membership Privacy Differential privacy is equivalent to membership privacy under the family of all distributions where the entities/tuples are mutually independent Differential privacy insufficient for membership privacy without independence assumption 35Topic 21: Data Privacy

36 Instances of Membership Privacy Notions 36 All distributions: Privacy with Almost no Utility Mutually Independent (MI) dist: Pr[T] =  t  T p t  t  T (1-p t ) Unbounded Differential Privacy Bounded Mutually Independent dist.: MI distributions conditioned on all datasets have the same size Bounded Differential Privacy MI distributions where each p t is either 0 or  Differential Privacy Under Sampling MI distributions where each p t is either 1 or 1/m, where m is number of t’s have probability  1 Differential Identifiability MI distributions where each p t is 1/2 New privacy notion Topic 21: Data Privacy

37 Provable Private Publishing Datasets While Preserving Utility Non-interactive data publishing rather than interactive query answering Diverse problem domains require different methods –e.g., number of dimensions Methodology: Combining analysis of how algorithm performs with experimental validation 37Topic 21: Data Privacy

38 An Example: Differentially Private Publishing of Geospatial Data Publishing synopsis of 2-D dataset while ensuring high accuracy for range queries Uniform Grid method: –Add noise to points count of each cell –Choose grid size to balance Error due to added noise Error due to non-uniformity Adaptive Grid method: –Two-level partitioning W. Qardaji, W. Yang, N. Li: Differentially private grids for geospatial data. ICDE 2013. 38Topic 21: Data Privacy

39 39 Coming Attractions … Role-Based and Attribute-based Access Control


Download ppt "Topic 21: Data Privacy1 Information Security CS 526 Topic 21: Data Privacy."

Similar presentations


Ads by Google