Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Introduction to Differential Privacy and its Applications 1 Ali Bagherzandi Ph.D Candidate University of California at Irvine 1- Most slides in this.

Similar presentations


Presentation on theme: "An Introduction to Differential Privacy and its Applications 1 Ali Bagherzandi Ph.D Candidate University of California at Irvine 1- Most slides in this."— Presentation transcript:

1 An Introduction to Differential Privacy and its Applications 1 Ali Bagherzandi Ph.D Candidate University of California at Irvine 1- Most slides in this presentation are from Cyntia Dwork’s talk.

2 Goal of Differential Privacy: Privacy-Preserving Statistical Analysis of Confidential Data Finding Statistical Correlations Analyzing medical data to learn genotype/phenotype associations Correlating cough outbreak with chemical plant malfunction Can’t be done with HIPAA safe-harbor sanitized data Noticing Events Detecting spike in ER admissions for asthma Official Statistics Census Contingency Table Release (statistics for small sets of attributes) Model fitting, regression analyses, etc. Standard Datamining Tasks Clustering; learning association rules, decision trees, separators; principal component analysis

3 Two Models: Database K ? Sanitized Database Non-Interactive: Data are sanitized and released

4 Two Models: Database K ? Sanitized Database Interactive: Multiple Queries, Adaptively Chosen

5 Achievements for Statistical Data-Bases Defined Differential Privacy Proved classical goal (Dalenius, 1977) unachievable “Ad Omnia” definition; independent of linkage information General Approach; Rigorous Proof Relates degree of distortion to the (mathematical) sensitivity of the computation needed for the analysis “How much” can the data of one person affect the outcome? Cottage Industry: redesigning algorithms to be insensitive Assorted Extensions When noise makes no sense; when actual sensitivity is much less than worst-case; when the database is distributed; … Lower bounds on distortion Initiated by Dinur and Nissim

6 Semantic Security for Confidentiality [Goldwasser-Micali ’ 82] Vocabulary Plaintext: the message to be transmitted Ciphertext: the encryption of the plaintext Auxiliary information: anything else known to attacker The ciphertext leaks no information about the plaintext. Formalization Compare the ability of someone seeing aux and ciphertext to guess (anything about) the plaintext, to the ability of someone seeing only aux to do the same thing. Difference should be “tiny”.

7 Semantic Security for Privacy? Dalenius, 1977 Anything that can be learned about a respondent from the statistical database can be learned without access to the database. Happily, Formalizes to Semantic Security Unhappily, Unachievable [Dwork and Naor 2006] Both for not serious and serious reasons. 7

8 Semantic Security for Privacy? The Serious Reason (told as a parable) Database teaches average heights of population subgroups “Terry Tao is four inches taller than avg Swedish ♀” Access to DB teaches Terry’s height. Terry’s height learnable from the DB, not learnable w/o. Proof extends to “any” notion of privacy breach. Attack Works Even if Terry Not in DB! Suggests new notion of privacy: risk incurred by joining DB Before/After interacting vs Risk when in/notin DB

9 Differential Privacy: Formal Definition K gives  differential  privacy if for all values of DB, DB’ differing in a single element, and all S in Range( K ) Pr[ K (DB) in S] Pr[ K (DB’) in S] ≤ e ε ~ (1+ε) ratio bounded Pr [t]

10 f + noise ? K f: DB  R d K (f, DB) = f(DB) + [Noise] d Eg, Count(P, DB) = # rows in DB with Property P 42 Achieving Differential Privacy:

11 How Much Can f(DB + Me) Exceed f(DB - Me)? Recall: K (f, DB) = f(DB) + noise Question Asks: What difference must noise obscure? Δf  = max H(DB, DB’)=1 ||f(DB) – f(DB’)|| 1 eg, ΔCount  = 1 43 Sensitivity of Function

12 Calibrate Noise to Sensitivity Δf  = max H(DB, DB’)=1 ||f(DB) – f(DB’)|| 1 -4R-3R-2R-R 0 R2R3R4R5R Lap(R) : f(x) = 1/2R exp(-|x|/R) Theorem: To achieve ε-differential  privacy, usescaled symmetric noise [Lap(R)] d with R = Δf  ε. Increasing R flattens curve; more privacy Noise depends on f and  not on the database

13 Why does it work? (d=1) Pr[ K (f, DB’) = t] Pr[ K (f, DB) = t] = exp(-(|t- f(DB)|-|t- f(DB’)|)/R) ≤ exp(-Δf/R) ≤ e  Δf  = max DB, DB-Me |f(DB) – f(DB-Me)| Theorem: To achieve ε-differential  privacy, usescaled symmetric noise Lap(R) with R = Δf  ε. -4R-3R-2R-R 0 R2R3R4R5R

14 Differential Privacy: Example IDClaim (‘000$) IDClaim (‘000$) 19.91710.63 29.16812.54 310.5999.29 411.27108.92 510.501120 611.89128.55 ………… Query : Average claim Aux input : A fraction C of entries. Without Noise : Can learn my claim w/ prob. C. What privacy level is enough for me? I feel safe if I’m hidden among n people  Pr [identify] = 1/n If n=100, c=0.01 in this example  = 1. In most applications, same  gives me more feeling of privacy. e.g. adv. Identification itself is probabilistic.   = 0.1, 1.0, ln(2)

15 Differential Privacy: Example Query : Average claim Aux input : 12 entries  = 1  hide me among 1200 people Δf = max DB, DB-Me |f(DB) – f(DB-Me)| = max DB, DB-Me |Avg(DB) – Avg(DB-Me)| = 11.10-10.29 = 0.81 Add noise : Lap (  f/  ) = Lap(0.81/1) = Lap(0.81) IDClaim (‘000$) IDClaim (‘000$) 19.91710.63 29.16812.54 310.5999.29 411.27108.92 510.501120 611.89128.55 ………… Var = 1.31  Roughly: respond w/ ± 1.31 true value  Avg. not a sensitive function!

16 Differential Privacy: Example Query : max claim Aux input : 12 entries  = 1  hide me among 1200 people Δf = max DB, DB-Me |f(DB) – f(DB-Me)| = max DB, DB-Me |Avg(DB) – Avg(DB-Me)| = 20-12.54 = 7.46 Add noise : Lap (  f/  ) = Lap(7.46/1) = Lap(7.46) IDClaim (‘000$) IDClaim (‘000$) 19.91710.63 29.16812.54 310.5999.29 411.27108.92 510.501120 611.89128.55 ………… Var = 1.31  Roughly: respond w/ ± 111.30 true value  Max. function is very sensitive!  Useless : error > sampling error

17 For all DB, DB’ differing in at most one element for all subset S of Range(K ), Pr[ K (DB) in S] ≤ e   Pr[ K (DB’) in S] + T where  (n) is negligible. Cf :  Differential  Privacy is unconditional, independent of n A Natural Relaxation: ( ,T)-Differential Privacy

18 Database D = {d 1, …,d n } rows in {0,1} k T “counting queries” q = (S,g) S subset [1..n], g: rows  {0,1}; “How many rows in S satisfy g?” q(D) = ∑ i in S g(di) Privacy Mechanism: Add Binomial Noise r = q(D) + B(f(n), ½) SuLQ Theorem: (δ, T )-Differential Privacy when T(n)=O(n c ), 0< c <1; δ = 1/O(n c’ ), 0< c’ < c/2 f(n) = (T(n) /δ 2 ) log 6 (n) If (√T(n)/δ)<O(√n) then |noise| < |sampling error|! Feasibility Results: [Dwork Nissim ’04] (Here: Revisionist Version) e.g. : T =n 3/4 and δ= 1/10 then √T/ δ= 10n 3/8 << √ n

19 Blatant Non-Privacy: Adversary guesses correctly o(n) rows Blatant non-privacy if: all /  *cn / (1/2 +  c’n answers are within o(√n) of the true answer, n / cn / c’n queries and poly(n) / poly(n) / exp(n) computation. [Y] / [DMT] / [DMT] Results are independent of how noise is distributed. A variant model permits poly(n) computation in the final case [DY]. Impossibility Results: Line of Research Initiated by Dinur and Nissim ’03 2n queries: noise E permits correctly guessing n-4E rows [DiNi]. Depressing implications for non-interactive case. even against an adversary restricted to

20 Thank You!


Download ppt "An Introduction to Differential Privacy and its Applications 1 Ali Bagherzandi Ph.D Candidate University of California at Irvine 1- Most slides in this."

Similar presentations


Ads by Google