Presentation is loading. Please wait.

Presentation is loading. Please wait.

Negotiated Privacy CS551/851CRyptographyApplicationsBistro Mike McNett 30 March 2004 Stanislaw Jarecki, Pat Lincoln, Vitaly Shmatikov. Negotiated Privacy.Negotiated.

Similar presentations


Presentation on theme: "Negotiated Privacy CS551/851CRyptographyApplicationsBistro Mike McNett 30 March 2004 Stanislaw Jarecki, Pat Lincoln, Vitaly Shmatikov. Negotiated Privacy.Negotiated."— Presentation transcript:

1 Negotiated Privacy CS551/851CRyptographyApplicationsBistro Mike McNett 30 March 2004 Stanislaw Jarecki, Pat Lincoln, Vitaly Shmatikov. Negotiated Privacy.Negotiated Privacy Dawn Xiaodong Song, David Wagner, Adrian Perrig. Practical Techniques for Searches on Encrypted Data.Practical Techniques for Searches on Encrypted Data Brent R. Waters, Dirk Balfanz, Glenn Durfee, and D. K. Smetters. Building an Encrypted and Searchable Audit Log.Building an Encrypted and Searchable Audit Log

2 Negotiated Privacy Necessary? World Wide Web Consortium (W3C) Platform for Privacy Preferences (P3P) Project (http://www.w3.org/P3P/)http://www.w3.org/P3P/ “The Platform for Privacy Preferences Project (P3P), … is emerging as an industry standard providing a simple, automated way for users to gain more control over the use of personal information on Web sites they visit. … P3P enhances user control by putting privacy policies where users can find them, in a form users can understand, and, most importantly, enables users to act on what they see. “ NOTE: 10 February 2004, W3C P3P 1.1 First public Working DraftW3C P3P 1.1 First public Working Draft

3 Why is it Really Necessary? “The way to have good and safe government, is not to trust it all to one, but to divide it among the many...[It is] by placing under every one what his own eye may superintend, that all will be done for the best.” Thomas Jefferson to Joseph Cabell (Feb. 2, 1816) It’s necessary because Mr. Jefferson said so!

4 Outline Application Areas Options for Privacy Management What Negotiated Privacy Is What Negotiated Privacy Is Not Implementation Details Limitations Conclusion

5 Application Areas Health data (diseases, bio-warfare, epidemics, drug interactions, etc.) Banking (money laundering, tax avoidance, etc.) National security (terrorist tracking, money transfers, etc.) Digital media (copies, access rights, etc.) Note: Many applications require –Security –Guarantees of privacy

6 Options for Privacy Management Trust the collectors / analysts (people / organizations accessing the data)? IRS, DMV, WalMart Trust the users for which the data is about? P3P Combination of the above? –Negotiate what is reportable and what isn’t

7 What Negotiated Privacy Is Provide personal data escrow of private data by the subjects of monitoring Pre-negotiated thresholds (interested parties) Conditional release: Meet threshold  “unlock” private data Ensures both accuracy and privacy Only allows authorized queries (i.e., has a threshold been met?)

8 What Negotiated Privacy Is Not Private Information Retrieval (PIR) –enforces privacy when data is retrieved Digital Cash –enforces privacy of multiple “digital coins” –can’t verify that a user has “too many” coins Privacy Preserving Datamining –sanitizes or splits data –can’t control conditions for exposing information Searching on Encrypted Data –Allows efficient (secure, but not private) searches –Paper by Song, Wagner, Perrig

9 “Practical Techniques for Searches on Encrypted Data” Song, Wagner, Perrig Several schemes – Last one supports: –Provable Secrecy (the untrusted server cannot learn anything about the plaintext given only the ciphertext) –Controlled Searching (the untrusted server cannot search for a word without the user’s authorization) –Hidden Queries (the user may ask the untrusted server to search for a secret word without revealing the word to the server) –Query Isolation (the untrusted server learns nothing more than the search result about the plaintext) Note – Negotiated Privacy has “Provable Secrecy” and is only slightly related to “Controlled Searching”

10 Basic Idea (details later) Example: Database – One record per copied song, per user. Database User ArtistSong xxxxxxx Analyst Service Provider 1.Escrow (e.g., Make one Copy of Song) 3.Issue Receipt, or Request Disclosure 2.Validate Escrow 4.Report Activity, t 5.if P(t) then give Receipt PKI / Magistrate 6.Validate and - Provide Service, or - Deny Service

11 Details - DDH Let F q be a finite field of q elements so that q=p n for some prime p and integer n. The multiplicative group of nonzero elements of F q, denoted by F q *, is a cyclic group of order q-1. If α is a generator of this multiplicative group, then every nonzero element β in F q is given by β= α x for some integer x; In fact for each β there is a unique integer in the range 0 ≤ x < q-1 with this property. The inverse problem, i.e., the problem of finding, for a given α and β, the x in the range 0 < x < q-1 satisfying β= α x, is the discrete logarithm problem; it is believed to be hard for many fields. Reference: http://www.math.clemson.edu/faculty/Gao/crypto_mod/node4.htmlhttp://www.math.clemson.edu/faculty/Gao/crypto_mod/node4.html

12 Details Reference: http://www.math.clemson.edu/faculty/Gao/crypto_mod/node4.htmlhttp://www.math.clemson.edu/faculty/Gao/crypto_mod/node4.html

13 Details Required “tools” / data: –asymmetric key system (x = private; y = public = g, g x ) –activity t (plaintext) –predicate P(t) –core(t) = part of the data that determines value of P(t) –s = fresh random element in G q –personal data escrow [t] x = (tag, c, Enc s {t}, k) where tag = h x where h = hash(core(t)) where deterministically hashes into G q c = s x k = threshold value Sig KM (U,y) Sig KA [t] x Protects against Malicious User Protects against Malicious Analyst / Provider

14 Details Database UserArtistSong xxxxxxx xxxxxx xxxxxxx xxxxxx xxxxxxx xxxxxx xxxxxxx xxxxxx Analyst 7.Send Sig KA [t] x, or Request Disclosure PKI / Magistrate 1.g,y 2.Verify U knows x (e.g., Schnorr Auth) 3.Sig KM (U,y) 4.Generate Escrow [t] x : tag = h x where h = hash(core(t)) s = fresh random element in G q hash s into keyspace and then Enc s {t} c = s x k = threshold value 6.Validate Escrow:-Escrow freshness -If |tag| < k-1 then issue receipt -Else user must disclose other records w/same tag 5.Send [t] x Service Provider User

15 Details Database User Analyst 7.Issue Receipt, or Request Disclosure 6.Validate Escrow PKI / Magistrate 1.g,y 5.Send [t] x 2.Verify U knows x (e.g., Schnorr Auth) 3.Sig KM (U,y) 8.Report Activity t if P(t) then give s, Sig KA ([t] x ), Sig KM (U,y) and proof (tag=h x, c=s x, and y=g x ) 9.Verify signatures Verify identity is U Verify t matches activity Verify reported k is correct for this activity Compute h = hash(core(t)) Verify proof information (tag=h x, c=s x, y=g x ) 10.Provide Service, or Deny Service Service Provider UserArtistSong xxxxxxx xxxxxx xxxxxxx xxxxxx xxxxxxx xxxxxx xxxxxxx xxxxxx User

16 ArtistSong xxxxxx More Details Disclosure: –When count(tag) ≥ k-1 –Not automatic – must request U to disclose –Only disclose escrows with same relevant tag –A gives U all relevant escrows for U to open –U opens all [t] x by: s = (c) 1/x t = Enc s {t} h = hash(core(t)) –For each [t] x, send to A: h, s, Sig KM (U,y), and proof that tag = h x, c=s x, and y=g x A learns U and t Lemma 4: –A will know the number of other reportable activities by U –Doesn’t leak to A the plaintext of other activities of U UserArtistSong D EvansBritney SpearsToxic D EvansBritney SpearsToxic D EvansBritney SpearsToxic D EvansBritney SpearsToxic

17 Limitations Social, legal, etc. questions Upfront threshold & query negotiations are required Query limitations – dynamic queries are difficult (impossible??) Can’t do “group” thresholds (since all must have same tag) No automatic disclosure of records (but could go to magistrate, if necessary) U gets escrow, but decides not to get served Can’t completely stop impersonations (use biometrics??) Doesn’t stop threats due to collusion among entities

18 Conclusion Good initial move towards supporting reasonable negotiated privacy Provides unique functionality for niche applications Don’t ask Dave for copies of his music

19 Searching on Encrypted Data Presented by Leonid Bolotnyy March 30, 2004 @UVA

20 Outline Practical Techniques for Searches on Encrypted Data Building an Encrypted and Searchable Audit Log

21 Practical Techniques for Searches on Encrypted Data

22 Goals Provable Security –Untrusted server learns nothing about the plaintext given only ciphertext Controlled Searching –Untrusted server cannot perform the search without user authorization Hidden Queries –Untrusted server does not know the query Query Isolation –Untrusted server does not learn more than the search results

23 Basic Scheme Encryption

24 Basic Scheme Search and Decryption To Search: To Decrypt:

25 Basic Scheme Issues Bad: 1: 2: Good: 3: 4:

26 Controlled Searching How do we decrypt now? The issue of hiding search queries is still unresolved.

27 Hidden Searches The problem with decryption still remains

28 Solving Decryption Problem

29 Scheme Conclusions “Efficient” encryption, decryption, search that take O(n) number of block cipher and stream cipher operations Provable security with controlled searching, hidden queries, query isolation Possible support for composed queries Possible support for varied-length words –Padding with fixed length blocks –Variable length words (store the length)

30 Building an Encrypted and Searchable Audit Log

31 Reasons to Encrypt Audit Logs Log may be stored at not completely trusted (secure) site To prevent tampering with the log To restrict access to the log –Allow only access to certain parts of the log –Allow only certain entities to access the log

32 Characteristics of a Secure Audit Log Temper Resistant –Guarantee that only the authorized entity can create entries and once created, entries cannot be altered Verifiable –Allow verification that all entries are present and has not been altered Searchable with data access control –Allow log to be “efficiently” searched only by authorized entities

33 Notation and Setup

34 Symmetric Key Scheme H – pseudorandom function keyed with S S – secret key for this log chosen by the escrow agent flag – constant bit string of length l.

35 Search and Decryption To search for all entries with keyword w: To decrypt: ???

36 Issues and Problems flag size and possibility of false positives Capabilities for different key words appear random Adversary may be is able to learn S which is known to the server Updating keys requires constant connection to the escrow agent + numerous keys management problem + high search time STORE AS LITTLE SECRET INFORMATION ON THE SERVER AS POSSIBLE

37 Identity Based Encryption Identity Based Encryption allows arbitrary strings to be used as public keys Master secret key stored with a trusted escrow agent allows generation of a private key after the public key has been selected

38 IBE Setup and Key generation Setup: Key generation:

39 IBE Encryption and Decryption Encryption Decryption

40 Asymmetric Scheme using IBE To encrypt: To search: To decrypt: ???

41 Comments on the IBE Scheme Note: Each server stores only public parameters Compromising the server does not allow attacker to search the data Possible to separate the search and decryption by encrypting the key using some other public key (requires an extra access to the escrow agent for decryption) A drawback: Tremendous increase in computation time

42 Scheme Optimizations Pairing Reuse Indexing Randomness Reuse


Download ppt "Negotiated Privacy CS551/851CRyptographyApplicationsBistro Mike McNett 30 March 2004 Stanislaw Jarecki, Pat Lincoln, Vitaly Shmatikov. Negotiated Privacy.Negotiated."

Similar presentations


Ads by Google