Presentation is loading. Please wait.

Presentation is loading. Please wait.

The University of Adelaide, School of Computer Science

Similar presentations


Presentation on theme: "The University of Adelaide, School of Computer Science"— Presentation transcript:

1 The University of Adelaide, School of Computer Science
Computer and Information Security Handbook The University of Adelaide, School of Computer Science 22 September 2018 Chapter 43 Privacy-Enhancing Technologies Simone Fischer-Hbner Stefan Berthold Karlstad University, Sweden Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

2 The University of Adelaide, School of Computer Science
22 September 2018 The Concept of Privacy In most legal systems, the right to privacy applies to people only, not institutions Privacy can be in conflict with other human rights General scope of privacy laws Define principles of collecting, processing, and storing personal data This chapter addresses informational privacy The Concept of Privacy Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

3 Legal Privacy Principles
The University of Adelaide, School of Computer Science 22 September 2018 Legal Privacy Principles Legitimacy of need to collect /store PII Informed consent, legal obligation, or contractual agreement Purpose specification and purpose binding Data cannot be processed in a way incompatible with the stated purpose Data minimization Limit data collected to minimum required Legal Privacy Principles Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

4 Legal Privacy Principles
The University of Adelaide, School of Computer Science 22 September 2018 Legal Privacy Principles Transparency and rights of data subjects Data subject must be informed of purpose and circumstances of data processing Security Appropriate security mechanisms must be used to ensure personal data integrity, confidentiality, and availability Legal Privacy Principles Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

5 Classification of PETs
The University of Adelaide, School of Computer Science 22 September 2018 Classification of PETs Classification of PETs Privacy-enhancing technologies (PETs) Technologies that enforce legal privacy principles Three different classes of PETs Those that minimize or avoid the collection and use of personal data Those that enforce legal privacy requirements Those that combine characteristics of the first two classes Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

6 Traditional Privacy Goals of PETs
The University of Adelaide, School of Computer Science 22 September 2018 Traditional Privacy Goals of PETs Anonymity of a subject Subject is not identifiable within a set of subjects Unlinkability Example: not being able to link the sender and recipient of a message Strongest privacy goal is unobservability Combines undetectability and anonymity Pseudonymity Use of pseudonyms as identifiers Traditional Privacy Goals of PETs Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

7 The University of Adelaide, School of Computer Science
22 September 2018 Privacy Metrics Privacy Metrics Goal of privacy metrics Quantify the effectiveness of schemes or technologies in meeting privacy goals Simple metric: the anonymity set All subjects that may have caused an event observed by the adversary K-anonymity Property or requirement for databases that must not leak sensitive information Other metrics 1-diversity, T-closeness Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

8 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies Anonymous communication Users cannot be identified by their IP addresses Chaum’s DC network protocol Anonymous communication protocol Not easily used in practice Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

9 The University of Adelaide, School of Computer Science
22 September 2018 Data Minimization Technologies Figure 43.1 A DC network with three users Sender sends a message, but by observing the communication in the network alone, the adversary cannot tell whether it was sent by a, b, or c, or even whether a meaningful message was sent at all. Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

10 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies Mix nets More practical than DC networks Most anonymity networks built on the mix net concept Mix Relay or proxy server Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

11 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies Four steps used by the mix Duplicates are discarded Messages are randomly delayed Messages are recoded using cryptography Messages are resent in a sequence independent of the receiving sequence Several mixes can be used in a chain, or cascade Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

12 The University of Adelaide, School of Computer Science
22 September 2018 Data Minimization Technologies Figure 43.2 Processing steps within a mix Different recoding functions are used. Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

13 The University of Adelaide, School of Computer Science
22 September 2018 Data Minimization Technologies Figure 43.3 Sender anonymity with two mixes. Using a single mix can only provide anonymity if it is completely trustworthy and cannot be compromised. Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

14 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies AN.ON Anonymity service developed and operated since the late 1990s Technical University of Dresden Uses a network of mixes for low-latency traffic routing Messaging delays are minimal to none Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

15 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies Advantages of stable mix cascades Mixes can be audited and certified Cascades can be designed to cross national boundaries Security measures focus on a small number of mixes Disadvantages Each mix is a potential bottleneck Setting up and operating mixes is expensive Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

16 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies Onion routing/Tor Low-latency, mix-based routing protocol Developed in 1990s at Naval Research Laboratory Provides anonymous socket connections using proxy servers Uses the mix net concept of layers of public key encryption Tor: second generation of onion routing Uses Diffie-Hellman to provide forward secrecy Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

17 The University of Adelaide, School of Computer Science
22 September 2018 Data Minimization Technologies Figure 43.4 The TOR key negotiation and a simple Web site request Diffie-Hellman used to provide forward secrecy. Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

18 Data Minimization Technologies
The University of Adelaide, School of Computer Science 22 September 2018 Data Minimization Technologies Data minimization at application level Techniques most often based on cryptographic protocols Blind signatures Extension of digital signatures Signer is not aware of the content of the document they are signing Invented by Chaum as a building block for anonymous eCash Data Minimization Technologies Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

19 The University of Adelaide, School of Computer Science
22 September 2018 Data Minimization Technologies Figure 43.5 The flow of e-Cash’s untraceable electronic money A protocol for offline electronic money has also been presented by Chaum. Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

20 Transparency-Enhancing Tools
The University of Adelaide, School of Computer Science 22 September 2018 Transparency-Enhancing Tools Tools for end users for making personal data processing more transparent Four classes of transparency-enhancing tools Those that provide information about the intended data collection and processing Those that provide the data subject with an overview of disclosed personal data Those that provide the data subject with online access to personal data Those that provide counter-profiling capabilities Transparency-Enhancing Tools Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

21 Transparency-Enhancing Tools
The University of Adelaide, School of Computer Science 22 September 2018 Transparency-Enhancing Tools Ex-ante TETs Examples: privacy policy language tools, human-computer interaction components that make privacy policies more transparent Visualization techniques based on the concept of a “nutrition label” Transparency-Enhancing Tools Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

22 The University of Adelaide, School of Computer Science
22 September 2018 Transparency-Enhancing Tools Figure 43.7 “Send data?” PPL user interface The PPL engine interacts with the user to display the result of policy matches, identity/credential selection, and for obtaining informed consent. Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

23 Transparency-Enhancing Tools
The University of Adelaide, School of Computer Science 22 September 2018 Transparency-Enhancing Tools Ex-post TETs The data track: a user-side transparency tool Includes history and online access functions Transactions are stored at the user side or in the cloud Google dashboard Shows users a summary of data stored within a user account Does not show how all data has been used, however PrimeLife Project secure logging system Transparency-Enhancing Tools Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer

24 The University of Adelaide, School of Computer Science
22 September 2018 Summary Summary Privacy-enhancing technologies (PETs) can be classified into three categories Privacy metrics attempt to quantify the concepts of anonymity, unlinkability, unobservability, and pseudonymity Transparency-enhancing technologies exist to make processing of personal data more transparent PET solutions have not been adopted widely by industry or users Copyright © 2014, Elsevier Inc. All rights Reserved Chapter 2 — Instructions: Language of the Computer


Download ppt "The University of Adelaide, School of Computer Science"

Similar presentations


Ads by Google