Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010.

Slides:



Advertisements
Similar presentations
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,
Advertisements

Operating System Security
Simulatability “The enemy knows the system”, Claude Shannon CompSci Instructor: Ashwin Machanavajjhala 1Lecture 6 : Fall 12.
Ragib Hasan Johns Hopkins University en Spring 2011 Lecture 8 04/04/2011 Security and Privacy in Cloud Computing.
Intro 1 Introduction Intro 2 Good Guys and Bad Guys  Alice and Bob are the good guys  Trudy is the bad guy  Trudy is our generic “intruder”
Sheng Xiao, Weibo Gong and Don Towsley,2010 Infocom.
Hyperproperties Michael Clarkson and Fred B. Schneider Cornell University IEEE Symposium on Computer Security Foundations June 23, 2008 TexPoint fonts.
Using Programmer-Written Compiler Extensions to Catch Security Holes Authors: Ken Ashcraft and Dawson Engler Presented by : Hong Chen CS590F 2/7/2007.
DYNAMIC ENFORCEMENT OF KNOWLEDGE-BASED SECURITY POLICIES Piotr (Peter) Mardziel, Stephen Magill, Michael Hicks, and Mudhakar Srivatsa.
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
Differential Privacy 18739A: Foundations of Security and Privacy Anupam Datta Fall 2009.
Toward a Framework for Preventing Side-Channel Attacks in Wireless Networks Jeff Pang.
Fundamental limits in Information Theory Chapter 10 :
1 Enforcing Confidentiality in Low-level Programs Andrew Myers Cornell University.
Verifiable Security Goals
Security in Databases. 2 Srini & Nandita (CSE2500)DB Security Outline review of databases reliability & integrity protection of sensitive data protection.
Anatomy: Simple and Effective Privacy Preservation Israel Chernyak DB Seminar (winter 2009)
Modelling and Analysing of Security Protocol: Lecture 9 Anonymous Protocols: Theory.
1 Validation and Verification of Simulation Models.
CS 711 Fall 2002 Programming Languages Seminar Andrew Myers 2. Noninterference 4 Sept 2002.
Security in Databases. 2 Outline review of databases reliability & integrity protection of sensitive data protection against inference multi-level security.
CS526Topic 21: Integrity Models1 Information Security CS 526 Topic 21: Integrity Protection Models.
1 Software Testing and Quality Assurance Lecture 5 - Software Testing Techniques.
Automatic Implementation of provable cryptography for confidentiality and integrity Presented by Tamara Rezk – INDES project - INRIA Joint work with: Cédric.
Information Theory and Security Prakash Panangaden McGill University First Canada-France Workshop on Foundations and Practice of Security Montréal 2008.
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Overview of Privacy Preserving Techniques.  This is a high-level summary of the state-of-the-art privacy preserving techniques and research areas  Focus.
Ragib Hasan University of Alabama at Birmingham CS 491/691/791 Fall 2011 Lecture 16 10/11/2011 Security and Privacy in Cloud Computing.
Reasoning about Information Leakage and Adversarial Inference Matt Fredrikson 1.
Channel Capacity.
Device-independent security in quantum key distribution Lluis Masanes ICFO-The Institute of Photonic Sciences arXiv:
Containment and Integrity for Mobile Code End-to-end security, untrusted hosts Andrew Myers Fred Schneider Department of Computer Science Cornell University.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Personalized Social Recommendations – Accurate or Private? A. Machanavajjhala (Yahoo!), with A. Korolova (Stanford), A. Das Sarma (Google) 1.
Database Security Outline.. Introduction Security requirement Reliability and Integrity Sensitive data Inference Multilevel databases Multilevel security.
CS426Fall 2010/Lecture 251 Computer Security CS 426 Lecture 25 Integrity Protection: Biba, Clark Wilson, and Chinese Wall.
SPAR-MPC Day 1 Breakout Sessions Emily Shen 29 May 2014.
Privacy vs. Utility Xintao Wu University of North Carolina at Charlotte Nov 10, 2008.
Hyperproperties Michael Clarkson and Fred B. Schneider Cornell University Air Force Office of Scientific Research December 4, 2008 TexPoint fonts used.
A paper by: Paul Kocher, Joshua Jaffe, and Benjamin Jun Presentation by: Michelle Dickson.
Bloom Cookies: Web Search Personalization without User Tracking Authors: Nitesh Mor, Oriana Riva, Suman Nath, and John Kubiatowicz Presented by Ben Summers.
Hyperproperties Michael Clarkson and Fred B. Schneider Cornell University Ph.D. Seminar Northeastern University October 14, 2010.
Design Principles and Common Security Related Programming Problems
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
Presented by Minkoo Seo March, 2006
Differential Privacy (1). Outline  Background  Definition.
1 Differential Privacy Cynthia Dwork Mamadou H. Diallo.
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010.
Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June.
Wired Equivalent Privacy (WEP) Chris Overcash. Contents What is WEP? What is WEP? How is it implemented? How is it implemented? Why is it insecure? Why.
KNOWLEDGE-ORIENTED MULTIPARTY COMPUTATION Piotr (Peter) Mardziel, Michael Hicks, Jonathan Katz, Mudhakar Srivatsa (IBM TJ Watson)
PREPARED BY: MS. ANGELA R.ICO & MS. AILEEN E. QUITNO (MSE-COE) COURSE TITLE: OPERATING SYSTEM PROF. GISELA MAY A. ALBANO PREPARED BY: MS. ANGELA R.ICO.
Chapter 29: Program Security Dr. Wayne Summers Department of Computer Science Columbus State University
Lecturer: Eng. Mohamed Adam Isak PH.D Researcher in CS M.Sc. and B.Sc. of Information Technology Engineering, Lecturer in University of Somalia and Mogadishu.
CS526Topic 19: Integrity Models1 Information Security CS 526 Topic 19: Integrity Protection Models.
Lecture 2 Page 1 CS 236 Online Security Policies Security policies describe how a secure system should behave Policy says what should happen, not how you.
Information Security, Theory and Practice.
Verifiable Security Goals
New Cache Designs for Thwarting Cache-based Side Channel Attacks
Privacy-preserving Release of Statistics: Differential Privacy
Quantification of Integrity
Information Theory Michael J. Watts
Cryptography Lecture 4.
Differential Privacy in Practice
Chapter 29: Program Security
Data Warehousing Data Mining Privacy
Topic 13: Message Authentication Code
Published in: IEEE Transactions on Industrial Informatics
Investigating Provably Secure and Practical Software Protection
Presentation transcript:

Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Goal Information-theoretic Quantification of programs’ impact on Integrity of Information Clarkson: Quantification of Integrity2 [Denning 1982]

What is Integrity? Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended Isolation necessary for protection from subversion Dual to confidentiality Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data …no universal definition Clarkson: Quantification of Integrity3

Our Notions of Integrity Clarkson: Quantification of Integrity4 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity

Our Notions of Integrity Clarkson: Quantification of Integrity5 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output …distinct, but interact

Contamination Goal: model taint analysis Clarkson: Quantification of Integrity6 Program User Attacker User Attacker trusted untrusted

Contamination Goal: model taint analysis Untrusted input contaminates trusted output Clarkson: Quantification of Integrity7 Program User Attacker User Attacker trusted untrusted

Contamination u contaminates o Clarkson: Quantification of Integrity8 o:=(t,u)

Contamination u contaminates o (Can’t u be filtered from o ?) Clarkson: Quantification of Integrity9 o:=(t,u)

Quantification of Contamination Use information theory: information is surprise X, Y, Z: distributions I (X,Y): mutual information between X and Y (in bits) I (X,Y | Z): conditional mutual information Clarkson: Quantification of Integrity10

Quantification of Contamination Clarkson: Quantification of Integrity11 Program User Attacker User Attacker trusted untrusted

Quantification of Contamination Clarkson: Quantification of Integrity12 Program User Attacker User Attacker trusted untrusted U in T in T out

Quantification of Contamination Contamination = I (U in,T out | T in ) Clarkson: Quantification of Integrity13 Program User Attacker User Attacker trusted untrusted U in T in T out [Newsome et al. 2009] Dual of [Clark et al. 2005, 2007]

Example of Contamination Clarkson: Quantification of Integrity14 o:=(t,u) Contamination = I (U, O | T) = k bits if U is uniform on [0,2 k -1]

Our Notions of Integrity Clarkson: Quantification of Integrity15 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output

Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity16 Sender Receiver correct Specification (Specification must be deterministic)

Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity17 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification Sender Receiver correct real

Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity18 Implementation Sender Attacker Receiver Attacker trusted untrusted Sender Receiver Implementation might suppress information about correct output from real output real correct Specification

Example of Program Suppression Clarkson: Quantification of Integrity19 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } Spec. a[0..m-1]: trusted

Example of Program Suppression Clarkson: Quantification of Integrity20 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } Spec. Impl. 1 Suppression—a[0] missing No contamination a[0..m-1]: trusted

Example of Program Suppression Clarkson: Quantification of Integrity21 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression—a[0] missing No contamination Suppression—a[m] added Contamination a[0..m-1]: trusted a[m]: untrusted

Suppression vs. Contamination Clarkson: Quantification of Integrity22 * * Attacker * * Contaminatio n Suppression output := input

Quantification of Program Suppression Clarkson: Quantification of Integrity23 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification Sender Receiver

Quantification of Program Suppression Clarkson: Quantification of Integrity24 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification In Spec Sender Receiver

Quantification of Program Suppression Clarkson: Quantification of Integrity25 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Impl In Spec Sender Receiver

Quantification of Program Suppression Clarkson: Quantification of Integrity26 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Program transmission = I (Spec, Impl) In Spec Impl Sender Receiver

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity27

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity28 Total info to learn about Spec

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity29 Info actually learned about Spec by observing Impl Total info to learn about Spec

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity30 Info actually learned about Spec by observing Impl Total info to learn about Spec Info NOT learned about Spec by observing Impl

Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Program Suppression = H (Spec | Impl) Clarkson: Quantification of Integrity31

Example of Program Suppression Clarkson: Quantification of Integrity32 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression = H (A)Suppression ≤ H (A) A = distribution of individual array elements

Belief-based Metrics What if user’s/receiver’s distribution on unobservable inputs is wrong?  Belief-based information flow [Clarkson et al. 2005] Belief-based generalizes information- theoretic:  On single executions, the same  In expectation, the same …if user’s/receiver’s distribution is correct Clarkson: Quantification of Integrity33

Suppression and Confidentiality Declassifier: program that reveals (leaks) some information; suppresses rest Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009] Thm. Leakage + Suppression is a constant  What isn’t leaked is suppressed Clarkson: Quantification of Integrity34

Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake Clarkson: Quantification of Integrity35 Anonymizer User Databas e User query response anonymized response

Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage Clarkson: Quantification of Integrity36 Anonymizer User Databas e User query response anonymized response anon. resp. := resp.

Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage …sacrifices integrity for confidentiality’s sake Clarkson: Quantification of Integrity37 Anonymizer User Databas e User query response anonymized response

Database Privacy Security Conditions k-anonymity: [Sweeney 2002]  Every individual must be anonymous within set of size k.  Every output corresponds to k inputs. …no bound on leakage or suppression L-diversity: [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007]  Every individual’s sensitive information should appear to have L roughly equally likely values.  Every output corresponds to L (roughly) equally likely inputs …implies suppression ≥ log L Differential privacy: [Dwork et al. 2006, Dwork 2006]  No individual loses privacy by including data in database  Output reveals almost no information about individual input beyond what other inputs already reveal …implies almost all information about individual suppressed …quite similar to noninterference Clarkson: Quantification of Integrity38

Summary Measures of information corruption: Contamination (generalizes taint analysis, dual to leakage) Suppression (generalizes program correctness, no dual) Application: database privacy (model anonymizers; relate utility and privacy; security conditions) Clarkson: Quantification of Integrity39

More Integrity Measures  Channel suppression … same as channel model from information theory, but with attacker  Attacker- and program-controlled suppression Granularity:  Average over all executions  Single executions  Sequences of executions …interaction of attacker with program Application: Error-correcting codes Clarkson: Quantification of Integrity40

Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

Beyond Contamination and Suppression?  Clark–Wilson integrity policy  Relational integrity constraints  Software testing metrics  … Clarkson: Quantification of Integrity42

Beyond Contamination & Suppression? Clarkson: Quantification of Integrity43 * * Attacker * *

Confidentiality Dual to Suppression? Clarkson: Quantification of Integrity44 Sender Attacker Receiver Attacker Public inputs lost from public outputs output := input  Classic duality of confidentiality and integrity is incomplete public secret

Value of Information What if some bits are worth more than others?  Discrete worth: security levels Top secret, secret, confidential, unclassified  Continuous worth: ? Clarkson: Quantification of Integrity45

Bounded Reasoning Our agents are logically omniscient. Bounds? Algorithmic knowledge, computational entropy, … Clarkson: Quantification of Integrity46

Information Information is surprise. X: random variable on set of events {e, …} Clarkson: Quantification of Integrity47 I (e): self-information conveyed by event e I (e) = − log 2 Pr[X=e](unit is bits) I (e): self-information conveyed by event e I (e) = − log 2 Pr[X=e](unit is bits)

Suppression vs. Contamination Clarkson: Quantification of Integrity48 n:=rnd(); o:=t xor n n:=rnd(); o:=t xor n o:=t xor u o:=(t,u) t suppressed by noise no contamination t suppressed by u o contaminated by u no suppression

Echo Specification Clarkson: Quantification of Integrity49 output := input Sender Receiver trusted T in

Echo Specification Clarkson: Quantification of Integrity50 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T impl U in

Echo Specification Clarkson: Quantification of Integrity51 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T out U in

Echo Specification Clarkson: Quantification of Integrity52 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T out U in Simplifies to information-theoretic model of channels, with attacker

Channel Suppression Clarkson: Quantification of Integrity53 Channel transmission = I (T in,T out ) Channel suppression = H (T in | T out ) (T out depends on U in ) Channel Sender Attacker Receiver Attacker trusted untrusted T in T out U in

Probabilistic Specifications Correct output: distributions on outputs Correct output distribution: distribution on distribution on outputs Continuous distributions, differential entropy? Clarkson: Quantification of Integrity54 o := rnd(1)

Duality Clarkson: Quantification of Integrity55

Contamination vs. Leakage Clarkson: Quantification of Integrity56 Program User Attacker User Attacker secret public Program User Attacker User Attacker trusted untrusted Contamination = I (U in,T out | T in ) Leakage= I (S in,P out | P in )

Contamination vs. Leakage Clarkson: Quantification of Integrity57 Contamination = I (U in,T out | T in ) Leakage= I (S in,P out | P in ) [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

Contamination vs. Leakage Clarkson: Quantification of Integrity58 Contamination = I (U in,T out | T in ) Leakage= I (S in,P out | P in ) [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]  Contamination is dual to leakage

Confidentiality Dual to Suppression? Clarkson: Quantification of Integrity59 No.

L-diversity Every individual’s sensitive information should appear to have L (roughly) equally likely values. [Machanavajjhala et al. 2007] Entropy L-diversity: H (anon. block) ≥ log L [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007] H (T in | t out ) ≥ log L (if T in uniform) …implies suppression ≥ log L Clarkson: Quantification of Integrity60

“To Measure is to Know” When you can measure what you are speaking about…you know something about it; but when you cannot measure it…your knowledge is…meager and unsatisfactory… You have scarcely…advanced to the state of Science. —Lord Kelvin Clarkson: Quantification of Integrity61