Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010.

Similar presentations


Presentation on theme: "Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010."— Presentation transcript:

1 Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

2 Goal Information-theoretic Quantification of programs’ impact on Integrity of Information Clarkson: Quantification of Integrity2 [Denning 1982]

3 What is Integrity? Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended Isolation necessary for protection from subversion Dual to confidentiality Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data …no universal definition Clarkson: Quantification of Integrity3

4 Our Notions of Integrity Clarkson: Quantification of Integrity4 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity

5 Our Notions of Integrity Clarkson: Quantification of Integrity5 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output …distinct, but interact

6 Contamination Goal: model taint analysis Clarkson: Quantification of Integrity6 Program User Attacker User Attacker trusted untrusted

7 Contamination Goal: model taint analysis Untrusted input contaminates trusted output Clarkson: Quantification of Integrity7 Program User Attacker User Attacker trusted untrusted

8 Contamination u contaminates o Clarkson: Quantification of Integrity8 o:=(t,u)

9 Contamination u contaminates o (Can’t u be filtered from o ?) Clarkson: Quantification of Integrity9 o:=(t,u)

10 Quantification of Contamination Use information theory: information is surprise X, Y, Z: distributions I (X,Y): mutual information between X and Y (in bits) I (X,Y | Z): conditional mutual information Clarkson: Quantification of Integrity10

11 Quantification of Contamination Clarkson: Quantification of Integrity11 Program User Attacker User Attacker trusted untrusted

12 Quantification of Contamination Clarkson: Quantification of Integrity12 Program User Attacker User Attacker trusted untrusted U in T in T out

13 Quantification of Contamination Contamination = I (U in,T out | T in ) Clarkson: Quantification of Integrity13 Program User Attacker User Attacker trusted untrusted U in T in T out [Newsome et al. 2009] Dual of [Clark et al. 2005, 2007]

14 Example of Contamination Clarkson: Quantification of Integrity14 o:=(t,u) Contamination = I (U, O | T) = k bits if U is uniform on [0,2 k -1]

15 Our Notions of Integrity Clarkson: Quantification of Integrity15 Starting PointCorruption Measure Taint analysisContamination Program correctness Suppression Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output

16 Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity16 Sender Receiver correct Specification (Specification must be deterministic)

17 Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity17 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification Sender Receiver correct real

18 Program Suppression Goal: model program (in)correctness Clarkson: Quantification of Integrity18 Implementation Sender Attacker Receiver Attacker trusted untrusted Sender Receiver Implementation might suppress information about correct output from real output real correct Specification

19 Example of Program Suppression Clarkson: Quantification of Integrity19 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } Spec. a[0..m-1]: trusted

20 Example of Program Suppression Clarkson: Quantification of Integrity20 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } Spec. Impl. 1 Suppression—a[0] missing No contamination a[0..m-1]: trusted

21 Example of Program Suppression Clarkson: Quantification of Integrity21 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression—a[0] missing No contamination Suppression—a[m] added Contamination a[0..m-1]: trusted a[m]: untrusted

22 Suppression vs. Contamination Clarkson: Quantification of Integrity22 * * Attacker * * Contaminatio n Suppression output := input

23 Quantification of Program Suppression Clarkson: Quantification of Integrity23 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification Sender Receiver

24 Quantification of Program Suppression Clarkson: Quantification of Integrity24 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification In Spec Sender Receiver

25 Quantification of Program Suppression Clarkson: Quantification of Integrity25 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Impl In Spec Sender Receiver

26 Quantification of Program Suppression Clarkson: Quantification of Integrity26 Implementation Sender Attacker Receiver Attacker trusted untrusted Specification U in T in Program transmission = I (Spec, Impl) In Spec Impl Sender Receiver

27 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity27

28 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity28 Total info to learn about Spec

29 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity29 Info actually learned about Spec by observing Impl Total info to learn about Spec

30 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Clarkson: Quantification of Integrity30 Info actually learned about Spec by observing Impl Total info to learn about Spec Info NOT learned about Spec by observing Impl

31 Quantification of Program Suppression H (X): entropy (uncertainty) of X H (X|Y): conditional entropy of X given Y Program Transmission = I (Spec, Impl) = H (Spec) − H (Spec | Impl) Program Suppression = H (Spec | Impl) Clarkson: Quantification of Integrity31

32 Example of Program Suppression Clarkson: Quantification of Integrity32 for (i=0; i<m; i++) { s := s + a[i]; } for (i=0; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Spec. Impl. 1Impl. 2 Suppression = H (A)Suppression ≤ H (A) A = distribution of individual array elements

33 Belief-based Metrics What if user’s/receiver’s distribution on unobservable inputs is wrong?  Belief-based information flow [Clarkson et al. 2005] Belief-based generalizes information- theoretic:  On single executions, the same  In expectation, the same …if user’s/receiver’s distribution is correct Clarkson: Quantification of Integrity33

34 Suppression and Confidentiality Declassifier: program that reveals (leaks) some information; suppresses rest Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009] Thm. Leakage + Suppression is a constant  What isn’t leaked is suppressed Clarkson: Quantification of Integrity34

35 Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake Clarkson: Quantification of Integrity35 Anonymizer User Databas e User query response anonymized response

36 Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage Clarkson: Quantification of Integrity36 Anonymizer User Databas e User query response anonymized response anon. resp. := resp.

37 Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage …sacrifices integrity for confidentiality’s sake Clarkson: Quantification of Integrity37 Anonymizer User Databas e User query response anonymized response

38 Database Privacy Security Conditions k-anonymity: [Sweeney 2002]  Every individual must be anonymous within set of size k.  Every output corresponds to k inputs. …no bound on leakage or suppression L-diversity: [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007]  Every individual’s sensitive information should appear to have L roughly equally likely values.  Every output corresponds to L (roughly) equally likely inputs …implies suppression ≥ log L Differential privacy: [Dwork et al. 2006, Dwork 2006]  No individual loses privacy by including data in database  Output reveals almost no information about individual input beyond what other inputs already reveal …implies almost all information about individual suppressed …quite similar to noninterference Clarkson: Quantification of Integrity38

39 Summary Measures of information corruption: Contamination (generalizes taint analysis, dual to leakage) Suppression (generalizes program correctness, no dual) Application: database privacy (model anonymizers; relate utility and privacy; security conditions) Clarkson: Quantification of Integrity39

40 More Integrity Measures  Channel suppression … same as channel model from information theory, but with attacker  Attacker- and program-controlled suppression Granularity:  Average over all executions  Single executions  Sequences of executions …interaction of attacker with program Application: Error-correcting codes Clarkson: Quantification of Integrity40

41 Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010

42 Beyond Contamination and Suppression?  Clark–Wilson integrity policy  Relational integrity constraints  Software testing metrics  … Clarkson: Quantification of Integrity42

43 Beyond Contamination & Suppression? Clarkson: Quantification of Integrity43 * * Attacker * *

44 Confidentiality Dual to Suppression? Clarkson: Quantification of Integrity44 Sender Attacker Receiver Attacker Public inputs lost from public outputs output := input  Classic duality of confidentiality and integrity is incomplete public secret

45 Value of Information What if some bits are worth more than others?  Discrete worth: security levels Top secret, secret, confidential, unclassified  Continuous worth: ? Clarkson: Quantification of Integrity45

46 Bounded Reasoning Our agents are logically omniscient. Bounds? Algorithmic knowledge, computational entropy, … Clarkson: Quantification of Integrity46

47 Information Information is surprise. X: random variable on set of events {e, …} Clarkson: Quantification of Integrity47 I (e): self-information conveyed by event e I (e) = − log 2 Pr[X=e](unit is bits) I (e): self-information conveyed by event e I (e) = − log 2 Pr[X=e](unit is bits)

48 Suppression vs. Contamination Clarkson: Quantification of Integrity48 n:=rnd(); o:=t xor n n:=rnd(); o:=t xor n o:=t xor u o:=(t,u) t suppressed by noise no contamination t suppressed by u o contaminated by u no suppression

49 Echo Specification Clarkson: Quantification of Integrity49 output := input Sender Receiver trusted T in

50 Echo Specification Clarkson: Quantification of Integrity50 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T impl U in

51 Echo Specification Clarkson: Quantification of Integrity51 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T out U in

52 Echo Specification Clarkson: Quantification of Integrity52 Implementation Sender Attacker Receiver Attacker trusted untrusted output := input Sender Receiver trusted T in T out U in Simplifies to information-theoretic model of channels, with attacker

53 Channel Suppression Clarkson: Quantification of Integrity53 Channel transmission = I (T in,T out ) Channel suppression = H (T in | T out ) (T out depends on U in ) Channel Sender Attacker Receiver Attacker trusted untrusted T in T out U in

54 Probabilistic Specifications Correct output: distributions on outputs Correct output distribution: distribution on distribution on outputs Continuous distributions, differential entropy? Clarkson: Quantification of Integrity54 o := rnd(1)

55 Duality Clarkson: Quantification of Integrity55

56 Contamination vs. Leakage Clarkson: Quantification of Integrity56 Program User Attacker User Attacker secret public Program User Attacker User Attacker trusted untrusted Contamination = I (U in,T out | T in ) Leakage= I (S in,P out | P in )

57 Contamination vs. Leakage Clarkson: Quantification of Integrity57 Contamination = I (U in,T out | T in ) Leakage= I (S in,P out | P in ) [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

58 Contamination vs. Leakage Clarkson: Quantification of Integrity58 Contamination = I (U in,T out | T in ) Leakage= I (S in,P out | P in ) [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]  Contamination is dual to leakage

59 Confidentiality Dual to Suppression? Clarkson: Quantification of Integrity59 No.

60 L-diversity Every individual’s sensitive information should appear to have L (roughly) equally likely values. [Machanavajjhala et al. 2007] Entropy L-diversity: H (anon. block) ≥ log L [Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007] H (T in | t out ) ≥ log L (if T in uniform) …implies suppression ≥ log L Clarkson: Quantification of Integrity60

61 “To Measure is to Know” When you can measure what you are speaking about…you know something about it; but when you cannot measure it…your knowledge is…meager and unsatisfactory… You have scarcely…advanced to the state of Science. —Lord Kelvin Clarkson: Quantification of Integrity61


Download ppt "Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010."

Similar presentations


Ads by Google