Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Decentralized Information Flow A paper by Myers/Liskov.

Similar presentations


Presentation on theme: "Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Decentralized Information Flow A paper by Myers/Liskov."— Presentation transcript:

1 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Decentralized Information Flow A paper by Myers/Liskov

2 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Motivation &Goals  Example  Model  Labels Confidentiality (reading) Integrity (writing) [presented later]  Principal hierarchy  Relabeling  Declassification  Jif (Java Information Flow)  Static and dynamic checking 2 Overview

3 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  End-end confidentiality  Need to control propagation/release of information beyond access control guarantees  Need for more precise and practical model  Model  Decentralized label model Independent principals, not a central authority  Copes with Untrusted code Mutual suspicion  Protection for users/group (not just organization)  Richer notion of declassification Need in practical systems to avoid “label creep” Does not need trusted subject (each principal declassifies their own data)  Finer grain of protection via JIF 3 Motivation & Goals

4 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Bob and Tax preparer must cooperate  Each have sensitive information to protect from the other  Bob cannot inspect Tax preparer code  Both must trust execution platform 4 Example

5 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  A process can act on behalf of a (set of) principal(s)  p acts for q  p has all the powers of q  Written  Represents  Individuals  Group/role 5 Model: Principals

6 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Confidentiality labels (integrity later)  Form  { policy 1, policy 2, …, policy n }  Policy is owner : list of readers  Example L = {o 1 : r 1, r 2 ;, o 2 : r 2, r 3 }  All policies in a label must be satisfied  In example, r 2 can read an object labeled by L 6 Model: Labels

7 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 7 Intuition - Confidentiality confidentialityrestrictive high lowless more assignment destination source declassification after before : union of all policies (each of which must be met), intersection of all readers by a given principal : intersection of all policies, union of all readers by a given principal lattice

8 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Assignment (value/L 1  variable/L 2 ) iff it is safe  Safety  L 2 is at least as restrictive as L 1  every policy in L 1 will be enforced by L 2  Notation  Label restriction  Policy “covers” J’s owner can act for I’s owner (or is the same owner) J’s readers are a subset of I’s readers (or are the same)  Incremental relabeling rules  Remove a reader  Add a policy  Add a reader r’ if r is in policy and  Replace an owner o with o’ if 8 Relabeling

9 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 9 Relabeling Examples

10 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  For policy I  Owner: o(I)  Explicit readers: r(I)  Implicit readers:  Rule 10 Complete relabeling rule

11 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 11 Combining Information

12 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Goal: deliberate action by process to weaken/relax confidentiality  Authority – the set of principals on whose behalf a process is allowed to act  Performed on a per-owner basis  No centralized declassifier needed  Owners cannot affect each others policies  Rule 12 Declassification – Confidentiality Labels

13 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  requires WebTax to declassify final tax form to be readable by Bob. 13 Declassification: example

14 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Form  { policy 1, policy 2,…, policy n }  Policy is owner : list of writers  Example L = {o 1 : w 1, w 2 ;, o 2 : w 2, w 3 }  Interpretation (of a policy)  A guarantee by the policy owner that the data can only be affected by the list of writers  The fewer writers, the less restrictive and the stronger the integrity guarantee  The most restrictive label is {} States no guarantee of source(s)/contamination All users may have written to the data Can only be used only when receiver imposes no integrity requirements 14 Model: Integrity Labels

15 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 15 Intuition - Integrity integrity low high guarantee strong weak assignment destination source declassification after before lattice : intersection of all policies, union of all writers by a given principal : union of all policies, intersection of all readers by a given principal restrictive less more

16 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Assignment (value/L 1  variable/L 2 ) iff it is safe  Safety: L 2 is more restrictive than L 1 (no less restrictive?)  Notation:  Examples:  {o : w 1 }  {o: w 1, w 2 } is allowed  { o : w 1, w 3 }  {o: w 1, w 2 } is not allowed  { o : w 1 ; o’: w 3 }  {o: w 1, w 2 } is allowed  Incremental relabeling rules  Add a writer  Remove a policy  Replace writer w’ by a writer w where  Add a policy J that is identical to policy I except that 16 Relabeling Integrity Labels

17 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Goal: deliberate act by a process to weaken/relax integrity guarantees  Authority: the set of principles on whose behalf a process is allowed to act  Performed on a per-owner basis  No centralized declassifier needed  Owners cannot affect each others policies  Rule  L 1 can be declassified to L 2 if  is an integrity label with a policy {p: all } for every principal p in the authority of the process; all is a list of all users 17 Declassification – Integrity Labels

18 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Extends Java with information flow checking  New features  Static checking of code privileges and also dynamic granting/checking of authority  Label polymorphism  Run-time checking when needed; run-time checks are checked to guard against leaks  Automatic label inference reduces need for manual labeling  Implicit flows accounted for by associating a static program-counter label (pc) with every statement 18 Jif – Java Information Flow

19 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science  Labeled type  associate a type, t, with an information flow label, l, as t{l}  Label checking insures that the apparent label of a value is at least as restrictive as the actual label of every value that might affect it.  Additional features  Declassify operator  An actsFor statement  Procedure calls my delegate a portion of the authority of the caller  Type “label” permits run-time label checking 19 Overview

20 Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science 20 Example


Download ppt "Fall, 2011 - Privacy&Security - Virginia Tech – Computer Science Click to edit Master title style Decentralized Information Flow A paper by Myers/Liskov."

Similar presentations


Ads by Google