Presentation is loading. Please wait.

Presentation is loading. Please wait.

I NFORMATION S ECURITY : C ONFIDENTIALITY P OLICIES (C HAPTER 4) Dr. Shahriar Bijani Shahed University.

Similar presentations


Presentation on theme: "I NFORMATION S ECURITY : C ONFIDENTIALITY P OLICIES (C HAPTER 4) Dr. Shahriar Bijani Shahed University."— Presentation transcript:

1 I NFORMATION S ECURITY : C ONFIDENTIALITY P OLICIES (C HAPTER 4) Dr. Shahriar Bijani Shahed University

2 S LIDES R EFERENCES Matt Bishop, Computer Security: Art and Science, the author homepage, 2002-2004. Chris Clifton, CS 526: Information Security course, Purdue university, 2010. 2

3 C HAPTER 5: C ONFIDENTIALITY P OLICIES Overview What is a confidentiality model Bell-LaPadula Model General idea Informal description of rules Formal description of rules Tranquility Controversy †-property System Z 3

4 O VERVIEW Bell-LaPadula Informally Formally Example Instantiation Tranquility Controversy System Z 4

5 C ONFIDENTIALITY P OLICY Goal: prevent the unauthorized disclosure of information Deals with information flow Multi-level security models are best-known examples Bell-LaPadula Model basis for many, or most, of these 5

6 B ACKGROUND Clearance levels Top Secret In-depth background check; highly trusted individual Secret Routine background check; trusted individual For Official Use Only/Sensitive No background check, but limited distribution; minimally trusted individuals May be exempt from disclosure Unclassified Unlimited distribution Untrusted individuals 6

7 B ELL -L A P ADULA M ODEL (S TEP 1) Security levels arranged in linear ordering Top Secret: highest Secret Confidential Unclassified: lowest Levels consist of: Subject has security clearance L(s) = l s Object has security classification L(o) = l o Clearance/Classification ordered: l i < l i+1 Mandatory access control 7

8 E XAMPLE security levelsubjectobject l 4: Top Secret BillPersonnel Files l 3: Secret SamuelE-Mail Files l 2: Confidential ClaireActivity Logs l 1: Unclassified JohnTelephone Lists Bill can read all files Claire cannot read Personnel or E-Mail Files John can only read Telephone Lists

9 R EADING I NFORMATION Information flows up, not down “Reads up” disallowed, “ reads down ” allowed Simple Security Condition (Step 1) Subject s can read object o iff, L ( o ) ≤ L ( s ) and s has permission to read o Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission) Sometimes called “no reads up” rule 9

10 W RITING I NFORMATION Information flows up, not down “ Writes up ” allowed, “writes down” disallowed *-Property (Step 1) Subject s can write object o iff L ( s ) ≤ L ( o ) and s has permission to write o Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission) Sometimes called “no writes down” rule 10

11 B ASIC S ECURITY T HEOREM, S TEP 1 If a system is initially in a secure state, and every transition of the system satisfies the simple security condition, step 1, and the *-property, step 1, then every state of the system is secure Proof: induct on the number of transitions 11

12 B ASICS : P ARTIALLY O RDERED S ET A Set S with relation  (written (S,  ) is called a partially ordered set if  is Anti-symmetric If a  b and b  a then a = b Reflexive For all a in S, a  a Transitive For all a, b, c. a  b and b  c implies a  c 12

13 B ACKGROUND : P OSET EXAMPLES Natural numbers with less than (total order) Sets under the subset relation (not a total order) Natural numbers ordered by divisibility 13

14 B ACKGROUND : L ATTICE Partially ordered set (S,  ) and two operations: greatest lower bound (glb X) Greatest element less than all elements of set X least upper bound (lub X) Least element greater than all elements of set X Every lattice has bottom (glb L) a least element top (lub L) a greatest element 14

15 B ACKGROUND : L ATTICE EXAMPLES Natural numbers in an interval (0.. n) with less than Also the linear order of clearances (U  FOUO  S  TS) The powerset of a set of generators under inclusion E.g. Powerset of security categories {NUC, Crypto, ASI, EUR} The divisors of a natural number under divisibility 15

16 B ELL -L A P ADULA M ODEL (S TEP 2) Total order of classifications not flexible enough Solution: Categories S can access O if C(O)  C(S) Combining with clearance: ( L,C) dominates (L’,C’) L’ = L and C’  C Induces lattice instead of levels Expand notion of security level to include categories Security level is ( clearance, category set ) 16

17 B ELL -L A P ADULA M ODEL ( BLP ) 17 Lattice Example1 Lattice Example2 ( Top Secret, { NUC, EUR, ASI } ) ( Confidential, { EUR, ASI } ) ( Secret, { NUC, ASI } ) {NUC, EUR, US} {NUC, EUR}{NUC, US}{EUR, US} {NUC} {EUR}{US} 

18 L EVELS AND L ATTICES dom (dominates) relation ( L, C ) dom ( L, C ) iff L ≤ L and C  C Examples (Top Secret, {NUC, ASI}) dom (Secret, {NUC}) (Secret, {NUC, EUR}) dom (Confidential,{NUC, EUR}) (Top Secret, {NUC})  dom (Confidential, {EUR}) Let C be set of clearances, K set of categories. Set of security levels L = C  K, dom form lattice lub ( L ) = ( max ( L ), C ) glb ( L ) = ( min ( L ),  ) 18

19 L EVELS AND O RDERING Security levels partially ordered Any pair of security levels may (or may not) be related by dom “dominates” serves the role of “greater than” in step 1 But “greater than” is a total ordering, 19

20 R EADING I NFORMATION Information flows up, not down “Reads up” disallowed, “reads down” allowed Simple Security Condition (Step 2) Subject s can read object o iff L ( s ) dom L ( o ) and s has permission to read o Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission) Sometimes called “no reads up” rule 20

21 W RITING I NFORMATION Information flows up, not down “Writes up” allowed, “writes down” disallowed *-Property (Step 2) Subject s can write object o iff L ( o ) dom L ( s ) and s has permission to write o Note: combines mandatory control (relationship of security levels) and discretionary control (the required permission) Sometimes called “no writes down” rule 21

22 B ASIC S ECURITY T HEOREM (S TEP 2) If a system is initially in a secure state, and every transition of the system satisfies the simple security condition (step 2) and the *-property (step 2) then every state of the system is secure Proof: induct on the number of transitions 22

23 E XAMPLE George is cleared into security level (SECRET,{NUC, EUR}), DocA is classified as ( CONFIDENTIAL, { NUC } ), DocB is classified as ( SECRET, { EUR, US}), and DocC is classified as (SECRET, { EUR }). Then: George dom DocA as CONFIDENTIAL ≤ SECRET and { NUC }  { NUC, EUR } George ¬dom DocB as { EUR, US }  { NUC, EUR } George dom DocC as SECRET ≤ SECRET and { EUR }  { NUC, EUR } George can read DocA and DocC but not DocB (assuming the discretionary access controls allow such access). Suppose Paul is cleared as (SECRET, { EUR, US, NUC }) and has discretionary read access to DocB. Paul can read DocB; were he to copy its contents to DocA and set its access permissions accordingly. George could then read DocB!? *-property (step 2) prevents this 23

24 P ROBLEM Colonel has (Secret, {NUC, EUR}) clearance Major has (Secret, {EUR}) clearance Major can talk to colonel (“write up” or “read down”) Colonel cannot talk to major (“read up” or “write down”) Not Desired! 24

25 S OLUTION Define maximum, current levels for subjects maxlevel ( s ) dom curlevel ( s ) Example Treat Major as an object (Colonel is writing to him) Colonel has maxlevel (Secret, { NUC, EUR }) Colonel sets curlevel to (Secret, { EUR }) Now L (Major) dom curlevel (Colonel) Colonel can write to Major without violating “no writes down” 25

26 26 S YSTEMS B UILT ON B ELL -L A P ADULA (BLP) BLP was a simple model Intent was that it could be enforced by simple mechanisms File system access control was the obvious choice Multics (1965) implemented BLP Unix inherited its discretionary AC from Multics


Download ppt "I NFORMATION S ECURITY : C ONFIDENTIALITY P OLICIES (C HAPTER 4) Dr. Shahriar Bijani Shahed University."

Similar presentations


Ads by Google