Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 - Chapter 4 of Bishop- 4. Security Policies. 2 Security Policy A a statement that partitions all possible system states into: Authorized (secure) states.

Similar presentations


Presentation on theme: "1 - Chapter 4 of Bishop- 4. Security Policies. 2 Security Policy A a statement that partitions all possible system states into: Authorized (secure) states."— Presentation transcript:

1 1 - Chapter 4 of Bishop- 4. Security Policies

2 2 Security Policy A a statement that partitions all possible system states into: Authorized (secure) states Unauthorized (non secure) states A secure system Starts in any authorized state Never enters unauthorized state S1S2S3S4 Authorized, Unauthorized Starting from S2 1. Statement means what? 2. State-based Requirements Specification

3 3 The “CIAs” of Security Confidentiality: Prevent unauthorized subjects from obtaining information Why so important? Privacy? Availability: Have to provide information when an authorized entity asks for it. Why so importantDisaster? Integrity: Information may not be changed in unauthorized ways Why so important?Loss of Valuable data?

4 4 Confidentiality X : a set of subjects, I : information I has confidentiality property w.r.t. X if no x  X can obtain information about I Example: X = students, I = final exam questions Subtleties Why “w.r.t. X ” ? I may not be confidential to students of in another class Why “information about I ” ? What is obtaining? Suppose “no question in I is easier than 4.12” But “question 4.12 is not included in I ”

5 5 Integrity Integrity = Prevent unauthorized modification of information I has integrity property w.r.t. X if all x  X trust information in I Types of integrity: Trust the origin / identity of I (origin integrity or authentication) Phishing Trust the conveyance / storage of I (data integrity) popular software hosted by an unpopular website Trust the specification of a resource I (assurance) software with backdoor Is trust a mechanism to ensure integrity?

6 6 Availability I has availability property w.r.t. X if all x  X can access I ‘can access’ usually means ‘can access in reasonable time’ Psychologically acceptable web response time: 8 second  2 DOS (DDOS) slows down instead of crashing Maybe to a sluggish speed

7 7 Policies and their Models policy = statement of safe states Abstract description of policy = abstract statement of safe states Basic things specified in policies Confidentiality Prohibit direct (rights leakage) or indirect (information flow) disclosure of information under temporal conditions Example: None should read other’s homework; one should not write one’s answers elsewhere; before in-class discussion” Integrity Whether can be altered; if so, under what conditions and how? Example: the software may be distributed without modification Availability What service must be provided and at what quality (QoS) Example: a response must reach the browser within 8 seconds

8 8 Policy: Many dimensions Formal vs. Informal Formal = using a formally specified syntax Informal = human language or pictures Degrees of formality Used by linguists Need to be amenable to linguistic argument and stand up to academic criticism Used by congressional policy makers, lawyers, etc. to provide a sense of non-ambiguity to the executive and judiciary branches Failure model: the Supreme court No clear division of good and bad

9 9 From Formal to Machine Understandable? Why formality? Desire for non ambiguity What can we do with formal policies? Analyze them for Consistency Are they non-contradictory? Completeness Do they cover all the cases? How do we analyze them? By committee? By hand? By machine?

10 10 Formal Policies during/after Analysis During analysis: Can we use machines to analyze? Many proof systems, calculi, etc What happens after analysis? Enforce them Machine enforcement Requires executable polices! Executable policies vs. executable code

11 11 Behavioral Policies Governing behavior of: Subjects, objects, and actions What subjects can/cannot do. Example: statements about socially acceptable behavior Equal opportunity, affirmative action, discrimination Safety vs. Liveliness Bad things to avoid: Do not divulge private information Good things that must be done Always inform change to user groups; new obligations placed on lenders etc.

12 12 Traditional Examples of Security Policies Military (DoD) security policy Policy primarily protect confidentiality Behavior against attack; Privacy act (and HIPAA) Commercial security policy Policy primarily protect integrity and availability Student account server But Confidentiality policy means protecting only confidentiality Integrity policy means protecting only integrity

13 13 Integrity vs. Consistency Integrity w.r.t. modifications in malign situations A software with a Trojan horse in it Consistency w.r.t benign situations In a distributed system Want to draw $20 from an ATM Bank debits $20 from your account ATM powers down before it gives you money Customer lose $20? Not really, why? It’s a transaction Two-phase commit Not a traditional security mechanism

14 14 Trust a: reliance on character, ability, strength, or truth of someone b : one in whom confidence is placed dependence and reliance on future payment for property delivered

15 15 Trust and Security Trust a basic assumption of truthfulness? (circular) Axiomatic: need to begin by assuming that something is true Build the rest based on some assumption Basis for trust: Belief (could be social, blind faith, unfounded or even unjustified) Experience Prior knowledge of good behavior Reputation based trust Credit score, from several credit agencies. Up to the individual to determine how to use the data in making a judgment.

16 16 Roles of Trust in system management (from the textbook) 1. Administrator installs a patch 2. Trusts patch came from vendor, not tampered with in transit Example: fake Windows patch (see Appendix 1 p36)fake Windows patch vendor tested patch thoroughly vendor ’ s test environment corresponds to local environment patch is installed correctly

17 17 Formal Verification Formal verification gives mathematical proof that given input i, the protection system P works as specified “I have invented an encryption algorithm, and I proved it to be perfectly secure.” Not a concept of trust. Only a proof system that uses some axioms Important points: What are the assumptions of the proof? What is the final claim of the proof? Does the final claim match the context in which the result is used? How complex is the proof?

18 18 Why believe inFormal Methods? 1. Proofs have no errors [Note: infamous examples of incorrect proofs] 2. Preconditions hold in actual environment 3. Transformed into executable code whose actions follow source code. 1. Not true in general 2. Hard to do. Compiler bugs, linker/loader/library problems No Hardware faults Real story of Ken Thompson (See Appendix 2 p37) Real story of Ken Thompson Hardware executes the program as intended Hardware bug (Pentium f00f bug (http://www.x86.org/errata/dec97/f00fbug.htm), for example)Pentium f00f bughttp://www.x86.org/errata/dec97/f00fbug.htm

19 19 Overview Policies The Role of Trust Types of Access Control Policy Expression Languages Limits on Precise Security Mechanisms

20 20 Common Types of Access Control Discretionary Access Control (DAC, IBAC) individual user sets access control mechanism to allow or deny access to an object Modeled with Access Control Matrix Mandatory Access Control (MAC) System mechanism controls access to object, and individuals cannot alter that access Military and governmental reflecting strictly hierarchical organizations Modeled with lattices

21 21 Role based access control Popular among business, military worlds 3 Basic entities: Subjects, Roles, Permissions 2 Basic mappings: Subject  Role, Role  Permission A subject gets all permissions assigned to a role Constraints taken as binary SubjectsRolesPermissions Subject to role mapping Role to permissions mapping

22 22 Types of Access Control Another Popular Concept Originator Controlled Access Control (ORCON) The originator (creator/owner) of information controls who can access it The own right no longer implies control of rights Copyright, Patent Some things to remember Delegation of Rights Ownership Right to recall the system or its rights

23 23 Other Issues about Access Control Identity based systems Attribute based systems Credential based access control Different from capability based systems Distributed access control Certificate schemas Chasing credential chains etc Federations and loosely coupled organizations Some aspects of Kerberos

24 24 Overview Policies The Role of Trust Types of Access Control Policy Expression Languages Limits on Precise Security Mechanisms

25 25 Why Policy Expression Languages? Natural language-based policies ambiguous? Example: “a student may not copy other student’s homework” What does ‘copy’ mean? And what if the date is past? To Express policies unambiguously Requires precise languages. Options: Mathematical logic Programming-like languages Rule languages

26 26 Policy Languages at Different Levels High-level languages Policies expressed abstractly Entities of the enforcement mechanism not part of the policy expression language syntax Low-level languages Policy constraints in terms of program options, input, or specific system characteristics More closely tied to enforcement mechanisms Borrowing from programming languages: Need to pass context and scope independent of the enforcement level – Fully abstract

27 27 From high to low level Policy refinement: Enforcement requires polices to be translated to formal syntax describing operational semantics of enforcement environment. Example: High level: Every student can read his/her own data Low level: Need to be written as an ACL or some other constraint on process behavior. High Level Policies Applies to selected/all environments Low level operational constraints or metadata+behavior specification Translation procedure can be parameterized by polices

28 28 Purity Measures on Refinement Fully Abstraction: Suppose P, Q are higher level policies and F : (higher level) --  (lower level) is a refinement. Then f is fully abstract if: [ P = Q] iff [ F (P) ~ F (Q)] Here, = is equality of higher level policies and ~ is equality of lower level policies. (a definition cooked for this lecture) For proper definition from programming languages see http://www.fabfac.org/intro.html Compositionality: F is compositional iff F (PUQ) ~ F (P) Ü F (Q)

29 29 Example 1: Policy Language for Java Goal: restrict actions of Java programs that are downloaded and executed by a web browser Expresses constraints as conditions restricting creation of Java class or invocation of methods (of class) Independent of enforcement mechanisms (Windows and Unix would need different mechanisms)

30 30 Example 1: Syntax of the Language Subjects (objects) are classes, methods Class: file, socket Method: file.read() Operations Instantiation: creates instance of class, e.g. -| socket Invocation: executes methods, e.g. |-> file.read Access constraints deny(s op s’ ) when b While b is true, subject s cannot perform op on (subject or object) s’ empty s means all subjects

31 31 Example 1: Application Scenarios Downloaded Java program cannot access password file deny( |-> file.read) when (file.getFileName()==“/etc/passwd”) The program cannot open network connection when there are already 100+ connections deny (|- socket) when (network.numerOfConns >= 100)

32 32 Example 2: DTEL Domain-type enforcement language (DTEL) The policy has three parts: Assign processes (principles/subjects) to domains Assign files (objects) to types Specify rights that domains have over types The model types all files Access granularity restricted to type (names)

33 33 Example 2: DTEL Syntax Domains: d_user(ordinary user processes) d_admin(administrator processes) d_login (login processes) d_daemon(system daemons) Types: t_sysbin(executable system files) t_readable(readable files) t_writable(writable files) t_dte(data used by DTEL) t_generic(data generated by user processes)

34 34 Example 2: DTEL Syntax (Cont’d) Policy in English: only administrator processes can write to system binaries; others cannot domain d_admin = (/usr/bin/sh, /usr/bin/csh, /usr/bin/ksh), (crwxd->t_generic), (crwxd->t_readable, t_writable, t_dte, t_sysbin), (sigtstp->d_daemon); (The last line says a process in d_admin can suspend a daemon process) domain d_user = (/usr/bin/sh, /usr/bin/csh, /usr/bin/ksh), (crwxd->t_generic), (rxd->t_sysbin), (crwd->t_writable), (rd->t_readable, t_dte);

35 35 Example 3: X Window Access Policy UNIX X11 Windowing System Access to X11 display controlled by list List says what hosts allowed, disallowed access xhost +groucho -chico Connections from host groucho allowed Connections from host chico not allowed Properties of the syntax Allows permissions and prohibitions That is positive (+) and negative (-) permissions on acceses

36 36 Example 4: Policy in English GMU Responsible Use of Computing Policy http://www.gmu.edu/catalog/0001/genpoli2.html

37 37 XACML: Specifying Access Control in XML Defined by an OASIS Technical Committee XACML is a markup language for specifying access control language to XML formatted documents Example on next page: Taken from http://www.idealliance.org/papers/dx_xmle04/papers /04-01-04/04-01-04.html

38 38 <Rule RuleId="" Effect="Permit"> John can open the door. <SubjectMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal"> <AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">John <SubjectAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:subject:subject-id" DataType="http://www.w3.org/2001/XMLSchema#string"/> <ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:anyURI-equal"> door open

39 39 Precise Enforcement of Policies After we have a policy, does there always exist mechanisms to enforce the policy? If so, can we devise a generic procedure for developing such mechanisms? secure precise set of reachable states with mechanisms set of secure states

40 40 The A. Jones + R. Lipton Model A program p is modeled as a function p: I 1 x I 2 x...x I n  r Assumption on Observability All information available about I 1 x I 2 x...x I n are encoded in the function p(I 1,I 2, I n ) A protection mechanism: Let p: I 1 x I 2 x...x I n  r be a function. m(I 1,I 2, I n ) = p(I 1,I 2, I n ) or m(I 1,I 2, I n )  E That is, m produces the same output as p or an error.

41 41 A Model of Mechanisms Objective is to secure a program p that takes inputs I 1 I 2... I n and outputs some r A protection mechanism m takes the same inputs I 1 I 2... I n and outputs either the same r or some error e set of reachable states without mechanisms set of secure states

42 42 The A. Jones + R. Lipton Model Cont. Definition: A confidentiality policy for p: I 1 x I 2 x...x I n  r is a function c: I 1 x I 2 x...x I n  A where A is a subset of I 1 xI 2 x...xI n Definition: A confidentiality policy c is secure with respect to a security mechanism m iff there is a function m’: A  R U E satisfying m(i 1,i 2,i n )= m’(c(i 1,i 2,i n )) Example: consider a password accepting function auth with respect to a database Db with output {good, bad} auth: U x P x Db  {good, bad}, where Db contains pairs of (u,pwd) that are allowed. The the confidentiality policy allow(i 1,i 2,i 3 )=(i 1,i 2 ). Then there is NO function auth’ satisfying auth’(allow(i 1,i 2,i 3 ))= auth’(i 1,i 2 )= auth(i 1,i 2,i 3 )

43 43 Precision Mechanisms for enforcing policies are typically overly-restrictive m 1, m 2 are distinct mechanisms for program p under same policy m 1 as precise as m 2 (m 1  m 2 ) if, for all inputs i 1, …, i n, m 2 (i 1, …, i n ) = p(i 1, …, i n )  m 1 (i 1, …, i n ) = p(i 1, …, i n ) set of reachable states without mechanisms set of secure states m1m1 m2m2

44 44 Combining Mechanisms m 3 = m 1  m 2 defined as: For inputs on which m 1 and m 2 outputs same value as p, m 3 does also; otherwise, m 3 returns same value as m 1 Theorem: if m 1, m 2 secure, then m 3 secure Also, m 3  m 1 and m 3  m 2 set of reachable states without mechanisms set of secure states m1m1 m2m2

45 45 Existence Theorem For any program p and security policy c, there exists a precise, secure mechanism m* such that, for all secure mechanisms m associated with p and c, m*  m m*=  i=1,  m i set of reachable states without mechanisms set of secure states mimi

46 46 Lack of Effective Procedure Theorem: There is no effective procedure that determines a maximally precise, secure mechanism for any policy and program. Proof analogous to that of undecidable problem However, possible to get a maximally precise secure mechanism for specific cases.

47 47 Key Points Policies describe what are (not) allowed Trust underlies everything DAC and MAC (ORCON) Formal languages are required to specify policy Precise enforcement of policies is generally difficult

48 48 Appendix 1: Fake Windows Patch Is a Windows Killer (Source: http://www.pcmag.com/article2/0,1895,1853366,00.asp) Go backhttp://www.pcmag.com/article2/0,1895,1853366,00.aspGo back From: update@microsoft.com Subject: What You Need to Know About the Zotob.A Worm. What You Should Know About Zotob Published: August 14, 2005 | Updated: August 19, 2005 Severity VirusGreen Supported Software Affected Windows All Version Microsoft Security Advisory 899588 Zotob.A Zotob.B Zotob.C Zotob.D Zotob.E Bobax.O Esbot.A Rbot.MA Rbot.MB Rbot.MC The attachment is named MS05-039.EXE. It is 21,229 bytes and is compressed with the MEW program. When the attachment is executed, it first downloads a second Trojan program, Agent.AII, and executes it. This program downloads additional malware which logs keystrokes and accesses multiple web sites. It also attempts to modify the settings of security programs on the user's computer. Zotob is a worm that targets All Windows computers and takes advantage of a security issue that was addressed by Microsoft Security Bulletin MS05-039. This worm installs malicious software, and then searches for other computers to infect. If you have installed the update released with Security Bulletin MS05-039, you are protected from Zotob and its variants. If you are using any supported version of Windows, you are not at risk.

49 49 Appendix 2: True Story about a Back Door Ken Thompson's 1983 Turing Award lecture to the ACM admitted the existence of a back door in early Unix versions that may have qualified as the most fiendishly clever security hack of all time. In this scheme, the C compiler contained code that would recognize when the `login' command was being recompiled and insert some code recognizing a password chosen by Thompson. So the compiled Unix system has a backdoor whereas the source code is clean. More amazingly, Thompson also arranged that the compiler would recognize when it was compiling a version of itself, and insert into the recompiled compiler the hack codes required to get him the password, and also to recognize itself and do the whole thing again the next time around! Consequently, when someone suspected the compiler and attempted to recompile the compiler from a clean source, he had to use the hacked compiler to recompile the compiler – which would of course be a hacked version again! The hack perpetuated itself invisibly, leaving the back door in place and active but with no trace in the sources. (See full story at http://www.acm.org/classics/sep95/)http://www.acm.org/classics/sep95/


Download ppt "1 - Chapter 4 of Bishop- 4. Security Policies. 2 Security Policy A a statement that partitions all possible system states into: Authorized (secure) states."

Similar presentations


Ads by Google