9- 1 Last time ● User Authentication ● Beyond passwords ● Biometrics ● Security Policies and Models ● Trusted Operating Systems and Software ● Military.

Slides:



Advertisements
Similar presentations
Information Flow and Covert Channels November, 2006.
Advertisements

Operating System Security
1 cs691 chow C. Edward Chow Confidentiality Policy CS691 – Chapter 5 of Matt Bishop.
CMSC 414 Computer (and Network) Security Lecture 13 Jonathan Katz.
Access Control Methodologies
Chapter 2.  CIA Model  Host Security VS Network Security  Least Privileges  Layered Security  Access Controls Prepared by Mohammed Saher2.
CMSC 414 Computer and Network Security Lecture 13 Jonathan Katz.
CS-550 (M.Soneru): Protection and Security - 1 [SaS] 1 Protection and Security.
Verifiable Security Goals
CMSC 414 Computer and Network Security Lecture 11 Jonathan Katz.
Sicurezza Informatica Prof. Stefano Bistarelli
Information Systems Security Security Architecture Domain #5.
User Domain Policies.
G Robert Grimm New York University Protection and the Control of Information Sharing in Multics.
Information Classification & Access Control. Background All information is not equal  Context decides the sensitivity Even then, all information in the.
CMSC 414 Computer and Network Security Lecture 19 Jonathan Katz.
Effectively Integrating Information Technology (IT) Security into the Acquisition Process Section 5: Security Controls.
CH14 – Protection / Security. Basics Potential Violations – Unauthorized release, modification, DoS External vs Internal Security Policy vs Mechanism.
1 Introduction to Information Security , Spring 2014 Lecture 3: Access control (cont.) Eran Tromer Slide credits: John Mitchell, Stanford Max.
1 IS 2150 / TEL 2810 Information Security & Privacy James Joshi Associate Professor, SIS Lecture 6 Oct 2-9, 2013 Security Policies Confidentiality Policies.
© G. Dhillon, IS Department Virginia Commonwealth University Principles of IS Security Formal Models.
Chapter 5 – Designing Trusted Operating Systems  What makes an operating system “secure”? Or “trustworthy?  How are trusted systems designed, and which.
The Protection of Information in Computer Systems Part I. Basic Principles of Information Protection Jerome Saltzer & Michael Schroeder Presented by Bert.
Session 2 - Security Models and Architecture. 2 Overview Basic concepts The Models –Bell-LaPadula (BLP) –Biba –Clark-Wilson –Chinese Wall Systems Evaluation.
Security+ All-In-One Edition Chapter 19 – Privilege Management Brian E. Brzezicki.
Lattice-Based Access Control Models Ravi S. Sandhu Colorado State University CS 681 Spring 2005 John Tesch.
Chapter 5 Network Security
Li Xiong CS573 Data Privacy and Security Access Control.
Access Control. What is Access Control? The ability to allow only authorized users, programs or processes system or resource access The ability to disallow.
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
Secure Operating Systems Lesson 4: Access Control.
Information Security CS 526 Topic 17
COEN 350: Network Security Authorization. Fundamental Mechanisms: Access Matrix Subjects Objects (Subjects can be objects, too.) Access Rights Example:
A Lattice Model of Secure Information Flow By Dorothy E. Denning Presented by Drayton Benner March 22, 2000.
Trusted Operating Systems
14.1 Silberschatz, Galvin and Gagne ©2009 Operating System Concepts with Java – 8 th Edition Protection.
Access Control: Policies and Mechanisms Vinod Ganapathy.
Privilege Management Chapter 22.
Design Principles and Common Security Related Programming Problems
Protection & Security Greg Bilodeau CS 5204 October 13, 2009.
CS426Fall 2010/Lecture 211 Computer Security CS 426 Lecture 21 The Bell LaPadula Model.
Security Models Xinming Ou. Security Policy vs. Security Goals In a mandatory access control system, the system defines security policy to achieve security.
Computer Science and Engineering Computer System Security CSE 5339/7339 Session 16 October 14, 2004.
Database Security Database System Implementation CSE 507 Some slides adapted from Navathe et. Al.
Access Controls Mandatory Access Control by Sean Dalton December 5 th 2008.
PREPARED BY: MS. ANGELA R.ICO & MS. AILEEN E. QUITNO (MSE-COE) COURSE TITLE: OPERATING SYSTEM PROF. GISELA MAY A. ALBANO PREPARED BY: MS. ANGELA R.ICO.
Chap5: Designing Trusted Operating Systems.  What makes an operating system “secure”? Or “trustworthy”?  How are trusted systems designed, and which.
22 feb What is Access Control? Access control is the heart of security Definitions: * The ability to allow only authorized users, programs or.
Lecture 2 Page 1 CS 236 Online Security Policies Security policies describe how a secure system should behave Policy says what should happen, not how you.
8- 1 Last time ● User Authentication ● Authentication Factors ● Passwords ● Attacks on Passwords.
Database System Implementation CSE 507
Access Control Model SAM-5.
Access Control CSE 465 – Information Assurance Fall 2017 Adam Doupé
Verifiable Security Goals
Security+ All-In-One Edition Chapter 1 – General Security Concepts
Security Models and Designing a Trusted Operating System
Mandatory Access Control (MAC)
Advanced System Security
OS Access Control Mauricio Sifontes.
Guest Lecture in Acc 661 (Spring 2007) Instructor: Christopher Brown)
Chapter 5: Confidentiality Policies
Computer Security Access Control
Operating System Concepts
O.S. Security.
IS 2150 / TEL 2810 Information Security & Privacy
Access Control and Audit
Presentation transcript:

9- 1 Last time ● User Authentication ● Beyond passwords ● Biometrics ● Security Policies and Models ● Trusted Operating Systems and Software ● Military and Commercial Security Policies

9- 2 This time ● Security Policies and Models ● Bell La-Padula and Biba Security Models ● Information Flow Control ● Trusted Operating System Design ● Design Elements ● Security Features

9- 3 Lattices ● Dominance relationship ≥ defined in military security model is transitive and antisymmetric ● Therefore, it defines a lattice ● For two levels a and b, neither a ≥ b nor b ≥ a might hold ● However, for every a and b, there is a lowest upper bound u for which u ≥ a and u ≥ b and a greatest lower bound l for which a ≥ l and b ≥ l ● There are also two elements U and L that dominate/are dominated by all levels ● In example, U = (“Top Secret”, {“Soviet Union”, “East Germany”}) L = (“Unclassified”,  )

9- 4 Example Lattice (“Top Secret”, {“Soviet Union”, “East Germany”}), (“Unclassified”,  ) (“Top Secret”, {“Soviet Union”}) (“Secret”, {“Soviet Union”}) (“Secret”, {“East Germany”}) (“Secret”, {“Soviet Union”, “East Germany”})

9- 5 Bell-La Padula Confidentially Model ● Regulates information flow in MLS policies, e.g., lattice-based ones ● Users should get information only according to their clearance ● Should subject s with clearance C(s) have access to object o with classification C(o)? ● ss-property (“no read up”): s should have read access to o only if C(s) ≥ C(o) ● *-property (“no write down”): s should have write access to o only if C(o) ≥ C(s) ● Comment: Version in textbook is a little bit less strict, but this is how *-property is typically defined

9- 6 Example ● No read up is straightforward ● No write down avoids the following information leak: ● James Bond reads top-secret document and summarizes it in a confidential document ● Miss Moneypenny with clearance “confidential” now gets access to top-secret information ● In practice, subjects are really programs (acting on behalf of users) ● Else James Bond couldn’t even talk to Miss Moneypenny ● If program accesses top-secret information, OS ensures that it won’t be allowed to write to confidential file later ● Even if program does not leak information to confidential file ● Might need explicit declassification operation for usability purposes

9- 7 Biba Integrity Model ● Prevent inappropriate modification of data ● Dual of Bell-La Padula model ● Subjects and objects are ordered by an integrity classification scheme, I(s) and I(o) ● Should subject s have access to object o? ● Write access: s can modify o only if I(s) ≥ I(o) ● Unreliable person cannot modify file containing high integrity information ● Read access: s can read o only if I(o) ≥ I(s) ● Unreliable information cannot “contaminate” subject

9- 8 Low Watermark Property ● Biba’s access rules are very restrictive, a subject cannot ever view lower integrity object ● Can use dynamic integrity levels instead ● Subject Low Watermark Property: If subject s reads object o, then I(s) = glb(I(s), I(o)), where glb() = greatest lower bound ● Object Low Watermark Property: If subject s modifies object o, then I(o) = glb(I(s), I(o)) ● Integrity of subject/object can only go down, information flows down ● What kind of property does Bell-La Padula imply?

9- 9 Review of Bell-La Padula & Biba ● Very simple, which makes it possible to prove properties about them ● E.g., can prove that if a system starts in a secure state, the system will remain in a secure state ● Probably too simple for great practical benefit ● Need declassification ● Need both confidentially and integrity, not just one ● What about object creation? ● Information leaks might still be possible through covert channels

9- 10 Information Flow Control ● An information flow policy describes authorized paths along which information can flow ● For example, Bell-La Padula describes a lattice-based information flow policy ● In compiler-based information flow control, a compiler checks whether the information flow in a program could violate an information flow policy ● How does information flow from a variable x to a variable y? ● Explicit flow: E.g., y:= x; or y:= x / z; ● Implicit flow: If x = 1 then y := 0; else y := 1

9- 11 Information Flow Control (cont.) ● See text for other sample statements ● Input and output variables of program each have a (lattice-based) security classification S() associated with them ● For each program statement, compiler verifies whether information flow is secure ● For example, x := y + z is secure only if S( x ) ≥ lub(S( y ), S( z )), where lub() is lowest upper bound ● Program is secure if each of its statements is secure

9- 12 Trusted System Design Elements ● Design must address which objects are accessed how and which subjects have access to what ● As defined in security policy and model ● Security must be part of design early on ● Hard to retrofit security, see Windows 95/98 ● Design principles for security ● Least privilege ● Operate using fewest privileges possible ● Economy of mechanism ● Protection mechanism should be simple and straightforward ● Open design ● Avoid security by obscurity ● Rely on secret keys or passwords, but not on secret algorithms

9- 13 Security Design Principles (cont.) ● Complete mediation ● Every access attempt must be checked ● Permission based ● Default should be denial of access ● Separation of privileges ● Two or more conditions must be met to get access ● Least common mechanism ● Every shared mechanism could potentially be used as a covert channel ● Ease of use ● If protection mechanism is difficult to use, nobody will use it or it will be used in the wrong way

9- 14 Security Features of Trusted OS ● Identification and authentication ● See earlier ● Access control ● Object reuse protection ● Complete mediation ● Trusted path ● Accountability and audit ● Intrusion detection

9- 15 Access Control ● Mandatory access control (MAC) ● Central authority establishes who can access what ● Good for military environments ● For implementing Chinese Wall, Bell-La Padula, Biba ● Discretionary access control (DAC) ● Owners of an object have (some) control over who can access it ● You can grant others access to your home directory ● In UNIX, Windows,… ● RBAC is neither MAC nor DAC ● Possible to use combination of these mechanisms

9- 16 Object Reuse Protection ● Alice allocates memory from OS and stores her password in this memory ● After using password, she returns memory to OS ● By calling free() or simply by exiting procedure if memory is allocated on stack ● Later, Bob happens to be allocated the same piece of memory and he finds Alice’s password in it ● OS should erase returned memory before handing it out to other users ● Defensive programming: Erase sensitive data yourself before returning it to OS ● Similar problem exists for files, registers and storage media

9- 17 Hidden Data ● Hidden data is related to object reuse protection ● You think that you deleted some data, but it is still hidden somewhere ● Deleting a file will not physically erase file on disk ● Deleting an in GMail will not remove from Google’s backups ● Deleting text in MS Word might not remove text from document ● Putting a black box over text in a PDF leaves text in PDF ● Shadow Copy feature of Windows Vista keeps file snapshots to enable restores

9- 18 Complete Mediation / Trusted Path ● Complete mediation ● All accesses must be checked ● Preventing access to OS memory is of little use if it is possible to access the swap space on disk ● Trusted path ● Give assurance to user that her keystrokes and mouse clicks are sent to legitimate receiver application ● Remember the fake login screen? ● Turns out to be quite difficult for existing desktop environments, both Linux and Windows ● Don’t run sudo if you have an untrusted application running on your desktop

9- 19 Recap ● Security Policies and Models ● Bell La-Padula and Biba Security Models ● Information Flow Control ● Trusted Operating System Design ● Design Elements ● Security Features

9- 20 Next time ● Trusted Operating System Design ● Security Features ● Trusted Computing Base ● Least Privilege in Popular OSs ● Assurance ● Security in Networks ● Network Concepts ● Threats in Networks ● Network Security Controls ● Firewalls ● Intrusion Detection Systems