CSCE 727 Awareness and Training Secure System Development and Monitoring.

Slides:



Advertisements
Similar presentations
Title Slide EVOLVING CRITERIA FOR INFORMATION SECURITY PRODUCTS Ravi Sandhu George Mason University Fairfax, Virginia USA.
Advertisements

THE ORANGE BOOK Ravi Sandhu ORANGE BOOK CLASSES A1Verified Design B3Security Domains B2Structured Protection B1Labeled Security Protection.
Computer Science CSC 474Dr. Peng Ning1 CSC 474 Information Systems Security Topic 5.2: Evaluation of Secure Information Systems.
TCSEC: The Orange Book. TCSEC Trusted Computer System Evaluation Criteria.
PKE PP Mike Henry Jean Petty Entrust CygnaCom Santosh Chokhani.
4/28/20151 Computer Security Security Evaluation.
CSE331: Introduction to Networks and Security Lecture 34 Fall 2002.
CS526Topic 22: TCSEC and Common Criteria 1 Information Security CS 526 Topic 22: TCSEC and Common Criteria.
FIPS 201 Personal Identity Verification For Federal Employees and Contractors National Institute of Standards and Technology Information Technology Laboratory.
Chapter 16: Standardization and Security Criteria: Security Evaluation of Computer Products Guide to Computer Network Security.
IT Security Evaluation By Sandeep Joshi
The Common Criteria Cs5493(7493). CC: Background The need for independently evaluated IT security products and systems led to the TCSEC Rainbow series.
1 The Economic Impact of Cyber Attacks The Global Picture Chapter 9.
Smart Grid - Cyber Security Small Rural Electric George Gamble Black & Veatch
Secure Operating Systems Lesson 0x11h: Systems Assurance.
Brief Synopsis of Computer Security Standards. Tenets of Information Systems Security Confidentiality Integrity Availability Over the years, standards.
1 Evaluating Systems CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 6, 2004.
Security Controls – What Works
COEN 351: E-Commerce Security Public Key Infrastructure Assessment and Accreditation.
Computer Security: Principles and Practice
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
National Information Assurance Partnership NIAP 2000 Building More Secure Systems for the New Millenium sm.
Gurpreet Dhillon Virginia Commonwealth University
Principles of Information System Security: Text and Cases
Information Security Framework & Standards
SEC835 Database and Web application security Information Security Architecture.
Evolving IT Framework Standards (Compliance and IT)
Information Systems Security Computer System Life Cycle Security.
Evaluating Systems Information Assurance Fall 2010.
BITS Proprietary and Confidential © BITS Security and Technology Risks: Risk Mitigation Activities of US Financial Institutions John Carlson Senior.
Security Baseline. Definition A preliminary assessment of a newly implemented system Serves as a starting point to measure changes in configurations and.
Information ITIL Technology Infrastructure Library ITIL.
ISA 562 Internet Security Theory & Practice
ITEC224 Database Programming
Health Insurance Portability and Accountability Act of 1996 (HIPAA) Proposed Rule: Security and Electronic Signature Standards.
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
UNCLASSIFIED DITSCAP Primer. UNCLASSIFIED 1/18/01DITSCAP Primer.PPT 2 DITSCAP* Authority ASD/C3I Memo, 19 Aug 92 –Develop Standardized C&A Process DODI.
CSCE 727 Awareness and Training Secure System Development and Monitoring.
Background. History TCSEC Issues non-standard inflexible not scalable.
CSCE 548 Secure System Standards Risk Management.
Security Architecture and Design Chapter 4 Part 3 Pages 357 to 377.
1 Common Criteria Ravi Sandhu Edited by Duminda Wijesekera.
Security Standards and Threat Evaluation. Main Topic of Discussion  Methodologies  Standards  Frameworks  Measuring threats –Threat evaluation –Certification.
The Value of Common Criteria Evaluations Stuart Katzke, Ph.D. Senior Research Scientist National Institute of Standards & Technology 100 Bureau Drive;
Security Administration II Trusted Systems Social Context.
Disaster Recover Planning & Federal Information Systems Management Act Requirements December 2007 Central Maryland ISACA Chapter.
Certification and Accreditation CS Syllabus Ms Jocelyne Farah Mr Clinton Campbell.
CMSC : Common Criteria for Computer/IT Systems
Trusted OS Design and Evaluation CS432 - Security in Computing Copyright © 2005, 2010 by Scott Orr and the Trustees of Indiana University.
CSCE 548 Secure Software Development Security Operations.
SAM-101 Standards and Evaluation. SAM-102 On security evaluations Users of secure systems need assurance that products they use are secure Users can:
Information Security Measures Confidentiality IntegrityAccessibility Information cannot be available or disclosed to unauthorized persons, entities or.
High Assurance Products in IT Security Rayford B. Vaughn, Mississippi State University Presented by: Nithin Premachandran.
Chapter 8: Principles of Security Models, Design, and Capabilities
Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University
Security-Enhanced Linux Stephanie Stelling Center for Information Security Department of Computer Science University of Tulsa, Tulsa, OK
Sicherheitsaspekte beim Betrieb von IT-Systemen Christian Leichtfried, BDE Smart Energy IBM Austria December 2011.
Information Security Office: Function, Alignment in the Organization, Goals, and Objectives Presentation to Sacramento PMO March 2011 Kevin Dickey.
Information Security Principles and Practices by Mark Merkow and Jim Breithaupt Chapter 5: Security Architecture and Models.
INFORMATION ASSURANCE POLICY. Information Assurance Information operations that protect and defend information and information systems by ensuring their.
1 Security Architecture and Designs  Security Architecture Description and benefits  Definition of Trusted Computing Base (TCB)  System level and Enterprise.
Computer Security: Principles and Practice First Edition by William Stallings and Lawrie Brown Lecture slides by Lawrie Brown Chapter 17 – IT Security.
CSCE 548 Secure System Standards Risk Management.
Security Architecture and Design Chapter 4 Part 4 Pages 377 to 416.
Information ITIL Technology Infrastructure Library ITIL.
TCSEC: The Orange Book.
Introduction to the Federal Defense Acquisition Regulation
Official levels of Computer Security
Cybersecurity Special Public Meeting/Commission Workshop for Natural Gas Utilities September 27, 2018.
THE ORANGE BOOK Ravi Sandhu
Presentation transcript:

CSCE 727 Awareness and Training Secure System Development and Monitoring

Information Warfare - Farkas2 Reading Reading for this lecture: Rainbow Series Library, Common Criteria, NIST, Guide for Applying the Risk Management Framework to Federal Information System, 2010, final.pdf final.pdf D. Young, Huawei, ZTE Banned From Selling to U.S. Government, government/ government/

SYSTEM CERTIFICATION Information Warfare - Farkas3

4 Reading list Denning: Chapter 14 Rainbow Series Library, Common Criteria,

Information Warfare - Farkas5 Building It Secure 1960s: US Department of Defense (DoD) risk of unsecured information systems 1970s: – 1977: DoD Computer Security Initiative – US Government and private concerns – National Bureau of Standards (NBS – now NIST) Responsible for stadards for acquisition and use of federal computing systems Federal Information Processing Standards (FIPS PUBs)

Information Warfare - Farkas6 NBS Two initiatives for security: – Cryptography standards 1973: invitation for technical proposals for ciphers 1977: Data Encryption Standard 2001: Advanced Encryption Standard (NIST) – Development and evaluation processes for secure systems Conferences and workshops Involves researchers, constructors, vendors, software developers, and users 1979: Mitre Corporation: entrusted to produce an initial set of criteria to evaluate the security of a system handling classified data

Information Warfare - Farkas7 National Computer Security Center 1981: National Computer Security Center (NCSC) was established within NSA – To provide technical support and reference for government agencies – To define a set of criteria for the evaluation and assessment of security – To encourage and perform research in the field of security – To develop verification and testing tools – To increase security awareness in both federal and private sector 1985: Trusted Computer System Evaluation Criteria (TCSEC) == Orange Book

Information Warfare - Farkas8 Orange Book Orange Book objectives – Guidance of what security features to build into new products – Provide measurement to evaluate security of systems – Basis for specifying security requirements Security features and Assurances Trusted Computing Base (TCB) security components of the system: hardware, software, and firmware + reference monitor

Information Warfare - Farkas9 Orange Book Supply Users: evaluation metrics to assess the reliability of the security system for protection of classified or sensitive information when – Commercial product – Internally developed system Developers/vendors: design guide showing security features to be included in commercial systems Designers: guide for the specification of security requirements

Information Warfare - Farkas10 Orange book Set of criteria and requirements Three main categories: – Security policy – protection level offered by the system – Accountability – of the users and user operations – Assurance – of the reliability of the system

Information Warfare - Farkas11 Security Policy Concerns the definition of the policy regulation the access of users to information – Discretionary Access Control – Mandatory Access Control – Labels: for objects and subjects – Reuse of objects: basic storage elements must be cleaned before released to a new user

Information Warfare - Farkas12 Accountability Identification/authentication Audit Trusted path: no users are attempting to access the system fraudulently

Information Warfare - Farkas13 Assurance Reliable hardware/software/firmware components that can be evaluated separately Operation reliability Development reliability

Information Warfare - Farkas14 Operation reliability During system operation – System architecture: TCB isolated from user processes, security kernel isolated from non-security critical portions of the TCB – System integrity: correct operation (use diagnostic software) – Covert channel analysis – Trusted facility management: separation of duties – Trusted recovery: recover security features after TCB failures

Information Warfare - Farkas15 Development reliability System reliable during the development process. Formal methods. – System testing: security features tested and verified – Design specification and verification: correct design and implementation wrt security policy. TCB formal specifications proved – Configuration management: configuration of the system components and its documentation – Trusted distribution: no unauthorized modifications

Information Warfare - Farkas16 Documentation Defined set of documents Minimal set: – Trusted facility manual – Security features user’s guide – Test documentation – Design documentation – Personnel info: Operators, Users, Developers, Maintainers

Information Warfare - Farkas17 Orange Book Levels Highest Security –A1 Verified protection –B3 Security Domains –B2 Structured Protection –B1 Labeled Security Protections –C2 Controlled Access Protection –C1 Discretionary Security Protection –D Minimal Protection No Security

Information Warfare - Farkas18 Orange Book C1, C2: simple enhancement of existing systems. Does not break applications. B1: relatively simple enhancement of existing system. May break some of the applications. B2: major enhancement of existing systems. Will break many applications. B3: failed A1 A1: top-down design and implementation of a new system from scratch.

Information Warfare - Farkas19 NCSC Rainbow Series Orange: Trusted Computer System Evaluation Criteria Yellow: Guidance for applying the Orange Book Red: Trusted Network Interpretation Lavender: Trusted Database Interpretation

Information Warfare - Farkas20 Evaluation Process Preliminary technical review (PTR) – Preliminary technical report: architecture potential for target rating Vendor assistance phase (VAP) – Review of the documentation needed for the evaluation process, e.g., security features user’s guide, trusted facility manual, design documentation, test plan. For B or higher, additional documentations are needed, e.g., covert channel analysis, formal model, etc. Design analysis phase (DAP) – Initial product assessment report (IPAR): pages, detailed info about the hardware, software architecture, security relevant features, team assessments, etc. – Technical Review Board – Recommendation to the NCSC

Information Warfare - Farkas21 Evaluation Process Formal evaluation phase (FEP) – Product Bulletin: formal and public announcement – Final Evaluation Report: information from IPAR and testing results, additional tests, review code (B2 and up), formal policy model, proof. – Recommends rating for the system – National Cyber Security Center (NCSC) decides final rating Rating maintenance phase (RAMP) – Minor changes and revisions – Reevaluated – Rating maintenance plan

Information Warfare - Farkas22 European Criteria German Information Security Agency: German Green Book (1988) British Department of Trade and Industry and Ministry of Defense: several volumes of criteria Canada, Australia, France: works on evaluation criteria 1991: Information Technology Security Evaluation Criteria (ITSEC) – For European community – Decoupled features from assurance – Introduced new functionality requirement classes – Accommodated commercial security requirements

Information Warfare - Farkas23 Common Criteria January 1996: Common Criteria – Joint work with Canada and Europe – Separates functionality from assurance – Nine classes of functionality: audit, communications, user data protection, identification and authentication, privacy, protection of trusted functions, resource utilization, establishing user sessions, and trusted path. – Seven classes of assurance: configuration management, delivery and operation, development, guidance documents, life cycle support, tests, and vulnerability assessment.

Information Warfare - Farkas24 Common Criteria Evaluation Assurance Levels (EAL) – EAL1: functionally tested – EAL2: structurally tested – EAL3: methodologically tested and checked – EAL4: methodologically designed, tested and reviewed – EAL5: semi-formally designed and tested – EAL6: semi-formally verified and tested – EAL7: formally verified design and tested

Information Warfare - Farkas25 National Information Assurance Partnership (NIAP) 1997: National Institute of Standards and Technology (NIST), National Security Agency (NSA), and Industry Aims to improve the efficiency of evaluation Transfer methodologies and techniques to private sector laboratories Functions: developing tests, test methods, tools for evaluating and improving security products, developing protection profiles and associated tests, establish formal and international schema for CC

National Security Issues Information Warfare - Farkas26

Information Warfare - Farkas27 National Security and IW U.S. agencies responsible for national security: large, complex information infrastructure 1990: defense information infrastructure (DOD). Supports – Critical war-fighting functions – Peacetime defense planning – Information for logistical support – Defense support organizations Need proper functioning of information infrastructure “digitized battlefield”

Information Warfare - Farkas28 National Security and IW Increased reliance on information infrastructure Heavily connected to commercial infrastructure – 95% of DOD’s unclassified communication via public network No boundaries, cost effectiveness, ambiguous

Information Warfare - Farkas29 National Security and IW Vital human services – Law enforcement – Firefighters – Emergency telephone system – Federal Emergency Management Agency Other Government Services and public utilities – Financial sector – Transportation – Communications – Power – Health system

Next Class No class on Monday April 4 Schedule back to normal on April 6 Information Warfare - Farkas30