Presentation is loading. Please wait.

Presentation is loading. Please wait.

IS 2620: Developing Secure Systems Assurance and Evaluation Lecture 8 March 15, 2012.

Similar presentations


Presentation on theme: "IS 2620: Developing Secure Systems Assurance and Evaluation Lecture 8 March 15, 2012."— Presentation transcript:

1 IS 2620: Developing Secure Systems Assurance and Evaluation Lecture 8 March 15, 2012

2 Reference Matt Bishop Book Chapters 18, 19, 21

3 3 Trust Trustworthy entity has sufficient credible evidence leading one to believe that the system will meet a set of requirements Trust is a measure of trustworthiness relying on the evidence Assurance is confidence that an entity meets its security requirements based on evidence provided by the application of assurance techniques SDLC, Formal methods, design analysis, testing etc.

4 4 Relationships Evaluation standards Trusted Computer System Evaluation Criteria Information Technology Security Evaluation Criteria Common Criteria

5 5 Problem Sources (Neumann) 1. Requirements definitions, omissions, and mistakes 2. System design flaws 3. Hardware implementation flaws, such as wiring and chip flaws 4. Software implementation errors, program bugs, and compiler bugs 5. System use and operation errors and inadvertent mistakes 6. Willful system misuse 7. Hardware, communication, or other equipment malfunction 8. Environmental problems, natural causes, and acts of God 9. Evolution, maintenance, faulty upgrades, and decommissions

6 6 Types of Assurance Policy assurance is evidence establishing security requirements in policy is complete, consistent, technically sound Design assurance is evidence establishing design sufficient to meet requirements of security policy Implementation assurance is evidence establishing implementation consistent with security requirements of security policy Operational assurance is evidence establishing system sustains the security policy requirements during installation, configuration, and day-to-day operation

7 7 Waterfall Life Cycle Model

8 8 Other Models of Software Development Exploratory programming Develop working system quickly No requirements or design specification, so low assurance Prototyping (Similar to Exploratory) Objective is to establish system requirements Future iterations (after first) allow assurance techniques Formal transformation Create formal specification Very conducive to assurance methods

9 9 Models System assembly from reusable components Depends on whether components are trusted Must assure connections, composition as well Very complex, difficult to assure This is common approach to building secure and trusted systems Extreme programming Rapid prototyping and “best practices” Project driven by business decisions Requirements open until project complete Components tested, integrated several times

10 10 Architectural considerations for assurance Determine the focus of control of security enforcement mechanism Operating system: focus is on data Applications: more on operations/transactions Centralized or Distributed Distribute them among systems/components Tradeoffs? Generally easier to “assure” centralized system Security mechanism may exist in any layer

11 11 Architectural considerations Example: Four layer architecture Application layer Transaction control Services/middleware layer Support services for applications Eg., DBMS, Object reference brokers Operating system layer Memory management, scheduling and process control Hardware Includes firmware

12 12 Trusted Computing Base (Security an integral part) Reference monitor (total mediation!) Mediates all accesses to objects by subjects Reference validation mechanism must be– 1. Tamperproof 2. Never be bypassed 3. Small enough to be subject to analysis and testing – the completeness can be assured Security kernel Hardware + software implementing a RM

13 13 Trusted Computing Base TCB consists of all protection mechanisms within a computer system that are responsible for enforcing a security policy TCB monitors four basic interactions Process activation Execution domain switching Memory protection I/O operation A unified TCB may be too large Create a security kernel

14 14 Techniques for Design Assurance Modularity & Layering Well defined independent modules Simplifies and makes system more understandable Data hiding Easy to understand and analyze Different layers to capture different levels of abstraction Subsystem (memory management, I/O subsystem, credit-card processing function) Subcomponent (I/O management, I/O drivers) Module: set of related functions and data structure Use principles of secure design

15 15 Design meets requirements? Techniques needed To prevent requirements and functionality from being discarded, forgotten, or ignored at lower levels of design Requirements tracing Process of identifying specific security requirements that are met by parts of a description Informal Correspondence Process of showing that a specification is consistent with an adjacent level of specification

16 16 Requirement mapping and informal correspondence Security Functional Requirements External Functional Specification Internal Design Specification Implementation Code Requirement Tracing Informal Correspondence

17 17 Design meets requirements? Informal arguments Protection profiles Define threats to systems and security objectives Provide rationale (an argument) Security targets Identifies mechanisms and provide justification Formal methods: proof techniques Formal proof mechanisms are usually based on logic (predicate calculus) Review When informal assurance technique is used Reviews of guidelines Conflict resolution methods Completion procedures

18 18 Implementation considerations for assurance Modularity with minimal interfaces Language choice C vs. Java Java Configuration management tools Control of the refinement and modification of configuration items such as source code, documentation etc.

19 19 Implementation meets Design? Security testing Functional testing (FT) (black box testing) Testing of an entity to determine how well it meets its specification Structural testing (ST) (white box testing) Testing based on an analysis of the code to develop test cases Testing occurs at different times Unit testing (usually ST): testing a code module before integration System testing (FT): on integrated modules Security testing: product security Security functional testing (against security issues) Security structural testing (security implementation) Security requirements testing

20 20 Code development and testing Code Unit test Integrate Build system Execute system Test on current Build Code Test the test On current build Integrate tested test into automated Test suite Build test suite Code bugs

21 21 Operation and maintenance assurance Bugs in operational phase need fixing Hot fix Immediate fix Bugs are serous and critical Regular fix Less serious bugs Long term solution after a hot fix

22 Courtesy of Professors Chris Clifton & Matt Bishop 22 Evaluation

23 What is Formal Evaluation? Method to achieve Trust Not a guarantee of security Evaluation methodology includes: Security requirements Assurance requirements showing how to establish security requirements met Procedures to demonstrate system meets requirements Metrics for results (level of trust)‏ Examples: TCSEC (Orange Book), ITSEC, CC

24 Formal Evaluation: Why? Organizations require assurance Defense Telephone / Utilities “Mission Critical” systems Formal verification of entire systems not feasible Instead, organizations develop formal evaluation methodologies Products passing evaluation are trusted Required to do business with the organization

25 Mutual Recognition Arrangement National Information Assurance partnership (NIAP), in conjunction with the U.S. State Department, negotiated a Recognition Arrangement that: Provides recognition of Common Criteria certificates by 24 nations (was 19 in 2005) Eliminates need for costly security evaluations in more than one country Offers excellent global market opportunities for U.S. IT industry

26 An Evolutionary Process Two decades of research and development… US-DOD TCSEC 1983-85 US-NIST MSFR 1990 Federal Criteria 1992 Europe ITSEC 1991 Canada TCPEC 1993 Common Criteria 1993-98 ISO 15408 Common Criteria 1999 European National/Regional Initiatives 1989-93 Canadian Initiatives 1989-93

27 Common Criteria: Origin

28 TCSEC Known as Orange Book, DoD 5200.28-STD Four trust rating divisions (classes)‏ D: Minimal protection C (C1,C2): Discretionary protection B (B1, B2, B3): Mandatory protection A (A1): Highly-secure

29 TCSEC: The Original Trusted Computer System Evaluation Criteria U.S. Government security evaluation criteria Used for evaluating commercial products Policy model based on Bell-LaPadula Enforcement: Reference Validation Mechanism Every reference checked by compact, analyzable body of code Emphasis on Confidentiality Metric: Seven trust levels: D, C1, C2, B1, B2, B3, A1 D is “tried but failed”

30 TCSEC Class Assurances C1: Discretionary Protection Identification Authentication Discretionary access control C2: Controlled Access Protection Object reuse and auditing B1: Labeled security protection Mandatory access control on limited set of objects Informal model of the security policy

31 TCSEC Class Assurances (continued)‏ B2: Structured Protections Trusted path for login Principle of Least Privilege Formal model of Security Policy Covert channel analysis Configuration management B3: Security Domains Full reference validation mechanism Constraints on code development process Documentation, testing requirements A1: Verified Protection Formal methods for analysis, verification Trusted distribution

32 How is Evaluation Done? Government-sponsored independent evaluators Application: Determine if government cares Preliminary Technical Review Discussion of process, schedules Development Process Technical Content, Requirements Evaluation Phase

33 TCSEC: Evaluation Phase Three phases Design analysis Review of design based on documentation Test analysis Final Review Trained independent evaluation Results presented to Technical Review Board Must approve before next phase starts Ratings Maintenance Program Determines when updates trigger new evaluation

34 TCSEC: Problems Based heavily on confidentiality Did not address integrity, availability Tied security and functionality Base TCSEC geared to operating systems TNI: Trusted Network Interpretation TDI: Trusted Database management System Interpretation

35 Later Standards CTCPEC – Canada ITSEC – European Standard Did not define criteria Levels correspond to strength of evaluation Includes code evaluation, development methodology requirements Known vulnerability analysis CISR: Commercial outgrowth of TCSEC FC: Modernization of TCSEC FIPS 140: Cryptographic module validation Common Criteria: International Standard SSE-CMM: Evaluates developer, not product

36 ITSEC: Levels E1: Security target defined, tested Must have informal architecture description E2: Informal description of design Configuration control, distribution control E3: Correspondence between code and security target E4: Formal model of security policy Structured approach to design Design level vulnerability analysis E5: Correspondence between design and code Source code vulnerability analysis E6: Formal methods for architecture Formal mapping of design to security policy Mapping of executable to source code

37 ITSEC Problems: No validation that security requirements made sense Product meets goals But does this meet user expectations? Inconsistency in evaluations Not as formally defined as TCSEC

38 Replaced TCSEC, ITSEC 7 Evaluation Levels (functionally tested to formally designed and tested) Functional requirements, assurance requirements and evaluation methodology Functional and assurance requirements are organized hierarchically into: class, family, component, and, element. The components may have dependencies.

39 PP/ST Framework

40 The Common Criteria defines two types of IT security requirements: Functional Requirements - for defining security behavior of the IT product or system: implemented requirements become security functions Assurance Requirements - for establishing confidence in security functions: correctness of implementation effectiveness in satisfying security objectives Examples: Identification & Authentication Audit User Data Protection Cryptographic Support Examples: Development Configuration Management Life Cycle Support Testing Vulnerability Analysis IT Security Requirements

41

42 Documentation Part 1: Introduction and General Model Part 2: Security Functional Requirements Part 3: Security Assurance Requirements CEM Latest version: 3.1 (variations exist) http://www.commoncriteriaportal.org/public/expert/index.php?menu=2

43 Class Decomposition Class Family Components Elements Note: Applicable to both functional and assurance documents

44 CC Evaluation 1: Protection Profile Implementation independent, domain-specific set of security requirements Narrative Overview Product/System description Security Environment (threats, overall policies)‏ Security Objectives: System, Environment IT Security Requirements Functional requirements drawn from CC set Assurance level Rationale for objectives and requirements

45 CC Evaluation 2: Security Target Specific requirements used to evaluate system Narrative introduction Environment Security Objectives How met Security Requirements Environment and system Drawn from CC set Mapping of Function to Requirements Claims of Conformance to Protection Profile

46 Common Criteria: Functional Requirements 314 page document 11 Classes Security Audit, Communication, Cryptography, User data protection, ID/authentication, Security Management, Privacy, Protection of Security Functions, Resource Utilization, Access, Trusted paths Several families per class Lattice of components in a family

47 Class Example: Communication Non-repudiation of origin 1. Selective Proof. Capability to request verification of origin 2. Enforced Proof. All communication includes verifiable origin

48 Class Example: Privacy 1. Pseudonymity – The TSF shall ensure that [assignment: set of users and/or subjects] are unable to determine the real user name bound to [assignment: list of subjects and/or operations and/or objects] – The TSF shall be able to provide [assignment: number of aliases] aliases of the real user name to [assignment: list of subjects] – The TSF shall [selection: determine an alias for a user, accept the alias from the user] and verify that it conforms to the [assignment: alias metric] 2. Reversible Pseudonimity … 3. Alias Pseudonimity 1. …

49 Common Criteria: Assurance Requirements 231 page document 10 Classes Protection Profile Evaluation, Security Target Evaluation, Configuration management, Delivery and operation, Development, Guidance, Life cycle, Tests, Vulnerability assessment, Maintenance Several families per class Lattice of components in family

50 Common Criteria: Evaluation Assurance Levels 1. Functionally tested 2. Structurally tested 3. Methodically tested and checked 4. Methodically designed, tested, and reviewed 5. Semi-formally designed and tested 6. Semi-formally verified design and tested 7. Formally verified design and tested

51 Common Criteria: Evaluation Process National Authority authorizes evaluators U.S.: NIST accredits commercial organizations Fee charged for evaluation Team of four to six evaluators Develop work plan and clear with NIST Evaluate Protection Profile first If successful, can evaluate Security Target

52 Defining Requirements ISO/IEC Standard 15408 A flexible, robust catalogue of standardized IT security requirements (features and assurances)‏ Protection Profiles Consumer-driven security requirements in specific information technology areas Operating Systems Database Systems Firewalls Smart Cards Applications Biometrics Routers VPNs Access Control Identification Authentication Audit Cryptography

53 Industry Responds Protection Profile Consumer statement of IT security requirements to industry in a specific information technology area Security Targets Vendor statements of security claims for their IT products CISCO Firewall Lucent Firewall Checkpoint Firewall Network Assoc. FW Security Features and Assurances Firewall Security Requirements

54 Demonstrating Conformance IT Products Vendors bring IT products to independent, impartial testing facilities for security evaluation Security Features and Assurances Private sector, accredited security testing laboratories conduct evaluations Common Criteria Testing Labs Test results submitted to the National Information Assurance Partnership (NIAP) for post-evaluation validation Test Reports

55 Validating Test Results Laboratory submits test report to Validation Body Test Report Validation Body validates laboratory’s test results Common Criteria Validation Body NIAP issues Validation Report and Common Criteria Certificate Validation Report National Information Assurance Partnership Common Criteria Certificate TM

56 Common Criteria: Status About 80 registered products (2005) Only one at level 5 (Java Smart Card)‏ Several OS at 4 Likely many more not registered 223 Validated products (Oct, 2007) Tenix Interactive Link Data Diode Device Version 2.1 at EAL 7+


Download ppt "IS 2620: Developing Secure Systems Assurance and Evaluation Lecture 8 March 15, 2012."

Similar presentations


Ads by Google