Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University

Similar presentations


Presentation on theme: "Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University"— Presentation transcript:

1 Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University Summers_wayne@colstate.edu http://csc.colstate.edu/summers

2 2 Goals of Formal Evaluation  Provide a set of requirements defining the security functionality for the system or product  Provide a set of assurance requirements that delineate the steps for establishing that the system or product meets its functional requirements.  Provide a methodology for determining that the product or system meets the functional requirements based on analysis of the assurance evidence.  Provide a measure of the evaluation result that indicates how trustworthy the product or system is with respect to the security functional requirements defined for it.

3 3 TCSEC: 1983-1999  Trusted Computer System Evaluation Criteria (Orange Book):D, C1, C2, B1, B2, B3, A1 Trusted Computer System Evaluation Criteria –Emphasized confidentiality –TCSEC Functional Requirements Discretionary Access control (DAC) Object reuse requirements Mandatory access control (MAC) (>=B1) Label requirements (>=B1) Identification and authentication requirements Trusted path requirements (>=B2) Audit requirements

4 4 TCSEC: 1983-1999  TSEC Assurance Requirements –Configuration management (>= B2) –Trusted distribution (A1) –TCSEC systems architecture (C1-B3) – mandate modularity, minimize complexity, keep TCB as small and simple as possible –Design specification and verification (>=B1) –Testing requirements –Product documentation requirements

5 5 TCSEC: 1983-1999  TCSEC Evaluation Classes –C1 – discretionary protection –C2 – controlled access protection –B1 – labeled security protection –B2 – structural protection –B3 – security domains –A1 – verified protection

6 6 International Efforts and the ITSEC: 1991-2001  Information Technology Security Evaluation Criteria - European Standard since 1991 (E0, E1, E2, E3, E4, E5, E6) –Did not include tamperproof reference validation mechanisms, process isolation, principle of least privilege, well-defined user interface, and requirement for system integrity –Did require assessment of security measures used for the developer environment during the development and maintenance, submission of code, procedures for delivery, ease of use analysis

7 7 ITSEC  E1 – required a security target, informal description of architecture.  E2 – required informal description of the detailed design, configuration control, distribution control process  E3 – more stringent requirements on detail desing and correspondence between source code and security requirements  E4 – requires formal model of security policy, more rigorous structured approach to architectural and detailed design, and a design level vulnerability analysis  E5 – requires correspondence between detailed desing and source code and source code level vulnerability analysis  E6 – requires extensive use of formal methods

8 8 Common CriteriaCommon Criteria: 1998-Present  CC – defacto standard for U. S. and many other countries; ISO Standard 15408 –TOE (target of evaluation) product/system that is the subject of the evaluation –TSP (TOE Security Policy) – set of rules that regulate how assets are managed, protected, and distributed –TSF (TOE Security Functions) – h’ware, s’ware, and firmware that must be relied on to enforce the TSP (generalization of TCSEC’s trusted computing base (TCB))

9 9 Common Criteria  CC Protection Profile (PP) – implementation independent set of security requirements for a category of products/systems that meet specific consumer needs –Introduction (PP Indentification & PP Overview) –Product/System Family Description –Product/System Family Security Environment –Security Objectives (product/system; environment) –IT Security Requirements (functional and assurance) –Rationale (objectives and requirements)

10 10 Common Criteria  Security Target (ST) – set of security requirements and specifications to be used as the basis for evaluation of an identified product/system –Introduction (ST Indentification & ST Overview) –Product/System Family Description –Product/System Family Security Environment –Security Objectives (product/system; environment) –IT Security Requirements (functional and assurance) –Product/System Summary Specification –PP Claims (claims of conformance) –Rationale (objectives, requirements, TOE summary specification, PP claims)

11 11 Common Criteria - Security Functional Requirements  Class FAU: Security Audit  Class FCO: Communication  Class FCS: Cryptographic Support  Class FDP: User Data Protection  Class FIA: Identification and Authentication  Class FMT: Security Management  Class FPR: Privacy  Class FPT: Protection of Security Functions  Class FRU: Resource Utilization  Class FTA: TOE Access  Class FTP: Trusted Path

12 12 Common Criteria - Assurance Requirements  Class APE: Protection Profile Evaluation  Class ASE: Security Target Evaluation  Class ACM: Configuration Management  Class ADO: Delivery and Operation  Class ADV: Development  Class AGD: Guidance Documentation  Class ALC: Life Cycle  Class ATE: Tests  Class AVA: Vulnerability Assessment  Class AMA: Maintenance of Assurance

13 13 Common Criteria – Evaluation Assurance Levels  EAL1: Functionally Tested  EAL2: Structurally Tested  EAL3: Methodically Tested and Checked  EAL4: Methodically Designed, Tested and Reviewed  EAL5: Semiformally Designed and Tested  EAL6: Semiformally Verified Design and Tested  EAL7: Formally Verified Design and Tested

14 14 SSE-CMM: 1997-Present  System Security Engineering Capability Maturity Model – process-oriented methodology for developing secure systems based on SE- CMM –Assess capabilities of security engineering processes –Provide guidance in designing and improving them –Provides an evaluation technique for an organization’s security engineering

15 15 SSE-CMM Model  Process capability – range of expected results that can be achieved by following the process  Process performance – measure of actual results achieved  Process maturity – extent to which a process is explicitly defined, managed, measured, controlled, and effective

16 16 SSE-CMM Process Areas  Administer Security Controls  Assess Impact  Assess Security Risks  Assess Threat  Assess Vulnerability  Build Assurance Argument  Coordinate Security  Monitor System Security Posture  Provide Security Input  Specify Security Needs  Verify and Validate Security

17 17 SSE-CMM Capability Maturity Levels  Performed Informally – base processes are performed  Planned and Tracked – Project-level definition, planning, and performance verification issues are addressed  Well-Defined – focus on defining and refining standard practice and coordinating it across the organization  Quantitatively Controlled – focus on establishing measurable quality goals and objectively managing their performance  Continuously Improving – organizational capability and process effectiveness are improved


Download ppt "Chapter 21: Evaluating Systems Dr. Wayne Summers Department of Computer Science Columbus State University"

Similar presentations


Ads by Google