Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSCE 548 Secure System Standards Risk Management.

Similar presentations


Presentation on theme: "CSCE 548 Secure System Standards Risk Management."— Presentation transcript:

1 CSCE 548 Secure System Standards Risk Management

2 Announcement Job Openings: – Daniel Rusu, Dreamgol, LLC – drusu9@gmail.com drusu9@gmail.com – 803-727-5634 – www.dreamgol.com www.dreamgol.com CSCE 548 - Farkas2

3 Announcements Job openings: – Peter J. Johnson, Staffing Consultant – 978.927.7000 (m) / premagni@verizon.net (e)premagni@verizon.net – www.linkedin.com/in/peterjjohnson (linkedin) www.linkedin.com/in/peterjjohnson – Massachusetts, Maryland, Virginia, and Ohio for computer scientists and software engineers with credentials in the fields of security, networking and privacy – US citizenship CSCE 548 - Farkas3

4 Announcement Job Openings: – Charleston, system development – C, Python, and Java, Linux, specifically Fedora 14, cellular communications, electrical engineering or the basic principles, and mysql – Android, iPhone / iPad developers to build the follow on versions of a current geo-location / SMS application – Contact info upon request CSCE 548 - Farkas4

5 Project Requirements available at http://www.cse.sc.edu/~farkas/csce548- 2012/csce548-project-requirements.htm http://www.cse.sc.edu/~farkas/csce548- 2012/csce548-project-requirements.htm Useful links: – OWASP, Open Web Application Security Project, https://www.owasp.org/index.php/Main_Page https://www.owasp.org/index.php/Main_Page – Sample projects Sample projects CSCE 548 - Farkas5

6 Homework 1 Choose a team member among your class mates. This selection is for this exercise only. List the steps of RMF for the "KillerAppCo's iWare 1.0 Server" given in your text book. (3 points) Carry out similar RMF on the computing resources owned by your team member. For example, understand the "business" context may include goals like graduating from USC, making profit from writing software to a company, etc. Document your RMF activities and findings. (7 points) BONUS points (2 points): Have your partner evaluate your risk management report and comment on it. CSCE 548 - Farkas6

7 7 Reading This lecture: – McGraw: Chapter 2 – Recommended: Rainbow Series Library, http://www.fas.org/irp/nsa/rainbow.htm http://www.fas.org/irp/nsa/rainbow.htm Common Criteria, http://www.commoncriteriaportal.org/ http://www.commoncriteriaportal.org/ Next lecture: – Software Development Lifecycle – Dr. J. Vidal

8 CSCE 548 - Farkas8 Risk Assessment RISK Threats VulnerabilitiesConsequences

9 CSCE 548 - Farkas9 Financial Loss Dollar Amount Losses by Type Total Loss (2006): $53,494,290 CSI/FBI Computer Crime and Security Survey Computer Security Institute

10 CSCE 548 - Farkas10 Security Protection Percentage of Organizations Using ROI, NPV, or IRR Metrics Percentage of IT Budget Spent on Security CSI/FBI Computer Crime and Security Survey Computer Security Institute

11 CSCE 548 - Farkas11 Real Cost of Cyber Attack Damage of the target may not reflect the real amount of damage Services may rely on the attacked service, causing a cascading and escalating damage Need: support for decision makers to – Evaluate risk and consequences of cyber attacks – Support methods to prevent, deter, and mitigate consequences of attacks

12 CSCE 548 - Farkas12 System Security Engineering (Traditional View) Specify System Architecture Identify Threats, Vulnerabilities, Attacks Estimate Risk Prioritize Vulnerabilities Identify and Install Safeguards Risk is acceptably low

13 CSCE 548 - Farkas13 Risk Management Framework (Business Context) Understand Business Context Identify Business and Technical Risks Synthesize and Rank Risks Define Risk Mitigation Strategy Carry Out Fixes and Validate Measurement and Reporting

14 CSCE 548 - Farkas14 Understand the Business Context “Who cares?” Identify business goals, priorities and circumstances, e.g., – Increasing revenue – Meeting service-level agreements – Reducing development cost – Generating high return investment Identify software risk to consider

15 CSCE 548 - Farkas15 Identify Business and Technical Risks “Why should business care?” Business risk – Direct threat – Indirect threat Consequences – Financial loss – Loss of reputation – Violation of customer or regulatory constraints – Liability Tying technical risks to the business context in a meaningful way

16 CSCE 548 - Farkas16 Synthesize and Rank the Risks “What should be done first?” Prioritization of identified risks based on business goals Allocating resources Risk metrics: – Risk likelihood – Risk impact – Risk severity – Number of emerging risks

17 CSCE 548 - Farkas17 Define the Risk Mitigation Strategy “How to mitigate risks?” Available technology and resources Constrained by the business context: what can the organization afford, integrate, and understand Need validation techniques

18 CSCE 548 - Farkas18 Carry Out Fixes and Validate Perform actions defined in the previous stage Measure “completeness” against the risk mitigation strategy – Progress against risk – Remaining risks – Assurance of mechanisms Testing

19 CSCE 548 - Farkas19 Measuring and Reporting Continuous and consistent identification and storage of risk information over time Maintain risk information at all stages of risk management Establish measurements, e.g., – Number of risks, severity of risks, cost of mitigation, etc.

20 CSCE 548 - Farkas20 Assets-Threat Model (1) Threats compromise assets Threats have a probability of occurrence and severity of effect Assets have values Assets are vulnerable to threats ThreatsAssets

21 CSCE 548 - Farkas21 Assets-Threat Model (2) Risk: expected loss from the threat against an asset R=V*P*S R risk V value of asset P probability of occurrence of threat V vulnerability of the asset to the threat

22 CSCE 548 - Farkas22 System-Failure Model Estimate probability of highly undesirable events Risk: likelihood of undesirable outcome Threat System Undesirable outcome

23 CSCE 548 - Farkas23 Risk Acceptance Certification How well the system meet the security requirements (technical) Accreditation Management’s approval of automated system (administrative)

24 NEXT SLIDES ARE RECOMMENDED ONLY CSCE 548 - Farkas24

25 Incident Handling Computer Security Incident Handling Guide, Recommendations of the National Institute of Standards and Technology http://csrc.nist.gov/publications/nistpubs/800- 61-rev1/SP800-61rev1.pdf http://csrc.nist.gov/publications/nistpubs/800- 61-rev1/SP800-61rev1.pdf http://csrc.nist.gov/publications/nistpubs/800- 61-rev1/SP800-61rev1.pdf

26 CSCE 548 - Farkas26 How to Response? Actions to avoid further loss from intrusion Terminate intrusion and protect against reoccurrence Law enforcement – prosecute Enhance defensive security Reconstructive methods based on: – Time period of intrusion – Changes made by legitimate users during the effected period – Regular backups, audit trail based detection of effected components, semantic based recovery, minimal roll- back for recovery.

27 CSCE 548 - Farkas27 Roles and Responsibilities User: – Vigilant for unusual behavior – Report incidents Manager: – Awareness training – Policies and procedures System administration: – Install safeguards – Monitor system – Respond to incidents, including preservation of evidences

28 CSCE 548 - Farkas28 Computer Incident Response Team Assist in handling security incidents – Formal – Informal Incident reporting and dissemination of incident information Computer Security Officer – Coordinate computer security efforts Others: law enforcement coordinator, investigative support, media relations, etc.

29 CSCE 548 - Farkas29 Incident Response Process 1. Preparation – Baseline Protection – Planning and guidance – Roles and Responsibilities – Training – Incident response team

30 CSCE 548 - Farkas30 Incident Response Process 2. Identification and assessment – Symptoms – Nature of incident Identify perpetrator, origin and extent of attack Can be done during attack or after the attack – Gather evidences Key stroke monitoring, honey nets, system logs, network traffic, etc. Legislations on Monitoring! – Report on preliminary findings

31 CSCE 548 - Farkas31 Incident Response Process 3. Containment – Reduce the chance of spread of incident – Determine sensitive data – Terminate suspicious connections, personnel, applications, etc. – Move critical computing services – Handle human aspects, e.g., perception management, panic, etc.

32 CSCE 548 - Farkas32 Incident Response Process 4. Eradication – Determine and remove cause of incident if economically feasible – Improve defenses, software, hardware, middleware, physical security, etc. – Increase awareness and training – Perform vulnerability analysis

33 CSCE 548 - Farkas33 Incident Response Process 5. Recovery – Determine course of action – Reestablish system functionality – Reporting and notifications – Documentation of incident handling and evidence preservation

34 CSCE 548 - Farkas34 Follow Up Procedures Incident evaluation: – Quality of incident (preparation, time to response, tools used, evaluation of response, etc.) – Cost of incident (monetary cost, disruption, lost data, hardware damage, etc.) Preparing report Revise policies and procedures

35 CSCE 548 - Farkas35 Security Awareness and Training Major weakness: users unawareness Organizational effort Educational effort Customer training Federal Trade Commission: program to educate customers about web scams

36 CSCE 548 - Farkas36 Building It Secure 1960s: US Department of Defense (DoD) risk of unsecured information systems 1970s: – 1977: DoD Computer Security Initiative – US Government and private concerns – National Bureau of Standards (NBS – now NIST) Responsible for standards for acquisition and use of federal computing systems Federal Information Processing Standards (FIPS PUBs)

37 CSCE 548 - Farkas37 NBS Two initiatives for security: – Cryptography standards 1973: invitation for technical proposals for ciphers 1977: Data Encryption Standard 2001: Advanced Encryption Standard (NIST) – Development and evaluation processes for secure systems Conferences and workshops Involves researchers, constructors, vendors, software developers, and users 1979: Mitre Corporation: entrusted to produce an initial set of criteria to evaluate the security of a system handling classified data

38 CSCE 548 - Farkas38 National Computer Security Center 1981: National Computer Security Center (NCSC) was established within NSA – To provide technical support and reference for government agencies – To define a set of criteria for the evaluation and assessment of security – To encourage and perform research in the field of security – To develop verification and testing tools – To increase security awareness in both federal and private sector 1985: Trusted Computer System Evaluation Criteria (TCSEC) == Orange Book

39 CSCE 548 - Farkas39 Orange Book Orange Book objectives – Guidance of what security features to build into new products – Provide measurement to evaluate security of systems – Basis for specifying security requirements Security features and Assurances Trusted Computing Base (TCB) security components of the system: hardware, software, and firmware + reference monitor

40 CSCE 548 - Farkas40 Orange Book Supply Users: evaluation metrics to assess the reliability of the security system for protection of classified or sensitive information when – Commercial product – Internally developed system Developers/vendors: design guide showing security features to be included in commercial systems Designers: guide for the specification of security requirements

41 CSCE 548 - Farkas41 Orange book Set of criteria and requirements Three main categories: – Security policy – protection level offered by the system – Accountability – of the users and user operations – Assurance – of the reliability of the system

42 CSCE 548 - Farkas42 Security Policy Concerns the definition of the policy regulation the access of users to information – Discretionary Access Control – Mandatory Access Control – Labels: for objects and subjects – Reuse of objects: basic storage elements must be cleaned before released to a new user

43 CSCE 548 - Farkas43 Accountability Identification/authentication Audit Trusted path: no users are attempting to access thr system fraudulently

44 CSCE 548 - Farkas44 Assurance Reliable hardware/software/firmware components that can be evaluated separately Operation reliability Development reliability

45 CSCE 548 - Farkas45 Operation reliability During system operation – System architecture: TCB isolated from user processes, security kernel isolated from non-security critical portions of the TCB – System integrity: correct operation (use diagnostic software) – Covert channel analysis – Trusted facility management: separation of duties – Trusted recovery: recover security features after TCB failures

46 CSCE 548 - Farkas46 Development reliability System reliable during the development process. Formal methods. – System testing: security features tested and verified – Design specification and verification: correct design and implementation wrt security policy. TCB formal specifications proved – Configuration management: configuration of the system components and its documentation – Trusted distribution: no unauthorized modifications

47 CSCE 548 - Farkas47 Documentation Defined set of documents Minimal set: – Trusted facility manual – Security features user’s guide – Test documentation – Design documentation – Personnel info: Operators, Users, Developers, Maintainers

48 CSCE 548 - Farkas48 Orange Book Levels Highest Security –A1 Verified protection –B3 Security Domains –B2 Structured Protection –B1 Labeled Security Protections –C2 Controlled Access Protection –C1 Discretionary Security Protection –D Minimal Protection No Security

49 CSCE 548 - Farkas49 NCSC Rainbow Series Orange: Trusted Computer System Evaluation Criteria Yellow: Guidance for applying the Orange Book Red: Trusted Network Interpretation Lavender: Trusted Database Interpretation

50 CSCE 548 - Farkas50 Evaluation Process Preliminary technical review (PTR) – Preliminary technical report: architecture potential for target rating Vendor assistance phase (VAP) – Review of the documentation needed for the evaluation process, e.g., security features user’s guide, trusted facility manual, design documentation, test plan. For B or higher, additional documentations are needed, e.g., covert channel analysis, formal model, etc. Design analysis phase (DAP) – Initial product assessment report (IPAR): 100-200 pages, detailed info about the hardware, software architecture, security relevant features, team assessments, etc. – Technical Review Board – Recommendation to the NCSC

51 CSCE 548 - Farkas51 Evaluation Process Formal evaluation phase (FEP) – Product Bulletin: formal and public announcement – Final Evaluation Report: information from IPAR and testing results, additional tests, review code (B2 and up), formal policy model, proof. – Recommends rating for the system – NCSC decides final rating Rating maintenance phase (RAMP) – Minor changes and revisions – Reevaluated – Rating maintenance plan

52 CSCE 548 - Farkas52 European Criteria German Information Security Agency: German Green Book (1988) British Department of Trade and Industry and Ministry of Defense: several volumes of criteria Canada, Australia, France: works on evaluation criteria 1991: Information Technology Security Evaluation Criteria (ITSEC) – For European community – Decoupled features from assurance – Introduced new functionality requirement classes – Accommodated commercial security requirements

53 CSCE 548 - Farkas53 Common Criteria January 1996: Common Criteria – Joint work with Canada and Europe – Separates functionality from assurance – Nine classes of functionality: audit, communications, user data protection, identification and authentication, privacy, protection of trusted functions, resource utilization, establishing user sessions, and trusted path. – Seven classes of assurance: configuration management, delivery and operation, development, guidance documents, life cycle support, tests, and vulnerability assessment.

54 CSCE 548 - Farkas54 Common Criteria Evaluation Assurance Levels (EAL) – EAL1: functionally tested – EAL2: structurally tested – EAL3: methodologically tested and checked – EAL4: methodologically designed, tested and reviewed – EAL5: semi-formally designed and tested – EAL6: semi-formally verified and tested – EAL7: formally verified design and tested

55 CSCE 548 - Farkas55 National Information Assurance Partnership (NIAP) 1997: National Institute of Standards and Technology (NIST), National Security Agency (NSA), and Industry Aims to improve the efficiency of evaluation Transfer methodologies and techniques to private sector laboratories Functions: developing tests, test methods, tools for evaluating and improving security products, developing protection profiles and associated tests, establish formal and international schema for CC

56 CSCE 548 - Farkas56 Next Class Software Development Lifecycle


Download ppt "CSCE 548 Secure System Standards Risk Management."

Similar presentations


Ads by Google