Presentation is loading. Please wait.

Presentation is loading. Please wait.

Risk Management ©USC-CSSE.

Similar presentations


Presentation on theme: "Risk Management ©USC-CSSE."— Presentation transcript:

1 Risk Management ©USC-CSSE

2 ©USC-CSSE

3 Risk vs Issue A Risk is an uncertain event that could impact your chosen path should it be realised. Risks are events that are not currently affecting you – they haven’t happened yet.  Once a risk is realised, it has the potential to become an Issue Source: ©USC-CSSE

4 Risk Management What is risk? What about problem, concern, issue?
©USC-CSSE

5 Reactively Managing a Software Development Problem
System Integration Time: We just started integrating the various software components to be used in our project and we found out that COTS* products A and B can’t talk to one another This is a problem, caused by a previously unrecognized risk, that materialized: The risk that two COTS products might not be able to talk to one another; specifically that A and B might not be able to talk to one another) We’ve got too much tied up in A and B to change Our best solution is to build wrappers around A and B to get them to talk via CORBA** This will result in: a 3 month schedule overrun = $100K contract penalty a $300K cost overrun *COTS: Commercial off-the-shelf **CORBA: Common Object Request Broker Architecture ©USC-CSSE

6 Proactively Managing a Risk (assessment)
System Design Time: A and B are our strongest COTS choices But there is some chance that they can’t talk to one another Probability that A and B can’t talk to one another = probability of loss: P(L) From previous experience, with COTS like A and B, we assess P(L) at 50% If we commit to using A and B, and we find out at integration time that they can’t talk to one another Size of loss S(L) = $300K + $100K = $400K We have a risk exposure of RE = P(L) * S(L) = (.5) * ($400K) = $200K ©USC-CSSE

7 Risk Management Strategy 1: Buying Information
System Design Time: Let’s spend $30K and 2 weeks prototyping the integration of A and B This will buy information on the magnitudes of P(L) and S(L) If RE = P(L) * S(L) is small, we’ll accept and monitor the risk If RE is large, we’ll use one/some of the other strategies ©USC-CSSE

8 Other Risk Management Strategies
Risk Avoidance COTS product C is almost as good as B, and we know, from having used A and C, that C can talk to A Delivering on time is worth more to the customer than the small performance loss Risk Transfer If the customer insists on using A and B, have them establish a risk reserve, to be used in case A and B can’t talk to each other Risk Reduction If we build the wrappers and the CORBA connections right now, we add cost but minimize the schedule delay Risk Acceptance If we can solve the A and B interoperability problem, we’ll have a big competitive edge on future procurements Let’s do this on our own money, and patent the solution ©USC-CSSE

9 Software Risk Management
Identification Checklists Decision driver analysis Assumption analysis Decomposition Performance models Cost models Network analysis Decision analysis Quality factor analysis Risk exposure Risk leverage Compound risk reduction Buying information Risk avoidance Risk transfer Risk reduction Risk element planning Risk plan integration Prototypes Simulations Benchmarks Analyses Staffing Milestone tracking Top-10 tracking Risk reassessment Corrective action Risk Assessment Risk Analysis Risk Prioritization Risk Management Risk mgmt Planning Risk Control Risk Resolution Risk Monitoring ©USC-CSSE

10 Top 10 Risk Categories: 1989 and 1995
1. Personnel shortfalls 2. Schedules and budgets 3. Wrong software functions 4. Wrong user interface 5. Gold plating 6. Requirements changes 7. Externally-furnished components 8. Externally-performed tasks 9. Real-time performance 10. Straining computer science 1995 1. Personnel shortfalls 2. Schedules, budgets, process 3. COTS, external components 4. Requirements mismatch 5. User interface mismatch 6. Architecture, performance, quality 7. Requirements changes 8. Legacy software 9. Externally-performed tasks 10. Straining computer science ©USC-CSSE

11 Primary CS577 Risk Categories (all on 1995 list) and Examples
Personnel shortfalls: commitment (This is team member’s last course; only needs C to graduate); compatibility; communication problems; skill deficiencies (management, Web design, Java, Perl, CGI, data compression, …) Schedule: project scope too large for 24 weeks; IOC content; critical-path items (COTS, platforms, reviews, …) COTS: see next slide re multi-COTS Rqts, UI: mismatch to user needs (recall overdue book notices) Performance: #bits; #bits/sec; overhead sources Externally-performed tasks: Client/Operator preparation; commitment for transition effort ©USC-CSSE

12 COTS and External Component Risks
COTS risks: immaturity; inexperience; COTS incompatibility with application, platform, other COTS; controllability Non-commercial off-the shelf components: reuse libraries, government, universities, etc. Qualification testing; benchmarking; inspections; reference checking; compatibility analysis ©USC-CSSE

13 Risk Exposure Factors (Satellite Experiment Software)
Unsatisfactory Outcome (UO) Prob (UO) Loss (UO) 10 8 7 9 3 4 1 5 2 Risk Exposure 3 - 5 4 - 8 5 6 8 1 2 45 15 24 8 30 7 4 A. S/ W error kills experiment B. S/ W error loses key data C. Fault tolerance features cause unacceptable performance D. Monitoring software reports unsafe condition as safe E. Monitoring software reports safe condition as unsafe F. Hardware delay causes schedule overrun G. Data reduction software errors cause extra work H. Poor user interface causes inefficient operation I. Processor memory insufficient J. DBMS software loses derived data ©USC-CSSE

14 Risk Reduction Leverage (RRL) Change in Risk Exposure / cost to implement avoidance method
BEFORE - AFTER RISK REDUCTION COST Spacecraft Example LONG DURATION TEST FAILURE MODE TESTS LOSS (UO) PROB (UO) RE $20M 0.2 $4M $20M 0.2 $4M B B PROB (UO) RE 0.05 $1M 0.07 $1.4M A A COST $2M $0.26M 4-1 4- 1.4 = 1.5 = 10 RRL 2 0.26 ©USC-CSSE

15 Risk Management Plans For Each Risk Item, Answer the Following Questions: 1. Why? Risk Item Importance, Relation to Project Objectives 2. What, When? Risk Resolution Deliverables, Milestones, Activity Nets 3. Who, Where? Responsibilities, Organization 4. How? Approach (Prototypes, Surveys, Models, …) 5. How Much? Resources (Budget, Schedule, Key Personnel) ©USC-CSSE

16 Risk Management Plan: Fault Tolerance Prototyping
1. Objectives (The “Why”) Determine, reduce level of risk of the software fault tolerance features causing unacceptable performance Create a description of and a development plan for a set of low-risk fault tolerance features 2. Deliverables and Milestones (The “What” and “When”) By week 3 1. Evaluation of fault tolerance option 2. Assessment of reusable components 3. Draft workload characterization 4. Evaluation plan for prototype exercise 5. Description of prototype By week 7 6. Operational prototype with key fault tolerance features 7. Workload simulation 8. Instrumentation and data reduction capabilities 9. Draft Description, plan for fault tolerance features By week 10 10. Evaluation and iteration of prototype 11. Revised description, plan for fault tolerance features ©USC-CSSE

17 Risk Management Plan: Fault Tolerance Prototyping (concluded)
Responsibilities (The “Who” and “Where”) System Engineer: G. Smith Tasks 1, 3, 4, 9, 11, support of tasks 5, 10 Lead Programmer: C. Lee Tasks 5, 6, 7, 10 support of tasks 1, 3 Programmer: J. Wilson Tasks 2, 8, support of tasks 5, 6, 7, 10 Approach (The “How”) Design-to-Schedule prototyping effort Driven by hypotheses about fault tolerance-performance effects Use real-time OS, add prototype fault tolerance features Evaluate performance with respect to representative workload Refine Prototype based on results observed Resources (The “How Much”) $60K - Full-time system engineer, lead programmer, programmer (10 weeks)*(3 staff)*($2K/staff-week) $0K - 3 Dedicated workstations (from project pool) $0K - 2 Target processors (from project pool) $0K - 1 Test co-processor (from project pool) $10K - Contingencies $70K - Total ©USC-CSSE

18 Risk Monitoring Milestone Tracking Top-10 Risk Item Tracking
Monitoring of risk Management Plan Milestones Top-10 Risk Item Tracking Identify Top-10 risk items Highlight these in monthly project reviews Focus on new entries, slow-progress items Focus review on manger-priority items Risk Reassessment Corrective Action ©USC-CSSE

19 Project Top 10 Risk Item List: Satellite Experiment Software
Mo. Ranking This Last #Mo. Risk Item Risk Resolution Progress Replacing Sensor-Control Software Top Replacement Candidate Unavailable Developer Target Hardware Delivery Delays Procurement Procedural Delays Sensor Data Formats Undefined Action Items to Software, Sensor Teams; Due Next Month Staffing of Design V&V Team Key Reviewers Committed; Need Fault- Tolerance Reviewer Software Fault-Tolerance May Fault Tolerance Prototype Successful Compromise Performance Accommodate Changes in Data Meeting Scheduled With Data Bus Bus Design Designers Testbed Interface Definitions Some Delays in Action Items; Review Meeting Scheduled User Interface Uncertainties User Interface Prototype Successful TBDs In Experiment Operational TBDs Resolved Concept Uncertainties In Reusable Required Design Changes Small, Monitoring Software Successfully Made ©USC-CSSE

20 Early Risk Management in 577
Project Tasks Risk Management Skills; Skill-building activities Select projects; form teams Project risk identification Staffing risk assessment and resolution - Readings, lectures, homework, case study, guidelines Plan early phases Schedule/budget risk assessment, planning Risk–driven processes (ICSM) - Readings, lectures, homework, guidelines, planning and estimating tools Formulate, validate concept of operation Risk-driven level of detail - Readings, lecture, guidelines, project Manage to plans Risk monitoring and control Develop, validate FC package Risk assessment and prioritization FC Architecture Review Risk-driven review process Review of top-N project risks Readings, lecture, case studies, review ©USC-CSSE

21 Software Risk Management Techniques
Source of Risk Risk Management Techniques 1. Personnel shortfalls Staffing with top talent; key personnel agreements; team-building; training; tailoring process to skill mix; walkthroughs 2. Schedules, budgets, Process Detailed, multi-source cost and schedule estimation; design to cost; incremental development; software reuse; requirements descoping; adding more budget and schedule; outside reviews 3. COTS, external components Benchmarking; inspections; reference checking; compatibility prototyping and analysis 4. Requirements mismatch Requirements scrubbing; prototyping; cost-benefit analysis; design to cost; user surveys 5. User interface mismatch Prototyping; scenarios; user characterization (functionality; style, workload); identifying the real users ©USC-CSSE

22 Software Risk Management Techniques
Source of Risk Risk Management Techniques 6. Architecture, performance, quality Simulation; benchmarking; modeling; prototyping; instrumentation; tuning 7. Requirements changes High change threshold: information hiding; incremental development (defer changes to later increments) 8. Legacy software Reengineering; code analysis; interviewing; wrappers; incremental deconstruction 9. COTS, Externally-performed tasks Pre-award audits, award-fee contracts, competitive design or Prototyping 10. Straining computer science Technical analysis; cost-benefit analysis; prototyping; reference checking ©USC-CSSE

23 Validation Results on Process Adoption
Incidents of Process Selection and Direction Changes #of teams Results on Project Process Selection 8/14 Selected the right process pattern from the beginning 3/14 Unclear project scope ; re-select right at the end of the Exploration phase 1/14 Minor changes on project scope ; right at the end of the Valuation phase Major change in Foundations phase Infeasible project scope ©USC-CSSE

24 Top 10 Risk Categories: 1995 and 2010
1. Personnel shortfalls 1. Customer-developer-user team cohesion 2. Schedules, budgets, process 2. Personnel shortfalls 3. COTS, external components 3. Architecture complexity; quality tradeoffs 4. Requirements mismatch 4. Budget and schedule constraints 5. User interface mismatch 5. COTS and other independently evolving systems 6. Architecture, performance, quality 6. Lack of domain knowledge 7. Requirements changes 7. Process Quality Assurance 8. Legacy software 8. Requirements volatility; rapid change 9. Externally-performed tasks 9. User interface mismatch 10. Straining computer science 10. Requirements mismatch ©USC-CSSE

25 Primary CS577 Risk Categories (all on 1995 list) and Examples
Personnel shortfalls: commitment (This is team member’s last course; only needs C to graduate); compatibility; communication problems; skill deficiencies (management, Web design, Java, Perl, CGI, data compression, …) Schedule: project scope too large for 24 weeks; IOC content; critical-path items (COTS, platforms, reviews, …) COTS: see next slide re multi-COTS Rqts, UI: mismatch to user needs (recall overdue book notices) Performance: #bits; #bits/sec; overhead sources Externally-performed tasks: Client/Operator preparation; commitment for transition effort ©USC-CSSE

26 Top 11 - Risk distribution in CSCI577
©USC-CSSE

27 Comparing between risks in Fall and Spring
©USC-CSSE

28 Conclusions Risk management starts on Day One
Delay and denial are serious career risks Data provided to support early investment Win Win spiral model provides process framework for early risk resolution Stakeholder identification and win condition reconciliation Anchor point milestones Risk analysis helps determine “how much is enough” Testing, planning, specifying, prototyping,… Buying information to reduce risk ©USC-CSSE

29 Quality Management ©USC-CSSE

30 Outline Quality Management In CMMI 1.3 In ISO 15288 In CSCI577ab
(c) USC-CSSE

31 Objectives of QM To ensure the high quality process
in order to deliver high quality products (c) USC-CSSE

32 Quality Management in CMMI 1.3
Process Areas Configuration Management (CM) Product Integration (PI) Causal Analysis and Resolution (CAR) Project Monitoring and Control (PMC) Decision Analysis and Resolution (DAR) Project Planning (PP) Integrated Project Management (IPM) Quantitative Project Management (QPM) Measurement and Analysis (MA) Requirements Development (RD) Organizational Performance Management (OPM) Requirements Management (REQM) Organizational Process Definition (OPD) Risk Management (RSKM) Organizational Process Focus (OPF) Supplier Agreement Management (SAM) Organizational Process Performance (OPP) Technical Solution (TS) Organizational Training (OT) Validation (VAL) Process and Product Quality Assurance (PPQA) Verification (VER) (c) USC-CSSE

33 PPQA - Product and Process Quality Assurance
(c) USC-CSSE

34 PPQA - Product and Process Quality Assurance
(c) USC-CSSE

35 PPQA for Agile development
(c) USC-CSSE

36 CM – Configuration Management
(c) USC-CSSE

37 CM – Configuration Management
(c) USC-CSSE

38 CM – Configuration Management
(c) USC-CSSE

39 MA – Measurement and Analysis
(c) USC-CSSE

40 VER - Verification (c) USC-CSSE

41 VER - Verification (c) USC-CSSE

42 VAL - Validation (c) USC-CSSE

43 VAL - Validation (c) USC-CSSE

44 Quality Management in ISO 15288
Activities Plan quality management. Establish quality management policies Establish organization quality management objectives Define responsibilities and authority for implementation of quality management. Assess quality management. Assess customer satisfaction and report. Conduct periodic reviews of project quality plans. The status of quality improvements on products and services is monitored. Perform quality management corrective action. Plan corrective actions when quality management goals are not achieved. Implement corrective actions and communicate results through the organization. (c) USC-CSSE

45 Configuration Management in ISO 15288
Activities Plan configuration management. Define a configuration management strategy Identify items that are subject to configuration control. Perform configuration management Maintain information on configurations with an appropriate level of integrity and security Ensure that changes to configuration baselines are properly identified, recorded, evaluated, approved, incorporated, and verified. (c) USC-CSSE

46 Quality Management in 577ab
IIV&V Configuration Management Defect Reporting and Tracking Testing Buddy Review Architecture Review Board Core Capability Drive through Design Code Review Document template Sample artifacts (c) USC-CSSE

47 Quality Guidelines Design Guidelines Coding Guidelines
Describe design guidelines on how to improve or maintain modularity, reuse and maintenance How the design will map to the implementation Coding Guidelines Describe how to document the code in such as way that it could easily be communicated to others (c) USC-CSSE

48 Coding Guidelines C: http://www.gnu.org/prep/standards/standards.html
Java: Visual Basic: (c) USC-CSSE

49 Quality Guidelines Version Control and History
Chronological log of the changes introduced to this unit Implementation Considerations Detailed design and implementation for as-built considerations Unit Verification Unit / integration test Code walkthrough / review / inspection (c) USC-CSSE

50 Quality Assessment Methods
Methods, tools, techniques, processes that can identify the problems Detect and report the problem Measure the quality of the software system Three methods of early defect identification peer review, IIV&V, Automated Analysis (c) USC-CSSE

51 Peer Review Reviews performed by peers in the development team
Can be from Fagan’s inspections to simple buddy checks Peer Review Items Participants / Roles Schedule (c) USC-CSSE

52 Defect Removal Profiles
(c) USC-CSSE


Download ppt "Risk Management ©USC-CSSE."

Similar presentations


Ads by Google