© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.

Slides:



Advertisements
Similar presentations
Strategy Strategic Plan might consist of: 1)Vision or Mission Statement 2)S.W.O.T. Analysis (or Environmental and Internal scans) 3)Tactical and Functional.
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Features of a competent planning system Gary White Government Planner Department of State Development, Infrastructure and Planning April 2012.
Formal Technical Reviews
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
W5HH Principle As applied to Software Projects
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
3/14/2006USC-CSE1 Ye Yang, Barry Boehm Center for Software Engineering University of Southern California COCOTS Risk Analyzer and Process Usage Annual.
Proposed Way Forward for SERC EM Task Barry Boehm, USC-CSSE 30 January 2009.
Overview Lesson 10,11 - Software Quality Assurance
Viewpoint Consulting – Committed to your success.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
University of Southern California Center for Systems and Software Engineering Integrating Systems and Software Engineering (IS&SE) with the Incremental.
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 10/23/01 1 COSYSMO Portion The COCOMO II Suite of Software Cost Estimation.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February.
University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
Ch8: Management of Software Engineering. 1 Management of software engineering  Traditional engineering practice is to define a project around the product.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
SE 450 Software Processes & Product Metrics 1 Defect Removal.
University of Southern California Center for Software Engineering CSE USC USC-CSE Annual Research Review COQUALMO Update John D. Powell March 11, 2002.
Copyright USC-CSSE 1 Quality Management – Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
(c) 2007 Mauro Pezzè & Michal Young Ch 1, slide 1 Software Test and Analysis in a Nutshell.
SE 555 Software Requirements & Specification 1 SE 555 Software Requirements & Specification Prototyping.
University of Southern California Center for Software Engineering C S E USC Agile and Plan-Driven Methods Barry Boehm, USC USC-CSE Affiliates’ Workshop.
Software Process and Product Metrics
12 Steps to Useful Software Metrics
University of Southern California Center for Systems and Software Engineering Improving Affordability via Value-Based Testing 27th International Forum.
Benchmarking at Saudi Aramco
University of Southern California Center for Software Engineering C S E USC ISERN 2005 November 15, 2005 Stefan Biffl, Aybuke Aurum, Rick Selby, Dan Port,
8/27/20151NeST Controlled. 2 Communication Transportation Education Banking Home Applications.
Achieving Better Reliability With Software Reliability Engineering Russel D’Souza Russel D’Souza.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
 The software systems must do what they are supposed to do. “do the right things”  They must perform these specific tasks correctly or satisfactorily.
High Dependability Computing in a Competitive World
Unit 8 Syllabus Quality Management : Quality concepts, Software quality assurance, Software Reviews, Formal technical reviews, Statistical Software quality.
Project Management Estimation. LOC and FP Estimation –Lines of code and function points were described as basic data from which productivity metrics can.
University of Southern California Center for Systems and Software Engineering 10/30/2009 © 2009 USC CSSE1 July 2008©USC-CSSE1 The Incremental Commitment.
Managing Risk Through Performance Measurement FIRMA Risk Management Training Conference Lori Loken-King - SVP Union Bank, N.A., Operational Risk Management.
Service Transition & Planning Service Validation & Testing
Chapter – 9 Checkpoints of the process
Test Metrics. Metrics Framework “Lord Kelvin, a renowned British physicist, is reputed to have said: ‘When you can measure what you are speaking about,
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
Dr. Tom WayCSC Testing and Test-Driven Development CSC 4700 Software Engineering Based on Sommerville slides.
T. E. Potok - University of Tennessee CS 594 Software Engineering Lecture 3 Dr. Thomas E. Potok
© USC-CSE 2001 Oct Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee (
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
Assessing the influence on processes when evolving the software architecture By Larsson S, Wall A, Wallin P Parul Patel.
Project Estimation Model By Deepika Chaudhary. Factors for estimation Initial estimates may have to be made on the basis of a high level user requirements.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
Competing For Advantage Chapter 4 – The Internal Organization: Resources, Capabilities, and Core Competencies.
Rational Unified Process (RUP) Process Meta-model Inception Phase These notes adopted and slightly modified from “RUP Made Easy”, provided by the IBM Academic.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
Information Technology Planning. Overview What is IT Planning Organized planning of IT infrastructure and applications portfolios done at various levels.
University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop.
Introduction to Project Management.  Explain what a project is?  Describe project management.  Understand project management framework.  Discuss the.
Kathy Corbiere Service Delivery and Performance Commission
Commercial Insurance Product Development Justin VanOpdorp ACAS, MAAA GE Commercial Insurance g.
1 Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6, 6.5 LiGuo Huang Computer Science and Engineering Southern Methodist University.
Information Technology Planning
12 Steps to Useful Software Metrics
The Economics of Systems and Software Reliability Assurance
Testing and Test-Driven Development CSC 4700 Software Engineering
Introducing ISTQB Agile Foundation Extending the ISTQB Program’s Support Further Presented by Rex Black, CTAL Copyright © 2014 ASTQB 1.
Quality Management, Peer Review, & Architecture Review Board
Software Project Management
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Presentation transcript:

© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California

© USC-CSE2 Software Dependability Business Case Software dependability in a competitive world –Software dependability requirements often conflict with schedule/cost requirements How much dependability is enough? –When to stop testing and release the product – Determining a risk-balanced “sweet spot” operating point

© USC-CSE3 Competing on Schedule and Dependability – A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: RE =  [Prob (Loss) * Size (Loss)] source sources

© USC-CSE4 Example RE Profile: Time to Ship – Loss due to unacceptable dependability Time to Ship (amount of testing) RE = P(L) * S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

© USC-CSE5 Example RE Profile: Time to Ship - Loss due to unacceptable dependability - Loss due to market share erosion Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

© USC-CSE6 Example RE Profile: Time to Ship - Sum of Risk Exposures Sweet Spot Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

© USC-CSE Slight inconvenience (1 hour) Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 300 hours 10 hours 10K hours 300K hours Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader Startup demo Safety-critical 1.26 Relative Cost/Source Instruction High RELY Rating Very High Nominal Low Very Low Software Development Cost/Reliability Tradeoff - COCOMO II calibration to 161 projects

© USC-CSE8 Current COQUALMO System COCOMO II COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size

© USC-CSE9 Highly advanced tools, model- based test More advance test tools, preparation. Dist-monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well- trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk-through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low COCOMO II p.263 Defect Removal Rating Scales

© USC-CSE10 Defect Removal Estimates - Nominal Defect Introduction Rates (60 defects/KSLOC) Delivered Defects / KSLOC Composite Defect Removal Rating (1.0) (.475) (.24) (.125) (.06) (.03) Prob. of Loss P(L)

© USC-CSE11 Relations Between COCOMO II and COQUALMO COQUALMO rating scales for levels of investment in defect removal via automated analysis, peer reviews, and execution testing and tools have been aligned with the COCOMO II RELY rating levels.

© USC-CSE12 Typical Marketplace Competition Value Estimating Relationships Market Share Loss VL(T d ) System Delivery Time T d Fixed-schedule Event Support: Value of On-time System Delivery Market Share Loss VL(T d ) System Delivery Time T d Off-line Data Processing: Value Loss vs. System Delivery Market Share Loss VL(T d ) System Delivery Time T d Internet Services, Wireless Infrastructure: Value Loss vs. System Delivery Time

© USC-CSE13 How much Dependability is Enough? - Early Startup: Risk due to low dependability - Commercial: Risk due to low dependability - High Finance: Risk due to low dependability - Risk due to market share erosion COCOMO II: Added % test time COQUALMO: P(L) Early Startup: S(L) Commercial: S(L) High Finance: S(L) Market Risk: RE m Sweet Spot

© USC-CSE14 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) Customer Type % of Value for Correct Customer Billing Automated test generation tool - all tests have equal value

© USC-CSE15 Value-Based vs. Value-Neutral Testing – High Finance COCOMO II: Added % test time COQUALMO: P(L) Value-based: S(L): Exponential Value-Neutral: S(L): Linear Market Risk: RE m Sweet Spot

© USC-CSE16 Reasoning about the Value of Dependability – iDAVE iDAVE: Information Dependability Attribute Value Estimator Extend iDAVE model to enable risk analyses – Determine how much dependability is enough

© USC-CSE17 iDAVE Model Framework

© USC-CSE18 Usage Scenario of iDAVE Combined Risk Analyses - How much Dependability is Enough? 1.Estimate software size in terms of value-adding capabilities. 2.Enter the project size and cost drivers into iDAVE to obtain project delivered defect density (= (defects introduced – defects removed)/KSLOC) for the range of dependability driver (RELY) ratings from Very Low to Very High. 3.Involve stakeholders in determining the sizes of loss based on the value estimating relationships for software dependability attributes. 4.Involve stakeholders in determining the risk exposures of market erosion based on the delivery time of the product. 5.Apply the iDAVE to assess the probability of losses for the range of dependability driver (RELY) ratings from Very Low to Very High. 6.Apply the iDAVE to combine the dependability risk exposures and market erosion risk exposures to find the sweet spot.

© USC-CSE19 Conclusions Increasing need for value-based approaches to software dependability achievement and evaluation –Methods and models emerging to address needs Value-based dependability cost/quality models such as COCOMO II, COQUALMO can be combined with Value Estimating Relationships (VERs) to –Perform risk analyses to determine “how much dependability is enough?” –Perform sensitivity analyses for the most appropriate dependability investment level for different cost-of-delay situations –Determine relative payoff of value-based vs. value-neutral testing