Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,

Similar presentations


Presentation on theme: "University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,"— Presentation transcript:

1 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15, 2005 (boehm@sunset.usc.edu) (http://sunset.usc.edu) The Economics of Software Quality

2 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE2 Outline The business case for software quality –Example: net value of test aids –Value depends on stakeholder value propositions What qualities do stakeholders really value? –Safety, accuracy, response time, … –Attributes often conflict Software quality in a competitive world Conclusions

3 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE3 Software Quality Business Case Software quality investments compete for resources –With investments in functionality, response time, adaptability, speed of development, … Each investment option generates curves of net value and ROI –Net Value NV = PV(benefit flows-cost flows) –Present Value PV = Flows at future times discounted –Return on Investment ROI = NV/PV(cost flows) Value of benefits varies by stakeholder and role

4 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE4 Software Testing Business Case Vendor proposition –Our test data generator will cut your test costs in half –We’ll provide it to you for 30% of your test costs –After you run all your tests for 50% of your original costs, you’re 20% ahead Any concerns with vendor proposition?

5 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE5 Software Testing Business Case Vendor proposition –Our test data generator will cut your test costs in half –We’ll provide it to you for 30% of your test costs –After you run all your tests for 50% of your original costs, you’re 20% ahead Any concerns with vendor proposition? –Test data generator is value-neutral* –Every test case, defect is equally important –Usually, 20% of test cases cover 80% of business value * As are most current software engineering techniques

6 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE6 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000) % of Value for Correct Customer Billing Customer Type 100 80 60 40 20 51015 Automated test generation tool - all tests have equal value

7 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE7 Value-Based Testing Provides More Net Value Net Value NV 60 40 20 0 -20 20401006080 -40 (100, 20) Percent of tests run Test Data Generator Value-Based Testing (30, 58) % TestsTest Data GeneratorValue-Based Testing CostValueNVCostValueNV 0300-30000 103510-25105040 204020-20207555 304530-15308858 405040-10409454 …. 10080100+20100 0

8 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE8 There is No Universal Quality-Value Metric Different stakeholders rely on different value attributes –Protection: safety, security, privacy –Robustness: reliability, availability, survivability –Quality of Service: performance, accuracy, ease of use –Adaptability: evolvability, interoperability –Affordability: cost, schedule, reusability Value attributes continue to tier down –Performance: response time, resource consumption (CPU, memory, comm.) Value attributes are scenario-dependent –5 seconds normal response time; 2 seconds in crisis Value attributes often conflict –Most often with performance and affordability

9 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE9 Major Information System Quality Stakeholders Information Suppliers -citizens, companies Information Brokers Information Consumers -financial services, -decisions, education, Mission Controllers Developers, Acquirers, Administrators Dependents Information System - passengers, patients news media entertainment -pilots, distribution controllers

10 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE10 Overview of Stakeholder/Value Dependencies Strength of direct dependency on value attribute **- Critical ; *-Significant; blank-insignificant or indirect Attributes Stakeholders ** * * * * * * ** * Protection Robustness Quality of Service Adaptability Affordability Developers, Acquirers Mission Controllers, Administrators Info. Consumers Info. Brokers Info. Suppliers, Dependents

11 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE11 Stakeholders Elaboration of Stakeholder/Value Dependencies

12 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE12 Implications for Quality Engineering There is no universal quality metric to optimize Need to identify system’s success-critical stakeholders –And their quality priorities Need to balance satisfaction of stakeholder dependencies –Stakeholder win-win negotiation –Quality attribute tradeoff analysis Need value-of-quality models, methods, and tools

13 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE13 Outline The business case for software quality What qualities do stakeholders really value? Software quality in a competitive world –Is market success a monotone function of quality? –Is quality really free? –Is “faster, better, cheaper” really achievable? –Value-based vs. value-neutral methods Conclusions

14 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE14 Competing on Cost and Quality - adapted from Michael Porter, Harvard Too cheap (low Q) Cost leader (acceptable Q) Stuck in the middle Quality leader (acceptable C) Over- priced Price or Cost Point ROI

15 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE15 Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: sources RE =  [Prob (Loss) * Size (Loss)] source

16 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE16 Example RE Profile: Time to Ship - Loss due to unacceptable quality Time to Ship (amount of testing) RE = P(L) * S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

17 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE17 Example RE Profile: Time to Ship - Loss due to unacceptable quality - Loss due to market share erosion Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

18 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE18 Example RE Profile: Time to Ship - Sum of Risk Exposures Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Sweet Spot Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

19 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE19 Comparative RE Profile: Safety-Critical System Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): defects High-Q Sweet Spot

20 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE20 Comparative RE Profile: Internet Startup Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): delays Low-TTM Sweet Spot TTM: Time to Market

21 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE21 How much Architecting is Enough? - A COCOMOII Analysis Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule 10000 KSLOC 100 KSLOC 10 KSLOC Sweet Spot Sweet Spot Drivers: Rapid Change: leftward High Assurance: rightward

22 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE22 Interim Conclusions Unwise to try to compete on both cost/schedule and quality –Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough –Buying information to reduce risk Often difficult to determine parameter values –Some COCOMO II values discussed next

23 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE23 0.80.91.01.11.21.3 Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) In-house support software 1.0 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

24 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE24 0.80.91.01.11.21.3 Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader 0.92 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

25 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE25 0.80.91.01.11.21.3 Slight inconvenience (1 hour) Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader 0.92 0.82 Startup demo Safety-critical 1.26 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects High RELY Rating Very High Nominal Low Very Low

26 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE26 “Quality is Free” Did Philip Crosby’s book get it all wrong? Investments in dependable systems – Cost extra for simple, short-life systems – Pay off for high-value, long-life systems

27 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE27 Software Life-Cycle Cost vs. Quality 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 1.0 0.92 1.26 0.82 Relative Cost to Develop COCOMO II RELY Rating

28 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE28 Software Life-Cycle Cost vs. Quality 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 0.92 1.26 0.82 Relative Cost to Develop, Maintain COCOMO II RELY Rating 1.23 1.10 0.99 1.07

29 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE29 Software Life-Cycle Cost vs. Quality 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 0.92 1.26 0.82 Relative Cost to Develop, Maintain COCOMO II RELY Rating 1.23 1.10 0.99 1.07 1.11 1.05 70% Maint. 1.07 1.20

30 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE30 Software Ownership Cost vs. Quality 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 0.92 1.26 0.82 Relative Cost to Develop, Maintain, Own and Operate COCOMO II RELY Rating 1.23 1.10 0.99 1.07 1.11 1.05 70% Maint. 1.07 1.20 0.76 0.69 VL = 2.55 L = 1.52 Operational-defect cost at Nominal dependability = Software life cycle cost Operational - defect cost = 0

31 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE31 Outline The business case for software quality What qualities do stakeholders really value? Software quality in a competitive world –Is market success a monotone function of quality? –Is quality really free? –Is “faster, better, cheaper” really achievable? –Value-based vs. value-neutral methods Conclusions

32 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE32 Cost, Schedule, Quality: Pick any Two? C QS

33 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE33 Cost, Schedule, Quality: Pick any Two? C QS C QS Consider C, S, Q as Independent Variable – Feature Set as Dependent Variable

34 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE34 C, S, Q as Independent Variable Determine Desired Delivered Defect Density (D4) –Or a value-based equivalent Prioritize desired features –Via QFD, IPT, stakeholder win-win Determine Core Capability –90% confidence of D4 within cost and schedule –Balance parametric models and expert judgment Architect for ease of adding next-priority features –Hide sources of change within modules (Parnas) Develop core capability to D4 quality level –Usually in less than available cost and schedule Add next priority features as resources permit Versions used successfully on 32 of 34 USC digital library projects

35 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE35 Value-Based vs. Value Neutral Methods Value-based defect reduction Constructive Quality Model (COQUALMO) Information Dependability Attribute Value Estimation (iDAVE) model

36 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE36 Value-Based Defect Reduction Example: Goal-Question-Metric (GQM) Approach Goal:Our supply chain software packages have too many defects. We need to get their defect rates down Question: ?

37 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE37 Value-Based GQM Approach – I Q: How do software defects affect system value goals? ask why initiative is needed -Order processing -Too much downtime on operations critical path -Too many defects in operational plans -Too many new-release operational problems G: New system-level goal: Decrease software-defect-related losses in operational effectiveness -With high-leverage problem areas above as specific subgoals New Q: ?

38 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE38 Value-Based GQM Approach – II New Q: Perform system problem-area root cause analysis: ask why problems are happening via models Example: Downtime on critical path Order Validate order Produce status reports Schedule packaging, delivery Validate items in stock items Prepare delivery packages Deliver order Where are primary software-defect-related delays? Where are biggest improvement-leverage areas? Reducing software defects in Scheduling module Reducing non-software order-validation delays Taking Status Reporting off the critical path Downstream, getting a new Web-based order entry system Ask “why not?” as well as “why?”

39 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE39 Value-Based GQM Results Defect tracking weighted by system-value priority –Focuses defect removal on highest-value effort Significantly higher effect on bottom-line business value –And on customer satisfaction levels Engages software engineers in system issues –Fits increasing system-criticality of software Strategies often helped by quantitative models –COQUALMO, iDAVE

40 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE40 COCOMO II Current COQUALMO System COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size

41 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE41 Defect Removal Rating Scales Highly advanced tools, model- based test More advance test tools, preparation. Dist-monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well- trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk-through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low COCOMO II p.263

42 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE42 Defect Removal Estimates - Nominal Defect Introduction Rates Delivered Defects / KSLOC Composite Defect Removal Rating

43 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE43 iDAVE Model

44 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE44 ROI Analysis Results Comparison

45 University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE45 Conclusions: Software Quality (SWQ) There is no universal SWQ metric to optimize –Need to balance stakeholder SWQ value propositions Increasing need for value-based approaches to SWQ –Methods and models emerging to address needs “Faster, better, cheaper” is feasible –If feature content can be a dependent variable “Quality is free” for stable, high-value, long-life systems –But not for dynamic, lower-value, short life systems Future trends intensify SWQ needs and challenges –Criticality, complexity, decreased control, faster change


Download ppt "University of Southern California Center for Software Engineering C S E USC 09/15/05©USC-CSE1 Barry Boehm, USC Motorola Quality Workshop September 15,"

Similar presentations


Ads by Google