Presentation is loading. Please wait.

Presentation is loading. Please wait.

Value-Based Software Engineering: Motivation and Key Practices

Similar presentations


Presentation on theme: "Value-Based Software Engineering: Motivation and Key Practices"— Presentation transcript:

1 Value-Based Software Engineering: Motivation and Key Practices
Barry Boehm, USC CS 510 Lecture Fall 2009

2 Outline Value-based software engineering (VBSE) motivation, examples, and definitions VBSE key practices Benefits realization analysis Stakeholder Win-Win negotiation Business case analysis Continuous risk and opportunity management Concurrent system and software engineering Value-based monitoring and control Change as opportunity Conclusions and references 07/08/09 ©USC-CSSE

3 Software Testing Business Case
Vendor proposition Our test data generator will cut your test costs in half We’ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original cost, you are 20% ahead Any concerns with vendor proposition? 07/08/09 ©USC-CSSE

4 Software Testing Business Case
Vendor proposition Our test data generator will cut your test costs in half We’ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original cost, you are 20% ahead Any concerns with vendor proposition? 34 reasons in 2004 ABB experience paper Unrepresentative test coverage, too much output data, lack of test validity criteria, poor test design, instability due to rapid feature changes, lack of preparation and experience (automated chaos yields faster chaos), … C. Persson and N. Yilmazturk, Proceedings, ASE 2004 But one more significant reason 07/08/09 ©USC-CSSE

5 Software Testing Business Case
Vendor proposition Our test data generator will cut your test costs in half We’ll provide it to you for 30% of your test costs After you run all your tests for 50% of your original cost, you are 20% ahead Any concerns with vendor proposition? Test data generator is value-neutral* Assumes every test case, defect is equally important Usually, 20% of test cases cover 80% of business case * As are most current software engineering techniques 07/08/09 ©USC-CSSE

6 20% of Features Provide 80% of Value: Focus Testing on These (Bullock, 2000)
100 80 % of Value for Correct Customer Billing 60 Automated test generation tool - all tests have equal value 40 20 5 10 15 Customer Type 07/08/09 ©USC-CSSE

7 Value-Based Testing Provides More Net Value
60 Value-Based Testing (30, 58) 40 Net Value NV (100, 20) 20 20 40 60 80 100 Percent of tests run -20 Test Data Generator % Tests Test Data Generator Value-Based Testing Cost Value NV 30 -30 10 35 -25 50 40 20 -20 75 55 45 -15 88 58 -10 94 54 …. 100 80 +20 -40 07/08/09 ©USC-CSSE

8 Value-Based Reading (VBR) Experiment — Keun Lee, ISESE 2005
By Number P-value % Gr A higher By Impact Average of Concerns 0.202 34 Average Impact of Concerns 0.049 65 Average of Problems 0.056 51 Average Impact of Problems 0.012 89 Average of Concerns per hour 0.026 55 Average Cost Effectiveness of Concerns 0.004 105 Average of Problems per hour 0.023 61 Average Cost Effectiveness of Problems 0.007 108 Group A: 15 IV&V personnel using VBR procedures and checklists Group B 13 IV&V personnel using previous value-neutral checklists Significantly higher numbers of trivial typo and grammar faults 07/08/09 ©USC-CSSE

9 Motivation for Value-Based SE
Current SE methods are basically value-neutral Every requirement, use case, object, test case, and defect is equally important Object oriented development is a logic exercise “Earned Value” Systems don’t track business value Separation of concerns: SE’s job is to turn requirements into verified code Ethical concerns separated from daily practices Value – neutral SE methods are increasingly risky Software decisions increasingly drive system value Corporate adaptability to change achieved via software decisions System value-domain problems are the chief sources of software project failures 07/08/09 ©USC-CSSE

10 Why Software Projects Fail
07/08/09 ©USC-CSSE

11 The “Separation of Concerns” Legacy
“The notion of ‘user’ cannot be precisely defined, and therefore has no place in CS or SE.” - Edsger Dijkstra, ICSE 4, 1979 “Analysis and allocation of the system requirements is not the responsibility of the SE group but is a prerequisite for their work” - Mark Paulk at al., SEI Software CMM* v.1.1, 1993 *Capability Maturity Model 07/08/09 ©USC-CSSE

12 Resulting Project Social Structure
SOFTWARE MGMT. AERO. ELEC. G & C MFG. COMM PAYLOAD I wonder when they'll give us our requirements? 07/08/09 ©USC-CSSE

13 20% of Fires Cause 80% of Property Loss: Focus Fire Dispatching on These?
100 80 % of Property Loss 60 40 20 % of Fires 07/08/09 ©USC-CSSE

14 Missing Stakeholder Concerns: Fire Dispatching System
Dispatch to minimize value of property loss Neglect safety, least-advantaged property owners English-only dispatcher service Neglect least-advantaged immigrants Minimal recordkeeping Reduced accountability Tight budget; design for nominal case Neglect reliability, safety, crisis performance 07/08/09 ©USC-CSSE

15 Key Definitions Value (from Latin “valere” – to be worth
A fair or equivalent in goods, services, or money The monetary worth of something Relative worth, utility or importance Software validation (also from Latin “valere”) Validation: Are we building the right product? Verification: Are we building the product right? 07/08/09 ©USC-CSSE

16 Conclusions So Far Value considerations are software success-critical
“Success” is a function of key stakeholder values Risky to exclude key stakeholders Values vary by stakeholder role Non-monetary values are important Fairness, customer satisfaction, trust Value-based approach integrates ethics into daily software engineering practice 07/08/09 ©USC-CSSE

17 Outline Value-based software engineering (VBSE) motivation, examples, and definitions VBSE key practices Benefits realization analysis Stakeholder Win-Win negotiation Business case analysis Continuous risk and opportunity management Concurrent system and software engineering Value-based monitoring and control Change as opportunity Conclusions and references 07/08/09 ©USC-CSSE

18 DMR/BRA* Results Chain
Order to delivery time is an important buying criterion ASSUMPTION INITIATIVE OUTCOME Contribution Contribution OUTCOME Reduced order processing cycle (intermediate outcome) Implement a new order entry system Reduce time to process order Increased sales Reduce time to deliver product *DMR Consulting Group’s Benefits Realization Approach 07/08/09 ©USC-CSSE

19 Expanded Order Processing System Benefits Chain
07/08/09 ©USC-CSSE

20 The Model-Clash Spider Web: Master Net
- Stakeholder value propositions (win conditions) 07/08/09 ©USC-CSSE

21 Projecting Yourself Into Others’ Win Situations
Counterexample: The Golden Rule Do unto others .. As you would have others do unto you 07/08/09 ©USC-CSSE

22 Projecting Yourself Into Others’ Win Situations
Counterexample: The Golden Rule Build computer systems to serve users and operators .. Assuming users and operators like to write programs, and know computer science Do unto others .. As you would have others do unto you Computer sciences world (compilers, OS, etc.) Users are programmers Applications world Users are pilots, doctors, tellers 07/08/09 ©USC-CSSE

23 EasyWinWin OnLine Negotiation Steps
07/08/09 ©USC-CSSE

24 Red cells indicate lack of consensus.
Oral discussion of cell graph reveals unshared information, unnoticed assumptions, hidden issues, constraints, etc. 07/08/09 ©USC-CSSE

25 Example of Business Case Analysis
ROI= Present Value [(Benefits-Costs)/Costs] 3 Option B Return on Investment 2 Option A 1 Time -1 07/08/09 ©USC-CSSE

26 Example of Business Case Analysis
ROI= Present Value [(Benefits-Costs)/Costs] Option B-Rapid 3 Option B Return on Investment 2 Option A 1 Time -1 07/08/09 ©USC-CSSE

27 Examples of Utility Functions: Response Time
Value Time Real-Time Control; Event Support Value Time Mission Planning, Competitive Time-to-Market Critical Region Value Time Event Prediction - Weather; Software Size Value Time Data Archiving Priced Quality of Service 07/08/09 ©USC-CSSE

28 How Much Testing is Enough
How Much Testing is Enough? - Early Startup: Risk due to low dependability - Commercial: Risk due to low dependability - High Finance: Risk due to low dependability - Risk due to market share erosion Sweet Spot COCOMO II: 12 22 34 54 Added % test time COQUALMO: 1.0 .475 .24 .125 0.06 P(L) Early Startup: .33 .19 .11 .06 .03 S(L) Commercial: .56 .32 .18 .10 High Finance: 3.0 1.68 .96 .54 .30 Market Risk: .008 .027 .09 REm 07/08/09 ©USC-CSSE

29 Value-Based Defect Reduction Example: Goal-Question-Metric (GQM) Approach
Goal: Our supply chain software packages have too many defects. We need to get their defect rates down Question: ? 07/08/09 ©USC-CSSE

30 Value-Based GQM Approach – I
Q: How do software defects affect system value goals? ask why initiative is needed Order processing Too much downtime on operations critical path Too many defects in operational plans Too many new-release operational problems G: New system-level goal: Decrease software-defect-related losses in operational effectiveness With high-leverage problem areas above as specific subgoals New Q: ? 07/08/09 ©USC-CSSE

31 Value-Based GQM Approach – II
New Q: Perform system problem-area root cause analysis: ask why problems are happening via models Example: Downtime on critical path Validate items in stock Schedule packaging, delivery Produce status reports Prepare delivery packages Order Validate order Deliver items order Where are primary software-defect-related delays? Where are biggest improvement-leverage areas? Reducing software defects in Scheduling module Reducing non-software order-validation delays Taking Status Reporting off the critical path Downstream, getting a new Web-based order entry system Ask “why not?” as well as “why?” 07/08/09 ©USC-CSSE

32 Value-Based GQM Results
Defect tracking weighted by system-value priority Focuses defect removal on highest-value effort Significantly higher effect on bottom-line business value And on customer satisfaction levels Engages software engineers in system issues Fits increasing system-criticality of software Strategies often helped by quantitative models COQUALMO, iDAVE 07/08/09 ©USC-CSSE

33 Outline Value-based software engineering (VBSE) motivation, examples, and definitions VBSE key practices Benefits realization analysis Stakeholder Win-Win negotiation Business case analysis Continuous risk and opportunity management Concurrent system and software engineering Value-based monitoring and control Change as opportunity Conclusions and references 07/08/09 ©USC-CSSE

34 Is This A Risk? We just started integrating the software
and we found out that COTS* products A and B just can’t talk to each other We’ve got too much tied into A and B to change Our best solution is to build wrappers around A and B to get them to talk via CORBA** This will take 3 months and $300K It will also delay integration and delivery by at least 3 months *COTS: Commercial off-the-shelf **CORBA: Common Object Request Broker Architecture 07/08/09 ©USC-CSSE

35 Is This A Risk? ******* No, it is a problem
We just started integrating the software and we found out that COTS* products A and B just can’t talk to each other We’ve got too much tied into A and B to change ******* No, it is a problem Being dealt with reactively Risks involve uncertainties And can be dealt with pro-actively Earlier, this problem was a risk 07/08/09 ©USC-CSSE

36 Earlier, This Problem Was A Risk
A and B are our strongest COTS choices But there is some chance that they can’t talk to each other Probability of loss P(L) If we commit to using A and B And we find out in integration that they can’t talk to each other We’ll add more cost and delay delivery by at least 3 months Size of loss S(L) We have a risk exposure of RE = P(L) * S(L) 07/08/09 ©USC-CSSE

37 How Can Risk Management Help You Deal With Risks?
Buying information Risk avoidance Risk transfer Risk reduction Risk acceptance 07/08/09 ©USC-CSSE

38 Risk Management Strategies: - Buying Information
Let’s spend $30K and 2 weeks prototyping the integration of A and B’s HCI’s This will buy information on the magnitude of P(L) and S(L) If RE = P(L) * S(L) is small, we’ll accept and monitor the risk If RE is large, we’ll use one/some of the other strategies 07/08/09 ©USC-CSSE

39 Other Risk Management Strategies
Risk Avoidance COTS product C is almost as good as B, and its HCI is compatible with A Delivering on time is worth more to the customer than the small performance loss Risk Transfer If the customer insists on using A and B, have them establish a risk reserve. To be used to the extent that A and B have incompatible HCI’s to reconcile Risk Reduction If we build the wrapper right now, we add cost but minimize the schedule delay Risk Acceptance If we can solve the A and B HCI compatibility problem, we’ll have a big competitive edge on the future procurements Let’s do this on our own money, and patent the solution 07/08/09 ©USC-CSSE

40 Is Risk Management Fundamentally Negative?
It usually is, but it shouldn’t be As illustrated in the Risk Acceptance strategy, it is equivalent to Opportunity Management Opportunity Exposure OE = P(Gain) * S(Gain) = Expected Value Buying information and the other Risk Strategies have their Opportunity counterparts P(Gain): Are we likely to get there before the competition? S(Gain): How big is the market for the solution? 07/08/09 ©USC-CSSE

41 What Else Can Risk Management Help You Do?
Determine “How much is enough?” for your products and processes Functionality, documentation, prototyping, COTS evaluation, architecting, testing, formal methods, agility, discipline, … What’s the risk exposure of doing too much? What’s the risk exposure of doing too little? Tailor and adapt your life cycle processes Determine what to do next (specify, prototype, COTS evaluation, business case analysis) Determine how much of it is enough Examples: Risk-driven spiral model and extensions (win-win, anchor points, RUP, MBASE, CeBASE Method) Get help from higher management Organize management reviews around top-10 risks 07/08/09 ©USC-CSSE

42 Example Large-System Risk Analysis: How Much Architecting is Enough?
Large system involves subcontracting to over a dozen software/hardware specialty suppliers Early procurement package means early start But later delays due to inadequate architecture And resulting integration rework delays Developing thorough architecture specs reduces rework delays But increases subcontractor startup delays 07/08/09 ©USC-CSSE

43 How Soon to Define Subcontractor Interfaces
How Soon to Define Subcontractor Interfaces? Risk exposure RE = Prob(Loss) * Size(Loss) -Loss due to rework delays Many interface defects: high P(L) Critical IF defects: high S(L) RE = P(L) * S(L) Few IF defects: low P(L) Minor IF defects: low S(L) Time spent defining & validating architecture 07/08/09 ©USC-CSSE

44 - Loss due to rework delays - Loss due to late subcontact startups
How Soon to Define Subcontractor Interfaces? - Loss due to rework delays - Loss due to late subcontact startups Many interface defects: high P(L) Critical IF defects: high S(L) Many delays: high P(L) Long delays: high S(L) RE = P(L) * S(L) Few delays: low P(L) Short Delays: low S(L) Few IF defects: low P(L) Minor IF defects: low S(L) Time spent defining & validating architecture 07/08/09 ©USC-CSSE

45 How Soon to Define Subcontractor Interfaces?
- Sum of Risk Exposures Many interface defects: high P(L) Critical IF defects: high S(L) Many delays: high P(L) Long delays: high S(L) RE = P(L) * S(L) Sweet Spot Few delays: low P(L) Short delays: low S(L) Few IF defects: low P(L) Minor IF defects: low S(L) Time spent defining & validating architecture 07/08/09 ©USC-CSSE

46 How Soon to Define Subcontractor Interfaces? Very Many Subcontractors
Higher P(L), S(L): many more IF’s Many-Subs Sweet Spot Mainstream Sweet Spot RE = P(L) * S(L) Time spent defining & validating architecture 07/08/09 ©USC-CSSE

47 How Much Architecting Is Enough?
A COCOMO II Analysis 10000 KSLOC Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution Added Schedule Devoted to Rework (COCOMO II RESL factor) Total % Added Schedule Sweet Spot 100 KSLOC Sweet Spot Drivers: Rapid Change: leftward High Assurance: rightward 10 KSLOC 07/08/09 ©USC-CSSE

48 Outline Value-based software engineering (VBSE) motivation, examples, and definitions VBSE key practices Benefits realization analysis Stakeholder Win-Win negotiation Business case analysis Continuous risk and opportunity management Concurrent system and software engineering Value-based monitoring and control Change as opportunity Conclusions and references 07/08/09 ©USC-CSSE

49 Sequential Engineering Neglects Risk
$100M Arch. A: Custom many cache processors $50M Arch. B: Modified Client-Server Original Spec After Prototyping 1 2 3 4 5 Response Time (sec) 07/08/09 ©USC-CSSE

50 Change As Opportunity: Agile Methods
Continuous customer interaction Short value - adding increments Tacit interpersonal knowledge Stories, Planning game, pair programming Explicit documented knowledge expensive to change Simple design and refactoring Vs. Big Design Up Front 07/08/09 ©USC-CSSE

51 Five Critical Decision Factors
Represent five dimensions Size, Criticality, Dynamism, Personnel, Culture Personnel (% Level 1B) (% Level 2&3) 40 15 30 20 20 25 Criticality (Loss due to impact of defects) Dynamism 10 30 (% Requirements - change/month) Many 35 1 Lives Single Essential 5 Life 10 Funds Discretionary Funds Comfort 30 50 3 Agile 90 10 70 30 Plan-driven 50 100 30 300 Size 10 Culture (# of personnel) (% thriving on chaos vs. order) 07/08/09 ©USC-CSSE

52 Conclusions Marketplace trends favor transition to VBSE paradigm
Software a/the major source of product value Software the primary enabler of adaptability VBSE involves 7 key elements Benefits Realization Analysis Stakeholders’ Value Proposition Elicitation and Reconciliation Business Case Analysis Continuous Risk and Opportunity Management Concurrent System and Software Engineering Value-Based Monitoring and Control Change as Opportunity Processes for implementing VBSE emerging ICM, Lean Development, DMR/BRA, Balanced Scorecard, Quality Economics, Agile Methods 07/08/09 ©USC-CSSE

53 References C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, 1999. B. Boehm, “Value-Based Software Engineering,” ACM Software Engineering Notes, March 2003. B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000. B. Boehm and L. Huang, “Value-Based Software Engineering: A Case Study, Computer, March 2003, pp B. Boehm & K. Sullivan, “Software Economics: A Roadmap,” The Future of Software Economics, A. Finkelstein (ed.), ACM Press, 2000. B. Boehm and R. Turner, Balancing Agility and Discipline: A Guide for the Perplexed, Addison Wesley, 2003 (to appear). J. Bullock, “Calculating the Value of Testing,” Software Testing and Quality Engineering, May/June 2000, pp S. Faulk, D. Harmon, and D. Raffo, “Value-Based Software Engineering (VBSE): A Value-Driven Approach to Product-Line Engineering,” Proceedings, First Intl. Conf. On SW Product Line Engineering, August 2000. 07/08/09 ©USC-CSSE

54 D. Reifer, Making the Software Business Case, Addison Wesley, 2002.
R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, 1996. C. Persson and N. Yilmazturk, “Establishment of Automated Regression Testing at ABB,” Proceedings, ASE 2004, August 2004, pp D. Reifer, Making the Software Business Case, Addison Wesley, 2002. K. Sullivan, Y. Cai, B. Hallen, and W. Griswold, “The Structure and Value of Modularity in Software Design,” Proceedings, ESEC/FSE, 2001, ACM Press, pp J. Thorp and DMR, The Information Paradox, McGraw Hill, 1998. Economics-Driven Software Engineering Research (EDSER) web site: MBASE web site : sunset.usc.edu/research/MBASE 07/08/09 ©USC-CSSE


Download ppt "Value-Based Software Engineering: Motivation and Key Practices"

Similar presentations


Ads by Google