Presentation is loading. Please wait.

Presentation is loading. Please wait.

Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering.

Similar presentations


Presentation on theme: "Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering."— Presentation transcript:

1 Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213-3890

2 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 2 Purpose Why are you interested in process improvement? Hopefully for the process performance benefits. If so, process performance measurement is a key concern. Many of the examples in this presentation are from the Team Software Process SM, however the concepts are broadly applicable. SM Team Software Process is a registered service mark of Carnegie Mellon University

3 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 3 Team Software Process The Team Software Process (TSP) is an integrated set of practices for developing software. TSP is a process-based solution to common software engineering and management issues. cost and schedule predictability productivity and product quality process improvement Unlike other methods, TSP teams are self-directed. emphasizes measurement and quality management. provides immediate and measurable benefits. accelerates CMMI-based improvement.

4 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 4 TSP Performance Summary -1 * From a study of 20 projects in 13 organizations conducted in 2003 ** Of the unsuccessful projects, average schedule error was 222% Performance Category TSP Impact Study (2003)* Typical Industry Performance (Standish Group)** Schedule error average 6% Schedule error range -20% to +27%

5 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 5 TSP Performance Summary -2 * From a study of 20 projects in 13 organizations conducted in 2003 Performance Category TSP Impact Study (2003)* Typical Industry Performance System test defects per thousand instructions 0.4 avg. 0.0 to 0.9 2 to 14 Released defects per thousand instructions 0.06 avg. 0.0 to 0.2 1 to 7 System test effort (% of total effort) 4% avg. 2% to 7% 40%

6 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 6 TSP Performance Summary -3 An analysis of 20 projects in 13 organizations showed TSP teams averaged 0.06 defects per thousand lines of new or modified code. Approximately 1/3 of these projects were defect-free. These results are substantially better than those achieved in high maturity organizations. Source: CMU/SEI-2003-TR-014

7 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 7 TSP-CMMI Overall Coverage

8 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 8 Topics Process management concepts TSP measurement framework Performance measures

9 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 9 SEI Process Management Premise “The quality of a software system is governed by the quality of the process used to develop and evolve it.” - Watts Humphrey

10 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 10 Managed Process The CMMI defines a managed process as a process with the following characteristics. a performed process that is planned an executed in accordance with policy the process employs skilled people who have adequate resources to produce controlled outputs it involves relevant stakeholders it is monitored, controlled, and reviewed it is evaluated for adherence to its process description

11 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 11 Process Management Process ownership: key responsibilities for designing, establishing, and implementing the process and the mechanisms for measurement and corrective action is assigned. Process definition: the design and formal documentation of the components of the process and their relationships. Process control: the function of ensuring that the process output meets specifications including measurement control variable(s) feedback loop(s) defect detection, correction, and defect prevention Source: Quality Process Management by Gabriel Pall

12 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 12 Process Management Concept Work Process InputOutput Control

13 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 13 Example Inspection Process InputOutput Control Review Rate Process Yield System test yield

14 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 14 Process Management Conclusions A defined process is a prerequisite for process management. The enactment of the process should not differ from the defined process in any substantive way. The key determinants of process performance must be instrumented and measured. Failure to measure or limited measurement scope can lead to sub-optimization or “process tampering” The process and measures should be designed to support process management from the start.

15 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 15 Topics Process management concepts TSP measurement framework Performance measures

16 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 16 Process Measurement Issues Some common process measurement issues… substantial variation in measurement reporting requirements across development groups and suppliers few measures of quality standards emphasize derived measures instead of common base measures inability to summarize, aggregate, drill-down, extend cannot benchmark or make comparisons limited use as a management indicator lack of accountability Measurement framework “literally” tied to CMMI process areas and the examples of derived measures from CMMI.

17 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 17 Measurement System Design and a “systems” approach solves many measurement issues. Define a few common base measurement categories and establish standards for the most used instances. Develop a measurement framework that relates the base measures to the key elements of software process work. Create derived measures from the standard base measures. Identify process performance models and benchmarks that predict future performance. Integrate into monitoring and decision-making processes.

18 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 18 TSP Measurement Framework -1 Base measurement categoriesExample derived measures Estimation accuracy Prediction intervals Productivity Cost performance index Planned value Earned value Predicted earned value Defect density Defect density by phase Defect removal rate by phase Defect removal leverage Review rates Process yield Phase yield Failure cost of quality Appraisal cost of quality Appraisal/Failure COQ ratio Percent defect free Defect removal profiles Quality profile Quality profile index … Size ScheduleDefects Effort

19 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 19 TSP Measurement Framework -2 A model of key process elements and their relations provides a context for the base measures. processes and phases projects and sub-projects products and parts teams and team members tasks period (week, month, etc.) The model facilitates analysis aggregation and drill-down queries and views scalability ProcessProjectTeamProductTasksPeriod

20 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 20 Estimated and Actual Size Size is a measure of the magnitude of the software deliverable, e.g. lines of code or function points. Size is estimated and actual size is measured for each component. Five size accounting categories are used. Base Modifications to the base Deletions from the base Added or new Reused Size data are used to estimate effort track progress normalize other measures

21 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 21 Estimated and Actual Effort Effort is a measure of time on task. The TSP effort measure is called a task hour. Task hours are estimated and measured by process phase task day or week How many task hours are there in a 40 hour week? About 15 to 20.

22 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 22 Estimated and Actual Schedule Schedule has two components resource availability task completion dates Planned task dates are calculated from estimates of resource availability and planned task hours. Actual date completed is recorded as tasks are finished. Actual resource availability is derived from actual task hours.

23 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 23 Estimated and Actual Defects Defects are the measure of quality. Estimates of the number of defects injected and removed. A count of the actual number of defects injected and removed. Defect data includes component phase injected phase removed Definition: a work product element that must be changed, after it was completed, in order to ensure proper design, implementation, test, use, or maintenance.

24 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 24 Topics Process management concepts TSP measurement framework Performance measures

25 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 25 TSP Performance Measures The most often used TSP performance measures are: Planned value, earned value, predicted earned value Planned and actual task hours Estimation error Growth Defect density Percent defect-free Quality profile and index These measures support planning and tracking. Combined with historical data and/or benchmarks, these measures also support process performance modeling.

26 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 26 Process Performance Models Project data Process performance model Predicted project performance Historical data and Benchmarks

27 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 27 Example: Quality Profile Project data Time in design, design review, coding, and code review Defects found in compile and unit test. Product size Process performance model Quality Profile Predicted value Likelihood of post system test defects Benchmarks Development time ratio criteria Defect density criteria

28 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 28 Quality Profile Benchmarks These software quality benchmarks predict post- development defects. Modules that meet these criteria were found to be largely defect free in system test and after deployment. Software Quality Benchmarks Derived Measure Desired Value Design Time vs. Code Time Ratio 1 to 1 Design vs. Design Review Time Ratio 2 to 1 Code vs. Code Review Time Ratio 2 to 1 Compile defect density < 10 per KLOC Unit test defect density < 5 per KLOC

29 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 29 Quality Profile The quality profile is a process performance model that provides an early warning indicator for post-development defects. The quality profile uses the five software quality benchmarks. Satisfied criteria are plotted at the outside edge of the chart. High quality componentPoor quality component Inadequate design review time results in design defects escaping to test and production.

30 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 30 Using the Quality Profile

31 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 31 Quality Performance Index The Quality Performance Index is the product of the five parameters in the quality profile. QPI predicts the likelihood of post-development defects in a system. Quality Performance Index Post-Development Defect Density Quality Performance Index vs. Post-Development Defect Density Interpreting the Quality Performance Index RangeInterpretation 0.0 to 0.2Re-inspect; test and post- development defects likely 0.2 to 0.4Re-inspect if test defects are found 0.4 to 1.0Component is of high-quality

32 © 2006 by Carnegie Mellon University Carnegie Mellon Software Engineering Institute Software Process Performance Measures.2006.02.01 32 Conclusion Measurement and process management are inseparable, you should incorporate measurement in your initial processes. A common problem with software process measurement is a lack of integrated, well designed measurement systems resulting in unnecessary complexity and usability issues such as lack of scalability and extensibility. Process management can be successfully applied to the software process with a few, simple, derived measures that are integrated into a measurement framework.


Download ppt "Carnegie Mellon Software Engineering Institute © 2006 by Carnegie Mellon University Software Process Performance Measures James Over Software Engineering."

Similar presentations


Ads by Google