Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Chapter 4 Software Process and Project Metrics.

Similar presentations


Presentation on theme: "1 Chapter 4 Software Process and Project Metrics."— Presentation transcript:

1 1 Chapter 4 Software Process and Project Metrics

2 2 Measure  A measure is a mapping from a set of entities and attributes in the real word to a representation or model in the mathematical world.  One can manipulate the numbers or symbols in the mathematical world to obtain more information and understanding about the real world.

3 3 Measurement measurement What do we use as a basis? size? size? function? function? project metrics process metrics processproduct product metrics

4 4 Measurement  Advantage For software processFor software process For software projectFor software project For software engineerFor software engineer

5 5 Measures, Metrics & Indicators

6 6  Private indicator vs. Public indicator  Process and Project Indicator assess the quality of an ongoing projectassess the quality of an ongoing project modify the technical approach to improve qualitymodify the technical approach to improve quality adjust work flow or tasks to avoid the delayadjust work flow or tasks to avoid the delay track potential riskstrack potential risks uncover problem areasuncover problem areas evaluate the project team’s abilityevaluate the project team’s ability Indicator

7 7 Process Metrics  Provide indicators – lead to long-term software process improvement  Organization – gain insight into the efficacy of an existing process  Managers and Practitioners -- assess what works and what doesn’t

8 8 Statistical Software Process Improvement (SSPI)  Failure analysis Categorize the defectsCategorize the defects Record the correcting costRecord the correcting cost Count and rank the defectsCount and rank the defects Compute overall cost in each categoryCompute overall cost in each category Uncover the highest cost categoriesUncover the highest cost categories Develop the plan to modify the processDevelop the plan to modify the process

9 9 Statistical Software Process Improvement (SSPI)

10 10 Fishbone Diagram

11 11 Process Metrics Guidelines Use common sense and organizational sensitivity when interpreting metrics data.Use common sense and organizational sensitivity when interpreting metrics data. Provide regular feedback to the individuals and teams who have worked to collect measures and metrics.Provide regular feedback to the individuals and teams who have worked to collect measures and metrics. Don’t use metrics to appraise individuals.Don’t use metrics to appraise individuals. Work with practitioners and teams to set clear goals and metrics that will be used to achieve them.Work with practitioners and teams to set clear goals and metrics that will be used to achieve them. Never use metrics to threaten individuals or teams.Never use metrics to threaten individuals or teams. Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement.Metrics data that indicate a problem area should not be considered “negative.” These data are merely an indicator for process improvement. Don’t obsess on a single metric to the exclusion of other important metrics.Don’t obsess on a single metric to the exclusion of other important metrics. Grady

12 12 Project Metrics  Measure  Input -- sources required to do the work  Output -- work products created during the process  Results -- effectiveness of the deliverables  Project Indicator Assess the Quality of an Ongoing ProjectAssess the Quality of an Ongoing Project Modify the Technical Approach to Improve QualityModify the Technical Approach to Improve Quality Adjust Work Flow or Tasks to Avoid the DelayAdjust Work Flow or Tasks to Avoid the Delay Track Potential RisksTrack Potential Risks Uncover Problem AreasUncover Problem Areas Evaluate the project team’s abilityEvaluate the project team’s ability  Project metrics  Errors uncovered per review hour  Scheduled vs. actual milestone dates  ….

13 13 Software Measurement  Direct measures: Defects, LOC produced, execution speed, memory size…  Indirect measures: quality, complexity, efficiency…

14 14 Typical Size-Oriented Metrics  Metrics:  errors per KLOC  defects per KLOC  $ per KLOC  page of documentation per KLOC  errors per person-month  LOC per person-month  $ per page of documentation Normalization value

15 15 Typical Size-Oriented Metrics  Advantages  easily counted  Disadvantages  programming language  well-design  late get data

16 16 Function-Oriented Metrics  Function Point countable (direct) measures of software’s information domain assessments of software complexity

17 17 Function Points (FP) Analyze information domain of the application and develop counts Weight each count by assessing complexity Assess influence of global factors that affect the application Compute function points Establishcount for input domain and system interfaces Assign level of complexity orweight to each count Grade significance of external factors, F such as reuse, concurrency, OS,... degree of influence: N = F i complexity multiplier: C = ( x N) function points =C x (count x weight) where: i

18 18 Function Points (FP)

19 19 Function Points (FP)  Complexity adjustment values (F i ; i = 1-14 ) 1.Does the system require reliable backup and recovery? 2.Are data communications required? 3.Are there distributed processing functions? :14.

20 20 Function Points (Example)

21 21

22 22 Function-Oriented Metrics  errors per FP  defects per FP  $ per FP  pages of documentation per FP  FP per person-month

23 23 Function-Oriented Metrics  Advantages  programming language independent  get data early  Disadvantages  subjective  difficult to collect the data  no direct physical meaning

24 24 Quality Metrics  There are different answers for the characteristics of software that contribute to its overall quality.  Users – external view  Practitioners – internal view  Build models to relate the user’s external view to the developer’s internal view of the software.

25 25 External quality Product quality

26 26 Metrics for Software Quality Correctness — operate correctlyCorrectness — operate correctly (defects per KLOC) (defects per KLOC) Maintainability — amenable to changeMaintainability — amenable to change (mean-time-to-change (MTTC)) (mean-time-to-change (MTTC)) Integrity — impervious to outside attackIntegrity — impervious to outside attack (integrity = Sum( 1- threat *(1-security))) (integrity = Sum( 1- threat *(1-security))) Usability — easy to useUsability — easy to use (learning time, productivity, assessment (learning time, productivity, assessment of user…) of user…)

27 27 Defect Removal Efficiency (DRE)  For Project DRE = (errors) / (errors + defects) where errors = problems found before release defects = problems found after release DRE = (errors) / (errors + defects)  For Process DRE i = (errors) i / (errors i + errors i+1 )

28 28 Metrics Baseline  Attributes of baseline data  reasonably accurate  as many projects as possible  consistent  similar

29 29 Managing Variation (Control Chart Approach) FIGURE 4.8 Metrics data for errors uncovered per review hour

30 30 Moving Range Control Chart (stable) mR UCL Mean of the moving Range (mR) = 1.71 Upper Control Limit (UCL) = mR * = 5.57

31 31 Individual Control Chart (control ) Upper Natural Process Limit (UNPL) = mR * Am = 8.55 Lower Natural Process Limit (LNPL) = mR * 2.66 – Am = 0.55 Standard deviation = (UNPL – Am) / 3 = 1.52 UNPL LNPL Am

32 32 Individual Control Chart (control )  Out of control  If any one of the following items is true A single metrics value lies outside the UNPL.A single metrics value lies outside the UNPL. Two out of three successive metrics values lie more than two standard deviations away from A m.Two out of three successive metrics values lie more than two standard deviations away from A m. Four out of five successive metrics values lie more than one standard deviation away from A m.Four out of five successive metrics values lie more than one standard deviation away from A m. Eight consecutive metrics values lie on one side of A m.Eight consecutive metrics values lie on one side of A m.

33 33 Metrics for Small Organizations  Keep the measurements simple.  Tailor measurements to each organization.  Ensure the measurements produces valuable information.  The cost of collecting and computing metrics ranges from 3% to 8% of project budget during the learning face, then drops to less than 1%.

34 34  What is it? A quantitative measure of the degree which a system, component, or process possesses an attribute. A quantitative measure of the degree which a system, component, or process possesses an attribute.  Why is it important? Objective evaluation, better estimates, true improvement… Objective evaluation, better estimates, true improvement…  How does it? Collected  analyzed  compared  assessed Collected  analyzed  compared  assessed  Who does it? Collect Measures  software engineers Collect Measures  software engineers Analyze and assess Metrics  software manager Analyze and assess Metrics  software manager Software Metrics

35 35 Software Process and Project Metrics  What is it?  Who does it?  Why is it important?  What are the steps?  What is the work product?  How to ensure it be done right?


Download ppt "1 Chapter 4 Software Process and Project Metrics."

Similar presentations


Ads by Google