Download presentation

Presentation is loading. Please wait.

Published byDevyn Maggard Modified about 1 year ago

1
Misleading Metrics and Unsound Analyses Presenter: Gil Hartman Authors: Barbara Kitchenham, David Ross Jeffery, and Colin Connaughton IEEE Software 24(2), pp , Mar-Apr 2007

2
About the authors Barbara Kitchenham - Professor of quantitative software engineering at Keele University, GB David Ross Jeffery - Professor of software engineering at the University of NSW, Australia Colin Connaughton - Metrics consultant for IBM’s Application Management Services, Sydney

3
Introduction Software Project management – predicting and monitoring software development projects Measurement is a valuable software- management support tool Unfortunately, some of the “expert” advice can encourage the use of misleading metrics

4
Metrics in AMS Data is from Application Management Services delivery group of IBM Australia – A CMM level 5 organization using standard metrics and analyses The program was intended to confirm each project’s productivity and to set improvement targets on future projects

5
ISO/IEC Software Measurement Process Indicator: Average productivity Function: Divide project X lines of code by project Y hours of effort Model: Compute mean and standard deviation of all project productivity values Decision criteria: Computed confidence intervals based on the standard deviation

6
Non-normal data distributions Frequency plot of the AMS productivity data over four years. The Simple average isn’t a good estimate of a typical project’s productivity.

7
Productivity for application 1 Standard deviation for all projects is very large. The mean and standard deviations of the total data, don’t necessarily relate to a specific application.

8
Application 2 What can we conclude from the standard run plot?

9
Scatter plot vs run chart

10
Productivity = Function points / Effort

11
Application 3

12
Run charts Advantages – Can identify productivity trends over time – provide a comparison with overall mean values Disadvantages – actual productivity values are difficult to interpret – mean and standard deviation can be inflated by high-productivity values for small unimportant projects

13
DO Lessons learned - DO Base all analysis of project data on data from similar projects Use graphical representations of productivity data Use the relationship between effort and size to develop regression models – Logarithmic transformations – actual effort vs predicted effort – Statistical confidence intervals

14
DON’T Lessons learned - DON’T Use the mean and standard deviation for either monitoring or prediction purposes Analyze projects that are dissimilar simply to get more data Use any metrics that are constructed from the ratio of two independent measures unless you’re sure you understand the measure’s implications

15
Conclusion Charts and metrics can sometimes be misleading. But they often help display statistics and data in a perceptible way.

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google