Presentation is loading. Please wait.

Presentation is loading. Please wait.

© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman.

Similar presentations


Presentation on theme: "© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman."— Presentation transcript:

1 © 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman Susan Linkman

2 © 1997, BAK. 2The DESMET Methodology EEL UNIVERSITY KE Agenda Evaluation methods Methods Selecting an appropriate metods

3 © 1997, BAK. 3The DESMET Methodology EEL UNIVERSITY KE Evaluation methods Two aspects: Nature of evaluation outcome –Assessment of suitability qualitative/subjective –Measurable benefits quantitative/objective Organisation of evaluation –formal experiment –case study –survey

4 © 1997, BAK. 4The DESMET Methodology EEL UNIVERSITY KE Qualitative methods Feature analysis User requirements mapped to method/tool features Subjective assessment how well is feature supported? how usable is functionality? Problems: Selection of features Subjectivity of rating Collation of results Too many features

5 © 1997, BAK. 5The DESMET Methodology EEL UNIVERSITY KE Quantitative methods Measured benefits of method/tool Objective assessment measure quality and/or productivity compare results using different method/tool Problems Not all benefits are quantitative Some quantitative benefits are hard to measure

6 © 1997, BAK. 6The DESMET Methodology EEL UNIVERSITY KE Hybrid methods Specific techniques Benchmarking objective performance measures subjective selection of “tests” Qualitative effects analysis subjective expert opinion about quantitative benefits

7 © 1997, BAK. 7The DESMET Methodology EEL UNIVERSITY KE Formal experiment Scientific paradigm Many subjects (engineers) Perform specified task(s) Subjects assigned at random to method Randomisation and replication minimise bias ensure results are trustworthy Best for precise answers to limited questions

8 © 1997, BAK. 8The DESMET Methodology EEL UNIVERSITY KE Case studies Method/tool tried out on “real” project Results scale to real world Limited replication so problems with comparisons

9 © 1997, BAK. 9The DESMET Methodology EEL UNIVERSITY KE Surveys For “mature” methods/tools People/groups that use method or tool polled Database of results analysed

10 © 1997, BAK. 10The DESMET Methodology EEL UNIVERSITY KE Nine evaluation methods Feature analysis –Formal Experiment –Case Study –Survey –Screening-mode Quantitative evaluation –Formal Experiment –Case Study –Survey Qualitative effects analysis Benchmarking

11 © 1997, BAK. 11The DESMET Methodology EEL UNIVERSITY KE Problem 9 Evaluation methods Embarrassment of riches Which method should you use? It depends what you want to do

12 © 1997, BAK. 12The DESMET Methodology EEL UNIVERSITY KE 7 Selection Criteria Evaluation project goals Evaluation capability of organisation Nature of evaluation object Nature of impact Scope of impact Maturity of evaluation object Learning curve

13 © 1997, BAK. 13The DESMET Methodology EEL UNIVERSITY KE Evaluation goals Choice of methods for individual project Selection of methods & tools for an organisation Monitoring changes as part of process improvement program evaluation of proposed change effect of adoption of change Selection of method/tool for resale

14 © 1997, BAK. 14The DESMET Methodology EEL UNIVERSITY KE Evaluation capability Characteristics of an organisation affect its ability to perform evaluations Four types of organisation capability: 1. Severely limited –each project is different 2. Qualitative evaluation capability –project follow same standards 3. Quantitative & qualitative –projects all keep project metrics 4. Full evaluation capability –the organisation maintains store of project data

15 © 1997, BAK. 15The DESMET Methodology EEL UNIVERSITY KE Nature of evaluation object Method (or method/tool combination) –likely to have major impact –quantitative assessment advisable Tool –comparing alternatives suggests feature analysis –tool v. no tool suggests quantitative assessment Generic method –e.g. object-oriented v. structured methods –can only try-out specific methods/tools –generic assessment needs expert opinion

16 © 1997, BAK. 16The DESMET Methodology EEL UNIVERSITY KE Scope impact Product granularity: whole product modules Extent of impact seen immediately seen over several phases or whole lifecycle seen on subsequent projects

17 © 1997, BAK. 17The DESMET Methodology EEL UNIVERSITY KE Impact on selection of method Formal experiments more viable for impacts with small scope easier to impose necessary control easier to provide replication Case studies appropriate for larger scope For impacts affecting later projects e.g. effect of re-usability need to consider surveys

18 © 1997, BAK. 18The DESMET Methodology EEL UNIVERSITY KE Maturity of item If currently in wide-spread use: surveys are possible If new method/tool case study or formal experiment

19 © 1997, BAK. 19The DESMET Methodology EEL UNIVERSITY KE Learning time time to understand principles time to become proficient Long learning reduces feasibility of formal experiment

20 © 1997, BAK. 20The DESMET Methodology EEL UNIVERSITY KE Feasibility of selection Other non-technical factors affect method selection: Timescales for evaluation Level of confidence required in result Cost of evaluation

21 © 1997, BAK. 21The DESMET Methodology EEL UNIVERSITY KE Timescales for evaluation Long (3 months plus): –Cases study (quantitative or qualitative) Medium (several months) –Feature analysis - survey Short (several weeks) –Experiments (quantitative or qualitative) –Benchmarking –Feature analysis - screening mode Very short (a few days) –Quantitative survey –Qualitative Effects Analysis

22 © 1997, BAK. 22The DESMET Methodology EEL UNIVERSITY KE Risk of “wrong” result Very High: –Qualitative Effects Analysis –Feature analysis - screening mode High: –Quantitative case study (“sister project”) –Feature analysis case study Medium –Quantitative case study (“organisation baseline”) –Feature analysis survey

23 © 1997, BAK. 23The DESMET Methodology EEL UNIVERSITY KE Risk of wrong result- continued Low: –Quantitative case study (“within project baseline”) –Formal feature analysis experiment –Quantitative survey Very Low –Formal quantitative experiment

24 © 1997, BAK. 24The DESMET Methodology EEL UNIVERSITY KE Cost of an evaluation High: –Formal experiment Medium: –Case study –Feature Analysis Survey or Screening-mode –Benchmarking Low (assuming infrastructure exists): –Quantitative Survey –Qualitative Effects Analysis

25 © 1997, BAK. 25The DESMET Methodology EEL UNIVERSITY KE Summary There is no best evaluation method An appropriate evaluation method is context dependent “Appropriate” technical choice can be infeasible if it: takes too long costs too much doesn’t provide sufficiently trustworthy results


Download ppt "© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman."

Similar presentations


Ads by Google