Presentation is loading. Please wait.

Presentation is loading. Please wait.

Key Measurements For Testers Pamela Perrott © 2003 Construx Software Builders Inc. All Rights Reserved. Construx Delivering Software.

Similar presentations


Presentation on theme: "Key Measurements For Testers Pamela Perrott © 2003 Construx Software Builders Inc. All Rights Reserved. Construx Delivering Software."— Presentation transcript:

1 Key Measurements For Testers Pamela Perrott http://www.construx.com © 2003 Construx Software Builders Inc. All Rights Reserved. Construx Delivering Software Project Success

2 consulting  training  project outsourcing  products  construx.com 2 Precision vs. Accuracy  Accuracy  Saying PI = 3 is accurate, but not precise  I’m 2 meters tall, which is accurate, but not precise  Precision  Saying PI = 4.378383 is precise, but not accurate  Airline flight times are precise to the minute, but not accurate  Number of significant digits is the key

3 consulting  training  project outsourcing  products  construx.com 3 Precision vs. Accuracy  People make assumptions about accuracy based on precision  “365 days” is not the same as “1 year” or “4 quarters” or even “52 weeks”  “10,000 staff hours” is not the same as “5 staff years”  Unwarranted precision is the enemy of accuracy (e.g., 395.7 days +/- 6 months)

4 consulting  training  project outsourcing  products  construx.com 4 Introduction Good Goals  A goal should be SMART  Specific  Measurable/Testable  Attainable  Relevant  Time-bound  Can use a Purpose, Issue, Object format

5 consulting  training  project outsourcing  products  construx.com 5 Introduction GQM Hierarchy Goal 1 Goal 2 Question Measure

6 consulting  training  project outsourcing  products  construx.com 6 Introduction GQM Example Current average cycle time * 100 Baseline average cycle time Subjective rating of manager’s satisfaction Measures Is the performance of the process improving?Question Average cycle time Standard Deviation % cases outside the upper limit Measures What is the current change request processing speed?Question Improve by 10% the timeliness of change request processing from the project manager’s viewpoint Goal Purpose Issue Object (process) Viewpoint

7 consulting  training  project outsourcing  products  construx.com 7 Project Evaluation: Quality Test Planning and Resources  Do we have enough testing resources?  How many tests do we need to run (estimated)?  How long does each test case take to design and write?  How long does each test take, on average?  How many full testing cycles do we expect? (more than one especially for early test cycles)  How many person-days do we need (# tests * time per test * # of cycles)?  How many testing staff do we have?  How long will the testing phase take, with our current staff?  Is the testing phase too long (i.e. our current staff is not sufficient)? Do we have to test less or can we add staff?

8 consulting  training  project outsourcing  products  construx.com 8 Project Evaluation: Quality Reported/Corrected Software Defects 0% 100% Time Start of testing phase End of testing phase Defects found Defects fixed Defects open From Manager’s Handbook for Software Development, Revision 1, NASA, Software Engineering Laboratory 1990

9 consulting  training  project outsourcing  products  construx.com 9 Project Evaluation: Quality Reported/Corrected Software Defects – Actual Project Number of defect reports (in thousands) 0 1.0 Weeks of testing 510152025303540 0.2 0.4 0.8 0.6 Found Open Fixed

10 consulting  training  project outsourcing  products  construx.com 10 Project Evaluation: Quality Defect Rate 95% 99% 99.9%

11 consulting  training  project outsourcing  products  construx.com 11 Project Evaluation: Quality Statistics on Effort per Defect  Data on time required to fix defects, categorized by type of defect, provides a basis for estimating remaining defect correction work  Need to collect data on fix time in defect tracking system  Data on phases in which defects are injected and later detected gives you a measure of the efficiency of the development process. If 95% of the defects are detected in the same phase they were created, the project has an efficient process

12 consulting  training  project outsourcing  products  construx.com 12 Project Evaluation: Quality A Defect Fix Time Model for Testing From Software Metrics: Establishing a Company-wide Program, by Robert B Grady and Deborah L. Caswell, 1987

13 consulting  training  project outsourcing  products  construx.com 13 Product Characterization: Quality Defects  Defects are one of the most often used measures of quality  Definitions of defects differ  Only items found by customers? Testers?  Items found during upstream reviews?  Only non-trivial items?  Small enhancements?  Timing of “defect” detection an important part of defect characterization  A “product defect” may be different than a “process defect”

14 consulting  training  project outsourcing  products  construx.com 14 Product Evaluation: Testing System Test Profile From NASA, Recommended Approach to Software Development, 1992

15 consulting  training  project outsourcing  products  construx.com 15 Product Evaluation: Testing System Test Profile From NASA, Recommended Approach to Software Development, 1992

16 consulting  training  project outsourcing  products  construx.com 16 Product Evaluation: Testing Cumulative Defects Found in Testing From Manager’s Handbook for Software Development, Revision 1, NASA, Software Engineering Laboratory 1990

17 consulting  training  project outsourcing  products  construx.com 17 Product Evaluation: Testing Cumulative Defects – Actual Project From Manager’s Handbook for Software Development, Revision 1, NASA, Software Engineering Laboratory 1990

18 consulting  training  project outsourcing  products  construx.com 18 Product Prediction Predicting Future Defect Rates Increasing Factors  System size  Application complexity  Compressing the schedule  4x increase  More staff  Lower productivity Decreasing Factors  Simplifying the application/problem at hand  Extending the planned development time  Cut in half  Fewer staff  Higher productivity

19 consulting  training  project outsourcing  products  construx.com 19 Product Prediction Defect Density Prediction  To judge whether we’ve found all the defects for an application, estimate its defect density  Need statistics on defect density of past similar projects  Use this data to predict expected density on this project  For example, if our prior projects had a defect density between 7 and 9.5 defects/KLOC, we expect a similar density on our new project  If our new project has 100,000 lines of code, we expect to find between 700 and 950 defects total  If we’ve found 600 defects so far  We’re not done: we expect to find between 100 and 350 more defects

20 consulting  training  project outsourcing  products  construx.com 20 Product Prediction Distribution of Software Defect Origins and Severities  Highest severity faults come from requirements and design Severity Level

21 consulting  training  project outsourcing  products  construx.com 21 Product Prediction Defect Modeling  Model the number of defects expected based on past experience  Model the number of defects in requirements, design, construction, etc.  Two approaches:  Model defects based on effort hours, i.e X defects will be introduced per hour worked  Model defects per KSLOC (or other size unit) based on past experience and code growth curve

22 consulting  training  project outsourcing  products  construx.com 22 Product Prediction Defect Modeling continued  Approach 1: SEI data, based on PSP data:  DesignInjected/hour = 1.76  CodingInjected/hour = 4.20  Approach 2:  Defects / KSLOC total are about 40 (30-85)  10% requirements (4/KLOC)  25% design (10/KLOC)  40% coding (16/KLOC)  15% user documentation (6/KLOC)  10% bad fixes (4/KLOC)

23 consulting  training  project outsourcing  products  construx.com 23 Product Prediction Predicted and Actual Defects Found Development Phase From Edward F. Weller, Practical Applications of Statistical Process Control, IEEE Software May/June 2000 Size reestimate

24 consulting  training  project outsourcing  products  construx.com 24 Product Prediction Defect Profile by Type - Example Sources of defects

25 consulting  training  project outsourcing  products  construx.com 25 Release Measures Defect Counts  Defect counts give a quantitative handle on how much work the project team still has to do before it can release the software  Graph the cumulative reported defects, open defects and fixed defects  When the software is nearing release, the number of open defects should trend downward, and the fixed defects should be approaching the reported defects line

26 consulting  training  project outsourcing  products  construx.com 26 Release Measures Defect Trends – Near Release All Defects Number of defect reports (in thousands) 0 1.0 Weeks of testing 510152025303540 0.2 0.4 0.8 0.6 Found Open Fixed Target

27 consulting  training  project outsourcing  products  construx.com 27 Release Measures Defect Trends – Near Release Severity 1 and 2 Number of defect reports (in thousands) 0 1.0 Weeks of testing 510152025303540 0.2 0.4 0.8 0.6 Found Open Fixed Target

28 consulting  training  project outsourcing  products  construx.com 28 Release Measures Construx Measurable Release Criteria  Acceptance testing successfully completed  All open change requests dispositioned  System testing successfully completed  All requirements implemented, based on the spec  All review goals have been met  Declining defect rates are seen  Declining change rates are seen  No open Priority A defects exist in the database  Code growth has stabilized

29 consulting  training  project outsourcing  products  construx.com 29 Release Measures HP Measurable Release Criteria  Breadth – testing coverage of user accessible and internal functions  Depth – branch coverage testing  Reliability – continuous hours of operation under stress; stability; ability to recover gracefully from defect conditions  Remaining defect density at release From Robert B Grady, Practical Software Metrics for Project Management and Process Improvement, 1992

30 consulting  training  project outsourcing  products  construx.com 30 Release Measures Post Release Defect Density by Whether Met Release Criteria From Practical Software Metrics for Project Management and Process Improvement, by Robert B. Grady 1992

31 consulting  training  project outsourcing  products  construx.com 31 Release Measures: Defect Counts Defect Plot Before Release From Robert B Grady, Practical Software Metrics for Project Management and Process Improvement, 1992

32 consulting  training  project outsourcing  products  construx.com 32 Detection Effectiveness [Jones86]

33 consulting  training  project outsourcing  products  construx.com 33 Process Evaluation Status Model Units created Units reviewed Units tested

34 consulting  training  project outsourcing  products  construx.com 34 Process Evaluation Status Example 1 From NASA, Manager’s Handbook for Software Development, Revision 1, 1990

35 consulting  training  project outsourcing  products  construx.com 35 Goal #1 – Improve Software Quality Postrelease Discovered Defect Density From Practical Software Metrics for Project Management and Process Improvement, by Robert B. Grady 1992

36 consulting  training  project outsourcing  products  construx.com 36 Goal #1 – Improve Software Quality Prerelease Defect Density Question: How can we predict software quality based on early development processes? From Practical Software Metrics for Project Management and Process Improvement, by Robert B. Grady 1992

37 consulting  training  project outsourcing  products  construx.com 37 Goal #3 – Improve Productivity Defect Repair Efficiency Question: How efficient are defect-fixing activities? Are we improving? From Practical Software Metrics for Project Management and Process Improvement, by Robert B. Grady 1992

38 consulting  training  project outsourcing  products  construx.com 38 Goal #4 – Maximize Customer Satisfaction Mean Time to Fix Critical and Serious Defects Question: How long does it take to fix a problem? From Practical Software Metrics for Project Management and Process Improvement, by Robert B. Grady 1992 AR = Awaiting release QA = Final QA testing KP = known problem AD = awaiting data LC = lab classification MR = marketing review

39 consulting  training  project outsourcing  products  construx.com 39 Pamela Perrott  23+ years in IT  Application programmer, systems programmer, programmer support  At two insurance companies  Several years at Boeing  In re-engineering group and repository group  Several years in wireless (stints in QA and SEPG groups)  Currently instructor/consultant at Construx Software

40 consulting  training  project outsourcing  products  construx.com 40 Construx Software  Steve McConnell  owner, CEO, Chief Software Engineer  Author: Code Complete, Rapid Development, Software Project Survival Guide, Professional Software Development  Founded in 1996  Staff of around 15, mostly software engineers  Located in the Pacific Northwest  Two primary lines of business: Training and Consulting

41 consulting  training  project outsourcing  products  construx.com 41 Contact Information Consulting@construx.comwww.construx.com (425) 636-0100  Custom Software Projects  Consulting  Seminars sales@construx.com www.construx.com Construx Delivering Software Project Success


Download ppt "Key Measurements For Testers Pamela Perrott © 2003 Construx Software Builders Inc. All Rights Reserved. Construx Delivering Software."

Similar presentations


Ads by Google