Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 SMU CSE 8314 Software Measurement.

Similar presentations


Presentation on theme: "Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 SMU CSE 8314 Software Measurement."— Presentation transcript:

1 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 SMU CSE 8314 Software Measurement and Quality Engineering Module 35 Advanced Defect Measurement Techniques

2 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 2 What is a Defect? Example:  A payroll program generates correct paychecks in 49 states.  But it fails in California due to a weird income tax rule that was omitted from the specification. –Failure free in 49 states –Serious failure in 1 state

3 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 3 Analysis  It is a defect if sold in California or if intended for use in California  The source of the failure is a defect in the requirements specification –This kind of defect is the most costly to find and to fix

4 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 4 Defects Occur in All Phases of Software Development  An important element of software reliability is understanding where the defects originate  Another important element is changing the process to remove the defects as early as possible  It is generally reported, though debated a lot, that the earliest defects are the most costly if not found until later

5 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 5 Cost of Defects

6 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 6 Cost to Fix a Requirements Defect vs. Phase where Detected

7 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 7 General Model of Defects Process Step (deletes old defects; introduces new defects; deletes some of the new defects) incoming defects outgoing defects new defects defects detected and removed

8 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 8 Examples of Requirements Specification Defects  Higher level system requirement not reflected in software requirements  Incomplete information in software requirements –Assumptions about designers’ underlying knowledge –Failure to include an “implicit” requirement  Failure to control and/or communicate changes

9 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 9 Examples of Requirements Specification Defects (continued)  Infeasible specification  Conflicts/inconsistencies within specification  Start-up conditions misstated  Software specification inconsistent with hardware specification

10 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 10 Examples of Design Defects  Out of range data –eg., character set on punch cards  Infinite loops –code –circular procedure calls  Incorrect use of reserved resources –special registers –temporary memory blocks

11 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 11 Examples of Design Defects (continued)  Improper analysis of computational errors or conditions –trigonometric functions may go to infinity at certain values –round-off errors  Inconsistent data representation –caller passes integer data –subroutine expects floating point data

12 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 12 Examples of Design Defects (continued)  Inconsistent interface documentation –hardware provides 17 lines -- software expects 16 lines –subroutine expects 3 parameters -- caller provides only 2  Failure to defend against bad input  Failure to fully test exceptional cases  Failure to conform to requirements

13 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 13 Prevention and Detection of Design Defects  Inspections  Walkthroughs  Measurement of problems and time to correct  Trace matrices

14 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 14 Examples of Coding Defects  Any programming error (bug)  Failure to implement the design  Failure to implement the requirements  Failure to test the software  Failure to test the design  Failure to test the requirements  etc.

15 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 15 Another Type of Coding Defects  All errors that result from attempts to “correct” a design error without correcting the design first –May only show up later when the design is corrected –Or may show up during maintenance when design does not match code

16 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 16 Related Defects  Documentation errors  Installation errors  Testing errors  etc.

17 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 17 Defects due to Software Development Process  Failure to document specifications and assumptions  Failure to control changes to specifications, design, etc.  Failure to control the configuration of code, documents and data –Which part goes with what?  Failure to take the time to inspect, evaluate, review, test, etc.

18 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 18 Minimizing Defects via Prevention Techniques  Good specifications, design practices, standards  Good staff –Experience –Training in application, language, methods, process  Good process practices –Check for completion of exit criteria –Good standards such as complexity, etc.

19 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 19 Minimizing Defects via Prevention Techniques (continued)  Configuration management  Good management techniques  Good programming practices  Continuous process improvement

20 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 20 Programming Standards and Practices; Process Standards  Must be... – developed –documented –kept up to date –communicated –“bought into” by practitioners –followed –measured –evaluated and improved All of this costs $ and effort Can be done by a SW QA org or by SW Engineers. All of this costs $ and effort Can be done by a SW QA org or by SW Engineers.

21 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 21 Defect Detection Methods  Reviews  Inspections  Walkthroughs  Tests  Measures –Tabulate number of defects –Track to source –Measure effectiveness of Process Steps  Audits

22 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 22 Defect Removal  There are many debugging techniques  Debugging tools can help some  But studies show this should be the last resort –Costs the most –Least effective –Most error prone

23 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 23 Guidelines for Audits, Reviews, Inspections, Walkthroughs, etc.  Confirm that work is traceable to predecessor tasks –Code traceable to design –Design traceable to requirements –Tests traceable to design or requirements, depending on type of test  Confirm completeness of work  Confirm conformance to standards  Ascertain correctness  Ascertain quality or lack thereof –Undue complexity, etc.

24 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 24 Additional Guidelines  Focus on finding errors –Not on proving that the software works  Make a list of action items –And follow up -- “track to completion” –Corrective action process should be documented  Automate as much as possible –Spell checkers, etc.  But do not omit human judgment

25 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 25 Different Tests find Different Kinds of Defects  White Box Tests tend to find more of the defects related to coding errors  Black Box Tests tend to find more of the defects related to requirements satisfaction

26 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 26 Tracking Defect Containment  The concept here is to learn where the defects are coming from  This requires that you collect information about each defect

27 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 27 Step 1 -- Track Defects and Record Phase of Origin For each step in the process, we measure defects detected For each defect, we categorize by which step originated the defect Phase where found ____ Phase where introduced ___ Defect Report

28 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 28 Step 2 -- At the End of Each Phase, Record by Phase of Origin Phase of Detection Phase of Injection Cell i,j indicates the number of defects created in phase i and detected in phase j

29 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 29 Step 3 -- Measure Defect Containment  Defects that are “in-phase” (detected in the same phase where originated) are said to be “contained”  Defects that are “out-of-phase” (detected in a later phase) are said to be “leaking”

30 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 30 Contained and Leaking Defects Phase of Detection Phase of Injection Out-of-phase or Leaking In-phase or Contained

31 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 31 Large Numbers Indicate Process Problems  Large numbers in any column indicates that you are generating many defects in that process phase  A large number in a “leaking” cell means you are also paying a lot of money for rework This tells you where to focus process improvement efforts

32 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 32 Defect Containment Chart Least Costly Defects are on the Diagonal -- “Contained” within the step where they were caused

33 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 33 Escaping Defects are Those Not Detected until After Release Escaping Defects Cost the Most of All

34 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 34 Issues with This Method  Definition of a defect must be adhered to in a consistent way across the project  As shown, there is no distinction by type or severity of defect (but this distinction can be made if the type and severity information are collected with the defect data)

35 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 35 Optional - Record Type, Severity, or Other Information Phase where found ____ Phase where introduced ___ Severity _____ Type _____ Estimated Cost to Fix _____ etc. Defect Report

36 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 36 Key Lessons Learned from These Methods  If you detect and correct defects early, it greatly reduces cost and reduces post-release defects  There may be reluctance to collect defect data during development –The most professional software engineers develop an appreciation for the value of this type of information

37 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 37 Other Information Can be Recorded - Such as Labor Cost Phase of Detection Phase of Injection Cell i,j indicates the average labor cost to repair a defect created in phase i and detected in phase j

38 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 38 Total Repair Cost  If you multiply the defect containment chart by the cost to fix chart, you get total repair cost Cell-wise multiplication Defect Counts Cost to Fix Repair Cost

39 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 39 Total Repair Costs Phase of Detection Phase of Injection Cell i,j indicates the total cost to repair all defects created in phase i and detected in phase j

40 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 40 Rework Costs Phase of Detection Phase of Injection Costs off-diagonal are rework costs

41 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 41 This Can Help You Justify Process Improvements  Rework costs are the equivalent of “software scrap”  If you can reduce scrap by investing in defect prevention activities, you can save a lot of money (see earlier modules on software quality)

42 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 42 Cost to Prevent can Also be Measured  Simply record the cost for each defect prevention activity –and then compare its cost with the resulting savings on repair cost  Some organizations are saving millions of dollars on major projects. Investment Rework Reduction

43 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 43 Analyzing Defect Data at the Organizational Level  By collecting data from many projects, we can show historical costs for rework  And we can also show patterns of defect containment

44 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 44 Organizational Analysis of Defect Containment Data  Analysis of defect containment data for many projects over a period of time may show such process information as: –Most frequent types of defects –Most costly defects –Time required to fix defects –Process steps generating the most defects –Which design standards help or hurt defects

45 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 45 Example of Process Metric Defect Data from SA/SD Projects Defect Data from OO Projects SA/SD Defect PatternOO Defect Pattern

46 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 46 Overview of the Cost Data Collection Process for an Organization RAPDDDCUTI & T RA 2 PD 4 2 DD 64 CUT8 4 2 I & T10 4 6 8 2 6 2 Product 1 Product 2 Product 3 Product 4 Count Log Hours Expended Table entry (I,J) TABLE ENTRY (I,J) CAPABILITY COST CAPABILITY MATRIX C 11 C 21 C 31 C 41 C 51 C 22 C 32 C 33 C 42 C 43 C 44 C 52 C 53 C 54 C 55 ValidationValidation Historical Project Data Hours/Defect Combine project Data for each table entry. Analyze each table entry for mean, std dev.. Expected range of values for each table entry. Defect History in Hours/Defect Compare with Program Experience +2 - 2 RAPDDDCUTI & T RA 2 PD 4 2 DD 64 CUT8 4 2 I & T10 4 6 8 2 6 2 Out of Range Here!

47 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 47 In Summary  Defect containment data are relatively simple to collect  But if you collect consistently and analyze thoroughly, you can manage many aspects of your project  And you can predict and manage many of your project costs and quality results

48 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 48 References Basili, Victor R. "Applying the Goal / Question / Metric Paradigm in the Experience Factory," 10th Annual CSR Workshop, October 1993. Hedstrom, John and Dan Watson. “Developing Software Defect Prediction,” Proceedings, Sixth International Conference on Applications of Software Measurement, 1995. Snyder, Terry and Ken Shumate. Defect Prevention in Practice (Draft white paper), Hughes Aircraft Company, October 22, 1993.

49 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 49 References (concluded) Ross, Sheldon M.. Introduction to Probability Models, Academic Press, 1993. Xie, M. Software Reliability Modeling

50 Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 50 END OF MODULE 35


Download ppt "Copyright 1995-2007, Dennis J. Frailey CSE8314 - Software Measurement and Quality Engineering CSE8314 M35 - Version 7.09 SMU CSE 8314 Software Measurement."

Similar presentations


Ads by Google