Presentation is loading. Please wait.

Presentation is loading. Please wait.

6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt1 Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement.

Similar presentations


Presentation on theme: "6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt1 Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement."— Presentation transcript:

1 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt1 Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement

2 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt2 Outline High Level Design Phase Review Inspection Questions Power Code Inspection Measurement System Test Plan Review Next Phases (Implementation & Test)

3 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt3 Due Today Completed & Inspected High Level Design (SDS) & Integration Test Plan Completed & Inspected Configuration Management Plan SRS Inspection Quality Records (LOGD, INS, Baseline Sheets) Completed & Inspected Integration Test Plans (incl. LOGD, INS, Baseline forms) Completed and Inspected Coding, naming & design standards (incl. LOGD, INS, Baseline forms) System Test Plans (draft for inspection) Updated project schedule & measurement data collected Time, defect & size data collected Updated project notebook

4 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt4 Software Design Specification (SDS) Input Conceptual Design Requirements (SRS) Design Objectives Principal parts How the parts interact How they are put together

5 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt5 Design Standards Naming conventions Function, File, Variable, Parameter Names Defines, Globals, Public, Statics, etc. Interface formats Variable handling Error codes System & Error messages Defect standards Severities Defect types Root cause bucketing LOC counting

6 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt6 Design for Reuse Reusable functions should be: Self contained Cleanly isolated Clearly & concisely documented (usage, interfaces, returns, errors) Examples of successful reusable components?

7 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt7 Design for Testability Unit test harnesses Simulation testing Black box testing Verify program’s external interfaces White box testing Also considers program’s logical paths & structure Typically requires special tools (e.g. code coverage) Typically requires supporting code

8 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt8 Integration Test Plan Objective Verify all system component interfaces Activity Review all interfaces (as defined in SDS) Specify how to test them Recommend: inspecting SDS & Integration Test Plan simultaneously Could also combine into a single document

9 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt9 System Test Plan Areas to cover: Installation Start-up All required functions available & working as specified Diabolical (e.g. power failures, corner cases, incorrect handling) Performance Usability Includes: Test cases you plan to run (numbered / named) Expected results Ordering of testing & dependencies Supporting materials needed Traceability to requirements

10 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt10 Power! What is it? “probability that one actor within a social relationship will be in a position to carry out his own will despite resistance” – Max Weber “interpersonal relationship in which one individual (or group) has the ability to cause another individual (or group) to take an action that would not be taken otherwise” – Steers & Black

11 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt11 Types of Power Referent/Charismatic Based on personal qualities, characteristics, reputation Expert Knowledge or expertise relevant to person Power only in domain of expertise Legitimate Person has the right to exert power in a specified domain Reward Controls rewards a person wants Coercive Based on fear, person can administer punishment

12 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt12 Effectiveness Which types of power are most effective? Power within the class teams?

13 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt13 Inspections Inspection Objectives Find defects at earliest possible point Verify to specification (e.g. design to requirements) Verify to standards Collect element and process data Set baseline point Exit Criteria All detected defects resolved Outstanding, non-blocking issues tracked Techniques & Methods Generic checklists & standards Inspectors prepared in advance Focus on problems, not on resolution Peers only “Mandatory” data collection Roles: Moderator, reader, recorder, inspector, author

14 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt14 Inspection Logistics Identify moderator (for TSPi, use process manager) Inspection briefing (identify inspection roles, set date/time for inspection) Review product –Individual reviews –Record time spent reviewing –Identify defects, but do not log on LOGD form (defects recorded during inspection on INS & LOGD forms) –Typically want 3-5 days for an adequate review period Inspection meeting –Obtain & record preparation data –Step through product one line or section at a time –Raise defects or questions –Defects recorded by moderator on INS form –Defects recorded by producer on LOGD form (no need to use Change Requests) –Peripheral issues & action items should be recorded in ITL log*

15 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt15 Inspection Logistics (continued) Estimate remaining defects TBD (but, for each defect, record all members who identified it) Conclude meeting –Agree on verification method for defects –Agree on disposition (e.g. approved, approved with modification, re- inspect) Rework product & verify fixes (e.g. moderator) Obtain signatures of all inspectors on baseline sheet (file as quality record)

16 Spelling Code Inspection

17 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt17 Measurement Data & Metrics Base Metrics # & Type of Defects found (major, minor) For each defect, who found # of pages inspected, preparation time (per inspector), inspection time Measures Preparation rate = # pages / average preparation time Inspection rate = # pages / inspection time Inspection defect rate = # major defects / inspection time Defect density = # estimated defects / # of pages Inspection yield = # defects / # estimated defects (individual & team) SRS Phase Defect Containment (%) = 100% * # Defects removed @ step / ( Incoming defects + Injected defects)

18 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt18 Inspections Estimating Defects Capture-ReCapture Example Catch 20 fish in lake, tag & release them Catch 25 more, 5 are tagged How many fish are in the lake? 5 out of 25 = 20 out of Total Population Total = ?

19 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt19 Capture-ReCapture Formula Fishing example: 5 out of 25 = 20 out of Total C = # from both fishing tries (e.g. 5) A = # from first fishing try (e.g. 20) B = # from second fishing try (e.g. 25) So, C out of B = A out of Total C/B = A/Total Total = A*B/C

20 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt20 Estimating Defects 2 Developer Case C = # from both tries = # found by both developers A = # from first try = # found by developer A B = # from second try = # found by developer B Total # defects = # A * # B / # both found Yield = # found / Total # defects expressed as percentage = 100 * (A*B-C) / (A*B/C) = 100 * (A*B-C)*C / A*B Humphrey 2 Developer Example: Two developers, A found 7, B found 5, common defects 3 Total estimated # defects = (7*5)/3 = 12 Yield = 9 / 12 = 75%

21 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt21 Estimating Defects 3 Developer Example Three developers in an inspection identified 10 unique defects (# from 1 to 10). Harry found defects 1, 2, 3, 4 & 5 Chapin found defects 1, 2, 4, 6 & 7 Sue found defects 4, 6, 7, 8, 9 & 10 Estimate total # of defects in product prior to inspection. Estimate total inspection yield. Sue identified the most unique defects (3) = 6 identified Combine Harry’s & Chapin’s defects = 7 identified, 3 in common w/ Sue Total Product Defects = 6 * 7 / 3 = 14 Yield % = 100 * 10/14 = 71%

22 6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt22 Capture-ReCapture Assumptions & Cautions Population is homogeneous Population is randomly distributed Sample #s are reasonably large

23 Backup Slides


Download ppt "6/19/2007SE 652- 2007_06_19_Overview_Inspections.ppt1 Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement."

Similar presentations


Ads by Google