Presentation is loading. Please wait.

Presentation is loading. Please wait.

Peer Reviews A. Winsor Brown Jan. 13, 2010

Similar presentations


Presentation on theme: "Peer Reviews A. Winsor Brown Jan. 13, 2010"— Presentation transcript:

1 Peer Reviews A. Winsor Brown Jan. 13, 2010
CS577b Spring 2010 Peer Reviews A. Winsor Brown Jan. 13, 2010 (c) AWBrown & CSSE Jan. 13, 2010

2 Goals of Presentation Background
CeBASE types of defects vs. methods of removal COQUALMO Model Reviews per IEEE J-STD : [repeat from 577a03] Standard for Information Technology Software Life Cycle Processes Software Development Acquirer-Supplier Agreement Quality Models and Metrics Peer Reviews as practiced in CS577 Fagan’s Inspections (c) AWBrown & CSSE Jan. 13, 2010

3 CeBASE Empirical Data Shows
Center for experience BAsed Software Engineering An early 2000 joint effort with UMD (Vic Basili); Funded by NSF Factor of 100 Growth in Software Cost-to-Fix from Requirements to Test [Other studies show a Factor of 1000] Technique Selection Guidance: Peer Reviews vs. Test (c) AWBrown & CSSE Jan. 13, 2010

4 Factor-of-100 Growth in Software Cost-to-Fix
(c) AWBrown & CSSE Jan. 13, 2010

5 Technique Selection Guidance
“Under specified conditions, …” Peer reviews are more effective than functional testing for faults of omission and incorrect specification (UMD, USC) Functional testing is more effective than reviews for faults concerning numerical approximations and control flow (UMD, USC) (c) AWBrown & CSSE Jan. 13, 2010

6 COQUALMO The COQUALMO model contains two sub-models
1) the Defect Introduction model: Uses the required subset of COCOMO cost drivers and three internal baseline defect rates (requirements, design, code and test baselines) 2) the Defect Removal model: Uses three (orthogonal) defect removal profile levels, Automated Analysis People Reviews Execution Testing and Tools along with the prediction produced by the defect introduction model to estimate resultant defect density (c) AWBrown & CSSE Jan. 13, 2010

7 COQUALMO (c) AWBrown & CSSE Jan. 13, 2010

8 IEEE J-STD Standard for Information Technology Software Life Cycle Processes Software Development Acquirer-Supplier Agreement ls usable with any development strategy: structured to better accommodate incremental, evolutionary, and other development models than the traditional “waterfall” model. It is structured to avoid time-oriented dependencies and implications, provides alternatives to formal reviews (that can force a waterfall development model), and explains how to apply the standard across multiple builds or iterations. (c) AWBrown & CSSE Jan. 13, 2010

9 Reviews per IEEE J-STD-016-1995
Joint technical reviews – objectives: a) Review evolving software products, using as criteria the software product evaluation criteria in annex L; review and demonstrate proposed technical solutions; provide insight and obtain feedback on the technical effort; surface and resolve technical issues. b) Review project status; surface near- and long-term risks regarding technical, cost, and schedule issues. c) ... d) e) .... (c) AWBrown & CSSE Jan. 13, 2010

10 Reviews per IEEE J-STD-016-1995
Joint management reviews – objectives: a) Keep management informed about project status, directions being taken, technical agreements reached, and overall status of evolving software products. b) Resolve issues that could not be resolved at joint technical reviews. c) Arrive at agreed-upon mitigation strategies for near- and long-term risks that could not be resolved at joint technical reviews. d) Identify and resolve management-level issues and risks not raised at joint technical reviews. e) Obtain commitments and acquirer approvals needed for timely accomplishment of the project. (c) AWBrown & CSSE Jan. 13, 2010

11 Agenda Reviews per IEEE J-STD-016-1995: Quality Models and Metrics 
Peer Reviews as practiced in CS577 (c) AWBrown & CSSE Jan. 13, 2010

12 Quality Model Types All four types represented: Process, Product, Property, Success (P3S) Product What’s a defect? Problem reports Property Defects [over time]: Removal and residual injection rates Defect Density Success Defect removal rate Problem/Trouble Reports Open over time Process Macro: Defect injection and removal; workflow Micro: ETVX, defect removal techniques, etc. (c) AWBrown & CSSE Jan. 13, 2010

13 Product Models Related to Quality
What’s a defect? An instance of non-conformance with the initiating requirements, standards, or exit criteria or other “checklists” Can exist in the accuracy/completeness of requirements, standards, and associated interface/reference documents Determined ONLY by the responsible Author of an artifact Typically start out as concerns in informal or agile reviews What’s an “issue” Concerns that can NOT be fixed by the author of the artifact under review In developments with large number of people or cycles, issues are usually tracked to closure. (c) AWBrown & CSSE Jan. 13, 2010

14 Defect Categories Severity Major Minor
A Condition that causes an operational failure, malfunction, or prevents attainment of an expected or specified result Information that would lead to an incorrect response or misinterpretation of the information by the user An instance of non-conformance that would lead to a discrepancy report if implemented as is Minor A violation of standards, guidelines, or rules, but would not lead to a discrepancy report Information that is undesirable but would not cause a malfunction or unexpected results (bad workmanship) Information that, if left uncorrected, may decrease maintainability (c) AWBrown & CSSE Jan. 13, 2010

15 Defect Categories (continued)
Class Missing Information that is specified in the requirements or standard, but is not present in the document Wrong Information that is specified in the requirements or standards and is present in the document, but the information is incorrect Extra Information that is not specified in the requirements or standards but is present in the document (c) AWBrown & CSSE Jan. 13, 2010

16 Defect Categories (continued)
Type Unavoidable Unavoidable defects (AKA changes) arise because of the methods, techniques or approaches being followed necessitate changes. Examples include changes arising because of the dynamics of learning, exploration in IKIWISI situations, code or screen contents reorganizations taken on as an "afterthought", replacement of stubs or place-holders in code, etc. Such situations are often "planned for" and expected to occur. Avoidable Changes in analysis, design, code or documentation arising from human error, and which could be avoided through better analysis, design, training, etc. Examples include stub replacement that violates win conditions or requirements such as execution time, memory space: for instance the replacement of a "stub" which breaks a critical timing constraint. (c) AWBrown & CSSE Jan. 13, 2010

17 Defect Categories (continued)
Severity Class Type Major Missing Avoidable Wrong Unavoidable Minor Extra (c) AWBrown & CSSE Jan. 13, 2010

18 Agenda Reviews per IEEE J-STD-016-199u5: Quality Models and Metrics
Peer Reviews as practiced in CS577  (c) AWBrown & CSSE Jan. 13, 2010

19 Role Based Agile Internal/Informal Review
The main activities are following Planning Overview (optional) Preparation Review Meeting Rework Participants (four recommended) Review Leader (recommended: quality focal point for cs577) “Coder” (or next type of person in the artifact’s chain) “Tester” (reviews external interfaces; generates gedanken test approaches) Author (c) AWBrown & CSSE Jan. 13, 2010

20 Agile Internal/Informal Review
Planning Overview Preparation Concern Log Problem List Review Rework Review Result Summary (c) AWBrown & CSSE Jan. 13, 2010

21 ETVX Paradigm Sw Development Process Role Based Peer Review
Relationships: A software development process, e.g., specifying, designing, coding Role Based Peer Review Tasks distributed to team members Entry Criteria Task Validate Exit Emphasize SIMPLE process management. Simple because we are not accounting for forks & joins: processes that produce multiple products (that are used by different subsequent processes) or use multiple inputs from different predecessor processes. At the highest level we see a major task with transition where the validate box would be in this case the PDR. Below that we see the division of work amongst the software development team. On the right hand side of the foil we see Fagan's Inspection as the in-process, work product validation. (c) AWBrown & CSSE Jan. 13, 2010

22 CS577 IICM-Sw Defect Reporting Concepts
Range of Defect identification and reporting mechanisms One at a time: Problem report system Multiple issues/problems found by a single reviewer: Agile Artifact Review (AAR) – only two types of forms (Issues/Concern Log and Defect List) Agile Internal/Informal Review (AIR): Two types of forms Agile Formal Review: Three different types of forms Internal/Informal Review: Four different types of bigger forms Formal Review: Four different types of bigger forms Fagan’s Inspection: Five different types of forms (c) AWBrown & CSSE Jan. 13, 2010


Download ppt "Peer Reviews A. Winsor Brown Jan. 13, 2010"

Similar presentations


Ads by Google