Presentation is loading. Please wait.

Presentation is loading. Please wait.

DAIMI(c) Henrik Bærbak Christensen1 Reviews Software Inspections.

Similar presentations


Presentation on theme: "DAIMI(c) Henrik Bærbak Christensen1 Reviews Software Inspections."— Presentation transcript:

1 DAIMI(c) Henrik Bærbak Christensen1 Reviews Software Inspections

2 DAIMI(c) Henrik Bærbak Christensen2 The Two Aspects Testing has two conflicting goals to detect defects “be as mean as possible” to ensure quality “be as nice as possible” These goals can be supported by two radically different types of analysis –Static analysis –Dynamic analysis

3 DAIMI(c) Henrik Bærbak Christensen3 Static Analysis –the “module view” in software architecture –Read artefacts: code, UML/design, requirements, etc. –by humans: developers, architects, testers, customers,... tools: parsers, compilers, static analysis tools,... –when there is something to read, that is, early!

4 DAIMI(c) Henrik Bærbak Christensen4 Dynamic Analysis –the “execution (component-connector) view” –Execute artefacts: code, (formal models) –by humans: testers, developers tools: automatic test cases, capture/replay tools, formal verification tools –when artefacts are executable, that is, late!

5 DAIMI(c) Henrik Bærbak Christensen5 Classification? Is static analysis (reviewing) a testing activity? Burnstein: –Yes – testing is whatever technique applicable to detect defects and estimate quality Many others (including me?): –No – testing is by definition execution-based and thus a dynamic technique

6 DAIMI(c) Henrik Bærbak Christensen6 Reviews A review is a matter of reading material. Reviews have a number of advantages to other detection techniques, like testing. It can be made early! No need for running code

7 DAIMI(c) Henrik Bærbak Christensen7 Review Types Review is reading material with the intent of finding defects and assess quality. Review is the “superclass” of more specific types of material reading processes –Inspection: Highly formalized process whose result is a written report. –Walk-through: Informal process whose result is often simply learning.

8 DAIMI(c) Henrik Bærbak Christensen8 Review Focus Reviews are means to –identify parts that must be improved contains defects –identify parts that do not need improvement quality assurance –identify specific anomalies more broad than defects –ensure conformance to organizational standards readability, etc.

9 DAIMI(c) Henrik Bærbak Christensen9 Review Advantages Reviews are beneficial because –can be made early –can be made on any type artefact requirements, design, plans, test cases,... –finds other types of defects/anomalies than test documentation defects, standards, naming, –mutual learning across organization tester/developers, junior/senior,... –spot reuse opportunities –awareness of quality issues –more effective test planning –build own checklists and review expertice

10 DAIMI(c) Henrik Bærbak Christensen10 A Code Related Advantage Review is a one phase approach –does defect detection and localization at the same time! Testing is a two phase approach –broken test detects failure –next debugging/inspection to localize defect

11 DAIMI(c) Henrik Bærbak Christensen11 Inspections Fagan, 1976, defined a rigorous process for doing review: Inspections. It basically formalizes the review by: –defining stakeholder roles –defining a specific process to be carried out, outlining phases and criteria –defining artifacts to be used in the process … and he reports improved quality with less effort…

12 DAIMI(c) Henrik Bærbak Christensen12 (Fagan) Roles Each team member has one or more roles: –Author: person that has written the object of inspection (diagrams, specs, code) –Moderator: person that manages and facilitates the process –Scribe: Records the results of the inspection –Inspector: Finds anomalies (faults, omissions, inconsistencies, etc.) Authors are notoriously bad at inspecting own code! But moderator and scribe usually act also as inspectors.

13 DAIMI(c) Henrik Bærbak Christensen13 (Burnstein) Roles (moderator)

14 DAIMI(c) Henrik Bærbak Christensen14 Process [Overview:whole team] Preparation:individual Inspection:whole team Rework.author Follow-up:moderator or new inspection Omitted in Burnstein’s outline

15 DAIMI(c) Henrik Bærbak Christensen15 Process Overview: –describing context and outlining what he has done –the produced object is handed out (code/design…) Preparation: –each participant ‘do their homework’ –reads and tries to understand… –checklists help, so does experience

16 DAIMI(c) Henrik Bærbak Christensen16 Process Inspection: –the author paraphrases the code/design i.e. not “if (balance > CREDIT_LIMIT)” but “test for the credit limit is exceeded” kind-a-thing –“Every piece of logic is covered at least once, and every branch taken at least once”. –Objective is to find anomalies! –not to: find solutions (beware !!!) make the author appear foolish –Moderator should be very aware of this –Output: written report on the inspection.

17 DAIMI(c) Henrik Bærbak Christensen17 Process Rework: –Author resolves anomalies Follow-up: –Moderator check that anomalies have been resolved. –If too much has to be reworked (Fagan: > 5%), then a new full inspection is required.

18 DAIMI(c) Henrik Bærbak Christensen18 Artifacts to help Common Error Checklists –i.e. awareness in the inspection team about common pitfalls. –Checklists must relate to the type of document reviewed: code, design, requirements, plans,... –Contains both language independent and language specific parts. Common: Is file open paired with file close in all paths? Specific: C: are there any “if ( x = y ) { … }” or malloc errors? –organization checklists are important to maintain.

19 DAIMI(c) Henrik Bærbak Christensen19 General Checklist Clearly a document focus

20 DAIMI(c) Henrik Bærbak Christensen20 General Programming Checklist

21 DAIMI(c) Henrik Bærbak Christensen21 Programming Language Specific

22 DAIMI(c) Henrik Bærbak Christensen22 OO checklist You can find checklists in –Cem Kaner: Testing Computer Software (2nd Ed.), International Thomson Computer Press, 1993. §4 and Appendix A –Fagan Many are non OO… Dunsmore produced the following OO checklist. IEEE Trans. Software Eng. vol. 29, no. 8, 2003

23 DAIMI(c) Henrik Bærbak Christensen23 Reporting Inspection output is a written report containing –Comments on all checklist items those of general nature –Summary/Status, signed by all reviewers –Defect list defect class, severity, x-ref to line/page –Review metrics statistical data for process learning –Status: Accept, conditional accept, reject

24 DAIMI(c) Henrik Bærbak Christensen24 Defect List Defects must of course be documented to be corrected. Burnstein and Fagan suggest severity and defect classes –Type: Shorthand for defect type as defined by checklist –Severity: Major: High impact on quality, must be corrected Minor: Low impact, we may live with it –Classification: Missing, Incorrect, Superfluous –X-ref: Where exactly is the defect in the artefact?

25 DAIMI(c) Henrik Bærbak Christensen25 Reporting Anomalies must be documented. Example: X/Y/Z X = type Y = missing, wrong, extra Z = severity

26 DAIMI(c) Henrik Bærbak Christensen26 Reporting IEEE classification of anomalies –missing –extra (superfluous) –ambiguous –inconsistent –improvement desirable –non standard –risk-prone –incorrect –not implementable –editorial

27 DAIMI(c) Henrik Bærbak Christensen27 Accept Review Accept signals a project milestone –Artefact has required degree of quality –Baseline item for configuration management “release” changes must be approved by configuration mgt board –Project progress has been achieved

28 DAIMI(c) Henrik Bærbak Christensen28 Reject Review Conditional accept/Reject signals non- completion –follow-up phase where anomalies are corrected –must go through a new formal inspection to be accepted.

29 DAIMI(c) Henrik Bærbak Christensen29 Review Metric Collect data on the review process itself –artefact size –review time, number of persons –defects found –defects not found (those that escaped) found in later reviews, testing, by customer and throw some metrics at it –defects found per hour, etc.

30 DAIMI(c) Henrik Bærbak Christensen30 Process issues Be aware of the human aspects ! The author easily feels like a “sitting duck”. The purpose is to improve software, not make someone feel uncomfortable! –or boost your ego Thus find the right attitude: –how do we make good software even better?

31 DAIMI(c) Henrik Bærbak Christensen31 Walk-through “Manual execution” of code –team plays the role of the computer on which the program executes.

32 DAIMI(c) Henrik Bærbak Christensen32 Discussion Then – why at all spend time on testing ??? Can’t we not just make software inspections ?

33 DAIMI(c) Henrik Bærbak Christensen33 Summary Review: –reading material to find defects and assess quality –project milestone: artefact accept. Inspection –Formalized process with written report as output –Roles: Moderator, author, recorder, reviewer –Phases: Planning, preparation, meeting, follow-up Artefacts –Checklists: Anomalies that must be checked –Reporting: Defects found, severity, class, x-reference


Download ppt "DAIMI(c) Henrik Bærbak Christensen1 Reviews Software Inspections."

Similar presentations


Ads by Google