Presentation is loading. Please wait.

Presentation is loading. Please wait.

Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005.

Similar presentations


Presentation on theme: "Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005."— Presentation transcript:

1 Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005

2 Overview Of This Presentation There are numerous types of technical reviews, and numerous methods for implementing reviews. There are numerous types of technical reviews, and numerous methods for implementing reviews. This presentation will briefly describe several types and techniques of reviews This presentation will briefly describe several types and techniques of reviews It will go into greater detail on one type and method for technical reviews It will go into greater detail on one type and method for technical reviews Finally, we will perform a technical review in class. Finally, we will perform a technical review in class.

3 Purpose of Reviews Oversight Oversight To create buy-in or a sense of ownership To create buy-in or a sense of ownership To disseminate or transfer knowledge To disseminate or transfer knowledge To improve the quality of a document or work product To improve the quality of a document or work product Other purposes? Other purposes?

4 Types Of Reviews Customer Reviews Customer Reviews Management Reviews Management Reviews Peer Reviews Peer Reviews Personal Reviews Personal Reviews Other Types of Reviews? Other Types of Reviews?

5 Customer Reviews Often specified as milestones in a contract (e.g. preliminary and critical design reviews). Often specified as milestones in a contract (e.g. preliminary and critical design reviews). Formal documentation is submitted prior to the review. Formal documentation is submitted prior to the review. A long walkthrough/review is conducted with the customer. A long walkthrough/review is conducted with the customer. Often leads up to customer sign-off of a milestone. Often leads up to customer sign-off of a milestone.

6 Management Reviews Each company has their own format for these. Each company has their own format for these. Typically a power point presentation to Management or technical leadership followed by Q&A session Typically a power point presentation to Management or technical leadership followed by Q&A session The usual objective is to either approve a project or monitor project status The usual objective is to either approve a project or monitor project status Inputs are typically project plans and schedules or status reports. Inputs are typically project plans and schedules or status reports. Management team makes decisions and charts or approves a course of action to ensure progress, and properly allocate resources. Management team makes decisions and charts or approves a course of action to ensure progress, and properly allocate resources.

7 Technical Peer Review Structured encounter in which a group of technical personnel analyze a work product with the following primary objectives: Structured encounter in which a group of technical personnel analyze a work product with the following primary objectives: improve the original quality of the work product improve the original quality of the work product improve the quality of the review process improve the quality of the review process

8 Why Hold Technical Peer Reviews? Software Development is a very error-prone process. Software Development is a very error-prone process. Early detection of defects is cost effective, and peer reviews find errors early. Early detection of defects is cost effective, and peer reviews find errors early. Peer reviews find many of the same errors as testing, but earlier and with less effort. Peer reviews find many of the same errors as testing, but earlier and with less effort. They serve to educate the participants and provide training They serve to educate the participants and provide training They raise a team’s core competence by setting standards of excellence They raise a team’s core competence by setting standards of excellence

9 A Generic Review Process Plan Review: assess readiness of work product, assign team members, send out announcement & review package. Plan Review: assess readiness of work product, assign team members, send out announcement & review package. Detect Defects: each reviewer looks for defects in work product Detect Defects: each reviewer looks for defects in work product Collect Defects: in a meeting or via email, etc. Collect Defects: in a meeting or via email, etc. Correct Defects: author corrects work product Correct Defects: author corrects work product Follow Up: verify rework, document review Follow Up: verify rework, document review

10 Methods For Technical Reviews Ad-Hoc Ad-Hoc Personal Personal Walkthrough Walkthrough Fagan Style Inspection Fagan Style Inspection Asynchronous Review Asynchronous Review N-Fold Inspection N-Fold Inspection Many Other Techniques Many Other Techniques

11 Ad-Hoc Reviews Provide no process or instructions on how to detect defects Provide no process or instructions on how to detect defects Defect detection depends on inspector’s skill and experience Defect detection depends on inspector’s skill and experience Still valuable for: Still valuable for: Enforcing standards Enforcing standards Project status evaluations Project status evaluations Improved communications Improved communications Training and knowledge dissemination Training and knowledge dissemination

12 Personal Reviews An integral part of the “Personal Software Process” by Watts Humphrey. An integral part of the “Personal Software Process” by Watts Humphrey. Involves only the author of a work. Involves only the author of a work. Employs checklists and metrics (if following PSP). Employs checklists and metrics (if following PSP). For PSP, code review occurs before first compile. For PSP, code review occurs before first compile. May be performed by the author prior to a Formal technical review. May be performed by the author prior to a Formal technical review.

13 Walkthroughs A meeting in which the author presents a work product in a sequential manner and clarifies the product as necessary. A meeting in which the author presents a work product in a sequential manner and clarifies the product as necessary. No preparation by meeting attendees. No preparation by meeting attendees. May be held as part of another type of review. May be held as part of another type of review. As an example, a walkthrough may be held for a work product prior to distributing the review packet to the reviewers for a Fagan style software inspection. As an example, a walkthrough may be held for a work product prior to distributing the review packet to the reviewers for a Fagan style software inspection.

14 Fagan Style Software Inspections A method of technical review that involves a meeting based process and specific roles. A method of technical review that involves a meeting based process and specific roles. Process: Reviewers detect defects separately, but hold a meeting to collect, classify and discuss defects. Process: Reviewers detect defects separately, but hold a meeting to collect, classify and discuss defects. Defined roles: Moderator, Author, Presenter, Recorder, etc. Defined roles: Moderator, Author, Presenter, Recorder, etc. We will examine this technique in detail. We will examine this technique in detail.

15 Asynchronous Inspections No Meeting, so review can be distributed in space and time No Meeting, so review can be distributed in space and time Doesn’t involve author Doesn’t involve author Process is as follows: Process is as follows: Moderator sends out material via email Moderator sends out material via email Individual reviewers create list of defects Individual reviewers create list of defects Defect lists are circulated to all inspectors and discussed via email Defect lists are circulated to all inspectors and discussed via email Individual reviewers update defect list and send to Moderator Individual reviewers update defect list and send to Moderator Moderator compiles final defect list, sends it to author and follows up – eliminates group approval Moderator compiles final defect list, sends it to author and follows up – eliminates group approval

16 N-Fold Inspections Several independent teams inspect the same work product using traditional inspection methods. Several independent teams inspect the same work product using traditional inspection methods. Of course, many teams find overlapping defects, but unique defects are typically found by each team. Of course, many teams find overlapping defects, but unique defects are typically found by each team. The Moderator collects faults from the independent teams and composes the final defect list. The Moderator collects faults from the independent teams and composes the final defect list. This is an expensive process used when high reliability is desired. This is an expensive process used when high reliability is desired.

17 Fagan Inspection Process A six step review process Author submits work products for review Author submits work products for review Moderator assesses the product’s readiness, assigns review team, and announces the review Moderator assesses the product’s readiness, assigns review team, and announces the review Reviewers prepare for review Reviewers prepare for review Reviewers hold review meeting Reviewers hold review meeting Author corrects defects Author corrects defects Moderator verifies rework and closes review Moderator verifies rework and closes review

18 Standards and Checklists Standards: Standards: Rules for requirements/design/coding that all work products must adhere to Rules for requirements/design/coding that all work products must adhere to Typically either project or company specific Typically either project or company specific Improve software maintenance and quality Improve software maintenance and quality Checklists: Checklists: A list of questions for the inspectors to answer while reading the document. A list of questions for the inspectors to answer while reading the document. Should be less than a page long Should be less than a page long Should be derived from most common past defects Should be derived from most common past defects Should be periodically updated Should be periodically updated

19 Inspection Package Work-Product to be inspected (line numbered if possible) Work-Product to be inspected (line numbered if possible) Supporting documentation (requirements or work-product from which the work- product to be inspected was derived) Supporting documentation (requirements or work-product from which the work- product to be inspected was derived) Checklists and Standards are available Checklists and Standards are available Inspection meeting notice (Often sent by email) Inspection meeting notice (Often sent by email)

20 Fagan Inspection Roles Producer/Author – creates the product being reviewed and answers questions. Producer/Author – creates the product being reviewed and answers questions. Moderator – prepares review package, moderates the meeting, verifies rework. Moderator – prepares review package, moderates the meeting, verifies rework. Presenter/Reader – presents product during meeting Presenter/Reader – presents product during meeting Recorder/Scribe – records defects during meeting Recorder/Scribe – records defects during meeting Reviewer – everyone is a reviewer, but you may have reviewers that don’t have another role. Reviewer – everyone is a reviewer, but you may have reviewers that don’t have another role.

21 Reviewer’s Responsibilities Responsible for objectively inspecting the work-product Responsible for objectively inspecting the work-product Responsible for tracking the amount of time spent preparing for the inspection meeting Responsible for tracking the amount of time spent preparing for the inspection meeting Actively participate in inspection meeting by providing defects found during examination of the work product Actively participate in inspection meeting by providing defects found during examination of the work product

22 Producer’s Responsibilities Provides required reference material for the inspection Provides required reference material for the inspection Finds defects Finds defects Provides clarification Provides clarification Answers questions Answers questions Modifies the inspected work-product to correct defects Modifies the inspected work-product to correct defects

23 Moderators Responsibilities Ensures entry criteria are met Ensures entry criteria are met Distributes the inspection package to review team Distributes the inspection package to review team Ensures that all reviewers are prepared prior to the inspection meeting Ensures that all reviewers are prepared prior to the inspection meeting Facilitates the inspection meeting Facilitates the inspection meeting Also participates in review as a reviewer Also participates in review as a reviewer Assures that all items logged at the meeting are dispositioned Assures that all items logged at the meeting are dispositioned Collects the data and completes the inspection record Collects the data and completes the inspection record

24 Presenter’s Responsibilities Presents the product in logical fashion paraphrased at a suitable rate Presents the product in logical fashion paraphrased at a suitable rate Typical review rates are 100-200 LOC/hour (or 10-12 pages/hour for documents) Typical review rates are 100-200 LOC/hour (or 10-12 pages/hour for documents) Can vary significantly due to following factors Can vary significantly due to following factors language language comments and readability comments and readability type of software type of software structure of software structure of software

25 Recorder’s Responsibilities Completes the defect log Completes the defect log Defects should be classified based on team consensus by: Defects should be classified based on team consensus by: Severity (Major, Minor) Severity (Major, Minor) type type class class Should use techniques to minimize defect logging time Should use techniques to minimize defect logging time Not a secretary Not a secretary

26 Review Meeting Reviewers fill out and sign inspection form indicating time spent reviewing product. Reviewers fill out and sign inspection form indicating time spent reviewing product. Reviewers collectively decide if they are ready for the review to be held. Reviewers collectively decide if they are ready for the review to be held. Presenter progresses through review product eliciting defects as he progresses. Presenter progresses through review product eliciting defects as he progresses. Recorder records defects on defect form. Recorder records defects on defect form. Reviewers collectively disposition review as “Accept As Is”, “Accept with Corrections” or “Re- review”. Reviewers collectively disposition review as “Accept As Is”, “Accept with Corrections” or “Re- review”.

27 Review Comprehension Methods Front to Back Front to Back Start at the front of a document or top of a code module, and proceed to the end in sequential order Start at the front of a document or top of a code module, and proceed to the end in sequential order Use with documents, or if already familiar with the code design Use with documents, or if already familiar with the code design Bottom Up Bottom Up Start at the lowest level routines and work up Start at the lowest level routines and work up Used when code is new to inspector Used when code is new to inspector Top Down Top Down Start at the main SW entry points and review those, then review the routines they call Start at the main SW entry points and review those, then review the routines they call Used when inspector is familiar with code Used when inspector is familiar with code Integrated Integrated Use both Top-Down and Bottom-up approaches as appropriate. Use both Top-Down and Bottom-up approaches as appropriate.

28 What Makes A Good Reviewer A good reviewer is thorough A good reviewer is thorough Is prepared (most peer review postponements are due to lack of team preparation) Is prepared (most peer review postponements are due to lack of team preparation) Reviews the product, not the producer Reviews the product, not the producer Raises issues, doesn’t resolve them Raises issues, doesn’t resolve them Doesn’t give the author the benefit of the doubt Doesn’t give the author the benefit of the doubt

29 What Makes A Good Moderator Encourages individuals to prepare and participate Encourages individuals to prepare and participate Controls Meeting (starts on time, keeps focus on agenda, eliminates problem solving, etc.) Controls Meeting (starts on time, keeps focus on agenda, eliminates problem solving, etc.) Nurtures inexperienced reviewers Nurtures inexperienced reviewers Is sensitive to Author Is sensitive to Author Feels ownership in quality of product Feels ownership in quality of product

30 How To Select Reviewers Participants are typically selected by the Moderator. Participants are typically selected by the Moderator. Important criteria for selecting participants: Important criteria for selecting participants: ability to detect defects (expertise) ability to detect defects (expertise) knowledge of source documents knowledge of source documents need to understand work-product (recipients of work-product) need to understand work-product (recipients of work-product) motivation, and other personal qualities motivation, and other personal qualities

31 Review Metrics Help measure the effectiveness of reviews Help measure the effectiveness of reviews Aid in continuous process improvement Aid in continuous process improvement Provide feedback to management Provide feedback to management Typical metrics are: Typical metrics are: Average preparation effort per unit of material (typically LOC, KLOC or pages) Average preparation effort per unit of material (typically LOC, KLOC or pages) Average examination effort per unit of material Average examination effort per unit of material Average explanation rate per unit of material Average explanation rate per unit of material Average number of defects and major defects found per unit of material Average number of defects and major defects found per unit of material Average hours per defect and per major defect Average hours per defect and per major defect Percentage of re-inspections Percentage of re-inspections

32 Industry Experience Aetna Insurance Company: Aetna Insurance Company: FTR found 82% of errors, 25% cost reduction. FTR found 82% of errors, 25% cost reduction. Bell-Northern Research: Bell-Northern Research: Inspection cost: 1 hour per defect. Inspection cost: 1 hour per defect. Testing cost: 2-4 hours per defect. Testing cost: 2-4 hours per defect. Post-release cost: 33 hours per defect. Post-release cost: 33 hours per defect. Hewlett-Packard Hewlett-Packard Est. inspection savings (1993): $21,454,000 Est. inspection savings (1993): $21,454,000 IBM IBM Reported 83% defect detection through inspections Reported 83% defect detection through inspections AT&T AT&T Reported 92% defect detection through inspections Reported 92% defect detection through inspections

33 Personal Case Study The KDR 510 A VHF aviation radio and modem. A VHF aviation radio and modem. A Real-time, embedded, safety critical, DSP system. A Real-time, embedded, safety critical, DSP system. Won the editors choice award from Flying Magazine. Won the editors choice award from Flying Magazine. Formal peer reviews were main QA activity. Formal peer reviews were main QA activity.

34 Quality Data for the KDR 510 KDR 510 reviews detected many errors. KDR 510 reviews detected many errors. 72% of SW requirements defects 72% of SW requirements defects 90.7% of SW design defects 90.7% of SW design defects 90.6% of SW coding defects 90.6% of SW coding defects Total review time was approx 5% of total project time. Total review time was approx 5% of total project time. Only 23% of total project time was spent in integration & test. Only 23% of total project time was spent in integration & test. Only one error escaped into the field. Only one error escaped into the field.

35 The Real Reasons For Holding Reviews Reviews improve schedule performance Reviews improve schedule performance Reviews reduce rework. Reviews reduce rework. Rework accounts for 44% of dev. cost! Rework accounts for 44% of dev. cost! Req DesignCodeTest R R R CodeDesignReq Reviews No Reviews

36 Tools For Technical Reviews Various tools for different inspection methods. Various tools for different inspection methods. ICICLE – for inspection of C & C++ programs ICICLE – for inspection of C & C++ programs Scrutiny & InspeQ for specific inspection processes Scrutiny & InspeQ for specific inspection processes ASSIST –supports generic inspection process ASSIST –supports generic inspection process For larger list, see: http://www2.ics.hawaii.edu/~johnson/FTR/ For larger list, see: http://www2.ics.hawaii.edu/~johnson/FTR/ http://www2.ics.hawaii.edu/~johnson/FTR/ Home grown tools Home grown tools Typically built with Access Database. Typically built with Access Database. Reviewer enters defects offline into database. Reviewer enters defects offline into database. Eliminates recorder and reader roles. Eliminates recorder and reader roles. Gives author time to consider defects before meeting Gives author time to consider defects before meeting

37 Some References A great website for Formal Technical Reviews: http://www2.ics.hawaii.edu/~johnson/FTR/ A great website for Formal Technical Reviews: http://www2.ics.hawaii.edu/~johnson/FTR/ http://www2.ics.hawaii.edu/~johnson/FTR/ A Discipline for Software Engineering, Watts S. Humphrey, Addison- Wesley, January, 1995. A Discipline for Software Engineering, Watts S. Humphrey, Addison- Wesley, January, 1995. M. E. Fagan, Design and code inspections to reduce errors in program development, IBM Systems Journal, Vol 15, No 3, 1976, 182-211 M. E. Fagan, Design and code inspections to reduce errors in program development, IBM Systems Journal, Vol 15, No 3, 1976, 182-211 G. M. Schneider, J. Martin, W. T. TSAI, An Experimental Study of Fault Detection In User Requirements Documents, ACM Transactions On SW Engineering & Methodology, Vol 2, No 2, April 1992, 188- 204. G. M. Schneider, J. Martin, W. T. TSAI, An Experimental Study of Fault Detection In User Requirements Documents, ACM Transactions On SW Engineering & Methodology, Vol 2, No 2, April 1992, 188- 204. A. Porter, H. Siy, C. Toman, and L. Votta, An Experiment to Assess the Cost-Benefits of Code Inspections in Large Scale Software Development, IEEE Transactions On Software Engineering, VOL. 23, NO. 6, JUNE 1997, 329-346 A. Porter, H. Siy, C. Toman, and L. Votta, An Experiment to Assess the Cost-Benefits of Code Inspections in Large Scale Software Development, IEEE Transactions On Software Engineering, VOL. 23, NO. 6, JUNE 1997, 329-346

38 Questions?

39 Review Workshop Objective: Allow everyone to take a role in a Fagan style code review Objective: Allow everyone to take a role in a Fagan style code review Combine results to create an N-Fold inspection. Combine results to create an N-Fold inspection. Break into teams of 4. Break into teams of 4. Handouts: Source Code Files, Supplementary Materiel, Review Forms. Handouts: Source Code Files, Supplementary Materiel, Review Forms. Schedule: 20 minutes to prepare for review, 20 minutes for review, 10 minute break for everyone but moderators. 5 minutes to summarize results. Schedule: 20 minutes to prepare for review, 20 minutes for review, 10 minute break for everyone but moderators. 5 minutes to summarize results.

40 Discussion on Review Workshop Results from N-Fold inspection. Results from N-Fold inspection. What did you learn from this code review? What did you learn from this code review? Was it effective? Was it effective? How long would it have taken to detect some of these defects by testing? How long would it have taken to detect some of these defects by testing? Other comments or conclusions? Other comments or conclusions?

41 Conclusion Reviews compliment software testing. Reviews compliment software testing. Reviews are cost effective techniques for improving the quality of a developed product. They pay for themselves. Reviews are cost effective techniques for improving the quality of a developed product. They pay for themselves. Reviews improve the maintainability of a developed product. Reviews improve the maintainability of a developed product. One size doesn’t fit all - An organization’s size, culture and industry should be considered in deciding on the methods to use for reviews. One size doesn’t fit all - An organization’s size, culture and industry should be considered in deciding on the methods to use for reviews.


Download ppt "Technical Reviews A Presentation By Manuel Richey for EECS 814 November 17 th, 2005."

Similar presentations


Ads by Google