Download presentation
Presentation is loading. Please wait.
1
Software Quality Engineering
Lecture # 5
2
Verification Static Testing Informal reviews Walkthrough Inspection
3
Comparison of verification methods
4
Review + Fix Reqs Review Fix + Impl fix after Delivery
True cost of review A five-person review costs five person-days. Consider an error that takes one person-day to fix if found during the requirements phase: Review + Fix Reqs Review Fix + Impl fix after Delivery
5
Example 50 KLOC takes 60 PM The IBM rule of thumb is that inspection adds 15% to the resources required, i.e., an additional 9PM.
6
Example Data show that it takes 1.58 PHour to find a single defect in inspections. There are 1188 PH in 9PM, so one can expect to find 752 defects in 9PM. It is known that it typically takes 9PH to fix a defect found after delivery. So these 752 defects will require 51PM to fix, almost as long as it took to build the software.
7
example If the same defects are found earlier by inspection, say no later than coding stage, then it will take 1PH to fix each. Therefore with inspections, these 752 defects will require 5.7PM to fix.
8
example So, a savings over the lifecycle of 32%.
W/O Inspection = 111PM W Inspection = 74.7PM, a savings over the lifecycle of 32%.
9
Review Metrics Preparation effort, Ep—the effort (in person-hours) required to review a work product prior to the actual review meeting Assessment effort, Ea— the effort (in person-hours) that is expending during the actual review Rework effort, Er— the effort (in person-hours) that is dedicated to the correction of those errors uncovered during the review Work product size, WPS—a measure of the size of the work product that has been reviewed (e.g., the number of UML models, or the number of document pages, or the number of lines of code) Minor errors found, Errminor—the number of errors found that can be categorized as minor (requiring less than some pre-specified effort to correct) Major errors found, Errmajor— the number of errors found that can be categorized as major (requiring more than some pre-specified effort to correct)
10
Review Metrics The total review effort and the total number of errors discovered are defined as: Ereview = Ep + Ea + Er Errtot = Errminor + Errmajor Defect density represents the errors found per unit of work product reviewed. Defect density = Errtot / WPS
11
Example If past history indicates that
the average defect density for a requirements model is 0.6 errors per page, and a new requirement model is 32 pages long, a rough estimate suggests that your software team will find about 19 or 20 errors during the review of the document. If you find only 6 errors, you’ve done an extremely good job in developing the requirements model or your review approach was not thorough enough.
12
Example The effort required to correct a minor model error (immediately after the review) was found to require 4 person-hours. The effort required for a major requirement error was found to be 18 person-hours. Examining the review data collected, you find that minor errors occur about 6 times more frequently than major errors. Therefore, you can estimate that the average effort to find and correct a requirements error during review is about 6 person-hours. Requirements related errors uncovered during testing require an average of 45 person-hours to find and correct. Using the averages noted, we get: Effort saved per error = Etesting – Ereviews 45 – 6 = 30 person-hours/error Since 22 errors were found during the review of the requirements model, a saving of about person-hours of testing effort would be achieved. And that’s just for requirements-related errors.
13
Conducting the review Review Process Entry criteria Planning
Preparation Review meeting Rework Follow-up Exit criteria
14
Entry criteria Work product completed
Independent (if dependent on other work products, they must be completed) Reviewers selected Reviewers trained In case of re-review, previous review comments resolved.
15
planning Intended goal Review type Review team
Roles and responsibilities
16
preparation Work product distributed Meeting scheduled
17
Review meeting Introduction (participants and objectives of the review) Work presented Concerns and issues raised (determine the validity of review comments) Review log sent to all participants
18
rework Review comments analyzed
Estimated effort to fix the review comment Determine the need to re-review Status of comments updated
19
Follow-up Defect resolution and status update
20
Exit criteria Goal satisfied Defects tracked to closure
21
Review Checklist (Requirements)
Requirements are hard to get right! Review checklist for SRS Correctness Ambiguity Completeness Consistency Verifiability Modifiability Traceability Feasibility
22
Correctness Every requirement stated in the SRS should correctly represent an expectation from the proposed software. We do not have standards, guidelines or tools to ensure the correctness of the software. If the expectation is that the software should respond to all button presses within 2 seconds, but the SRS states that ‘the software shall respond to all buttons presses within 20 seconds’, then that requirement is incorrectly documented.
23
Ambiguity There may be an ambiguity in a stated requirement. If a requirement conveys more than one meaning, it is a serious problem. Every requirement must have a single interpretation only. We give a portion of the SRS document (having one or two requirements) to 10 persons and ask their interpretations. If we get more than one interpretation, then there may be an ambiguity in the requirement(s). Hence, requirement statement should be short, explicit, precise and clear. However, it is difficult to achieve this due to the usage of natural languages (like English), which are inherently ambiguous. A checklist should focus on ambiguous words and should have potential ambiguity indicators.
24
Completeness The SRS document should contain all significant functional requirements and non- functional requirements. It should also have forms (external interfaces) with validity checks, constraints, attributes and full labels and references of all figures, tables, diagrams, etc. The completeness of the SRS document must be checked thoroughly by a checklist.
25
Consistency Consistency of the document may be maintained if the stated requirements do not differ with other stated requirements within the SRS document. For example, in the overall description of the SRS document, it may be stated that the passing percentage is 50 in ‘result management software’ and elsewhere, the passing percentage is mentioned as 40. In one section, it is written that the semester mark sheet will be issued to colleges and elsewhere it is mentioned that the semester mark sheet will be issued directly to students. These are examples of inconsistencies and should be avoided. The checklist should highlight such issues and should be designed to find inconsistencies.
26
Verifiability The SRS document is said to be verifiable, if and only if, every requirement stated therein is verifiable. Non-verifiable requirements include statements like ‘good interfaces’, ‘excellent response time’, ‘usually’, ‘well’, etc. These statements should not be used.
27
Modifiability The SRS document should incorporate modifications without disturbing its structure and style. Thus, changes may be made easily, completely and consistently while retaining the framework. Modifiability is a very important characteristic due to frequent changes in the requirements. What is constant in life? It is change and if we can handle it properly, then it may have a very positive impact on the quality of the SRS document.
28
Traceability The SRS document is traceable if the origin of each requirement is clear and may also help for future development. Traceability may help to structure the document and should find place in the design of the checklist.
29
Feasibility Some of the requirements may not be feasible to implement due to technical reasons or lack of resources. Such requirements should be identified and accordingly removed from the SRS document. A checklist may also help to find non-feasible requirements
30
References Software engineering: A practitioner’s approach by Roger S. Pressman 8th edition Software Testing by Yogesh Singh
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.