Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

Similar presentations


Presentation on theme: "1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)"— Presentation transcript:

1 1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)

2 2 Lecture Outline Evaluation Criteria & Organization of the guided inspection activities. Preparing for the guided inspection.

3 3 Guided inspection: evaluation criteria Correctness: is a measure of the accuracy of the model: In analysis: it is the accuracy of the problem description. In design: it is how accurately the model represents the solution of the problem The model is correct with respect to a set of test cases if every test case produces the expected result.

4 4 Guided inspection: evaluation criteria Completeness: is a measure of the inclusiveness of the model (are any necessary, or at least useful, elements missing from the model)? In iterative incremental process, completeness is considered relative to how mature the current increment is expected to be. Do all objects needed for the sequence diagram come from classes in the class diagram? The model is complete if the result of executing the test cases can be adequately represented using only the content of the model.

5 5 Guided inspection: evaluation criteria (cont...) Consistency: is a measure of whether there are contradictions within the model or between the current model and the model upon which it is based. Consistency can determine whether there are contradictions or conflicts present either internal to a single diagram or between two diagrams. The model is inconsistent if there are different representations within the model for similar test cases.

6 6 Guided inspection: evaluation criteria (cont...) Other qualities like performance goals: define a number of system attributes that the development team might wish to verify. The guided inspection test cases can be used as scenarios for testing performance.

7 7 Organization of the guided inspection activity Basic roles: Domain expert: the source of the expected results Tester: conduct the analysis necessary to select effective test cases. Developer: the creators of the models under test. Individual inspections: testers complete a checklist specific to the type of model being inspected. This process can be automated.

8 8 Preparing for the guided inspection: specifying the inspection Scope of an inspection is defined by specifying a set of use cases, a set of packages, or abstract classes/interfaces. Depth of an inspection is defined by specifying layers in aggregation hierarchies under which messages are not sent.

9 9 Preparing for the guided inspection: realistic models Layered approach: more individual diagrams but each diagram is sufficiently modular to fit within the scope of a specific inspection

10 10 Preparing for the guided inspection: realistic models— layered approach

11 11 Preparing for the guided inspection: realistic models— layered approach (cont…)

12 12 Preparing for the guided inspection: selecting test cases for the inspection Test cases can be selected to ensure that specific types of coverage are achieved or to find specific type of defects.

13 13 Preparing for the guided inspection: selecting test cases for the inspection Test case selection methods: Orthogonal defect classification: most likely to identify defects by covering the different categories of system actions that trigger defects. Use profiles: give confidence in the reliability of the product by identifying which parts of the program are used the most, Risk Analysis

14 14 Preparing for the guided inspection: selecting test cases for the inspection (cont...) Orthogonal defect classification (ODC) The activities that caused a defect to be detected are classified as triggers. The guided inspection technique uses several of these triggers as a guide to select test cases.

15 15 Preparing for the guided inspection: selecting test cases for the inspection (cont...) By structuring the guided inspection process so that as many of these triggers as possible are encountered, you ensure that the tests that guide the inspection are more likely to trigger as many failures as possible.

16 16 Preparing for the guided inspection: selecting test cases for the inspection (cont...) Use profiles: A use profile for a system is an ordering of the individual test cases based on a combination of the frequency and criticality values for the individual use cases.

17 17 Preparing for the guided inspection: selecting test cases for the inspection (cont...) Risk as a test case selector: Risk can be used as the basis for test case selection It is useful during the development It is not appropriate after development when we are trying to achieve some measure of reliability of the software, at that time the use of profile technique is better because it supports the way software will be used.

18 18 Preparing for the guided inspection: creating test cases Use case scenario: the path taken The alternative paths: several scenarios that differ from the use scenario but represent valid execution Exceptional paths: scenarios that result in error conditions

19 19 Preparing for the guided inspection: an example of a use case Use case # 1 Actor: Player Use Scenario: The user selects the play option from the menu. The system responds by starting the game. Alternate Paths: If a match is already in progress, the selection is ignored. Exceptional Cases: If the match cannot open the display, an error message is displayed and the game aborts. Frequency: Low Criticality: High Risk: Medium

20 20 Testing specific types of models The level of detail in the model becomes greater as development proceeds. The amount of information also increases as development proceeds. The exact interpretation of the evaluation criteria can be made more specific for a specific model. The membership of the inspection team changes for different models.

21 21 Key points Evaluation criteria: Correctness Completeness Consistency Other qualities Test case selection methods: Orthogonal defect classification Use profiles Risk


Download ppt "1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)"

Similar presentations


Ads by Google