Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Testing and Quality Assurance: Inspection Reading Assignment: –John McGregor and David A. Sykes, A Practical Guide to Testing Object-Oriented.

Similar presentations


Presentation on theme: "Software Testing and Quality Assurance: Inspection Reading Assignment: –John McGregor and David A. Sykes, A Practical Guide to Testing Object-Oriented."— Presentation transcript:

1 Software Testing and Quality Assurance: Inspection Reading Assignment: –John McGregor and David A. Sykes, A Practical Guide to Testing Object-Oriented Software, Addison-Wesley, 2001, ISBN: 0- 201-325640. Chapter 4: Testing Analysis and Design Models

2 Objectives To learn how to inspect the semantics of UML models. To be able to setup an inspection session. To learn a technique for testing a model for extensibility.

3 Topics covered Introduction Guided inspection technique Testing specific types of models –Requirements model –Analysis models –Design models Testing models for additional qualities

4 Introduction Developers model the software they are constructing because: –It assists in understanding the problem they are solving –It helps manage the complexity of the system being developed. High quality of models make a valuable contribution to the project Guided inspection: an enhanced inspection technique for verifying the models as they are created and for validating the completed models against project requirements.

5 Introduction (cont...) Reviews focus on what is in the model rather than what should be in the model. Guided inspection applies the testing perspective early in the development process. Guided inspection requires valuable resources, and the time and attention of project personnel.

6 Introduction: the V Model System Models System Implementation Architecture Models Subsystem Integration Component Models Component Implementation Guided Inspection Code-based Testing techniques

7 Place of Guided Inspection in the Development Process The last activity in each phase in the development process should be a verification of the required qualities of the developed product. Differences between succeeding models: –The scope of the content –The level of abstraction

8 Guided Inspection in the Development Process: Models and Phases Architectural design model and standard design patterns and algorithms Each interface in the Architecture is implemented by one or more components Detailed design Standard architectural patterns and creativity of designers Basic structure of interfaces and their interactions Architectural design The domain analysis model and the requirements model Concepts needed to explain the specific problem; standard algorithm Application analysis The minds of the domain experts Domain concepts, standard algorithms Domain analysis Transformed fromContentPhase/model

9 The Basics of Guided Inspection The guided inspection technique provides a means of objectively and systematically searching a work product for faults by using explicit test cases. The basic steps in guided inspection –Define the test space –Select values from the test space using a specific strategy –Apply the test values to the product being tested –Evaluate the results and the percentage of the model covered by the test

10 The Basics of Guided Inspection (cont...) The previous steps are specialized to the following steps: –Specify the scope and depth of the inspection –Identify the basis models from which the model under test (MUT) was created –Develop test cases for each of the evaluation criteria to be applied using the contents of the basis model as input. –Establish criteria for measuring test coverage. –Perform the static analysis using the appropriate checklist. –“Execute” the test cases. –Evaluate the effectiveness of tests using the coverage measurement. –If the coverage is insufficient, expand the test suite and apply the additional tests, otherwise, terminate the testing.

11 The Basics of Guided Inspection : Example Initial stages of developing Brickles Following work products have been produced: –Design-level class diagrams [Ref. Figure 2.18] –State diagrams [Ref. Figure 2.19] –Sequence diagrams [Ref. Figure 2.20] Test the design model before starting coding

12 The Basics of Guided Inspection : Example (cont...) Assign the inspection team, the team includes –Two developers –A system tester from the company –Company’s process person (Moderator) The tester will develop a set of test cases from the use case diagram. The developers will show how the classes in the design model handle each test case. The moderator will define the inspection boundaries, schedule the guided inspection sessions, etc Steps of guided inspection

13 The Basics of Guided Inspection : Example (cont...) Step 1: Specify the scope and depth of the inspection –The scope is defined by a set of use cases. We will cover all use cases –The depth is defined by identifying levels of containment in the composition hierarchies. Test only those objects that represent the state of the match (BricklesDoc object)

14 The Basics of Guided Inspection : Example (cont...) Step 2: Identify the basis models from which the model under test (MUT) was created –Design Model Design-level class diagrams [Ref. Figure 2.18] State diagrams [Ref. Figure 2.19] Sequence diagrams [Ref. Figure 2.20]

15 The Basics of Guided Inspection : Example (cont...) Step 3: Develop test cases for each of the evaluation criteria to be applied using the contents of the basis model as input. –Tester will write all test cases based on the Use case diagrams like Figure 2.11, 2.12 (see next slide) –Each test case should be in the following format The use case: A player stops a Brickles match by selecting QUIT from the File Menu Preconditions: The Player has started the Brickles, has moved the paddle, and has broken some bricks Test input: The player selects Quit Expected output: All game action freezes and the game window disappears

16 The Basics of Guided Inspection: Example― Use Cases for Brickles (cont...) Use cases for Brickles

17 The Basics of Guided Inspection : Example (cont...) Step 4: Establish criteria for measuring test coverage. –All use cases should be tested. Coverage will be 100 % for use cases and 60 % for composition hierarchies.

18 The Basics of Guided Inspection : Example (cont...) Step 5: Perform the static analysis using the appropriate checklist. –Developers complete the design model checklist [Ref. Figure 4.3] –Developers compare class diagram from analysis model [Ref. Figure 2.13] with the class diagram in the design model.

19 The Basics of Guided Inspection : Example — Design Model Checklist (cont...)

20 The Basics of Guided Inspection : Example (cont...) Step 6: “Execute” the test cases. –Moderator schedules a meeting and makes the relevant material available to all involved –During the meeting, the developers show how test cases will be handled by the classes in the design model

21 The Basics of Guided Inspection : Example (cont...) Step 7: Evaluate the effectiveness of tests using the coverage measurement. –The moderator will note down the problems found during the symbolic execution of test cases. – Founded bugs will be analyzed by the team and proper description for each bug will be developed.

22 The Basics of Guided Inspection : Example (cont...) Step 8: If the coverage is insufficient, expand the test suite and apply the additional tests, otherwise, terminate the testing. –The moderator will prepare and submit a report to the project manager that contains: The problems found during the execution of the test cases. Where execution terminated. The sequence diagram used to record the test execution.

23 Guided Inspection Issues Should test cases be available to the developers prior the inspection test? –No, developers should not have the scenarios prior the inspection test. Should testers only use test cases for the current increment in an inspection test? –No, running a test scenario from a previous increment as a regression check on the mode is a useful idea. Coverage in models –Higher the level of abstraction, the higher the level of coverage.

24 Guided Inspection: Evaluation Criteria Correctness: is a measure of the accuracy of the model: –In analysis: it is the accuracy of the problem description. –In design: it is how accurately the model represents the solution of the problem –The model is correct with respect to a set of test cases if every test case produces the expected result. Completeness: is a measure of the inclusiveness of the model (are any necessary, or at least useful, elements missing from the model)? –In iterative incremental process, completeness is considered relative to how mature the current increment is expected to be. –Do all objects needed for the sequence diagram come from classes in the class diagram? –The model is complete if the result of executing the test cases can be adequately represented using only the content of the model.

25 Guided Inspection: Evaluation Criteria (cont...) Consistency: is a measure of whether there are contradictions within the model or between the current model and the model upon which it is based. –Consistency can determine whether there are contradictions or conflicts present either internal to a single diagram or between two diagrams. –The model is inconsistent if there are different representations within the model for similar test cases. Other qualities like performance goals: define a number of system attributes that the development team might wish to verify. –The guided inspection test cases can be used as scenarios for testing performance.

26 Organization of the Guided Inspection Activity Basic roles: –Domain expert: the source of the expected results –Tester: conduct the analysis necessary to select effective test cases. –Developer: the creators of the models under test. Individual inspections: testers complete a checklist specific to the type of model being inspected. –This process can be automated.

27 Preparing for the Guided Inspection: Specifying the Inspection Scope of an inspection is defined by specifying a set of use cases, a set of packages, or abstract classes/interfaces. Depth of an inspection is defined by specifying layers in aggregation hierarchies under which messages are not sent.

28 Preparing for the Guided Inspection: Realistic Models Layered approach: more individual diagrams but each diagram is sufficiently modular to fit within the scope of a specific inspection

29 Preparing for the Guided Inspection: Realistic Models — Layered Approach

30 Preparing for the Guided Inspection: Realistic Models — Layered Approach (cont)

31 Layering of sequence diagrams The diagram terminates at an interface or abstract class A sequence diagram is then constructed for each class that implements the interface or specializes the abstract class Preparing for the Guided Inspection: Realistic Models— Layered Approach (cont)

32 Preparing For The Guided Inspection: Realistic Models— Layered Approach (cont)

33 Preparing for the Guided Inspection: Selecting Test Cases for the Inspection Test cases can be selected to ensure that specific types of coverage are achieved or to find specific type of defects. Test case selection methods: –Orthogonal defect classification: most likely to identify defects by covering the different categories of system actions that trigger defects. –Use profiles: give confidence in the reliability of the product by identifying which parts of the program are used the most, –Risk as a test case selector

34 Preparing for the Guided Inspection: Selecting Test Cases for the Inspection (cont) Orthogonal defect classification (ODC) –The activities that caused a defect to be detected are classified as triggers. –The guided inspection technique uses several of these triggers as a guide to select test cases. –By structuring the guided inspection process so that as many of these triggers as possible are encountered, you ensure that the tests that guide the inspection are more likely to trigger as many failures as possible.

35 Preparing for the Guided Inspection: Selecting Test Cases for the Inspection (cont) Some review and inspection triggers –Design conformance is addressed by comparing the basis model to the MUT –Concurrency is a trigger that will be visible in the design model and scenarios can be generated that explicitly explore thread interactions –Lateral compatibility is activated by the trace of scenarios between objects on sequence diagrams. –E.g. rare situation (examining unexpected behavior of the system), backward compatibility (comparison to previous products).

36 Preparing for the Guided Inspection: Selecting Test Cases for the Inspection (cont) Use profiles: –A use profile for a system is an ordering of the individual test cases based on a combination of the frequency and criticality values for the individual use cases. –The traditional operational profile used for procedural systems is based strictly on frequency-of-use information –Ex, Logo, attaching local db server –

37 Preparing for the Guided Inspection: Selecting Test Cases for the Inspection (cont) Risk as a test case selector: –Risk can be used as the basis for test case selection –It is useful during the development –It is not appropriate after development when we are trying to achieve some measure of reliability of the software, at that time the use of profile technique is better because it supports the way software will be used. –The frequency/criticality information is used instead of the risk information for guided inspection

38 Preparing for the Guided Inspection: Creating Test Cases Use case scenario: the path taken The alternative paths: several scenarios that differ from the use scenario but represent valid execution Exceptional paths: scenarios that result in error conditions

39 Preparing for the Guided Inspection: an Example of a Use Case Use case # 1 Actor: Player Use Scenario: The user selects the play option from the menu. The system responds by starting the game. Alternate Paths: If a match is already in progress, the selection is ignored. Exceptional Cases: If the match cannot open the display, an error message is displayed and the game aborts. Frequency: Low Criticality: High Risk: Medium

40 Testing Specific Types of Models The level of detail in the model becomes greater as development proceeds. The amount of information also increases as development proceeds. The exact interpretation of the evaluation criteria can be made more specific for a specific model. The membership of the inspection team changes for different models.

41 Testing Specific Types of Models: Requirements Model Acceptance testing often finds faults that result from problems with the requirements (missing requirements, contradiction in requirements). There is no UML model on which the requirements are based, so comparison to the basis model refer to documents produced by marketing, system engineering or client organization.

42 Testing Specific Types of Models: Requirements Model (Cont...) Outline of testing the requirements models: –Develop the ranking of use cases by computing combined frequency and criticality information for a use case. –Determine the total number of test cases that can be constructed given the amount of resources available. –Ration the tests based on the ranking –Write scenarios based only on the knowledge of those in the domain expert’s role –Check for completeness, consistency, and correctness

43 Testing Specific Types of Models: Requirements Model (Cont...) UseFrequencyCriticalityCombined valueRank Number of test cases Start Brickles MediumHigh 13 Pause Brickles Low 31 Stop Brickles MediumLowMedium22 Break brick High 13 WinsMediumHigh 13 LosesMediumLowMedium22

44 Testing Specific Types of Models: Requirements Model (Cont...) Criteria for requirement inspection: Any system functionality is specified in the same manner everywhere it is described. Consistency Each use case accurately represents a requirement. Correctness The use cases represent all of the functionality needed for a satisfactory product. No use case is needed that is not required functionality. Completeness Interpretation for domain modelingCriteria

45 Testing Specific Types of Models: Analysis Models Domain analysis model Application analysis model

46 Testing Specific Types of Models: Analysis Models — Domain Analysis Model A domain model is a representation of the knowledge in a domain as seen through the eyes of a set of domain experts. Two groups are formed: –Group one: developers group ―creates the domain model –Group two: testers and domain expert group ―serves as testers of that model. (this group creates the test cases of the model).

47 Testing Specific Types of Models: Analysis Models — Domain Analysis Model (Cont...) Criteria for domain analysis model inspection Model elements should be consistent with company’s definitions and meanings Consistency The description of domain concepts are accurate; the algorithms will produce the expected results Correctness The concepts are sufficient to cover the scope of the content specified. Sufficient detail is given to describe concepts to the required path. Completeness Interpretation for domain modelingCriteria

48 Testing Specific Types of Models: Analysis Models — Application Analysis Model Multiple domain models contribute to the single application analysis model. Criteria for application analysis model inspection. Where there are multiple ways to represent a concept of action, those ways are equivalent Consistency Experts agree with the attributes and behaviors assigned to each concept; on the steps in each algorithm; major states for each conceptual entity. Correctness The ideas expressed in each use case can be represented by the concepts and algorithms in the model. No design information is included in the model. Completeness Interpretation for domain modelingCriteria

49 Testing Specific Types of Models: Design Models In an OO project there are three levels of design: –Architectural –Mechanistic –Detailed We will focus on two basic design models that encompass those three levels: –Architectural design model: provided the basic structure of the application by defining how a set of interfaces are related. –Detailed class design model: provides the precise semantics of each class and identifies the architectural interface to which the class corresponds.

50 Testing Specific Types of Models: Design Models — Architectural Model The architectural model is the skeleton of the entire application. Non-functional requirements are blended with functional requirements. A software architectural model is the basic structure that defines the system in terms of computational components and interactions among those components.

51 Testing Specific Types of Models: Design Models — Architectural Model (Cont...) Representations of the architecture: –To represent an architecture: Relationships States Algorithms –Tool support: Rational Rose performs a variety of consistency checks on the static-relationship model. Consistency checks will prevent certain types of connection from being established. However, it is not sufficient for determining whether the functionality is correctly implemented Architecture description languages provide the capability to represent a system at a high level of abstraction (e.g. Rapide)

52 Testing the architecture: –The Software Architecture Testing (SAT) technique is a special type of guided inspection that requires the following steps: Test cases are constructed. The tests are conducted on the product The results of the test are evaluated for correctness. Testing Specific Types of Models: Design Models— Architectural Model (Cont...)

53 Constructing test cases: –Test cases are constructed from the use cases. –The test cases for the architecture are defined at a higher level than more detailed design models. Testing Specific Types of Models: Design Models— Architectural Model (Cont...)

54 Criteria for the architectural design model inspection: Each use of the system can be handled only in one set of interfaces. Consistency The architecture satisfies its constraints; uses the appropriate architecture patterns; represents the interactions between interfaces. Correctness A sufficient set of interfaces are defined to provide all of the services needed for the application’s functionality. The relationship between the interfaces allows for the flow of control and data necessary to realize all of the uses described in the use case diagram. Completeness Interpretation for the architectural design modelCriteria Testing Specific Types of Models: Design Models— Architectural Model (Cont...)

55

56 Test execution: –Test cases are executed by constructing a message-sequence diagram. –The diagram reflects preconditions for a test case. Testing Specific Types of Models: Design Models— Architectural Model (Cont...)

57

58 Verification of results: –When the output from the test is in the form of diagrams, the resulting diagrams must be verified after each test execution by domain experts. –When the output is the result of an execution, the test result can be verified by having those domain experts construct event sequences that would be produced by an architecture that performs correctly. Testing Specific Types of Models: Design Models— Architectural Model (Cont...)

59 Evaluating performance and scalability: –An architecture of a system should be evaluated beyond correctness, completeness and consistency (performance and scalability). –The test cases are symbolically executed and the message-sequence diagrams can be analyzed from a performance perspective. –Sequence diagrams can also be used to evaluate scalability. Testing Specific Types of Models: Design Models— Architectural Model (Cont...)

60 Testing Specific Types of Models: Design Models — Detailed Class Design Model The detailed class design model populates the architectural model with classes that will implement the interfaces defined in the architecture. This model typically includes: –A set of class diagram –The Object Constraint Language (OCL) pre- and postconditions for every method of every class –Activity diagrams of significant algorithms –State diagram for each class The test cases at this level are very much like final system test cases.

61 Testing Specific Types of Models: Design Models — Detailed Class Design Model (cont) Criteria for the class design model inspection: The behavior in the interface of each class provides either a single way to accomplish a task or, if there are multiple ways, they provide the same behavior but with different preconditions Consistency Each class accurately implements the semantics of an interface. For those classes that correspond to interfaces in the architecture, the class’ specification must correspond to the interface specified by the architecture Correctness Classes are defined for each interface in the architecture. The preconditions for each method specify sufficient information so that the user can safely use the method. The postconditions for a method show error conditions as well as the normally expected result. Completeness Interpretation for the architectural design modelCriteria

62 As the test progresses: –The executers select methods that will be invoked –The state model of the receiver is checked to be certain the target object can receive the message –The messages are then added to the sequence diagram and the state models are updated to reflect changes in the state. –Sequence diagrams will have dead-end objects in which testers will not attempt to examine the logic beyond that object. –Implementation and code-based testing start after this point. Testing Specific Types of Models: Design Models— Detailed Class Design Model (cont)

63 Testing Models for Additional Qualities Projects are chartered to achieve more aggressive objectives such as: –The development of extensible designs –The design of reusable frameworks –Highly portable systems A change case is a use case that is not a requirement of the system, but it is an anticipated change to the system.

64 Testing Models for Additional Qualities (cont...) Explicitly state the objective that the change will address. –The design will be easily extensible to accommodate new games. Construct a “change case” including a specific scenario that illustrates the objective. –The framework is to be used to implement pinball games that are user configurable. The obstacles to be available include posts, flippers, and bumpers.

65 Create test cases by sampling from the range permitted by the change case. –A pinball game is to be created. States may be added to Brickles state machine. Enumerate the work needed to achieve the objective by specifying the differences in state and behavior required for the new objectives. –The stationarySprite class will be subclasses to provide the new obstacle. Testing Models for Additional Qualities (cont...)

66 Evaluate the current design relative to the design required to achieve the objective. –The necessary base classes and methods are present. The needed attributs can be added without conflict with existing attributes. Repeat with additional test scenarios until all proposed changes are examined. The output of this process is a set of potential changes needed to achieve the desired system quality such as extensibility. Testing Models for Additional Qualities (cont...)

67 Guided Inspection Process Checklist Decide how completeness, consistency, and correctness will be judged for particular use in the model under test (MUT) Determine which scenarios to sample from the use case model to use as test cases. Create test cases by supplementing the scenarios with specific data Select the model/notation that will record the results of each execution Conduct tests Evaluate the results of the test executions to determine which tests the model passed and which it failed Record the results for use in guiding the repair and testing processes in the next iteration.

68 Key Points The guided inspection technique provides a means of objectively and systematically searching a work product for faults by using explicit test cases. The basic steps in guided inspection –Define the test space –Select values from the test space using a specific strategy –Apply the test values to the product being tested –Evaluate the results and the percentage of the model covered by the test Evaluation criteria: –Correctness –Completeness –Consistency –Other qualities

69 Key Points (cont...) Test case selection methods: –Orthogonal defect classification –Use profiles –Risk Testing the requirements models: –Develop the ranking of use cases by computing combined frequency and criticality information for a use case. –Determine the total number of test cases that can be constructed given the amount of resources available. –Ration the tests based on the ranking –Write scenarios based only on the knowledge of those in the domain expert’s role –Check for completeness, consistency, and correctness

70 Key Points (cont...) A domain model is a representation of the knowledge in a domain as seen through the eyes of a set of domain experts. Multiple domain models contribute to the single application analysis model. Design models –Architectural design model –Detailed class design model Projects are chartered to achieve more aggressive objectives such as: –The development of extensible designs –The design of reusable frameworks –Highly portable systems


Download ppt "Software Testing and Quality Assurance: Inspection Reading Assignment: –John McGregor and David A. Sykes, A Practical Guide to Testing Object-Oriented."

Similar presentations


Ads by Google