Lecture 9 Software Testing Techniques. OOD Case Study.

Slides:



Advertisements
Similar presentations
Testing (1) Let’s look at : Principles of testing The testing process
Advertisements

1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Chapter 17 Software Testing Techniques
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Testing an individual module
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Engineering Lecture 12 Software Testing Techniques 1.
Software Testing Strategies based on Chapter 13 - Software Engineering: A Practitioner’s Approach, 6/e copyright © 1996, 2001, 2005 R.S. Pressman & Associates,
Software Engineering Lecture 13 Software Testing Strategies 1.
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Chapter 13 & 14 Software Testing Strategies and Techniques
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 Object-Oriented Testing CIS 375 Bruce R. Maxim UM-Dearborn.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 13b: Software Testing Strategies Software Engineering: A Practitioner’s Approach, 6/e Chapter.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 Software Engineering Muhammad Fahad Khan Software Engineering Muhammad Fahad Khan University Of Engineering.
1 Chapter 7 Software Testing Strategies. 2 Software Testing Testing is the process of exercising a program with the specific intent of finding errors.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Agenda Introduction Overview of White-box testing Basis path testing
1 Chapter 6 Software Testing Techniques. 2 Testability Operability—it operates cleanly Operability—it operates cleanly Observability—the results of each.
INTRUDUCTION TO SOFTWARE TESTING TECHNIQUES BY PRADEEP I.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 14a: Software Testing Techniques Software Engineering: A Practitioner’s Approach, 6/e Chapter.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 6/e (McGraw-Hill 2005). Slides copyright 2005 by Roger Pressman.1.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Chapter 24 객체지향 응용프로그램 테스팅 Testing Object-Oriented Applications 임현승 강원대학교 Revised from the slides by Roger S. Pressman and Bruce R. Maxim for the book.
1 Lecture 15: Chapter 19 Testing Object-Oriented Applications Slide Set to accompany Software Engineering: A Practitioner’s Approach, 7/e by Roger S. Pressman.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Software Engineering By Germaine Cheung Hong Kong Computer Institute Lecture 7.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
OBJECT-ORIENTED TESTING. TESTING OOA AND OOD MODELS Analysis and design models cannot be tested in the conventional sense. However, formal technical reviews.
1 Lecture 14: Chapter 18 Testing Conventional Applications Slide Set to accompany Software Engineering: A Practitioner’s Approach, 7/e by Roger S. Pressman.
CSC 395 – Software Engineering Lecture 27: White-Box Testing.
Software Testing Testing is the process of exercising a program with the specific intent of finding errors prior to delivery to the end user.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Chapter 17 Software Testing Techniques
Chapter 18 Testing Conventional Applications
Software Testing Techniques
Chapter 18 Testing Conventional Applications
Software Engineering: A Practitioner’s Approach, 6/e Chapter 13 Software Testing Strategies copyright © 1996, 2001, 2005 R.S. Pressman & Associates,
Chapter 18 Software Testing Strategies
Chapter 13 & 14 Software Testing Strategies and Techniques
Chapter 24 Testing Object-Oriented Applications
Chapter 18 Testing Conventional Applications
Chapter 18 Testing Conventional Applications
Chapter 17 Software Testing Strategies
Chapter 19 Testing Object-Oriented Applications
Chapter 18 Testing Conventional Applications
Chapter 10 – Software Testing
Chapter 18 Testing Conventional Applications.
Chapter 23 Testing Conventional Applications
Chapter 19 Testing Object-Oriented Applications
Chapter 17 Software Testing Strategies.
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

Lecture 9 Software Testing Techniques

OOD Case Study

Software Testing Testing is the process of exercising a program with the specific intent of finding (and removing) errors prior to delivery to the end user.

What Testing Shows errors requirements conformance performance an indication of quality

Who Tests the Software? developer independent tester Understands the system but, will test "gently" and, is driven by "delivery" Must learn about the system, but, will attempt to break it and, is driven by quality

Testing Principles All tests should be traceable to customer requirements Tests should be planned long before testing begins The Pareto (80/20) principle applies to software testing Testing should begin “in the small” and progress toward testing “in the large” Exhaustive testing is not possible Testing should be conducted by an independent third party

Testability Operability—it operates cleanly Observability—the results of each test case are readily observed Controlability—the degree to which testing can be automated and optimized Decomposability—testing can be targeted Simplicity—reduce complex architecture and logic to simplify tests Stability—few changes are requested during testing Understandability—of the design

What is a good test? Has a high probability of finding an error – need to develop a classification of perceivable errors and design test for each case Not redundant – no two tests intend to uncover the same error Representative – have highest likelihood of uncovering error Having right level of complexity – not too simple or too complex

Exhaustive Testing test per millisecond, it would take 3,170 years to There are 10 possible paths! If we execute one test this program!! 14 loop < 20 X

Selective Testing loop < 20 X Selected path

Software Testing Methods Strategies white-box methods black-box methods Test internal operations, i.e. procedural details Test functions (at the software interface)

Test Case Design "Bugs lurk in corners and congregate at boundaries..." Boris Beizer OBJECTIVE CRITERIA CONSTRAINT to uncover errors in a complete manner with a minimum of effort and time

White-Box Testing... our goal is to ensure that all statements and conditions have been executed at least once... Paths Decisions Loops Internal data

Why Cover? logic errors and incorrect assumptions are inversely proportional to a path's execution probability we often believe that a path is not likely to be executed; in fact, reality is often counter intuitive typographical errors are random; it's likely that untested paths will contain some

Flow Graph ,3 4, R1R1 R2R2 R3R3 R4R4 Flowchart Flow Graph

Flow Graph (cont.) a a x x y y x x b b IF a OR b then procedure x else procedure y ENDIF SequenceIfWhileUntil Case Predicate node

Basis Path Testing 1 1 2,3 4, R1R1 R2R2 R3R3 R4R4 Basis path set path 1: 1-11 path 2: path 3: path 4:

Cyclomatic Complexity A number of industry studies have indicated that the higher V(G), the higher the probability or errors. V(G) modules modules in this range are more error prone V(G) = Number of regions Edges - Nodes + 2 Predicate Nodes + 1

Derive Test Cases 1.Draw a flow graph based on the design or code 2.Determine cyclomatic complexity 3.Determine basis set of linearly independent paths 4.Prepare test cases for basis set

Example PROCEDURE average; INTERFACE RETURNS average, total.input, total.valid; INTERFACE ACCEPTS value, minimum, maximum; TYPE value[1:100] IS SCALAR ARRAY; TYPE average, total.input, total.valid; minimum, maximum, sum IS SCALAR; TYPE i IS INTEGER; i = 1; total.input = total.valid = 0; sum = 0; DO WHILE value[i] <> -999 AND total.input < 100 increment total.input by 1; IF value[i] >= minimum AND value[i] <= maximum THEN increment total.valid by 1; sum = sum + value[i]; ELSE skip ENDIF increment i by 1; ENDDO IF total.valid > 0 THEN average = sum / total.valid; ELSE average = -999; ENDIF END average nodes=13 edges=19 regions=6 R2R2 R4R4 R3R3 R5R5 R1R1 R6R6

Example (cont.) path 1: path 2: path 3: path 4: … path 5: … path 6: … … = loop back Predicate nodes = 2,3,5,6,10 V(G) = 6 Path 1 test case: value(k) = valid input, where k<i for 2  i  100 value(i)=-999 where 2  i  100 Expected results: correct average based on k values and proper totals Note: must be tested as part of path 4, 5, and 6 tests Path 2 test case: value(i) = -999 Expected results: average = -999

Example (cont.) Path 5 test case: value(i) = valid input where i<100 value(k) > minimum where k  i Expected results: correct average based on k values and proper totals Path 6 test case: value(i) = valid input where i<100 Expected results: correct average based on k values and proper totals Path 3 test case: Attempt to process 101 or more values First 100 values should be valid Expected results: Same as test case 1 Path 4 test case: value(i) = valid input where i<100 value(k) < minimum where k<i Expected results: Correct average based on k values and proper totals

Loop Testing Nested Loops Concatenated Loops Unstructured Loops Simple loop

Simple Loops Minimum conditions—Simple Loops 1. skip the loop entirely 2. only one pass through the loop 3. two passes through the loop 4. m passes through the loop m < n 5. (n-1), n, and (n+1) passes through the loop where n is the maximum number of allowable passes

Nested Loops 1.Start at the innermost loop. Set all outer loops to their minimum iteration parameter values. 2.Test the min+1, typical, max-1 and max for the innermost loop, while holding the outer loops at their minimum values. Add other tests for out-of-range or excluded values 3.Move out one loop and set it up as in step 2, holding all other loops at typical values. Continue this step until the outermost loop has been tested.

Concatenated Loops If the loops are independent of one another then treat each as a simple loop else treat as nested loops endif Redesign unstructured loops if possible

Black-Box Testing requirements events input output

Equivalence Partitioning user queries mouse clicks output formats prompts FK input data  An equivalence class represents a set of valid or invalid states for input conditions  A test case can uncover classes of errors

Sample Equivalence Classes User supplied commands Responses to system prompts File names Computational data physical parameters bounding values initiation values Output data formatting Responses to error messages Graphical data (e.g., mouse clicks) Data outside bounds of the program Physically impossible data Proper value supplied in wrong place Valid data Invalid data If condition needs:  Range / Specific value  one valid and two invalid equivalence classes  Member of a set/ Boolean  one valid and one invalid equivalence classes

Bank ID Example Area code – blank or three-digit number Boolean – code may or may not be present Prefix – three-digit number not beginning with 0 or 1 Range – 200 – 999, with specific exceptions Suffix – four-digit number Value – four digits Password – six-digit alphanumeric string Boolean – password may or may not present Value – six-character string Commands – check, deposit, bill payment, and the like Set – valid commands

Boundary Value Analysis user queries mouse picks output formats prompts FK input data output domain input domain Concentrates on the boundary conditions!

BVA Guidelines Range (e.g. a and b) – test with a, b, and just above and below a and b Values – test with min., max., and values just above and below min. and max. The above two guidelines apply to output conditions Test data structure at its boundary (e.g. array)

Testing Strategy unit test (components) Integration test (architecture) Validation test (requirement) System test (interface other system)

Unit Testing module to be tested test cases results software engineer white box

Unit Test Environment module stub driver RESULTS interface local data structures boundary conditions independent paths error handling paths test cases dummies

Integration Testing Options: the “big bang” approach an incremental construction strategy white box

Top Down Integration top module is tested with stubs stubs are replaced one at a time, “depth first” as new modules are integrated, some subset of tests is re-run A B C DE FG “breadth first” approach would replace stubs of the same level first Disadvantages Incur development overhead Difficult to determine real cause of error

Bottom-Up Integration drivers are replaced one at a time, "depth first" worker modules are grouped into builds and integrated A B C DE FG cluster / build Disadvantage Don’t have a full product until the test is finished

Sandwich Testing Top modules are tested with stubs Worker modules are grouped into builds and integrated A B C DE FG cluster / build

Regression Testing Retest a corrected module or whenever a new module is added as part of the integration test. Use part of the previously used test cases Can also use automated capture/playback tools Regression test suite should contain Representative sample tests Tests cases for functions likely to be affected by the integration Tests cases that focus on the changed components

Validation Testing Aims to test conformity with requirements A test plan outlines the classes of tests to be conducted A test schedule defines specific test cases to be used Configuration review (audit) make sure the correct software configuration is developed User acceptance test – user “test drive” Alpha Test – conducted at the developer’s site Beta Test – conducted at customer’s site black box

System Testing Testing interface with other systems Recovery testing: force some error and see if the system can recovery (mean-time-to-repair) Security testing: verify that protection mechanisms are correctly implemented Stress testing: test with abnormal quantity,frequency or volume of transactions Performance testing: test run time performance of software within the context of an integrated system

The Debugging Process test cases results Debugging suspected causes identified causes corrections regression tests new test cases

Debugging Effort time required to diagnose the symptom and determine the cause time required to correct the error and conduct regression tests

Symptoms & Causes symptom cause symptom and cause may be geographically separated symptom may disappear when another problem is fixed cause may be due to a combination of non-errors cause may be due to a system or compiler error cause may be due to assumptions that everyone believes symptom may be intermittent

Consequences of Bugs damage mild annoying disturbing serious extreme catastrophic infectious Bug Type Bug Categories: function-related bugs, system-related bugs, data bugs, coding bugs, design bugs, documentation bugs, standards violations, etc.

Debugging Techniques brute force / testing backtracking induction deduction

Debugging: Final Thoughts Don't run off half-cocked,think about the symptom you're seeing. Use tools (e.g., dynamic debugger) to gain more insight. If at an impasse,get help from someone else. Be absolutely sure toconduct regression tests when you do "fix" the bug

Object-Oriented Testing

begins by evaluating the correctness and consistency of the OOA and OOD models testing strategy changes the concept of the ‘unit’ broadens due to encapsulation integration focuses on classes and their execution across a ‘thread’ or in the context of a usage scenario validation uses conventional black box methods test case design draws on conventional methods, but also encompasses special features Encapsulation and inheritance makes OOT difficult

Testing the CRC Model 1.Revisit the CRC model and the object-relationship model. 2.Inspect the description of each CRC index card to determine if a delegated responsibility is part of the collaborator’s definition. 3.Invert the connection to ensure that each collaborator that is asked for service is receiving requests from a reasonable source. 4.Using the inverted connections examined in step 3, determine whether other classes might be required or whether responsibilities are properly grouped among the classes. 5.Determine whether widely requested responsibilities might be combined into a single responsibility. 6.Steps 1 to 5 are applied iteratively to each class and through each evolution of the OOA model.

OOT Strategy Class testing is the equivalent of unit testing operations within the class are tested the state behavior of the class is examined Integration applied three different strategies thread-based testing—integrates the set of classes required to respond to one input or event use-based testing—integrates the set of classes required to respond to one use case cluster testing—integrates the set of classes required to demonstrate one collaboration Validation testing using use cases

Test Case Design 1.Each test case should be uniquely identified and should be explicitly associated with the class to be tested, 2.The purpose of the test should be stated, 3.A list of testing steps should be developed for each test and should contain: a.a list of specified states for the object that is to be tested b.a list of messages and operations that will be exercised as a consequence of the test c.a list of exceptions that may occur as the object is tested d.a list of external conditions (i.e., changes in the environment external to the software that must exist in order to properly conduct the test) e.supplementary information that will aid in understanding or implementing the test.[BER93]

Random Testing identify operations applicable to a class define constraints on their use identify a minimum test sequence an operation sequence that defines the minimum life history of the class (object) generate a variety of random (but valid) test sequences exercise other (more complex) class instance life histories

Partition Testing reduces the number of test cases required to test a class in much the same way as equivalence partitioning for conventional software state-based partitioning categorize and test operations based on their ability to change the state of a class attribute-based partitioning categorize and test operations based on the attributes that they use category-based partitioning categorize and test operations based on the generic function each performs

Inter-Class Testing For each client class, use the list of class operators to generate a series of random test sequences. The operators will send messages to other server classes. For each message that is generated, determine the collaborator class and the corresponding operator in the server object. For each operator in the server object (that has been invoked by messages sent from the client object), determine the messages that it transmits. For each of the messages, determine the next level of operators that are invoked and incorporate these into the test sequence