Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #18.

Similar presentations


Presentation on theme: "CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #18."— Presentation transcript:

1 CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #18

2 CS 240, Prof. Sarwar Slide 2 Agenda  Software Testing Principles  Integration Testing  Usability Testing  Practical Testing aspects—Fault-based Testing Approach  System Testing

3 CS 240, Prof. Sarwar Slide 3 Integration testing  Tests complete systems or subsystems composed of integrated components  Integration testing should be black-box testing with tests derived from the specification  Main difficulty is localising errors  Incremental integration testing reduces this problem

4 CS 240, Prof. Sarwar Slide 4 Incremental integration testing

5 CS 240, Prof. Sarwar Slide 5 Product with 13 Modules

6 CS 240, Prof. Sarwar Slide 6 Approaches to integration testing  Big bang testing  Top-down testing  Start with high-level system and integrate from the top-down replacing individual components by stubs where appropriate  Bottom-up testing  Integrate individual components in levels until the complete system is created  In practice, most integration involves a combination of these strategies known as  Sandwich testing

7 CS 240, Prof. Sarwar Slide 7 Big Bang Integration Testing  It assumes that all components are first tested individually and then tested together as a single system  No additional test stubs or drivers needed  It sounds simple, but it’s expensive  if a test uncovers a failure (which we expect to happen), it is difficult to pinpoint the specific component responsible for the failure  In addition, it is impossible to distinguish failures in the interface from failures within a component  The better approach  Incremental or phased (or snowballing) integration

8 CS 240, Prof. Sarwar Slide 8 Test Stubs and Test Drivers  To test module a, modules b, c, d must be stubs  Empty module, or  Prints message ("Procedure radarCalc called"), or  Returns precooked values from preplanned test cases  To test module h on its own requires a driver, which calls it  Once, or  Several times, or  Many times, each time checking the value returned  Testing module d requires a driver and two stubs

9 CS 240, Prof. Sarwar Slide 9 Problems with: Implementation, Then Integration Approach  Problem 1  Stubs and drivers must be written, then thrown away after module testing is complete  Problem 2  Lack of fault isolation  A fault could lie in any of 13 modules or 13 interfaces  In a large product with, say, 103 modules and 108 interfaces, there are 211 places where a fault might lie  Solution to both problems  Combine module and integration testing  “Implementation and integration phase”

10 CS 240, Prof. Sarwar Slide 10 Top-down Implementation and Integration  If module m1 calls module m2, then m1 is implemented and integrated before m2  One possible top-down ordering is a, b, c, d, e, f, g, h, i, j, k, l, m  Another possible top-down ordering is a [a]b, e, h [a]c, d, f, i [a, d]g, j, k, l, m

11 CS 240, Prof. Sarwar Slide 11 Top-down Implementation and Integration: Advantages  Fault isolation  A previously successful test case fails when mNew is added to what has been tested so far  Stubs not wasted  Each stub is expanded into the corresponding complete module at the appropriate step  Major design flaws show up early  Logic modules include decision-making flow of control  In the example, modules a, b, c, d, g, j  Operational modules perform actual operations of module  In the example, modules e, f, h, i, k, l, m  Logic modules are developed before operational modules

12 CS 240, Prof. Sarwar Slide 12 Top-down Implementation and Integration: Disadvantages  Problem 1  Reusable modules are not properly tested  Lower level (operational) modules are not tested frequently  The situation is aggravated if the product is well designed  Defensive programming (fault shielding)  Example: if (x >= 0) y = computeSquareRoot (x, errorFlag);  Never tested with x < 0  Reuse implications

13 CS 240, Prof. Sarwar Slide 13 Bottom-up Implementation and Integration  If module m1 calls module m2, then m2 is implemented and integrated before m1  One possible bottom-up ordering is l, m, h, i, j, k, e, f, g, b, c, d, a  Another possible bottom-up ordering is h, e, b i, f, c, d l, m, j, k, g[d] a[b, c, d]

14 CS 240, Prof. Sarwar Slide 14 Bottom-up Implementation and Integration: Advantages  Advantage 1  Operational modules are thoroughly tested  Advantage 2  Operational modules are tested with drivers, not by fault shielding, defensively programmed calling modules  Advantage 3  Fault isolation

15 CS 240, Prof. Sarwar Slide 15 Bottom-up Implementation and Integration: Disadvantages  Difficulty 1  Major design faults are detected late  Solution  Combine top-down and bottom-up strategies making use of their strengths and minimizing their weaknesses

16 CS 240, Prof. Sarwar Slide 16 Sandwich Implementation and Integration  Logic modules are implemented and integrated top-down  Operational modules are implemented and integrated bottom-up  Finally, the interfaces between the two groups are tested

17 CS 240, Prof. Sarwar Slide 17 Sandwich Implementation and Integration (contd)  Advantage 1  Major design faults are caught early  Advantage 2  Operational modules are thoroughly tested  They may be reused with confidence  Advantage 3  There is fault isolation at all times

18 CS 240, Prof. Sarwar Slide 18 Top-down testing

19 CS 240, Prof. Sarwar Slide 19 Bottom-up testing

20 CS 240, Prof. Sarwar Slide 20 Testing approaches  Architectural validation  Top-down integration testing is better at discovering errors in the system architecture  System demonstration  Top-down integration testing allows a limited demonstration at an early stage in the development  Test implementation  Often easier with bottom-up integration testing  Test observation  Problems with both approaches. Extra code may be required to observe tests

21 CS 240, Prof. Sarwar Slide 21  Takes place when modules or sub-systems are integrated to create larger systems  Objectives are to detect faults due to interface errors or invalid assumptions about interfaces  Particularly important for object-oriented development as objects are defined by their interfaces Interface testing

22 CS 240, Prof. Sarwar Slide 22 Interface testing

23 CS 240, Prof. Sarwar Slide 23 Interfaces types  Parameter interfaces  Data passed from one procedure to another  Shared memory interfaces  Block of memory is shared between procedures  Procedural interfaces  Sub-system encapsulates a set of procedures to be called by other sub- systems  Message passing interfaces  Sub-systems request services from other sub-systems

24 CS 240, Prof. Sarwar Slide 24 Interface errors  Interface misuse  A calling component calls another component and makes an error in its use of its interface e.g. parameters in the wrong order  Interface misunderstanding  A calling component embeds assumptions about the behaviour of the called component which are incorrect  Timing errors  The called and the calling component operate at different speeds and out- of-date information is accessed

25 CS 240, Prof. Sarwar Slide 25 Interface testing guidelines  Design tests so that parameters to a called procedure are at the extreme ends of their ranges  Always test pointer parameters with null pointers  Design tests which cause the component to fail  Use stress testing in message passing systems  In shared memory systems, vary the order in which components are activated

26 CS 240, Prof. Sarwar Slide 26 Usability Testing (User Interface) Testing  GUIs work as a collection of controls that a human can stimulate via mouse or keyboard  Some controls are simple (e.g., buttons) but some are complex (e.g., list boxes), as they pass control as well as data  Testing data passing controls are challenging  Another concern is the order in which GUI controls are used  As a tester our job would be to understand the events and data combinations that originate and ensure that all interesting cases get tested

27 CS 240, Prof. Sarwar Slide 27 Attacking GUI  We’ll look into four categories to attack a GUI  input  output  storing of data  performing computation  Strategies in testing GUI input  Apply inputs that force all error messages to occur  Apply inputs that force the software to establish default values  Explore allowable character sets and data types  if you type in a URL named file://c:\AUX in internet explorer v 5.5 the program breaks  Overflow input buffers  We have seen one such boundary value attack on MS Word

28 CS 240, Prof. Sarwar Slide 28 Attacking GUI  Strategies in testing GUI input  repeat the same input or series of inputs numerous times  Microsoft equation editor fails this test  Force different outputs to be generated for each input  Force invalid outputs to be generated  Win NT (earlier versions) showed 2001 as a leap year  Force properties of an output to change  Ex. MS Powerpoint  Force the screen to refresh  MS Powerpoint fails this test

29 CS 240, Prof. Sarwar Slide 29 Attacking GUI  Strategies in testing GUI Data and Computation  Apply inputs using a variety of initial conditions  Force a data structure to store too many or too few values  Use alternative ways to modify internal data constraints  Force a function to call itself recursively  Force computation result to be too large or too small  Strategies in testing GUI File system  Fill the file system to its capacity  force the media to be busy or unavailable  Damage the media  Assign an invalid file name  vary or corrupt file contents  change file access permissions etc.

30 CS 240, Prof. Sarwar Slide 30 System Testing  Unit and Integration testing focus on finding faults in individual components and the interfaces between the components  After integration system testing is performed  There are several system testing activities  Functional testing  Nonfunctional (performance) testing  Pilot testing  Acceptance testing  Installation testing

31 CS 240, Prof. Sarwar Slide 31 Functional Testing  a.k.a Requirements testing finds differences between the functional requirements and the system.  System testing is a black-box testing  Test cases are derived from the use case model  although usability testing also uses use-case model, there is a subtle difference between them  functional: finds differences between the use case model and the observed system behavior  usability: finds differences between the use case model and the user’s expectation of the system

32 CS 240, Prof. Sarwar Slide 32 Performance Testing  Finds the differences between the design goals selecte3d during the system design and the observed system  The following tests are performed during performance testing  Stress Testing  volume testing  security testing  timing tests  recovery tests

33 CS 240, Prof. Sarwar Slide 33 Stress testing  Exercises the system beyond its maximum design load. Stressing the system often causes defects to come to light  Stressing the system test failure behaviour.. Systems should not fail catastrophically. Stress testing checks for unacceptable loss of service or data  Particularly relevant to distributed systems which can exhibit severe degradation as a network becomes overloaded

34 CS 240, Prof. Sarwar Slide 34 Pilot Testing  a.k.a Field Test  The system is installed and used by a selected set of users  Useful when the system is built without a specific set of requirements  An alpha test is a pilot test with users exercising the system in the development environment  A beta test is an acceptance test done by a limited number of end users in the target environment

35 CS 240, Prof. Sarwar Slide 35 Acceptance Testing  There are three ways a client evaluates a system during acceptance testing  Benchmark test  by using a set of test cases under which the system should operate  Competitor test  if the product is replacing an earlier product then it is tested against it  Shadow test  the new and legacy systems run in parallel and their outputs are compared

36 CS 240, Prof. Sarwar Slide 36  The components to be tested are object classes that are instantiated as objects  Larger grain than individual functions so approaches to white-box testing have to be extended  No obvious ‘top’ to the system for top-down integration and testing Object-oriented testing

37 CS 240, Prof. Sarwar Slide 37 Testing levels  Testing operations associated with objects  Testing object classes  Testing clusters of cooperating objects  Testing the complete OO system

38 CS 240, Prof. Sarwar Slide 38 Object class testing  Complete test coverage of a class involves  Testing all operations associated with an object  Setting and interrogating all object attributes  Exercising the object in all possible states  Inheritance makes it more difficult to design object class tests as the information to be tested is not localised

39 CS 240, Prof. Sarwar Slide 39 Weather station object interface  Test cases are needed for all operations  Use a state model to identify state transitions for testing  Examples of testing sequences  Shutdown  Waiting  Shutdown  Waiting  Calibrating  Testing  Transmitting  Waiting  Waiting  Collecting  Waiting  Summarising  Transmitting  Waiting

40 CS 240, Prof. Sarwar Slide 40 Object integration  Levels of integration are less distinct in object-oriented systems  Cluster testing is concerned with integrating and testing clusters of cooperating objects  Identify clusters using knowledge of the operation of objects and the system features that are implemented by these clusters

41 CS 240, Prof. Sarwar Slide 41 Approaches to cluster testing  Use-case or scenario testing  Testing is based on a user interactions with the system  Has the advantage that it tests system features as experienced by users  Thread testing  Tests the systems response to events as processing threads through the system  Object interaction testing  Tests sequences of object interactions that stop when an object operation does not call on services from another object

42 CS 240, Prof. Sarwar Slide 42 Scenario-based testing  Identify scenarios from use-cases and supplement these with interaction diagrams that show the objects involved in the scenario  Consider the scenario in the weather station system where a report is generated

43 CS 240, Prof. Sarwar Slide 43 Collect weather data

44 CS 240, Prof. Sarwar Slide 44 Weather station testing  Thread of methods executed  CommsController:request  WeatherStation:report  WeatherData:summarise  Inputs and outputs  Input of report request with associated acknowledge and a final output of a report  Can be tested by creating raw data and ensuring that it is summarised properly  Use the same raw data to test the WeatherData object

45 CS 240, Prof. Sarwar Slide 45 Testing workbenches  Testing is an expensive process phase. Testing workbenches provide a range of tools to reduce the time required and total testing costs  Most testing workbenches are open systems because testing needs are organisation-specific  Difficult to integrate with closed design and analysis workbenches

46 CS 240, Prof. Sarwar Slide 46 A testing workbench

47 CS 240, Prof. Sarwar Slide 47 Tetsing workbench adaptation  Scripts may be developed for user interface simulators and patterns for test data generators  Test outputs may have to be prepared manually for comparison  Special-purpose file comparators may be developed


Download ppt "CS 240, Prof. Sarwar Slide 1 CS 240: Software Project Fall 2003 Sections 1 & 2 Dr. Badrul M. Sarwar San Jose State University Lecture #18."

Similar presentations


Ads by Google