Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Testing CS 560. Software testing strategies A strategic approach to testing  Functionality meets the requirements that guided its design. 

Similar presentations


Presentation on theme: "Software Testing CS 560. Software testing strategies A strategic approach to testing  Functionality meets the requirements that guided its design. "— Presentation transcript:

1 Software Testing CS 560

2 Software testing strategies A strategic approach to testing  Functionality meets the requirements that guided its design.  Responds correctly to all types of input.  Can be installed/replicated and run in its intended environment.  Achieves the results that stakeholders desire. 2

3 Introduction to software testing Provides a road map that describes  The steps to be taken (how to test)  When to test (milestones, component design, integration, etc.)  How much effort, time, and resources will be required Incorporates  test planning  test case design  test execution  and test result collection and evaluation Software testing provides guidance for the development team  Because of time constraints, progress must be measurable and problems must surface as early as possible 3

4 General characteristics of software testing To perform effective testing, a software team should conduct effective formal technical reviews  the objective is to arrive at a superior version of the product being reviewed, whether by correction of defects or introduction of new functionality. Testing begins at the component level  Make sure individual functions of a component works, then move toward integration testing of the entire system.  Different testing techniques for components and groups of components. Testing and debugging are different activities, but debugging must be accommodated in any testing strategy  Testing: “based on requirements, does function a do x?”  Debugging: function a doesn’t work with function b because function a calls function b incorrectly. Correct the call for function b in function a. 4

5 Verification and Validation Software testing is part of a broader group of activities called verification and validation. Verification (Are the algorithms coded correctly?)  Activities that ensure that software correctly implements a specific function or algorithm.  Difficult because of the number of possible test cases that exist even in a simple design.  Ex: Search algorithm works correctly based on specified input. Validation (Does it meet user requirements?)  Activities that ensure that the software that has been built is traceable to client/customer requirements  Ex: Web search engine returns results based on user query in n specified time. 5

6 Software Testing Testing should aim at "breaking" the software Common misconceptions  The developer of software should do no testing at all.  The software should be given to a secret team of testers who will test it unmercifully.  The testers get involved with the project only when the testing steps are about to begin. 6

7 Levels of Testing for Conventional Software (documentation) Unit (Component) testing  Concentrates on each component/function of the software as implemented in the source code.  Used to generate documentation of how each function works. Integration testing  Focuses on the design and construction of the software architecture  Individual components are combined and tested as a group. Validation testing  Requirements are validated against the constructed software to make sure it meets stakeholder requirements. System testing  The software and other system elements are tested as a whole.  System testing should require no knowledge of the system architecture or code. 7

8 Ensuring a Successful Software Test Strategy Specify product requirements in a quantifiable manner before testing commences.  State testing objectives explicitly in measurable terms.  Understand the user of the software and develop a profile for each user category. Develop a continuous improvement approach for the testing process through the gathering of metrics 8

9 Unit Testing Focuses testing on the function or software module  Concentrates on the internal program logic and data structures. 9

10 Targets for Unit Test Cases Function interface  Ensure that information flows properly into and out of the function Local data structures  Ensure that data stored temporarily maintains its integrity during all steps in an algorithm execution Boundary conditions  Ensure that the function operates properly at boundary values established to limit or restrict processing Independent paths (basis paths)  Paths are exercised to ensure that all statements in a module have been executed at least once Error handling paths  Ensure that the algorithms respond correctly to specific error conditions 10

11 Integration Testing Systematic technique for constructing the software architecture  Conduct tests to uncover errors associated with combined components and interfaces Two Approaches  Non-incremental Integration Testing  Incremental Integration Testing 11

12 Non-incremental Integration Testing All components are combined in advance The entire program is tested as a whole  Chaotic results  Many seemingly-unrelated errors are encountered  Correction is difficult because isolation of causes is complicated Once a set of errors are corrected  More errors occur  Testing appears to enter an endless loop 12

13 Incremental Integration Testing The program is constructed and tested in small increments Errors are easier to isolate and correct 13

14 Validation testing Validation testing follows integration testing Focuses on user-visible actions and user- recognizable output from the system Demonstrates functionality based on requirements Designed to ensure that  All functional requirements are satisfied  All behavioral characteristics are achieved  All performance requirements are attained  Documentation is correct  Usability and other requirements are met 14

15 System testing: Different Types Recovery testing  Tests for recovery from system faults.  Forces the software to fail in a variety of ways and verifies that recovery is properly performed.  Tests reinitialization, data recovery, and restart for correctness. Security testing  Verifies that protection mechanisms built into a system will, in fact, protect it from improper access. Stress testing  Executes a system in a manner that demands resources in abnormal quantity, frequency, or volume. Performance testing  Tests the run-time performance of the product.  Often coupled with stress testing.  Can uncover situations that lead to degradation and possible system failure. 15

16 Three Questions to ask Before Correcting Errors Is the cause of the error reproduced in another part of the program?  Similar errors may be occurring in other parts of the program. What next error might be introduced by the fix that I’m about to make?  The source code (and even the design) should be studied. What could we have done to prevent this bug in the first place?  This is the first step toward software quality assurance.  By correcting the process as well as the product, the bug will be removed from the current program and may be eliminated from all future programs. 16

17 Dynamic verification testing: stages of testing Testing is most effective if divided into stages User interface testing (carried out separately) Unit testing  unit test System testing  integration test  function test  performance test  installation test Acceptance testing (carried out separately) 17

18 Dynamic testing: User interface testing (documentation) Subset of user interface testing guidelines:  General  Every action that alters user data can be undone.  All application settings can be restored to default.  The most frequently used functions are found at the top level of the menu structure.  Keyboard  Efficient keyboard access is provided to all application features.  No awkward reaches for frequently performed keyboard operations.  Mouse  The mouse pointer is never restricted to part of the screen by the application. 18

19 Dynamic testing: Unit testing, interface (documentation) Interface types: Parameter interfaces  Data passed from one procedure to another Shared memory interfaces  Block of memory is shared between procedures Procedural interfaces  Sub-system encapsulates a set of procedures to be called by other sub-systems Message passing interfaces  Sub-systems request services from other sub-systems 19

20 Dynamic testing: Unit testing, Basis paths (documentation) The objective of path testing is to build a set of test cases so that each path through the program is executed at least once  Ensures statement/branch coverage If every condition in a compound condition is considered  Condition coverage can be achieved as well Steps for basis path testing:  Draw a (control) flow graph using the source code  Line numbers can dictate each node in the graph  Calculate the cyclomatic complexity using the flow graph  Determine the basis set of linearly independent paths  Design test cases to exercise each path in the basis set 20

21 Dynamic testing: Unit testing, Basis paths (documentation) Flow Graphs  Used to depict program control structure  Can be drawn from a piece of source code  Flow Graph Notation: composed of edges and nodes.  An edge starts from a node and ends at another node 21

22 Dynamic testing: Unit testing, Basis paths (documentation) Linear search flow graph 22

23 Dynamic testing: Unit testing, Basis paths (documentation) Calculating cyclomatic complexity:  E = number of edges  N = number of nodes  P = number of connected components  CC = E – N + 2P  CC = 11 – 10 + 2 = 3 Independent paths through the program:  12, 1, 2, 3, 5, 6, 7, 10  12, 1, 2, 3, 5, 6, 7, 8, 7, 10  12, 1, 2, 3, 5, 6, 7, 8, 9, 7, 10 Test cases should be derived so that all of these paths are executed 23

24 Dynamic testing: Unit testing, Basis paths (documentation) Designing the test cases: Path 1 test case: List size 1  12, 1, 2, 3, 5, 6, 7, 10  Input data: [4]  Expected output: 4 Path 2 test case: max in pos 1  12, 1, 2, 3, 5, 6, 7, 8, 7, 10  Input data: [6, 2, 5, 1, 3]  Expected output: 6 Path 3 test case: max not in pos 1  12, 1, 2, 3, 5, 6, 7, 8, 9, 7, 10  Input data: [5, 2, 1, 1, 8, 3, 4]  Expected output: 8 24

25 Dynamic testing: system testing, integration test (documentation) Tests complete systems or subsystems composed of integrated components Main difficulty is localising errors  Errors may not exist until components rely on each other to function properly Incremental integration testing reduces this problem 25

26 Dynamic testing: System testing, Performance/Stress test (documentation) Exercises the system beyond its maximum design load.  Stressing the system often causes defects to come to light Stressing the system test failure behaviour.  Systems should not fail catastrophically.  Stress testing also checks for unacceptable loss of service or data Particularly relevant to distributed systems which can exhibit severe degradation as a network becomes overloaded 26

27 Dynamic testing: acceptance testing (documentation) Used to determine if the requirements of a software product are met. Gather the key acceptance criteria/requrements  The list of features/functions that will be evaluated before testing the product. Determine testing approaches  Types of acceptance tests: stress, timing, compliance, capacity  Testing levels: system level, component level, integration level  Test methods and tools Test data recording  Description of how acceptance test will be recorded 27

28 Key points of software testing Test parts of a system which are commonly used rather than those which are rarely executed acceptance testing is based on the system specifications and requirements Flow graphs identify test cases which cause all paths through the program to be executed Interface defects arise because of specification misreading, misunderstanding, errors or invalid timing assumptions 28

29 Acceptance testing The complete system, including documentation, training materials, installation scripts, etc. is tested against the requirements by the client  Assisted by the developers.  Developers create test cases and scenarios for the client.  Each requirement is tested separately.  Scenarios are used to compare the expected outcomes with what the system does.  Emphasis is placed on how the system handles problems, errors, restarts, and other difficulties. Is the system we have built, the system that you wanted? Does it meet your requirements? 29

30 Acceptance testing Three major objectives of acceptance testing:  Confirm that the system meets the agreed upon requirements  Identify and resolve any conflicts  Determine the readiness of the system for live operations 30

31 Acceptance criteria (documentation) Defined by the following attributes:  Functional Correctness and Completeness  Accuracy  Data Integrity  Data Conversion  Backup and Recovery  Competitive Edge  Usability  Performance  Start-up Time  Stress  Reliability and Availability  Maintainability and Serviceability  Robustness  Timeliness  Confidentiality and Availability  Compliance  Installability and Upgradability  Scalability  Documentation 31

32 Acceptance tests  Closed box by the client without knowledge of the internals  The entire system is tested as a whole  The emphasis is on whether the system meets the requirements  The tests should use real data in realistic situations, with actual users, administrators, and operators The acceptance tests must be successfully completed before the new system can go live or replace a legacy system. 32

33 Acceptance test report (documentation) 33

34 Project Delivery Summary A good delivery package results in:  happy client  happy users  less expense in support and maintenance But many projects fall short  Poor packaging/documentation  Poor or no training materials  Testing components improperly  Generally neglecting parts of the software process 34


Download ppt "Software Testing CS 560. Software testing strategies A strategic approach to testing  Functionality meets the requirements that guided its design. "

Similar presentations


Ads by Google