Presentation on theme: "Dr. Pedro Mejia Alvarez Software Testing Slide 1 Software Testing: Building Test Cases."— Presentation transcript:
Dr. Pedro Mejia Alvarez Software Testing Slide 1 Software Testing: Building Test Cases
Dr. Pedro Mejia Alvarez Software Testing Slide 2 Topics covered l Software Testing Process l Software Testability l Test Cases l Black box testing l Search and Sort Cases l Debuging the application
Dr. Pedro Mejia Alvarez Software Testing Slide 3 Introduction to Software Testing, 3 l Fault Detection: “Waiting” or causing (or finding) for the error or failure to ocurr. l Fault Location: Finding where the fault(s) ocurred, its causes and its consequences. l Fault Recovery: Fixing the fault and NOT causing others. l Regresion Testing: Testing the software again with the same data that caused the original fault Dynamics of Faults
Dr. Pedro Mejia Alvarez Software Testing Slide 4 l Software inspections. Concerned with analysis of the static system representation to discover problems (static verification) May be supplement by tool-based document and code analysis l Software Testing. Concerned with exercising and observing product behaviour (dynamic verification) The system is executed with test data and its operational behaviour is observed. Used to Detect Faults. l Software Debugging. Concerned with finding and removing the faults. l Fault Tolerance. If faults still ocurr during operation: detect, remove and recover the system from faults. Static and dynamic verification
Dr. Pedro Mejia Alvarez Software Testing Slide 5 The Software Testing process
Dr. Pedro Mejia Alvarez Software Testing Slide 6 Software Testability l Plainly speaking – how hard it is to find faults in the software l Testability is determined by two practical problems How to provide the test values to the software How to observe the results of test execution The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met
Dr. Pedro Mejia Alvarez Software Testing Slide 7 Observability and Controllability l Observability Software that affects hardware devices, databases, or remote files have low observability l Controllability Easy to control software with inputs from keyboards Inputs from hardware sensors or distributed software is harder l Data abstraction reduces controllability and observability How easy it is to observe the behavior of a program in terms of its outputs, effects on the environment and other hardware and software components How easy it is to provide a program with the needed inputs, in terms of values, operations, and behaviors
Dr. Pedro Mejia Alvarez Software Testing Slide 8 Components of a Test Case l A test case is a multipart artifact with a definite structure l Test case values Expected results Expected results The result that will be produced when executing the test if the program satisfies it intended behavior The values that directly satisfy one test requirement
Dr. Pedro Mejia Alvarez Software Testing Slide 9 Affecting Controllability and Observability l Preconditions l Postconditions 1.Verification Values : Values needed to see the results of the test case values 2.Exit Commands : Values needed to terminate the program or otherwise return it to a stable state l Executable test script Any conditions that must be observed after executing the software Any conditions necessary for the correct execution of the software A test case that is prepared in a form to be executed automatically on the test software and produce a report
Dr. Pedro Mejia Alvarez Software Testing Slide 10 Search Binary Search Interpolation Search Sort Heap Sort Merge Sort. Algorithms to Test
Dr. Pedro Mejia Alvarez Software Testing Slide 11 Black Box Testing. l Study the algorithms l Programming. l Pre-conditions & Post-Conditions of each algorithm. l Develop Test Cases. l Develop Oracle. l Run the test. l Record Inputs and Outputs. l Record Invalid Outputs l If invalid Outputs or Execution Errors then: l Debug the program: Finding the fault. Testing the algorithms
Dr. Pedro Mejia Alvarez Software Testing Slide 12 Test case design l Involves designing the test cases (inputs and outputs) used to test the system. l The goal of test case design is to create a set of tests that are effective in validation and defect testing. l Design approaches: Black Box Testing: Partition testing; White Box Testing: Structural testing. Criteria based on Structures.
Dr. Pedro Mejia Alvarez Software Testing Slide 13 Black-box testing
Dr. Pedro Mejia Alvarez Software Testing Slide 14 Equivalence partitioning Oracle Test Data Generation Valid/ Invalid Output Random Number Generation
Dr. Pedro Mejia Alvarez Software Testing Slide 15 Testing guidelines l Testing guidelines are hints for the testing team to help them choose tests that will reveal defects in the system Choose inputs that force the system to generate valid outputs; Choose inputs that force the system to generate faults (invalid outputs); Check if valid inputs generate invalid outputs. Check if invalid outputs generate valid outputs. Design inputs that cause buffers to overflow; Repeat the same input or input series several times; Force computation results to be too large or too small.
Dr. Pedro Mejia Alvarez Software Testing Slide 16 Partition testing l Input data and output results often fall into different classes where all members of a class are related. l Each of these classes is an equivalence partition or domain where the program behaves in an equivalent way for each class member. l Test cases should be chosen from each partition.
Dr. Pedro Mejia Alvarez Software Testing Slide 17 Search routine - input partitions
Dr. Pedro Mejia Alvarez Software Testing Slide 18 Sort Routine – Partitions Sequences: Array of 1,000 elements Sort an array with all elements not sorted. Sort an array with all elements sorted. Sort an array with many equal elements. Oracle: Simple: Check that first element in array is less or equal than next. Check how many elements are not correctly sorted. Check what input makes out to be incorrect.
Dr. Pedro Mejia Alvarez Software Testing Slide 19 l Defect testing and debugging are distinct processes. l Verification and validation is concerned with establishing the existence of defects in a program. l Debugging is concerned with locating and repairing these errors. l Debugging involves formulating a hypothesis about program behaviour then testing these hypotheses to find the system error. Testing and debugging
Dr. Pedro Mejia Alvarez Software Testing Slide 20 The debugging process
Dr. Pedro Mejia Alvarez Software Testing Slide 21 Debugging What can debuggers do? l Run programs l Make the program stops on specified places or on specified conditions l Give information about current variables’ values, the memory and the stack l Let you examine the program execution step by step - stepping l Let you examine the change of program variables’ values - tracing ! To be able to debug your program, you must compile it with the -g option (creates the symbol table) ! CC –g my_prog
Dr. Pedro Mejia Alvarez Software Testing Slide 22 GDB – Running Programs Running a program: run (or r) -- creates an inferior process that runs your program. if there are no execution errors the program will finish and results will be displayed in case of error, the GDB will show: - the line the program has stopped on and - a short description of what it believes has caused the error
Dr. Pedro Mejia Alvarez Software Testing Slide 23 Test automation l Testing is an expensive process phase. Testing workbenches provide a range of tools to reduce the time required and total testing costs. l Systems such as Junit support the automatic execution of tests. l Most testing workbenches are open systems because testing needs are organisation-specific. l They are sometimes difficult to integrate with closed design and analysis workbenches.
Dr. Pedro Mejia Alvarez Software Testing Slide 24 A testing workbench
Dr. Pedro Mejia Alvarez Software Testing Slide 25 What do we need to improve testing? l Run programs