Chapter 9, Testing.

Slides:



Advertisements
Similar presentations
Chapter 11, Testing.
Advertisements

Lecture 8: Testing, Verification and Validation
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
Nov R McFadyen1 A Traditional Software Development Process Unit test Integration test System test Detailed design Architectural design Analysis.
OOMPA Lecture 13 Exam Testing Project management.
Illinois Institute of Technology
Bernd Bruegge & Allen Dutoit Object-Oriented Software Engineering: Conquering Complex and Changing Systems 1 Software Engineering October 31, 2001 Testing.
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
Chapter 11, Testing.
Outline Types of errors Component Testing Testing Strategy
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
1 TCSS 360, Spring 2005 Lecture Notes Testing Relevant Reading: Object-Oriented Software Engineering, Ch. 9 B. Bruegge, A. Dutoit.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
System/Software Testing
Testing Chapter 11. Dealing with Errors Verification: –Makes assumptions –Doesn’t always deal with real environment Testing (this lecture) –Testing is.
Testing. Outline Terminology Types of errors Dealing with errors Quality assurance vs Testing Component Testing Unit testing Integration testing Testing.
Software Quality Assurance Lecture #8 By: Faraz Ahmed.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Conquering Complex and Changing Systems Object-Oriented Software Engineering Chapter 11, Testing.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Intro to S/W Engg Software Testing. Terminology  Reliability: The measure of success with which the observed behavior of a system confirms to some specification.
Bernd Bruegge & Allen Dutoit Object-Oriented Software Engineering: Conquering Complex and Changing Systems 1 Terminology  Reliability: The measure of.
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 11, Testing.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CS 360 Lecture 16.  For a software system to be reliable:  Each stage of development must be done well, with incremental verification and testing. 
Conquering Complex and Changing Systems Object-Oriented Software Engineering Chapter 9, Testing.
Dynamic Testing.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Conquering Complex and Changing Systems Object-Oriented Software Engineering Chapter 9, Testing.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
What is this?. Erroneous State (“Error”) Algorithmic Fault.
Dynamic White-Box Testing What is code coverage? What are the different types of code coverage? How to derive test cases from control flows?
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
ANOOP GANGWAR 5 TH SEM SOFTWARE TESTING MASTER OF COMPUTER APPLICATION-V Sem.
Software Testing Strategies for building test group
Software Testing.
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
These slides are based on:
Software Engineering (CSI 321)
Chapter-(Testing principles)
Chapter 11, Testing.
Chapter 8 – Software Testing
Some Simple Definitions for Testing
Types of Testing Visit to more Learning Resources.
Chapter 11, Testing.
Software testing strategies 2
Lecture 09:Software Testing
Verification and Validation Unit Testing
Testing and Test-Driven Development CSC 4700 Software Engineering
Advance Software Engineering (CEN-5011)
Introduction to Software Engineering (CEN-4010)
Chapter 11, Testing.
Software Testing & Quality Management
Chapter 10 – Software Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Chapter 9, Testing.
Software Testing “If you can’t test it, you can’t design it”
Chapter 11, Testing.
Chapter 7 Software Testing.
Software Lifecycle Activities
CS410 – Software Engineering Lecture #11: Testing II
Software Testing Strategies
Presentation transcript:

Chapter 9, Testing

Terminology Reliability: The measure of success with which the observed behavior of a system confirms to some specification of its behavior. Failure: Any deviation of the observed behavior from the specified behavior. Error: The system is in a state such that further processing by the system will lead to a failure. Fault (Bug): The mechanical or algorithmic cause of an error. There are many different types of errors and different ways how we can deal with them.

What is this?

Erroneous State (“Error”)

Algorithmic Fault

Mechanical Fault

How do we deal with Errors and Faults?

Modular Redundancy?

Patching?

Testing?

Declaring the Bug as a Feature?

Examples of Faults and Errors Faults in the Interface specification Mismatch between what the client needs and what the server offers Mismatch between requirements and implementation Algorithmic Faults Missing initialization Branching errors (too soon, too late) Missing test for nil Mechanical Faults (very hard to find) Documentation does not match actual conditions or operating procedures Errors Stress or overload errors Capacity or boundary errors Timing errors Throughput or performance errors

How to Deal with Errors Error prevention (Fault Avoidance) - before system is released: Use good programming methodology to reduce complexity Use version control to prevent inconsistent system (configuration mgmt) Apply verification to prevent algorithmic bugs Review of code by inspection team Error detection (Fault detection) - while system is running: Testing: Create failures in a planned way Debugging: Start with an unplanned failures Monitoring: Deliver information about state. Find performance bugs Error recovery (Fault Tolerance) - recover from failure while system is running: Data base systems (atomic transactions) Modular redundancy Recovery blocks

Some Observations It is impossible to completely test any nontrivial module or any system Theoretical limitations: Halting problem Practial limitations: Prohibitive in time and cost Testing can only show the presence of bugs, not their absence (Dijkstra) Often more effective to have independent tester Programmer often sticks to the data set that makes the program work A program often does not work when tried by somebody else. Don’t let this be the end-user!

Testing Activities Unit T est Unit T est Integration Functional Test Requirements Analysis Document Subsystem Code Unit Requirements Analysis Document System Design Document T est Tested Subsystem User Manual Subsystem Code Unit T est Tested Subsystem Integration Functional Test Test Functioning System Integrated Subsystems Tested Subsystem Subsystem Code Unit T est All tests by developer

Testing Activities ctd Client’s Understanding of Requirements Global Requirements User Environment Validated System Accepted System Functioning System Performance Acceptance Installation Test Test Test Usable System Tests by client Tests by developer User’s understanding System in Use Tests (?) by user

Quality Assurance encompasses Testing Usability Testing Scenario Testing Prototype Testing Product Testing Fault Avoidance Fault Tolerance Atomic Transactions Modular Redundancy Verification Configuration Management Fault Detection Reviews Debugging Walkthrough Inspection Testing Correctness Debugging Performance Debugging Component Testing Integration Testing System Testing

Testing Concepts Component – part of system that can be isolated for testing Test case – set of inputs and expected results that tries to cause failures and detect faults Black box tests – input/output behavior of component White box tests – internal structure of component Test stub – partial implementation of components on which the tested component depends Test driver – partial implementation of a component that depends on the tested component

Testing Concepts (cont) Correction – a change made to a component, to repair a fault A correction can introduce new faults! Then what? Problem Tracking Documentation of every failure, error, and fault Used with configuration management to find the new faults Regression testing After a change, re-execute all prior tests to ensure that previously working functionality has not been affected Rationale Maintenance Documentation of rationale of the change Developers can avoid new faults by reviewing assumptions and rationales of the other developers

Component Testing Unit Testing: Integration Testing: Individual subsystem Carried out by developers Goal: Confirm that each subsystem is correctly coded and carries out the intended functionality Integration Testing: Groups of subsystems (collection of classes) and eventually the entire system Goal: Test the interfaces among the subsystems

System Testing System Testing: Acceptance Testing: The entire system Carried out by developers Goal: Determine if the system meets the requirements (functional and global) Acceptance Testing: Evaluates the system delivered by developers Carried out by the client. May involve executing typical transactions on site on a trial basis Goal: Demonstrate that the system meets customer requirements and is ready to use Implementation (Coding) and testing go hand in hand

Unit Testing Informal: Static Analysis: Dynamic Analysis: Incremental coding Static Analysis: Hand execution: Reading the source code Walk-Through (informal presentation to others) Code Inspection (formal presentation to others) Automated Tools checking for syntactic and semantic errors departure from coding standards Dynamic Analysis: Black-box testing (Test the input/output behavior) White-box testing (Test the internal logic of the subsystem or object) Data-structure based testing (Data types determine test cases)

Black-box Testing Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test. Almost always impossible to generate all possible inputs ("test cases") Need to reduce number of test cases: Equivalence Testing Divide input conditions into equivalence classes Choose test cases for each equivalence class. (Example: If all negative inputs should behave the same, just test one negative) Boundary Testing Special case of equivalence testing Select elements from the “edges” of equivalence classes Developers often overlook special cases at the boundary Example: use of >= instead of >

White-box Testing Focus: Thoroughness (Coverage). Every statement in the component is executed at least once. Path testing Exercise all possible paths through a piece of code Start with flow graph, and graph all possible paths (including all choices at branch statements and loops) Limitation: Doesn’t always detect things that are missing. State-based testing Focuses more on object-oriented systems Derive test cases with start state, transition stimuli, and expected end state Difficulty: must include sequences to put system in desired start state before transitions are tested

White-box Testing Example: Determining the Paths FindMean (FILE ScoreFile) { float SumOfScores = 0.0; int NumberOfScores = 0; float Mean=0.0; float Score; Read(ScoreFile, Score); while (! EOF(ScoreFile)) { if (Score > 0.0 ) { SumOfScores = SumOfScores + Score; NumberOfScores++; } /* Compute the mean and print the result */ if (NumberOfScores > 0) { Mean = SumOfScores / NumberOfScores; printf(“ The mean score is %f\n”, Mean); } else printf (“No scores found in file\n”); 1 2 3 4 5 6 7 8 9

Constructing the Logic Flow Diagram

Comparison of White & Black-box Testing White-box Testing: White-box testing often tests what is done, instead of what should be done Cannot detect missing use cases Black-box Testing: Potential combinatorial explosion of test cases (valid & invalid data) Often not clear whether the selected test cases uncover a particular error Does not discover extraneous use cases ("features") Both types of testing are needed White-box testing and black box testing are the extreme ends of a testing continuum.