Software Testing. SE, Testing, Hans van Vliet, ©2008 2 Nasty question  Suppose you are being asked to lead the team to test the software that controls.

Slides:



Advertisements
Similar presentations
Lecture 12 - Software Testing Techniques & Software Testing Strategies
Advertisements

Software Testing Technique. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves.
Chapter 14 Testing Tactics
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Creator: ACSession No: 13 Slide No: 1Reviewer: SS CSE300Advanced Software EngineeringFebruary 2006 Testing - Techniques CSE300 Advanced Software Engineering.
Chapter 17 Software Testing Techniques
Software Testing Techniques. December Introduction Many aspects to achieving software quality –Formal reviews (of both the software process and.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Testing an individual module
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
Chapter 18 Testing Conventional Applications
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Engineering Lecture 12 Software Testing Techniques 1.
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Chapter 13 & 14 Software Testing Strategies and Techniques
Software Testing Sudipto Ghosh CS 406 Fall 99 November 9, 1999.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Testing phases. Test data Inputs which have been devised to test the system Test cases Inputs to test the system and the predicted outputs from these.
Prof. Mohamed Batouche Software Testing.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
CS /51 Illinois Institute of Technology CS487 Software Engineering Software Testing Techniques Mr. David A. Lash.
Software Testing Testing types Testing strategy Testing principles.
Software Testing The process of operating a system or component under specified conditions, observing and recording the results, and making an evaluation.
Agenda Introduction Overview of White-box testing Basis path testing
Software testing Main issues: There are a great many testing techniques Often, only the final code is tested.
INTRUDUCTION TO SOFTWARE TESTING TECHNIQUES BY PRADEEP I.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 14a: Software Testing Techniques Software Engineering: A Practitioner’s Approach, 6/e Chapter.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 6/e (McGraw-Hill 2005). Slides copyright 2005 by Roger Pressman.1.
1 Program Testing (Lecture 14) Prof. R. Mall Dept. of CSE, IIT, Kharagpur.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Theory and Practice of Software Testing
SOFTWARE TESTING. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves any activity.
Software Quality Assurance and Testing Fazal Rehman Shamil.
Dynamic Testing.
White Box Testing by : Andika Bayu H.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Chapter 23 전통적인 애플리케이션 테스팅 Testing Conventional Applications 임현승 강원대학교 Revised from the slides by Roger S. Pressman and Bruce R. Maxim for the book “Software.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
1 Lecture 14: Chapter 18 Testing Conventional Applications Slide Set to accompany Software Engineering: A Practitioner’s Approach, 7/e by Roger S. Pressman.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
1 Software Testing. 2 What is Software Testing ? Testing is a verification and validation activity that is performed by executing program code.
Testing Integral part of the software development process.
Chapter 17 Software Testing Techniques
Chapter 18 Testing Conventional Applications
Software Testing.
Software Testing.
Software Engineering (CSI 321)
Chapter 18 Testing Conventional Applications
Chapter 13 & 14 Software Testing Strategies and Techniques
Structural testing, Path Testing
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Chapter 18 Testing Conventional Applications
Chapter 18 Testing Conventional Applications
Chapter 14 Software Testing Techniques
Chapter 18 Testing Conventional Applications
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Chapter 18 Testing Conventional Applications.
Chapter 23 Testing Conventional Applications
Software Testing “If you can’t test it, you can’t design it”
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Unit III – Chapter 3 Path Testing.
Presentation transcript:

Software Testing

SE, Testing, Hans van Vliet, © Nasty question  Suppose you are being asked to lead the team to test the software that controls a new ATM machine. Would you take that job?  What if the contract says you’ll be charged with maximum punishment in case a patient dies because of a mal-functioning of the software?

SE, Testing, Hans van Vliet, © State-of-the-Art  errors are made per 1000 lines of source code  Extensively tested software contains errors per 1000 lines of source code  Testing is postponed, as a consequence: the later an error is discovered, the more it costs to fix it.  Error distribution: 60% design, 40% implementation. 66% of the design errors are not discovered until the software has become operational.

SE, Testing, Hans van Vliet, © Relative cost of error correction REdesigncodetestoperation

SE, Testing, Hans van Vliet, © Lessons  Many errors are made in the early phases  These errors are discovered late  Repairing those errors is costly

SE, Testing, Hans van Vliet, © How then to proceed?  Exhaustive testing most often is not feasible  Random statistical testing does not work either if you want to find errors  Therefore, we look for systematic ways to proceed during testing

SE, Testing, Hans van Vliet, © Classification of testing techniques  Classification based on the criterion to measure the adequacy of a set of test cases:  coverage-based testing  fault-based testing  error-based testing  Classification based on the source of information to derive test cases:  black-box testing (functional, specification-based)  white-box testing (structural, program-based)

SE, Testing, Hans van Vliet, © Some preliminary questions  What exactly is an error?  How does the testing process look like?  When is test technique A superior to test technique B?  What do we want to achieve during testing?  When to stop testing?

SE, Testing, Hans van Vliet, © Error, fault, failure  an error is a human activity resulting in software containing a fault  a fault is the indiactor of an error  a fault may result in a failure

SE, Testing, Hans van Vliet, © When exactly is a failure?  Failure is a relative notion: e.g. a failure w.r.t. the specification document  Verification: evaluate a product to see whether it satisfies the conditions specified at the start: Have we built the system right?  Validation: evaluate a product to see whether it does what we think it should do: Have we built the right system?

SE, Testing, Hans van Vliet, © Point to ponder: Maiden flight of Ariane 5

SE, Testing, Hans van Vliet, © What is our goal during testing?  Objective 1: find as many faults as possible  Objective 2: make you feel confident that the software works OK

SE, Testing, Hans van Vliet, © Test documentation (IEEE 928)  Test plan  Test design specification  Test case specification  Test procedure specification  Test item transmittal report  Test log  Test incident report  Test summary report

Software Testing Methods Strategies white-box methods black-box methods

Test Case Design Strategies  Black-box or behavioral testing (knowing the specified function a product is to perform and demonstrating correct operation based solely on its specification without regard for its internal logic)  White-box or glass-box testing (knowing the internal workings of a product, tests are performed to check the workings of all independent logic paths)

White-Box Testing... our goal is to ensure that all statements and conditions have been executed at least once...

Why Cover? logic errors and incorrect assumptions are inversely proportional to a path's execution probability we often believe that a path is not that a path is not likely to be executed; in fact, reality is often counter intuitive typographical errors are random; it's likely that untested paths will contain some

Basis Path Testing  White-box technique usually based on the program flow graph

First, we compute the cyclomatic Complexity, or number of enclosed areas + 1, In this case, V(G) = 4

Next, we derive the independent paths: Since V(G) = 4, there are four paths Path 1: 1,2,3,6,7,8 Path 2: 1,2,3,5,7,8 Path 3: 1,2,4,7,8 Path 4: 1,2,4,7,2,4,...7,8 Finally, we derive test cases to exercise these paths

Basis Path Testing Notes you don't need a flow chart, but the picture will help when you trace program paths count each simple logical test, compound tests count as 2 or more basis path testing should be applied to critical modules

Control Structure Testing  White-box techniques focusing on control structures present in the software  Condition testing (e.g. branch testing) focuses on testing each decision statement in a software module, it is important to ensure coverage of all logical combinations of data that may be processed by the module (a truth table may be helpful)

 Data flow testing selects test paths based according to the locations of variable definitions and uses in the program (e.g. definition use chains)  Loop testing focuses on the validity of the program loop constructs (i.e. simple loops, concatenated loops, nested loops, unstructured loops), involves checking to ensure loops start and stop when they are supposed to (unstructured loops should be redesigned whenever possible)

Loop Testing NestedLoops Concatenated Loops Loops Unstructured Loops Simpleloop

Graph-based Testing Methods  Black-box methods based on the nature of the relationships (links) among the program objects (nodes), test cases are designed to traverse the entire graph  Transaction flow testing (nodes represent steps in some transaction and links represent logical connections between steps that need to be validated)

Equivalence Partitioning  Black-box technique that divides the input domain into classes of data from which test cases can be derived  An ideal test case uncovers a class of errors that might require many arbitrary test cases to be executed before a general error is observed

Boundary Value Analysis  Black-box technique that focuses on the boundaries of the input domain rather than its center  BVA guidelines: 1.If input condition specifies a range bounded by values a and b, test cases should include a and b, values just above and just below a and b 2.If an input condition specifies and number of values, test cases should be exercise the minimum and maximum numbers, as well as values just above and just below the minimum and maximum values 3.Apply guidelines 1 and 2 to output conditions, test cases should be designed to produce the minimum and maxim output reports 4.If internal program data structures have boundaries (e.g. size limitations), be certain to test the boundaries

Comparison Testing  Black-box testing for safety critical systems in which independently developed implementations of redundant systems are tested for conformance to specifications  Often equivalence class partitioning is used to develop a common set of test cases for each implementation

SE, Testing, Hans van Vliet, © Summary  Do test as early as possible  Testing is a continuous process  Design with testability in mind  Test activities must be carefully planned, controlled and documented.  No single reliability model performs best consistently