111 Testing Overview CS 4311 Frank Tsui, Orland Karam, and Barbara Bernal, Essential of Software Engineering, 3rd edition, Jones & Bartett Learning. Sections.

Slides:



Advertisements
Similar presentations
Software Testing. Quality is Hard to Pin Down Concise, clear definition is elusive Not easily quantifiable Many things to many people You'll know it when.
Advertisements

Test process essentials Riitta Viitamäki,
Lecture 8: Testing, Verification and Validation
1 Integration Testing CS 4311 I. Burnstein. Practical Software Testing, Springer-Verlag, 2003.
Testing and Quality Assurance
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Software Failure: Reasons Incorrect, missing, impossible requirements * Requirement validation. Incorrect specification * Specification verification. Faulty.
November 2005J. B. Wordsworth: J5DAMQVT1 Design and Method Quality, Verification, and Testing.
Chapter 9 Testing the System, part 2. Testing  Unit testing White (glass) box Code walkthroughs and inspections  Integration testing Bottom-up Top-down.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Illinois Institute of Technology
Testing an individual module
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
Outline Types of errors Component Testing Testing Strategy
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
BY RAJESWARI S SOFTWARE TESTING. INTRODUCTION Software testing is the process of testing the software product. Effective software testing will contribute.
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Chapter 13 & 14 Software Testing Strategies and Techniques
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
CS4311 Spring 2011 Verification & Validation Dr. Guoqiang Hu Department of Computer Science UTEP.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Chapter 12: Software Testing Omar Meqdadi SE 273 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Verification and Validation Overview References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s.
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
Lecture 11 Testing and Debugging SFDV Principles of Information Systems.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Testing Testing types Testing strategy Testing principles.
Neil Ghani Software testing. 2 Introduction In a perfect world all programs fully verified testing thus redundant Back in the real.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Today’s Agenda  HW #1  Finish Introduction  Input Space Partitioning Software Testing and Maintenance 1.
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
1 Ch. 1: Software Development (Read) 5 Phases of Software Life Cycle: Problem Analysis and Specification Design Implementation (Coding) Testing, Execution.
The Software Development Process
Software Development Problem Analysis and Specification Design Implementation (Coding) Testing, Execution and Debugging Maintenance.
Chapter 8 Testing. Principles of Object-Oriented Testing Å Object-oriented systems are built out of two or more interrelated objects Å Determining the.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Software Engineering Saeed Akhtar The University of Lahore.
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
Software Quality Assurance and Testing Fazal Rehman Shamil.
 Software Testing Software Testing  Characteristics of Testable Software Characteristics of Testable Software  A Testing Life Cycle A Testing Life.
Dynamic Testing.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Chapter 12: Software Testing Omar Meqdadi SE 273 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Testing Overview References: Pressman, Software Engineering: a Practitioner’s Approach, McGraw Hill Pfleeger, Software Engineering, Theory and Practice,
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
System Testing 12/09. Hierarchy of Testing Testing Program Testing Top Down Bottom Up Integration TestingUnit Testing System Testing Big Bang Sandwich.
SOFTWARE TESTING SOFTWARE TESTING Presented By, C.Jackulin Sugirtha-10mx15 R.Jeyaramar-10mx17K.Kanagalakshmi-10mx20J.A.Linda-10mx25P.B.Vahedha-10mx53.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
Testing Integral part of the software development process.
Testing Overview References: Pressman, Software Engineering: a Practitioner’s Approach, McGraw Hill Pfleeger, Software Engineering, Theory and Practice,
Software Testing Strategies for building test group
Software Testing.
Rekayasa Perangkat Lunak Part-13
Software Testing Techniques
Verification and Testing
Verification and Validation Overview
Some Simple Definitions for Testing
Chapter 13 & 14 Software Testing Strategies and Techniques
Lecture 09:Software Testing
Verification and Validation Unit Testing
Software testing.
Testing Overview References:
Chapter 10 – Software Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Software Testing Strategies
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

111 Testing Overview CS 4311 Frank Tsui, Orland Karam, and Barbara Bernal, Essential of Software Engineering, 3rd edition, Jones & Bartett Learning. Sections Hans Van Vliet, Software Engineering, Principles and Practice, 3 rd edition, John Wiley & Sons, Chapter 13.

222 Outline V&V  Definitions of V&V terms  V&V and software lifecycle  Sample techniques Testing  Basics of testing  Levels of software testing  Sample testing techniques

333 Verification and Validation (V&V) Textbook use of term “Testing”  General/wider sense to mean V&V Q: What is V&V in software testing?  Groups of 2  What? Why? Who? Against what? When? How?  5 minutes

What is V&V? Different use by different people, e.g.,  Formal vs. informal and static vs. dynamic Verification  Evaluation of an object to demonstrate that it meets its specification. (Did we build the system right?)  Evaluation of the work product of a development phase to determine whether the product satisfies the conditions imposed at the start of the phase. Validation  Evaluation of an object to demonstrate that it meets the customer’s expectations. (Did we build the right system?) 4

V&V and Software Lifecycle Throughout software lifecycle, e.g., V-model 5

Requirement Engineering Determine general test strategy/plan (techniques, criteria, team) Test requirements specification  Completeness  Consistency  Feasibility (functional, performance requirements)  Testability (specific; unambiguous; quantitative; traceable) Generate acceptance/validation testing data 6

Design Determine system and integration test strategy Assess/test the design  Completeness  Consistency  Handling scenarios  Traceability (to and from)  Design walkthrough, inspection 7

Implementation and Maintenance Implementation  Determine unit test strategy  Techniques (static vs. dynamic)  Tools, and whistles and bells (driver/harness, stub) Maintenance  Determine regression test strategy  Documentation maintenance (vital!) 8

Hierarchy of V&V Techniques V&V Dynamic Technique TestingSymbolic Execution Model Checking Static Analysis Proof Reading Inspection Walkthrough Informal Analysis Formal Analysis Static Technique in narrow sense Complementary 9

Definitions of V&V Terms “Correct” program and specification  Program matches its specification  Specification matches the client’s intent Error (a.k.a. mistake)  A human activity that leads to the creation of a fault  A human error results in a fault which may, at runtime, result in a failure Fault (a.k.a. bug)  The physical manifestation of an error that may result in a failure  A discrepancy between what something should contain (in order for failure to be impossible) and what it does contain Failure (a.k.a. symptom, problem, incident)  Observable misbehavior  Actual output does not match the expected output  Can only happen when a thing is being used 10

Definitions Fault identification and correction  Process of determining what fault caused a failure  Process of changing a system to remove a fault Debugging  The act of finding and fixing program errors Testing  The act of designing, debugging, and executing tests Test case and test set  A particular set of input and the expected output  A finite set of test cases working together with the same purpose Test oracle  Any means used to predict the outcome of a test 11

Where Do the Errors Come From? Q: What kinds of errors? Who?  Groups of 2  3 minutes 12

Where Do the Errors Come From? Kinds of errors  Missing information  Wrong information/design/implementation  Extra information Facts about errors  To err is human (but different person has different error rate).  Different studies indicate 30 to 85 errors per 1000 lines. After extensive testing, 0.5 to 3 errors per 1000 lines remain.  The longer an error goes undetected, the more costly to correct 13

Types of Faults List all the types and causes of faults: what can go wrong in the development process?  In group of 2  3 minutes 14

Sample Types of Faults Algorithmic: algorithm or logic does not produce the proper output for the given input Syntax: improper use of language constructs Computation (precision): formula’s implementation wrong or result not to correct degree of accuracy Documentation: documentation does not match what program does Stress (overload): data structures filled past capacity Capacity: system’s performance unacceptable as activity reaches its specified limit Timing: code coordinating events is inadequate Throughput: system does not perform at speed required Recovery: failure encountered and does not behave correctly 15

Sample Causes of Faults Requirements System Design Program Design Program Implementation Unit Testing System Testing Incorrect or missing requirements Incorrect translation Incorrect design specification Incorrect design interpretation Incorrect semantics Incorrect documentation Incomplete testing New faults introduced correcting others 16

Sample V&V Techniques Requirements Operation Design Maintenance Implementation Testing Reviews: walkthroughs/inspections Synthesis Model checking Correctness proofs Runtime monitoring 17

18 Outline V&V Definitions of V&V terms V&V and software lifecycle Sample techniques Testing  Basics of testing  Levels of software testing  Sample testing techniques

Question How do you know your software works correctly? 19

Question How do you know your software works correctly? Answer: Try it. Example: I have a function, say f, of one integer input. I tried f(6). It returned 35.  Is my program correct? Groups of 2 1 minute 20

Question How do you know your software works correctly? Answer: Try it. Example: I have a function, say f, of one integer input. I tried f(6). It returned 35.  My function is supposed to compute x*6-1. Is it correct?  Is my program correct? Groups of 2 1 minute 21

Goals of Testing I want to show that my program is correct; i.e., it produces the right answer for every input. Q: Can we write tests to show this? Groups of 2 1 minute 22

Goals of Testing Can we prove a program is correct by testing? Yes, if we can test it exhaustively: every combination of inputs in every environment. 23

How Long Will It Take? Consider X+Y for 32-bit integers. How many test cases are required? How long will it take?  1 test per second:  1,000 tests per second:  1,000,000 per second: Groups of 2 1 minute 24

How Long? Consider X+Y for 32-bit integers. How many test cases are required?  2 32 * 2 32 = 2 64 =10 19  (The universe is 4*10 17 seconds old.) How long will it take? 1 test per second:580,000,000,000 years 1,000 tests per second: 580,000,000 years 1,000,000 per second: 580,000 years 25

Another Example A B C A loop returns to A. We want to count the number of paths. The maximum number of iterations of the loop is 20. How many? 26

Another Example A B C Suppose the loop does not repeat: Only one pass executes 5 distinct paths 27

Another Example A B C Suppose the loop repeats exactly once 5*5=25 distinct paths If it repeats at most once, 5 + 5*5 28

Another Example A B C What if it repeats exactly n times? 5 n paths 29

Another Example A B C What if it repeats at most n times? ∑5 n = 5 n + 5 n-1 + … + 5 n=20, ∑5 n = years at 1,000,000 tests per second 30

Yet Another Example Consider testing a Java compiler? How many inputs are needed to test every input? 31

Limits of Testing You can’t test it completely. You can’t test all valid inputs. You can’t test all invalid inputs. You really can’t test edited inputs. You can’t test in every environment. You can’t test all variations on timing. You can’t even test every path. (path, set of lines executed, start to finish) 32

Why Bother? Test cannot show an absence of a fault. But, it can show its existence! 33

Goals of Testing Identify errors  Make errors repeatable (when do they occur?)  Localize errors (where are they?) The purpose of testing is to find problems in programs so they can be fixed. 34

Cost of Testing Testing accounts for between 30% and 90% of the total cost of software. Microsoft employs one tester for each developer. We want to reduce the cost  Increase test efficiency: #defects found/test  Reduce the number of tests  Find more defects How? Organize! 35

A Good Test: Has a reasonable probability of catching an error Is not redundant Is neither too simple nor complex Reveals a problem Is a failure if it doesn’t reveal a problem 36

37 Outline V&V Definitions of V&V terms V&V and software lifecycle Sample techniques Testing Basics of testing  Levels of software testing  Sample testing techniques

Levels of Software Testing Unit/Component testing Integration testing System testing Acceptance testing Installation testing 38

Levels of Software Testing Unit/Component testing  Verify implementation of each software element  Trace each test to detailed design Integration testing System testing Acceptance testing Installation testing 39

Levels of Software Testing Unit/Component testing Integration testing  Combine software units and test until the entire system has been integrated  Trace each test to high-level design System testing Acceptance testing Installation testing 40

Levels of Software Testing Unit/Component testing Integration testing System testing  Test integration of hardware and software  Ensure software as a complete entity complies with operational requirements  Trace test to system requirements Acceptance testing Installation testing 41

Levels of Software Testing Unit/Component testing Integration testing System testing Acceptance testing  Determine if test results satisfy acceptance criteria of project stakeholder  Trace each test to stakeholder requirements Installation testing 42

Levels of Software Testing Unit/Component testing Integration testing System testing Acceptance testing Installation testing  Perform testing with application installed on its target platform 43

Testing Phases: V-Model Requirements Specification System Specification System Design Detailed Design Unit code and Test Sub-system Integration test System Integration test Acceptance Test Service Acceptance Test Plan System Integration Test Plan Sub-system Integration Test Plan 44

Hierarchy of Testing Testing Program Testing Top Down Bottom Up Integration TestingUnit Testing System Testing Big Bang Sandwich Black Box White Box Function Performance Reliability Availability Acceptance Testing Properties Security Equivalence Boundary Decision Table State Transition Use Case Domain Analysis Control FlowData Flow Usability Documentation Portability Capacity Ad Hoc Benchmark Pilot Alpha Beta 45