Chapter 11, Testing.

Slides:



Advertisements
Similar presentations
2017/3/25 Test Case Upgrade from “Test Case-Training Material v1.4.ppt” of Testing basics Authors: NganVK Version: 1.4 Last Update: Dec-2005.
Advertisements

SOFTWARE ENGINEERING & IT 0301 Semester V
Chapter 7 Constructors and Other Tools. Copyright © 2006 Pearson Addison-Wesley. All rights reserved. 7-2 Learning Objectives Constructors Definitions.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Copyright © 2003 Pearson Education, Inc. Slide 1.
By Rick Clements Software Testing 101 By Rick Clements
4 Copyright © 2005, Oracle. All rights reserved. Creating the Web Tier: Servlets.
1 RA I Sub-Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Casablanca, Morocco, 20 – 22 December 2005 Status of observing programmes in RA I.
Making the System Operational
Software Engineering COMP 201
LECTURE 8: Software Testing
Configuration management
Chapter 11: Models of Computation
Software Testing Strategies
Chapter 11, Testing.
Software testing.
Defect testing Objectives
Testing Workflow Purpose
Chapter 11, Testing, Part 2: Integration and System Testing
Chapter 17 Linked Lists.
Chapter 1 Object Oriented Programming 1. OOP revolves around the concept of an objects. Objects are created using the class definition. Programming techniques.
+ Introduction to JUnit IT323 – Software Engineering II By: Mashael Al-Duwais 1.
การทดสอบโปรแกรม กระบวนการในการทดสอบ
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Lecture 8: Testing, Verification and Validation
Chapter 10 Software Testing
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 1: Introduction.
Verification and Validation
Copyright © 2003 by Prentice Hall Computers: Tools for an Information Age Chapter 15 Programming and Languages: Telling the Computer What to Do.
PSSA Preparation.
Essential Cell Biology
Software Engineering I: Integration and System Testing
 2003 Prentice Hall, Inc. All rights reserved. 1 Chapter 13 - Exception Handling Outline 13.1 Introduction 13.2 Exception-Handling Overview 13.3 Other.
1 Unit Testing with JUnit CS 3331 Fall 2009 Kent Beck and Eric Gamma. Test Infected: Programmers Love Writing Tests, Java Report, 3(7):37-50, Available.
Testing and Quality Assurance
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 11: Integration- and System Testing.
Chapter 11, Testing.
Outline Types of errors Component Testing Testing Strategy
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
System/Software Testing
JUnit The framework. Goal of the presentation showing the design and construction of JUnit, a piece of software with proven value.
Chapter 11, Testing.
ECE 355: Software Engineering
Conquering Complex and Changing Systems Object-Oriented Software Engineering Chapter 9, Testing.
Testing. What is Testing? Definition: exercising a program under controlled conditions and verifying the results Purpose is to detect program defects.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
CMSC 345 Fall 2000 Unit Testing. The testing process.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
Chapter 8 – Software Testing Lecture 1 1Chapter 8 Software testing The bearing of a child takes nine months, no matter how many women are assigned. Many.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Introduction to Software Testing. Types of Software Testing Unit Testing Strategies – Equivalence Class Testing – Boundary Value Testing – Output Testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Introduction to JUnit 3.8 SEG 3203 Winter ‘07 Prepared By Samia Niamatullah.
Unit Testing with JUnit and Clover Based on material from: Daniel Amyot JUnit Web site.
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Using UML, Patterns, and Java Object-Oriented Software Engineering Chapter 11, Testing.
S Ramakrishnan1 Systems V & V, Quality and Standards Dr Sita Ramakrishnan School CSSE Monash University.
PROGRAMMING TESTING B MODULE 2: SOFTWARE SYSTEMS 22 NOVEMBER 2013.
Unit Testing with FlexUnit
Conquering Complex and Changing Systems Object-Oriented Software Engineering Chapter 9, Testing.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Testing Tutorial 7.
Chapter 9, Testing.
Chapter 8 – Software Testing
Testing and Test-Driven Development CSC 4700 Software Engineering
Chapter 11, Testing.
Chapter 11: Integration- and System Testing
Chapter 11, Testing.
Software Lifecycle Activities
CS410 – Software Engineering Lecture #11: Testing II
Presentation transcript:

Chapter 11, Testing

Outline of the 3 Lectures on Testing Today: Terminology Testing Activities Unit testing Third Lecture: Model-based Testing U2TP Testing Profile Next lecture: Integration testing Testing strategy Design patterns & testing System testing Function testing Acceptance testing. Outline anpassen

Famous Problems F-16 : crossing equator using autopilot Result: plane flipped over Reason? Reuse of autopilot software The Therac-25 accidents (1985-1987), quite possibly the most serious non-military computer-related failure ever in terms of human life (at least five died) Reason: Bad event handling in the GUI NASA Mars Climate Orbiter destroyed due to incorrect orbit insertion (September 23, 1999) Reason: Unit conversion problem.

Terminology Failure: Any deviation of the observed behavior from the specified behavior Erroneous state (error): The system is in a state such that further processing by the system can lead to a failure Fault: The mechanical or algorithmic cause of an error (“bug”) Validation: Activity of checking for deviations between the observed behavior of a system and its specification.

What is this? A failure? An error? A fault? We need to describe specified and desired behavior first!

Erroneous State (“Error”)

Algorithmic Fault

Mechanical Fault

F-16 Bug What is the failure? What is the error? What is the fault? Bad use of implementation inheritance A Plane is not a rocket. Rocket Plane

Examples of Faults and Errors Faults in the Interface specification Mismatch between what the client needs and what the server offers Mismatch between requirements and implementation Algorithmic Faults Missing initialization Incorrect branching condition Missing test for null Mechanical Faults (very hard to find) Operating temperature outside of equipment specification Errors Null reference errors Concurrency errors Exceptions. Beispiel Itunes Jeder freier song

How do we deal with Errors, Failures and Faults?

Modular Redundancy

Declaring the Bug as a Feature

Patching

Testing

Another View on How to Deal with Faults Fault avoidance Use methodology to reduce complexity Use configuration management to prevent inconsistency Apply verification to prevent algorithmic faults Use Reviews Fault detection Testing: Activity to provoke failures in a planned way Debugging: Find and remove the cause (Faults) of an observed failure Monitoring: Deliver information about state => Used during debugging Fault tolerance Exception handling Modular redundancy. Fault avoidance (before the system is released): Use programming methodology to reduce complexity Use configuration management to prevent inconsistency Apply verification to prevent algorithmic faults Reviews Fault detection (while system is running): Testing: Create failures in a planned way Debugging: Start with an unplanned failures Monitoring: Deliver information about state. Find performance bugs Fault tolerance (recover from failure once the system is released): Atomic transactions (database systems) Modular redundancy

Taxonomy for Fault Handling Techniques Fault Avoidance Fault Detection Fault Tolerance Methodoloy Configuration Management Atomic Transactions Modular Redundancy Richtig machen Verification Testing Debugging Unit Testing Integration Testing System Testing

Observations It is impossible to completely test any nontrivial module or system Practical limitations: Complete testing is prohibitive in time and cost Theoretical limitations: e.g. Halting problem “Testing can only show the presence of bugs, not their absence” (Dijkstra). Testing is not for free => Define your goals and priorities

Testing takes creativity To develop an effective test, one must have: Detailed understanding of the system Application and solution domain knowledge Knowledge of the testing techniques Skill to apply these techniques Testing is done best by independent testers We often develop a certain mental attitude that the program should in a certain way when in fact it does not Programmers often stick to the data set that makes the program work A program often does not work when tried by somebody else. Testing often viewed as dirty work. To develop an effective test, one must have: Detailed understanding of the system Knowledge of the testing techniques Skill to apply these techniques in an effective and efficient manner Testing is done best by independent testers We often develop a certain mental attitude that the program should in a certain way when in fact it does not. Programmer often stick to the data set that makes the program work "Don’t mess up my code!" A program often does not work when tried by somebody else. "Every new user uncovers a new class of bugs” behave

Requirements Analysis Testing Activities Unit Testing Integration Testing System Testing Acceptance Testing Object Design Document System Design Document Requirements Analysis Document Client Expectation Unit Testing Integration Testing System Testing Acceptance Testing Developer Client

Types of Testing Unit Testing Integration Testing System Testing Acceptance Testing Unit Testing Individual component (class or subsystem) Carried out by developers Goal: Confirm that the component or subsystem is correctly coded and carries out the intended functionality Integration Testing Groups of subsystems (collection of subsystems) and eventually the entire system Goal: Test the interfaces among the subsystems.

Types of Testing continued... Unit Testing Integration Testing System Testing Acceptance Testing System Testing The entire system Carried out by developers Goal: Determine if the system meets the requirements (functional and nonfunctional) Acceptance Testing Evaluates the system delivered by developers Carried out by the client. May involve executing typical transactions on site on a trial basis Goal: Demonstrate that the system meets the requirements and is ready to use.

When should you write a test? Traditionally after the source code is written In XP before the source code written Test-Driven Development Cycle Add a test Run the automated tests => see the new one fail Write some code => see them succeed Refactor code.

Unit Testing Static Testing (at compile time) Acceptance Testing Integration Testing System Testing Static Testing (at compile time) Static Analysis Review Walk-through (informal) Code inspection (formal) Dynamic Testing (at run time) Black-box testing White-box testing. Static Analysis: Manual execution: Reading the source code Walk-through (informal presentation to others) Code inspection (formal presentation to others) Automated tools checking for syntactic and semantic errors departure from coding standards Dynamic Analysis: Black-box testing White-box testing Data-structure based testing Black-box testing (Test the input/output behavior) White-box testing (Test the internal logic of the subsystem or object) Data-structure based testing (Data types determine test cases)

Static Analysis with Eclipse Unit Testing Integration Testing System Testing Acceptance Testing Compiler Warnings and Errors Possibly uninitialized Variable Undocumented empty block Assignment has no effect Checkstyle Check for code guideline violations http://checkstyle.sourceforge.net FindBugs Check for code anomalies http://findbugs.sourceforge.net Metrics Check for structural anomalies http://metrics.sourceforge.net

Black-box testing Focus: I/O behavior Unit Testing Integration Testing System Testing Acceptance Testing Focus: I/O behavior If for any given input, we can predict the output, then the component passes the test Requires test oracle Goal: Reduce number of test cases by equivalence partitioning: Divide input conditions into equivalence classes Choose test cases for each equivalence class. Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test. Almost always impossible to generate all possible inputs ("test cases") Goal: Reduce number of test cases by equivalence partitioning: Divide input conditions into equivalence classes Choose test cases for each equivalence class. (Example: If an object is supposed to accept a negative number, testing one negative number is enough)

Black-box testing: Test case selection a) Input is valid across range of values Developer selects test cases from 3 equivalence classes: Below the range Within the range Above the range b) Input is only valid, if it is a member of a discrete set Developer selects test cases from 2 equivalence classes: Valid discrete values Invalid discrete values No rules, only guidelines. Rauslassen?

Black box testing: An example public class MyCalendar { public int getNumDaysInMonth(int month, int year) throws InvalidMonthException { … } } Representation for month: 1: January, 2: February, …., 12: December Representation for year: 1904, … 1999, 2000,…, 2006, … How many test cases do we need for the black box testing of getNumDaysInMonth()? Depends on calendar: gregorian calendar Month parameter equivalence classes 30 days 31 days February No month 0, 13, -1 year: parameter equivalence classes Normal year Leap year: /4 /100 /400 Before christ/ After Christ year -200 Before at and after 2000 => 12 test cases

White-box testing overview Unit Testing Integration Testing System Testing Acceptance Testing Code coverage Branch coverage Condition coverage Path coverage => Details in the exercise session about testing

Unit Testing Heuristics 1. Create unit tests when object design is completed Black-box test: Test the functional model White-box test: Test the dynamic model 2. Develop the test cases Goal: Find effective num- ber of test cases 3. Cross-check the test cases to eliminate duplicates Don't waste your time! 4. Desk check your source code Sometimes reduces testing time 5. Create a test harness Test drivers and test stubs are needed for integration testing 6. Describe the test oracle Often the result of the first successfully executed test 7. Execute the test cases Re-execute test whenever a change is made (“regression testing”) 8. Compare the results of the test with the test oracle Automate this if possible. Develop the test cases Goal: Find minimal num- ber of test cases to cover as many paths as possible Don’t forget regression testing Re-execute test cases every time a change is made.

JUnit: Overview A Java framework for writing and running unit tests Unit Testing Integration Testing System Testing Acceptance Testing A Java framework for writing and running unit tests Test cases and fixtures Test suites Test runner Written by Kent Beck and Erich Gamma Written with “test first” and pattern-based development in mind Tests written before code Allows for regression testing Facilitates refactoring JUnit is Open Source www.junit.org JUnit Version 4, released Mar 2006

JUnit Classes Methods under Test * TestResult Test run(TestResult) TestCase run(TestResult) setUp() tearDown() testName:String runTest() TestSuite run(TestResult) addTest() ConcreteTestCase setUp() tearDown() runTest() UnitToBeTested Methods under Test

An example: Testing MyList Unit to be tested MyList Methods under test add() remove() contains() size() Concrete Test case MyListTestCase

* TestResult Test run(TestResult) TestCase run(TestResult) setUp() tearDown() testName:String runTest() TestSuite run(TestResult) addTest() MyListTestCase MyList setUp() tearDown() runTest() testAdd() testRemove() add() remove() contains() size()

Writing TestCases in JUnit public class MyListTestCase extends TestCase { public MyListTestCase(String name) { super(name); } public void testAdd() { // Set up the test List aList = new MyList(); String anElement = “a string”; // Perform the test aList.add(anElement); // Check if test succeeded assertTrue(aList.size() == 1); assertTrue(aList.contains(anElement)); protected void runTest() { testAdd(); Test run(TestResult) MyListTestCase setUp() tearDown() runTest() testAdd() testRemove() TestResult TestCase setUp() tearDown() testName:String TestSuite addTest() MyList add() remove() contains() size() *

Writing Fixtures and Test Cases public class MyListTestCase extends TestCase { // … private MyList aList; private String anElement; public void setUp() { aList = new MyList(); anElement = “a string”; } public void testAdd() { aList.add(anElement); assertTrue(aList.size() == 1); assertTrue(aList.contains(anElement)); public void testRemove() { aList.remove(anElement); assertTrue(aList.size() == 0); assertFalse(aList.contains(anElement)); Test Fixture Test Case A test fixture is a set of instances and links that are used as a dummyenvironment for the test cases. By factoring out the initialization of the fixture into the setUp() method, fixtures can be reused between test cases.The tearDown() method is used to clean up after a test, so that the fixturecan be rebuilt from scratch for the next test case.The white paper from which these concepts are taken can be found at: http://junit.sourceforge.net/doc/cookbook/cookbook.htm Test Case

Collecting TestCases into TestSuites public static Test suite() { TestSuite suite = new TestSuite(); suite.addTest(new MyListTest(“testAdd”)); suite.addTest(new MyListTest(“testRemove”)); return suite; } * Test run(TestResult) TestCase setUp() tearDown() testName:String runTest() TestSuite addTest() Composite Pattern!

Design patterns in JUnit Composite Pattern Command Pattern * TestResult Test run(TestResult) TestCase run(TestResult) setUp() tearDown() testName:String runTest() TestSuite run(TestResult) addTest() Template Method Pattern ConcreteTestCase setUp() tearDown() runTest() Adapter Pattern TestedUnit

Design patterns in JUnit Composite Pattern Command Pattern * TestResult Test run(TestResult) TestCase run(TestResult) setUp() tearDown() testName:String runTest() TestSuite run(TestResult) addTest() Template Method Pattern ConcreteTestCase setUp() tearDown() runTest() Adapter Pattern TestedUnit

Other JUnit features Textual and GUI interface Displays status of tests Displays stack trace when tests fail Integrated with Maven and Continuous Integration http://maven.apache.org Build and Release Management Tool http://Maven.apache.org/continuum Continous integration server for Java programs All tests are run before release (regression tests) Test results are advertised as a project report Many specialized variants Unit testing of web applications J2EE applications

Additional Readings JUnit Website www.junit.org/index.htm