CS 432 Object-Oriented Analysis and Design

Slides:



Advertisements
Similar presentations
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
Advertisements

Documentation Testing
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Illinois Institute of Technology
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Systems Analysis and Design in a Changing World, 6th Edition
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Software Testing & Strategies
Software Engineering Lecture 13 Software Testing Strategies 1.
BY RAJESWARI S SOFTWARE TESTING. INTRODUCTION Software testing is the process of testing the software product. Effective software testing will contribute.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Chapter 13 & 14 Software Testing Strategies and Techniques
12.
CCSB223/SAD/CHAPTER141 Chapter 14 Implementing and Maintaining the System.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 Object-Oriented Testing CIS 375 Bruce R. Maxim UM-Dearborn.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 13b: Software Testing Strategies Software Engineering: A Practitioner’s Approach, 6/e Chapter.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Testing.
INT-Evry (Masters IT– Soft Eng)IntegrationTesting.1 (OO) Integration Testing What: Integration testing is a phase of software testing in which.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
1 Software Engineering Muhammad Fahad Khan Software Engineering Muhammad Fahad Khan University Of Engineering.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 Chapter 7 Software Testing Strategies. 2 Software Testing Testing is the process of exercising a program with the specific intent of finding errors.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
CS 3610: Software Engineering – Spring 2009 Dr. Hisham Haddad – CSIS Dept. Chapter 13 Software Testing Strategies Discussion of Software Testing Strategies.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
Software Engineering Prof. Ing. Ivo Vondrak, CSc. Dept. of Computer Science Technical University of Ostrava
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
LECTURE 19 23/11/15 Software Quality and Testing.
Testing dan Implementasi
1 Software Testing Strategies: Approaches, Issues, Testing Tools.
1 Object-Oriented Analysis and Design with the Unified Process Figure 13-1 Implementation discipline activities.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Software Engineering By Germaine Cheung Hong Kong Computer Institute Lecture 7.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
OBJECT-ORIENTED TESTING. TESTING OOA AND OOD MODELS Analysis and design models cannot be tested in the conventional sense. However, formal technical reviews.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
Software Testing Testing is the process of exercising a program with the specific intent of finding errors prior to delivery to the end user.
Software Testing Strategies for building test group
Software Testing.
Objectives Describe implementation activities
Software Testing Techniques
Software Engineering: A Practitioner’s Approach, 6/e Chapter 13 Software Testing Strategies copyright © 1996, 2001, 2005 R.S. Pressman & Associates,
Chapter 18 Software Testing Strategies
Chapter 13 & 14 Software Testing Strategies and Techniques
Applied Software Implementation & Testing
CS 8532: Advanced Software Engineering
Verification and Validation Unit Testing
Chapter 17 Software Testing Strategies
CS 8532: Advanced Software Engineering
Chapter 10 – Software Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Chapter 17 Software Testing Strategies
Chapter 17 Software Testing Strategies
Chapter 17 Software Testing Strategies.
Chapter 22 Software Testing Strategies
Chapter 17 Software Testing Strategies
Overview Activities from additional UP disciplines are needed to bring a system into being Implementation Testing Deployment Configuration and change management.
Chapter 17 Software Testing Strategies
Software Testing Strategies
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

CS 432 Object-Oriented Analysis and Design * 07/16/96 CS 432 Object-Oriented Analysis and Design Week 7 Testing *

Software Testing Testing is the process of exercising a program with the specific intent of finding errors prior to delivery to the end user. “You can’t test in quality. If it’s not there before you begin testing, it won’t be there when you’re finished testing.”

Testing Testing is a process of identifying defects Develop test cases and test data A test case is a formal description of A starting state One or more events to which the software must respond The expected response or ending state Test data is a set of starting states and events used to test a module, group of modules, or entire system

Testing discipline activities

Testing is Use Case Driven Terms associated with testing: Verification – are we building the system right? Validation – are we building the right product? Testing the Use Case Model is a validation task that determines whether the customer’s needs are being met by the application. Testing of remaining models is a verification task that determines whether the application correctly satisfies the requirements specified in the Use Case Model

Testing Workflow Defines the primary activities required to develop the Test Model: Develop Test Plan Outline the testing strategies to be used including model tests, Estimating the resources required to complete the testing activities, Scheduling the testing as part of the overall project plan.

Testing Workflow Design Model Tests Design Test Determine strategies for testing a model’s correctness, Determine strategies for testing a model’s completeness, Determine strategies for testing a model’s consistency, Design Test Identify and describe test cases for each build Identify and structure test procedures specifying how to perform or carry out the test cases that test the executable application components and their integration,

Testing Workflow Implement Test – automate test procedures by creating test components that define executable components that automate all or part of the test procedures, if possible Perform Tests – execute the tests cases as part of: an iteration release or implementation build including: Unit testing is the white-box technique in which programmers verify that the software they implemented satisfies its design requirements (see Section 6.3) Regression testing verifies that changes to one part of an application do not break other previous working parts/components of the application,

Testing Workflow Integration testing verifies the communication and cooperation among major system components that have been previously unit tested, User acceptance testing tests whether the users can actually use the system and the difficulties they encounter when doing so (often a Cognitive Psychology issue), Stress testing ensures the system can handle significant loads, e.g., 5,000 simultaneous users. Performance testing ensures the system meets performance objectives, e.g., response within 3 seconds. Evaluate Test – evaluate the results of the tests and the testing strategy.

Testing Categories black-box testing techniques use only functional knowledge of an entity being tested. E.g. only knowledge on use cases, or interfaces white-box testing techniques utilize knowledge of the internal control structure logic of the entity being tested. Behavioral diagrams capture some aspect of the internal behavior of how an entity realizes the functionality associated with a use case.

Models – How to Test In general, develop use cases that address these issues: correctness – tests both the syntactic and semantic accuracy of a model. From a syntactic perspective, the model must be tested to verify that the various UML elements used within the model are used correctly. From a semantic perspective, the elements within the model must accurately correspond to the reality being represented by the model.

Models – How to Test completeness – tests whether a model is complete by verifying whether any required elements are missing from the model. This is typically accomplished by determining whether the model can appropriately handle scenarios developed for this purpose. consistency – tests whether the elements in a model are used in a conflicting way. It also verifies that the elements of a model don’t contradict with the elements of other models on which it depends.

Use Case Model The testing of the Use-Case Model must consider the following issues: Correctness – does each use case accurately represent a requirement? Completeness – do the use cases represent all of the functionality needed for a satisfactory product Consistency – generally this requires examining extension and included use cases to ensure that their relation to other use cases is consistent. Naturally, it’s possible to declare two use cases that contradict each other, but this is rare in practice.

Analysis Model Testing Testing the analysis model amounts to determining whether the application being modeled correctly interprets the application domain: Correctness – the description of the domain concepts are accurate; the algorithms will produce the expected results. The concepts and algorithms cover the use cases Completeness – the concepts are sufficient to cover the scope of the content specified. Sufficient detail is given to describe concepts to the required depth. Experts agree with the attributes and behaviors assigned to each class. Consistency – Model elements should be consistent with the business’s definitions and meanings. Where there are multiple ways to represent a concept of action, those ways should be modeled equivalently.

Design Model Testing Testing the design model is conceptually similar to testing the analysis model, but also requires addressing the following issues: Correctness – Each class accurately implements the semantics of an interface. Classes corresponding to interfaces must implement the interface Completeness – classes are defined for each interface in the architecture. Preconditions for method use are appropriate specified. Post-conditions and error conditions are specified. Consistency – The behaviors in the interface of each class provides either a single way to accomplish a task or, of there are multiple ways, they provide the same behavior, but with different preconditions

Design Model Testing Testing an object-oriented design presents some additional challenges not typically found in the testing of procedure programming designs: Interfaces: Classes implement interfaces and it’s necessary to ensure that the class correctly implements the functionality required by its interface. Inheritance: The use of inheritance also introduces coupling problems in which a change to one class results in a change to all of its subclasses. This requires ensuring that the change that is now being inherited is appropriate for all of the subclasses. Delegation can often alleviate many of the issues associated with inheritance. Delegation: The delegation of tasks to other objects, though similar to the use of modules in procedure programming languages, also presents its own set of issues. In a class, the delegation is encapsulated behind a public interface, hence its necessary to ensure that the implementation of the interface is still correct when the class to which the functionality has been delegated is changed.

Implementation Model Testing Testing the implementation model focuses on verifying that the implemented code satisfies the design in the design model (and the requirements of the use-case model). Correctness – The executable components of the system correctly build (e.g., compile and deploy) and adhere to the design model. Completeness – the executable components provide all of the functionality specified in the design model. Consistency – The executable components of the system are appropriately integrated in a way that provides the functionality specified by the use-case requirements

Testing Framework The object-oriented concepts you have learned in this class can easily be used to develop a simple testing framework for any application you are building. The execution of such tests will be automatically performed every night as part of the most recent build process.

Example – Test Method With Method:

Example – Test Handler Create new class that handles the common testing behaviors This can be done with Java, C# but not C++ Specifies that the execution of the given message should result in a returned value equal to the given object

verifyTest() - AccountCatalog public verifyTest() { // Get a newly created unique account id int id = nextId(); // Create a new account object with this id. Account account = new Account(id); // Save the newly created account object save(account); // Object[] parameters = new Object[1]; parameters[0] = new Integer(id); // Assert that the execution of the find method // on the current AccountCatalog object (this) // with the single id parameter, should return // a value equal to the account object. If not, // assert will log an appropriate error message // to the log file. assert(“find”, this, parameters, account);

General Testing Methodologies

Figure 13-3: Test types and detected defects

Unit Testing The process of testing individual methods, classes, or components before they are integrated with other software Two methods for isolated testing of units Driver Simulates the behavior of a method that sends a message to the method being tested Stub Simulates the behavior of a method that has not yet been written

Unit Testing module to be tested results software engineer test cases

Unit Testing module to be tested interface local data structures boundary conditions independent paths error handling paths test cases

Unit Test Environment test cases RESULTS driver Module stub stub interface local data structures Module boundary conditions independent paths error handling paths stub stub test cases RESULTS

Test Cases Test cases should uncover errors such as: Comparison of different data types Incorrect logical operators or precedence Expectation of equality when precision error makes equality unlikely Incorrect comparison of variables Improper or nonexistent loop termination Failure to exit when divergent iteration is encountered Improperly modified loop variables

Integration Testing Evaluates the behavior of a group of methods or classes Identifies interface compatibility, unexpected parameter values or state interaction, and run-time exceptions System test Integration test of the behavior of an entire system or independent subsystem Build and smoke test System test performed daily or several times a week

Integration Testing Strategies Options: • the “big bang” approach • an incremental construction strategy

Top Down Integration A top module is tested with stubs B F G stubs are replaced one at a time, "depth first" C as new modules are integrated, some subset of tests is re-run D E Main disadvantage -> is the need for stubs and attendant testing difficulties that can be associated with them. Should test major control functions early to help with this.

Bottom-Up Integration F G drivers are replaced one at a time, "depth first" C worker modules are grouped into builds and integrated D E cluster Major disadvantage -> the program as an entity does not exist until the last module is added.

Usability Testing Determines whether a method, class, subsystem, or system meets user requirements Performance test Determines whether a system or subsystem can meet time-based performance criteria Response time specifies the desired or maximum allowable time limit for software responses to queries and updates Throughput specifies the desired or minimum number of queries and transactions that must be processed per minute or hour

User Acceptance Testing Determines whether the system fulfills user requirements Involves the end users Acceptance testing is a very formal activity in most development projects

Object-Oriented Testing begins by evaluating the correctness and consistency of the OOA and OOD models testing strategy changes the concept of the ‘unit’ broadens due to encapsulation integration focuses on classes and their execution across a ‘thread’ or in the context of a usage scenario validation uses conventional black box methods test case design draws on conventional methods, but also encompasses special features

Who Tests Software? Programmers Users Quality assurance personnel Unit testing Testing buddies can test other’s programmer’s code Users Usability and acceptance testing Volunteers are frequently used to test beta versions Quality assurance personnel All testing types except unit and acceptance Develop test plans and identify needed changes

Debugging: A Diagnostic Process

The Debugging Process Debugging test cases results new test cases regression tests suspected causes corrections Debugging identified causes

Debugging Effort time required to diagnose the symptom and determine the cause time required to correct the error and conduct regression tests

Symptoms & Causes symptom cause symptom and cause may be geographically separated symptom may disappear when another problem is fixed cause may be due to a combination of non-errors cause may be due to a system or compiler error cause may be due to symptom assumptions that everyone cause believes symptom may be intermittent

Consequences of Bugs infectious damage catastrophic extreme serious disturbing annoying mild Bug Type Bug Categories: function-related bugs, system-related bugs, data bugs, coding bugs, design bugs, documentation bugs, standards violations, etc.

Debugging Techniques brute force / testing -- run-time traces,output statements backtracking -- start where error is found and work backwards induction -- cause elimination, binary partitioning deduction

Debugging: Final Thoughts 1. Don't run off half-cocked, think about the symptom you're seeing. 2. Use tools (e.g., dynamic debugger) to gain more insight. 3. If at an impasse, get help from someone else. 4. Be absolutely sure to conduct regression tests when you do "fix" the bug.

Configuration and Change Management Controls the complexity associated with testing and supporting a system through multiple development and operational versions Integrally related to project management, implementation, testing, and deployment activities Change control procedures are typically developed in the first iteration before development Need for formal procedures depends on size and cohesiveness of project

Figure 13-7 Configuration and change management discipline activities

Versioning Alpha version Beta Production version Maintenance release Test version that is incomplete but ready for some level of rigorous integration or usability testing Beta Test version that is stable enough to be tested by end users for an extended period of time Production version System version that is formally distributed to users or made operational for long-term use Maintenance release System update that provides bug fixes and small changes to existing features

Submitting Change Requests and Error Reports Typical change control procedures include Standard change request forms Completed by a user or system owner Review of requests by a change control committee Assess impact on system, security, and budget Extensive planning for design and implementation Bugs reports are often reported separately because of the need for an immediate fix

Figure 13-11: A sample change request form

Figure 13-12: A sample change review form

Planning and Managing Testing Testing activities must be distributed throughout the project Unit and integration testing occur whenever software is developed, acquired, or combined with other software Usability testing occurs whenever requirements or design decisions need to be evaluated User acceptance tests are conducted as a final validation of the requirements, design, and implementation activities

Development Order Input, process, output (IPO) development Top-down Implements input modules first, process modules next, and output modules last Important user interfaces are developed early Top-down Implements top-level modules first There is always a working version of the program Bottom-up Implements low-level detailed modules first Programmers can be put to work immediately

Framework Development Foundation classes Object framework that covers most or all of the domain and data access layer classes Reused in many parts of the systems and across applications Whenever possible, developers choose use cases for early iterations that rely on many foundation classes Testing early finds bugs before dependent code is developed

Direct Deployment Installs a new system, quickly makes it operational, and immediately turns off any overlapping systems Advantages Simplicity Disadvantages Risk of system unavailability Used when a new system is not replacing an old system and/or downtime can be tolerated

Direct Deployment and Cutover

Parallel Deployment Operates both old and new systems for an extended time period Advantages Relatively low risk of system failure Disadvantage Cost to operate both systems Used for mission-critical applications Partial parallel deployment can be implemented with increased risk of undetected errors

Parallel deployment and operation Figure 13-24 Parallel deployment and operation

Phased Deployment Installs a new system and makes it operational in a series of steps or phases Advantages Reduced risk Disadvantages Increased complexity Useful when a system is large, complex, and composed of relatively independent subsystems

Phased deployment with direct cutover and parallel operation Figure 13-25 Phased deployment with direct cutover and parallel operation

Personnel Issues New system deployment places significant demands on personnel Temporary and contract personnel may be hired to increase manpower, especially during a parallel deployment System operators Personnel with experience in hardware or software deployment and configuration Employee productivity decreases temporarily with a new system due to the learning curve