Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 432 Object-Oriented Analysis and Design

Similar presentations


Presentation on theme: "CS 432 Object-Oriented Analysis and Design"— Presentation transcript:

1 CS 432 Object-Oriented Analysis and Design
* 07/16/96 CS 432 Object-Oriented Analysis and Design Week 7 Testing *

2 Software Testing Testing is the process of exercising a
program with the specific intent of finding errors prior to delivery to the end user. “You can’t test in quality. If it’s not there before you begin testing, it won’t be there when you’re finished testing.”

3 Testing Testing is a process of identifying defects
Develop test cases and test data A test case is a formal description of A starting state One or more events to which the software must respond The expected response or ending state Test data is a set of starting states and events used to test a module, group of modules, or entire system

4 Testing discipline activities

5 Testing is Use Case Driven
Terms associated with testing: Verification – are we building the system right? Validation – are we building the right product? Testing the Use Case Model is a validation task that determines whether the customer’s needs are being met by the application. Testing of remaining models is a verification task that determines whether the application correctly satisfies the requirements specified in the Use Case Model

6

7 Testing Workflow Defines the primary activities required to develop the Test Model: Develop Test Plan Outline the testing strategies to be used including model tests, Estimating the resources required to complete the testing activities, Scheduling the testing as part of the overall project plan.

8 Testing Workflow Design Model Tests Design Test
Determine strategies for testing a model’s correctness, Determine strategies for testing a model’s completeness, Determine strategies for testing a model’s consistency, Design Test Identify and describe test cases for each build Identify and structure test procedures specifying how to perform or carry out the test cases that test the executable application components and their integration,

9 Testing Workflow Implement Test – automate test procedures by creating test components that define executable components that automate all or part of the test procedures, if possible Perform Tests – execute the tests cases as part of: an iteration release or implementation build including: Unit testing is the white-box technique in which programmers verify that the software they implemented satisfies its design requirements (see Section 6.3) Regression testing verifies that changes to one part of an application do not break other previous working parts/components of the application,

10 Testing Workflow Integration testing verifies the communication and cooperation among major system components that have been previously unit tested, User acceptance testing tests whether the users can actually use the system and the difficulties they encounter when doing so (often a Cognitive Psychology issue), Stress testing ensures the system can handle significant loads, e.g., 5,000 simultaneous users. Performance testing ensures the system meets performance objectives, e.g., response within 3 seconds. Evaluate Test – evaluate the results of the tests and the testing strategy.

11 Testing Categories black-box testing techniques use only functional knowledge of an entity being tested. E.g. only knowledge on use cases, or interfaces white-box testing techniques utilize knowledge of the internal control structure logic of the entity being tested. Behavioral diagrams capture some aspect of the internal behavior of how an entity realizes the functionality associated with a use case.

12 Models – How to Test In general, develop use cases that address these issues: correctness – tests both the syntactic and semantic accuracy of a model. From a syntactic perspective, the model must be tested to verify that the various UML elements used within the model are used correctly. From a semantic perspective, the elements within the model must accurately correspond to the reality being represented by the model.

13 Models – How to Test completeness – tests whether a model is complete by verifying whether any required elements are missing from the model. This is typically accomplished by determining whether the model can appropriately handle scenarios developed for this purpose. consistency – tests whether the elements in a model are used in a conflicting way. It also verifies that the elements of a model don’t contradict with the elements of other models on which it depends.

14 Use Case Model The testing of the Use-Case Model must consider the following issues: Correctness – does each use case accurately represent a requirement? Completeness – do the use cases represent all of the functionality needed for a satisfactory product Consistency – generally this requires examining extension and included use cases to ensure that their relation to other use cases is consistent. Naturally, it’s possible to declare two use cases that contradict each other, but this is rare in practice.

15 Analysis Model Testing
Testing the analysis model amounts to determining whether the application being modeled correctly interprets the application domain: Correctness – the description of the domain concepts are accurate; the algorithms will produce the expected results. The concepts and algorithms cover the use cases Completeness – the concepts are sufficient to cover the scope of the content specified. Sufficient detail is given to describe concepts to the required depth. Experts agree with the attributes and behaviors assigned to each class. Consistency – Model elements should be consistent with the business’s definitions and meanings. Where there are multiple ways to represent a concept of action, those ways should be modeled equivalently.

16 Design Model Testing Testing the design model is conceptually similar to testing the analysis model, but also requires addressing the following issues: Correctness – Each class accurately implements the semantics of an interface. Classes corresponding to interfaces must implement the interface Completeness – classes are defined for each interface in the architecture. Preconditions for method use are appropriate specified. Post-conditions and error conditions are specified. Consistency – The behaviors in the interface of each class provides either a single way to accomplish a task or, of there are multiple ways, they provide the same behavior, but with different preconditions

17 Design Model Testing Testing an object-oriented design presents some additional challenges not typically found in the testing of procedure programming designs: Interfaces: Classes implement interfaces and it’s necessary to ensure that the class correctly implements the functionality required by its interface. Inheritance: The use of inheritance also introduces coupling problems in which a change to one class results in a change to all of its subclasses. This requires ensuring that the change that is now being inherited is appropriate for all of the subclasses. Delegation can often alleviate many of the issues associated with inheritance. Delegation: The delegation of tasks to other objects, though similar to the use of modules in procedure programming languages, also presents its own set of issues. In a class, the delegation is encapsulated behind a public interface, hence its necessary to ensure that the implementation of the interface is still correct when the class to which the functionality has been delegated is changed.

18 Implementation Model Testing
Testing the implementation model focuses on verifying that the implemented code satisfies the design in the design model (and the requirements of the use-case model). Correctness – The executable components of the system correctly build (e.g., compile and deploy) and adhere to the design model. Completeness – the executable components provide all of the functionality specified in the design model. Consistency – The executable components of the system are appropriately integrated in a way that provides the functionality specified by the use-case requirements

19 Testing Framework The object-oriented concepts you have learned in this class can easily be used to develop a simple testing framework for any application you are building. The execution of such tests will be automatically performed every night as part of the most recent build process.

20 Example – Test Method With Method:

21 Example – Test Handler Create new class that handles the common testing behaviors This can be done with Java, C# but not C++ Specifies that the execution of the given message should result in a returned value equal to the given object

22 verifyTest() - AccountCatalog
public verifyTest() { // Get a newly created unique account id int id = nextId(); // Create a new account object with this id. Account account = new Account(id); // Save the newly created account object save(account); // Object[] parameters = new Object[1]; parameters[0] = new Integer(id); // Assert that the execution of the find method // on the current AccountCatalog object (this) // with the single id parameter, should return // a value equal to the account object. If not, // assert will log an appropriate error message // to the log file. assert(“find”, this, parameters, account);

23 General Testing Methodologies

24 Figure 13-3: Test types and detected defects

25 Unit Testing The process of testing individual methods, classes, or components before they are integrated with other software Two methods for isolated testing of units Driver Simulates the behavior of a method that sends a message to the method being tested Stub Simulates the behavior of a method that has not yet been written

26 Unit Testing module to be tested results software engineer test cases

27 Unit Testing module to be tested interface local data structures
boundary conditions independent paths error handling paths test cases

28 Unit Test Environment test cases RESULTS driver Module stub stub
interface local data structures Module boundary conditions independent paths error handling paths stub stub test cases RESULTS

29 Test Cases Test cases should uncover errors such as:
Comparison of different data types Incorrect logical operators or precedence Expectation of equality when precision error makes equality unlikely Incorrect comparison of variables Improper or nonexistent loop termination Failure to exit when divergent iteration is encountered Improperly modified loop variables

30 Integration Testing Evaluates the behavior of a group of methods or classes Identifies interface compatibility, unexpected parameter values or state interaction, and run-time exceptions System test Integration test of the behavior of an entire system or independent subsystem Build and smoke test System test performed daily or several times a week

31 Integration Testing Strategies
Options: • the “big bang” approach • an incremental construction strategy

32 Top Down Integration A top module is tested with stubs B F G
stubs are replaced one at a time, "depth first" C as new modules are integrated, some subset of tests is re-run D E Main disadvantage -> is the need for stubs and attendant testing difficulties that can be associated with them. Should test major control functions early to help with this.

33 Bottom-Up Integration
F G drivers are replaced one at a time, "depth first" C worker modules are grouped into builds and integrated D E cluster Major disadvantage -> the program as an entity does not exist until the last module is added.

34 Usability Testing Determines whether a method, class, subsystem, or system meets user requirements Performance test Determines whether a system or subsystem can meet time-based performance criteria Response time specifies the desired or maximum allowable time limit for software responses to queries and updates Throughput specifies the desired or minimum number of queries and transactions that must be processed per minute or hour

35 User Acceptance Testing
Determines whether the system fulfills user requirements Involves the end users Acceptance testing is a very formal activity in most development projects

36 Object-Oriented Testing
begins by evaluating the correctness and consistency of the OOA and OOD models testing strategy changes the concept of the ‘unit’ broadens due to encapsulation integration focuses on classes and their execution across a ‘thread’ or in the context of a usage scenario validation uses conventional black box methods test case design draws on conventional methods, but also encompasses special features

37 Who Tests Software? Programmers Users Quality assurance personnel
Unit testing Testing buddies can test other’s programmer’s code Users Usability and acceptance testing Volunteers are frequently used to test beta versions Quality assurance personnel All testing types except unit and acceptance Develop test plans and identify needed changes

38 Debugging: A Diagnostic Process

39 The Debugging Process Debugging test cases results new test cases
regression tests suspected causes corrections Debugging identified causes

40 Debugging Effort time required to diagnose the symptom and
determine the cause time required to correct the error and conduct regression tests

41 Symptoms & Causes symptom cause symptom and cause may be
geographically separated symptom may disappear when another problem is fixed cause may be due to a combination of non-errors cause may be due to a system or compiler error cause may be due to symptom assumptions that everyone cause believes symptom may be intermittent

42 Consequences of Bugs infectious damage catastrophic extreme serious
disturbing annoying mild Bug Type Bug Categories: function-related bugs, system-related bugs, data bugs, coding bugs, design bugs, documentation bugs, standards violations, etc.

43 Debugging Techniques brute force / testing
-- run-time traces,output statements backtracking -- start where error is found and work backwards induction -- cause elimination, binary partitioning deduction

44 Debugging: Final Thoughts
1. Don't run off half-cocked, think about the symptom you're seeing. 2. Use tools (e.g., dynamic debugger) to gain more insight. 3. If at an impasse, get help from someone else. 4. Be absolutely sure to conduct regression tests when you do "fix" the bug.

45 Configuration and Change Management
Controls the complexity associated with testing and supporting a system through multiple development and operational versions Integrally related to project management, implementation, testing, and deployment activities Change control procedures are typically developed in the first iteration before development Need for formal procedures depends on size and cohesiveness of project

46 Figure 13-7 Configuration and change management discipline activities

47 Versioning Alpha version Beta Production version Maintenance release
Test version that is incomplete but ready for some level of rigorous integration or usability testing Beta Test version that is stable enough to be tested by end users for an extended period of time Production version System version that is formally distributed to users or made operational for long-term use Maintenance release System update that provides bug fixes and small changes to existing features

48 Submitting Change Requests and Error Reports
Typical change control procedures include Standard change request forms Completed by a user or system owner Review of requests by a change control committee Assess impact on system, security, and budget Extensive planning for design and implementation Bugs reports are often reported separately because of the need for an immediate fix

49 Figure 13-11: A sample change request form

50 Figure 13-12: A sample change review form

51 Planning and Managing Testing
Testing activities must be distributed throughout the project Unit and integration testing occur whenever software is developed, acquired, or combined with other software Usability testing occurs whenever requirements or design decisions need to be evaluated User acceptance tests are conducted as a final validation of the requirements, design, and implementation activities

52 Development Order Input, process, output (IPO) development Top-down
Implements input modules first, process modules next, and output modules last Important user interfaces are developed early Top-down Implements top-level modules first There is always a working version of the program Bottom-up Implements low-level detailed modules first Programmers can be put to work immediately

53 Framework Development
Foundation classes Object framework that covers most or all of the domain and data access layer classes Reused in many parts of the systems and across applications Whenever possible, developers choose use cases for early iterations that rely on many foundation classes Testing early finds bugs before dependent code is developed

54 Direct Deployment Installs a new system, quickly makes it operational, and immediately turns off any overlapping systems Advantages Simplicity Disadvantages Risk of system unavailability Used when a new system is not replacing an old system and/or downtime can be tolerated

55 Direct Deployment and Cutover

56 Parallel Deployment Operates both old and new systems for an extended time period Advantages Relatively low risk of system failure Disadvantage Cost to operate both systems Used for mission-critical applications Partial parallel deployment can be implemented with increased risk of undetected errors

57 Parallel deployment and operation
Figure 13-24 Parallel deployment and operation

58 Phased Deployment Installs a new system and makes it operational in a series of steps or phases Advantages Reduced risk Disadvantages Increased complexity Useful when a system is large, complex, and composed of relatively independent subsystems

59 Phased deployment with direct cutover and parallel operation
Figure 13-25 Phased deployment with direct cutover and parallel operation

60 Personnel Issues New system deployment places significant demands on personnel Temporary and contract personnel may be hired to increase manpower, especially during a parallel deployment System operators Personnel with experience in hardware or software deployment and configuration Employee productivity decreases temporarily with a new system due to the learning curve


Download ppt "CS 432 Object-Oriented Analysis and Design"

Similar presentations


Ads by Google