Presentation is loading. Please wait.

Presentation is loading. Please wait.

test the system against user & system requirements

Similar presentations


Presentation on theme: "test the system against user & system requirements"— Presentation transcript:

1 test the system against user & system requirements
DATABASE DESIGN & DEVELOPMENT test the system against user & system requirements Zatil Ridh'wah Hj Darot

2 testing? Aim to find defects within the system as well as verifying whether the application behaves as expected and according to what was documented in the requirements analysis phase.

3 what to test?

4

5 Test Team Test Team Professional Tester Analyst System Designer User
too familiar Programmer with code Analyst System Designer Test User Team Configuration Management Specialist

6 test planning A Test Plan: covers all types and phases of testing
guides the entire testing process who, why, when, what developed as requirements, functional specification, and high-level design are developed should be done before __________________

7 A test plan includes: test objectives schedule and logistics test strategies test ___________ procedure data expected result procedures for handling problems

8 testing level Unit Testing: Individual subsystem
Carried out by developers Goal: Confirm that _________________ and carries out the intended functionality Integration Testing: Groups of subsystems (collection of classes) and eventually the entire system Goal: Test the interface among the subsystem

9 testing level (con'td) 3. System Testing: The entire system
Carried out by developers Goal: Determine if the system meets the _________________ (functional and global) 4. Acceptance Testing: Evaluates the system delivered by developers Carried out by the client. May involve executing typical transactions on site on a trial basis Goal: Demonstrate that the system meets customer requirements and is ready to use Implementation (Coding) and testing go hand in hand

10

11 testing methods

12 white box testing White-box testing (also known as clear box testing, glass box testing, transparent box testing, and structural testing) __________________________ and uses that knowledge as part of the testing process. If, for example, exception is thrown under certain conditions, test might want to reproduce those conditions. White-box testing requires internal knowledge of the system and programming skills. It provides internal perspective of the software under test.

13 Some of the advantages of white-box testing are:
Efficient in finding errors and problems Required knowledge of internals of the software under test is beneficial for thorough testing Allows finding hidden errors Programmers introspection Helps optimizing the code Due to required internal knowledge of the software, maximum coverage is obtained Some of the disadvantages of white-box testing are: Might not find _________________________ features Requires high level knowledge of internals of the software under test Requires code access

14 black box testing Black-box testing treats software under test as a black-box without knowing its internals. Tests are using software interfaces and trying to ensure that they work as expected. As long as ______________________, tests should pass even if internals are changed. Tester is aware of what the program should do but does not have the knowledge of how it does it.

15 Some of the advantages of black-box testing are:
Efficient for large segments of code Code access is not required Separation between user’s and developer’s perspectives Some of the disadvantages of black-box testing are: Limited coverage since only a fraction of test scenarios is performed _________________due to tester’s luck of knowledge about software internals Blind coverage since tester has limited knowledge about the application

16 White vs Black-box Testing
White-box Testing: Potentially infinite number of paths have to be tested White-box testing often tests what is done, instead of what should be done Cannot detect missing use cases Black-box Testing: Potential combinatorical explosion of test cases (valid & invalid data) Often not clear whether the selected test cases uncover a particular error Does not discover extraneous use cases ("features") _______________________ White-box testing and black box testing are the extreme ends of a testing continuum. Any choice of test case lies in between and depends on the following: Number of possible logical paths Nature of input data Amount of computation Complexity of algorithms and data structures

17 testing documentation
Testing documents are prepared at different stages: Before Testing Testing starts with test cases generation. Following documents are needed for reference: SRS document - Functional Requirements document Test Policy document - This _________________________________________________ Test Strategy document - This mentions detail aspects of test team, responsibility matrix and rights/responsibility of test manager and test engineer. Traceability Matrix document - This is SDLC document, which is related to requirement gathering process. As new requirements come, they are added to this matrix. These matrices help testers know the source of requirement. They can be traced forward and backward.

18 While Being Tested The following documents may be required while testing is started and is being done: Test Case document - This document contains list of tests required to be conducted. It includes Unit test plan, Integration test plan, System test plan and Acceptance test plan. Test description - This document is a detailed description of all test cases and procedures to execute them. Test case report - This document contains test case report as a result of the test. Test logs - This _________________________________________.

19 After Testing The following documents may be generated after testing : Test summary This test summary is ________________of all test reports and logs. It summarizes and concludes if the software is ready to be launched. The software is released under version control system if it is ready to launch.

20 system testing Functional Testing Validates functional requirements
Performance Testing Validates non-functional requirements Acceptance Testing Validates clients expectations

21 Functional Testing Goal: Test functionality of system
Test cases are designed from ________________________ (better: user manual) and centered around requirements and key functions (use cases) The system is treated as black box Unit test cases can be reused, but new test cases have to be developed as well. .

22 functionality tests

23 Performance Testing Goal: Try to violate non-functional requirements
Test how the system ____________________________ . Can bottlenecks be identified? (First candidates for redesign in the next iteration) Try unusual orders of execution Call a receive() before send() Check the system’s response to large volumes of data If the system is supposed to handle 1000 items, try it with 1001 items. What is the amount of time spent in different use cases? Are typical cases executed in a timely fashion?

24 Types of Performance Testing
Stress Testing Stress limits of system Volume testing Test what happens if large amounts of data are handled Configuration testing Test the various software and hardware configurations Compatibility test Test backward compatibility with existing systems Timing testing Evaluate response times and time to perform a function Security testing Try to violate security requirements Environmental test Test tolerances for heat, humidity, motion Quality testing Test reliability, maintain- ability & availability Recovery testing Test system’s response to presence of errors or loss of data Human factors testing Test with end users.

25 Acceptance Testing Alpha test:
Client uses the software at the developer’s environment. Software used in a controlled setting, with the developer always ready to fix bugs. Beta test: Conducted at client’s environment (developer is not present) Software gets a realistic workout in target environ- ment Goal: Demonstrate system is ready for operational use Choice of tests is made by client Many tests can be taken from integration testing Acceptance test is performed by the client, not by the developer.

26 differences between system & acceptance testing

27 software validation Validation is process of examining whether or not the software satisfies the user requirements. If the software matches requirements for which it was made, it is validated. Validation ensures the product under _____________________________ . Validation answers the question – "Are we developing the product which attempts all that user needs from this software ?". Validation emphasizes on user requirements.

28 software verification
Verification is the process of _____________________________ requirements, and is developed adhering to the proper specifications and methodologies. Verification ensures the product being developed is according to design specifications. Verification answers the question– "Are we developing this product by firmly following all design specifications ?" Verifications concentrates on the design and system specifications.

29 software verification (con'td)
Target of the test are: Errors - These are actual coding mistakes made by developers. In addition, there is a difference in output of software and desired output, is considered as an error. Fault - When error exists fault occurs. A fault, also known as a bug, is a result of an error which can cause system to fail. Failure - failure is said to be the inability of the system to perform the desired task. Failure occurs when fault exists in the system.

30 robustness tests Robustness means how much sensitive a system is to _______________ and changes its operational environment Tests in this category are designed to verify how gracefully the system behaves in error situations and in a changed operational environment

31 Boundary value Boundary value tests are designed to cover boundary conditions, special values, and system defaults The tests include providing invalid input data to the system and observing how the system reacts to the invalid input. Power cycling Power cycling tests are executed to ensure that, when there is a power glitch in a deployment environment, the system ______________________ in normal operation after power is restored On-line insertion and removal On-line Insertion and Removal (OIR) tests are designed to ensure that on-line insertion and removal of modules, incurred during both idle and heavy load operations, are gracefully handled and recovered

32 High Availability The concept of high availability is also known as ___________________ High availability tests are designed to verify the redundancy of individual modules, including the software that controls these modules. The goal is to verify that the system gracefully and quickly recovers from hardware and software failures without adversely impacting the operation of the system High availability is realized by means of proactive methods to maximize service up-time, and to minimize the downtime Degraded Node Degraded node (also known as failure containment) tests verify the operation of a system after a portion of the system becomes non-operational It is a useful test for all mission-critical applications.

33 data validation checks
Data Validation Testing allows you to make sure that: the Data you deal with is correct and complete that your Data and Database can go successfully through _________________________________ that your Database can dwell with specific and incorrect data in a proper way Finally that you have all the Data you expect to see in the front end of your system been represented correctly corresponding to the input.

34 There is a number of testing data validation testing techniques and approaches:
Data Accuracy Testing – makes sure that data is correct; Data Completeness Testing – makes sure that data is complete; Data Transformation Testing – makes sure that data goes successfully through transformations; Data Quality Testing – makes sure that bad data is handled well; Database Comparison Testing – compares the source and target DB despite the fact their structure and volume differ; Data Comparison Testing – compares data between different points of data flow; End-To-End Testing – final system testing that makes sure that in the end point we have correct data according to what we put into start point of the data flow; Data Warehouse Testing – makes sure that data goes successfully through all points of the system that uses data warehouse.

35 Help menus A part of a computer program that gives instructions and information about how to use the program. Pop-ups A window that suddenly appears (pops up) when you select an option with a mouse or press a special function key. Usually, the pop-up window contains a menu of commands and stays on the screen only until you select one of the commands. It then disappears. Hot-spots A page in either the index or data file that every job wants to access at the same time

36 Resources

37


Download ppt "test the system against user & system requirements"

Similar presentations


Ads by Google