2 Test PlanningTest DesignTest AnalysisTest Design TechniquesStatic TechniquesDynamic TechniquesChoosing A Test Design TechniqueTest Design Specification StructureTest Design Specification ExamplesHomework
3 Test PlanningTest Planning – the process of defining and documenting the strategy that will be used to verify and ensure that a product or system meets its design specifications and other requirements.Test Plan document should be created by QC management (QC Analyst/QC Lead/QC Manager) and answer on the following questions:How the testing will be done?Who will do it?What will be tested?How long it will take?What the test coverage will be, i.e. what quality level is required?Test Plan document formats can be as varied as the products and organizations to which they apply, but there are three major elements that should be described in each Test Plan:Test CoverageTest MethodsTest Responsibilities
4 Test Plan according to IEEE 829 standard IEEE 829 – Standard for Software Test DocumentationAccording to IEEE 829 Test Plan consists of:Test plan identifierIntroductionTest itemsFeatures to be testedFeatures not to be testedApproachItem pass/fail criteriaSuspension criteria and resumption requirementsTest deliverablesTesting tasksEnvironmental needsResponsibilitiesStaffing and training needsScheduleRisks and contingenciesApprovals
5 Test DesignTest Design Phase – In software engineering, test design phase is a process of reviewing and analyzing test basis, selecting test design techniques and creating designed test cases, checklists and scenarios for testing software.Test Design SpecificationIt is a document that describes features to be tested and specifies list of all test scenarios or test cases, which should be designed for providing the testing of software.The test design does not record the values to be entered for a test, but describes the requirements for defining those values.Test design could require all or one of:Knowledge of the software, and the business area it operates onKnowledge of the functionality being testedKnowledge of testing techniques and heuristicsPlanning skills to schedule in which order the test cases should be designed, given the effort, time and cost needed or the consequences for the most important and/or risky features
6 Trainings’ Content … Review and Analyze Test Basis SRS…Mock-upsTest PlanReview and Analyze Test BasisTest Design SpecificationSelect Test Design TechniquesTest Case SpecificationCreate Test Design SpecificationCreate Test Cases Specification
7 Test AnalysisTest Analysis is the process of looking at something that can be used to derive test information. This basis for the tests is called the 'test basis’.Test analysis has the following major tasks, in approximately the following order:Review Test BasisDefine Test ConditionsEvaluate testability of the requirements and systemDefine test environmentTest Basis – all documents from which the requirements of a component or system can be inferred (the documentation on which the test cases are based).Test Condition – an item or event of a component or system that could be verified by one or more test cases, e.g. a function, transaction, feature, quality attribute or structural element.Traceability – the ability to identify related items in documentation and software, such as requirements with associated tests. There are:Horizontal traceabilityVertical traceability
8 Trainings’ Content … Review and Analyze Test Basis SRS…Mock-upsTest PlanReview and Analyze Test BasisTest Design SpecificationSelect Test Design TechniquesTest Case SpecificationCreate Test Design SpecificationCreate Test Case Specification
9 Test Design Techniques Test Design Techniques are used to derive and/or select test casesWhy they are important?Two main categories of Test Design Techniques1Test Design TechniquesStatic: The fundamental objective of static testing is to improve the quality of software work products by assisting engineers to recognize and fix their own defects early in the software development.Dynamic: Testing that involves the execution of the software of a component or system.
11 Statique TechniquesInformal Review – a review not based on a formal (documented) procedure.Walkthrough – a step-by-step presentation by the author of a document in order to gather information and to establish a common understanding of its content.Technical Review – a peer group discussion activity that focuses on achieving consensus on the technical approach to be taken.Inspection – a type of peer review that relies on visual examination of documents to detect defects. The most formal review technique and therefore always based on a documented procedure.Control flow analysis – a form of static analysis based on a representation of unique paths (sequences of events) in the execution through a component or system. Control flow analysis evaluates the integrity of control flow structures, looking for possible control flow anomalies such as closed loops or logically unreachable process steps.Data Flow Analysis – a form of static analysis based on the definition and usage of variables.
12 Dynamic Techniques Dynamic Techniques Structure – Based .Dynamic TechniquesTesting, either functional or non-functional, without reference to the internal structure of the component or system.Structure – BasedExperience – BasedSpecification-BasedStatementError GuessingEquivalence PartitioningDecisionExploratory TestingState TransitionDecision TablesConditionUse Case TestingMultiple ConditionBoundary Values Analysis
13 Equivalence Partitioning Equivalence partitioning (EP) – A black box test design technique in which test cases are designed to execute representatives from equivalence partitions. In principle test cases are designed to cover each partition at least once.Idea: Divide (i.e. to partition) a set of test conditions into groups or sets that can be considered the same (i.e. the system should handle them equivalently), hence equivalence partitioning.Example: Bank represents new deposit program for corporate clients. According to the program client has ability to get different %, based on amount of deposited money. Minimum which can be deposited in $1, maximum is – $999. If client deposits less than $500 it will have 5% of interests. In case the amount of deposited money is $500 and higher, then client gets on 10% of interests more.InvalidValid for 5% discountValid for 15% discount$0$ $499$ $999$1000
14 Boundary Values Analysis Boundary value: An input value or output value which is on the edge of an equivalence partition or at the smallest incremental distance on either side of an edge, for example the minimum or maximum value of a range.Boundary value analysis (BVA): A black box test design technique in which test cases are designed based on boundary values.Idea: Divide test conditions into sets and test the boundaries between these sets.Example: Bank represents new deposit program for corporate clients. According to the program client has ability to get different %, based on amount of deposited money. Minimum which can be deposited in $1, maximum is – $999. If client deposits less than $500 it will have 5% of interests. In case the amount of deposited money is $500 and higher, then client gets on 10% of interests more.InvalidValid for 5% discountValid for 15% discount$0$ $499$ $999$1000
15 Decision tablesDecision table – A table showing combinations of inputs and/or stimuli (causes) with their associated outputs and/or actions (effects), which can be used to design test cases.Example: If you hold an 'over 60s' rail card, you get a 34% discount on whatever ticket you buy. If you hold family rail card and you are traveling with a child (under 16), you can get a 50% discount on any ticket. If you are traveling with a child (under 16), but do not have family rail card, you can get a 10% discount. You can use only one type of rail card.
16 State transitionState transition – A transition between two states of a component or systemState transition testing – A black box test design technique in which test cases are designed to execute valid and invalid state transitionsExample: The diagram below shows an example of entering a Personal Identity Number (PIN) to a bank account. The states are shown as circles, the transitions as lines with arrows and the events as the text near the transitions.StartCard insertedEat cardWait for PinEnter1st tryPinNOT Ok2nd tryPinNOT Ok3rd tryPin OkPin OkPin OkAccess to account
17 Use Case testingUse Case testing - is a technique that helps us identify test cases that exercise the whole system on a transaction by transaction basis from start to finish.Use cases describe the process flows through a system based on its most likely useThis makes the test cases derived from use cases particularly good for finding defects in the real-world use of the systemEach use case usually has a mainstream (or most likely) scenario and sometimes additional alternative branches (covering, for example, special cases or exceptional conditions).Each use case must specify any preconditions that need to be met for the use case to work.Use cases must also specify post conditions that are observable results and a description of the final state of the system after the use case has been executed successfully.
18 Structure-Based Techniques .Procedure to derive and/or select test cases based on an analysis of the internal structure of a component or system.Dynamic TechniquesStructure – BasedExperience – BasedSpecification-BasedStatementEquivalence PartitioningError GuessingDecisionState TransitionExploratory TestingDecision TablesConditionUse Case TestingMultiple ConditionBoundary Values Analysis
19 Structure based Techniques Types of Structure based technique:StatementA testing aimed at exercising programming statements. If we aim to test every executable statement we call this full or 100% statement coverage.DecisionA white box test design technique in which test cases are designed to execute decision outcomes.ConditionA white box test design technique in which test cases are designed to execute condition outcomes – the evaluation of a condition to True or FalseMultiply ConditionA white box test design technique in which test cases are designed to execute combinations of single condition outcomes (within one statement.
20 Statement TestingStatement – an entity in a programming language, which is typically the smallest indivisible unit of execution.Example:
21 Decision TestingDecision is an IF statement, a loop control statement (e.g. DO-WHILE or REPEAT-UNTIL), or a CASE statement, where there are two or more possible exits or outcomes from the statement.Example:
22 Experience-Based Techniques .Dynamic TechniquesSpecification-BasedStructure – BasedExperience – BasedStatementEquivalence PartitioningError GuessingDecisionState TransitionExploratory TestingConditionDecision TablesProcedure to derive and/or select test cases basedon the tester’s experience, knowledge and intuition.Multiple ConditionUse Case TestingBoundary Values Analysis
23 Experience based Techniques Error guessing is a technique that should always be used as a complement to other more formal techniques. The success of error guessing is very much dependent on the skill of the tester, as good testers know where the defects are most likely to lurk.Exploratory testing is a hands-on approach in which testers are involved in minimum planning and maximum test execution.
24 Choosing A Test Design Technique Which technique is best? This is the wrong question!Each technique is good for certain things, and not as good for other things. Some techniques are more applicable to certain situations and test levels, others are applicable to all test levels.The internal factors that influence the decision about which technique to use are:Tester knowledge and experienceExpected defectsTest objectivesDocumentationLife cycle modelThe external factors that influence the decision about which technique to use are:RisksCustomer and contractual requirementsSystem typeRegulatory requirementsTime and budget
26 Trainings’ Content … Review and Analyze Test Basis SRS…Mock-upsTest PlanReview and Analyze Test BasisTest Design SpecificationSelect Test Design TechniquesTest Case SpecificationCreate Test Design SpecificationCreate Test Case Specification
27 Test Design Specification Structure According to IEEE-829 standard template structure looks in the following way:Test Design Specification Identifier1.1 Purpose1.2 References1.3 Definitions, acronyms and abbreviationsFeatures to be TestedApproach RefinementsTest Identification4.1 <Test Item 1>4.2 <Test Item …>4.3 <Test Item N>Feature Pass/Fail Criteria
28 Test Design Specification Structure Test Design Specification Identifier section covers:Purpose of the documentScope of the documentList of references which should include references on test plan, functional specification, test case specification, etc.Definitions, acronyms and abbreviations used in Test Design SpecificationFeatures to be Tested identifies test items and describes features and combinations of features that are the object of this design specification. Reference on Functional Specification for each feature or combination of features should be included.Approach Refinements section describes the following:Specific test techniques to be used for testing features or combinations of featuresTypes of testing which will be providedMethods of analyzing test resultsTest results reportingWhether automation of test cases will be provided or notAny other information which describes approach to testing
29 Test Design Specification Structure Feature Pass/Fail Criteria specifies the criteria to be used to determine whether the feature or feature combination has passed or failedThe following items can be considered as “pass / fail criteria”:Feature works according to stated requirementsFeature works correctly on the test platformsFeature works correctly with other modules of applicationAll issues with High and Medium Priority will be verified and closed
30 Test Design Specification Structure Test Identification section is separated to sub-section according to the amount of test items identifying future documentation which will be created for testing features or combinations of features that are the object of this design specificationFeatures can be covered by test objectives in different ways depending on projects needs, approaches for testing etc.Let’s consider three examples of such coverage:Feature covered Feature covered Feature coveredby test cases by test scenarios by check list
31 Real Example: User Registration page Business Value: I, as an Administrator user, should be able to create a simple user account to log in application.Functional Requirements: ‘User Registration’ page should contain three fields ‘User Name’, ‘Password’, ‘Confirm Password’ and two buttons – ‘Save’ and ‘Cancel’.Mock up:‘User Name’ field is limited by 10 symbols and should contain letters of Latin alphabet only. ‘User Name’ field is empty by default. User Name should be unique in the system.‘Password’ field should be no less than 4 symbols long and should include only numbers and letters of Latin alphabet only. ‘Password’ field is empty by default.‘Confirm Password’ field should be equal to ‘Password’. ‘Confirm Password’ field is empty by default.‘Cancel’ button cancels account creation and closes ‘User Registration’ page.‘Save’ button validates data entered into fields on ‘User Registration’ page and creates user account if entered data are correct; or shows error dialogs if validation fails. Validation should be provided in following order: User Name, Password, and Confirm Password.
33 Real Example: Test Item “User Registration” RequirementTest NameDescription‘Save’ button functionalityCreating new user account and saveThis test verifies that user account could be created if all fields on ‘User Registration’ page are filled with correct data; and ‘User Registration’ page is closed on save action‘Cancel’ button functionalityCreating new user account and cancelThis test verifies that user account is not created after filling in fields on ‘User Registration’ page and canceling; and ‘User Registration’ page is closed on cancel actionDefault valuesDefault values on the ‘User Registration’ pageThis test verifies that all fields on ‘User Registration’ page are blank by default‘User Name’ field validationError dialog on saving user account with too long user nameThis test verifies that error dialog appears while save action if user name length is too long:1)boundary length – 11 characters2)restricted length – more than 11 charactersError dialog on saving user account with blank ‘User Name’ fieldThis test verifies that error dialog appears while save action if ‘User Name’ field is blankVerify boundary length for user nameThis test verifies that user account having user name with boundary length 1 or 10 could be createdError dialog on saving user account with wrong user nameThis test verifies that error dialog appears while save action if ‘User Name’ field include: 1)special symbols; 2)numbers; 3)bothError dialog on saving already existing user accountThis test verifies that error dialog appears while save action if user already exists in the system‘Password’ field validationError dialog on saving user account with too short passwordThis test verifies that error dialog appears while save action if password length is too short: 1)boundary length – 3 characters2)restricted length – less than 3 charactersError dialog on saving user account with blank ‘Password’ fieldThis test verifies that error dialog appears while save action if password is blankVerify boundary length for passwordThis test verifies that user account having password with boundary length 4 could be createdError dialog on saving user account with incorrect passwordThis test verifies that error dialog appears while save action if ‘Password’ field includes special symbols‘Confirm Password’ field validationError dialog on saving user account with unequal password and confirm passwordThis test verifies that error dialog appears while save action if:1)’Confirm Password’ field is blank2)password and confirm password do not match
35 Test Design and Techniques Homework Create Test Design Specification based on Software Requirements SpecificationPractice in using Test Design Techniques and design test objectives using Dynamic Test Design Techniques