Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to Create a Test Strategy

Similar presentations

Presentation on theme: "How to Create a Test Strategy"— Presentation transcript:

1 How to Create a Test Strategy
Intelligent Assurance and Testing 13-Apr-17 How to Create a Test Strategy Paul Gerrard Web:

2 Intelligent Testing and Assurance
Paul Gerrard Paul Gerrard is a consultant, teacher, author, webmaster, programmer, tester, conference speaker, rowing coach and a publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the USA, Australia, South Africa and occasionally won awards for them. Educated at the universities of Oxford and Imperial College London, in 2010, Paul won the Eurostar European Testing excellence Award. In 2012, with Susan Windsor, Paul recently co-authored “The Business Story Pocketbook”. He is Principal of Gerrard Consulting Limited and is the host of the UK Test Management Forum and the UK Business Analysis Forum. Intelligent Testing and Assurance

3 Intelligent Testing and Assurance
Agenda Your test strategy challenges? What is test strategy? Test strategy approach Test axioms as thinking tools Using the First Equation of Testing Testing in staged projects Goals, risks and designing the test process Goals, risks and coverage-based test reporting Communicating test strategies Case study exercise Your challenges revisited (if we have time). NB some slides are hidden and won’t be presented today. Intelligent Testing and Assurance

4 Overview This is a full day course (or 2 day, customised for clients)
I won’t talk over every slide We might not get through the whole Case Study exercise in detail But you WILL get some practice in many of the Q&A and conversations about test strategy.

5 What Are Your Test Strategy Challenges?

6 What Are Your Test Strategy Challenges?
Ever written a test strategy that no one read?

7 What is Test Strategy? Dig a Hole? Test a system?

8 Test strategy answers questions
Before you can plan a test, you need to have a lot of questions answered The strategy… Presents some decisions that can be made ahead of time Defines the process or method or information that will allow decisions to be made (in project) Sets out the principles (or process) to follow for uncertain situations or unplanned events. Intelligent Testing and Assurance

9 Test Strategy ‘evolution’
Success-based: test to show it works Defect-based: test to find bugs Coverage-based: analyse requirements and code to achieve test coverage targets Risk-based: use risk to focus testing and to inform risk assessment Goal-based: use business goals and risk to focus testing and support decision-making. FOCUS: from programmer  tester  stakeholder Intelligent Testing and Assurance

10 Intelligent Testing and Assurance
Goal-Based Testing Multiple objectives in multiple test stages Objectives vary with product readiness (testing in the small  testing in the large) Objectives vary by role Developer  tester  user  sponsor Multiple stakeholders: sponsor, users, developers, support staff, project management Different test models for each test objective Projects are unique, goals, risks and therefore the test approach is unique too. Intelligent Testing and Assurance

11 Test Strategy defines…
Who are the testing stakeholders What are their concerns and how their concerns will be used to scope testing What, when and how testing will report to stakeholders Who will perform the testing and reporting The human and technical resources required to deliver What might change, what could go wrong and how testers can recover the situation. Intelligent Testing and Assurance

12 Who are the stakeholders?
Those who focus on risk and business goals: Sponsors, project stakeholders Business Users Project management Those who focused on contractual aspects, stage payments etc: Software suppliers Contracts people. Intelligent Testing and Assurance

13 Who are the stakeholders? 2
Those who focus on their risks and responsibilities: Suppliers Developers System Test Team UAT Team Those who focus on meeting business and technical requirements: Technical Architects Operations Technical Support. Intelligent Testing and Assurance

14 Test Strategy and Approach
Strategy is a thought process not a document

15 Intelligent Testing and Assurance
Test Strategy ToC from 1994 1. Introduction 1.1. Version History 1.2. Purpose 1.3. Scope 1.4. Background 1.5. Assumptions and Dependencies 1.6. Summary 2. Overview of User Testing 2.1. Objectives 2.2. User Testing and the Overall Project 2.3. Functional Requirements Testing 2.4. The Need for Technical Requirements Testing 2.5. Starting Early - Front-loading 3. User Test Policy 3.1. Baseline for Testing 3.2. Contract Acceptance Criteria XXX and Supplier Responsibilities Testing and QA Quality Plan 3.3. Testing Criteria 3.4. Risk Criteria 3.5. Starting Criteria 3.6. Policy for Re-Testing 3.7. Policy for Regression Testing 3.8. Completion Criteria 3.9. Handover to Production Documentation Plan User Test Strategy Test Plan Test Log Incident Log Error Log Test Report 4. Functional Requirements Testing 4.1. Approach 4.2. Process 4.3. Special Application Needs 5. Technical Requirements Testing 5.1. Usability testing Requirements Conducting Usability Tests 5.2. Performance testing Requirements for Performance Testing Performance Test Cases 5.3. Conversion Testing 5.4. Security testing Security Tests Security Test Cases 5.5. Documentation Testing 5.6. Volume testing 5.7. Stress testing 5.8. Storage testing 5.9. Recovery Testing Installation testing Reliability Testing Serviceability Testing Portability Testing Tests Not Required by the Users 6. User Test Infrastructure 6.1. Test Environment Support Roles and Responsibilities Test Environment 6.2. Tools and Automation Comparators Test Data Generators Capture/Replay Tools Testing Information Systems Database Query and Maintenance Facilities Transaction Simulators 7. Schedule 7.1. Milestone Plan 7.2. Activities to be Resourced 7.3. Skills Required 8. User Test Execution 8.1. Acceptance Test Procedure Pre-Test Meeting During the Test Post-Test Meeting 8.2. Software Delivery 8.3. Testing to Plan 8.4. Handling Failures 8.5. Logging Tests 8.6. Error Classification 8.7. Controlling Releases of New Versions 8.8. Regression Testing 8.9. Documentation This is the table of contents of a 51 page document for an acceptance test of an outsourced development (written by me) A large safety-related system might have 150 pages of test strategy supported by other risk-related documents “Does size matter?” Intelligent Testing and Assurance

16 Contexts of Test Strategy
Axioms Communication Early Testing Risks De-Duplication Test Strategy Opportunities Goals Automation Culture Contract User involvement Constraints Human resource Artefacts Skills Environment Process (lack of?) Timescales Intelligent Testing and Assurance

17 Introducing the Test Axioms

18 Intelligent Testing and Assurance
Test Axioms Formulated as a context-neutral set of rules for testing systems They represent the critical thinking processes required to test any system There are clear opportunities to advance the practice of testing using them Testers Pocketbook: Test Axioms Website Intelligent Testing and Assurance

19 How can we use Test Axioms?
Test Axioms are not beginners guides They can help you to think critically about testing They expose flaws in other people’s thinking and their arguments about testing They generate some useful by-products They help you to separate context from values Interesting research areas! First Equation of Testing, Testing Uncertainty Principle, Quantum Theory, Relativity, Exclusion Principle... You can tell I like physics Intelligent Testing and Assurance

20 The Axioms are thinking tools

21 Intelligent Testing and Assurance
The Axioms Stakeholder Basis Oracle Fallibility Scope Value Coverage Never-Finished Delivery Good-Enough Environment Repeat-Test Event Design Prioritisation Sequencing Intelligent Testing and Assurance

22 Intelligent Testing and Assurance
Grouping the Axioms Stakeholder Basis Oracle Fallibility Scope Value Coverage Never-Finished Delivery Good-Enough Environment Repeat-Test Event Design Prioritisation Sequencing Intelligent Testing and Assurance

23 Intelligent Testing and Assurance
The three axiom groups Stakeholder Value Scope Fallibility Good-Enough Delivery Repeat-Test Sequence Environment Event Never-finished Design Basis Coverage Prioritisation Oracle Intelligent Testing and Assurance

24 Testing needs stakeholders (p64)
Summary: Identify and engage the people or organisations that will use and benefit from the test evidence we are to provide Consequence if ignored or violated: There will be no mandate or any authority for testing. Reports of passes, fails or enquiries have no audience. Questions: Who are they? Whose interests do they represent? What evidence do they want? What do they need it for? When do they want it? In what format? How often? Intelligent Testing and Assurance

25 Test design is based on models (p68)
Summary: Choose test models to derive tests that are meaningful to stakeholders. Recognise the models’ limitations and the assumptions that the models make Consequence if ignored or violated: Tests design will be meaningless and not credible to stakeholders. Questions Are design models available to use as test models? Are they mandatory? What test models could be used to derive tests from the Test Basis? Which test models will be used? Are test models to be documented or are they purely mental models? What are the benefits of using these models? What simplifying assumptions do these models make? How will these models contribute to the delivery of evidence useful to the acceptance decision makers? How will these models combine to provide sufficient evidence without excessive duplication? How will the number of tests derived from models be bounded? Intelligent Testing and Assurance

26 Test execution requires a known, controlled environment (p77)
Summary: Establish the need and requirements for an environment and test data to be used for testing, including a mechanism for managing changes to that environment – in good time. Consequence if ignored or violated: Environments are not available in time or are unsuitable for testing. This will delay testing or cause tests to be run in the wrong environment and undermine the credibility of evidence produced. Questions Who is responsible for the acquisition, configuration and support of test environments? What assumptions regarding test environments do our test models make? How will requirements for test environments be articulated, negotiated? How will the validity and usability of test environments be assured? How will changes to environments be managed, consistent with changes in requirements and other deliverables under test? How will the state of environments, including backed up and restored versions be managed? Intelligent Testing and Assurance

27 Creating a Test Strategy

28 First Equation of Testing
Axioms Context Not an equation in the mathematical sense Need to consider three key aspects and do a lot of thinking Values Thinking + Approach Intelligent Testing and Assurance

29 Why is the equation useful?
Separation of Axioms, context, values and thinking Tools, methodologies, certification, maturity models promote approaches without reference to your context or values No thinking is required! Without a unifying test theory you have no objective way of assessing these products. Intelligent Testing and Assurance

30 One context, multiple approaches
Given context, practitioners can promote different approaches based on their values Values are preferences or beliefs Pre-planned v exploratory Predefined v custom process Requirements-driven v goal-based Standard documentation v face-to-face comms. Some contexts preclude certain practices “No best practices” Intelligent Testing and Assurance

31 Axioms allow (ensure) different approaches and expose positions
Separating axioms, context and values clarifies positions, for example: ‘Structured’ test advocates have little (useful) to say about Agile contexts Exploratory test advocates have little (useful) to say about contract/requirements-based acceptance The disputes between these positions is more about values than practices in context. Intelligent Testing and Assurance

32 Testing in Staged Projects
Intelligent Assurance and Testing Testing in Staged Projects The V-Model, W-Model and Goal-Based Approaches (PAUL hand over to NEIL)

33 Intelligent Assurance and Testing
V-Model Requirements User Acceptance Test Is there ever a one-to-one relationship between baseline documents and testing? Functional Specification System Test Physical Design Integration Test Where is the static testing (reviews, inspections, static analysis etc.)? NEIL: Now, you might be thinking, isn’t the V-model old-fashioned, doesn’t this risk-based testing replace it? Our answers are yes and no: yes, the V-model is old-fashioned, but it need not be as rigid as many people think, and no, risk-based testing doesn’t replace it as such, but we base our approach on an enhancement of the V-model. So what’s wrong with the basic V-model? Many criticisms have been made of it, but we pick in two here: [ANIMATION 1] one-to-one arrows [ANIMATION 2] no visibility of static testing Our enhancement of the V-model we call... [NEXT SLIDE] <2 MINUTES> Program Specification Unit Test

34 Intelligent Assurance and Testing
Work products Project documents: schedule, quality plan, test strategy, standards Deliverables: requirements, designs, specifications, user documentation, procedures software: custom built or COTS components, sub-systems, systems, interfaces infrastructure: hardware, O/S, network, DBMS transition plans, conversion software, training... NEIL: In describing the W-model we emphasised how all work products should get tested somewhere, but not necessarily in a rigid, one-to-one correspondence with test stages In the widest sense, everything on this slide is a work product But in the same way as we distinguished (in an earlier slide) product risks from project & process risks, it’s worth clarifying our scope as testers by distinguishing those work products - we call them deliverables for convenience - which are the basis for risk-based testing The risks associated with project documents are project & process risks, and are the concern of project management and quality management (DON’T GET BOGGED INTO QA V QC) The risks of faults occurring in deliverables, whether they be documents or executable software, are what we base testing on. <1 MINUTE> Intelligent Testing and Assurance

35 What do we mean by testing?
Intelligent Assurance and Testing What do we mean by testing? Testing is the process of evaluating the deliverables of a software project detect faults so they can be removed demonstrate products meet their requirements gain confidence that products are ready for use measure and reduce risk Testing includes: static tests: reviews, inspections etc. dynamic tests: unit, system, acceptance tests etc. NEIL: So... to summarise our introduction, Part I of today’s tutorial... (QUESTION FOR PAUL: DOES TESTING TEST PROJECT SCHEDULE, QUALITY PLAN, TEST STRATEGY, STANDARDS ETC? I GUESS NOT, BUT IF YOU MEAN IT TO, WE NEED TO CHANGE deliverables BACK TO products ON THIS SLIDE AND I NEED TO CHANGE EXPLANATION OF PREVIOUS SLIDE) <HALF MINUTE> Intelligent Testing and Assurance

36 Intelligent Assurance and Testing
W-Model Write Requirements Test the Requirements Install System Acceptance Test Specify System Test the Specification Build System System Test Design System Test the Design Build Software Integration Test Write Code Unit Test NEIL: ... the W-model. It retains the correspondence of test activities to specification and software development activities, which is the basis of many project & quality management methods, but: it adds static testing to the overall framework for verification and validation (this includes the practice of early test preparation as part of “test the requirements, test the specification etc”); and it subtracts the rigid arrows between test activities & baseline documents. We might not have a full set of those documents, and some of those we do have may be out-of-date or faulty, so me may need for example to base system testing on elements of the original requirements and the design in addition to the system spec. In extreme cases some of those levels might be only in people’s heads, yet we are still asked to test. This is one aspect of exploratory testing, which is receiving a lot of attention currently. To sum up: it’s a major point in the W-model that all work products get tested somewhere, whether those products are actual software or “merely” documents <3 MINUTES>

37 W-Model and static testing
Intelligent Assurance and Testing W-Model and static testing Requirements Animation Write Requirements Test the Requirements Install System Acceptance Test Early Test Case Preparation Scenario Walkthroughs Specify System Test the Specification Build System System Test Reviews Inspections Design System Test the Design Build Software Integration Test Write Code Unit Test NEIL: Here are some examples of static testing, in the broad sense <HALF MINUTE> Static Analysis Inspection

38 W-Model and dynamic testing
Intelligent Assurance and Testing W-Model and dynamic testing Write Requirements Test the Requirements Install System Acceptance Test Business Integration Testing System Integration Testing Specify System Test the Specification Build System System Test Performance Testing Usability Testing Design System Test the Design Build Software Integration Test Security Testing Boundary Value Testing Equivalence Partitioning Exploratory Testing Write Code Unit Test NEIL: ...and here are some examples of dynamic test types, and approaches & techniques IF ANYONE ASKS: path testing, equivalence partitioning & boundary value testing are techniques usability, security, performance, SIT & BIT are techniques which are mapped, for e-business testing in particular, to test types in RBEBT exploratory testing is a wider approach <1 MINUTE> Path Testing

39 It’s usually more complicated: A real high level test plan
Intelligent Assurance and Testing 13-Apr-17 It’s usually more complicated: A real high level test plan There were three streams of testing: Core systems – this comprised the SAP modules and new applications and interfaces Non-core systems were enhancements and changes to existing legacy systems All systems – this is the integrated whole tested as a system of systems. FDR is Full Dress Rehearsal (for the cutover) Programme managed

40 Intelligent Assurance and Testing
Goals The fundamental business objectives of the system(s) to be built, implemented and used The benefits of undertaking the project The payoff(s) that underpin and justify the project Risks are what threaten the goals of a project. (PAUL) Intelligent Testing and Assurance

41 Goal Based Test Strategy
The test strategy must set out how: Achievements (the goals) of a project are evidenced or demonstrated The risks that threaten goals will be explored, re-assessed and deemed acceptable (or not) We need to understand the goals and how achievement will be measured We need to understand (in particular, product) risk and how they are explored and exposed. Intelligent Testing and Assurance

42 A goal network (aka results chain or logic model)
Every project has a network of dependent interim and ultimate goals threatened by risks The ultimate business goal Your strategy will identify the test activities that will measure goal achievement and evidence these risks GOAL RISK Intelligent Testing and Assurance

43 Intelligent Assurance and Testing
Introduction to Risk (PAUL)

44 Intelligent Assurance and Testing
The definition of risk Italian dictionary: Risicare, “to dare” Simple generic definition: “The probability that undesirable events will occur” In this tutorial, we will use this definition: “A risk threatens one or more of a project’s goals and has an uncertain probability” (PAUL) Intelligent Testing and Assurance

45 Some general statements about risk
Intelligent Assurance and Testing Some general statements about risk Risks only exist where there is uncertainty If the probability of a risk is zero or 100%, it is not a risk Unless there is the potential for loss, there is no risk (“nothing ventured, nothing gained”) There are risks associated with every project Software development is inherently risky. (PAUL) Intelligent Testing and Assurance

46 Three types of software risk
Intelligent Assurance and Testing Three types of software risk Project Risk resource constraints, external interfaces, supplier relationships, contract restrictions Process Risk variances in planning and estimation, shortfalls in staffing, failure to track progress, lack of quality assurance and configuration management Primarily a management responsibility Planning and the development process are the main issues here. Product Risk lack of requirements stability, complexity, design quality, coding quality, non-functional issues, test specifications. Testers are mainly concerned with Product Risk (PAUL) Requirements risks are the most significant risks reported in risk assessments.

47 Risk Management Process
Intelligent Assurance and Testing Risk Management Process (PAUL)

48 Intelligent Assurance and Testing
Process Risk identification what are the risks to be addressed? Risk analysis nature, probability, consequences, exposure Risk response planning pre-emptive or reactive risk reduction measures Risk resolution and monitoring Stakeholders should be involved at all stages. (PAUL) Intelligent Testing and Assurance

49 Intelligent Assurance and Testing
The danger slope Where we want to move all risks (PAUL)

50 Risk response planning
Intelligent Assurance and Testing Risk response planning Do nothing! Pre-emptive risk reduction measures information buying process model risk influencing contractual transfer Reactive risk reduction measures contingency plans insurance But this all sounds highly theoretical – we could never get this to work in my company! Where testing fits in (PAUL) Intelligent Testing and Assurance

51 Even penguins know how to manage risk!

52 Role of Testing in Product Risk Management
Intelligent Assurance and Testing Role of Testing in Product Risk Management (PAUL)

53 Faults, failure and risk
Intelligent Assurance and Testing Faults, failure and risk System failures are what we fear The faults that cause failures are our prey Uncertainty is what makes us concerned: what type of faults are present in the system? how many faults are in the system? did testing remove all the serious faults? Testing helps us to address these uncertainties. (PAUL) Intelligent Testing and Assurance

54 Testing helps to reduce risk
Intelligent Assurance and Testing Testing helps to reduce risk If risk assessment steers test activity we design tests to detect faults we reduce the risks caused by faulty products Faults found early reduce rework, cost and time lost in later stages Faults found are corrected and re-tested and so the quality of all products is improved. (PAUL) Intelligent Testing and Assurance

55 Testing can measure risk
Intelligent Assurance and Testing Testing can measure risk Testing is a measurement activity Tests that aim to find faults provide information on the quality of the product which parts of the software are faulty which parts of the software are not faulty Tests help us understand the risk of release Understanding the risks helps us to make a risk-based decision on release After testing, our risk assessment can be refined. (PAUL) Intelligent Testing and Assurance

56 Intelligent Assurance and Testing
The test passes… The risk could be unchanged because: Risk probability higher because: Risk probability lower because: Risk consequence higher because: Risk consequence lower because: (PAUL) Ask the class to think of reasons why, each of the five statements could be true for a test that passes Intelligent Testing and Assurance

57 Intelligent Assurance and Testing
The test fails… The risk could be unchanged because: Risk probability higher because: Risk probability lower because: Risk consequence higher because: Risk consequence lower because: (PAUL) Ask the class to think of reasons why, each of the five statements could be true for a test that fails Intelligent Testing and Assurance

58 Why use risks to define test objectives?
If we focus on risks, we know that bugs relating to the selected mode of failure are bound to be important If we focus on particular bug types, we will probably be more effective at finding those bugs If testers provide evidence that certain failure modes do not occur in a range of test scenarios, we will become more confident that the system will work in production. Intelligent Testing and Assurance

59 Risks as failure modes or bug types
Intelligent Assurance and Testing Risks as failure modes or bug types Risks describe ‘what we don’t want to happen’ Typical modes of failure: calculations don’t work pages don’t integrate performance is poor user experience is uncomfortable Think of them as ‘generic bug types’. (PAUL) Intelligent Testing and Assurance

60 Defining a test objective from risk
Intelligent Assurance and Testing Defining a test objective from risk We ‘turn around’ the failure mode or risk Risk: a BAD thing happens and that’s a problem for us Test objective: demonstrate using a test that the system works without the BAD thing happening The test: execute important user tasks and verify the BAD things don’t happen in a range of scenarios. (PAUL) Intelligent Testing and Assurance

61 Risk-based test objectives are usually not enough
Other test objectives relate to broader issues contractual obligations acceptability of a system to its users demonstrating that all or specified functional or non-functional requirements are met non-negotiable test objectives might relate to mandatory rules imposed by an industry regulatory authority and so on Generic test objectives complete the definition of your test stages. Intelligent Testing and Assurance

62 Generic test objectives
Intelligent Assurance and Testing Generic test objectives (NEIL, to PAUL’s slide) Intelligent Testing and Assurance

63 Tests as demonstrations
Intelligent Assurance and Testing Tests as demonstrations “Demonstrate” is most often used in test objectives Better than “Prove” which implies mathematical certainty (which is impossible) But is the word “demonstrate” too weak? it represents exactly what we will do we provide evidence for others to make a decision we can only run a tiny fraction of tests compared to what is possible so we really are only doing a demonstration of a small, sample number of tests. (NEIL, to PAUL’s slide) Intelligent Testing and Assurance

64 But tests should aim to locate faults, shouldn't they?
The tester’s goal: to locate faults We use boundary tests, extreme values, invalid data, exceptional conditions etc. to expose faults: if we find faults these are fixed and re-tested we are left with tests that were designed to detect faults, some did detect faults, but do so no longer We are left with evidence that the feature works correctly and our test objective is met No conflict between: strategic risk-based test objectives and tactical goal of locating faults. Intelligent Testing and Assurance

65 Testing and meeting requirements
Intelligent Assurance and Testing Testing and meeting requirements Risk-based test objectives do not change the methods of test design much Functional requirements We use formal or informal test design techniques as normal Non-functional requirements Test objectives are often detailed enough to derive specific tests. (NEIL, to PAUL’s slide) Intelligent Testing and Assurance

66 Goals and Risks and Designing the Test Process
Intelligent Assurance and Testing Goals and Risks and Designing the Test Process (PAUL)

67 Test activities overlay the goal network (not all goals in scope)
Test Phase/Activity GOAL RISK Intelligent Testing and Assurance

68 Risks, deliverables and test types
Walkthrough Review Inspect Prototype Early test preparation Requirements HL Design Tech Design Prog. Spec. System Sub-System Code Unit Test Static analysis Integration Test System Test Acceptance Test Goal/Risk Non-functional Security Performance Usability Backup/recovery Failover/restart Volume Stress Etc. etc. Intelligent Testing and Assurance

69 Risks, objectives and test stages
Goal/Risk Test Objective Sub-System Testing Technique System Testing Technique Intelligent Testing and Assurance

70 From goals/risks to test process
Intelligent Assurance and Testing From goals/risks to test process Identify/analyse the goals and risks What evidence must testing provide? Define, for each goal/risk: Objective(s) Object under test Test and coverage model(s), coverage target(s) Entry, exit and acceptance criteria Responsibility, deliverables Environment, tools, techniques, methods Select a test activity to achieve the test objective Collect test activities into stages and align them with the goal network or project plan. (PAUL) Intelligent Testing and Assurance

71 Intelligent Testing and Assurance
Testing Uncertainty Planning relies on predictions of the future but how can you predict test status at a future date? The answer is … you can’t The Testing Uncertainty Principle: One can predict test status, but not when it will be achieved; One can predict when a test will end, but not its status. Intelligent Testing and Assurance

72 Testing in the real world
Intelligent Assurance and Testing Testing in the real world Time and cost limit what can be done Some risks may be deemed acceptable with little or no testing Some goals will be deemed ‘achieved’ without testing at all Items on the plan will be de-scoped to squeeze the plan into the available timescales or budget mark de-scoped line items ‘out of scope’ if someone asks later what happened, you have evidence that the goal/risk was considered, but waived. (PAUL) Intelligent Testing and Assurance

73 Acceptance Criteria

74 Intelligent Testing and Assurance
Acceptance Criteria Represents the overall readiness to commit to going live considering: The readiness of the solution The readiness of the business Ability to implement (and rollback, if necessary) To live with the difficulties of early days To support the system in it’s early days The need to be compliant Here’s a generic, but comprehensive set of Acceptance Criteria for a Large Programme. Intelligent Testing and Assurance

75 Level 1 Criteria (example)
Steady State Operation Implementation Early Life The Solution (system, process and data) is ready The Organisation (business and IT) is ready We are ready to Implement the solution (and roll back if necessary) We are ready to Support the solution Operational Risks are understood and mitigating actions taken We meet the necessary Regulatory and Compliance requirements Intelligent Testing and Assurance

76 Level 2 Criteria (example)
The Solution (system, process and data) is ready The users have proven that the solution (system and data) supports the business processes. The quality level is high (demonstrated by a low quantity and severity of defects) with agreed workarounds for significant defects. The system performance is sufficient and it is reliable, stable and robust to failure. The Organisation (business and IT) is ready New organisational structures are in place and necessary positions are filled. Sufficient staff have been trained in the new solution and competency has been assessed to be acceptable. Third parties understand the changes and their readiness has been confirmed. Benefit realisation plans are in place. We are ready to Implement the solution (and roll back if necessary) Implementation and roll-back plans have been adequately tested, rehearsed and communicated. Roles, responsibilities and escalation path over the cutover period have been agreed. Temporary business workarounds during the cutover period have been agreed. We are ready to Support the solution Early Life support processes and people are in place. Sufficient transition and handover has been completed with the support and operations groups to enable the solution to be supported in Early Life. Processes and metrics are in place to provide early warning of operational problems during Early Life. Operational Risks are understood and mitigating actions agreed. For the significant operational risks mitigating actions have agreed including, where possible, tested contingency plans. Any residual risk (i.e. not fully mitigated) has been understood and accepted by senior management. We meet the necessary Regulatory and Compliance requirements The necessary regulatory and compliance approvals have been received (Audit, SOX, System Security). Intelligent Testing and Assurance

77 Test exit criteria = Assumptions
A test exit criteria is, in effect, a planning assumption If exit criteria are met on time or earlier, our planning assumptions are sound: We are where we want to be If exit criteria are not met or not met on time, our plan was optimistic: Our plan needs adjustment, or we must relax the criteria What do test exit criteria actually mean? Intelligent Testing and Assurance

78 Goal, Risk and Coverage-Based Test Reporting

79 Test Strategy overview
Project/Test Phase Reqs Design Build Integ Systest UAT Trial Prod. Test Driver Test Obj. Business goals Objectives for each test phase are easily identified Coverage target Risks Intelligent Testing and Assurance

80 Risk-based reporting Residual Risks Progress through the test plan
Planned end Progress through the test plan Residual Risks start today residual risks of releasing TODAY all risks ‘open’ at the start Intelligent Testing and Assurance

81 Benefits of risk-based test reporting
Risk of release is known: On the day you start and throughout the test phase On the day before testing is squeezed Progress through the test plan brings positive results – risks are checked off, benefits available Pressure: to eliminate risks and for testers to provide evidence that risks are gone We assume the system does not work until we have evidence – “guilty until proven innocent” Reporting is in the language that management and stakeholders understand. Intelligent Testing and Assurance

82 Goal/Risk based reporting
Open Closed Open Risks Closed Closed Open Closed Open Goals/Benefits available for release

83 Benefits of benefit-based test reporting
Risk(s) that block every benefit are known: On the day you start and throughout the test phase Before testing is squeezed Progress through the test plan brings positive results – benefits are delivered Pressure: to eliminate risks and for testers to provide evidence that benefits are delivered We assume that the system has no benefits to deliver until we have evidence Reporting is in the language that management and stakeholders understand. Intelligent Testing and Assurance

84 Communicating Test Strategies

85 Based on IEEE Standard 829-1998
IEEE 829 Test Plan Outline Test Plan Identifier Introduction Test Items Features to be Tested Features not to be Tested Approach Item Pass/Fail Criteria Suspension Criteria and Resumption Requirements Test Deliverables Testing Tasks Environmental Needs Responsibilities Staffing and Training Needs Schedule Risks and Contingencies Approvals There is an IEEE standard for a Test Plan. This standard can be applied to the Test Strategy, the Master Test Plan or any of the Phase Test Plans. Based on IEEE Standard Intelligent Testing and Assurance

86 Intelligent Testing and Assurance
I’m no fan of IEEE 829 Used as a strategy checklist Scarily vague (don’t go there) Used as a documentation template/standard Flexible, not prescriptive, but encourages copy and edit mentality (documents that no one reads) But many many testers seek guidance on What to consider in a test strategy Communicating their strategy to stakeholders and project participants Intelligent Testing and Assurance

87 Intelligent Testing and Assurance
IEEE 829 Plan and Axioms Items 1, 2 – Administration Items – Scope Management, Prioritisation Item 6 – All the Axioms are relevant Items 7+8 – Good-Enough, Value Item 9 – Stakeholder, Value, Confidence Item 10 – All the Axioms are Relevant Item 11 – Environment Item 12 – Stakeholder Item 13 – All the Axioms are Relevant Item 14 – All the Axioms are Relevant Item 15 – Fallibility, Event Item 16 – Stakeholder Axioms Intelligent Testing and Assurance

88 A Better Test Strategy and Plan
Stakeholder Objectives Stakeholder management Goal and risk management Decisions to be made and how (acceptance) How testing will provide confidence and be assessed How scope will be determined Design approach Test phases and sequence Sources of knowledge (bases and oracles) Sources of uncertainty Models to be used for design and coverage Prioritisation approach Delivery approach Test sequencing policy Repeat test policies Environment requirements Information delivery approach Incident management approach Execution and end-game approach Plan (high or low-level) Scope Tasks Responsibilities Schedule Approvals Risks and contingencies Intelligent Testing and Assurance

89 Reaching your audience
It’s all about consensus Strategy must address stakeholders’ concerns Present strategy aligned with those concerns “Their part” of the strategy sets out HOW it addresses those concerns: Will it evidence goal achievement? Does it address the risks? Does it set out MY and OTHERS’ responsibilities? How do MY activities fit into the context of the larger plan? Involve stakeholders in risks workshops, reviews and consult them before presenting your proposal. Intelligent Testing and Assurance

90 Intelligent Testing and Assurance
Audience messages “Business goals and risk” focus: Project Stakeholders, management, users Are my goals being evidenced? Has every risk been identified? Has every risk been addressed? Is the right group addressing the risk? Will tests address MY concerns? Intelligent Testing and Assurance

91 Intelligent Testing and Assurance
Audience messages 2 “Contractual aspects” focus: Software suppliers and Contract people Does the strategy match the contract? Are test activities aligned with stage payments? Is the strategy fair and allow me to be paid? Does the strategy impose the right level of test activity on the supplier? How will testing demonstrate we get what we want? Intelligent Testing and Assurance

92 Intelligent Testing and Assurance
Audience messages 3 “Risks and responsibilities” focus: Suppliers, Developers, System Test Team, UAT Team Do I know what I have to do in my testing? Who covers the things I don’t cover? When do I start testing? When do I stop? What evidence do I need to provide to address the stakeholder risks? How do I report progress? Intelligent Testing and Assurance

93 Intelligent Testing and Assurance
Audience messages 4 “Meeting business and technical requirements” focus: Business analysts, Technical Architects, Operations, Technical Support How will testing show the architecture “works”? How will testing give me confidence the system “works”? How will testing demonstrate the system can be operated and supported? Is the system ready to be deployed? Intelligent Testing and Assurance

94 Communicating with the audience
Presentations Appropriate for management teams or larger groups Q&A can stay at a high level Walkthroughs Smaller groups get more involved in the detail In person Involve individuals in workshops, reviews, handover Focus on the audience messages regardless of the medium. Intelligent Testing and Assurance

95 Closing Comments

96 Intelligent test strategy
Helps stakeholders: They get more involved and buy-in The have better visibility of the test process Helps testers Clear guidance on the focus of testing Approval to test against risks in scope Approval to not test against risks out of scope Clearer test objectives upon which to design tests. Intelligent Testing and Assurance

97 Intelligent test execution and reporting
Helps stakeholders: They have better visibility of the results available and the risks that block results Helps management: To see progress in terms of risks addressed and results that are available for delivery To manage the risks that block acceptance To better make the release decision Helps project managers, helps testers. Intelligent Testing and Assurance

98 We work for the operator for a national lottery
The Case Study We work for the operator for a national lottery

99 We’ll work through several exercises
Introduction to the project (30-40m) Regulatory framework and stakeholders Stakeholder objectives/benefits and concerns The lottery process (more risks) (20m) Project Goal network and Test Process (20m) Test Phase Responsibilities (20m) Test Phase Definition (20m) What’s Left? Intelligent Testing and Assurance

100 If the information you need is not provided – make it up
If there is an unknown – don’t get stuck In the team, think of a possible scenario and assume that There won’t be time for me to invent answers and explain … and your ideas are probably more interesting  Intelligent Testing and Assurance

101 Introduction to the project (30-40 minutes)
On your own: Read the Case Study overview (page ONLY) Read the regulatory framework Learn about the stakeholders Identify 1-2 goals and 3-4 risks As a TEAM, discuss your goals and risks List them in your workbook page 12 If you need more space, use page 25 onwards… Intelligent Testing and Assurance

102 Intelligent Testing and Assurance
Introduction retro Did everyone find the same goals/risks? Is it hard to find goals? Is it hard to find risks? Should these goals/risks be in scope for testing? Which ones aren’t suitable for testing? Does it depend on how you define testing? Get your scope right/agreed! Intelligent Testing and Assurance

103 The lottery process (20 mins)
Read through the lottery process stage by stage (page ONLY) Each stage has a short narrative Can you think of any new risks? Add them to the table on page 12 Intelligent Testing and Assurance

104 Intelligent Testing and Assurance
Lottery process retro How many risks do you think are there to be documented? How many are in scope for testing? Who has the most risks of concern? For each risk – is there a single mitigation or more than one? Intelligent Testing and Assurance

105 The Project Goal network and The Test Process (20 mins)
The project goal network on page 16 is an outline of the goals of the project Read and discuss the diagram as a team Some test activities are shown, but are merged with development activities Mark up the Project Goal network with your test phases or… Use page 19 to sketch out the network of goals and test phases that you think you need. Intelligent Testing and Assurance

106 Intelligent Testing and Assurance
Project Goals retro Did having the system schematic help? Each test phase measures achievement Each test phase is associated with a project goal, activity or deliverable If a test phase ‘passes’ then the goal is achieved Does exiting a test phase confirm that a project goal has been achieved? Intelligent Testing and Assurance

107 Test stage responsibilities (RACI/RASCI chart) (10 mins)
For each test phase, define some objectives (goals to demonstrate or risks to be mitigated) and assign responsibilities Intelligent Testing and Assurance

108 Test Phase Definition (all day)
Now you have a much clearer understanding of the goals and risks You have identified a set of test stages and begun to assign goals/risks and responsibilities Read page 20 guidelines for completing the full test stage definitions Create a full definition of one of your test stages (I suggest a system or integration test stage to start) Intelligent Testing and Assurance

109 What’s Left/Missing? Lots!

110 Intelligent Testing and Assurance
What’s missing? No requirements! Do you need them (yet)? No discussion of what the sources of knowledge (basis and oracles) are or their quality/usefulness No discussion of test models How do we identify things to test? What coverage targets could be used; can we define them; can we measure them? No discussion of our capabilities (the operator) and the supplier’s No mention of environments or technology Intelligent Testing and Assurance

111 Intelligent Testing and Assurance
What’s missing 2 No discussion of acceptance criteria (consider slides etc.) No discussion of what evidence stakeholders actually require and in what format No mention of the incident process No discussion of test execution tools (developer or system level) No discussion of test design/record-keeping And lots more, I know… Intelligent Testing and Assurance

112 Have we addressed all of your concerns?
me your test strategy questions – answers are often good ideas for blogs

113 Or… ask us to run a test strategy workshop at your site
Thank You!

114 How to Create a Test Strategy
Intelligent Assurance and Testing 13-Apr-17 How to Create a Test Strategy @paul_gerrard Paul Gerrard Web:

Download ppt "How to Create a Test Strategy"

Similar presentations

Ads by Google