Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to Create a Test Strategy Paul Gerrard Web:

Similar presentations

Presentation on theme: "How to Create a Test Strategy Paul Gerrard Web:"— Presentation transcript:

1 How to Create a Test Strategy Paul Gerrard Twitter: @paul_gerrard Web:

2 Paul Gerrard is a consultant, teacher, author, webmaster, programmer, tester, conference speaker, rowing coach and a publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the USA, Australia, South Africa and occasionally won awards for them. Educated at the universities of Oxford and Imperial College London, in 2010, Paul won the Eurostar European Testing excellence Award. In 2012, with Susan Windsor, Paul recently co-authored “The Business Story Pocketbook”. He is Principal of Gerrard Consulting Limited and is the host of the UK Test Management Forum and the UK Business Analysis Forum. Intelligent Testing and AssuranceSlide 2

3  Your test strategy challenges?  What is test strategy?  Test strategy approach  Test axioms as thinking tools  Using the First Equation of Testing  Testing in staged projects  Goals, risks and designing the test process  Goals, risks and coverage-based test reporting  Communicating test strategies  Case study exercise  Your challenges revisited (if we have time).  NB some slides are hidden and won’t be presented today. Intelligent Testing and AssuranceSlide 3

4  This is a full day course (or 2 day, customised for clients)  I won’t talk over every slide  We might not get through the whole Case Study exercise in detail  But you WILL get some practice in many of the Q&A and conversations about test strategy.


6 Ever written a test strategy that no one read?

7 Dig a Hole? Test a system?

8  Before you can plan a test, you need to have a lot of questions answered  The strategy…  Presents some decisions that can be made ahead of time  Defines the process or method or information that will allow decisions to be made (in project)  Sets out the principles (or process) to follow for uncertain situations or unplanned events. Intelligent Testing and AssuranceSlide 8

9 1. Success-based: test to show it works 2. Defect-based: test to find bugs 3. Coverage-based: analyse requirements and code to achieve test coverage targets 4. Risk-based: use risk to focus testing and to inform risk assessment 5. Goal-based: use business goals and risk to focus testing and support decision-making. FOCUS: from programmer  tester  stakeholder Intelligent Testing and AssuranceSlide 9

10  Multiple objectives in multiple test stages  Objectives vary with product readiness  (testing in the small  testing in the large)  Objectives vary by role  Developer  tester  user  sponsor  Multiple stakeholders: sponsor, users, developers, support staff, project management  Different test models for each test objective  Projects are unique, goals, risks and therefore the test approach is unique too. Intelligent Testing and AssuranceSlide 10

11  Who are the testing stakeholders  What are their concerns and how their concerns will be used to scope testing  What, when and how testing will report to stakeholders  Who will perform the testing and reporting  The human and technical resources required to deliver  What might change, what could go wrong and how testers can recover the situation. Intelligent Testing and AssuranceSlide 11

12  Those who focus on risk and business goals:  Sponsors, project stakeholders  Business Users  Project management  Those who focused on contractual aspects, stage payments etc:  Software suppliers  Contracts people. Intelligent Testing and AssuranceSlide 12

13  Those who focus on their risks and responsibilities:  Suppliers  Developers  System Test Team  UAT Team  Those who focus on meeting business and technical requirements:  Technical Architects  Operations  Technical Support. Intelligent Testing and AssuranceSlide 13

14 Strategy is a thought process not a document

15 1.Introduction 1.1.Version History 1.2.Purpose 1.3.Scope 1.4.Background 1.5.Assumptions and Dependencies 1.6.Summary 2.Overview of User Testing 2.1.Objectives 2.2.User Testing and the Overall Project 2.3.Functional Requirements Testing 2.4.The Need for Technical Requirements Testing 2.5.Starting Early - Front-loading 3.User Test Policy 3.1.Baseline for Testing 3.2.Contract 3.2.1.Acceptance Criteria 3.2.2.XXX and Supplier Responsibilities 3.2.3.Testing and QA 3.2.4.Quality Plan 3.3.Testing Criteria 3.4.Risk Criteria 3.5.Starting Criteria 3.6.Policy for Re-Testing 3.7.Policy for Regression Testing 3.8.Completion Criteria 3.9.Handover to Production 3.10.Documentation Plan 3.10.1.User Test Strategy 3.10.2.Test Plan 3.10.3.Test Log 3.10.4.Incident Log 3.10.5.Error Log 3.10.6.Test Report 4.Functional Requirements Testing 4.1.Approach 4.2.Process 4.3.Special Application Needs 5.Technical Requirements Testing 5.1.Usability testing 5.1.1.Requirements 5.1.2.Conducting Usability Tests 5.2.Performance testing 5.2.1.Requirements for Performance Testing 5.2.2.Performance Test Cases 5.3.Conversion Testing 5.4.Security testing 5.4.1.Security Tests 5.4.2.Security Test Cases 5.5.Documentation Testing 5.6.Volume testing 5.7.Stress testing 5.8.Storage testing 5.9.Recovery Testing 5.10.Installation testing 5.11.Reliability Testing 5.12.Serviceability Testing 5.13.Portability Testing 5.14.Tests Not Required by the Users 6.User Test Infrastructure 6.1.Test Environment 6.1.1.Support 6.1.2.Roles and Responsibilities 6.1.3.Test Environment 6.2.Tools and Automation 6.2.1.Comparators 6.2.2.Test Data Generators 6.2.3.Capture/Replay Tools 6.2.4.Testing Information Systems 6.2.5.Database Query and Maintenance Facilities 6.2.6.Transaction Simulators 7.Schedule 7.1.Milestone Plan 7.2.Activities to be Resourced 7.3.Skills Required 8.User Test Execution 8.1.Acceptance Test Procedure 8.1.1.Pre-Test Meeting 8.1.2.During the Test 8.1.3.Post-Test Meeting 8.2.Software Delivery 8.3.Testing to Plan 8.4.Handling Failures 8.5.Logging Tests 8.6.Error Classification 8.7.Controlling Releases of New Versions 8.8.Regression Testing 8.9.Documentation This is the table of contents of a 51 page document for an acceptance test of an outsourced development (written by me) A large safety-related system might have 150 pages of test strategy supported by 10-20 other risk-related documents “Does size matter?” Slide 15Intelligent Testing and Assurance

16 Test Strategy Test Strategy Risks Goals Constraints Human resource Environment Timescales Process (lack of?) Contract Culture Opportunities User involvement Automation De- Duplication Early Testing Skills Communication Axioms Artefacts Intelligent Testing and AssuranceSlide 16


18  Formulated as a context-neutral set of rules for testing systems  They represent the critical thinking processes required to test any system  There are clear opportunities to advance the practice of testing using them  Testers Pocketbook:  Test Axioms Website Intelligent Testing and AssuranceSlide 18

19 Test Axioms are not beginners guides They can help you to think critically about testing They expose flaws in other people’s thinking and their arguments about testing They generate some useful by-products They help you to separate context from values Interesting research areas! First Equation of Testing, Testing Uncertainty Principle, Quantum Theory, Relativity, Exclusion Principle... You can tell I like physics Intelligent Testing and AssuranceSlide 19

20 The Axioms are thinking tools

21 Design Coverage Value Scope Prioritisation Fallibility Event Oracle Never-Finished Good-Enough Environment Basis Repeat-Test Delivery Sequencing Stakeholder Intelligent Testing and AssuranceSlide 21

22 Design Coverage Value Scope Prioritisation Fallibility Event Oracle Never-Finished Good-Enough Environment Basis Repeat-Test Delivery Sequencing Stakeholder Intelligent Testing and AssuranceSlide 22

23 Stakeholder Value Scope Fallibility Good-Enough Stakeholder Value Scope Fallibility Good-Enough Delivery Repeat-Test Sequence Environment Event Never-finished Delivery Repeat-Test Sequence Environment Event Never-finished Design Basis Coverage Prioritisation Oracle Design Basis Coverage Prioritisation Oracle Intelligent Testing and AssuranceSlide 23

24 Summary: Identify and engage the people or organisations that will use and benefit from the test evidence we are to provide Consequence if ignored or violated: There will be no mandate or any authority for testing. Reports of passes, fails or enquiries have no audience. Questions:  Who are they?  Whose interests do they represent?  What evidence do they want?  What do they need it for?  When do they want it?  In what format?  How often? Intelligent Testing and AssuranceSlide 24

25 Summary: Choose test models to derive tests that are meaningful to stakeholders. Recognise the models’ limitations and the assumptions that the models make Consequence if ignored or violated: Tests design will be meaningless and not credible to stakeholders. Questions  Are design models available to use as test models? Are they mandatory?  What test models could be used to derive tests from the Test Basis?  Which test models will be used?  Are test models to be documented or are they purely mental models?  What are the benefits of using these models?  What simplifying assumptions do these models make?  How will these models contribute to the delivery of evidence useful to the acceptance decision makers?  How will these models combine to provide sufficient evidence without excessive duplication?  How will the number of tests derived from models be bounded? Intelligent Testing and AssuranceSlide 25

26 Summary: Establish the need and requirements for an environment and test data to be used for testing, including a mechanism for managing changes to that environment – in good time. Consequence if ignored or violated: Environments are not available in time or are unsuitable for testing. This will delay testing or cause tests to be run in the wrong environment and undermine the credibility of evidence produced. Questions  Who is responsible for the acquisition, configuration and support of test environments?  What assumptions regarding test environments do our test models make?  How will requirements for test environments be articulated, negotiated?  How will the validity and usability of test environments be assured?  How will changes to environments be managed, consistent with changes in requirements and other deliverables under test?  How will the state of environments, including backed up and restored versions be managed? Intelligent Testing and AssuranceSlide 26


28 Axioms Context Values Thinking + Approach Not an equation in the mathematical sense Need to consider three key aspects and do a lot of thinking Intelligent Testing and AssuranceSlide 28

29  Separation of Axioms, context, values and thinking  Tools, methodologies, certification, maturity models promote approaches without reference to your context or values  No thinking is required!  Without a unifying test theory you have no objective way of assessing these products. Intelligent Testing and AssuranceSlide 29

30  Given context, practitioners can promote different approaches based on their values  Values are preferences or beliefs  Pre-planned v exploratory  Predefined v custom process  Requirements-driven v goal-based  Standard documentation v face-to-face comms.  Some contexts preclude certain practices  “No best practices” Intelligent Testing and AssuranceSlide 30

31  Separating axioms, context and values clarifies positions, for example:  ‘Structured’ test advocates have little (useful) to say about Agile contexts  Exploratory test advocates have little (useful) to say about contract/requirements-based acceptance  The disputes between these positions is more about values than practices in context. Intelligent Testing and AssuranceSlide 31

32 The V-Model, W-Model and Goal-Based Approaches

33 Requirements Functional Specification Physical Design Program Specification User Acceptance Test System Test Integration Test Unit Test Is there ever a one-to-one relationship between baseline documents and testing? Where is the static testing (reviews, inspections, static analysis etc.)? Slide 33

34  Project documents:  schedule, quality plan, test strategy, standards  Deliverables:  requirements, designs, specifications, user documentation, procedures  software: custom built or COTS components, sub-systems, systems, interfaces  infrastructure: hardware, O/S, network, DBMS  transition plans, conversion software, training... Intelligent Testing and AssuranceSlide 34

35  Testing is the process of evaluating the deliverables of a software project  detect faults so they can be removed  demonstrate products meet their requirements  gain confidence that products are ready for use  measure and reduce risk  Testing includes:  static tests: reviews, inspections etc.  dynamic tests: unit, system, acceptance tests etc. Intelligent Testing and AssuranceSlide 35

36 Write Requirements Specify System Design System Test the Requirements Test the Specification Test the Design Unit Test Acceptance Test System Test Integration Test Install System Build System Build Software Write Code Slide 36

37 Write Requirements Specify System Design System Test the Requirements Test the Specification Test the Design Unit Test Acceptance Test System Test Integration Test Install System Build System Build Software Write Code Reviews Inspections Scenario Walkthroughs Early Test Case Preparation Inspection Static Analysis Requirements Animation Slide 37

38 Write Requirements Specify System Design System Test the Requirements Test the Specification Test the Design Unit Test Acceptance Test System Test Integration Test Install System Build System Build Software Write Code Security Testing Path Testing Performance Testing Usability Testing Business Integration Testing System Integration Testing Equivalence Partitioning Boundary Value Testing Exploratory Testing Slide 38

39 Programme managed Slide 39

40  The fundamental business objectives of the system(s) to be built, implemented and used  The benefits of undertaking the project  The payoff(s) that underpin and justify the project  Risks are what threaten the goals of a project. Intelligent Testing and AssuranceSlide 40

41  The test strategy must set out how:  Achievements (the goals) of a project are evidenced or demonstrated  The risks that threaten goals will be explored, re- assessed and deemed acceptable (or not)  We need to understand the goals and how achievement will be measured  We need to understand (in particular, product) risk and how they are explored and exposed. Intelligent Testing and AssuranceSlide 41

42 RISK GOAL Every project has a network of dependent interim and ultimate goals threatened by risks Your strategy will identify the test activities that will measure goal achievement and evidence these risks The ultimate business goal Intelligent Testing and AssuranceSlide 42


44  Italian dictionary: Risicare, “to dare”  Simple generic definition:  “The probability that undesirable events will occur”  In this tutorial, we will use this definition: “A risk threatens one or more of a project’s goals and has an uncertain probability” Intelligent Testing and AssuranceSlide 44

45  Risks only exist where there is uncertainty  If the probability of a risk is zero or 100%, it is not a risk  Unless there is the potential for loss, there is no risk (“nothing ventured, nothing gained”)  There are risks associated with every project  Software development is inherently risky. Intelligent Testing and AssuranceSlide 45

46 Project Risk resource constraints, external interfaces, supplier relationships, contract restrictions Process Risk variances in planning and estimation, shortfalls in staffing, failure to track progress, lack of quality assurance and configuration management Primarily a management responsibility Planning and the development process are the main issues here. Product Risk lack of requirements stability, complexity, design quality, coding quality, non-functional issues, test specifications. Requirements risks are the most significant risks reported in risk assessments. Testers are mainly concerned with Product Risk Slide 46


48  Risk identification  what are the risks to be addressed?  Risk analysis  nature, probability, consequences, exposure  Risk response planning  pre-emptive or reactive risk reduction measures  Risk resolution and monitoring  Stakeholders should be involved at all stages. Intelligent Testing and AssuranceSlide 48

49 Where we want to move all risks Slide 49

50  Do nothing!  Pre-emptive risk reduction measures  information buying  process model  risk influencing  contractual transfer  Reactive risk reduction measures  contingency plans  insurance  But this all sounds highly theoretical – we could never get this to work in my company! Where testing fits in Intelligent Testing and AssuranceSlide 50

51 Even penguins know how to manage risk!


53  System failures are what we fear  The faults that cause failures are our prey  Uncertainty is what makes us concerned:  what type of faults are present in the system?  how many faults are in the system?  did testing remove all the serious faults?  Testing helps us to address these uncertainties. Intelligent Testing and AssuranceSlide 53

54  If risk assessment steers test activity  we design tests to detect faults  we reduce the risks caused by faulty products  Faults found early reduce rework, cost and time lost in later stages  Faults found are corrected and re-tested and so the quality of all products is improved. Intelligent Testing and AssuranceSlide 54

55  Testing is a measurement activity  Tests that aim to find faults provide information on the quality of the product  which parts of the software are faulty  which parts of the software are not faulty  Tests help us understand the risk of release  Understanding the risks helps us to make a risk-based decision on release  After testing, our risk assessment can be refined. Intelligent Testing and AssuranceSlide 55

56  The risk could be unchanged because:  Risk probability higher because:  Risk probability lower because:  Risk consequence higher because:  Risk consequence lower because: Intelligent Testing and AssuranceSlide 56

57  The risk could be unchanged because:  Risk probability higher because:  Risk probability lower because:  Risk consequence higher because:  Risk consequence lower because: Intelligent Testing and AssuranceSlide 57

58  If we focus on risks, we know that bugs relating to the selected mode of failure are bound to be important  If we focus on particular bug types, we will probably be more effective at finding those bugs  If testers provide evidence that certain failure modes do not occur in a range of test scenarios, we will become more confident that the system will work in production. Intelligent Testing and AssuranceSlide 58

59  Risks describe ‘what we don’t want to happen’  Typical modes of failure:  calculations don’t work  pages don’t integrate  performance is poor  user experience is uncomfortable  Think of them as ‘generic bug types’. Intelligent Testing and AssuranceSlide 59

60  We ‘turn around’ the failure mode or risk  Risk:  a BAD thing happens and that’s a problem for us  Test objective:  demonstrate using a test that the system works without the BAD thing happening  The test:  execute important user tasks and verify the BAD things don’t happen in a range of scenarios. Intelligent Testing and AssuranceSlide 60

61  Other test objectives relate to broader issues  contractual obligations  acceptability of a system to its users  demonstrating that all or specified functional or non- functional requirements are met  non-negotiable test objectives might relate to mandatory rules imposed by an industry regulatory authority and so on  Generic test objectives complete the definition of your test stages. Intelligent Testing and AssuranceSlide 61

62 Intelligent Testing and AssuranceSlide 62

63  “Demonstrate” is most often used in test objectives  Better than “Prove” which implies mathematical certainty (which is impossible)  But is the word “demonstrate” too weak?  it represents exactly what we will do  we provide evidence for others to make a decision  we can only run a tiny fraction of tests compared to what is possible  so we really are only doing a demonstration of a small, sample number of tests. Intelligent Testing and AssuranceSlide 63

64  The tester’s goal: to locate faults  We use boundary tests, extreme values, invalid data, exceptional conditions etc. to expose faults:  if we find faults these are fixed and re-tested  we are left with tests that were designed to detect faults, some did detect faults, but do so no longer  We are left with evidence that the feature works correctly and our test objective is met  No conflict between:  strategic risk-based test objectives and  tactical goal of locating faults. Intelligent Testing and AssuranceSlide 64

65  Risk-based test objectives do not change the methods of test design much  Functional requirements  We use formal or informal test design techniques as normal  Non-functional requirements  Test objectives are often detailed enough to derive specific tests. Intelligent Testing and AssuranceSlide 65


67 RISK GOAL Test Phase/Activity Intelligent Testing and AssuranceSlide 67

68 Requirements HL Design Tech Design Prog. Spec. System Sub-System Code Goal/Risk Walkthrough Review Inspect Prototype Early test preparation Unit Test Static analysis Non-functional Security Performance Usability Backup/recovery Failover/restart Volume Stress Etc. etc. Integration Test System Test Acceptance Test Intelligent Testing and AssuranceSlide 68

69 Test Objective Sub-System Testing Technique System Testing Technique Goal/Risk Intelligent Testing and AssuranceSlide 69

70 1. Identify/analyse the goals and risks 2. What evidence must testing provide? 3. Define, for each goal/risk: a) Objective(s) b) Object under test c) Test and coverage model(s), coverage target(s) d) Entry, exit and acceptance criteria e) Responsibility, deliverables f) Environment, tools, techniques, methods 4. Select a test activity to achieve the test objective 5. Collect test activities into stages and align them with the goal network or project plan. Intelligent Testing and AssuranceSlide 70

71  Planning relies on predictions of the future but how can you predict test status at a future date?  The answer is … you can’t  The Testing Uncertainty Principle:  One can predict test status, but not when it will be achieved;  One can predict when a test will end, but not its status. Intelligent Testing and AssuranceSlide 71

72  Time and cost limit what can be done  Some risks may be deemed acceptable with little or no testing  Some goals will be deemed ‘achieved’ without testing at all  Items on the plan will be de-scoped to squeeze the plan into the available timescales or budget  mark de-scoped line items ‘out of scope’  if someone asks later what happened, you have evidence that the goal/risk was considered, but waived. Intelligent Testing and AssuranceSlide 72


74  Represents the overall readiness to commit to going live considering:  The readiness of the solution  The readiness of the business  Ability to implement (and rollback, if necessary)  To live with the difficulties of early days  To support the system in it’s early days  The need to be compliant  Here’s a generic, but comprehensive set of Acceptance Criteria for a Large Programme. Intelligent Testing and AssuranceSlide 74

75 Steady State Operation ImplementationEarly Life 1.The Solution (system, process and data) is ready 2.The Organisation (business and IT) is ready 3.We are ready to Implement the solution (and roll back if necessary) 4.We are ready to Support the solution 5.Operational Risks are understood and mitigating actions taken 6.We meet the necessary Regulatory and Compliance requirements Intelligent Testing and AssuranceSlide 75

76 Level 1 CriteriaLevel 2 Criteria The Solution (system, process and data) is ready The users have proven that the solution (system and data) supports the business processes. The quality level is high (demonstrated by a low quantity and severity of defects) with agreed workarounds for significant defects. The system performance is sufficient and it is reliable, stable and robust to failure. The Organisation (business and IT) is ready New organisational structures are in place and necessary positions are filled. Sufficient staff have been trained in the new solution and competency has been assessed to be acceptable. Third parties understand the changes and their readiness has been confirmed. Benefit realisation plans are in place. We are ready to Implement the solution (and roll back if necessary) Implementation and roll-back plans have been adequately tested, rehearsed and communicated. Roles, responsibilities and escalation path over the cutover period have been agreed. Temporary business workarounds during the cutover period have been agreed. We are ready to Support the solution Early Life support processes and people are in place. Sufficient transition and handover has been completed with the support and operations groups to enable the solution to be supported in Early Life. Processes and metrics are in place to provide early warning of operational problems during Early Life. Operational Risks are understood and mitigating actions agreed. For the significant operational risks mitigating actions have agreed including, where possible, tested contingency plans. Any residual risk (i.e. not fully mitigated) has been understood and accepted by senior management. We meet the necessary Regulatory and Compliance requirements The necessary regulatory and compliance approvals have been received (Audit, SOX, System Security). Intelligent Testing and AssuranceSlide 76

77  A test exit criteria is, in effect, a planning assumption  If exit criteria are met on time or earlier, our planning assumptions are sound: We are where we want to be  If exit criteria are not met or not met on time, our plan was optimistic: Our plan needs adjustment, or we must relax the criteria  What do test exit criteria actually mean? Intelligent Testing and AssuranceSlide 77


79 Risks Coverage target Business goals Test DriverTest Obj. Project/Test Phase Reqs DesignBuildIntegSystestUATTrialProd. Objectives for each test phase are easily identified Intelligent Testing and AssuranceSlide 79

80 today Planned end residual risks of releasing TODAY Progress through the test plan Residual Risks start all risks ‘open’ at the start Intelligent Testing and AssuranceSlide 80

81  Risk of release is known:  On the day you start and throughout the test phase  On the day before testing is squeezed  Progress through the test plan brings positive results – risks are checked off, benefits available  Pressure: to eliminate risks and for testers to provide evidence that risks are gone  We assume the system does not work until we have evidence – “guilty until proven innocent”  Reporting is in the language that management and stakeholders understand. Intelligent Testing and AssuranceSlide 81

82 Open Closed Risks Open Closed Open Goal Goals/Benefits available for release Goal Closed Slide 82

83  Risk(s) that block every benefit are known:  On the day you start and throughout the test phase  Before testing is squeezed  Progress through the test plan brings positive results – benefits are delivered  Pressure: to eliminate risks and for testers to provide evidence that benefits are delivered  We assume that the system has no benefits to deliver until we have evidence  Reporting is in the language that management and stakeholders understand. Intelligent Testing and AssuranceSlide 83


85 1. Test Plan Identifier 2. Introduction 3. Test Items 4. Features to be Tested 5. Features not to be Tested 6. Approach 7. Item Pass/Fail Criteria 8. Suspension Criteria and Resumption Requirements 9. Test Deliverables 10. Testing Tasks 11. Environmental Needs 12. Responsibilities 13. Staffing and Training Needs 14. Schedule 15. Risks and Contingencies 16. Approvals Based on IEEE Standard 829-1998 Intelligent Testing and AssuranceSlide 85

86  Used as a strategy checklist  Scarily vague (don’t go there)  Used as a documentation template/standard  Flexible, not prescriptive, but encourages copy and edit mentality (documents that no one reads)  But many many testers seek guidance on  What to consider in a test strategy  Communicating their strategy to stakeholders and project participants Intelligent Testing and AssuranceSlide 86

87  Items 1, 2 – Administration  Items 3+4+5 – Scope Management, Prioritisation  Item 6 – All the Axioms are relevant  Items 7+8 – Good-Enough, Value  Item 9 – Stakeholder, Value, Confidence  Item 10 – All the Axioms are Relevant  Item 11 – Environment  Item 12 – Stakeholder  Item 13 – All the Axioms are Relevant  Item 14 – All the Axioms are Relevant  Item 15 – Fallibility, Event  Item 16 – Stakeholder Axioms Intelligent Testing and AssuranceSlide 87

88 1. Stakeholder Objectives  Stakeholder management  Goal and risk management  Decisions to be made and how (acceptance)  How testing will provide confidence and be assessed  How scope will be determined 2. Design approach  Test phases and sequence  Sources of knowledge (bases and oracles)  Sources of uncertainty  Models to be used for design and coverage  Prioritisation approach 3. Delivery approach  Test sequencing policy  Repeat test policies  Environment requirements  Information delivery approach  Incident management approach  Execution and end-game approach 4. Plan (high or low-level)  Scope  Tasks  Responsibilities  Schedule  Approvals  Risks and contingencies Intelligent Testing and AssuranceSlide 88

89  It’s all about consensus  Strategy must address stakeholders’ concerns  Present strategy aligned with those concerns  “Their part” of the strategy sets out HOW it addresses those concerns:  Will it evidence goal achievement?  Does it address the risks?  Does it set out MY and OTHERS’ responsibilities?  How do MY activities fit into the context of the larger plan?  Involve stakeholders in risks workshops, reviews and consult them before presenting your proposal. Intelligent Testing and AssuranceSlide 89

90  “Business goals and risk” focus:  Project Stakeholders, management, users  Are my goals being evidenced?  Has every risk been identified?  Has every risk been addressed?  Is the right group addressing the risk?  Will tests address MY concerns? Intelligent Testing and AssuranceSlide 90

91  “Contractual aspects” focus:  Software suppliers and Contract people  Does the strategy match the contract?  Are test activities aligned with stage payments?  Is the strategy fair and allow me to be paid?  Does the strategy impose the right level of test activity on the supplier?  How will testing demonstrate we get what we want? Intelligent Testing and AssuranceSlide 91

92  “Risks and responsibilities” focus:  Suppliers, Developers, System Test Team, UAT Team  Do I know what I have to do in my testing?  Who covers the things I don’t cover?  When do I start testing?  When do I stop?  What evidence do I need to provide to address the stakeholder risks?  How do I report progress? Intelligent Testing and AssuranceSlide 92

93  “Meeting business and technical requirements” focus:  Business analysts, Technical Architects, Operations, Technical Support  How will testing show the architecture “works”?  How will testing give me confidence the system “works”?  How will testing demonstrate the system can be operated and supported?  Is the system ready to be deployed? Intelligent Testing and AssuranceSlide 93

94  Presentations  Appropriate for management teams or larger groups  Q&A can stay at a high level  Walkthroughs  Smaller groups get more involved in the detail  In person  Involve individuals in workshops, reviews, handover  Focus on the audience messages regardless of the medium. Intelligent Testing and AssuranceSlide 94


96  Helps stakeholders:  They get more involved and buy-in  The have better visibility of the test process  Helps testers  Clear guidance on the focus of testing  Approval to test against risks in scope  Approval to not test against risks out of scope  Clearer test objectives upon which to design tests. Intelligent Testing and AssuranceSlide 96

97  Helps stakeholders:  They have better visibility of the results available and the risks that block results  Helps management:  To see progress in terms of risks addressed and results that are available for delivery  To manage the risks that block acceptance  To better make the release decision  Helps project managers, helps testers. Intelligent Testing and AssuranceSlide 97

98 We work for the operator for a national lottery Slide 98

99 1. Introduction to the project (30-40m) 1. Regulatory framework and stakeholders 2. Stakeholder objectives/benefits and concerns 2. The lottery process (more risks) (20m) 3. Project Goal network and Test Process (20m) 4. Test Phase Responsibilities (20m) 5. Test Phase Definition (20m) 6. What’s Left? Intelligent Testing and AssuranceSlide 99

100  If there is an unknown – don’t get stuck  In the team, think of a possible scenario and assume that  There won’t be time for me to invent answers and explain  … and your ideas are probably more interesting Intelligent Testing and AssuranceSlide 100

101 1. On your own: 1. Read the Case Study overview (page 10-11 ONLY) 2. Read the regulatory framework 3. Learn about the stakeholders 4. Identify 1-2 goals and 3-4 risks 2. As a TEAM, discuss your goals and risks 1. List them in your workbook page 12 2. If you need more space, use page 25 onwards… Intelligent Testing and AssuranceSlide 101

102  Did everyone find the same goals/risks?  Is it hard to find goals?  Is it hard to find risks?  Should these goals/risks be in scope for testing?  Which ones aren’t suitable for testing?  Does it depend on how you define testing?  Get your scope right/agreed! Intelligent Testing and AssuranceSlide 102

103 1. Read through the lottery process stage by stage (page 13-15 ONLY) 1. Each stage has a short narrative 2. Can you think of any new risks? 1. Add them to the table on page 12 Intelligent Testing and AssuranceSlide 103

104  How many risks do you think are there to be documented?  How many are in scope for testing?  Who has the most risks of concern?  For each risk – is there a single mitigation or more than one? Intelligent Testing and AssuranceSlide 104

105  The project goal network on page 16 is an outline of the goals of the project  Read and discuss the diagram as a team  Some test activities are shown, but are merged with development activities  Mark up the Project Goal network with your test phases or…  Use page 19 to sketch out the network of goals and test phases that you think you need. Intelligent Testing and AssuranceSlide 105

106  Did having the system schematic help?  Each test phase measures achievement  Each test phase is associated with a project goal, activity or deliverable  If a test phase ‘passes’ then the goal is achieved  Does exiting a test phase confirm that a project goal has been achieved? Intelligent Testing and AssuranceSlide 106

107  For each test phase, define some objectives (goals to demonstrate or risks to be mitigated) and assign responsibilities Intelligent Testing and AssuranceSlide 107

108  Now you have a much clearer understanding of the goals and risks  You have identified a set of test stages and begun to assign goals/risks and responsibilities  Read page 20 guidelines for completing the full test stage definitions  Create a full definition of one of your test stages (I suggest a system or integration test stage to start) Intelligent Testing and AssuranceSlide 108

109 Lots!

110  No requirements! Do you need them (yet)?  No discussion of what the sources of knowledge (basis and oracles) are or their quality/usefulness  No discussion of test models  How do we identify things to test?  What coverage targets could be used; can we define them; can we measure them?  No discussion of our capabilities (the operator) and the supplier’s  No mention of environments or technology Intelligent Testing and AssuranceSlide 110

111  No discussion of acceptance criteria (consider slides 73-77 etc.)  No discussion of what evidence stakeholders actually require and in what format  No mention of the incident process  No discussion of test execution tools (developer or system level)  No discussion of test design/record-keeping  And lots more, I know… Intelligent Testing and AssuranceSlide 111

112 Email me your test strategy questions – answers are often good ideas for blogs

113 Thank You!

114 How to Create a Test Strategy Paul Gerrard Web: @paul_gerrard

Download ppt "How to Create a Test Strategy Paul Gerrard Web:"

Similar presentations

Ads by Google